Real Coffee with Scott Adams - December 01, 2022


Episode 1944 Scott Adams: The Gaslighting Around Politics, Nutrition And Everything Else


Episode Stats

Length

1 hour and 22 minutes

Words per Minute

143.60301

Word Count

11,882

Sentence Count

952

Misogynist Sentences

33

Hate Speech Sentences

26


Summary

A tip from my stepdad about how to use social media to your advantage, and how to get the most out of it, and why you should be using it for information, not just to be fun and entertaining, but as a tool to help you improve your life.


Transcript

00:00:00.000 Good morning, everybody, and welcome to another Highlight of Civilization.
00:00:06.620 Coffee with Scott Adams.
00:00:09.320 For a moment there, I did forget my own name.
00:00:12.000 That's probably not good foreshadowing, is it?
00:00:14.700 But I think we're going to take it up a level,
00:00:17.320 even though maybe you were not expecting such excellence this morning.
00:00:21.600 But, you know, we're going to start fast.
00:00:24.940 And all you need to take it up another level is,
00:00:27.340 do you know, do you know, do you know?
00:00:29.000 Yes, you do.
00:00:30.360 All you need is a cup or a mug or a glass of tech or chalice or stein,
00:00:33.960 a canteen jug or flask, a vessel of any kind.
00:00:36.700 Fill it with your favorite liquid.
00:00:38.100 I like coffee.
00:00:39.860 Join me now for the unparalleled pleasure, the dopamedia,
00:00:43.340 the day, the thing that makes everything better.
00:00:45.520 It's called the simultaneous sip.
00:00:47.760 And it happens just about now.
00:00:54.400 Ah.
00:00:54.960 The gloriousness of this moment cannot be understated.
00:01:04.540 Or overstated.
00:01:06.820 Choose one.
00:01:08.440 All right.
00:01:08.880 I have a recommendation for you.
00:01:15.000 You know how social media is all evil and it's killing us all?
00:01:19.200 Well, there's an account on Instagram that I'm going to recommend.
00:01:22.900 It goes by mystepdad55.
00:01:25.340 55 might be his age or when he was born.
00:01:31.020 I don't know.
00:01:32.040 But my stepdad55 is literally this friendly guy
00:01:37.240 who just does a video every, looks like every day,
00:01:42.440 in which he teaches you things your father didn't teach you.
00:01:47.020 It's frickin' amazing.
00:01:48.600 It's like one of the best things on the whole internet.
00:01:53.820 And all he does is he's this super, super, like, pleasant guy.
00:02:01.180 You know, you immediately wish this were your dad.
00:02:04.780 He is the most dad, dad of all dads.
00:02:08.560 Like, he takes dadness to levels of excellence
00:02:11.560 that you've never seen before.
00:02:12.820 And all he does is teach you stuff like,
00:02:15.780 all right, I think he's teaching you how to wash the dishes,
00:02:18.380 how to make some meals,
00:02:20.320 how to do some, how to change your wiper blades.
00:02:23.240 I actually didn't know how to change my wiper blades.
00:02:27.400 But he teaches you how.
00:02:28.540 I just have the dealer do it.
00:02:31.920 Let's see what else.
00:02:36.460 So, how to sew by hand.
00:02:38.700 How to sew.
00:02:39.420 How many of you learned how to sew?
00:02:43.180 To, like, fix a sock or something when you were a kid.
00:02:46.500 How many of the men?
00:02:48.500 Talk about the men.
00:02:50.080 How many of the men learned to sew?
00:02:53.340 I did.
00:02:54.720 Yeah.
00:02:55.260 My mother taught me all those skills.
00:02:57.020 Or my grandmother.
00:02:57.880 My mother and my grandmother.
00:02:59.620 Eagle Scout, right.
00:03:01.500 Yeah, I learned all those things.
00:03:02.840 I don't know if anybody learns them now.
00:03:04.100 But it made me think that Instagram and social media,
00:03:10.540 as damaging as they can be,
00:03:13.620 and how they've changed our, let's say, our brains and our attention span.
00:03:19.000 But it made me think this.
00:03:21.760 Maybe instead of condemning and complaining about social media,
00:03:25.720 we should take a tip from stepdad, my stepdad 55,
00:03:30.940 and use it for information.
00:03:34.820 Because if that's how people want to consume things,
00:03:38.400 here he's made it interesting.
00:03:39.680 Like, the reason I recommend it is not just because the information is good,
00:03:44.360 but he figured out how to make it enjoyable and short.
00:03:47.900 Think about how much you could teach young people and adults, too,
00:03:52.800 if you created content that was meant to be useful
00:03:56.500 and was interesting and short.
00:04:00.200 Think about it.
00:04:01.460 There's almost nothing I wouldn't want to learn
00:04:03.800 if it were interesting and short.
00:04:07.300 One of the things that YouTube does so well
00:04:09.580 is I will watch a YouTube history lesson on almost anything
00:04:17.960 because it's short and entertaining and well-made.
00:04:21.580 But if you said, hey, you learned this from your history book,
00:04:24.020 I'd be like, yeah, you've got other things to do.
00:04:26.960 So I think the potential for Instagram in particular,
00:04:33.520 maybe Twitter later,
00:04:34.980 is some serious, useful information.
00:04:39.900 You've seen all the relationship advice that tends to be terrible.
00:04:43.760 And apparently there was some guy selling some kind of,
00:04:47.480 I don't know, the liver king that turned out to be a fake.
00:04:51.180 So there's tons of bad advice on the Internet.
00:04:56.120 But what if it were good?
00:04:58.260 So right now you can't tell the good advice from the bad advice.
00:05:01.920 And free speech means you're not going to change that.
00:05:05.260 But what if you've got good advice on social media?
00:05:09.140 It's possible.
00:05:10.640 There might be some way that that could actually happen.
00:05:12.640 Here's the funniest thing about social media.
00:05:18.320 You know, I talk about the same characters practically every day
00:05:21.700 because they're the interesting ones.
00:05:24.320 If somebody's interesting, I can't help it.
00:05:27.780 They're just out there being interesting.
00:05:30.040 And while I don't personally like Andrew Tate,
00:05:32.460 I am so fascinated by the, let's say, the phenomenon of him.
00:05:39.120 All right?
00:05:39.260 Number one, he does a great job of getting attention and building an audience.
00:05:45.300 Great job.
00:05:46.640 But this is how good a job he does.
00:05:49.800 I just want you to hold this in your head.
00:05:52.240 And I don't need to add any commentary to it.
00:05:55.480 I'm just going to describe it accurately.
00:05:57.440 And it goes like this.
00:06:00.620 So Andrew Tate is, at the moment, one of the most, you know,
00:06:05.300 let's say he's getting a lot of heat on social media, good and bad.
00:06:09.060 But he's getting tons of attention, and his audience is growing and stuff.
00:06:13.500 And he basically is a cigar-smoking, whiskey-drinking guy
00:06:19.200 who doesn't have a stable relationship,
00:06:21.440 who gives young men advice on health and relationships.
00:06:28.340 Do I have to add anything to that to make it funny?
00:06:31.840 That's actually a real thing.
00:06:34.100 And just think about the world you live in,
00:06:36.900 where I don't have to add any punchlines to anything.
00:06:39.320 I can literally just describe it.
00:06:43.700 He's someone who is held up as a role model
00:06:46.120 for health, fitness, and relationship information,
00:06:50.020 who smokes cigars and drinks whiskey on video,
00:06:53.580 you know, as part of his brand.
00:06:56.400 And then he clearly doesn't have any kind of successful relationship in his life.
00:07:03.660 So, oh, and he's becoming Muslim.
00:07:07.040 And he puts his money in depreciating assets, cars.
00:07:14.080 So he has 17 expensive cars.
00:07:17.800 Those are depreciating assets.
00:07:20.280 Worst investment you could possibly make.
00:07:23.060 Do you think you can't get laid with one Lamborghini?
00:07:27.860 Seriously?
00:07:28.920 Do you need 17 sports cars to get laid?
00:07:34.540 They're depreciating assets.
00:07:36.140 It's not going to help you in your relationship to have 17.
00:07:39.920 It's not going to help you financially.
00:07:41.520 It's the worst financial decision anybody ever made.
00:07:44.600 So the other thing he does is give financial advice
00:07:48.220 while he invests in depreciable assets.
00:07:53.380 Okay.
00:07:55.680 So I have nothing but compliments for his skill set.
00:08:00.160 His skill set is really through the roof.
00:08:02.320 I mean, he definitely has the skill.
00:08:04.320 There's no question about that.
00:08:05.680 But it's just, it's insanely educational
00:08:11.080 to know that he can build a brand
00:08:13.640 around telling people the opposite of what they're observing with their own eyes.
00:08:18.560 I don't know how he does it.
00:08:20.900 But he's doing it well.
00:08:22.520 So he's working for him.
00:08:23.500 Here's a story I saw just before I went live.
00:08:28.020 Somebody has developed insects that can eat plastic.
00:08:33.340 And he's modified their gut biome somehow.
00:08:37.740 So they actually digest plastic and turn it back into its organic precursors, I guess,
00:08:44.860 if that's the right word.
00:08:47.040 Now, isn't that great?
00:08:49.360 Imagine if we could solve all of our plastic waste problems with insects that can eat our plastic.
00:08:54.940 That's all good, right?
00:08:59.160 Unless they get out of their little cage.
00:09:03.260 What happens if one gets out?
00:09:06.440 Has anybody noticed that our whole planet is made of plastic?
00:09:09.840 They would basically destroy civilization.
00:09:12.120 They would just eat all their, they'd eat all your car parts
00:09:16.300 and, you know, your computer would be destroyed.
00:09:20.920 It's the best and worst idea I've ever heard in my life.
00:09:24.400 Okay.
00:09:25.400 But that's happening.
00:09:26.420 All right.
00:09:29.000 This is the dog not barking theme for today.
00:09:33.740 The dog not barking, most of you know, comes from a Shakespeare, Shakespeare, not Shakespeare,
00:09:41.380 what the fuck, all British things look alike to me.
00:09:44.640 Sherlock Holmes, not Shakespeare.
00:09:47.740 Sherlock, not Shakespeare.
00:09:49.380 See, it was an obvious mistake.
00:09:51.200 Both British, both smart, both have a sh in their name.
00:09:57.240 It was an understandable mistake.
00:10:00.160 Anyway, yeah, Sir Arthur Conan Doyle was the author.
00:10:04.540 So in the Sherlock Holmes, one of the cases was solved by noting that the dogs were not barking.
00:10:13.460 And instead of noticing something happening,
00:10:16.560 Sherlock Holmes was smart enough to notice something that should have been happening that wasn't.
00:10:20.820 And so this is a lesson to all of you.
00:10:23.660 Don't just look at what's happening.
00:10:25.220 Ask yourself what should be happening that isn't.
00:10:28.760 And this screams out for this Balenciaga story.
00:10:35.500 So as you know, there was one of the photographs in a Balenciaga photo shoot.
00:10:40.980 It's a fashion house.
00:10:42.340 In which some police tape showed Balenciaga, or the B-A-A-L part, misspelled with two A's.
00:10:52.620 B-A-A-L.
00:10:54.560 And now B-A-A-L, coincidentally, or not, happens to be a Canaanite demon god.
00:11:04.660 Pagans ritualistically sacrifice children to this demon.
00:11:08.040 So it's the demon of child sacrifice, right?
00:11:11.960 Now Balenciaga, being accused, rightly or wrongly, we do not know,
00:11:16.780 of putting what appears to be pedophile positive images in their photo shoots.
00:11:23.380 Now that's the accusation.
00:11:24.700 But part of the accusation is that there's no way in the world it could be an accident
00:11:30.520 that this B-A-A-L thing was part of the shoot.
00:11:35.560 And then it turns out that the name Balenciaga, if you break out the parts,
00:11:45.620 and if you make the first part the B-A-A-L, if you turn it into B-A-A-L,
00:11:51.720 then the second part of it, the N-C-A-G-A part, means something like, you know, praising that.
00:11:58.120 So it means like praising the god of child sacrifice.
00:12:01.860 Now, is that a coincidence?
00:12:03.620 Could be.
00:12:06.340 That could be it.
00:12:07.380 Oh, Bal is king, right.
00:12:10.280 The Balenciaga apparently means in whatever language that that Bal guy is king.
00:12:17.760 Now, could that be a coincidence?
00:12:20.240 Absolutely.
00:12:21.760 Oh, is it Latin?
00:12:22.500 Is it Latin?
00:12:23.640 So could it be a coincidence that the name Balenciaga sounds so terrible in this context?
00:12:31.120 That could be a coincidence.
00:12:32.560 Coincidence, yeah.
00:12:34.160 To me, that would not be a super big coincidence.
00:12:38.060 It would be a funny one, but I would still call it a coincidence, right?
00:12:42.180 We see these all the time.
00:12:44.560 But where's the dog that's not barking?
00:12:47.340 Does anybody see the dog not barking?
00:12:49.120 Or do you hear the dog not barking?
00:12:51.180 What's missing in the story so far?
00:12:53.180 What's the most obvious thing that you would expect to know by now that you do not know?
00:13:05.320 Are you really all missing this?
00:13:07.040 Where the name came from is interesting, but I think that's going to be a dry hole.
00:13:18.960 Thank you, Erica the Excellent.
00:13:22.080 Yeah.
00:13:23.520 Here's what's missing.
00:13:25.360 Erica said who ordered it, like who bought that tape and who put it there.
00:13:30.280 Yes, but I would generalize that a little bit more and I would say, how did we get to today without Balenciaga issuing a statement that says, oh, about that tape?
00:13:43.880 There's an innocent explanation, and here it is.
00:13:49.560 Where is it?
00:13:52.000 Isn't that the most obvious thing that should have happened by now?
00:13:56.880 I mean, it's been days.
00:13:58.460 You don't think Balenciaga knows internally where that tape came from and why it's there?
00:14:04.220 Of course they do.
00:14:05.840 Why are they not telling you?
00:14:07.980 Well, it can't be a good reason, right?
00:14:11.100 Do you think that the heads of Balenciaga know that the tape is a complete coincidence, accidental thing, and they've just decided not to mention it in public?
00:14:23.900 They have a perfectly innocent explanation.
00:14:27.580 They just decided not to tell you.
00:14:30.340 Does that sound right?
00:14:32.000 Now, how many of you didn't notice that dog that wasn't barking?
00:14:35.740 You know, raise your hand in the comments if you didn't notice that the most obvious thing is missing.
00:14:44.220 Which one of your news sources, which one of your news sources, yeah, yeah, I think yay said it.
00:14:51.700 Which one of your news sources told you that's missing?
00:14:55.380 None of them.
00:14:57.280 None of them.
00:14:58.300 Nobody said that's missing.
00:14:59.960 Nobody said we asked for a comment and we didn't get one.
00:15:02.520 Did you?
00:15:03.760 Have you heard anybody say we asked Balenciaga to comment on the B.A.A.L. tape and they decided not to or they gave us this response?
00:15:14.240 Right?
00:15:15.240 Where?
00:15:15.920 What the hell is going on?
00:15:18.260 What the hell is going on that nobody asked the most obvious question?
00:15:24.120 Well, and weirdly, most of you didn't even notice it wasn't asked.
00:15:28.040 You think Kim Kardashian asked?
00:15:32.120 She may have asked, but we didn't hear any answer.
00:15:36.620 Sticks did?
00:15:37.460 Of course.
00:15:38.660 Right.
00:15:39.280 Yeah, Sticks and Hammer did because it's an obvious question for anybody who's an informed consumer in news.
00:15:47.400 Give us some news.
00:15:49.740 All right.
00:15:50.340 Well, there's going to be another one of those coming up.
00:15:51.980 It'll be just as shocking.
00:15:53.160 How did we get to 2022?
00:15:58.240 And I actually had an exchange on Twitter this morning with an actual doctor about whether white potatoes are good in your diet or not.
00:16:10.500 How in the world could that still be a question that's up in the air?
00:16:15.440 Now, you might say to me, Scott, the doctor told you they're good.
00:16:21.720 I'm talking about russet white potatoes.
00:16:24.480 If the doctor says they're, and the reason is, apparently they rate high, according to one study at least, in satiety.
00:16:33.820 Satiety.
00:16:35.020 In other words, it satiates you or makes you feel like you've eaten enough.
00:16:39.600 And apparently it's way high on that list on at least one study that somebody showed me.
00:16:46.000 And therefore, eating white potatoes, it's a good, solid element of your diet.
00:16:53.140 How many of you would believe that?
00:16:59.760 Satiety?
00:17:00.880 Satiety.
00:17:03.020 Thank you.
00:17:04.900 It's pronounced satiety?
00:17:07.160 Okay, thank you.
00:17:08.280 All right, something like that.
00:17:11.340 All right, so, but just hold this in your head for a second.
00:17:15.660 Imagine that it's 2022, and we can't agree whether a potato is good for you or bad for you.
00:17:24.040 Now, here's what I've been taught, that potatoes spike your glycemic index, white ones do.
00:17:31.820 The sweet potatoes, not so much.
00:17:34.460 But the white ones spike your glycemic index.
00:17:37.320 And if you spike your glycemic index, it will make you hungrier later.
00:17:43.640 All right?
00:17:44.440 Now, do you think there was anything wrong with the study that says that white potatoes are great for satiety?
00:17:53.560 Satiety.
00:17:54.180 In other words, they satiate you.
00:17:55.860 Do you think there was anything wrong with the study?
00:17:57.420 Well, do you think it looked at potatoes the way we really eat them?
00:18:03.960 Or did it look like a baked potato or a boiled potato?
00:18:09.180 Baked or boiled, I think.
00:18:13.000 And it did not include French fries.
00:18:17.600 Did not include French fries.
00:18:19.220 Did not include any fried potatoes, you know, like hash browns or anything like that.
00:18:24.600 Did not include, or at least I didn't see mentioned, what kind of toppings the potato had.
00:18:30.980 If somebody gave you some French fries, and they checked with you an hour later and said,
00:18:38.060 would you like some more French fries, you'd probably say yes.
00:18:40.660 I wouldn't mind another French fry, no matter how full I was, because I really, really like them.
00:18:46.540 But suppose somebody said to you, this is a trial.
00:18:49.360 We'd like you to eat this baked potato.
00:18:51.580 And then you say, hey, excellent, I like potatoes.
00:18:54.940 And what am I putting on it for topping?
00:18:58.240 Sour cream and butter and all that?
00:19:01.760 And they go, no, no, no.
00:19:02.880 We're only testing the potato.
00:19:05.620 And you say, what?
00:19:07.280 Yeah, just the potato.
00:19:09.040 So just eat the potato.
00:19:10.940 And then we're going to check with you an hour later to see if you want some more potato.
00:19:15.740 So you eat the potato with nothing on it.
00:19:21.580 Okay, I'm glad they're paying me to be in this trial.
00:19:26.220 Okay.
00:19:26.880 And then they check with you an hour later and they say, hey, Scott, would you like some more potato?
00:19:32.660 You know what I'm going to say?
00:19:34.100 I'm good.
00:19:35.460 I am so satiated with potato.
00:19:39.200 I could not possibly eat another potato.
00:19:42.020 Man, those potatoes hit the spot.
00:19:44.020 Don't get me near another potato.
00:19:45.880 No, no, no.
00:19:46.980 I've had all the potatoes I need for a week.
00:19:49.520 Oh, I'm good on potato.
00:19:51.580 In the real world, nobody eats a plain potato ever, ever, ever.
00:20:01.600 Now, here's my bullshit detector rule, which I wrote about in Loserthink, I think.
00:20:09.020 Here's the rule, and it goes like this.
00:20:10.940 One of the red flags that doesn't tell you for sure if something's bullshit, but it tells you to look deeper, is if your observation, your common lived experience, is at odds with the science.
00:20:25.080 Whenever that's true, that should raise a flag and say, wait a minute, is my observation wrong or is the science wrong?
00:20:33.120 Now, my observation about eating white potatoes is if I put them in my diet, I gain weight immediately, and if I take them out of my diet, I immediately lose weight.
00:20:44.340 And I've been doing it for years, for years, for years, all I have to do is put potatoes in my diet, weight creeps up, take them out of my diet, weight goes down, and it's just one-to-one.
00:20:57.280 Now, if you're telling me that 50 years of experimenting have all been wrong, and if I told you that would you be surprised that the studies that show that potatoes are great at sauteating you, would you be surprised to know that even in that study, it says that other studies say the opposite?
00:21:19.620 No, of course you would not be.
00:21:20.760 That wouldn't surprise you at all, because science on food is complete bullshit.
00:21:28.000 Now, who was it who funded the study on satiety of potatoes?
00:21:33.200 Who do you think funded the study?
00:21:35.480 I don't know.
00:21:36.820 I didn't look.
00:21:37.900 Do you think it was big potato?
00:21:40.100 Do you think there was anybody else in the whole fucking world who cared about the potato study?
00:21:46.880 No.
00:21:47.220 No, it wasn't big carrot.
00:21:50.580 It wasn't the carrot people who said, you know, I'd really like to fund a big study to find out if potatoes that we do.
00:21:57.280 Do not sell are good for people.
00:22:00.200 No.
00:22:01.100 It was probably big potato.
00:22:05.440 Right?
00:22:06.640 Like, who else would even care?
00:22:09.320 Do you think anybody else cared?
00:22:10.740 Do you think the scientists were sitting around saying, all right, we got to, let's come up with an idea.
00:22:15.520 I'm really concerned about potatoes.
00:22:17.480 I've looked at all the problems in the world, all the many things we could concentrate on, and I'm thinking a little bit more information about white potatoes would be where I'd like to put all of my scientific expertise.
00:22:30.280 No, somebody was willing to pay somebody to study a white potato.
00:22:36.260 Who?
00:22:37.240 Who would be willing to pay you to study a white potato?
00:22:40.220 There's literally nothing you could depend on, on food science.
00:22:48.120 Many of you know my story.
00:22:50.100 Years ago, I tried to start a healthy food product company to make a burrito that had all of your daily vitamins and minerals.
00:22:59.380 And so, of course, I wanted to be as well-informed as I could to know what was the right recommendation of those things.
00:23:06.420 And do you know what I found out?
00:23:08.440 Very quickly, that food science is not science at all.
00:23:14.700 It's not even close to science.
00:23:16.700 Pretty much everything you know about food is a guess or bullshit or somebody paid for a fucked up study.
00:23:23.380 None of it's true.
00:23:24.160 The only, as far as I can tell, the only stuff that's true is the stuff you can immediately test on your own body.
00:23:31.860 Right?
00:23:33.000 You know that if you eat a bunch of sugar, you feel different.
00:23:36.920 You know that.
00:23:38.260 You know if you drink a lot of alcohol, you feel different.
00:23:41.180 And you look different.
00:23:42.780 Right?
00:23:43.480 You know if you eat clean, you feel different and you look different.
00:23:47.960 You can see it.
00:23:49.740 You can see it with your own eyes.
00:23:51.440 And when the stuff you can see with your own eyes differs from the science, which one are you going to believe?
00:23:59.280 If it were a different topic, I might say believe the science.
00:24:02.820 You know, because science is the way that you make sure that your personal observations are not biased.
00:24:08.740 But the personal observations are so consistent.
00:24:12.260 And the science is so bullshit and not credible.
00:24:15.460 That I would actually believe my personal observation when it comes to food.
00:24:20.540 I would actually believe that over science.
00:24:23.620 Because I don't think there's any reliable science on food right now.
00:24:28.360 All right.
00:24:30.020 Let me tell you something that happened years ago.
00:24:33.100 I dated a woman who was just crazy.
00:24:37.620 And I'll put that in quotes, crazy.
00:24:40.360 About eating clean.
00:24:43.180 Eating food that was organic and had no additives and no preservatives.
00:24:47.600 And I've got to tell you, she was the healthiest looking human being I've ever seen in my life.
00:24:54.840 Like she screamed healthy and her diet was just perfect.
00:25:00.040 Now, you want to make a billion dollars in the food business?
00:25:04.140 I'm going to give you an idea for making a billion dollars.
00:25:06.900 And this will be a persuasion tip.
00:25:11.440 Start a company that's organic food with no preservatives.
00:25:16.780 And mostly it's just fresh stuff.
00:25:18.560 It's like a farmer's market, but it's there every day.
00:25:21.200 So you're getting like direct from the farm.
00:25:23.280 Just like a farmer's market, but it's just there all the time.
00:25:26.800 And here's how you totally put every other company out of business.
00:25:32.280 You name it, clean food.
00:25:35.800 Clean food.
00:25:36.680 And you mean it two different ways.
00:25:39.480 The food is literally washed.
00:25:41.580 And nobody can get near it until they're buying it.
00:25:45.520 How much do you not love the fact that when you go to the supermarket,
00:25:49.800 everybody is standing next to the broccoli that you're going to pick up and eat?
00:25:53.920 I don't like it.
00:25:55.280 I don't like everybody breathing on my broccoli before I buy it.
00:25:58.600 Even if I wash it.
00:25:59.780 I don't like that.
00:26:00.960 So suppose you said all of our food is behind glass.
00:26:03.680 But if you reach through with, I don't know, with your plastic gloved hand or something,
00:26:09.780 you can put it in your basket.
00:26:11.300 So imagine that, first of all, nobody else can touch your food after it's been washed.
00:26:17.340 So the company washes it.
00:26:20.360 Company washes it.
00:26:21.700 And then you can get it.
00:26:23.280 Now, hold on.
00:26:24.080 I haven't made my whole case there, so don't get ahead of me.
00:26:26.920 Secondly, you say that your food is clean because it doesn't have preservatives or additives.
00:26:31.960 Now, here's why this would put every other food company out of business.
00:26:38.720 Because as soon as you create that frame that some food is clean and some is not,
00:26:44.900 you can never eat the other food again.
00:26:48.420 It would be the kill shot of all kill shots.
00:26:51.340 So do you want some clean food or, you know, the other kind?
00:26:55.860 Just think about it.
00:26:56.960 But everybody, we're all designed to prefer clean food, meaning it's as much food as it could be
00:27:04.420 and there's less entertainment and less additives and stuff.
00:27:08.140 If you named your company clean food, you would just put everybody out of business.
00:27:14.460 Think about it.
00:27:15.040 Now, do I believe that you should avoid all of those germs?
00:27:20.120 No.
00:27:21.120 Nope.
00:27:21.780 I've told you before that you should see a baseline level of germs in your life you should see as creating good health.
00:27:29.940 They make you healthier because they challenge your system on a regular basis.
00:27:34.200 So what's healthiest for you is that your food is a little bit imperfect, but that's hard to sell.
00:27:43.000 All right.
00:27:44.020 Do you remember a long time ago, I've said this so many times, you're bored with it,
00:27:47.460 but I told you the reason we can't get AI to sound exactly like a person
00:27:51.860 is that we don't realize that AI is already smarter than us.
00:27:56.820 And in order for AI to act like a person, it would have to act like an asshole who didn't know too much.
00:28:01.500 And then you'd think, oh, that's a real person.
00:28:04.720 And you can't fool me.
00:28:05.880 That sounds real to me.
00:28:07.340 That person's a jerk, very selfish, and seems to only care about they self.
00:28:13.440 Yeah, that's a person.
00:28:14.700 And I actually heard an AI expert finally agree with me.
00:28:18.960 An AI expert said, one of the ways that you could determine if you were talking to an AI
00:28:25.600 is you could ask it a hard question.
00:28:27.660 And I thought, oh, shit, that's true.
00:28:33.260 Because AI can answer hard questions.
00:28:36.160 So if you said to the AI, all right, what's the capital of Elbonia?
00:28:42.440 And the AI would know, but no human would.
00:28:46.540 So you could immediately detect an AI just by asking any hard question.
00:28:50.900 And unless they lied, you'd say, I know a person who wouldn't have asked the answer to that.
00:28:57.260 Have you seen the interview on the street series where there's a guy who asks, usually young people,
00:29:03.860 asks them questions about general knowledge questions about the country?
00:29:07.760 And then whatever they say, no matter how ridiculous, he goes, right.
00:29:13.480 And he agrees with them, and that's the punchline.
00:29:16.020 And then they feel like they got it right.
00:29:17.920 It's hilarious every time he does it.
00:29:19.960 He says right, right?
00:29:21.460 No, he says yes.
00:29:22.420 He says yes.
00:29:23.560 So he'll go up to somebody and say, can you name two states in the United States?
00:29:29.220 And they'll be like, oh, two states, wow.
00:29:32.020 States, there's a New York.
00:29:35.000 Is New York one?
00:29:36.400 And he'll say, yes, name another.
00:29:38.380 And they'll be like, Philadelphia?
00:29:42.220 Philadelphia is a state?
00:29:43.220 And he'll look at them and go, yes.
00:29:47.880 It never gets less funny.
00:29:50.260 Every single time he looks at them and goes, yes.
00:29:54.460 Because you watch the person being so happy that they got the answer right, you know, on film.
00:29:59.680 It's diabolical.
00:30:01.680 It's diabolical.
00:30:03.040 Anyway.
00:30:06.480 Mayor Eric Adams, no relationship to me, said that they need to remove the seriously mentally ill people from the streets.
00:30:15.600 So they're going to remove all the seriously mentally ill people from the streets.
00:30:22.000 Now, here's a question.
00:30:26.400 Did somebody just have that idea?
00:30:29.600 Is that an idea that nobody had until now?
00:30:33.020 Or is that an idea that Dr. Drew has been screaming at the top of his lungs for, I don't know, five years that I've noticed anyway?
00:30:41.400 Probably longer.
00:30:42.900 That you have to physically take them off the streets if they're mentally ill.
00:30:46.760 And maybe the drug addicts as well.
00:30:48.460 Because there's some people who just can't handle, take care of it themselves.
00:30:52.280 So, when I was reading this story, I had the following thought.
00:30:58.980 If you were to collect up all of the street people, and then you were to poll them, how many would be Republicans?
00:31:05.900 Have you ever thought of that?
00:31:13.540 It might be zero.
00:31:17.740 25%.
00:31:18.180 It might be zero.
00:31:20.240 Have you ever heard of a Republican street person?
00:31:25.800 I don't know.
00:31:26.560 I don't think I have.
00:31:28.540 Why is that?
00:31:29.380 Do you have any idea why that is?
00:31:35.540 David DePaype.
00:31:36.620 I don't know.
00:31:38.420 Maybe.
00:31:39.380 Veterans.
00:31:40.020 Maybe.
00:31:41.340 Maybe.
00:31:43.840 Well, it could be that the homeless are not very political, so it's hard to say.
00:31:48.660 But the other possibility is if you're a Republican, you probably come from a Republican family.
00:31:54.020 Maybe there's more support there or something.
00:31:55.900 Maybe there's more tough love.
00:31:57.400 I don't know.
00:31:57.760 I just think it's the homeless are mostly Democrats, it feels like.
00:32:05.400 So, I'm not sure that the homeless situation is a homeless situation as much as it's a homeless Democrat situation.
00:32:13.180 So, suppose Republicans started framing it as homeless Democrats.
00:32:19.100 What would that do?
00:32:23.100 Number one, it would make it look like it wasn't the Republican problem to solve.
00:32:30.220 Right?
00:32:32.180 If the problem is homeless Democrats, is that for the Republicans to solve?
00:32:38.160 Because the Republicans just move out of the city.
00:32:40.460 They do solve it.
00:32:41.320 Republicans do have a solution for homelessness.
00:32:44.260 They move.
00:32:46.000 Am I wrong?
00:32:47.640 Every Republican has a solution for a situation that isn't good.
00:32:52.000 They fix it or they move.
00:32:55.480 So, it's not a Republican problem because Republicans have a solution.
00:32:59.840 It's called moving.
00:33:01.220 Or carrying a gun.
00:33:03.120 Maybe carrying a gun.
00:33:04.100 But Republicans have solutions for their own problems.
00:33:08.100 They're taking care of it.
00:33:09.320 The government doesn't need to solve all their problems.
00:33:12.160 Do you know why I live where I live?
00:33:14.660 One of the biggest reasons?
00:33:16.380 Part of it is I live in the most survivable place in the world.
00:33:21.560 Because it's not so cold outside that I'll die from the elements most days.
00:33:25.660 And I'm not so close to the ocean that I'll get killed by a tidal wave.
00:33:30.680 I don't have hurricanes.
00:33:32.760 I don't have snowstorms.
00:33:35.260 I do have earthquakes.
00:33:36.700 But I'm earthquake proof and I'm not on a fault.
00:33:40.100 So, basically, I've, you know, picked a life that's the safest it could be.
00:33:47.100 But one of the main things I picked was I was far away from homeless people.
00:33:52.740 Far away from homeless people.
00:33:54.140 Far away from homeless people.
00:33:55.700 Intentionally.
00:33:57.140 And so I just said, huh, if I don't want to live around homeless people, what can I do about that?
00:34:04.180 Can I change all the homeless people into non-homeless people?
00:34:07.940 Well, if I could, if I could, and that was good for them, I would.
00:34:11.960 But I can't.
00:34:13.680 So I'm solving my problem.
00:34:16.460 Now, did I not solve my problem?
00:34:19.000 Did I not solve my homelessness problem?
00:34:21.640 Yeah, I moved away from it.
00:34:22.660 And when people say, Scott, how would you like to go to San Francisco for some fun?
00:34:28.240 I say, no, no.
00:34:30.200 How about we don't go to San Francisco and that's our fun?
00:34:33.720 How about instead of going to the pumpkin patch or some fucking thing that nobody, no male wants to do,
00:34:41.820 how about we just don't do that?
00:34:43.660 How about that?
00:34:44.640 Solve my problem.
00:34:46.300 Problem solved.
00:34:46.900 I saw the funniest stand-up comedy about some guy who said that no guy wants to go to the pumpkin patch.
00:34:56.640 Is that true?
00:35:02.100 That no husband wants to go to the pumpkin patch?
00:35:05.260 Oh, we'll go.
00:35:06.500 We will go to the pumpkin patch.
00:35:08.560 Uh-huh.
00:35:09.200 We will.
00:35:10.000 But no man wants to go to the pumpkin patch.
00:35:12.840 Because once you get to the pumpkin patch, it's just looking at pumpkins.
00:35:16.840 And it's not really interesting.
00:35:18.920 All right.
00:35:22.520 I saw an article on CNN that correctly, to their credit, criticized the fact that Merriam-Webster has entered gaslighting into the dictionary,
00:35:31.940 but they did it wrong, which I should have mentioned, but CNN's opinion piece did.
00:35:40.240 And here's what's wrong with the Merriam-Webster dictionary.
00:35:48.240 Gaslighting is not just lying.
00:35:50.820 Now, you probably remember I tried to tell you this and I just gave up.
00:35:55.400 Because common usage does change the word, right?
00:36:00.220 So gaslighting used to mean not just lying, but lying about something that's so obviously a lie that you question your own sanity that someone could say it with a straight face.
00:36:13.360 So it's the questioning your own sanity that makes it gaslighting.
00:36:16.920 Otherwise, we already had a word for lying.
00:36:19.560 The word for lying was lying.
00:36:22.560 We didn't need a new word.
00:36:24.220 So gaslighting was not a synonym for lying.
00:36:26.920 It was a whole different concept.
00:36:28.200 But because the Democrats used it for three years against Trump for just regular lying,
00:36:35.460 Merriam-Webster, I think, finally just gave up and said, all right, well, I guess that's the definition now.
00:36:39.840 So now it's closer to regular lying.
00:36:43.560 So I'm going to, you know, as an author, I accept that language changes.
00:36:48.340 And I guess this is a change that happened.
00:36:50.300 You know, it wasn't my first choice.
00:36:53.040 But it looks like it happened.
00:36:54.440 However, we should not lose sight of the fact that the original gaslighting is still a thing, even if it lost its word.
00:37:04.920 It no longer has its own word, but it's still a thing.
00:37:08.220 And here are some examples of it in the news.
00:37:11.640 So what is her name, the spokesperson, Biden's spokesperson?
00:37:19.760 What's her name?
00:37:24.300 Corrine Jean-Pierre.
00:37:28.000 Now, the reason I cannot remember her name, if you haven't noticed, she has a three-name name, which, first of all, is abusive.
00:37:36.480 That's abusive.
00:37:37.220 You know, I can remember one name, sometimes two.
00:37:40.880 You tell me I have to remember three names, and I'm probably out.
00:37:45.540 And I'm never, and I'm never, let me say two things.
00:37:50.320 If you have a name that could be pronounced, plausibly pronounced two different ways, that's not my fault.
00:37:57.040 That's not my fault.
00:37:58.980 All right.
00:37:59.300 And likewise, if you have a name where all three of your names are really a first name, right?
00:38:09.100 Because Corrine is a first name, Jean is a first name, and Pierre is a first name.
00:38:14.340 And Jean and Pierre are a guy's first name.
00:38:18.140 So, I mean, it's really hard for me to store this memory.
00:38:22.020 You know, everybody stores memory differently.
00:38:23.580 Like, I never store exact memories, like phone numbers.
00:38:27.680 I can't remember names in phone numbers, if they're unusual.
00:38:30.780 But I can remember anything that fits any kind of a familiar form.
00:38:35.980 Because first I'll think of the familiar form, and then it will remind me of the specific.
00:38:41.740 So if somebody has three first names, I can't remember that.
00:38:46.280 Because my brain says, all right, what's her last name?
00:38:48.780 And then there is none.
00:38:49.560 So it doesn't fit into my form, so I can't remember it.
00:38:54.100 So it's not my fault.
00:38:56.000 But anyway, she was asked about when was Biden going to visit the border.
00:39:02.580 And she said, he's been there.
00:39:05.620 He's been to the border.
00:39:06.940 And since he took office.
00:39:08.720 Now, apparently, that is the world's easiest thing to fact check.
00:39:12.820 He has not been to the border since he took office.
00:39:15.660 And yet she says it, like, straight out, and then doesn't correct it.
00:39:22.020 No correction.
00:39:24.520 That's very close to gaslighting.
00:39:27.320 Now, it's just short of intentionally trying to make you feel insane.
00:39:32.580 But it's a weird kind of lie.
00:39:36.100 Now, I think Trump and maybe even Biden were, you know, pioneers in this kind of a lie.
00:39:43.280 The kind of a lie where it's easy to know it's a lie, it still works.
00:39:47.920 It still works.
00:39:49.340 Because the Democrats who watch that, what do you think the Democrats concluded when she said,
00:39:54.540 he has been to the border?
00:39:56.380 I'll bet 100% of Democrats, 100% said, oh, well, he's been to the border.
00:40:01.840 I guess that was a dumb question, Fox News.
00:40:04.580 Don't you think?
00:40:05.180 Do you think that there was even one Democrat in the entire United States
00:40:10.540 who saw her claim with a straight face, he's been there since he was elected?
00:40:15.580 I'll bet there wasn't even one Democrat who questioned that.
00:40:19.120 Do you?
00:40:19.900 Like, literally, I'll bet not even one.
00:40:22.260 Maybe within the professional class.
00:40:24.880 But within normal voters, I'll bet not even one could tell the difference.
00:40:29.960 That's my bet.
00:40:30.600 And of, you know, 70, 80 million people, I bet not even one knew the truth.
00:40:36.680 How many of the Republicans knew the truth?
00:40:40.980 Probably half.
00:40:42.800 There's a really big difference in knowledge.
00:40:45.300 All right.
00:40:50.940 Christy Noem, governor of South Dakota, is banning TikTok, the app, on state devices.
00:40:59.100 So individuals can still use TikTok in South Dakota,
00:41:02.580 but not government agencies on their government equipment.
00:41:06.800 Now, what does that remind you of?
00:41:13.660 What is my analogy for what Christy Noem just did?
00:41:17.860 She did something that dissentists does all the time.
00:41:23.240 What's the free money?
00:41:24.480 Thank you.
00:41:24.900 Yes.
00:41:25.520 She picked up the free money.
00:41:28.200 Who exactly was going to argue that government employees should have TikTok on their government devices?
00:41:35.200 Was there somebody going to come out and say,
00:41:37.320 hold on, hold on, governor.
00:41:40.360 We would like to use TikTok on our government phone.
00:41:43.720 Like, nobody.
00:41:46.280 Literally nobody could make that argument.
00:41:48.740 This was free money.
00:41:50.540 There will be people who agree with her,
00:41:52.820 and then people who shut the fuck up,
00:41:55.020 because they don't want to argue the case, even if they think they should.
00:41:58.200 This is the freest of free money.
00:42:01.140 Right?
00:42:01.780 Now, remember, I told you this when Trump started doing it early in, you know, 2015, 16.
00:42:08.060 He always picks up the free money.
00:42:10.440 He stopped doing that, by the way, which concerns me.
00:42:13.840 But DeSantis, he was picking up free money every day.
00:42:17.620 Free money?
00:42:18.320 Sure.
00:42:19.340 And nobody's going to say this thing out loud,
00:42:21.180 and I get free money if I say it out loud.
00:42:23.020 I'll just say it out loud.
00:42:24.160 And then he gets his free money.
00:42:26.380 So that's a real good sign for a future leader,
00:42:30.960 if they can at least recognize the easy stuff.
00:42:34.620 Right?
00:42:35.280 Here's another one.
00:42:36.220 Apparently, the Department of Agriculture, the federal government,
00:42:40.840 is still buying and using Chinese drones for their agricultural surveying.
00:42:46.320 And we know that the Chinese-made drones can send their data back to China,
00:42:51.880 and they can have all kinds of, you know, information.
00:42:55.800 Now, I don't know if the agricultural information really is helpful to...
00:43:02.520 I mean, I'm not sure that China can get enough from, you know,
00:43:05.600 the information from a drone over a farm.
00:43:08.560 I don't know if that helps them or not.
00:43:10.560 But don't you think we should ban that?
00:43:13.340 Don't you think you'd like to know who is the biggest drone manufacturer
00:43:20.260 that's completely American?
00:43:23.620 Why don't you know that?
00:43:26.100 Why are we sitting here, and if I said to you,
00:43:29.020 oh, the biggest drone maker by far is DJI?
00:43:34.240 Is that the name of the Chinese company?
00:43:36.680 I think it is.
00:43:37.280 But what's the name of the biggest American drone maker?
00:43:43.340 I don't know.
00:43:47.240 And why don't you know that?
00:43:50.060 Yeah, well, I'm talking about the hobby-sized drones,
00:43:53.040 the ones that an individual can purchase.
00:43:55.420 Yeah, I'm not talking about the big defense contractors.
00:44:01.180 So DJI is the Chinese manufacturer.
00:44:04.740 But what's the biggest American-only manufacturer of drones?
00:44:08.660 You don't know.
00:44:10.280 Why don't you know?
00:44:11.260 Let me ask you this.
00:44:14.940 All of you are smart enough by now.
00:44:17.180 You're all well-informed enough to know that if you had a choice of buying an American drone
00:44:22.660 versus a Chinese drone, you might even pay more to get the American drone.
00:44:27.800 Am I right?
00:44:28.360 You would pay a little bit more to get the American one because you might think that it's a little more data secure.
00:44:35.160 So what's the name of that company?
00:44:39.780 So China wants to sell you Chinese drones, and your press has never told you the name of their competitor.
00:44:46.200 Think about it.
00:44:48.920 Think about the dog that's not barking.
00:44:51.680 It's a huge story.
00:44:53.440 You see it in the headlines everywhere.
00:44:55.620 People should not buy Chinese drones from a company named DJI.
00:45:00.200 And what's the name of the other company they should use?
00:45:03.360 I don't know.
00:45:03.920 I've never heard it.
00:45:05.720 Do you think that's an accident?
00:45:07.960 Is it an accident that none of the stories about the drones that you should not buy includes any reference to the one that maybe you should buy?
00:45:17.600 What the hell is up with that?
00:45:19.060 No, I don't think Lockheed, Boeing, and Raytheon, I think they make the big drones, the military drones.
00:45:29.660 I'm not talking about that.
00:45:31.120 I'm talking about the little ones that fly over your house.
00:45:35.040 Is Acme real?
00:45:36.580 Or are you joking?
00:45:38.040 I'm seeing people say Acme drones, but Mattel?
00:45:42.500 No, I'm talking about serious industrial drones, but not military.
00:45:49.060 But seriously, is anybody's mind blown that you don't automatically know the name of the competitor to DJI?
00:45:59.000 Never even heard it.
00:46:01.320 Now, some of you are saying Exo, but I've never heard that name.
00:46:05.920 E-X-O?
00:46:07.960 So that's the name of a drone.
00:46:10.400 So you'd have to be sort of in the drone business or a hobbyist to actually have heard that.
00:46:15.940 So it looks like E-X-O.
00:46:17.060 Well, let's do a little search on that.
00:46:21.300 Okay?
00:46:21.880 Let's find this out right now.
00:46:24.880 So E-X-O drones.
00:46:28.260 E-X-O drones.
00:46:31.480 Give them a little commercial here if this is what I think it is.
00:46:35.840 Yeah, so that is E-X-Drones official site.
00:46:38.440 And let's see if the drone, let's see if it comes up.
00:46:47.880 Okay.
00:46:48.460 Yep.
00:46:48.840 There we go.
00:46:49.960 So you say this is an American company, right?
00:46:52.540 E-X-O?
00:46:55.640 Can you confirm this is an American company?
00:46:57.900 I don't want to mislead you.
00:46:58.860 Well, you can check for yourself, but it's amazing.
00:47:05.540 You know, we didn't all know that at the top of our, tip of our tongues.
00:47:09.960 All right.
00:47:10.320 But, so Musk apparently met with Apple's Tim Cook, and the outcome is that Tim Cook says
00:47:21.160 Apple never was considering banning Twitter from the App Store.
00:47:25.800 They never considered banning it from the App Store.
00:47:32.140 And it took, wait a minute, what do I hear?
00:47:36.240 I hear something.
00:47:37.620 Ah, yes.
00:47:38.760 It's the dog not barking.
00:47:41.300 Can you tell me how many days it took Apple to respond that they were not considering taking
00:47:47.560 Twitter out of the store?
00:47:49.140 How hard was it for Tim Cook to say, oh, no, we're not considering that?
00:47:55.800 Three days, right?
00:47:58.320 So for three days, Tim Cook was never asked by any journalist, was this true?
00:48:04.920 There was no journalist in the entire world who asked Tim Cook, hey, is this true?
00:48:10.120 In three days.
00:48:13.000 Come on.
00:48:14.860 Come on.
00:48:16.540 Come on.
00:48:19.060 Really?
00:48:20.540 Really?
00:48:22.180 Like, what's going on here?
00:48:23.480 Now, did you also notice that dog not barking?
00:48:28.040 Did you notice that for three days, the simplest thing that a person could do?
00:48:32.880 Do you know how easily Tim Cook could have fixed it?
00:48:36.060 Oh, looks like there's a rumor that's not true.
00:48:38.720 Well, let me open up my Twitter app.
00:48:41.300 At Elon Musk.
00:48:42.960 That's a false rumor.
00:48:44.140 We are not considering banning Twitter.
00:48:47.720 One tweet.
00:48:48.500 And would anyone have said, oh, that's unprofessional?
00:48:54.540 No.
00:48:55.620 That would be exactly the way to do it.
00:48:58.200 One tweet?
00:48:58.900 No, no, that's not true.
00:49:00.580 Has anybody ever tested the theory that you can shoot down a rumor quickly with one tweet?
00:49:07.080 Yes.
00:49:08.040 Elon Musk does it almost every day.
00:49:10.960 Almost every day, Elon Musk tweets, that's a rumor, that's not true.
00:49:15.220 And as soon as you read it, do you say to yourself, oh, he's lying?
00:49:20.720 Never.
00:49:21.960 Never.
00:49:22.780 You read it, you go, oh, that's true.
00:49:25.880 He just shoots down the rumors, like, with no ambiguity.
00:49:29.560 Nope.
00:49:30.200 Never happened.
00:49:31.580 Who do you believe?
00:49:32.380 I always believe Elon Musk, every time.
00:49:34.880 Because when somebody is that quick to shut something down and they don't leave any ambiguity,
00:49:40.600 no, that did not happen.
00:49:42.600 You know, there's no wiggle room there.
00:49:43.840 As soon as I hear that, I go, okay, that didn't happen.
00:49:47.280 And Tim Cook didn't think that watching half of the United States say they were going to
00:49:54.440 throw their iPhones into the sea, like that didn't get him going in two days?
00:50:01.000 He had something better to do?
00:50:03.700 All right, there's something very unexplained.
00:50:06.840 Here's what I think it is.
00:50:08.360 Let us speculate.
00:50:09.640 What is one sentence that's always true and always false at the same time?
00:50:18.620 Now, I asked this on Twitter.
00:50:20.020 I got a whole bunch of funny answers that are worth looking at, by the way.
00:50:22.900 Look at that tweet.
00:50:23.680 You'll see a whole bunch of funny answers.
00:50:25.400 Things that are true and false at the same time.
00:50:28.120 And here was my entry.
00:50:31.520 This is always true and always a lie at the same time.
00:50:34.560 We were considering it.
00:50:37.680 We considered it.
00:50:39.460 It's always true and it's always a lie.
00:50:44.060 Always true and always a lie.
00:50:46.360 Do you know why I know this for sure?
00:50:49.720 I made a very big mistake once when talking to my doctor years ago.
00:50:54.660 I forget what the context was.
00:50:57.720 This was years ago.
00:50:59.360 And the doctor asked, had I ever considered ending my own life?
00:51:07.180 And unfortunately, I didn't think this through.
00:51:11.700 And so I said, yeah, every day.
00:51:15.040 Of course.
00:51:16.620 I considered every day.
00:51:17.980 So that immediately got me into the mental health.
00:51:24.460 I managed to get myself out of it.
00:51:27.060 But it immediately puts you into the mental health process.
00:51:31.720 And now you've got to talk to a psychologist and you'd better get on drugs.
00:51:35.280 And, you know, basically everything's in question at this point.
00:51:38.740 Right.
00:51:39.460 Stop everything.
00:51:40.500 They're going to have to help you.
00:51:41.600 And then I said, hold on, hold on.
00:51:44.260 We must be using language differently here.
00:51:46.560 You asked me if I ever considered it.
00:51:49.560 And I answered you honestly every day.
00:51:52.420 And then I decide that it would be a bad idea.
00:51:54.940 And then I go on with my day.
00:51:56.620 I consider everything every day.
00:51:58.920 When I run into somebody in the street, I think about killing them.
00:52:02.580 I've just never been serious about it.
00:52:05.480 If I see an attractive woman, I think about having sex with her.
00:52:09.880 I just don't act on it.
00:52:11.920 I consider everything.
00:52:13.500 Considering is just my mental internal process.
00:52:18.000 Considering has nothing to do with the likelihood of something that's going to happen.
00:52:21.840 Those are completely unrelated concepts.
00:52:24.460 Yes, I literally consider everything.
00:52:28.280 And then I choose what makes sense.
00:52:31.360 So did Apple ever consider knocking Twitter off?
00:52:36.140 Of course they did.
00:52:37.920 Of course they considered it.
00:52:39.600 The same way I consider ending my life every day, even when I'm nowhere near any kind of suicidal thoughts.
00:52:46.820 So just to be clear, I have zero intention of ending my life.
00:52:50.780 Zero.
00:52:52.100 But do I consider it?
00:52:54.220 Yeah, every day.
00:52:55.580 Every single day.
00:52:56.900 Along with every other possibility in my life I consider.
00:53:00.920 Everything.
00:53:01.440 So I think it's true that Apple considered it.
00:53:06.340 And I think that when they were asked, are you serious?
00:53:10.140 They said, no.
00:53:12.520 Which could be construed as not considering it.
00:53:16.780 So there are things that you consider and you also don't consider them.
00:53:21.520 Because you're meaning the terms in slightly different ways, right?
00:53:25.780 Considering means I thought about it in my mind.
00:53:28.560 But in your mind it might mean I took it seriously.
00:53:30.680 That's different.
00:53:31.940 I think that's all that happened.
00:53:33.440 But I don't know.
00:53:34.600 Just a guess.
00:53:38.860 All right.
00:53:40.940 Lee Zeldin said the smartest thing that a Republican has said in, I don't know, a year.
00:53:49.220 Maybe a year.
00:53:51.120 Every once in a while you'll see a Republican say exactly the right thing.
00:53:54.640 And honestly, I'm surprised.
00:53:57.640 Because Republicans are not super good at messaging.
00:54:00.640 Nobody is really.
00:54:01.880 None of our, except maybe AOC every now and then and Trump every now and then.
00:54:07.200 But here's Lee Zeldin getting it exactly right.
00:54:11.780 Here's a tweet.
00:54:12.460 He says, whenever they, they, meaning Democrats, whenever they propose ballot harvesting, totally oppose it.
00:54:20.620 Whenever they pass ballot harvesting, do it so much better than them that they deeply regret ever passing it in the first place.
00:54:28.880 All right.
00:54:33.520 Standing ovation.
00:54:40.880 So, here's the answer to your question.
00:54:43.520 Did you ever wonder how Lee Zeldin did so well?
00:54:48.740 He lost.
00:54:49.920 But he came incredibly close in New York.
00:54:53.460 Did you ever wonder how a Republican could get that close to a Democrat state governorship?
00:55:00.580 This is why.
00:55:02.720 This is why.
00:55:04.500 Everything you need to know about Lee Zeldin is in this one tweet.
00:55:08.660 He just told you, I'm smarter than all the other Republicans.
00:55:12.520 That's what I heard.
00:55:14.400 I just heard, I'm smarter than all of the other Republicans.
00:55:17.980 Because I just told you exactly what to do, which is the right thing to do.
00:55:23.380 That is leadership.
00:55:26.120 Right?
00:55:27.220 That's leadership.
00:55:28.600 This is exactly, exactly the leadership I want.
00:55:32.020 And you know what?
00:55:33.180 I would agree with this, whether I'm Democrat, Republican, or Independent.
00:55:40.920 Because we have a competitive system.
00:55:42.740 In our competitive system, we thought was about getting the votes.
00:55:46.600 But it turns out that our competitive system is also about getting out the vote.
00:55:51.560 I mean, we knew that, but we need to extend that to getting out all kinds of votes.
00:55:56.140 So Lee Zeldin is saying, this isn't the game we want to play.
00:56:00.700 But if you're in the game, don't fucking stand there while the ball rolls past you.
00:56:08.740 I mean, I feel like the Republicans were like, we don't want to play baseball.
00:56:12.500 No, we don't want to.
00:56:13.400 So, okay, we're on the field.
00:56:15.400 Okay, we're on the team.
00:56:16.400 We are playing baseball.
00:56:17.480 But we don't want to be here.
00:56:19.140 And then somebody hits the ball and it rolls through their feet.
00:56:21.800 And they're like, ah, I could have picked that up and gotten you out.
00:56:24.960 But I didn't want to be here.
00:56:28.100 So fucking dumb.
00:56:30.020 Pick the ball up.
00:56:31.640 Right?
00:56:31.740 Lee Zeldin is telling you the most obvious, logical, strategic thing to do.
00:56:38.000 There's no complexity here at all.
00:56:40.740 And I will even go further.
00:56:43.520 So I don't think Lee Zeldin can say this directly because he's in the political sphere.
00:56:48.860 But I can.
00:56:50.480 So here's my strategy for the Republicans.
00:56:53.700 Whether or not they actually care about their people voting by mail,
00:56:59.460 no matter whether they care about it or not,
00:57:03.600 they should say it's their number one priority.
00:57:06.420 And they should tell Republicans, this is a vote-by-mail election coming up.
00:57:11.460 You can also vote in person, but you should consider it a vote-by-mail election.
00:57:17.200 If you're a Republican, we're voting by mail this time.
00:57:20.620 If there's anything you can do, make it a vote-by-mail
00:57:24.200 and make sure that you've legally, legally helped everybody vote-by-mail
00:57:30.100 who's anywhere in your environment.
00:57:33.280 And you should check with everybody, especially the elderly,
00:57:36.700 especially the people with special needs.
00:57:38.800 You should make sure that all of their votes are heard.
00:57:42.640 That would scare the living shit out of the Democrats.
00:57:46.420 But right now, the Republicans are playing as stupid.
00:57:49.140 They're like, oh, we hate it that you do that ballot harvesting so successfully.
00:57:55.460 What is that?
00:57:57.260 Is that the leadership you want?
00:57:59.520 The leadership you want is complaining that the other team is better.
00:58:03.420 That's it.
00:58:04.600 Oh, I'd like to vote for the team that doesn't do a good job on ballot harvesting,
00:58:09.840 rather that they complain that the other team does a good job
00:58:13.240 in a completely legal process, as far as I can tell.
00:58:17.180 Right?
00:58:17.280 Like, there's no leadership there.
00:58:20.520 Lee Zeldin just showed you leadership.
00:58:24.920 So, now, who do you want to be your Speaker of the House?
00:58:30.840 Too late.
00:58:32.520 Did McCarthy already get picked?
00:58:34.400 Or is that not until the actual...
00:58:36.700 That can't happen until the actual majority happens, right,
00:58:39.380 after January something?
00:58:44.600 But am I wrong?
00:58:45.780 Like, Lee Zeldin is the best leader who doesn't have another job.
00:58:52.800 Well, I guess he has another job, but I don't know.
00:58:55.940 They should think about him as a Speaker.
00:59:03.200 Neuralink is ready for human testing, if the FDA approves.
00:59:08.720 So, Musk says that that's a go.
00:59:13.620 And the potential uses for Neuralink, which is literally a chip,
00:59:17.740 which they literally will drill into your skull.
00:59:20.600 Not into your brain, but they'll put a little dent in your skull
00:59:24.440 and put a little chip in there.
00:59:27.060 I'm not sure I'll ever be able to sign up for that, but maybe.
00:59:30.820 Who knows?
00:59:32.020 Anything is possible.
00:59:32.820 But among the things that might be possible with this little chip
00:59:36.580 is restoring sight to the blind.
00:59:40.660 Is that amazing or what?
00:59:43.460 Restoring sight to the blind.
00:59:48.280 Holy cow.
00:59:49.960 But also maybe restoring use of the limbs to paralyze people.
00:59:55.580 What?
00:59:58.400 Restoring the use of your limbs to paralyze people?
01:00:03.960 Like, this is so futuristic, I can't even...
01:00:06.960 It's hard to even wrap your head around it.
01:00:09.340 But what if it works?
01:00:11.400 So, I think here's...
01:00:13.360 You know, this is certainly a signal that the cyborg age is upon us.
01:00:19.380 You know, clearly we're already cyborgs
01:00:21.260 because even though your phone is not physically attached to your hand,
01:00:25.680 it sort of is.
01:00:27.180 You know, it's sort of attached to your hand.
01:00:30.080 So you're already cyborgs,
01:00:31.380 but this chip would make it a little bit better.
01:00:33.400 Better even than the Apple Watch
01:00:35.300 to turn you into a cyborg.
01:00:40.600 Well.
01:00:42.220 Oh, by the way, what Lee Zeldin was doing
01:00:44.760 by saying that you should ballot harvest better than the other side,
01:00:49.960 that's embrace and amplify.
01:00:53.080 So that's your lesson on persuasion today.
01:00:57.420 So what...
01:00:58.380 Instead of complaining about the thing you don't like,
01:01:01.680 embrace it as hard as you can
01:01:03.720 and use it as hard as you can,
01:01:06.300 and the other side will immediately see it was a bad idea.
01:01:10.300 And that's what Lee Zeldin is correctly, correctly suggesting.
01:01:14.540 There was a YouTube presentation last night of Neuralink.
01:01:21.840 I bet that was interesting.
01:01:23.960 Yeah.
01:01:24.560 Sometimes I think we're all Captain Pike.
01:01:28.000 No one more than me.
01:01:29.800 Oh, you tagged me on that?
01:01:31.060 Okay, I'll look for it.
01:01:34.360 Yeah, if the refs aren't calling fouls...
01:01:36.720 Exactly.
01:01:38.000 So there was a good analogy, which I approve.
01:01:40.420 If the refs are calling it loose
01:01:44.380 and they're not calling fouls,
01:01:46.260 you should be fouling.
01:01:48.580 Right?
01:01:49.340 If the other team is fouling you and not getting called,
01:01:51.760 you should foul as soon as you can.
01:01:54.220 Immediately foul.
01:01:55.580 That is correct strategy.
01:01:57.160 Now, do you think that Apple may have blinked
01:02:06.160 because Elon Musk saying he might have to build his own phone
01:02:12.000 doesn't sound like a bluff, does it?
01:02:17.120 How would you like to play poker with Elon Musk
01:02:21.140 where nothing he says is really ever a bluff?
01:02:25.200 I'm going to build a rocket to Mars.
01:02:29.120 No, you're not.
01:02:31.440 You're not going to build a rocket to Mars.
01:02:33.760 No, actually, I'm building a rocket to Mars.
01:02:36.240 Yeah.
01:02:36.820 I'm going to build a chip
01:02:38.200 that people will drill into their skull
01:02:40.520 and be happy about it.
01:02:41.920 No, you're not.
01:02:43.680 That's freaking crazy.
01:02:45.380 Nobody's going to put a chip in their skull.
01:02:48.120 Maybe they will.
01:02:49.980 So I'm going to build an electric car
01:02:52.180 when nobody thinks that's economical.
01:02:54.020 And now I'm the most economical car company of all time.
01:02:58.640 So the last thing I would do
01:03:01.220 is say to myself,
01:03:03.340 you know,
01:03:04.460 nobody can compete with Apple on phones at this point.
01:03:08.200 I mean, Google and Apple have it all wrapped up.
01:03:10.720 There's nobody who could enter the market at this late stage
01:03:14.220 and make an impression.
01:03:18.360 Except the one person in the world
01:03:21.260 that you would be afraid of
01:03:22.520 would be Elon Musk.
01:03:24.280 I'm going to make another prediction.
01:03:29.100 I don't think we'll find out if this is true.
01:03:32.200 But my prediction is this.
01:03:34.700 Elon Musk already has a phone design.
01:03:38.560 The specs.
01:03:39.840 Not a full design.
01:03:41.540 Not a buildable design.
01:03:42.680 But I'll bet you somewhere in his messaging
01:03:47.060 or in his files,
01:03:48.900 he's got an actual design for a new kind of phone
01:03:52.080 that would not use apps.
01:03:55.380 I'll bet you.
01:03:57.740 I'll bet you he's at least
01:03:59.160 intellectually engaged enough
01:04:02.180 that he would be super interested
01:04:04.400 in what that would look like.
01:04:05.660 Like from an engineering perspective,
01:04:08.200 how would you build a phone
01:04:09.640 that would leapfrog current phones?
01:04:13.860 Here's the first way you do it.
01:04:16.580 How often do you want to use your phone
01:04:19.100 but your hands and your mouth
01:04:21.420 are busy doing something else?
01:04:23.440 It's like all the time, isn't it?
01:04:25.400 Do you know how many times
01:04:26.140 I want to Google something
01:04:27.280 while I'm walking to my car?
01:04:29.760 Like all the time.
01:04:31.340 And I don't always want to take out my phone
01:04:33.060 and look for it.
01:04:34.040 Here's what I like to do.
01:04:35.000 On the way to the car,
01:04:36.440 I'd like to have a question
01:04:37.460 and I'd like to just think it
01:04:39.060 and then have it spoken back to my brain.
01:04:42.500 Or maybe I speak it out loud
01:04:44.260 when nobody's hearing.
01:04:45.220 Maybe it's an audio thing in my ear.
01:04:47.640 But I'd like to just be walking along
01:04:49.360 and say,
01:04:51.540 huh, what is the capital of Elbonia?
01:04:54.180 And then in my ear it says,
01:04:55.840 I'm wrong, gang.
01:04:56.740 It's the capital of Elbonia.
01:04:58.260 Or something.
01:05:00.040 Like all day long,
01:05:01.400 I want to know things
01:05:02.680 that I should Google
01:05:03.820 but I'm not going to.
01:05:05.000 So if Neuralink,
01:05:08.320 if Neuralink could get to the point
01:05:11.080 where I don't need a physical phone,
01:05:13.680 then your Apple phone is worthless.
01:05:17.560 Imagine if Neuralink could get to the point
01:05:19.820 where you could see a screen
01:05:22.520 that other people can't see.
01:05:25.500 If Neuralink can give vision to the sightless,
01:05:29.540 it can also put things in your environment
01:05:32.240 that aren't there.
01:05:34.280 I'm assuming.
01:05:35.760 So couldn't it do,
01:05:36.920 yeah,
01:05:37.320 couldn't it do in augmented reality
01:05:39.520 where if you've got the chip in your head,
01:05:43.160 anytime you want,
01:05:44.220 a virtual screen pops up
01:05:45.660 that only you can see.
01:05:47.480 And you turn around
01:05:48.220 and it just follows you.
01:05:49.060 And you can just make it go away
01:05:50.560 by wanting it to go away.
01:05:52.220 So imagine I'm walking to the car
01:05:53.960 and I think,
01:05:55.580 ah,
01:05:56.600 who is,
01:05:57.320 what's the capital of Elbonia?
01:05:59.540 Boop.
01:06:00.080 Screen pops up
01:06:01.020 and I see it on a map.
01:06:03.260 I think that's where we're going.
01:06:06.460 Now,
01:06:06.740 would you have a smartphone
01:06:07.740 that you had to carry around in your hand
01:06:10.300 if you could do 100% of all the things
01:06:13.000 you need to do
01:06:13.700 just by thinking it?
01:06:16.920 That's where we're going.
01:06:18.220 So I think,
01:06:18.600 I think Elon Musk
01:06:19.480 is going to put Apple on a business
01:06:21.140 one way or the other.
01:06:24.160 It's just going to take,
01:06:24.920 you know,
01:06:25.160 it might take 15 years.
01:06:26.680 But I think,
01:06:27.560 but I think you're,
01:06:29.280 do you think,
01:06:30.300 does anybody think you're going to be
01:06:31.380 carrying a physical phone in 15 years?
01:06:34.440 Does anybody think that's going to happen?
01:06:38.700 I don't think so.
01:06:41.520 That seems the least likely future possibility.
01:06:45.720 Yeah.
01:06:46.440 I don't know what it will be,
01:06:47.600 but it's not going to be a phone.
01:06:49.480 It might be,
01:06:49.680 it might be glasses with a chip
01:06:51.420 or something like that.
01:06:54.260 Yeah.
01:06:57.060 I feel like some kind of
01:06:59.080 technical glasses
01:07:01.740 are in the future
01:07:03.720 because they have so many
01:07:05.000 potential uses,
01:07:06.060 everything from sunglasses
01:07:07.260 to whatever.
01:07:14.960 Oh,
01:07:15.420 could it help with brain chemistry?
01:07:17.060 I would think so.
01:07:20.520 Imagine if Neuralink
01:07:21.800 could help you
01:07:22.900 change what you think about.
01:07:25.820 Let's say you could just tell it,
01:07:27.080 hey,
01:07:27.660 make me see images
01:07:29.360 of happy things,
01:07:30.340 puppies and kittens.
01:07:32.240 I think it would make you happier.
01:07:35.440 Now,
01:07:37.060 is anybody concerned,
01:07:39.500 I'm going to just change the subject
01:07:40.840 totally here.
01:07:42.520 Is anybody concerned
01:07:43.540 that the
01:07:44.740 relationships
01:07:46.780 can't work anymore?
01:07:49.920 Like,
01:07:50.520 they sort of can't?
01:07:53.200 I feel like we've crossed
01:07:54.800 some kind of
01:07:55.760 a bridge here
01:07:57.000 socially
01:07:58.260 and it has something
01:07:59.220 to do with social media,
01:08:01.260 something to do with,
01:08:02.580 I don't know,
01:08:03.100 wokeness,
01:08:03.920 something to do with
01:08:04.640 division maybe,
01:08:05.500 I don't know.
01:08:06.400 But
01:08:06.700 it doesn't,
01:08:09.540 I don't see
01:08:10.340 men and women
01:08:11.020 agreeing that they
01:08:11.880 should even be together.
01:08:14.260 I'm seeing men and women
01:08:15.460 both agree
01:08:16.080 that the other
01:08:16.640 doesn't offer them enough.
01:08:18.940 Right?
01:08:19.340 And the general argument
01:08:21.740 that I'm seeing everywhere
01:08:22.840 from both men and women
01:08:24.140 on social media
01:08:24.900 goes like this.
01:08:26.440 In the old days,
01:08:27.940 there was a definite benefit
01:08:29.460 to getting married
01:08:30.140 and having kids.
01:08:31.740 The biggest benefit
01:08:32.800 is that was your retirement.
01:08:35.100 You had to have kids
01:08:36.000 to take care of you
01:08:36.800 when you're old
01:08:37.440 and to run the farm
01:08:38.880 and to give you
01:08:40.000 physical defense
01:08:41.000 if you had sons,
01:08:42.180 et cetera.
01:08:43.080 So,
01:08:44.160 it certainly made sense.
01:08:46.080 And then there was a period,
01:08:47.300 let's say the 50s,
01:08:48.340 where if the man
01:08:50.660 was working
01:08:51.360 and the woman
01:08:53.360 was raising the kids,
01:08:54.980 as long as everybody
01:08:56.020 was okay with that,
01:08:57.860 that was a system
01:08:58.780 that seemed to be successful.
01:09:00.700 But we're in a
01:09:01.760 different world now
01:09:02.580 and today
01:09:04.300 if a woman
01:09:05.320 offers,
01:09:06.900 let's say,
01:09:07.920 herself to a man,
01:09:09.860 this is what I'm seeing
01:09:10.760 on social media,
01:09:11.900 the man says,
01:09:12.800 what are you offering?
01:09:14.760 And the woman says,
01:09:15.580 well,
01:09:16.580 of course,
01:09:17.440 I will be expensive
01:09:18.640 because I would expect you
01:09:19.900 to take most
01:09:21.500 of the financial burden
01:09:22.440 because that appears
01:09:24.120 to be the thing
01:09:25.500 that we hear the most.
01:09:27.020 And the guy says,
01:09:27.700 okay,
01:09:28.020 so that's what you're
01:09:28.640 going to take from me.
01:09:29.480 You're going to take my money.
01:09:30.520 Now,
01:09:30.720 what are you giving me?
01:09:32.160 And she'll say,
01:09:32.700 well,
01:09:33.440 I'll give you
01:09:34.340 children.
01:09:37.080 And then the guy says,
01:09:38.180 okay,
01:09:38.640 so you can guarantee
01:09:39.720 that no matter
01:09:40.900 what happens with us,
01:09:41.960 I get to keep
01:09:42.600 those children?
01:09:43.800 And she'll say,
01:09:44.340 no,
01:09:44.520 no,
01:09:44.620 if something happens
01:09:45.680 to us,
01:09:46.960 I'll probably
01:09:47.660 keep the children.
01:09:49.160 So he goes,
01:09:49.620 okay,
01:09:49.800 you're not really
01:09:50.280 giving me children.
01:09:51.760 Like,
01:09:52.200 you can't even offer that.
01:09:53.540 It's not within your power.
01:09:56.200 And then she says,
01:09:57.160 but,
01:09:58.200 you don't have to worry
01:09:58.940 about that
01:09:59.400 because I will offer
01:10:00.420 you faithfulness.
01:10:02.100 I will be true to you
01:10:03.460 and only physical with you
01:10:04.920 and then you don't have
01:10:06.080 to worry about us
01:10:06.700 breaking up
01:10:07.200 and taking the kids.
01:10:08.700 Then the man says,
01:10:09.560 I live in the real world.
01:10:12.380 As soon as you're
01:10:13.560 financially capable
01:10:15.020 of taking care of yourself
01:10:16.100 and your kids,
01:10:16.900 why would you stay with me
01:10:18.220 when you could find
01:10:19.400 somebody better?
01:10:21.080 And then she would say,
01:10:22.340 well,
01:10:22.460 that's a good point.
01:10:23.080 I could find somebody better
01:10:24.300 because once I have
01:10:25.440 enough of your money
01:10:26.200 and I have kids,
01:10:27.980 you know who's going
01:10:28.760 to look not so good?
01:10:30.740 Whoever you're living with
01:10:32.160 is going to not look good
01:10:33.700 because your co-worker
01:10:35.540 is always,
01:10:36.260 you know,
01:10:36.600 clean and well-dressed
01:10:37.780 and that's the only time
01:10:39.360 you ever see them.
01:10:40.980 But you're at home,
01:10:41.980 you're seeing,
01:10:42.520 you know,
01:10:42.740 the worst of your mate.
01:10:46.600 So,
01:10:47.500 so women and men
01:10:50.840 are asking the same question.
01:10:52.080 What are women
01:10:52.660 offering to men
01:10:54.140 in 2022?
01:10:56.320 And I actually don't know
01:10:57.260 the answer to the question.
01:10:59.200 I literally,
01:11:00.160 I'm not being argumentative.
01:11:01.880 I don't know.
01:11:03.400 What do they offer?
01:11:04.960 If the most they can offer
01:11:06.380 is a 25% chance
01:11:09.700 that you'll live
01:11:10.360 a happy life
01:11:11.060 and get to see your kids
01:11:12.180 the whole time,
01:11:13.900 is that,
01:11:14.540 is that something
01:11:15.140 you want to trade
01:11:16.320 your whole,
01:11:18.080 your entire net worth for?
01:11:21.160 Your,
01:11:21.640 your entire financial future,
01:11:23.540 you're going to trade
01:11:24.380 for a 25% chance
01:11:25.940 of being able to see
01:11:26.720 your kids in 20 years?
01:11:28.400 I mean,
01:11:28.940 it just doesn't sound like
01:11:29.880 anything that anybody
01:11:30.940 would do
01:11:31.400 if they were rational.
01:11:32.760 Now,
01:11:33.240 I think that we're
01:11:34.080 a zombie culture.
01:11:35.180 A zombie culture
01:11:37.400 in that we're doing
01:11:38.360 what we used to do
01:11:39.380 even though it doesn't
01:11:40.080 make sense anymore.
01:11:42.000 So,
01:11:42.460 we're,
01:11:42.660 we're still sort of
01:11:43.500 running through the,
01:11:44.620 oh,
01:11:44.980 I guess I'm dating
01:11:45.760 and I guess I'm supposed
01:11:47.460 to get myself a wife
01:11:49.460 and,
01:11:50.420 you know.
01:11:51.340 So,
01:11:51.800 I think we're just
01:11:52.540 zombie-ing our way
01:11:53.660 toward it
01:11:54.160 with,
01:11:54.760 with no sense
01:11:55.580 that it makes any sense
01:11:56.580 at all
01:11:56.920 or that it even could.
01:12:00.720 You're saying
01:12:01.300 feminists ruined everything?
01:12:02.720 I don't know about that.
01:12:03.620 Maybe it was the economy
01:12:05.560 that did.
01:12:06.860 Maybe it was consumerism
01:12:08.320 that ruined everything
01:12:09.160 because then you needed
01:12:10.080 two jobs.
01:12:12.020 You could take this
01:12:13.040 a lot of ways.
01:12:15.020 But,
01:12:15.760 am I wrong
01:12:16.420 that women
01:12:16.840 are not offering
01:12:17.740 anything to men
01:12:19.520 that is reliable?
01:12:21.940 They do offer
01:12:22.620 but nothing reliable
01:12:23.780 or lasting.
01:12:25.280 Would you agree?
01:12:27.020 Now,
01:12:27.580 is it the same
01:12:28.220 the other way?
01:12:29.880 Would you say
01:12:30.620 that men
01:12:31.520 do not offer
01:12:32.900 anything to women?
01:12:36.060 Because I feel
01:12:36.860 like they offer
01:12:37.440 money sometimes
01:12:38.900 and physical protection
01:12:41.660 sometimes
01:12:42.660 but do men
01:12:44.740 really offer
01:12:45.280 physical protection
01:12:46.180 to women?
01:12:48.240 Think about it.
01:12:49.740 Do men
01:12:50.280 really offer
01:12:51.260 physical protection
01:12:52.200 to women
01:12:52.740 within the house?
01:12:56.700 All right,
01:12:56.900 women,
01:12:57.160 tell me
01:12:59.100 who has been
01:13:00.500 your biggest
01:13:01.120 physical threat
01:13:02.260 over the course
01:13:02.880 of your life?
01:13:03.780 Somebody inside
01:13:04.800 your house
01:13:05.260 or somebody
01:13:05.720 outside your house?
01:13:07.380 There's no competition.
01:13:09.380 The biggest
01:13:09.820 physical threat
01:13:10.820 to a woman
01:13:11.380 is a male
01:13:12.500 inside their house.
01:13:14.940 Right?
01:13:15.900 I don't think
01:13:17.040 in 2022
01:13:17.680 men protect women.
01:13:19.260 I think men
01:13:19.820 are a bigger threat
01:13:20.680 to women
01:13:21.120 than they are
01:13:22.220 a protection.
01:13:23.160 Now,
01:13:23.360 obviously,
01:13:23.740 it depends
01:13:24.040 on the situation.
01:13:25.360 Plenty of cases
01:13:26.280 where men
01:13:27.020 are legitimately
01:13:27.680 protecting women.
01:13:29.620 Right?
01:13:30.020 Don't,
01:13:30.340 don't,
01:13:31.760 don't have
01:13:33.220 a heart attack
01:13:33.780 over this,
01:13:34.300 right?
01:13:35.200 Plenty,
01:13:35.860 plenty of cases
01:13:36.680 where the man
01:13:37.540 is absolutely
01:13:38.540 protecting the woman.
01:13:40.840 But what I'm saying
01:13:41.600 is,
01:13:43.140 is the woman,
01:13:44.260 is the woman
01:13:44.980 really seeing
01:13:45.680 that benefit?
01:13:46.920 If you're
01:13:47.560 a 20-year-old woman
01:13:48.480 and the only time
01:13:50.640 you've ever been abused
01:13:51.560 is by your own boyfriend,
01:13:52.760 maybe once or twice,
01:13:55.700 are you saying
01:13:56.260 to yourself,
01:13:56.700 hmm,
01:13:56.880 I really need a man
01:13:57.700 to protect me?
01:13:58.940 Or are you saying
01:13:59.440 I'd better live alone
01:14:00.380 because the most
01:14:00.940 dangerous thing
01:14:01.520 I can do
01:14:01.900 is live with a man?
01:14:04.820 I don't know.
01:14:05.820 But,
01:14:06.080 so I think
01:14:06.700 the entire,
01:14:07.620 what do we call it
01:14:09.700 in business,
01:14:10.280 the value proposition.
01:14:12.260 I think the value
01:14:13.440 proposition
01:14:13.940 for both the men
01:14:15.060 and the women
01:14:15.620 is that they're
01:14:16.340 not offering
01:14:16.880 what they used to.
01:14:18.660 Not offering
01:14:19.400 what they used to.
01:14:20.600 And how about
01:14:21.020 having kids?
01:14:23.420 Let me be brutally honest.
01:14:26.300 In the old days,
01:14:27.940 like what I imagined
01:14:28.860 the old days
01:14:29.400 in the 60s,
01:14:30.360 I think having a kid
01:14:31.780 was sort of a good deal
01:14:32.780 because you could
01:14:34.100 interact with your kids
01:14:35.220 and sometimes
01:14:35.980 you could do
01:14:36.400 some family stuff
01:14:37.240 that everybody liked.
01:14:38.960 Do you know
01:14:39.320 it's like having
01:14:39.940 a kid today?
01:14:42.160 If you don't have a kid,
01:14:43.880 allow me to give you
01:14:45.180 my impression
01:14:46.820 in one act
01:14:47.600 of all of your
01:14:49.180 quality interaction
01:14:50.280 with your kid
01:14:51.020 after the age
01:14:52.400 of 10.
01:14:56.760 Uh-huh.
01:14:58.880 Yep.
01:14:59.660 Uh-huh.
01:15:01.660 Right, right.
01:15:02.360 Uh-huh.
01:15:05.240 Uh-huh.
01:15:07.920 Pretty satisfying,
01:15:09.020 wasn't it?
01:15:10.740 Now, that's the sort
01:15:11.800 of thing you want
01:15:12.440 to organize
01:15:13.440 your entire life
01:15:14.640 around.
01:15:15.020 because if you do
01:15:16.720 everything right,
01:15:18.480 you can get that.
01:15:20.460 Yeah.
01:15:20.760 You can have
01:15:21.440 the satisfying experience
01:15:22.860 of having your kid
01:15:24.340 not give a fucking shit
01:15:26.000 about whether you
01:15:27.220 live or die.
01:15:28.360 They only care
01:15:29.040 what's on social media.
01:15:30.400 You get to make
01:15:31.080 one of those
01:15:31.600 and then get to sit there
01:15:33.040 and feel like
01:15:33.560 what you did wrong
01:15:34.300 because every moment
01:15:35.620 you sit with that kid,
01:15:36.560 you'll fucking hate
01:15:37.120 your life
01:15:37.600 because you're like,
01:15:38.680 I love the kid,
01:15:39.940 but I fucking hate this.
01:15:42.120 I hate this moment.
01:15:43.260 I hate dinner.
01:15:44.380 I hate sitting in the car.
01:15:45.900 I hate walking with them.
01:15:47.460 I hate having
01:15:48.080 a conversation.
01:15:51.820 Shut off their phones.
01:15:53.280 Try shutting off a phone.
01:15:56.000 Has anybody tried
01:15:56.780 punishing a kid
01:15:57.580 by turning off their phone?
01:16:00.220 How'd that work out?
01:16:03.940 I mean,
01:16:04.600 it happens all the time.
01:16:06.460 Have you ever seen
01:16:07.200 anybody change
01:16:07.880 their behavior
01:16:08.460 because you turned
01:16:09.340 off their phone
01:16:09.920 for a week?
01:16:11.400 I haven't.
01:16:13.320 I've never seen
01:16:14.120 any kid change
01:16:14.840 their behavior
01:16:15.340 because they lose
01:16:16.680 their phone.
01:16:18.820 Because they just
01:16:19.340 find a laptop
01:16:21.440 or an iPad usually.
01:16:27.360 Cernovich's kids,
01:16:28.260 let's see,
01:16:28.620 don't have screen time.
01:16:31.100 Yeah.
01:16:31.800 So,
01:16:32.760 that's also true
01:16:33.760 of people
01:16:34.220 in the tech business.
01:16:35.500 The people
01:16:36.160 who are least likely
01:16:37.240 to allow their kids
01:16:38.200 to use technology
01:16:39.120 are the people
01:16:39.780 who make it.
01:16:40.280 You know that,
01:16:41.460 right?
01:16:42.500 The people
01:16:43.180 who are least likely
01:16:44.180 to get hernia surgery
01:16:45.360 are who?
01:16:47.400 What class of people
01:16:48.600 are the least likely
01:16:49.720 to get hernia surgery?
01:16:51.740 Doctors who perform
01:16:53.000 hernia surgeries
01:16:53.800 because they know
01:16:55.200 too much.
01:16:57.000 The people
01:16:57.780 who make your technology
01:16:58.920 are the ones
01:16:59.460 who don't let
01:16:59.900 their kids see it.
01:17:01.580 China doesn't let
01:17:03.140 Chinese citizens
01:17:04.500 see TikTok.
01:17:05.140 They have a version
01:17:07.440 of TikTok
01:17:07.840 that's just like
01:17:08.600 a propaganda version.
01:17:11.620 China doesn't let
01:17:13.040 their own citizens
01:17:13.900 use TikTok.
01:17:15.060 The people
01:17:15.460 who know the most
01:17:16.260 keep their kids
01:17:17.820 away from it.
01:17:22.140 That is not true.
01:17:23.380 Well, it's not true
01:17:24.000 for every person.
01:17:27.660 Your wife quit
01:17:28.720 using her phone
01:17:29.420 when she's with you?
01:17:30.260 Wow.
01:17:37.500 Yeah, they use
01:17:38.220 the phone in school
01:17:39.280 now too.
01:17:39.920 That's right.
01:17:41.500 Scott needs a hug.
01:17:46.460 All right.
01:17:46.980 Well, I don't have
01:17:47.680 anything else to talk
01:17:48.320 about, so I'm going
01:17:49.160 to say that's it
01:17:50.160 for today.
01:17:50.560 I'm saying
01:17:57.100 all true
01:17:58.620 but you're taking
01:17:59.200 your personal hatred
01:18:00.160 I don't know
01:18:00.620 if this is to me
01:18:01.260 or to somebody else
01:18:01.880 your personal hatred
01:18:03.140 yes, we can be
01:18:04.100 brutally honest too
01:18:04.980 of women
01:18:05.480 to an unacceptable level.
01:18:07.500 You're going to lose
01:18:08.020 people unnecessarily.
01:18:09.420 Tone it down, please.
01:18:11.060 Do I sound like
01:18:12.260 does that sound like me?
01:18:17.720 Do I have
01:18:18.820 here's what I think
01:18:20.340 I'm doing?
01:18:21.820 I think I'm just being
01:18:23.280 a straight caller
01:18:25.600 of balls and strikes.
01:18:27.360 I don't think
01:18:28.140 that men are adding
01:18:29.020 enough to women
01:18:29.700 and women are not
01:18:30.500 adding enough to men.
01:18:31.800 So I don't believe
01:18:32.640 that I'm showing
01:18:33.480 a bias against women,
01:18:34.980 am I?
01:18:35.980 Because I'm saying
01:18:36.660 that neither is
01:18:37.520 giving enough
01:18:38.040 to the other.
01:18:39.160 I just did a whole
01:18:39.960 explanation of how
01:18:41.300 the most dangerous
01:18:42.000 thing to a woman
01:18:42.700 is a man in her own house.
01:18:44.520 Does that sound like
01:18:45.440 woman hating?
01:18:48.000 Maybe it does.
01:18:48.780 Maybe the truth
01:18:50.400 is too brutal
01:18:51.500 or maybe you were
01:18:53.040 talking to somebody
01:18:53.700 else, I can't tell.
01:18:57.620 What about Steve Jobs?
01:19:01.100 Yeah, Steve Jobs
01:19:02.120 didn't let his kids
01:19:02.960 use stuff.
01:19:03.380 You married an Instagram
01:19:12.000 model, yes?
01:19:14.120 What exactly is the
01:19:16.280 point of that comment?
01:19:19.480 Your equal blame is not
01:19:20.920 the same experience
01:19:21.860 for everyone, of course.
01:19:24.860 You've been a little
01:19:25.660 hostile toward the ladies
01:19:27.080 back to November 9th
01:19:30.880 when you said women
01:19:31.880 were toxic to men.
01:19:34.780 Women are toxic to men.
01:19:37.260 Yeah, but that's
01:19:38.260 a biological explanation.
01:19:41.740 The biological explanation
01:19:43.220 is that the biological
01:19:45.180 purpose of a woman
01:19:46.460 is to transfer resources
01:19:49.180 from a man to a child.
01:19:53.140 So, there you go.
01:19:54.500 I mean, from a machine
01:19:58.640 perspective, that's what
01:20:00.200 the machine is doing.
01:20:01.540 The machine has created
01:20:02.860 these creatures where
01:20:04.200 the shift of resources
01:20:06.520 works very well for
01:20:07.920 keeping the next
01:20:08.700 generation alive.
01:20:13.500 All right.
01:20:14.940 What?
01:20:23.860 Where is Musk's
01:20:24.860 election interference
01:20:25.720 release?
01:20:26.400 Yeah, we're still waiting
01:20:27.080 for the big report
01:20:28.020 and of Twitter.
01:20:29.840 Man.
01:20:32.080 You're calling
01:20:32.700 Christina to yell
01:20:33.560 at her.
01:20:36.560 Now, don't
01:20:37.360 yell at Christina.
01:20:39.160 Remember when you
01:20:45.600 called your
01:20:46.180 relationships
01:20:46.680 rental wives?
01:20:47.660 I do.
01:20:48.600 Yeah, I think
01:20:49.140 rental is going
01:20:51.480 to be the model
01:20:52.100 that has to
01:20:52.700 be at least
01:20:55.700 tried by some
01:20:56.300 people.
01:20:57.760 Yeah.
01:21:05.560 New York Post
01:21:06.480 on Elon Musk.
01:21:07.240 I didn't see it.
01:21:07.820 Oh, yeah, maybe.
01:21:16.460 No, I think
01:21:18.020 the trouble is
01:21:18.700 that marriage
01:21:19.480 is a rental
01:21:20.140 that we don't
01:21:20.960 acknowledge.
01:21:23.360 I think people
01:21:24.360 look at the
01:21:25.120 statistics and
01:21:25.980 they say,
01:21:26.380 yeah, I'm going
01:21:26.960 to try to be
01:21:27.620 the one that
01:21:28.200 lasts forever,
01:21:29.740 but realistically,
01:21:31.000 it's probably
01:21:31.540 a rental.
01:21:31.940 like, if you
01:21:35.760 got married
01:21:36.320 thinking that
01:21:37.000 you were going
01:21:37.440 to be the
01:21:37.820 one who
01:21:38.120 beat the
01:21:38.560 odds,
01:21:40.580 I would
01:21:41.460 question that
01:21:42.220 point of view.
01:21:48.300 I should
01:21:48.940 follow Andrew
01:21:49.640 Tate and go
01:21:50.700 Muslim.
01:21:51.080 You know,
01:21:54.540 wouldn't you
01:21:54.840 love to see
01:21:55.500 an honest
01:21:56.700 poll
01:21:57.780 of Muslim
01:21:59.480 women versus
01:22:00.280 other women
01:22:01.600 to see who's
01:22:02.620 happier?
01:22:04.220 What do you
01:22:04.880 think that
01:22:05.260 would show?
01:22:06.600 If you did
01:22:07.260 an honest
01:22:07.800 poll, I don't
01:22:08.440 know if you
01:22:08.760 could do it
01:22:09.080 unbiased,
01:22:09.940 but if you
01:22:10.360 could find a
01:22:10.800 way to do
01:22:11.160 it, do you
01:22:12.000 think that
01:22:12.340 Muslim women
01:22:12.900 would be
01:22:13.340 less happy?
01:22:18.480 I don't
01:22:18.940 know.
01:22:20.040 I do not
01:22:20.760 know.
01:22:23.540 Yeah, it
01:22:24.700 could go
01:22:25.080 either way.
01:22:25.540 I wouldn't
01:22:25.780 even know
01:22:26.080 how to
01:22:26.280 predict that.
01:22:29.680 Yeah, Iran's
01:22:30.580 a special
01:22:30.940 case, but...
01:22:34.480 All right.
01:22:35.760 All right,
01:22:36.260 that's all for
01:22:36.680 now.
01:22:37.040 I'm going to
01:22:37.340 talk to you
01:22:37.820 tomorrow,
01:22:38.520 YouTube.
01:22:39.020 I'll stay
01:22:39.280 with the
01:22:39.540 locals people
01:22:40.100 for a little
01:22:40.800 bit.
01:22:41.780 Bye for
01:22:42.120 now.
01:22:42.800 Spotify,
01:22:43.340 YouTube,
01:22:44.040 and
01:22:44.160 Rumble.