Real Coffee with Scott Adams - April 26, 2022


Episode 1725 Scott Adams: The Best Take On Elon Musk And Twitter You'll Ever Hear. With Whiteboard


Episode Stats

Length

57 minutes

Words per Minute

140.73988

Word Count

8,097

Sentence Count

579

Misogynist Sentences

4

Hate Speech Sentences

5


Summary

Joe Biden's speech at the Democratic National Convention is a must-listen, and I think you'll agree that it's one of the most impactful speeches we've ever had. I also talk about why the country is now more divided than it was before the 2016 election, and why that's a good thing.


Transcript

00:00:00.000 Good morning, everybody.
00:00:04.840 And I was checking the news to see if anything big has happened lately, and I don't know.
00:00:12.480 Anything? Is there anything on your mind?
00:00:15.000 Are there any topics of great interest to you?
00:00:19.440 Is there anything that everybody's talking about?
00:00:23.100 Well, we might get to that, too.
00:00:26.160 But first, do you feel it yet?
00:00:30.720 Is it starting to tingle?
00:00:33.040 That incredible feeling you get when you're here live, and even when you're watching it recorded,
00:00:40.980 and you're imagining what it would have been like if you were watching this live right now,
00:00:46.560 it starts as a small little good feeling, and you might feel it in your arms first, like a little tingle on your arms.
00:00:53.780 That's the beginning of an incredible day.
00:00:56.560 Because people who watch this live stream, you're not like regular people.
00:01:01.900 You're sexier. You're smarter.
00:01:03.740 You're learning machines.
00:01:05.960 My God, you're getting better every day.
00:01:08.260 Your skill stacks are growing.
00:01:09.640 Your systems are working.
00:01:11.720 I admire you.
00:01:15.460 Congratulations to you for being awesome.
00:01:17.500 Now, because you were so awesome, today will not just be...
00:01:22.640 I mean, normally, it's the best day you've ever had when you come here.
00:01:27.420 But I think we can take it up a level.
00:01:30.560 I think we can.
00:01:31.900 And it's going to happen in a moment.
00:01:33.480 And all you need is a cup or a mug or a glass, a tank or chalice or stine, a canteen jug or a flask,
00:01:39.020 a vessel of any kind.
00:01:41.080 Fill it with your favorite liquid.
00:01:43.620 I like coffee.
00:01:45.800 And join me now for the unparalleled pleasure.
00:01:49.520 It's the dopamine to the day, the thing that makes everything better.
00:01:54.560 It's called the simultaneous sip.
00:01:57.140 And it happens now.
00:01:58.420 Go.
00:01:58.660 Go.
00:01:58.720 Go.
00:02:03.480 That was so good that even though I noted a little pushback from what I would call an anti-sipper,
00:02:15.000 don't grab the pitchforks.
00:02:17.120 No, let them go.
00:02:18.140 Let them go.
00:02:19.020 That's not the kind of people we are.
00:02:20.920 People can be pro-sip.
00:02:23.480 They can be anti-sip.
00:02:25.760 We're a big tent kind of a thing.
00:02:28.040 So really, you should not be bigoted against the anti-sippers.
00:02:34.920 Even though we like sipping, we like it.
00:02:38.600 We're sort of on team sip.
00:02:41.240 The anti-sippers, we shouldn't hate them because they're worth less than us.
00:02:49.320 I'm not saying that.
00:02:50.400 And I'm not saying that you should feel any animus toward them at all, the anti-sippers.
00:02:58.140 I hate the anti-sippers.
00:02:59.660 I hate them.
00:03:01.320 I'm going to tweet about those idiots, anti-sippers.
00:03:05.640 Ah!
00:03:06.800 Ah!
00:03:07.920 Gone!
00:03:09.820 All right.
00:03:11.320 Rasmussen has a poll.
00:03:12.420 It says that 66% of likely U.S. voters believe America is now more divided than it was before
00:03:20.240 the 2020 election.
00:03:22.440 So good job, Joe Biden.
00:03:24.380 I believe there was one thing that Joe Biden promised us above all other things.
00:03:31.300 What was that?
00:03:32.020 What was that one thing he said?
00:03:34.120 He said, man, you know, there's one thing I'm going to give you.
00:03:38.760 Let me tell you.
00:03:39.400 Let me tell you there's one thing you're definitely going to get.
00:03:42.420 The one thing you're absolutely going to get, we're not going to be so divided.
00:03:48.240 Not with a Joe Biden president.
00:03:50.980 Nope.
00:03:52.660 Only to discover that it doesn't really matter who the president is, the country will get
00:03:58.860 more divided.
00:04:01.360 Because?
00:04:02.900 Why?
00:04:04.280 Because it's probably not anything to do with the president.
00:04:07.340 I don't think it had anything to do with Trump.
00:04:10.840 Not really.
00:04:11.460 And I don't think it has anything to do with Biden either.
00:04:14.440 I think that the, you know, the media is such that all presidents will look worse and
00:04:20.900 worse until we immediately elect them and then execute them without any gap in between.
00:04:28.460 Because, you know, everybody just looks like a worse and worse version of, it looks like
00:04:35.360 a photocopy of a photocopy of a photocopy.
00:04:37.580 Like every new president seems worse than the last in ways that we can't even imagine.
00:04:42.340 It's like, oh my God, this one was so bad.
00:04:45.800 You know, not as bad as that last one, but when do you see the next one?
00:04:49.280 So if the trend continues and each president is devalued by social media, eventually you'll
00:04:57.140 get to a point where the only person who could be elected president is so bad that they just
00:05:04.840 have to be immediately executed.
00:05:06.260 So that might make the primary season shorter, fewer people.
00:05:15.100 So there could be some good elements of that.
00:05:17.220 We're going to get to Elon Musk.
00:05:21.480 Of course we are.
00:05:23.000 Of course we are.
00:05:23.780 I'm just waiting for some more people to get on here.
00:05:27.520 What do you think about governments picking winners and losers?
00:05:32.000 This whole Disney dissentist thing was about that.
00:05:35.280 I'll just ask this one question.
00:05:38.280 I'll just leave it at this.
00:05:39.920 Just a general question.
00:05:41.080 And this has more to do with Disney having a special status that they had in Florida.
00:05:47.940 Because they'd made a deal to have this special status.
00:05:51.260 So, what do you think?
00:05:55.260 Should governments pick winners and losers?
00:05:58.320 Go.
00:05:59.240 In the comments.
00:06:01.720 Should governments be in the business of choosing winners and losers?
00:06:07.380 Solid no's, I'm saying.
00:06:08.940 No, no, no, no, no, no.
00:06:11.420 No, no, no.
00:06:13.620 Somebody says, I love it, but I think you're kidding.
00:06:16.360 A few people say yes.
00:06:19.340 I'm sort of surprised.
00:06:21.660 Sort of surprised.
00:06:23.320 Now, let me...
00:06:23.840 All right, second question.
00:06:25.520 Second question.
00:06:26.280 So, I'll let some of your answers, you know, run through here so they're not overlapping.
00:06:33.680 Because there's a little lag on the comments.
00:06:36.160 All right, second question.
00:06:37.000 Are you aware that the current government is almost entirely designed to pick winners and losers?
00:06:47.800 Like, it's like its main function.
00:06:51.340 Its primary function is to pick winners and losers.
00:06:54.140 Are you aware that that is what the government does?
00:06:57.720 Okay.
00:06:58.200 Most of you are.
00:06:59.060 Okay.
00:06:59.180 So, if you were to do away with that, what would be left?
00:07:10.400 There wouldn't be anything left.
00:07:12.560 That's all governments do.
00:07:14.800 That's literally all they do.
00:07:16.260 So, you know, they do it in a variety of ways that some of it involves taxes.
00:07:22.360 Some of it involves the legal system, which literally picks winners and losers.
00:07:27.380 And, you know, the IRS literally decides who gets benefits and what types of things they want to promote, what they want to discourage, literally.
00:07:37.720 About the military.
00:07:40.280 Well, if you don't join, if there are not enough people who join the military, the government will just decide to enlist you without your permission.
00:07:50.820 So, they literally get to pick who wins and loses.
00:07:53.060 And then you go on the battlefield and, you know, somebody who works for the government, you know, your commanding officer, says, all right, you know, you lot, you're going to run after that tank, literally picking winners and losers.
00:08:06.540 That's all they do.
00:08:08.420 Like, the laws, like, every law makes somebody a winner and somebody a loser, with the exception of maybe, you know, some that are just so obvious that they should be a law.
00:08:19.420 But largely, some industry loses, but maybe they should have.
00:08:25.900 Maybe they should have lost.
00:08:27.440 I'm not saying that they're picking the wrong winners or losers.
00:08:30.880 I'm just saying that if your objection to the Disney Florida thing is on principle, and the principle is the government should not be picking, like, a specific company to win,
00:08:43.260 I would say, well, that's all they do.
00:08:46.940 That's all they do.
00:08:48.140 That's everything they do and everything we want them to do.
00:08:51.060 Like, that's peak performance for a government, is picking winners and losers.
00:08:55.080 They just have to do it well.
00:08:57.440 Now, the secret is, did Florida make a deal which was also good for Florida?
00:09:03.020 If they did, then everybody won.
00:09:10.020 And, of course, people say, but they have the power to change that deal.
00:09:14.160 Yeah, they do.
00:09:15.300 They do have the power to change that deal.
00:09:17.160 Looks like they did.
00:09:19.020 So, but I just want to make that minor point that that's all governments do is pick winners and losers.
00:09:24.520 All right, let's talk about Twitter.
00:09:27.660 As you know, Twitter has accepted Elon Musk's bid.
00:09:31.760 So, Elon will, I don't know how long it takes this deal to close, but he will be owning Twitter.
00:09:42.240 And so, you might ask yourself, what did Jack Dorsey say about all of this?
00:09:45.820 And Jack tweeted, I think it was yesterday, after the news.
00:09:50.720 Jack Dorsey said, I love Twitter.
00:09:53.040 Twitter is the closest thing we have to a global consciousness.
00:09:59.040 Now, I'm not sure you saw that coming.
00:10:00.660 A global consciousness.
00:10:03.100 And then he followed up with that.
00:10:04.800 He said, in principle, I don't believe anyone should own or run Twitter.
00:10:09.960 It wants to be a public good at a protocol level, not a company.
00:10:14.980 Solving for the problem of it being a company, however, Elon is the singular solution I trust.
00:10:21.400 I trust his mission to extend the light of consciousness.
00:10:24.260 Now, was that the take you were expecting?
00:10:32.600 It's not exactly the take you were expecting, is it?
00:10:35.040 But it's exactly my take.
00:10:42.500 So, I would say that my take is exactly like this.
00:10:46.200 And if you would like to see more about what global consciousness means in this context,
00:10:53.120 I would recommend you to my book, written in 2001.
00:10:57.680 Now, I'm not going to say anything else about it, but I'm going to let the people in the comments who have read it just say in the comments, if you've read it,
00:11:10.800 would you think that this is the right book at the right time because of the news?
00:11:15.760 Just in the comments.
00:11:18.320 Just tell the other people if you think, oh my God, the timing of this is exactly right.
00:11:24.320 So, watch as the comments go by.
00:11:26.660 On locals, it's solid yeses.
00:11:28.420 They've all read the book.
00:11:29.840 And there are fewer people on YouTube as a percentage who have read it, but they all say yes.
00:11:34.740 So, I'm not going to tell you anything else because I don't want to be a spoiler.
00:11:38.360 But just know that if you're following the story and you wonder what it means if someone refers to Twitter as forming a global consciousness
00:11:50.000 and what it means to extend the light of consciousness, that those are not crazy ideas.
00:11:56.780 They're not even slightly crazy.
00:11:59.520 Because Twitter is indeed that.
00:12:03.880 But it gets lost in the fact that it's a bunch of other things.
00:12:07.200 It's whiteboard time.
00:12:10.240 Whiteboard.
00:12:11.080 We've got the whiteboard.
00:12:12.840 That needs some kind of a drum intro, doesn't it?
00:12:16.240 I'm going to have to record a drum intro for whiteboard.
00:12:21.240 All right.
00:12:22.320 So, when we're looking at Twitter, it's all of these different things.
00:12:25.860 It's like a cesspool of hate and fake news.
00:12:28.760 True.
00:12:30.420 It's a job for some of us, such as myself.
00:12:34.640 You know, for me, Twitter is a career.
00:12:37.200 You know, it's how you reach people.
00:12:39.440 It's how you build an audience.
00:12:41.280 It's basically just how you connect.
00:12:44.860 So, if you have a certain kind of job, it's part of your job.
00:12:51.060 It's also an addiction for other people.
00:12:53.860 You get a little dopamine hit every time you get one on the other team.
00:12:57.100 And, as Jack has noted, you can think of it as a global consciousness.
00:13:04.640 Now, the shortcut of that is that Twitter is like a meta-brain.
00:13:10.660 Or, let's say, let's say that Twitter has a mind of its own that looks like something like the majority opinion.
00:13:19.020 But, there's always a minority opinion.
00:13:22.140 If you were to say to yourself, but, Scott, how can it be like a global consciousness?
00:13:28.180 At the same time, it's like a cesspool of hate and fake news.
00:13:31.680 Well, let me ask you this.
00:13:33.800 What do you think your brain is?
00:13:36.080 Your brain is a cesspool of hate and fake news.
00:13:39.420 It just has other stuff, too.
00:13:41.480 It's got some love in there, I hope, some empathy.
00:13:44.420 You know, it's got some charity and some industry and some good intentions.
00:13:48.980 It's got a lot of stuff in there.
00:13:50.120 But, your brain is a hot mess.
00:13:51.980 Do you have one clean thought every day?
00:13:56.360 No.
00:13:57.540 It's like this boiling cesspool soup of awesomeness sometimes and such dark thoughts that you can't even express them to another human because you think you'll be burned as a monster.
00:14:11.220 That's where your brain is.
00:14:12.760 That's where your consciousness is.
00:14:14.380 So, if your consciousness is a royal cesspool of badness, but, weirdly, you put all of that awfulness together and it can produce a human who does some good stuff, who contributes to the economy, takes care of a family, does all kinds of good stuff.
00:14:41.680 So, Twitter's like that.
00:14:44.380 It's like this insane extreme of everything that you can barely tolerate sometimes.
00:14:54.900 Do you know what else is like that?
00:14:56.860 Do you know what else is like an extreme, like, you know, thoughts in every direction and you can barely tolerate it?
00:15:03.320 Your own mind.
00:15:06.300 Most of us can barely tolerate our own minds half the time.
00:15:11.120 Right?
00:15:11.300 Because it's just a mess in there.
00:15:15.040 Now, and by the way, this is part of what I call the basket case theory.
00:15:19.540 If you're not old enough to have learned this yet, I can save you some time.
00:15:24.920 Everybody's a basket case once you get to know them.
00:15:32.060 Until you do, and you don't know much about their inner thoughts or their actual life.
00:15:36.560 You think, well, there's somebody who's got it together.
00:15:38.820 If only I could have the clarity of thought and the happiness that I'm observing in this complete stranger.
00:15:45.960 If only I could have that.
00:15:47.460 And then you live a little bit, and you start noticing this pattern, you're thinking, I've met a lot of people.
00:15:57.420 Why are all the people that I've met really messed up on the inside?
00:16:04.000 Because once they get to know you, they really, you know, they'll vomit out their inner thoughts.
00:16:10.140 And you're like, whoa, that's as bad as my inner thoughts.
00:16:12.880 I thought I was the only one who had thoughts like that.
00:16:16.860 So anyway, I wouldn't worry about the fact that there's a lot of awfulness within the Twitter global consciousness.
00:16:26.900 Because that's just like you.
00:16:29.420 And you have some good points, too.
00:16:33.460 And if you want to know more about that on a concept level, God's Debris is the book you want to read.
00:16:42.900 It only takes about an hour and a half.
00:16:44.940 And it's written with hypnosis technique, which I disclose in the beginning of the book.
00:16:52.040 So if that makes you uncomfortable, you should not read it.
00:16:55.800 But the technique is used to give you a, let's say, an experience when you read the book that goes beyond the story.
00:17:05.040 So it's how to make it an experience and not just a story.
00:17:10.000 All right.
00:17:11.660 So here are all the angles that people are taking on Elon Musk buying Twitter.
00:17:18.660 One angle is that Elon has, or Tesla, has too many connections to China.
00:17:23.460 So they've got a China factory and want to sell cars in China, and it's the biggest market.
00:17:30.300 And would that not, would that not cause Elon Musk to be biased?
00:17:36.300 And would it not cause him to do things that are a little bit pro-China?
00:17:42.200 Because he might have a trillion dollars riding on it.
00:17:45.500 What do you think?
00:17:46.920 Remember, I said follow the money.
00:17:48.400 So if you follow the money, and he might have a trillion dollars, you know, riding on making China happy,
00:17:57.400 would you expect a trillion dollars would influence him?
00:18:00.400 Well, if anybody were not to be influenced by a trillion dollars, it might be him.
00:18:13.940 But you have to start with the baseline assumption that all humans are influenced by a trillion dollars.
00:18:19.580 Like, even if you ranked who could resist it the most, you know, and even if you said,
00:18:28.080 all right, of all the people in the world, he could resist it the most, because he's already the richest person.
00:18:33.820 I mean, you'd imagine that would be the person who could resist it best.
00:18:38.200 But even if he were the best at it, it's a trillion dollars, right?
00:18:43.220 Nobody's going to be, no one is immune to a trillion dollars.
00:18:46.520 I can't even, it's not imaginable.
00:18:50.000 Even if you justified it as, you know, I'll use it to feed the poor or something.
00:18:56.240 Like, you would always have a reason why that trillion dollars should actually make a difference to you.
00:19:01.860 But, that said, what has both Jack Dorsey and Elon Musk said about what should be done with Twitter?
00:19:11.600 Well, it turns out that it looks like they're on exactly the same page.
00:19:16.120 That as long as the algorithm is transparent, what can China do?
00:19:22.500 To which I think, oh yeah, what can they do?
00:19:26.440 The entire point is to make it transparent.
00:19:29.680 That's the entire point.
00:19:31.940 If Elon Musk doesn't make it transparent, well, then the whole thing's a waste of money.
00:19:36.900 And, you know, it's a step backwards, probably.
00:19:42.320 But, it's everything he says he's going to do.
00:19:45.320 You know, when was the last time he said he was going to do something, and then it turned out that he was kidding?
00:19:51.780 Like, not something on this scale, right?
00:19:54.320 He's very clear what he's going to do.
00:19:57.720 Jack Dorsey is very clear that it should be done.
00:20:00.200 And, I think he even mentioned that the current CEO of Twitter is also on that same page.
00:20:07.600 So, one wonders who was not on that page.
00:20:12.460 Well, the hints that we have are possibly the board and possibly other owners of Twitter who had corporate interests.
00:20:23.980 So, if you read what Jack Dorsey is tweeting, it would seem that there's some kind of Wall Street corporate influence that was on Twitter that might have even been more than the influence of Jack Dorsey.
00:20:37.600 Like, that's the only way I could read it, is that if Jack Dorsey wanted things to be one way, he couldn't get it done.
00:20:46.080 So, he couldn't get the very thing he kept saying in public a lot, right?
00:20:50.400 Because it's not like this is a new thing.
00:20:52.100 Like, Jack Dorsey's been saying this for a long time.
00:20:57.340 You know, ever since it became an issue, he's been saying it should be, you know, you should have a choice of algorithms,
00:21:01.940 it should be, you know, transparent and stuff, and shouldn't have corporate ownership.
00:21:06.360 He said for a long time that he sort of regrets, not sort of, that he regrets the business model that Twitter became.
00:21:14.940 He says it directly.
00:21:15.780 So, all the people who seem to be having the biggest influence on where Elon Musk would go with this seem to think that he could get to transparency and that it's doable.
00:21:31.160 What do you think?
00:21:32.740 Doable or not?
00:21:33.780 Because otherwise, the China thing is completely valid.
00:21:38.300 If it's not completely transparent, that trillion dollars is certainly going to make a difference, I would say.
00:21:45.840 Here's what Elon Musk is good at.
00:21:56.280 He's a product guy.
00:21:58.660 Now, when you look at Twitter, isn't the thing that just screams, it needs a feature, right?
00:22:05.840 Like, why can't I have the feature of the edits, or why can't I have the feature that, you know, maybe gives me more information about if I'm shadow banned,
00:22:16.160 or why can't I have a feature that gives me everything unfiltered, you know?
00:22:22.500 So, you think of Twitter, you say, give me a bunch of features, and then I'll be happy.
00:22:27.560 And who's better at that?
00:22:31.540 Like, Elon Musk is like sort of the ultimate figure out what features make sense and build it and give it to you kind of a guy.
00:22:38.500 So, that all looks good.
00:22:42.240 Who should be the most worried people in the universe right now?
00:22:48.560 Crystal Ball.
00:22:49.660 I'm going to go with the marketing department at Twitter.
00:22:52.500 I feel like the marketing department at Twitter, probably putting their resumes together,
00:23:02.600 because famously, Elon Musk doesn't need marketing for Tesla, because he just tweets.
00:23:12.740 What do you think?
00:23:13.820 Do you think he needs a marketing department for Twitter?
00:23:17.800 He's literally doing that while he does his other job.
00:23:20.900 You know, I joked, and I think he liked that tweet, that I joked that he does the entire marketing department for Tesla while he's on the toilet.
00:23:30.480 You know, because I think there might have been a joke about him tweeting from the toilet or something.
00:23:36.480 And now he could add the running Twitter's marketing to it, too.
00:23:44.800 I mean, he could just do both of them before he's done with his business there.
00:23:48.960 Now, literally, I'm not joking, what possible good does the Twitter marketing department do to anybody?
00:23:58.580 You know, if you know that Elon Musk is just going to be doing what they do times a thousand with just a few tweets.
00:24:07.440 Can Elon make the business model profitable?
00:24:10.740 Well, it's already profitable, right?
00:24:12.380 Wait, wait, are you suggesting that Twitter's not profitable?
00:24:19.060 I think the question is how profitable.
00:24:23.340 Right?
00:24:27.280 All right.
00:24:29.540 Now, here's a question.
00:24:31.580 Have you seen a bunch of people say that their number of followers is way up?
00:24:36.760 And that it's people who are, let's say, conservative-leaning, and their number is up?
00:24:43.860 Well, I will tell you that my number of people who follow me on Twitter is typically, on average,
00:24:51.420 and this is just really gross numbers, on average, 300 a day.
00:24:56.880 So I would get, and you could look back pretty far, and you'll see I get 300 a day.
00:25:01.760 Yesterday, when Twitter was sold to Elon Musk, I got 10 times that.
00:25:10.000 I got over 3,000.
00:25:12.640 So I went from 300 to 3,000.
00:25:16.240 But, but, if you think that's because, you know, Twitter is, you know, burning the old algorithms
00:25:25.640 and, you know, hiding the shadow banning so they can't get caught or anything like that,
00:25:30.440 maybe, maybe, I don't know what's going on.
00:25:33.920 But I think it's equally likely that I hear a lot of people are just coming back to Twitter.
00:25:39.980 So a lot of conservatives who might have followed me in the past,
00:25:43.760 but had stopped following me, may just be coming back to see what's going on, something like that.
00:25:49.080 So, so, so, um, I would say it's too early to know that this is because Twitter changed anything,
00:25:57.880 as opposed to Musk changing how people thought of Twitter,
00:26:03.120 which might have just brought people back.
00:26:04.840 Or maybe just to see what's going on, because it's the biggest story.
00:26:09.900 But some people are, like, I think Mike Cernovich just went through the roof, his numbers.
00:26:14.980 Like, just a crazy number of people followed him in one day.
00:26:19.080 Uh, and the funny thing is that Ivermectin is trending now.
00:26:24.200 So everybody's tried to say all the things that got them banned before.
00:26:28.060 And indeed, they're not getting banned.
00:26:29.880 But it's so obvious that they're saying it for, for humor purposes
00:26:34.420 that I'm not sure that they would get banned for that anyway.
00:26:37.880 Um, so, I don't know.
00:26:44.400 I'm not going to talk about Ivermectin.
00:26:46.340 It's just funny that it's trending.
00:26:47.920 Um, there's a clip of, uh, Ari Melber on MSNBC
00:26:53.540 who is talking about Musk buying Twitter.
00:26:57.980 And, uh, he's concerned about it.
00:27:01.340 And he's concerned that, that Musk could have too much influence
00:27:04.200 on something like an election.
00:27:05.600 So here's what he said.
00:27:06.460 He said, quote,
00:27:07.680 You could secretly ban one party's candidate,
00:27:11.880 secretly turn down the reach of their stuff,
00:27:14.640 and turn up the reach of something else,
00:27:16.640 and the rest of us might not even find out about it
00:27:19.620 until after the election.
00:27:21.020 Uh, well, yes, Ari Melber.
00:27:26.060 That is a risk.
00:27:31.080 It's only the thing we've been talking about for five fucking years.
00:27:35.120 Like, obsessing about it.
00:27:37.120 Like, the number one thing that, that, uh,
00:27:39.680 something like 50% of the country has been,
00:27:42.400 you know, just frantically, uh, you know, flailing about.
00:27:45.840 And suddenly he's like, hey, hey,
00:27:50.240 what if, what if somebody used Twitter to, like, affect an election?
00:27:55.620 What, what would happen then?
00:27:58.080 Oh, my God.
00:28:01.000 Uh, I, I think Tucker Carlson is already back on Twitter.
00:28:05.360 I don't know if that was because he was unbanned
00:28:07.380 or he just wanted to get back on.
00:28:09.980 Um, I swear,
00:28:14.160 I think Brian Stelter only exists for our entertainment
00:28:19.000 and not in the way that he's hoping.
00:28:21.380 Uh, you know, I don't have any,
00:28:24.060 I don't have a bad feeling about him personally,
00:28:26.580 but from the perspective of someone who watches not just CNN,
00:28:34.180 he, he is a, he's a wonderfully, uh,
00:28:40.000 I don't know.
00:28:41.520 What is it that makes him so interesting?
00:28:46.760 Is it because you can't tell if he believes what he's saying?
00:28:50.080 Is that what it is?
00:28:51.500 It might be that.
00:28:52.800 That you're not entirely sure
00:28:54.140 if he believes it
00:28:55.820 or he knows he's supposed to be saying it, right?
00:28:57.980 Is that it?
00:28:59.780 You know, you're,
00:29:00.480 well, some of you are making unkind
00:29:02.480 comments about his physicality,
00:29:05.380 but I don't think it's that.
00:29:06.420 Because he, he could look like anything
00:29:08.920 if you agreed with him
00:29:10.020 and suddenly he'd be,
00:29:11.440 he'd be handsome to you.
00:29:14.480 I don't know.
00:29:15.620 But, uh, here's what he said,
00:29:17.800 uh, talking about Elon Musk buying Twitter.
00:29:20.280 He said, uh,
00:29:21.300 if you get invited to something
00:29:22.960 where there are no rules,
00:29:25.100 where there is total freedom for everybody,
00:29:27.340 do you actually want to go to that party?
00:29:29.500 Or are you going to decide to stay home?
00:29:31.740 Uh, to which many people,
00:29:36.260 uh, said before I could cleverly chime in
00:29:39.580 and they'd taken all the good jokes
00:29:41.220 and I believe I didn't.
00:29:43.260 But, uh,
00:29:45.580 yeah, yeah, that's the party
00:29:47.760 we do want to go to.
00:29:50.600 Um, um, yeah.
00:29:53.500 Yeah.
00:29:54.740 You mean the good party?
00:29:57.240 Yeah, we want to go to the good party.
00:30:00.820 Definitely.
00:30:01.740 Definitely.
00:30:02.380 Take me to the good party.
00:30:04.260 So, uh,
00:30:05.660 but it's just hilarious
00:30:06.560 to have him use this, uh, explanation.
00:30:09.920 Especially when you talk about freedom of speech.
00:30:13.000 You know,
00:30:13.980 yeah, I do want to go to the place
00:30:15.620 where they have freedom of speech.
00:30:17.700 As a matter of fact, I do.
00:30:19.900 I wouldn't mind that at all.
00:30:21.660 Now, I wouldn't mind also
00:30:22.840 having better tools
00:30:24.620 for filtering out people
00:30:25.860 whose opinions aren't adding value to me
00:30:28.860 for whatever reason.
00:30:29.620 So, um,
00:30:32.340 but it is hilarious
00:30:34.320 to see people,
00:30:35.560 let's say,
00:30:36.060 associated with the, uh,
00:30:38.200 the,
00:30:38.680 the propaganda on the left,
00:30:42.100 how concerned they are
00:30:43.960 that their main propaganda lever
00:30:45.780 just disappeared.
00:30:48.120 Can you imagine?
00:30:49.520 Because, you know,
00:30:50.760 I've been saying for a while
00:30:51.940 that Twitter isn't like
00:30:53.340 other media properties.
00:30:54.840 It's the lever that moves
00:30:56.620 the other properties.
00:30:58.360 So, whoever controls Twitter,
00:31:01.420 uh,
00:31:01.620 and,
00:31:02.100 and ideally,
00:31:03.320 having a transparent system
00:31:05.840 means that the public controls it,
00:31:07.560 um,
00:31:08.560 just by having,
00:31:09.820 you know,
00:31:09.980 a lot of sunlight on it,
00:31:11.040 ideally.
00:31:12.560 Um,
00:31:13.200 I don't know how you can get away
00:31:16.000 with fake news as easily.
00:31:18.260 I mean,
00:31:18.840 it seems like there'd be
00:31:19.660 much better checking ballots
00:31:20.800 on that stuff.
00:31:21.620 Both ways.
00:31:22.280 So, like,
00:31:22.660 not just the left,
00:31:23.720 but the right,
00:31:24.420 of course.
00:31:26.100 All right.
00:31:28.240 You know,
00:31:28.820 I was, uh,
00:31:30.320 I was actually thinking
00:31:32.240 I should be more of an activist
00:31:33.580 on this whole thing.
00:31:35.320 And,
00:31:35.880 I wanted to protest
00:31:38.100 all the billionaire ownership
00:31:40.280 of social media.
00:31:42.020 You know,
00:31:42.200 you got Facebook,
00:31:43.440 billionaires,
00:31:44.660 Washington Post,
00:31:45.720 New York Times,
00:31:46.720 billionaires,
00:31:47.460 you know,
00:31:47.580 all the media.
00:31:48.740 Seems owned by billionaires,
00:31:50.220 and then,
00:31:50.980 you got another billionaire
00:31:51.940 buying Twitter.
00:31:53.520 So,
00:31:53.820 all these billionaires.
00:31:55.620 And,
00:31:56.140 uh,
00:31:56.940 I thought,
00:31:57.460 I'm gonna quit,
00:31:58.360 I'm just gonna quit
00:31:59.040 all of social media.
00:32:01.420 Um,
00:32:03.360 you know,
00:32:04.000 I'm just gonna quit
00:32:04.780 all social media.
00:32:07.100 Sort of as a protest
00:32:08.300 against the billionaire ownership.
00:32:10.800 But then,
00:32:11.760 I didn't know how to tell anybody.
00:32:16.980 So,
00:32:17.520 I got kind of stymied there.
00:32:18.780 It's like,
00:32:19.980 well,
00:32:20.300 I want to protest.
00:32:22.400 I want to,
00:32:22.760 I want to quit all that stuff
00:32:24.060 and kind of make a statement.
00:32:27.160 Who,
00:32:27.520 who am I gonna tell?
00:32:29.360 How would I tell them?
00:32:31.360 So,
00:32:31.820 I was thinking
00:32:32.180 I might write letters.
00:32:34.940 Um,
00:32:35.500 possibly,
00:32:37.100 sort of a letter writing campaign.
00:32:40.100 Something like that.
00:32:42.020 Um,
00:32:42.860 now let's,
00:32:44.280 uh,
00:32:44.420 do my new segment
00:32:45.300 called
00:32:45.820 people who are
00:32:47.420 bad at economics.
00:32:49.680 People who are
00:32:50.260 bad at economics.
00:32:52.040 Bad at economics.
00:32:54.780 And,
00:32:55.320 today's candidate
00:32:55.940 will be
00:32:56.420 a representative
00:32:57.180 of Pramila
00:32:58.380 Jayapal.
00:33:00.280 I believe
00:33:00.920 one of the squad.
00:33:02.580 Is she one of the squad?
00:33:05.160 Or,
00:33:06.440 she is, right?
00:33:07.820 No or yes?
00:33:08.480 I can't remember
00:33:10.100 if she's in the squad
00:33:11.000 or she's like
00:33:11.740 squad adjacent.
00:33:15.340 Okay.
00:33:16.320 She's close enough.
00:33:17.620 She's squad adjacent.
00:33:19.680 Anyway,
00:33:20.440 so,
00:33:20.820 uh,
00:33:21.500 she tweeted this
00:33:22.340 because of the news.
00:33:23.160 She said,
00:33:23.820 just a reminder
00:33:24.480 that from 2014 to 2018,
00:33:26.820 Elon Musk
00:33:27.720 paid an effective tax rate
00:33:29.760 of 3.27%,
00:33:31.820 whereas the average
00:33:32.880 working family
00:33:33.840 pays an average tax rate
00:33:35.020 of 13%.
00:33:35.480 And she says,
00:33:38.020 it's time for
00:33:38.680 a wealth tax
00:33:39.600 in this country.
00:33:41.280 No,
00:33:41.980 representative
00:33:42.600 Jayapal.
00:33:44.340 It's time for
00:33:45.020 an economics lesson
00:33:46.160 for representatives.
00:33:48.540 People who are
00:33:49.120 elected to represent
00:33:50.100 this country
00:33:50.700 should know
00:33:51.940 a little bit more
00:33:52.640 about economics.
00:33:54.200 Because,
00:33:54.900 do you know why
00:33:55.740 tax rules
00:33:58.840 are the way
00:33:59.420 that they are?
00:34:00.920 It's because
00:34:01.880 the government
00:34:02.580 picks winners
00:34:04.360 and losers.
00:34:05.480 And the government
00:34:06.860 wants people
00:34:08.220 who invest
00:34:08.980 in things
00:34:09.560 that create jobs
00:34:10.600 and make the climate
00:34:12.020 better in their view.
00:34:13.960 They want those people
00:34:15.340 to get big tax breaks
00:34:16.360 to allow
00:34:18.640 an Elon Musk
00:34:20.920 to survive
00:34:23.140 and thrive.
00:34:24.760 It's exactly
00:34:25.780 what the government
00:34:26.480 wants.
00:34:27.740 Do you know
00:34:28.040 why the government
00:34:28.740 wants people like,
00:34:31.380 well,
00:34:31.740 entrepreneurs in general,
00:34:32.900 but do you know
00:34:33.360 why the government
00:34:33.780 wants entrepreneurs
00:34:34.720 to pay low taxes?
00:34:38.660 It's so that
00:34:39.640 they'll build
00:34:40.360 trillion dollar companies.
00:34:44.560 Did I have to
00:34:46.600 explain that?
00:34:48.020 Like,
00:34:48.480 it's not random.
00:34:51.160 Does she think
00:34:52.080 that these things
00:34:52.740 are randomly
00:34:53.360 handed out?
00:34:55.000 They're not random.
00:34:56.480 It's because
00:34:57.100 people do things
00:34:58.740 for their own benefit.
00:34:59.780 If the government
00:35:01.360 decided that
00:35:02.320 somebody should have
00:35:03.020 a tax benefit,
00:35:03.900 it's because the government
00:35:04.760 said that's probably
00:35:06.000 good for the government
00:35:06.820 too.
00:35:07.600 It's good for everybody.
00:35:09.580 There's a reason
00:35:10.540 for every tax rate,
00:35:12.420 everybody who gets
00:35:14.460 a break.
00:35:14.920 it's not random.
00:35:18.540 It's based
00:35:19.240 on economics.
00:35:20.640 And if you found
00:35:22.100 something
00:35:22.580 that was clearly
00:35:24.020 just bad economics,
00:35:26.840 like it didn't
00:35:27.560 create the right
00:35:28.620 incentives,
00:35:29.680 well then,
00:35:30.000 yeah,
00:35:30.200 you should change
00:35:30.760 that.
00:35:31.480 And we do.
00:35:32.740 You know,
00:35:33.040 when we discover
00:35:34.100 that that's the case.
00:35:35.460 But to imagine
00:35:36.540 there's something
00:35:37.080 that is designed
00:35:38.440 and working
00:35:39.360 exactly like
00:35:40.300 you wanted it to,
00:35:41.900 creating incentives
00:35:42.820 for exactly
00:35:43.480 the right people,
00:35:45.800 I don't know.
00:35:47.000 I feel as if
00:35:48.040 we deserve
00:35:48.800 better representatives.
00:35:50.940 And,
00:35:51.600 you know,
00:35:51.820 we always talk
00:35:52.340 about how
00:35:52.920 the low-income
00:35:54.720 people in this country
00:35:55.560 should get
00:35:56.120 something like
00:35:57.460 financial investment
00:35:59.220 advice,
00:36:00.120 you know,
00:36:00.240 how to manage money,
00:36:01.880 basic lessons
00:36:02.720 on money management,
00:36:04.100 which I think
00:36:04.680 would be an amazing
00:36:05.480 addition to school.
00:36:07.220 but seriously,
00:36:10.380 we should not
00:36:11.500 have people
00:36:12.060 in Congress
00:36:12.720 whose understanding
00:36:14.140 of economics
00:36:15.920 is so low
00:36:16.920 that they don't
00:36:18.100 understand
00:36:18.620 that,
00:36:21.260 you know,
00:36:21.580 the tax system
00:36:22.680 is set up
00:36:23.220 for a reason.
00:36:25.340 It's not random.
00:36:27.260 All right.
00:36:28.460 I saw a little
00:36:29.400 thing going around
00:36:30.160 the Internet
00:36:30.720 that said
00:36:32.620 there was some study,
00:36:33.900 who knows how
00:36:34.620 reliable studies are,
00:36:36.140 but it was funny.
00:36:38.000 So it said that
00:36:38.560 men who help
00:36:39.200 the most
00:36:39.740 with housework,
00:36:42.000 so that
00:36:42.360 wherever the man
00:36:43.760 is most close
00:36:45.060 to doing
00:36:46.220 an equal amount
00:36:46.900 of housework
00:36:47.540 to the woman,
00:36:49.200 they also have
00:36:50.440 the highest likelihood
00:36:51.300 of divorce.
00:36:53.820 And,
00:36:54.340 and people
00:36:56.560 were expressing
00:36:57.400 some confusion
00:36:58.900 and surprise
00:36:59.800 that that would
00:37:00.900 be the case.
00:37:03.440 So,
00:37:04.000 I tweeted
00:37:06.880 that,
00:37:07.260 of course,
00:37:07.580 I know the answer
00:37:08.500 to this,
00:37:08.920 but I'm certainly
00:37:09.580 not going to say
00:37:10.180 it in public.
00:37:11.880 But,
00:37:12.900 I'll give you
00:37:13.760 some things
00:37:15.760 that people
00:37:16.120 are saying
00:37:16.520 about it.
00:37:17.640 Number one,
00:37:20.340 there are lots
00:37:21.740 of other correlations,
00:37:22.960 are there not?
00:37:24.360 That where you have
00:37:25.000 a situation
00:37:25.580 where two people
00:37:26.560 are doing nearly
00:37:27.220 the same amount
00:37:28.020 of housework,
00:37:29.520 there are probably
00:37:30.500 other things
00:37:31.300 going on,
00:37:31.980 are there not?
00:37:32.580 like,
00:37:36.380 one of them
00:37:38.340 is not
00:37:38.900 a billionaire
00:37:40.160 working 14 hours
00:37:41.960 a day,
00:37:43.460 and,
00:37:45.180 well,
00:37:45.420 that's a bad
00:37:45.880 example,
00:37:46.400 because in that
00:37:46.820 case,
00:37:47.120 they'd have
00:37:47.400 housekeepers.
00:37:48.560 But,
00:37:49.900 clearly,
00:37:50.920 there are a lot
00:37:51.420 of other things
00:37:52.060 going on,
00:37:53.320 and I would
00:37:55.500 like to offer
00:37:56.620 absolutely no
00:37:57.960 opinion whatsoever
00:37:59.040 about why
00:38:00.780 this is sort
00:38:02.080 of obvious
00:38:02.640 what's going
00:38:03.240 on here.
00:38:05.120 But,
00:38:05.720 I'm not going
00:38:06.060 to say it out
00:38:06.540 loud.
00:38:07.740 You can.
00:38:09.700 You all
00:38:10.480 risk-takers,
00:38:11.340 you.
00:38:12.480 Just,
00:38:12.900 just,
00:38:13.380 I'm not even
00:38:15.960 going to read
00:38:16.440 your comments,
00:38:17.540 because they're
00:38:18.460 so inappropriate.
00:38:20.000 The anti-wokeness,
00:38:21.300 or the wokeness,
00:38:22.300 or the anti-wokeness
00:38:23.460 that I'm seeing,
00:38:23.960 is disgusting.
00:38:30.280 All right.
00:38:31.800 So,
00:38:32.300 you can fill in
00:38:32.900 your own jokes
00:38:33.480 on that one.
00:38:35.200 Former DNI
00:38:36.340 John Ratcliffe,
00:38:38.260 I guess he was
00:38:38.840 on
00:38:39.560 Charlie Kirk's
00:38:42.820 show,
00:38:43.420 and he said
00:38:44.240 that to expect
00:38:44.940 more indictments
00:38:45.760 coming from
00:38:46.740 the Durham thing,
00:38:47.880 and the whole
00:38:48.820 fake
00:38:50.540 Russian collusion
00:38:52.400 thing.
00:38:52.700 Now,
00:38:54.180 apparently,
00:38:54.960 Ratcliffe must
00:38:55.800 have seen
00:38:56.200 documents that
00:38:56.940 the rest of us
00:38:57.600 have not,
00:38:58.580 because he says
00:38:59.220 that when more
00:38:59.880 classified documents
00:39:00.920 come out,
00:39:02.220 we're going to
00:39:03.200 be kind of
00:39:05.120 shocked.
00:39:06.820 He said it
00:39:07.720 will appall
00:39:08.480 the public,
00:39:09.500 if they're
00:39:10.260 declassified.
00:39:12.300 So,
00:39:12.940 how much do
00:39:13.700 you want to
00:39:14.020 see the things
00:39:14.560 that would
00:39:14.860 appall us?
00:39:16.820 I'd really
00:39:17.480 like to see
00:39:18.040 those.
00:39:19.240 And I don't
00:39:20.040 imagine there's
00:39:20.640 any reason
00:39:21.020 they couldn't
00:39:21.580 be.
00:39:21.800 declassified.
00:39:23.980 So,
00:39:24.480 I feel like
00:39:25.880 one of the
00:39:26.320 strategies that
00:39:27.280 bad people
00:39:27.980 use in
00:39:28.860 politics is
00:39:29.920 to just
00:39:30.280 make sure
00:39:30.640 that the
00:39:31.120 investigation
00:39:31.680 lasts long
00:39:32.660 enough that
00:39:33.800 we don't
00:39:34.140 care about
00:39:34.540 the issue
00:39:35.000 anymore.
00:39:36.560 Because
00:39:36.760 imagine how
00:39:37.320 hot the
00:39:37.780 issue was
00:39:38.480 at one
00:39:38.900 point.
00:39:39.280 But every
00:39:39.560 year that
00:39:39.860 goes by,
00:39:41.040 even I,
00:39:42.380 who talked
00:39:43.060 about it a
00:39:43.560 lot,
00:39:44.360 I start to
00:39:45.180 lose interest
00:39:45.740 in it over
00:39:46.200 time.
00:39:46.700 It's like,
00:39:47.040 yeah,
00:39:47.220 Russia collusion,
00:39:48.220 how many more
00:39:48.740 times are we
00:39:49.080 going to mention
00:39:49.440 that?
00:39:49.720 So by
00:39:50.680 the time
00:39:50.960 you find
00:39:51.340 out exactly
00:39:52.020 what the
00:39:52.400 plot was
00:39:53.040 and exactly
00:39:53.660 who was
00:39:54.300 behind it,
00:39:55.160 you've already
00:39:55.900 moved on to
00:39:56.500 a new
00:39:57.080 outrage,
00:39:57.840 so your
00:39:58.620 attention
00:39:58.980 doesn't get
00:39:59.500 the same
00:39:59.920 hit.
00:40:05.600 Here is the
00:40:06.600 weirdest thing
00:40:07.920 that's happening.
00:40:09.340 Now,
00:40:10.000 if you study
00:40:11.520 cognitive dissonance
00:40:12.700 and confirmation
00:40:13.460 bias and all
00:40:14.220 that,
00:40:14.440 you know
00:40:17.180 what they
00:40:17.480 mean,
00:40:18.580 but when
00:40:19.020 you see
00:40:19.340 an example
00:40:19.880 that's
00:40:20.180 really clear,
00:40:21.620 it's still
00:40:22.100 shocking,
00:40:23.580 because
00:40:24.160 cognitive
00:40:24.800 dissonance
00:40:25.300 is one
00:40:25.640 of those
00:40:25.880 things that
00:40:26.420 you can
00:40:27.720 see if
00:40:28.120 you don't
00:40:28.420 have it,
00:40:29.480 but the
00:40:29.840 person who's
00:40:30.400 in it
00:40:30.720 can't see
00:40:31.240 it.
00:40:32.280 And you
00:40:32.760 think,
00:40:33.140 my God,
00:40:33.620 how do you
00:40:34.200 not see
00:40:34.560 this?
00:40:34.880 So here's
00:40:35.200 an example.
00:40:36.740 I tweeted
00:40:37.380 this if you
00:40:37.880 want to go
00:40:38.180 look for
00:40:38.440 yourself,
00:40:38.820 but there's
00:40:39.120 a Yahoo.com
00:40:40.120 article,
00:40:41.660 and it's a
00:40:42.180 recent one,
00:40:43.200 which is
00:40:43.600 interesting,
00:40:44.140 which they're
00:40:44.660 talking to
00:40:45.120 Dr.
00:40:45.560 Birx,
00:40:46.720 one of
00:40:47.220 Trump's
00:40:47.960 pandemic
00:40:49.080 experts.
00:40:50.700 And they
00:40:51.640 talked about
00:40:52.200 how she
00:40:53.140 felt when
00:40:54.080 Trump
00:40:54.700 allegedly
00:40:55.320 suggested
00:40:56.560 drinking
00:40:57.740 bleach.
00:41:00.280 Now,
00:41:01.100 here's the
00:41:01.640 weird part.
00:41:02.720 You've seen
00:41:03.220 lots and
00:41:04.100 lots of
00:41:04.480 reports about
00:41:05.640 this story
00:41:06.480 that's not
00:41:06.940 true.
00:41:08.060 Trump
00:41:08.360 never suggested
00:41:09.380 drinking or
00:41:10.800 injecting
00:41:11.340 bleach or
00:41:12.500 any kind
00:41:13.020 of liquid
00:41:13.620 disinfectant.
00:41:15.660 He did
00:41:16.040 talk about
00:41:16.700 shooting
00:41:17.320 UV light
00:41:18.520 into the
00:41:18.980 lungs,
00:41:19.580 which was
00:41:20.360 being tested
00:41:21.380 at that
00:41:21.880 time at
00:41:22.380 Cedars-Sinai.
00:41:24.020 And he
00:41:25.300 was very
00:41:25.640 careful to
00:41:26.180 say it
00:41:26.460 was UV
00:41:26.800 light,
00:41:27.580 both before
00:41:28.440 he talked
00:41:28.840 about it,
00:41:29.760 and then
00:41:30.100 after he
00:41:30.600 was done
00:41:30.880 talking about
00:41:31.460 it,
00:41:31.660 he bookended
00:41:33.020 it by saying,
00:41:33.860 yeah,
00:41:34.060 but UV
00:41:34.500 light.
00:41:35.480 So he
00:41:35.780 made sure
00:41:36.140 that UV
00:41:37.000 light is
00:41:37.540 what he
00:41:37.760 was talking
00:41:38.120 about as
00:41:38.740 a disinfectant
00:41:39.660 injected into
00:41:41.220 the body.
00:41:42.780 So here's
00:41:44.320 what's weird
00:41:44.680 about the
00:41:45.040 Yahoo article.
00:41:46.340 They showed
00:41:46.960 his actual
00:41:47.800 quote,
00:41:48.560 including the
00:41:49.280 part about
00:41:49.700 the light,
00:41:50.300 which completely
00:41:52.340 debunks the
00:41:53.440 hoax.
00:41:54.840 I mean,
00:41:55.080 just clearly.
00:41:56.260 There's the
00:41:56.720 quote.
00:41:57.680 There's him
00:41:58.140 talking about
00:41:58.760 light as the
00:41:59.500 disinfectant,
00:42:00.700 so you can
00:42:01.320 see exactly
00:42:01.820 what he
00:42:02.120 meant.
00:42:03.240 And then
00:42:03.740 the rest of
00:42:04.420 the article
00:42:05.000 acts as if
00:42:06.720 that didn't
00:42:07.260 happen.
00:42:09.340 It's so
00:42:10.180 weird.
00:42:11.460 It just
00:42:11.960 goes on as
00:42:12.920 if they
00:42:13.220 hadn't just
00:42:13.700 debunked one
00:42:15.000 of the biggest
00:42:15.700 hoaxes in
00:42:18.180 the nation.
00:42:19.480 Just acted
00:42:20.440 like it
00:42:20.840 didn't happen.
00:42:22.560 They just
00:42:23.020 printed it
00:42:23.840 and then
00:42:24.260 just went
00:42:24.980 on like,
00:42:26.640 sorry,
00:42:27.300 this lady
00:42:27.700 had to
00:42:28.100 endure this
00:42:28.680 horrible hoax
00:42:29.620 or this
00:42:30.340 horrible thing
00:42:30.920 that they
00:42:31.760 just said
00:42:32.160 didn't happen.
00:42:32.740 I don't
00:42:34.860 know what
00:42:35.640 to think
00:42:35.900 about it.
00:42:36.360 It's just
00:42:36.840 like so
00:42:37.200 mind-blowing.
00:42:38.100 Now,
00:42:38.400 I can
00:42:38.720 explain it.
00:42:40.160 It's easily
00:42:41.080 explained.
00:42:42.680 It's
00:42:42.920 cognitive
00:42:43.300 distance.
00:42:46.820 I mean,
00:42:47.620 it's easy.
00:42:48.580 But to
00:42:49.100 watch an
00:42:49.660 example that
00:42:50.280 is that
00:42:51.400 clean is
00:42:52.980 just weird.
00:42:55.380 Now,
00:42:56.480 I think
00:42:58.240 I've
00:42:58.500 delivered on
00:43:00.320 the best
00:43:01.480 live stream
00:43:02.200 you've
00:43:03.140 ever seen
00:43:03.560 in your
00:43:03.820 life.
00:43:05.020 We are,
00:43:06.200 in fact,
00:43:06.620 forming a
00:43:07.300 global
00:43:07.660 consciousness
00:43:08.220 with Twitter
00:43:09.840 and with
00:43:10.960 what we're
00:43:11.300 doing here,
00:43:12.020 which is
00:43:12.320 sort of
00:43:12.840 an adjunct
00:43:13.520 to all
00:43:13.900 that stuff.
00:43:15.900 As I've
00:43:16.660 said before,
00:43:18.040 I believe
00:43:18.600 that there's
00:43:19.280 an emerging
00:43:20.280 thing called
00:43:21.080 the Internet
00:43:21.760 Dads,
00:43:23.920 meaning
00:43:24.400 people who
00:43:25.800 generally
00:43:27.520 have
00:43:27.920 figured out
00:43:29.940 how to
00:43:30.240 meet their
00:43:30.660 own needs
00:43:31.320 and are
00:43:32.540 just trying
00:43:33.000 to be
00:43:33.280 useful.
00:43:35.080 And I
00:43:35.740 try to be
00:43:36.180 one of
00:43:36.500 those.
00:43:37.720 I'm very
00:43:38.460 intentionally
00:43:39.020 trying to
00:43:39.620 be useful
00:43:40.160 because I'm
00:43:41.200 at that
00:43:41.500 stage of
00:43:42.040 life.
00:43:42.820 A number
00:43:43.520 of other
00:43:43.860 people just
00:43:44.700 like it.
00:43:45.600 Mike Cernovich,
00:43:47.080 probably the
00:43:47.900 best example,
00:43:49.020 of someone who
00:43:49.600 literally just
00:43:50.340 wakes up every
00:43:50.960 day and is
00:43:51.600 just trying to
00:43:52.300 be useful
00:43:52.760 on Twitter,
00:43:54.760 you know,
00:43:55.220 25 tweets a
00:43:56.240 day that are
00:43:56.860 pretty much
00:43:57.480 all great.
00:43:59.180 And the
00:44:00.580 number of
00:44:00.980 other people.
00:44:02.020 One of
00:44:02.440 those people
00:44:03.040 I think is
00:44:04.880 Elon Musk.
00:44:06.700 I see
00:44:07.460 Elon Musk
00:44:08.080 as, you
00:44:09.640 know, he's
00:44:09.900 sort of the
00:44:10.340 super dad,
00:44:11.280 but I see
00:44:11.720 him as an
00:44:12.140 Internet dad.
00:44:13.700 Like, I
00:44:14.040 don't even
00:44:14.400 see him,
00:44:15.440 it doesn't
00:44:15.940 feel exactly
00:44:16.660 like, you
00:44:17.320 know, a
00:44:17.600 patriot.
00:44:18.880 It doesn't
00:44:19.280 sound exactly
00:44:20.020 like politics.
00:44:21.580 It doesn't.
00:44:22.540 It's not like
00:44:23.120 politics.
00:44:24.580 It's not
00:44:25.020 exactly just,
00:44:26.020 you know,
00:44:26.280 rah, rah,
00:44:26.860 America first.
00:44:28.320 It's literally
00:44:29.220 just dad.
00:44:30.840 Am I
00:44:31.420 wrong?
00:44:32.380 Now, again,
00:44:33.220 of course, I'm
00:44:33.820 being sexist
00:44:34.920 here, so,
00:44:36.000 you know, let
00:44:36.320 me acknowledge
00:44:36.860 that in
00:44:38.060 this context,
00:44:39.080 dad means
00:44:39.700 men or
00:44:40.700 women who
00:44:41.220 have a
00:44:41.500 certain
00:44:41.780 sensibility
00:44:42.900 at a
00:44:43.620 certain place
00:44:44.080 in life.
00:44:45.000 You know, I
00:44:45.280 could have
00:44:45.540 called it
00:44:45.860 mom, you
00:44:47.860 know, in
00:44:48.060 a different
00:44:48.380 context.
00:44:49.280 But it
00:44:49.820 just feels
00:44:50.340 like that
00:44:50.960 dad vibe.
00:44:51.880 It's like,
00:44:52.240 okay, okay,
00:44:53.560 I'm just
00:44:54.220 going to buy
00:44:54.640 the company.
00:44:55.160 If you
00:44:56.800 kids won't
00:44:57.700 stop arguing,
00:44:58.760 I'm just
00:44:59.700 going to
00:44:59.960 buy the
00:45:00.360 company.
00:45:01.580 Doesn't that
00:45:02.180 sound like
00:45:02.600 dad?
00:45:03.900 It doesn't
00:45:04.720 sound like
00:45:05.140 mom, does
00:45:06.600 it?
00:45:07.200 Because mom
00:45:08.040 doesn't buy
00:45:08.560 the company
00:45:09.020 to make you
00:45:09.520 stop arguing.
00:45:11.140 That's just
00:45:11.800 dad.
00:45:12.740 Just dad.
00:45:14.480 And who
00:45:15.480 makes a bunch
00:45:16.340 of off-color
00:45:17.100 jokes while
00:45:18.260 he's doing
00:45:18.720 it?
00:45:19.680 Mom?
00:45:20.260 Does mom
00:45:21.580 make a
00:45:21.900 bunch of
00:45:22.260 off-color
00:45:22.900 sexual
00:45:23.580 scatological
00:45:25.580 jokes when
00:45:28.180 he's buying
00:45:29.100 a major
00:45:29.540 company that's
00:45:30.420 the lever
00:45:30.860 of civilization?
00:45:33.640 No.
00:45:35.220 Dad does.
00:45:36.540 Oh, yeah,
00:45:36.980 dad does.
00:45:38.220 That's why
00:45:38.700 they're called
00:45:39.380 dad jokes.
00:45:44.000 Exactly.
00:45:44.320 So, now,
00:45:48.220 how many,
00:45:48.920 you could
00:45:49.460 mention
00:45:49.800 probably another
00:45:50.780 dozen or
00:45:51.460 so dads,
00:45:52.280 could you?
00:45:53.200 But I'm
00:45:55.560 quite serious
00:45:56.280 in saying
00:45:56.900 that this
00:45:58.920 phenomenon
00:45:59.640 of
00:46:00.440 internet
00:46:02.360 dads who
00:46:03.760 only get
00:46:04.520 involved when
00:46:05.180 dad has to
00:46:05.840 get involved
00:46:06.420 is probably
00:46:07.780 a really
00:46:08.220 good thing.
00:46:09.600 Like, it
00:46:09.820 just creates
00:46:10.360 this one
00:46:10.860 extra positive
00:46:12.600 force.
00:46:13.180 Because I
00:46:13.460 don't think
00:46:13.820 that the
00:46:14.140 dads get
00:46:14.660 involved unless
00:46:16.160 there's sort
00:46:17.500 of a log
00:46:17.940 jam.
00:46:18.620 You know
00:46:18.820 what I
00:46:19.000 mean?
00:46:19.780 Because dad's
00:46:20.520 going to let
00:46:20.840 you fight.
00:46:22.320 That's like
00:46:22.880 the cool
00:46:23.220 thing about
00:46:23.620 the dad
00:46:24.080 vibe.
00:46:24.880 Mom is
00:46:25.420 going to
00:46:25.620 try to
00:46:25.900 stop you
00:46:26.500 from fighting
00:46:26.920 right away.
00:46:28.240 Dad might
00:46:28.840 let you
00:46:29.180 duke it
00:46:29.520 down a
00:46:29.780 little bit.
00:46:30.860 Let you
00:46:31.180 fight a
00:46:31.500 little bit.
00:46:32.600 But, you
00:46:33.000 know, when
00:46:33.500 it's time,
00:46:34.940 when it's
00:46:35.320 dad time,
00:46:36.800 well, then
00:46:37.560 you get
00:46:37.760 involved.
00:46:38.140 And, again,
00:46:39.440 not trying
00:46:40.000 to be sexist,
00:46:40.800 so you can
00:46:41.200 switch the
00:46:42.040 genders.
00:46:43.180 Any gender
00:46:43.900 preference you
00:46:44.500 want to
00:46:44.680 put on
00:46:44.940 that would
00:46:45.420 be LGBTQ
00:46:47.080 find with
00:46:48.960 me.
00:46:51.020 It's just
00:46:51.660 an easy way
00:46:52.160 to explain
00:46:52.620 it in
00:46:53.140 classic
00:46:54.260 sexist
00:46:54.800 terms.
00:47:00.420 All right.
00:47:01.080 Ashley says,
00:47:11.400 can hypnosis
00:47:12.080 increase
00:47:12.800 sex drive?
00:47:14.600 Would you
00:47:15.280 like me to
00:47:15.800 answer that
00:47:16.160 question to
00:47:16.560 anybody else?
00:47:18.600 Can hypnosis
00:47:19.660 increase
00:47:20.420 sex drive?
00:47:21.860 Well, let me
00:47:22.420 ask you this.
00:47:24.100 Have you
00:47:24.500 noticed that
00:47:25.040 your sex drive
00:47:25.940 seems to be a
00:47:26.800 mental process?
00:47:27.600 hypnosis, and
00:47:29.200 that if you,
00:47:30.800 let's say,
00:47:32.260 spend some
00:47:34.020 time in the
00:47:34.480 same room
00:47:34.960 with, or
00:47:35.660 you're exposed
00:47:36.820 to someone
00:47:37.600 who's unusually
00:47:38.700 sexy, that
00:47:42.380 your brain
00:47:42.940 says, oh,
00:47:43.680 there's
00:47:44.020 something unusually
00:47:44.740 sexy, and
00:47:45.260 then your body
00:47:45.780 just responds.
00:47:47.840 So, can you
00:47:49.180 imagine that
00:47:49.800 hypnosis would
00:47:50.620 not help?
00:47:51.340 it seems
00:47:53.820 almost impossible
00:47:54.580 it wouldn't.
00:47:56.260 Am I right?
00:47:57.160 And the
00:47:57.940 answer is
00:47:58.320 yes.
00:47:59.700 You know,
00:48:00.280 yes, definitely.
00:48:02.120 Now, I'm not
00:48:02.920 talking about
00:48:03.300 somebody who's
00:48:03.740 got, like, an
00:48:04.320 actual medical
00:48:05.000 problem, right?
00:48:06.860 So, if you
00:48:07.500 have an actual
00:48:07.940 medical problem,
00:48:09.760 no, it's not
00:48:11.520 going to help
00:48:11.780 with that.
00:48:12.420 But, if the
00:48:14.380 problem is
00:48:14.940 entirely about
00:48:15.940 how you think
00:48:16.560 about yourself,
00:48:18.380 or it's
00:48:18.760 entirely how you
00:48:19.540 think about
00:48:20.060 the topic, let's
00:48:22.360 say it's
00:48:22.620 because you're
00:48:23.160 shy, let's
00:48:24.520 say it's
00:48:24.820 because you
00:48:26.360 haven't found
00:48:26.920 that one
00:48:27.400 thought that
00:48:28.000 really, you
00:48:29.480 know, lights
00:48:30.120 up your
00:48:30.440 brain, there
00:48:31.960 are a whole
00:48:32.300 bunch of
00:48:32.840 ways that
00:48:34.120 simply re-engineering
00:48:35.360 your thought
00:48:35.980 process, which
00:48:36.740 hypnosis is
00:48:37.540 good at doing,
00:48:38.440 if you want
00:48:39.600 it to, right?
00:48:40.680 So, it's the
00:48:41.360 willingness to
00:48:42.640 participate in
00:48:43.500 it that makes
00:48:44.100 it work, if
00:48:44.840 short of that,
00:48:46.280 no.
00:48:47.100 But, if
00:48:47.800 you're willing
00:48:48.180 to participate,
00:48:50.060 a hypnotist
00:48:52.120 who knew
00:48:52.680 what they
00:48:52.900 were doing,
00:48:54.660 and be
00:48:55.320 careful about
00:48:56.140 unscrupulous
00:48:57.080 hypnotists,
00:48:58.380 because I'm
00:48:59.740 not even sure
00:49:00.240 how you'd
00:49:00.620 protect against
00:49:01.280 that, honestly.
00:49:02.560 So, it's
00:49:03.160 a risky
00:49:03.520 business.
00:49:04.360 But, in
00:49:04.640 theory, it's
00:49:07.700 not only
00:49:08.220 doable, but
00:49:08.920 it's actually
00:49:09.340 among the
00:49:10.200 easiest things
00:49:11.020 to do.
00:49:12.300 So, it's
00:49:12.900 one thing
00:49:13.300 that you
00:49:13.520 could be
00:49:13.820 quite reliably
00:49:14.800 sure you
00:49:15.380 could do.
00:49:16.360 Whereas,
00:49:16.800 quitting
00:49:17.040 cigarettes, for
00:49:17.720 example, you
00:49:18.340 wouldn't bet
00:49:20.280 that hypnosis
00:49:21.200 could do
00:49:21.700 it.
00:49:22.660 It works
00:49:23.080 about one
00:49:23.540 time out
00:49:23.900 of three.
00:49:25.240 Same with
00:49:25.900 losing weight.
00:49:26.880 If you use
00:49:27.380 hypnosis to
00:49:28.000 lose weight
00:49:28.400 or quit
00:49:28.760 smoking, it
00:49:31.140 works about
00:49:31.560 one time out
00:49:32.100 of three.
00:49:33.260 So does
00:49:34.060 every other
00:49:35.180 method.
00:49:37.200 In other
00:49:37.560 words, hypnosis
00:49:38.620 doesn't add
00:49:39.560 something.
00:49:40.640 What really
00:49:41.380 happens is
00:49:41.960 people decide
00:49:42.760 to quit
00:49:43.180 those things.
00:49:44.580 So, somebody
00:49:45.140 decides to
00:49:45.980 quit smoking,
00:49:46.480 and then
00:49:47.280 they pick
00:49:47.640 a method.
00:49:49.240 And it
00:49:49.520 almost doesn't
00:49:50.060 matter what
00:49:50.440 method they
00:49:50.900 pick, because
00:49:51.900 they decide
00:49:52.360 it.
00:49:53.260 They don't
00:49:53.680 want to
00:49:54.040 quit, they
00:49:56.380 decide it.
00:49:57.380 So, about
00:49:57.720 one in three
00:49:58.220 people literally
00:49:59.180 just decide, and
00:50:00.700 then it doesn't
00:50:01.200 matter what
00:50:01.620 method they
00:50:02.100 use, including
00:50:02.780 hypnosis.
00:50:03.940 Same with
00:50:04.520 eating.
00:50:06.280 About one in
00:50:07.140 three people
00:50:07.700 are just going
00:50:09.480 to have that
00:50:10.140 ability to
00:50:11.480 just stop and
00:50:12.960 eat differently
00:50:13.860 for the rest
00:50:14.440 of their lives.
00:50:15.000 The rest
00:50:16.180 can't.
00:50:17.340 So, did the
00:50:17.980 hypnosis help?
00:50:19.060 Well, it
00:50:19.460 didn't matter
00:50:19.820 what method
00:50:20.280 they used.
00:50:21.320 As long as
00:50:22.080 they had
00:50:22.360 decided, the
00:50:24.000 method became
00:50:24.740 somewhat irrelevant.
00:50:26.580 It probably
00:50:27.300 helped to have
00:50:27.960 a method, because
00:50:29.520 it gave them
00:50:29.940 something that
00:50:30.520 they thought was
00:50:31.240 working, and
00:50:32.020 made them feel
00:50:33.000 more confident
00:50:33.640 that they could
00:50:34.180 get through it.
00:50:34.880 So, maybe it
00:50:35.540 helped them
00:50:36.500 psychologically in
00:50:37.360 some way, but
00:50:38.080 so would
00:50:38.380 anything else.
00:50:39.120 hypnosis cure
00:50:49.600 disease, such
00:50:50.620 as mental
00:50:51.180 illness.
00:50:53.380 Oh, here's
00:50:54.540 something that
00:50:54.940 could get me
00:50:55.400 kicked off of
00:50:56.080 YouTube.
00:50:58.900 That's a
00:50:59.320 dangerous question.
00:51:00.700 That's a
00:51:01.060 dangerous question,
00:51:01.880 so naturally I'm
00:51:02.900 going to answer
00:51:03.520 it.
00:51:04.580 So, could
00:51:05.080 hypnosis cure
00:51:05.980 mental illness?
00:51:07.060 Number one, it
00:51:09.660 depends how you
00:51:11.460 define mental
00:51:12.100 illness, because
00:51:13.820 there might be
00:51:14.740 some categories of
00:51:15.780 things that sort
00:51:16.680 of a gray area
00:51:17.580 that hypnosis
00:51:18.900 would help in, but
00:51:20.020 maybe I would say
00:51:20.980 it's mental illness,
00:51:22.000 but you might say,
00:51:22.760 well, that's more
00:51:23.360 of a personality or
00:51:24.460 a situational thing,
00:51:26.420 you know, that
00:51:27.140 doesn't count.
00:51:28.560 It definitely
00:51:29.420 can't cure
00:51:30.540 schizophrenia.
00:51:32.380 It definitely
00:51:32.860 can't cure
00:51:33.500 anything that's
00:51:34.020 like a physical
00:51:34.680 brain damage.
00:51:35.520 So, there are
00:51:37.580 lots of things
00:51:38.180 you can be
00:51:38.680 sure it
00:51:39.020 doesn't help
00:51:39.500 with.
00:51:40.680 So, I would
00:51:41.080 say if you
00:51:41.460 took the
00:51:41.820 basket of
00:51:42.400 all mental
00:51:42.820 illness, the
00:51:44.100 odds that it
00:51:44.780 would cure any
00:51:45.460 one thing in
00:51:46.440 there would be
00:51:46.960 low.
00:51:48.040 But, is
00:51:49.320 there anything
00:51:50.100 in the basket
00:51:50.820 that it could
00:51:52.080 cure?
00:51:53.060 And the answer
00:51:53.640 is, yes, with
00:51:56.560 a caveat.
00:51:59.280 Yes, with a
00:52:00.100 caveat.
00:52:01.140 That reframing
00:52:02.180 itself can be
00:52:04.200 enough to help
00:52:04.980 some people.
00:52:06.260 Like, simply
00:52:06.780 thinking of
00:52:07.720 something in a
00:52:08.320 new way, and
00:52:09.820 then, you
00:52:10.720 know, practicing
00:52:11.400 the thinking in
00:52:12.220 a new way.
00:52:12.580 The practice is
00:52:13.400 important, too.
00:52:14.300 That feels like
00:52:15.420 hypnosis, and
00:52:17.460 can work without
00:52:18.120 hypnosis.
00:52:19.480 I mean, it's
00:52:19.840 literally just
00:52:20.440 something somebody
00:52:21.220 can suggest to
00:52:22.400 you, and then
00:52:23.220 you just practice
00:52:23.800 it, and it
00:52:24.400 can make a big
00:52:24.900 difference.
00:52:25.540 In your mood,
00:52:26.100 for example.
00:52:26.520 I mention it
00:52:30.420 too often, but
00:52:31.760 it just fits
00:52:32.440 every situation,
00:52:33.300 it seems like.
00:52:34.220 The Dale
00:52:34.700 Carnegie course
00:52:35.580 teaches you to
00:52:37.480 have less
00:52:38.120 social anxiety.
00:52:40.760 And I would
00:52:41.680 argue that it's
00:52:42.520 a form of
00:52:43.340 persuasion,
00:52:44.460 brainwashing,
00:52:45.320 hypnosis, that
00:52:46.880 they don't brand
00:52:47.580 that way, and
00:52:48.960 they don't use
00:52:49.460 any of those
00:52:49.860 specific tools.
00:52:50.640 But their
00:52:52.880 method is so
00:52:53.640 persuasive that
00:52:56.280 the design of
00:52:57.100 the class ends
00:52:57.940 up being almost
00:52:59.100 like you've been
00:52:59.700 hypnotized or
00:53:00.560 rewired.
00:53:01.880 It's such a
00:53:02.740 strong effect.
00:53:04.320 And most of
00:53:05.480 the technique is
00:53:06.260 just giving you
00:53:06.820 compliments instead
00:53:07.740 of any
00:53:08.960 criticism.
00:53:10.040 They just make
00:53:10.540 it safe to
00:53:11.360 basically rewire
00:53:12.600 your own brain.
00:53:14.260 So there's
00:53:15.820 nothing that
00:53:16.240 looks even
00:53:16.600 slightly dangerous
00:53:17.560 about it, and
00:53:18.140 yet your brain
00:53:18.800 is literally
00:53:19.340 rewired.
00:53:20.640 But in such
00:53:21.200 a positive
00:53:21.660 way that you
00:53:23.380 end up just
00:53:23.860 recommending it.
00:53:25.340 Like, you
00:53:25.880 don't worry
00:53:26.280 about it.
00:53:26.600 You're like,
00:53:26.820 oh, that was
00:53:27.260 good.
00:53:28.160 You should
00:53:28.640 rewire your
00:53:29.220 brain, too.
00:53:30.380 So the
00:53:31.620 answer is, if
00:53:32.340 the hypnotist
00:53:32.980 knew how to
00:53:34.280 reframe, and
00:53:36.020 then used the
00:53:36.660 hypnosis to,
00:53:38.340 let's say,
00:53:39.080 reinforce the
00:53:39.980 reframe, to
00:53:41.460 cause you to
00:53:42.020 think about the
00:53:42.640 reframe more
00:53:43.480 than the other
00:53:43.980 thing, that
00:53:45.460 could be done.
00:53:46.820 But it would
00:53:47.520 require a certain
00:53:48.680 level of skill.
00:53:49.400 pretty high
00:53:51.240 level of skill.
00:53:52.460 But in
00:53:52.920 theory, yes.
00:53:55.380 In theory, for
00:53:56.680 some of the
00:53:58.300 most, I'd
00:53:59.580 say, for the
00:54:00.140 most treatable
00:54:01.020 forms of
00:54:02.220 mental illness,
00:54:04.080 you know, lower
00:54:04.660 on the severity
00:54:05.500 scale, you
00:54:07.240 almost certainly
00:54:08.060 could get the
00:54:08.680 same benefit
00:54:09.280 that, say,
00:54:10.640 therapy would
00:54:11.440 give you.
00:54:11.820 and for
00:54:13.100 some people,
00:54:14.320 nothing is
00:54:15.180 universal.
00:54:16.280 For some
00:54:16.900 people, it
00:54:17.320 might be almost
00:54:17.920 instant, like a
00:54:19.540 day or two.
00:54:21.220 But also, there
00:54:22.940 are people who
00:54:23.460 can, with
00:54:24.160 therapy, can
00:54:24.780 fix you in a
00:54:25.340 day or two.
00:54:26.740 So, you know,
00:54:28.280 there isn't
00:54:28.560 really any science
00:54:29.480 that I can rely
00:54:30.240 on for this
00:54:30.920 stuff.
00:54:32.160 So, and the
00:54:34.340 way therapy
00:54:35.300 would fix you in
00:54:36.120 a day or two is
00:54:36.780 with a reframe
00:54:37.640 that just made
00:54:38.840 you say, whoa.
00:54:39.620 Let me give
00:54:40.960 you an example.
00:54:41.560 Here's a reframe.
00:54:44.040 Somebody thinks
00:54:44.860 that they have,
00:54:45.640 they don't have
00:54:46.460 any worth.
00:54:47.680 They just have
00:54:48.300 low self-esteem.
00:54:50.160 Here's a reframe.
00:54:51.860 They say, do you
00:54:52.780 feel other people
00:54:53.880 are worthless?
00:54:55.480 No.
00:54:57.000 Like, why not?
00:54:58.540 Because, you
00:54:58.980 know, some have
00:54:59.440 all kinds of
00:54:59.960 flaws and stuff.
00:55:01.300 And people will
00:55:01.780 say, well,
00:55:02.180 everybody's got
00:55:02.740 flaws.
00:55:03.640 Like, I'm not
00:55:04.100 going to judge
00:55:04.660 them for, you
00:55:06.020 know, their
00:55:06.380 anything.
00:55:07.600 Like, you
00:55:08.200 know, they're
00:55:08.640 good at some
00:55:09.120 things.
00:55:09.440 I'm good at
00:55:09.900 some things.
00:55:11.380 And then you
00:55:12.140 say, well, do
00:55:13.520 you think they're
00:55:13.960 judging you?
00:55:16.340 Why do you
00:55:17.160 think they're
00:55:17.440 thinking anything
00:55:17.940 different?
00:55:18.660 And then you
00:55:19.020 say, how much
00:55:20.060 do you actually
00:55:20.420 think about
00:55:21.100 people other
00:55:22.620 than your
00:55:22.980 immediate group
00:55:24.620 that you deal
00:55:25.060 with?
00:55:25.280 Do you ever
00:55:25.640 think about
00:55:26.220 them?
00:55:27.400 And they'll
00:55:27.680 say, not
00:55:28.200 really.
00:55:29.260 Do you care
00:55:29.980 that much?
00:55:31.140 Not strangers.
00:55:32.960 And that's
00:55:33.800 just a couple
00:55:34.240 of reframes to
00:55:35.320 get you to
00:55:35.820 understand that
00:55:37.180 your feeling of
00:55:38.060 low self-esteem
00:55:39.200 doesn't fit any
00:55:41.220 mental model.
00:55:42.900 So it doesn't
00:55:43.820 immediately make
00:55:45.040 you feel better.
00:55:46.420 But if you
00:55:47.000 just keep telling
00:55:47.820 yourself, wait a
00:55:48.460 minute, other
00:55:49.560 people really
00:55:50.120 don't judge me,
00:55:51.580 and I'm not
00:55:52.140 judging other
00:55:52.660 people, so what
00:55:54.080 does it even mean
00:55:54.800 to feel like you
00:55:55.580 have low value?
00:55:56.280 it actually starts
00:55:57.880 losing its meaning.
00:56:00.220 And if you just
00:56:01.020 say, well, isn't
00:56:02.500 everybody good at
00:56:03.260 something and bad
00:56:03.960 at other things?
00:56:05.480 You think, yeah, I
00:56:06.460 can't even think of
00:56:07.080 an exception.
00:56:08.220 Like, everybody's
00:56:08.920 better at some
00:56:10.240 things and worse
00:56:10.800 than others.
00:56:11.540 If you were to
00:56:12.200 judge people by
00:56:12.900 their mistakes, you'd
00:56:14.080 hate everybody, so
00:56:14.960 that doesn't work.
00:56:16.180 If you judge people
00:56:17.160 by what they're good
00:56:17.980 at and bad at,
00:56:19.260 again, you'd hate
00:56:20.020 everybody, except
00:56:20.860 some small number
00:56:21.660 of people, I guess.
00:56:23.160 So there's no
00:56:24.060 standard by which
00:56:24.940 you can measure
00:56:25.540 anybody's worth.
00:56:27.240 So these are just
00:56:27.940 reframes.
00:56:28.980 I'm just talking.
00:56:30.380 So this isn't
00:56:31.020 hypnosis, but
00:56:32.720 imagine that these
00:56:33.620 same ideas could
00:56:35.460 be reinforced
00:56:36.200 through a
00:56:37.700 hypnotic process
00:56:39.280 so that somebody
00:56:40.860 is just more
00:56:41.840 likely to think in
00:56:42.780 the positive frames
00:56:43.640 than the less
00:56:44.500 positive ones.
00:56:46.660 Totally doable.
00:56:48.240 But, you know,
00:56:48.780 would it work for
00:56:49.300 everybody now?
00:56:50.580 Because hypnosis
00:56:51.280 doesn't work for
00:56:52.020 everybody.
00:56:52.340 not every hypnotist
00:56:55.380 is good, etc.
00:56:58.600 All right.
00:57:01.580 And that, ladies
00:57:02.440 and gentlemen, is
00:57:04.200 your amazing, amazing
00:57:06.720 live stream for
00:57:07.440 today.
00:57:09.120 Best that's ever
00:57:10.120 happened?
00:57:11.700 Can you confirm
00:57:12.780 the best thing you've
00:57:14.020 ever listened to in
00:57:14.780 your whole life?
00:57:16.000 I think so.
00:57:17.400 I think so.
00:57:19.420 We all agree.
00:57:20.880 All right.
00:57:21.400 YouTube.
00:57:21.800 See you tomorrow.
00:57:23.040 See you tomorrow.
00:57:31.620 See you tomorrow.