Real Coffee with Scott Adams - November 06, 2022


Episode 1919 Scott Adams: Two Days Before Elections And The News Is Delicious & Stimulating. Join Us


Episode Stats

Length

1 hour and 32 minutes

Words per Minute

146.00072

Word Count

13,575

Sentence Count

1,126

Misogynist Sentences

10

Hate Speech Sentences

19


Summary


Transcript

00:00:00.880 Good morning, everybody, and welcome to the highlight of civilization, the day before
00:00:06.120 Election Day, and our minds are, I would say, focused.
00:00:12.620 I would say that things are starting to move in the right direction, maybe.
00:00:17.860 Maybe.
00:00:19.000 We'll see.
00:00:20.200 How would you like to take it up to a new level of awareness, to a higher dimension,
00:00:25.140 a higher level of performance than you have ever experienced before?
00:00:29.240 Well, all you need is a cup or a mug or a glass, a tank or a chelsea, a stein, a canteen jug
00:00:33.500 or a flask, a vessel of any kind.
00:00:36.260 Fill it with your favorite liquid.
00:00:37.780 I like coffee.
00:00:39.800 And join me now for the unparalleled pleasure.
00:00:44.100 The dopamine here of the day, the pre-election sip that you've been waiting for, it's all
00:00:48.920 going to happen now.
00:00:49.620 It's called the simultaneous sip.
00:00:52.000 Go.
00:00:52.340 Yes, it is laundry day.
00:00:59.720 I'm sorry I'm not wearing my usual shirt.
00:01:02.660 Sometimes you've got to do the laundry.
00:01:04.100 Well, I had been working pretty hard, but I managed to take some of my workload off by
00:01:17.140 hiring a part-time assistant.
00:01:20.060 So thankfully, a lot of the chores that one normally does, I managed to put off on my part-time
00:01:29.180 assistant, who immediately had a sporting accident and is on vacation right now.
00:01:35.840 So that didn't work out as well as I'd hoped, but it will.
00:01:41.460 It will.
00:01:42.540 It's going to get on track any moment now.
00:01:44.280 Have you seen the video of the Iranian women knocking the headgear off of clerics in Iran?
00:01:52.940 It's really fun to watch.
00:01:55.360 So, you know, you're watching the, I guess the government is cracking down on the women
00:02:01.460 who are trying to not wear the traditional, you know, face coverings and stuff.
00:02:07.820 And it's becoming a really big deal.
00:02:10.880 The protests continue.
00:02:12.020 I thought they would be stamped out by now, but they continue.
00:02:15.280 So maybe this has some legs.
00:02:17.700 But apparently there's a number of videos.
00:02:19.980 You can see a compilation where the young women, it looks like, will run up behind a cleric who's
00:02:25.560 got on the big hat.
00:02:27.500 What's the name of the headgear that a cleric wears?
00:02:32.760 It's not a hat, right?
00:02:35.220 Turban.
00:02:36.220 Turban?
00:02:37.860 Is it really a turban?
00:02:39.020 A headpiece.
00:02:43.740 All right.
00:02:44.520 Don't get all racist on me.
00:02:46.360 Let's call it a hat.
00:02:50.440 And here's the fun part.
00:02:52.340 The clerics tend to be not very athletic.
00:02:55.800 But the young women are, you know, they can just knock the hats off the clerics and just
00:03:00.320 run away.
00:03:00.820 Because there's nothing the cleric could do about it.
00:03:03.900 And I thought to myself, the mullahs or the clerics?
00:03:09.520 What would be the correct word?
00:03:11.900 Isn't a mullah a higher level cleric?
00:03:14.960 Is a cleric lower than a mullah?
00:03:18.040 I'm not sure how all that works.
00:03:20.580 But anyway, here's my suggestion.
00:03:23.120 I think they should do more of that.
00:03:26.940 But here's what I would do.
00:03:28.320 If I were the women in Iran, I would start to all dress alike everywhere.
00:03:34.840 All the same.
00:03:36.240 I would wear all black bottoms, all black tops of the same kind.
00:03:42.100 And then I would have black headgear.
00:03:46.500 And I would cover myself completely in public.
00:03:50.060 Do you know why I'd do that?
00:03:51.100 Because I'd be knocking the hat off of every fucking cleric within miles.
00:03:57.060 And they wouldn't be able to catch anybody because they'd all look the same.
00:04:00.740 Right?
00:04:01.220 So I think that the Iranian women should embrace and amplify.
00:04:05.760 They should over-cover themselves and all look the same so they could be knocking hats
00:04:11.560 off of clerics all day long and nobody could get caught.
00:04:15.040 Be like, oh, shit, there's like 600 of them.
00:04:17.100 I have no idea which one did that.
00:04:19.200 Am I right?
00:04:19.720 Just give them what they want.
00:04:22.760 Just all dress the same.
00:04:23.880 Wear masks.
00:04:25.100 And once you cannot be identified, go nuts.
00:04:28.860 Once you can't be identified, go nuts.
00:04:30.940 That's what they're asking for.
00:04:32.440 They're asking for you to be completely unidentified.
00:04:36.860 Give them what they want.
00:04:38.880 Show them what happens when you're unidentified.
00:04:40.880 Give them exactly what they're asking for.
00:04:47.360 I've told you this before, but man, is it true that remember when we were waiting for AI artwork to someday be as good as a human?
00:04:57.640 That day passed a while ago.
00:05:00.680 AI art is unambiguously better than human art.
00:05:04.120 I've seen enough examples now where one after another AI art I want to put on my wall.
00:05:12.820 Like, I'll see it and I'll go, I would put that on my wall.
00:05:15.460 That's actually beautiful.
00:05:16.840 I don't really see human art that makes me do that.
00:05:19.260 I don't see human art that makes me want to put it on a wall.
00:05:24.900 I can't remember the last time.
00:05:26.640 Like, if I put up artwork, it's at protest.
00:05:28.820 I'd rather have nothing on my wall.
00:05:32.620 But Brian Machiavelli did another piece of artwork in which he just asked it to do a Dilbert comic.
00:05:41.860 I'll show you which AI he was using here in a moment.
00:05:44.140 And the artwork, in my opinion, as a professional cartoonist, is equal to the best human cartoonist.
00:05:54.760 Just the art.
00:05:56.600 But there's an interesting quality about it that blows my frickin' mind.
00:06:00.840 Let me show you here in a minute.
00:06:03.080 So you won't be able to see it as clearly as I want it.
00:06:06.340 But the request from Machiavelli's Underbelly was to make a Dilbert cartoon about a zebra.
00:06:15.700 And interestingly, it made Dilbert part zebra.
00:06:20.000 Not zebra, giraffe, I'm sorry.
00:06:22.680 Part giraffe.
00:06:24.280 And look at the quality of the cartoon at the bottom.
00:06:30.120 That is 100% as good as the best human cartoonist right there.
00:06:34.700 There's nothing missing in that.
00:06:37.920 It's beautiful, really.
00:06:39.860 And like this art, like that bottom cartoon, that is so well designed.
00:06:47.400 Like I would put that on my wall.
00:06:49.080 It's just cool to look at.
00:06:51.260 Now, but here's the weird thing.
00:06:54.520 Do you notice that the humans have weird characteristics?
00:06:59.660 Like they don't quite look like they have the right features and stuff.
00:07:03.440 They look all interesting, but even from one panel to the next, the AI is making the person look different.
00:07:10.940 Like one has, you know, three lenses and blah, blah, blah.
00:07:14.260 Now here, you want a freaky idea?
00:07:17.680 Do you want a really freaky idea?
00:07:21.080 All right.
00:07:21.660 I'm going to build into this.
00:07:22.760 This will just freak you the fuck out.
00:07:24.940 You ready for it?
00:07:25.600 You know how the human brain gives you a perfect image of things that you don't see perfectly?
00:07:34.800 You know that, right?
00:07:36.020 So, for example, you will have a memory of something that didn't even happen.
00:07:40.360 A false memory is pretty common.
00:07:42.240 We all have.
00:07:42.640 You also, if you're watching, let's say, a tennis ball being hit hard and it bounces near a line,
00:07:51.080 your brain says, I saw that tennis ball.
00:07:55.420 I saw it hit the ground.
00:07:57.200 And I know if it's in or out.
00:07:59.260 But did you know you don't see that tennis ball?
00:08:02.120 Your brain or your eyes can only pick up the tennis ball every, depending on the speed it's going,
00:08:06.980 it can only pick up the tennis ball every five feet or something.
00:08:09.960 But all the stuff in between, your brain filled in and it wasn't there.
00:08:15.260 But your memory is you saw the ball the whole way.
00:08:17.780 You did not see the ball the whole way.
00:08:19.920 Your brain allowed you to imagine you saw it the whole way.
00:08:23.160 Now, do you accept that your brain is, in real time, translating things into things that are not true?
00:08:34.300 Do you agree with that?
00:08:35.440 Your brain is always translating your approximate environment into a specific picture, which is not true.
00:08:44.240 Now, what happens when you take AI, high intelligence, and you train it to look at human faces,
00:08:53.240 and then you tell them to reproduce a human face in some artwork?
00:08:56.600 And then the AI makes the human face look different in each panel.
00:09:03.180 All the time.
00:09:04.980 All the time.
00:09:05.900 It seems like it doesn't seem to matter which AI you're using.
00:09:09.980 And you can train the AI to really know what a face looks like.
00:09:14.600 Because faces are approximately, you know, the same, they're approximately symmetrical, right?
00:09:22.460 It knows what a nose is.
00:09:24.200 It knows what eyes are.
00:09:25.280 It knows where they appear on the face.
00:09:27.300 Are you telling me that AI can't make a human face?
00:09:32.020 What should be the simplest thing, right?
00:09:34.500 Now I'm going to blow your fucking mind.
00:09:36.560 You ready?
00:09:38.460 You ready for this?
00:09:40.380 Get ready.
00:09:41.100 It could be that human faces don't look the same from minute to minute, and that we can't tell.
00:09:54.260 Because our human brain is making your face look the same as it looked a minute ago to me.
00:10:00.760 What if our faces don't look the same from moment to moment?
00:10:05.040 And that it's actually like a, you know, it's more like a wave function.
00:10:12.860 That your face is only a potential face.
00:10:16.520 And when somebody looks at it, they're picking up the potential.
00:10:19.560 They're forming in their mind their own face, and then they lock it in.
00:10:24.180 And so every time they look at you, they see that face that they've locked in.
00:10:27.420 But your actual face is never the same.
00:10:31.220 Okay.
00:10:31.880 Did I freak anybody out?
00:10:33.040 Because give me one other reason that AI can't make a face the same twice in a row.
00:10:39.920 There's no other reason.
00:10:41.580 Because the technology could easily do it.
00:10:44.160 The technology is telling you what is actually there.
00:10:48.420 The AI is telling you that our faces are not the same from minute to moment.
00:10:52.160 That's what it sees.
00:10:53.720 But we don't see it.
00:10:56.360 Now, I'm not saying that's true.
00:10:58.240 I'm just saying it might be.
00:10:59.600 That is within the realm of something that could be true, which is freaky by itself, even if it's not true.
00:11:06.540 But I can't think of another reason that AI can't make a face.
00:11:10.260 I mean, there's just no other reason.
00:11:14.100 All right.
00:11:15.840 Real clear politics had a poll asking people if we thought the country was heading in the right direction or the wrong direction.
00:11:22.480 And I want to test your intelligence.
00:11:24.800 Many of you have not seen this news.
00:11:26.280 But I want to see if you can use your powers of deduction.
00:11:32.300 What percentage of the population thinks that things are moving in the right direction?
00:11:37.540 Go.
00:11:39.400 How are you doing that?
00:11:42.100 How are you doing that?
00:11:43.540 25.6%.
00:11:46.280 You guys, you guys and gals, you impress me.
00:11:52.960 I've never seen a group that could guess things better than you.
00:11:56.160 But once again, once again, with no help whatsoever, you got the right answer.
00:12:00.780 Almost all of you.
00:12:02.760 So congratulations to you.
00:12:05.100 Yeah.
00:12:05.360 About a quarter of the country thinks things are fine and in the right direction.
00:12:09.000 Now, I happen to be in the 25%.
00:12:14.280 Surprise.
00:12:19.360 I had a realization the other day that two things are true.
00:12:23.940 I tweeted this.
00:12:25.240 Two things are true at the same time.
00:12:27.260 We'll see if you agree with the first one.
00:12:29.240 I think you'll agree with the first one, but not the second one.
00:12:31.780 The first one, everything appears to be broken.
00:12:36.220 Agree?
00:12:36.580 Like, everything in the world.
00:12:39.660 All of our systems, all of the way we think about things, education, government, supply chains, military.
00:12:48.680 Correct?
00:12:49.440 Right?
00:12:49.900 Would you all agree?
00:12:51.060 Now, I say seems broken.
00:12:52.880 Right?
00:12:53.140 It appears broken.
00:12:54.980 Now, that could be an impression, right?
00:12:57.380 When everything looks broken, you could be wrong.
00:13:01.200 You know, it could be like the face thing.
00:13:02.640 You see it as broken, but it's not so broken.
00:13:04.680 Because, remember, the central fact of our human cognition is that we're looking for problems so we can fix them, so we can survive.
00:13:17.060 Your brain is a problem identifier.
00:13:20.760 It's not built to keep you happy and stupid.
00:13:23.700 Because if it keeps you happy and stupid, you're also going to be dead, right?
00:13:27.380 Because you won't see a problem.
00:13:28.580 You need to see problems to stay alive.
00:13:32.340 So you're a problem identifier.
00:13:34.780 Now, add to the world news.
00:13:38.420 A news model that gets more clicks for bad news, right?
00:13:42.340 There's a bad thing coming.
00:13:44.120 You'd better get afraid.
00:13:46.340 So now, people are automatic problem identifiers and solvers.
00:13:50.860 And then the news model gives you nothing but problems.
00:13:54.040 What in the world would you think about your world?
00:13:58.960 You would think your world is falling apart.
00:14:01.980 Because that's all you see, right?
00:14:04.140 But when was the last time you saw a big news story about good news?
00:14:08.460 Hey, here's a news story of something that's working great.
00:14:11.540 I try to give you those stories, but even when you see them in the press, they don't really frame them that way.
00:14:16.860 It's more like, well, a fact happened.
00:14:18.380 But, you know, I'll tell you, hey, things are moving in the right direction.
00:14:21.740 All right, here's my take.
00:14:23.440 So number one, I think we all agree everything looks broken.
00:14:27.660 And I would say that it is.
00:14:29.960 I would say it's not just that it looks broken.
00:14:32.440 It's actually all broken.
00:14:34.300 All of it.
00:14:36.300 Would you agree with that?
00:14:37.960 Would you take it further and say it doesn't just look broken.
00:14:41.600 It's fucking broken.
00:14:43.560 Right?
00:14:44.040 In the sense of it's not where we want it to be.
00:14:46.300 No, not maybe 100% broken, but everything's not where we want it to be.
00:14:51.340 Which is a perfect situation for humans.
00:14:54.960 Humans love it when things are not where we want them to be.
00:14:57.760 That's like our perfect situation.
00:15:00.220 You're continually like, ah, got to build this thing, got to fix this thing.
00:15:04.440 That's where we're at our best.
00:15:06.640 When everything's a little bit broken.
00:15:08.700 And we're like working on fixing it.
00:15:10.780 It's the ideal human situation.
00:15:13.040 And we do it well.
00:15:14.380 The Adam's Law of slow-moving disasters.
00:15:17.240 We do fix things pretty well.
00:15:18.720 All right, here's my uber-optimistic take.
00:15:22.280 Are you ready?
00:15:24.960 Everything's broken.
00:15:26.080 And that's true.
00:15:26.760 And we live in a self-correcting system.
00:15:32.420 That our politics and our free markets and our, you know, flow of information, which isn't as free as it could be, but it's getting better.
00:15:42.240 100% of things are self-correcting.
00:15:46.100 All of it.
00:15:47.500 Everything is self-correcting.
00:15:48.860 Now, I told you about, you know, Germany was going to freeze over the winter because they didn't have Russian gas.
00:15:55.420 And then the latest report was, oh, somehow they figured out how to store enough gas so they could make it through the winter.
00:16:03.940 They solved it.
00:16:04.880 Now, here's what I see happening.
00:16:09.340 Because everything's broken, that gives you license to improve it.
00:16:14.360 Did you ever work in a company where something's working fine?
00:16:17.960 Do you think you could improve it?
00:16:20.460 Everybody would say, why are you focusing on that?
00:16:22.920 It's fine.
00:16:23.600 But everything looks broken, which causes us to try to fix everything.
00:16:30.260 We're in this frenzy, you know, in society, in the world in general.
00:16:35.880 We're in a frenzy of redesign.
00:16:39.660 Everything's getting redesigned, usually by the free market, right?
00:16:43.920 Absolutely everything's being redesigned from almost the bottom up.
00:16:47.640 We're in a reinvention phase that you won't recognize until it's, you know, toward the end.
00:16:55.800 But you don't recognize we're in a creative, inventive phase that will be one of the great ones of all human history.
00:17:06.700 Right?
00:17:07.040 Now, I would argue that after World War II, we had a creative, you know, explosion.
00:17:13.800 Wouldn't you say?
00:17:14.440 You know, and then there was the industrial revolution, creative explosion, et cetera.
00:17:19.680 But I feel like we're in one, but you don't feel it.
00:17:23.620 When you're in it, it just feels like everything's broken because you're just trying to fix everything.
00:17:27.940 But when it gets fixed, it will be better.
00:17:31.600 It'll be fixed better than it was.
00:17:33.380 And I think that applies everything from our energy policy to the way we run elections to Twitter to you name it.
00:17:42.540 Literally everything is trending positive if you look at it as a system that's, you know, in the phase of correcting.
00:17:50.300 So that's my optimism.
00:17:51.960 It does not say we should not be vigilant about all of our problems.
00:17:55.620 We'll do that automatically because we're good at being vigilant.
00:17:58.740 How many of you saw the video of John Fetterman giving his speech outdoors?
00:18:04.780 He had a bunch of American flags behind him and the wind blew them all down while he was talking.
00:18:13.920 And Fox News had an article about how social media was mocking it and, you know, had some tweets there from some notable people saying funny things.
00:18:24.480 And I could not be more angry at Fox News today.
00:18:29.000 Fox News was saying that people were saying clever things about that Fetterman flag thing and they did not include my tweet.
00:18:36.640 What?
00:18:37.900 That oversight cannot be forgiven.
00:18:41.280 And so, since they only mentioned some lesser clever tweets, which were pretty good, but they were lesser clever than the one I did.
00:18:49.280 And my tweet was this.
00:18:53.340 Hypothetically, if God had been trying to warn us for months not to vote for a particular candidate, what would that look like?
00:19:02.320 Come on.
00:19:03.700 I'm a professional joke writer and I declare that was the best joke written on this topic.
00:19:11.120 Now, let me tell you why.
00:19:13.720 This is a joke writing lesson for you, so you can take this way.
00:19:17.840 I've told you this before, but every time you see an example, it helps.
00:19:21.440 If you can do a punchline or a joke in which the audience has to fill in the details, that's your best joke.
00:19:33.420 Like, that's the one that really works.
00:19:35.100 If you over-specify, you know, what you want somebody to imagine, well, that can be a good joke, too.
00:19:41.060 But the great jokes are where it takes you a moment to fill it in.
00:19:45.480 And when you fill it in, you're filling it in with your own stuff.
00:19:48.700 So when I said God had been trying to warn you for months, you can fill it in with everything that happened to John Fetterman, which was a tragedy.
00:19:57.620 I don't mean to make fun of a medical problem.
00:19:59.720 But if you had, you tried to fill it in, you're like, okay.
00:20:04.540 Yeah, that sounds very much like that old joke.
00:20:07.300 You've all heard the joke.
00:20:09.000 The floodwaters are rising and neighbor knocks on the elderly person's door and says, you know, hey, the water's rising.
00:20:16.420 Come with us.
00:20:17.120 We're leaving right now.
00:20:18.060 And the woman says, no, no, you know, God will provide.
00:20:21.160 I'll be fine.
00:20:22.080 And the water rises and, you know, the woman has to go up to the second floor of her house.
00:20:26.140 And then a boat comes up to the second floor window and says, get in the boat.
00:20:30.860 Get in the boat.
00:20:31.960 And she's like, no, God will provide.
00:20:33.840 I'm fine.
00:20:34.780 And she ends up on the roof of her house because the water has risen.
00:20:38.600 And she's on the roof of the house and a helicopter comes by and says, get on the helicopter.
00:20:43.040 You know, puts a little rope down.
00:20:44.660 She goes, no, no, God will provide.
00:20:46.320 The helicopter goes away.
00:20:48.960 Well, the water rises.
00:20:50.340 She drowns and she goes to heaven.
00:20:52.740 And she's talking to God in heaven.
00:20:55.600 And she goes, God, what's up?
00:20:58.620 I've been believing in you my whole life.
00:21:01.060 You know, I was sure you would provide.
00:21:03.260 What happened?
00:21:04.560 And God says, I sent you a car.
00:21:07.240 I sent you a boat.
00:21:09.100 I sent you a helicopter.
00:21:11.960 And, you know, that's the joke.
00:21:13.060 So, sometimes the signs are there.
00:21:18.580 Maybe you should just recognize them.
00:21:21.260 Anyway, it sounded so much like the joke, I couldn't resist.
00:21:25.140 There is a video on Getter.
00:21:30.200 It's sort of a Twitter competitor, which I'm not going to tell you is credible.
00:21:37.400 So, don't assume this is credible.
00:21:40.220 Because if I told you it's credible, I would be kicked off of social media.
00:21:44.860 Probably not Twitter at the moment.
00:21:47.260 But I'll just describe it to you.
00:21:51.640 So, there's a woman who claims that she was doing construction on a home that she and her husband owns for a couple of years.
00:21:59.380 So, it's a place that has been abandoned.
00:22:02.140 But you can reach the mailbox, even though it's behind a chain link fence.
00:22:06.560 Right?
00:22:06.760 So, you can sort of reach in and get to the mailbox, I guess.
00:22:09.340 And, she found it was stuffed with address changes.
00:22:15.360 So, a whole bunch of people, allegedly, had changed their address to her address, which is an abandoned building.
00:22:23.700 Now, obviously, the implication is that people are doing things to be able to vote by mail fraudulently.
00:22:33.220 Right?
00:22:33.700 Now, no squatters.
00:22:34.940 There were no people in there.
00:22:35.900 They were just using the mailbox as a change of address place.
00:22:38.680 Now, anecdotally, I heard other reports of this being a ballot fraud thing that people do, etc.
00:22:47.440 Now, question number one.
00:22:50.460 You have seen one real person, apparently, describing what looks like a pretty credible claim.
00:22:59.140 Use your critical thinking to answer this question.
00:23:04.240 What are the odds it's true?
00:23:07.560 Go.
00:23:07.820 Put the odds on it.
00:23:09.240 Odds, it's true.
00:23:13.000 I'm seeing 25, 25, low, 1%, low, 0, 10%.
00:23:20.320 Very good.
00:23:21.400 Very good.
00:23:22.880 Very good.
00:23:24.020 So, do you remember what I predicted would be true after the 2020 election?
00:23:29.560 If you remember, I said a whole bunch of times, I said this over and over again.
00:23:34.640 I said, every claim you hear about election fraud is at least 95% chance likely to be false.
00:23:44.720 95%.
00:23:45.200 Now, what was the final outcome of all the 2020 claims?
00:23:51.460 How close was I?
00:23:53.520 I told you that even if something is true, and I didn't have an opinion at that time, I didn't know if there was some true stuff.
00:24:00.300 But I told you that 95% of what people surface will definitely not be true.
00:24:05.480 So, when you see this new claim, just keep the 95% thing in your mind.
00:24:12.440 Now, it looked very credible to me.
00:24:15.580 Right?
00:24:15.760 If you put a gun to my head and said, all right, you're going to have to bet whether this particular story was true, I want to think it is.
00:24:26.000 But then I have to retreat to my rational mind and say, okay, the fact that it looks true means what?
00:24:33.680 How much credibility do you put on, it's on video, it's a real person, and it looks true.
00:24:39.800 What's that mean to you?
00:24:41.880 Nothing.
00:24:42.960 Nothing.
00:24:43.240 In 2022, it doesn't have any value.
00:24:46.120 Absolutely no value.
00:24:47.940 And the first thing you ask yourself is, why is it only on Getter?
00:24:52.280 Right?
00:24:53.520 Is there some reason that would be not on other platforms?
00:24:57.340 So, it could be, you know, from a different year.
00:25:01.200 It could be an operative who's just, you know, doing an op.
00:25:04.980 It could be anything.
00:25:06.840 It could be absolutely just anything.
00:25:09.020 You don't know.
00:25:09.620 All right.
00:25:12.860 CNN.
00:25:13.300 I'm going to give yet another compliment.
00:25:17.720 Compliment to CNN.
00:25:19.560 You know?
00:25:20.200 Being one of its biggest critics for years, I feel like I have some kind of social obligation
00:25:27.880 to tell you when they do something right.
00:25:30.120 And not only today do they have Daniel Dale doing a hard fact check on Biden, and when
00:25:38.520 I say a hard fact check, he just goes right at him.
00:25:42.860 Yeah.
00:25:42.980 Daniel Dale went at Biden the way you saw him go after Trump.
00:25:48.160 I never thought I'd say it.
00:25:50.140 Never thought I'd say it.
00:25:51.380 But he just went right at it.
00:25:53.020 He said, lie, lie, lie.
00:25:54.580 Not true.
00:25:55.020 Lie.
00:25:55.860 Big stuff.
00:25:57.200 I'm not talking little stuff.
00:25:58.380 I'm talking about the major claims of the president.
00:26:00.860 Lie.
00:26:01.920 Just lie.
00:26:03.920 Amazing.
00:26:04.360 So congratulations to CNN for what looks like a successful business pivot.
00:26:11.220 You want more?
00:26:12.720 This will blow your mind.
00:26:15.200 The day before elections, CNN's website, the following headline.
00:26:21.440 Opinion.
00:26:22.580 Democrats are out of touch with American voters.
00:26:24.920 That's a headline on CNN the day before the election.
00:26:30.040 Democrats are out of touch with American voters.
00:26:33.000 And then it goes on in the article to say that Republicans are running for office on the
00:26:38.020 actual issues that people care about.
00:26:41.660 Can you believe that?
00:26:44.400 Oh, two days.
00:26:45.360 I'm sorry.
00:26:45.740 Two days before election.
00:26:47.540 Forgot what day it was.
00:26:49.280 Today feels like a Monday to me for some reason.
00:26:52.160 Must be that time change.
00:26:53.280 But does that just blow your mind?
00:26:57.080 My mind is completely blown by the fact that something's going right.
00:27:02.360 Now, remember I told you everything's broken.
00:27:05.020 Everything's broken.
00:27:06.640 But it's all being redesigned right.
00:27:09.900 You tell me.
00:27:11.380 Was CNN broken a year ago?
00:27:14.260 And do you see that they're redesigning in a way that's useful to the country?
00:27:18.740 Yes or no?
00:27:19.440 Would you agree that you see it?
00:27:21.680 It's not just me, right?
00:27:22.900 You can see it.
00:27:24.580 Is Twitter better off than it was a year ago?
00:27:27.960 I think so.
00:27:29.160 I mean, it's a big mystery where things will go.
00:27:31.920 So you've got CNN being a responsible citizen.
00:27:37.200 Congratulations, CNN.
00:27:38.580 Completely successful.
00:27:40.260 I don't know about their ratings.
00:27:43.280 I actually hope their ratings improve.
00:27:45.520 So I'm going to root for CNN's ratings to improve because I think they're acting like patriots.
00:27:52.940 Is that too far?
00:27:54.800 In my opinion, CNN is taking a patriot perspective on their business, which is to try to give you the actual news, which is different.
00:28:04.720 All right.
00:28:06.100 Too far, maybe.
00:28:08.180 But that's another example of something that's going right.
00:28:11.740 Biden, of course, gaffed again and said he wanted to close all the coal plants.
00:28:17.620 And then somebody from his own party, two days before elections, Joe Manchin, is asking the president to apologize.
00:28:24.360 You know, if members of your own party are asking you to apologize two days before the midterms, that's not a good look.
00:28:32.920 That's not good.
00:28:34.980 Now, a year ago, do you think that Democrats were willing to admit that Biden is brain damaged and useless?
00:28:43.440 Not so much, right?
00:28:45.320 A year ago, even the Democrats would have said, oh, stop making fun of his speech defect.
00:28:51.680 He's fine.
00:28:52.380 They don't say that now, do they?
00:28:55.360 Am I right?
00:28:56.580 There is a completely different tone from the Democrats about their own president.
00:29:01.440 And that tone is, oh, look what we did.
00:29:07.900 Maybe we should undo this.
00:29:10.460 Now, you're going to get people who are still supportive because it's a political world.
00:29:14.240 But let me give you a little story of something that happened to me.
00:29:20.600 I want to see if any of you have had a similar experience.
00:29:25.400 So last night, I went to a large-ish social gathering of people in my town.
00:29:34.400 So I don't want to be any more specific than that.
00:29:36.980 So it was a large social gathering in my town.
00:29:39.480 And I live in Northern California, and now we're outside of San Francisco.
00:29:46.560 So what do you assume is the political nature of my town?
00:29:50.820 It's just what you think, right?
00:29:52.820 It's exactly what you think.
00:29:55.100 Or is it?
00:29:55.880 I spent an entire night at an event and heard a number of pro-Republican statements from people privately.
00:30:06.940 I heard zero Democrat supportive statements.
00:30:13.280 Zero.
00:30:14.820 There wasn't one.
00:30:15.820 I don't think there was anybody there who was favorable to the Democrat platform at the moment.
00:30:23.420 Which is not to say they weren't Democrats.
00:30:26.560 If you could put on goggles and see everybody's voting party, I'll bet you they're mostly Democrats.
00:30:33.780 I'll bet they were at least 60% Democrats.
00:30:36.920 But you don't see anybody talking up Biden.
00:30:40.520 It's just not a thing.
00:30:42.240 It's just not a thing.
00:30:43.260 And you don't even see people agreeing with wokeness.
00:30:47.840 At least within, let's say, the suburban family kind of people.
00:30:51.600 Just nobody.
00:30:53.200 And I used to feel like I was in a little bit of danger when I just went out in public.
00:31:02.680 You know what I mean?
00:31:03.240 I mean, if you're notable as a Trump supporter, there was a time when it would have been dangerous for me to just, I was a little wary even just going to the grocery store.
00:31:14.760 Because I thought I'd be confronted with somebody.
00:31:17.280 Not necessarily physically.
00:31:19.380 But I thought somebody would get in my face, you know, throw a milkshake on me or some bullshit like that.
00:31:24.940 But I don't get any pushback anywhere where I live.
00:31:31.760 Just think about that.
00:31:33.540 I mean, people recognize me at this point.
00:31:35.860 And they've got a good idea what I'm up to.
00:31:38.240 I get no pushback.
00:31:40.600 Zero.
00:31:41.700 There is not a single instance of somebody in my own community who's told me I've, you know, gone too far or I'm a Nazi or I'm on the wrong side.
00:31:52.320 Nothing.
00:31:52.580 So are you seeing this?
00:31:56.400 Are you seeing that in mixed gatherings there is no Biden support or even Democrats support, really?
00:32:04.260 People will talk about abortion.
00:32:07.660 But that's sort of all they have.
00:32:09.840 Am I right?
00:32:10.980 It's all they have.
00:32:12.740 And a lot of people don't care about that as much as you imagine they do.
00:32:16.600 Yeah, there was, you know, a little talk of Trump, but even he's not the important thing these days.
00:32:24.900 We'll talk about Trump.
00:32:31.660 MSNBC.
00:32:32.100 There's a clip on Twitter of MSNBC interviewing a Democrat governor candidate or governor, Kathy Hochul.
00:32:42.900 Now, imagine MSNBC interviewing a Democrat governor two days before an election.
00:32:54.160 Now, how do you think that went?
00:32:58.660 You think that would be really just a friendly, right?
00:33:01.360 They'd be trying to get Hochul over the finish line?
00:33:06.600 Nope.
00:33:07.120 That didn't happen.
00:33:08.380 MSNBC just slapped the shit out of her.
00:33:10.160 I didn't know what I was looking at there for a moment.
00:33:16.040 Yeah, I didn't catch the name of the host, but credit to the host.
00:33:21.800 The host said, you know, I guess the context was a crime.
00:33:28.420 And the host says, here's the problem.
00:33:30.340 We don't feel safe.
00:33:32.020 I walk into my pharmacy and everything is on lockdown because of shoplifters.
00:33:36.260 I'm not going in the subway.
00:33:38.040 People don't feel safe in this town.
00:33:40.160 That is the end of her, isn't it?
00:33:45.480 Did MSNBC just drive a stake through her fucking heart?
00:33:49.640 It looked like it.
00:33:51.820 It looked like even MSNBC is like, we can't even do this anymore.
00:33:55.920 We just can't do this anymore.
00:33:57.520 We have to kill you now.
00:33:58.940 I mean, politically, not actually.
00:34:01.600 I think they just drove a stake right through her heart on live TV.
00:34:06.460 Because, and again, if somebody knows, if you saw the clip, I'd like to give a shout out to the host.
00:34:13.320 Because that was good work.
00:34:14.800 That was good work.
00:34:15.660 And not just on a team play kind of basis.
00:34:19.280 But it was good journalism.
00:34:22.620 This is somebody who really got to it.
00:34:24.340 Really got under the hood there.
00:34:26.940 I wish somebody would say her name just so we could give her credit.
00:34:32.340 All right.
00:34:32.640 Well, maybe you'll see it in the comments.
00:34:34.400 Jack Dorsey has apologized for growing Twitter too quickly.
00:34:42.900 And therefore, you know, being a participant, I guess he would say, for how many people had to get laid off.
00:34:49.640 Because he thinks it grew a little too quickly.
00:34:53.980 And, you know, I appreciate that.
00:34:57.180 That's sort of a, it's kind of the context you wanted to hear, right?
00:35:01.360 Didn't you wonder why there were so many people?
00:35:05.220 And could you actually get rid of a lot of people and still run the company?
00:35:09.500 And probably yes.
00:35:11.700 If grew too fast is a true statement, and it certainly looks that way,
00:35:16.840 then probably there's a little bit of, or maybe a lot, of, you know, cushion, stuff they can cut.
00:35:25.160 So that's happening.
00:35:26.040 You know, I was trying to imagine what it would be like to be Jack Dorsey and know that all these layoffs are happening.
00:35:37.440 Like, that's got to be, it's just got to be crushing.
00:35:41.160 I once had to close a restaurant.
00:35:45.780 Two restaurants, actually.
00:35:46.920 But I had to close a restaurant that wasn't working out.
00:35:50.500 And, you know, I had to tell the staff that they were all fired, you know, with severance and stuff like that.
00:35:56.640 So they were treated well.
00:35:59.080 But that's a really bad day.
00:36:02.700 It's twice a bad day because as I was telling them that I would, you know, treat them right and everything,
00:36:09.060 there were, some others were actually robbing the storeroom of all the electronics.
00:36:13.600 As I was talking to the staff, they were robbing, they were actually robbing the restaurant.
00:36:19.960 Well, that showed me.
00:36:22.020 Anyway, that was pretty expensive.
00:36:27.660 Musk met with some civil rights leaders, the Anti-Defamation League and NAACP.
00:36:35.620 And I saw an interview by the head of the NAACP, Derek Johnson.
00:36:41.360 Now, interestingly, the head of the NAACP appears to be, and I don't want to make an assumption here,
00:36:49.120 but appears to be a black man.
00:36:51.640 I don't know.
00:36:52.400 It's a weird thing about 2022 when I find out that the head of the NAACP is at least apparently black.
00:36:59.980 So good job there.
00:37:03.620 Seems like a good fit.
00:37:04.820 But I would like to give him a shout-out because you know how you get an immediate reaction to some public figures?
00:37:14.420 Do you remember when, maybe you don't remember it, but when Barack Obama burst on the scene,
00:37:20.500 the first time you saw him give a speech, even if you were a Republican,
00:37:25.620 you probably said to yourself, ooh, there's something special there, right?
00:37:31.260 Didn't you see that?
00:37:32.540 Like, you see it immediately.
00:37:33.440 And the special thing was, he wasn't doing the thing you expect.
00:37:39.560 Obama did not run as a black guy.
00:37:42.560 And every time he didn't run as a black guy, which he did better than anybody ever did, right?
00:37:49.080 Obama was just the master of using race without using it.
00:37:53.620 Oh, my God, he did that well.
00:37:54.820 Because it was there, you could see it, you could make your own decision.
00:37:58.440 But if he'd even said once, you should vote for me because I'm black, he would be done.
00:38:04.060 It was like, that's, no, no.
00:38:05.920 That's absolutely, you're off the list if you say that.
00:38:10.580 But he never did, because he's very good at this stuff.
00:38:13.700 So I want to give, like, a similar shout-out to this Derek Johnson.
00:38:17.800 I'd never seen him before, but he was talking about the meeting,
00:38:22.340 and he first said that he thought Musk agreed with the people who were there.
00:38:28.120 That's good.
00:38:28.820 So no confrontational anything, just it looks like he agreed.
00:38:32.400 But there was a question of whether Musk would be able to execute,
00:38:35.580 which is exactly the best question you could ever have.
00:38:41.400 Do you think Musk can execute?
00:38:44.920 That's like what he does better than anybody who's ever done anything.
00:38:48.400 Musk can execute, yeah, better than anybody,
00:38:51.380 maybe in the whole world who's ever done anything.
00:38:54.120 That's the one thing he can do.
00:38:56.020 But here's why I want to give Derek a shout-out.
00:38:59.000 He went through the whole interview,
00:39:01.060 and he talked about how he wanted to make sure that, you know,
00:39:03.180 Twitter was protecting communities that, you know, were marginalized.
00:39:08.880 And I watched the whole interview,
00:39:11.660 and not once did he frame things racially.
00:39:16.520 Now, I think it's completely fair to say
00:39:18.140 don't have groups targeted for hate speech.
00:39:20.920 That's not really a racial statement.
00:39:22.780 That's something everybody would say.
00:39:24.500 But somehow he got through the whole thing
00:39:26.500 without me hating him for saying I'm bad or something.
00:39:31.340 Like, I didn't get blamed for anything.
00:39:34.680 And he didn't even talk about it like in the normal team play.
00:39:41.080 Somehow he did an Obama on this thing.
00:39:44.020 And I've got to say, I want to see more from him, Derek Johnson.
00:39:48.660 Because one of the things I've been saying forever
00:39:50.440 is that the black American population doesn't have a good leader.
00:39:54.800 You know, somebody who would appeal to whoever they're trying to persuade
00:39:59.400 at the same time as appeal to the black community.
00:40:04.500 And he may be the guy.
00:40:07.560 He's got a strong, strong game.
00:40:10.180 Yeah.
00:40:10.840 So just keep an eye on him.
00:40:12.660 This is another thing that, in my opinion, is going right.
00:40:15.080 In my opinion, and this will be really controversial,
00:40:20.340 racial, what would you say, harmony, is actually improving.
00:40:29.680 What do you think of that?
00:40:31.280 In my opinion, racial harmony is improving.
00:40:34.760 And I'll tell you why.
00:40:36.160 Because we're wising up to why it happened in the first place.
00:40:39.260 We're wising up that it was the media that was causing it.
00:40:42.020 And I think people are wising up that everybody wants everybody to do well.
00:40:49.540 If there's one thing that you can say about Republicans
00:40:52.200 that you could just take to the bank,
00:40:55.340 they want everybody to do well.
00:40:58.680 Right?
00:41:00.280 There are no Republicans who want black people not to do well.
00:41:04.000 They all want them to do well.
00:41:05.340 As well as everybody who's struggling.
00:41:07.720 They don't want to necessarily be the ones that pay for it.
00:41:10.720 That's a different question.
00:41:11.800 But they want everybody to do well
00:41:13.060 and would even jump in and help quite a bit
00:41:17.100 if there was something that made sense to help on.
00:41:20.340 And I think that the education question
00:41:23.460 is where everybody's going to meet.
00:41:26.720 Because, you know, a black parent and a white parent,
00:41:30.680 when they're talking about educating their kids, same page.
00:41:34.420 I mean, small differences, but same page.
00:41:37.060 Right?
00:41:37.220 Everybody's got individual differences.
00:41:38.640 Somebody says,
00:41:42.560 I don't think about race unless I'm in social media.
00:41:45.820 Isn't that true?
00:41:47.500 When was the last time you had a real-world racial anything?
00:41:53.540 I can't even think of one.
00:41:57.100 Now, I live in a pretty harmonious place in the country.
00:42:01.560 But when was the last time you had a racial confrontation?
00:42:04.640 Or even a racial issue?
00:42:08.560 I don't even remember the last time.
00:42:12.380 It's literally not on my consciousness any time except social media.
00:42:17.060 You're right.
00:42:17.740 Yeah, there's a social media thing.
00:42:18.860 So this is weird and wonderful and mind-blowing
00:42:26.900 that there was an accusation on Twitter
00:42:30.300 that Twitter had sold somebody a blue check for $15,000
00:42:36.280 and that there was a system, sort of an underground system,
00:42:40.340 where people could get to some Twitter employee
00:42:43.300 who would illegally, I don't know if it's illegal,
00:42:46.980 but would offer to sell them blue check, you know,
00:42:50.880 outside of the normal system.
00:42:52.980 And apparently a number of people took advantage of it.
00:42:55.500 I don't know what that number is.
00:42:56.820 But here's the funny part.
00:42:58.840 Elon Musk confirmed it.
00:43:01.020 He goes, yup.
00:43:02.000 He said, yup.
00:43:04.820 Absolutely confirmed that Twitter had been selling blue checks.
00:43:09.640 Does that just blow your mind?
00:43:12.800 Because, and it blows my mind on several levels.
00:43:17.660 Watching Elon Musk fact-check the world in real time
00:43:21.800 is something we've never seen before.
00:43:24.600 Like, even if a politician gets fact-checked,
00:43:26.820 you have to wait until they give a speech, right?
00:43:29.800 Or wait until the statement comes out or something.
00:43:32.320 But Musk will be on Twitter,
00:43:34.580 there'll be something that's just like a complete lie,
00:43:37.140 or in this case, something that's weird but true,
00:43:39.840 and he just fact-checks it.
00:43:42.140 You got like a five-minute fact-check
00:43:44.620 that you could actually believe, right?
00:43:47.480 If Musk says that happened, I believe it.
00:43:50.700 I mean, why would he say that?
00:43:52.860 Of all things you could say,
00:43:54.140 you know, nobody's going to lie about that.
00:43:56.200 For $15,000, you will cook dinner.
00:44:02.640 Well, okay.
00:44:03.640 It would be a good dinner.
00:44:05.760 Let's talk about some more Twitter stuff,
00:44:07.520 the most important thing that's happening lately.
00:44:09.100 Mark Ruffalo, one of the smartest Democrats.
00:44:15.280 No, I'm just joking.
00:44:16.360 He's one of the dumbest.
00:44:18.020 If you tried to list the dumbest celebrities
00:44:22.160 who are political,
00:44:25.160 you know, you'd put at least politically dumb.
00:44:28.280 I'm not saying they're dumb, like low IQ.
00:44:30.140 But Mark Ruffalo,
00:44:32.540 while I love his passion,
00:44:35.780 and I love that he apparently cares about the country
00:44:38.520 and people and wants good things for all of us,
00:44:41.820 so character-wise, I'm a big fan.
00:44:46.160 Right?
00:44:46.540 Yeah, I think you have to appreciate
00:44:48.500 when somebody's on the other side from you,
00:44:51.820 if they're trying to make things better,
00:44:54.260 but they have a different opinion on it,
00:44:55.640 you can't really hate them for that.
00:44:57.260 Right?
00:44:57.760 You just want a different thing.
00:44:59.220 But Mark Ruffalo tweeted at Musk,
00:45:03.380 and he says,
00:45:04.000 Elon, period.
00:45:05.820 Please, for the love of decency,
00:45:08.380 get off Twitter.
00:45:13.200 Get off Twitter.
00:45:14.600 Hand the keys over to someone
00:45:15.980 who does this as an actual job,
00:45:18.400 and get on with running Tesla and SpaceX.
00:45:21.340 You are destroying your credibility.
00:45:23.620 It's just not a good look.
00:45:25.000 Is that the worst take you've ever seen in your life?
00:45:32.800 Have you ever seen a worst take?
00:45:37.060 This is an actor
00:45:38.360 who's giving the most successful entrepreneur
00:45:42.840 of all time business advice.
00:45:47.240 And he does it in public,
00:45:51.860 like, that should look good.
00:45:54.120 And he's worried that Elon,
00:45:56.300 he's saying that it's not a good look for Elon.
00:45:59.480 Do you know it's not a good look?
00:46:02.460 Giving business advice to Elon Musk.
00:46:05.620 My God, man.
00:46:09.340 My God.
00:46:11.220 Let me give you the Hollywood analogy
00:46:14.180 that even Mark Ruffalo would,
00:46:17.920 this would hit him hard.
00:46:19.820 All right, you might not know this,
00:46:21.020 but in the acting world,
00:46:23.360 I made this mistake once
00:46:24.700 when I was working on the Dilbert TV show.
00:46:27.220 Well, I'll tell you what mistake I made.
00:46:29.280 One of the voice actors
00:46:30.600 was running through some lines
00:46:32.060 for the Dilbert animated TV show
00:46:34.140 that ran years ago.
00:46:36.100 And I didn't like what one of the actors,
00:46:39.500 how they played a line.
00:46:41.380 I'd written the line,
00:46:42.360 so I knew how I wanted it read.
00:46:44.080 So I stopped and said,
00:46:46.160 oh, you know, I want it done this way.
00:46:48.040 And I did an impression
00:46:49.080 of the actor or actress doing the line.
00:46:54.980 And I said, do it like this.
00:46:56.360 And it gave an impression.
00:46:57.280 My co-executive producer
00:47:00.620 turned off the mic
00:47:02.440 and invited me into the hallway
00:47:05.740 where I learned
00:47:09.040 that I had just committed
00:47:10.380 the greatest sin.
00:47:12.540 The greatest sin.
00:47:14.400 Let me tell you,
00:47:15.320 if you're ever in this situation,
00:47:17.200 don't tell an actor how to act.
00:47:19.800 Don't do that.
00:47:21.560 Here's what you do do.
00:47:22.920 Can you give me a different look?
00:47:24.680 Can you give me a different take?
00:47:26.120 You could say,
00:47:27.460 can you combine a little,
00:47:29.420 like, let's say, more concern?
00:47:31.820 You could say, do that,
00:47:32.880 but like you're more concerned as well.
00:47:35.700 You can describe it,
00:47:37.900 but do not do the imitation.
00:47:40.700 Do not do that.
00:47:42.020 That's like the big thing
00:47:44.740 you don't fucking do.
00:47:46.500 Now, to the credit of the actor
00:47:49.500 or actress who will remain nameless,
00:47:51.860 they didn't give me a hard time about it.
00:47:53.640 But it's because
00:47:55.300 they also saw the co-executive producer
00:47:57.880 drag me into the hallway
00:47:58.900 so that they knew exactly what happened.
00:48:02.780 And I'm sure that they were fine with it.
00:48:05.000 Which, by the way,
00:48:06.200 was good executive,
00:48:07.800 co-executive producing.
00:48:09.600 So let me give my compliments
00:48:11.020 to my, you know,
00:48:13.060 co-executive producer, Larry Charles.
00:48:15.420 Yeah, Larry just dragged me
00:48:17.120 into the hallway and said,
00:48:18.120 mm-mm, nope, nope,
00:48:19.420 don't do that again.
00:48:20.100 So that was a good lesson.
00:48:22.380 Now, that feels to me
00:48:25.180 exactly like an actor
00:48:27.020 giving Elon Musk business advice.
00:48:30.740 Am I right?
00:48:32.240 That's the same thing.
00:48:34.600 You don't do that.
00:48:36.360 You don't do that.
00:48:37.640 Well, if you do it,
00:48:39.040 at least add a little bit of humility.
00:48:41.680 Right?
00:48:42.320 Because I think we're all giving
00:48:43.600 Elon Musk some business advice,
00:48:45.760 but usually it's in the form
00:48:46.680 of our own preferences.
00:48:47.640 You know, it's not in the form
00:48:49.140 of what's good business.
00:48:50.620 Now, what about the advice
00:48:51.600 that he's got these two other CEO jobs,
00:48:55.960 or three or four,
00:48:56.920 however many businesses,
00:48:58.340 and he should stick to those.
00:49:01.980 If you tell Elon Musk,
00:49:04.200 of all people in the world,
00:49:06.300 you know, seven point,
00:49:07.780 whatever, a billion people in the world,
00:49:09.620 there is exactly one person
00:49:11.480 who's the wrong person
00:49:12.480 to stay in your lane.
00:49:13.940 He's the most wrong person
00:49:16.680 out of all seven billion of us
00:49:18.540 that you should advise
00:49:19.880 not to leave his lane.
00:49:22.360 Leaving his lane
00:49:23.400 is what he does best.
00:49:25.380 It's what he does best.
00:49:27.380 Better than anybody's ever done it.
00:49:29.240 Probably.
00:49:30.200 More consistently.
00:49:32.360 So, everything about that take
00:49:34.680 was bad.
00:49:36.120 What did Musk...
00:49:38.220 And Ruffalo's comment
00:49:41.080 was because Elon was responding to AOC.
00:49:44.240 Then AOC had made some claims
00:49:45.600 about Musk messing with her Twitter feed,
00:49:49.240 which didn't happen, of course.
00:49:50.920 And then Musk replies,
00:49:52.980 hot take.
00:49:53.860 Not everything AOC says
00:49:55.420 is 100% accurate.
00:49:57.820 I love the understatement.
00:49:59.760 Hot take.
00:50:00.860 Not everything AOC says
00:50:02.180 is 100% accurate.
00:50:05.480 All right.
00:50:06.280 Here's the most exciting part,
00:50:07.820 or dangerous part,
00:50:08.740 or something.
00:50:09.160 Musk is continuing
00:50:11.240 to design Twitter,
00:50:13.500 or redesign it,
00:50:14.800 in public.
00:50:16.600 This is great.
00:50:18.100 You're getting
00:50:18.560 a total business education
00:50:20.940 in watching him
00:50:21.760 do this in public.
00:50:23.660 You see it all.
00:50:24.980 So, here's a tweet today.
00:50:27.060 He was talking about how
00:50:28.160 he wants Twitter
00:50:30.800 to be able to monetize
00:50:32.340 all creator content.
00:50:35.300 All creator content.
00:50:36.720 So, Twitter would monetize
00:50:39.000 much the way
00:50:40.040 YouTube would,
00:50:41.800 I imagine.
00:50:43.140 That any content,
00:50:44.740 you know,
00:50:45.020 you could share the revenue
00:50:46.340 with Twitter.
00:50:47.000 So, I think that's where
00:50:47.740 he's heading.
00:50:49.660 And that includes video.
00:50:51.680 Now, Twitter has some
00:50:52.600 video limitations,
00:50:54.440 you know,
00:50:54.620 resource limitations.
00:50:56.200 But apparently,
00:50:57.140 according to Musk,
00:50:58.040 we can do,
00:50:59.060 he says we,
00:50:59.740 Twitter,
00:51:00.240 can do 42-minute chunks
00:51:01.860 at 1080 resolution.
00:51:03.080 Now,
00:51:04.020 if you're in the blue,
00:51:05.480 the blue program
00:51:06.600 within Twitter.
00:51:07.620 So, you could break that up
00:51:08.980 into longer videos.
00:51:10.160 The 42-minute limit
00:51:11.100 could be fixed
00:51:11.960 by next month.
00:51:13.140 So, he's telling you
00:51:13.940 that they're removing
00:51:14.780 the 42-minute limit
00:51:16.220 within a month,
00:51:18.880 which is super fast
00:51:20.320 for something
00:51:20.980 of that magnitude.
00:51:23.580 And then he finishes
00:51:24.760 with this question.
00:51:26.380 He says,
00:51:26.980 to the public,
00:51:27.980 how does YouTube
00:51:29.100 monetization work,
00:51:30.780 and what could Twitter
00:51:31.580 do better?
00:51:32.080 Talk about
00:51:34.060 the transparency.
00:51:35.880 The transparency
00:51:37.040 there is insane.
00:51:39.120 So, he's telling you
00:51:40.120 that he doesn't have
00:51:40.940 a sufficient,
00:51:42.200 like, user-level
00:51:43.760 understanding
00:51:44.400 of what YouTube
00:51:45.740 does for creators,
00:51:47.400 which is quite an
00:51:48.540 admission in public,
00:51:50.120 especially for somebody
00:51:51.080 who's running
00:51:51.580 this company.
00:51:52.600 And then he's asking
00:51:53.640 for your input,
00:51:55.100 and it's legitimate.
00:51:58.080 He's actually asking
00:51:59.240 for your input.
00:51:59.960 It's real.
00:52:00.380 Like, it's not
00:52:01.580 for appearance.
00:52:03.500 Your actual input
00:52:04.640 will determine,
00:52:06.020 at least it could,
00:52:07.320 what Twitter becomes,
00:52:08.600 and Twitter could be
00:52:09.240 one of the most
00:52:09.720 important parts
00:52:10.560 of society.
00:52:11.720 And you get
00:52:12.680 to be part of it.
00:52:14.220 So, look at all
00:52:15.300 the things you're learning.
00:52:16.700 Number one,
00:52:17.860 did he prove
00:52:18.460 that you don't need
00:52:19.120 a marketing department
00:52:20.020 all the time?
00:52:21.420 Yes, he did.
00:52:22.840 Yes, he did.
00:52:23.640 So, that's like
00:52:25.020 a whole lesson
00:52:25.580 in itself.
00:52:26.620 You just replaced
00:52:27.440 the marketing department.
00:52:30.140 He's also showing you
00:52:31.400 the new CEO move
00:52:32.940 I talk about all the time,
00:52:34.400 which is immediately
00:52:35.600 establishing
00:52:36.380 what you're about,
00:52:38.460 like, on day one.
00:52:40.060 Because that first impression
00:52:41.440 becomes really sticky.
00:52:42.620 That's who you are forever.
00:52:44.220 He did that perfectly.
00:52:45.060 And now you're seeing him
00:52:47.820 collecting input
00:52:49.680 from the customers
00:52:51.180 in a very direct way.
00:52:54.200 And he's doing
00:52:55.620 two things here.
00:52:56.480 You might see one,
00:52:57.320 but he's doing two.
00:52:59.300 The first thing he's doing
00:53:00.480 is making sure
00:53:01.060 he gets input.
00:53:03.720 So he's got input.
00:53:05.840 And the second thing
00:53:08.480 he's doing
00:53:08.920 is making you feel buy-in.
00:53:11.360 How much do you feel
00:53:12.580 like you're legitimately
00:53:13.980 part of the process?
00:53:15.060 Or better,
00:53:16.300 people that you trust
00:53:17.540 on Twitter,
00:53:18.500 you see them
00:53:19.140 making suggestions,
00:53:20.740 right?
00:53:21.180 And they're smart people
00:53:22.100 in many cases.
00:53:23.080 When you see smart people
00:53:24.060 on Twitter
00:53:25.020 making smart suggestions
00:53:26.340 to Musk,
00:53:28.460 even though it's
00:53:29.180 not your suggestion,
00:53:30.780 you feel like you're
00:53:31.480 part of the process,
00:53:32.240 right?
00:53:32.520 Because you could have.
00:53:33.340 You could have made
00:53:33.960 that suggestion.
00:53:35.640 So,
00:53:36.480 bringing the,
00:53:37.880 knowing that he has
00:53:38.760 this huge public
00:53:39.640 relations problem,
00:53:40.880 because half of the world
00:53:41.640 is going to hate him
00:53:42.260 no matter what he does,
00:53:43.180 now the news
00:53:45.060 is reporting
00:53:45.600 he's meeting
00:53:46.560 with the NAACP
00:53:48.200 and the ADL,
00:53:50.920 and that apparently
00:53:51.740 they like him.
00:53:53.700 Imagine that.
00:53:55.020 That's the news.
00:53:56.180 He managed the news.
00:53:58.080 The news is telling you
00:53:59.280 that the groups
00:54:00.300 that would be
00:54:00.800 most concerned
00:54:01.600 got a good response
00:54:04.900 from him.
00:54:06.700 He's doing everything right.
00:54:08.320 So you're learning
00:54:09.320 publicity,
00:54:10.280 marketing,
00:54:11.240 you're learning
00:54:11.840 how to take
00:54:12.400 user requirements,
00:54:13.600 you're learning
00:54:14.060 how to get buy-in,
00:54:15.780 you're learning
00:54:16.340 how to communicate,
00:54:17.520 you're learning
00:54:17.820 how to tweet,
00:54:18.540 you're learning
00:54:18.860 social media
00:54:19.600 as it relates
00:54:20.420 to a company.
00:54:21.660 You're seeing
00:54:22.540 his business model
00:54:23.600 being created
00:54:24.780 in real time.
00:54:26.020 So he's struggling
00:54:26.800 between subscription
00:54:28.040 and advertiser
00:54:29.240 and what balance
00:54:30.320 and what features.
00:54:32.420 Oh my God.
00:54:34.240 Oh my God,
00:54:35.400 is this useful?
00:54:37.580 You don't realize
00:54:38.520 how much smarter
00:54:39.280 we're all getting
00:54:39.980 just by getting
00:54:41.300 to see the transparency
00:54:42.520 of this process.
00:54:44.720 So the brilliance
00:54:45.800 that is coming out
00:54:46.620 of all of this
00:54:47.400 is just stunning.
00:54:50.160 Mike Cernovich
00:54:51.000 had an interesting
00:54:51.920 note
00:54:53.140 on a tweet.
00:54:55.800 He said
00:54:56.640 that it's not being
00:54:57.920 treated as newsworthy
00:54:59.160 that Twitter
00:55:00.280 had entire teams
00:55:01.420 that coordinated
00:55:02.220 messaging with the UN
00:55:03.700 says it all.
00:55:05.400 And he goes on
00:55:08.460 and he says
00:55:09.000 how is that
00:55:09.420 not a big deal?
00:55:10.660 And he speculates
00:55:11.680 and he goes
00:55:12.060 exactly.
00:55:13.040 New York Times,
00:55:13.720 Washington Post
00:55:14.260 probably has
00:55:14.960 the same internals
00:55:16.220 team doing propaganda.
00:55:18.420 So Cernovich
00:55:19.020 is speculating
00:55:19.740 that the major media
00:55:21.120 might not be
00:55:21.980 hitting this story hard
00:55:23.200 because they might
00:55:24.780 also have
00:55:25.320 deep connections
00:55:26.180 at Twitter
00:55:26.720 which was
00:55:28.080 managing the news
00:55:29.460 if you will.
00:55:31.280 And I don't know
00:55:32.040 if that's true.
00:55:32.920 It's reasonable
00:55:33.780 speculation.
00:55:35.400 But
00:55:36.900 I'm more interested
00:55:38.440 in the fact
00:55:38.980 of why we decide
00:55:40.620 some things are news
00:55:41.720 and some are not.
00:55:42.680 Have I told you
00:55:43.320 before that
00:55:44.020 the New York Times
00:55:45.220 and Washington Post
00:55:48.540 and really
00:55:50.340 not much else
00:55:51.060 a few others
00:55:52.280 they make news.
00:55:55.020 And by make news
00:55:56.040 I mean they look
00:55:56.900 into all the things
00:55:57.720 that are happening
00:55:58.240 and then they tell you
00:55:59.220 what things are important.
00:56:01.240 But is it because
00:56:02.220 they are important?
00:56:03.000 That's what I used
00:56:04.600 to think.
00:56:05.700 I used to think
00:56:06.380 that if it's on the news
00:56:07.420 it's because it's important
00:56:08.600 and everybody notices it
00:56:10.580 and then everybody
00:56:11.060 puts it on the news.
00:56:12.480 Nothing like that
00:56:13.380 is happening.
00:56:14.340 When it comes to politics
00:56:15.580 they decide
00:56:17.720 what the news is.
00:56:18.660 It's literally
00:56:19.100 just a decision
00:56:19.980 by people who
00:56:21.740 want to manage
00:56:23.280 the public.
00:56:24.700 And they've decided
00:56:26.120 that this is not
00:56:27.100 a big story.
00:56:27.760 that Twitter
00:56:29.860 was
00:56:30.620 you know
00:56:31.660 being influenced
00:56:32.540 directly
00:56:33.280 and personally
00:56:34.040 and in an ongoing way
00:56:35.780 by the United Nations
00:56:37.500 to make sure
00:56:38.840 their messaging
00:56:39.360 was compatible
00:56:40.100 with what the UN wanted.
00:56:42.120 Which is the scariest
00:56:43.220 thing you could ever hear.
00:56:45.040 Now why is that
00:56:45.800 not a big story?
00:56:48.700 Because the media
00:56:49.740 decided it wasn't.
00:56:51.640 Now I suppose
00:56:52.240 Fox News could decide
00:56:53.640 and they're big enough
00:56:54.800 that they could make
00:56:55.460 that a story.
00:56:55.980 but it would never
00:56:57.000 cross over
00:56:57.760 from Fox News
00:56:58.620 would it?
00:56:59.760 Except that
00:57:00.480 other people
00:57:01.300 would mock them
00:57:02.320 for having
00:57:02.760 the only stories
00:57:03.580 about it or something.
00:57:05.180 So this is one
00:57:06.560 of those cases
00:57:07.140 where you can tell
00:57:08.080 yourself
00:57:08.540 you really see
00:57:10.080 that the media
00:57:11.580 assigns your opinion.
00:57:13.740 Because whatever
00:57:14.720 you're worked up
00:57:15.720 about today
00:57:16.740 I guarantee
00:57:17.860 it was on the news.
00:57:19.680 And the things
00:57:20.560 you're not worked up
00:57:21.380 about
00:57:21.680 they were not
00:57:22.380 on the news.
00:57:23.960 The news decided
00:57:25.220 what to run
00:57:25.900 and they could
00:57:27.220 pick or ignore
00:57:28.080 whatever they wanted.
00:57:29.720 And then
00:57:30.100 you get worked up
00:57:31.020 over it
00:57:31.340 and you think
00:57:31.640 oh I'm worked up
00:57:32.440 over that important
00:57:33.240 thing.
00:57:34.060 No.
00:57:34.720 You're not worked up
00:57:35.700 over the important
00:57:36.480 thing.
00:57:37.320 You're worked up
00:57:38.040 over what the news
00:57:39.060 wanted you to be
00:57:40.120 worked up over.
00:57:41.640 You're completely
00:57:42.280 manipulated.
00:57:43.900 As am I.
00:57:44.980 Right?
00:57:45.220 I'm not
00:57:45.780 outside that system.
00:57:48.380 I'm just as
00:57:48.900 manipulated as you are.
00:57:50.420 Just be aware
00:57:51.020 of it.
00:57:51.280 So that was a
00:57:54.280 two-level
00:57:54.940 excellent tweet
00:57:56.260 by Sertovich
00:57:56.940 because it really
00:57:57.460 gets to something
00:57:58.020 deeper than just
00:57:59.240 the news.
00:58:02.340 All right.
00:58:04.660 Tim Ryan
00:58:05.420 running for
00:58:05.980 Ohio Senate
00:58:06.680 had this statement
00:58:08.880 to say in public.
00:58:09.880 He said
00:58:10.200 imagine
00:58:10.760 this is his
00:58:12.100 competitor
00:58:12.500 he says
00:58:13.420 imagine if
00:58:14.000 J.D. Vance
00:58:14.840 was your
00:58:15.720 teenage kid
00:58:16.460 and you caught
00:58:17.100 him hanging around
00:58:17.820 with people like
00:58:18.600 Marjorie Taylor Greene
00:58:20.180 Ron DeSantis
00:58:21.100 and Alex Jones.
00:58:25.320 To which I
00:58:26.200 tweeted
00:58:26.520 imagine being
00:58:28.360 a Democrat
00:58:28.980 candidate
00:58:29.560 I said one day
00:58:31.300 before election day
00:58:32.140 because I didn't
00:58:32.720 know what day it was
00:58:33.320 and the best
00:58:34.900 argument you have
00:58:35.680 is that your
00:58:36.060 opponent values
00:58:37.040 diversity.
00:58:39.880 That's his
00:58:40.520 argument.
00:58:41.460 My opponent
00:58:42.140 values
00:58:43.860 diverse
00:58:44.440 people
00:58:45.320 because here's
00:58:46.300 what he didn't
00:58:46.820 say.
00:58:47.760 He didn't say
00:58:48.760 J.D. Vance
00:58:50.040 refuses to
00:58:50.860 meet with
00:58:51.180 black people.
00:58:52.860 He didn't
00:58:53.180 say that
00:58:53.740 because I
00:58:55.360 assume that
00:58:55.800 J.D. Vance
00:58:56.440 has met with
00:58:57.280 all of his
00:58:58.140 constituents.
00:59:00.380 But he's
00:59:01.120 also
00:59:01.620 also
00:59:02.540 meeting with
00:59:03.220 some people
00:59:03.620 who have
00:59:04.000 opinions that
00:59:04.880 Tim Ryan
00:59:05.740 does not
00:59:06.160 share.
00:59:07.900 So he's
00:59:08.880 actually
00:59:09.280 criticizing
00:59:10.000 I mean this
00:59:10.520 is how
00:59:10.800 empty
00:59:11.320 this is how
00:59:12.340 empty their
00:59:12.900 entire
00:59:13.500 approach is
00:59:14.500 the Democrats.
00:59:15.740 They are so
00:59:16.260 empty
00:59:16.620 that this
00:59:18.320 close to
00:59:18.780 election
00:59:19.160 a strong
00:59:19.940 approach
00:59:20.720 is that
00:59:21.800 J.D. Vance
00:59:22.860 values diverse
00:59:23.880 opinions
00:59:24.400 because that's
00:59:26.660 what I heard.
00:59:28.440 I didn't hear
00:59:29.340 J.D. Vance
00:59:31.380 agrees with
00:59:32.420 Marjorie Taylor
00:59:33.940 Green on all
00:59:34.640 things.
00:59:35.620 Did you hear
00:59:36.080 that?
00:59:36.960 Nobody's even
00:59:37.680 claiming that.
00:59:38.940 What I didn't
00:59:39.460 hear is
00:59:40.100 J.D. Vance
00:59:42.160 agrees with all
00:59:43.100 the things that
00:59:43.620 Alex Jones has
00:59:44.400 said.
00:59:45.640 I never heard
00:59:46.140 that because
00:59:47.440 I'm sure
00:59:47.800 he doesn't.
00:59:49.660 If you're
00:59:50.380 interacting with
00:59:51.180 people who
00:59:51.520 have different
00:59:52.000 opinions but
00:59:52.760 not adopting
00:59:53.540 them, you're
00:59:55.220 doing what we
00:59:56.020 want you to
00:59:56.640 do, aren't
00:59:57.900 you?
00:59:58.880 Do you want a
01:00:00.220 politician who
01:00:01.000 doesn't interact
01:00:01.900 with people who
01:00:02.440 have different
01:00:02.800 opinions?
01:00:04.720 I mean this
01:00:05.440 is the craziest,
01:00:06.640 weakest,
01:00:07.680 emptiest attack
01:00:08.660 I've ever seen.
01:00:09.880 But even more
01:00:10.820 than that, I'm
01:00:11.440 going a little
01:00:11.800 hard on it
01:00:12.360 because about
01:00:15.220 once a year
01:00:15.980 or maybe twice
01:00:17.020 a year, I
01:00:17.940 have to say
01:00:18.480 this in public
01:00:19.200 and so I'm
01:00:20.600 going to say
01:00:20.900 it again.
01:00:22.400 I assert
01:00:23.360 my complete
01:00:25.560 right to
01:00:26.660 associate with
01:00:27.720 anybody I
01:00:28.300 want.
01:00:29.580 And when you
01:00:30.240 don't like it,
01:00:31.020 fuck you.
01:00:32.380 And that's the
01:00:32.880 end of the
01:00:33.200 conversation.
01:00:34.520 I'm not going
01:00:35.280 to adopt the
01:00:36.040 views of any
01:00:36.880 bad people
01:00:37.580 necessarily.
01:00:39.100 I might.
01:00:40.320 But that's not
01:00:40.980 why that happens.
01:00:42.880 I interact
01:00:44.520 with people
01:00:45.240 you don't
01:00:45.980 like.
01:00:47.560 And if you
01:00:48.360 don't like
01:00:48.800 that, fuck
01:00:50.500 you.
01:00:51.760 And there's
01:00:52.200 nothing else
01:00:52.620 to say.
01:00:53.340 Would you
01:00:53.620 agree there's
01:00:54.060 nothing else
01:00:54.500 to the
01:00:54.760 conversation?
01:00:56.960 This is one
01:00:57.600 of those
01:00:57.800 conversations
01:00:58.340 where fuck
01:00:59.020 you, that's
01:01:00.840 the whole
01:01:01.460 argument.
01:01:02.300 That's the
01:01:02.580 whole argument.
01:01:03.420 There's nothing
01:01:03.840 else.
01:01:04.860 I have no
01:01:05.580 nuance to add
01:01:06.240 to that.
01:01:06.520 Just fuck
01:01:06.900 you.
01:01:07.820 That's it.
01:01:08.840 Fuck you.
01:01:09.780 I'm going to
01:01:10.220 associate with
01:01:10.900 whoever I
01:01:11.420 want, whenever
01:01:12.600 I want, right
01:01:13.840 in front of
01:01:14.240 you.
01:01:15.360 Don't care.
01:01:19.460 To make
01:01:20.240 my point, watch
01:01:22.280 me prove my
01:01:23.020 point with a
01:01:24.280 segue that was
01:01:25.860 probably created
01:01:26.940 in heaven, it's
01:01:27.800 so amazing.
01:01:30.320 Adam Kinzinger
01:01:31.280 is right about
01:01:32.060 this.
01:01:33.780 So there you
01:01:34.540 go.
01:01:35.360 Adam Kinzinger,
01:01:36.240 somebody that I
01:01:37.020 would not agree
01:01:37.820 with normally, but
01:01:38.540 I associate with
01:01:39.380 his following
01:01:40.000 comments, because
01:01:41.040 if you don't
01:01:41.600 like me associating
01:01:42.560 and agreeing with
01:01:44.340 Adam Kinzinger on
01:01:45.420 a specific policy,
01:01:47.200 fuck you.
01:01:48.820 Fuck you.
01:01:50.160 But I know you
01:01:50.840 don't.
01:01:51.400 None of you have
01:01:52.260 that problem, I'm
01:01:52.940 sure.
01:01:53.760 But here he is
01:01:55.160 being right about
01:01:55.840 something.
01:01:56.320 Apparently there's
01:01:56.820 some aircraft that
01:01:57.880 the National Guard
01:01:58.760 uses to track the
01:02:00.780 drug shipments, and
01:02:01.980 fentanyl in
01:02:02.760 particular, and it's
01:02:04.160 going to be retired,
01:02:05.680 and I guess Kinzinger is
01:02:06.700 one of the few
01:02:07.120 people who actually
01:02:07.780 can fly that
01:02:08.420 plane.
01:02:09.940 I didn't realize he
01:02:10.860 was in the National
01:02:11.460 Guard.
01:02:12.540 Kinzinger is still
01:02:13.400 in the National
01:02:13.860 Guard?
01:02:16.020 He is?
01:02:16.760 All right, credit
01:02:17.380 him.
01:02:18.600 So can we say that
01:02:19.780 without, can we
01:02:22.300 give a full-throated
01:02:23.460 patriot?
01:02:25.400 Now I hate what he
01:02:26.660 did with January 6th,
01:02:28.040 right?
01:02:29.040 But he's a member
01:02:29.760 of the National
01:02:30.420 Guard, and he's
01:02:32.420 calling out a huge
01:02:33.860 problem, that this
01:02:35.260 aircraft will be
01:02:36.420 retired, and then
01:02:37.340 they won't be able
01:02:37.880 to easily, you
01:02:39.480 know, interdict
01:02:40.780 fentanyl.
01:02:43.160 So good job, Adam
01:02:45.000 Kinzinger, on this
01:02:46.100 particular issue, which
01:02:48.480 also suggests that the
01:02:49.860 administration is not
01:02:50.800 that serious about
01:02:51.760 fentanyl, because this
01:02:54.380 plane is critical to
01:02:55.680 their success of
01:02:57.080 capturing stuff, and
01:02:59.460 Kinzinger is fighting
01:03:00.780 to keep this funded.
01:03:02.080 So good for you.
01:03:02.700 Hope that works.
01:03:05.560 Let's talk about
01:03:06.480 Trump.
01:03:07.860 Trump labeled Governor
01:03:09.760 DeSantis with a
01:03:10.760 nickname.
01:03:12.200 He called him Ron
01:03:13.280 DeSanctimonious.
01:03:15.640 How many of you had
01:03:17.800 to look up the word
01:03:20.600 Sanctimonious?
01:03:24.000 I did.
01:03:25.560 I did.
01:03:26.920 Yeah, I did.
01:03:28.440 So if it makes it
01:03:29.560 easier for you to
01:03:30.540 admit it, I am a
01:03:32.080 professional writer.
01:03:33.740 I do it for a living.
01:03:35.500 I never use that
01:03:36.460 word.
01:03:38.240 Yeah.
01:03:38.860 So I had to look it
01:03:39.880 up.
01:03:40.420 So in case you don't
01:03:41.140 know what it means,
01:03:42.020 like me, basically
01:03:44.680 somebody who's showing
01:03:46.640 a big show of religion
01:03:49.420 or piety, you know,
01:03:51.040 being the right person,
01:03:52.240 but being somewhat
01:03:53.820 hypocritical.
01:03:54.740 So somebody who's
01:03:56.460 trying to be holier than
01:03:58.400 you, but they're sort
01:03:59.420 of a hypocrite.
01:04:01.120 Does any of that seem
01:04:02.340 to fit Ron DeSantis?
01:04:05.180 Does that feel like a
01:04:07.320 kill shot?
01:04:08.920 It does not.
01:04:10.720 Now, how many of you
01:04:11.900 were happy that he
01:04:12.940 said that?
01:04:14.080 How many of you said,
01:04:15.120 oh, I'm glad he did
01:04:16.240 that?
01:04:17.680 None, right?
01:04:18.680 Zero.
01:04:18.980 So, given that none
01:04:21.660 of you were happy
01:04:22.320 with that, almost none
01:04:23.920 of you, let's say 95%,
01:04:25.400 if you were not happy
01:04:27.380 with it, was it a
01:04:28.180 mistake?
01:04:29.720 Eh?
01:04:30.200 You weren't happy
01:04:31.120 with it, but was it
01:04:32.380 a mistake?
01:04:33.500 Go.
01:04:36.080 Yeah, it gets a little
01:04:37.300 more complicated now.
01:04:40.060 Let me hark back to my
01:04:41.980 Mark Ruffalo comment.
01:04:45.000 Mark Ruffalo is not the
01:04:47.060 one who should be
01:04:47.620 commenting on business
01:04:48.860 stuff, right?
01:04:49.800 That's not his domain.
01:04:50.680 And so, we should all be
01:04:52.580 humble when we're
01:04:54.540 commenting on somebody
01:04:55.700 else's expertise, which
01:04:57.020 we don't share.
01:04:58.680 Now, I always have a
01:05:01.160 little concern when I
01:05:02.440 criticize Trump for any
01:05:04.140 of his persuasion-related
01:05:05.740 stuff, because he became
01:05:08.040 president.
01:05:09.340 I didn't.
01:05:11.100 Whatever he does seems to
01:05:12.640 work, even when you think
01:05:13.740 it won't, right?
01:05:14.960 That's probably the most
01:05:17.620 remarkable thing about his
01:05:20.140 method, is that even people
01:05:22.140 who support him think
01:05:23.040 that's not going to work.
01:05:24.000 Well, that's not going to
01:05:24.680 work.
01:05:25.320 And then, weirdly, it
01:05:26.180 works, for reasons that you
01:05:27.960 don't imagine.
01:05:28.580 He sucks all the energy out,
01:05:30.080 you know, whatever.
01:05:31.260 Here's what I thought.
01:05:32.360 My first impression was, oh,
01:05:34.340 that's a mistake.
01:05:35.980 That's a mistake.
01:05:36.840 He just went after one of the
01:05:38.660 most popular Republicans who's
01:05:41.440 100% successful, according to
01:05:43.780 Republicans.
01:05:45.480 So, on the surface, it's a
01:05:47.160 mistake, wasn't it?
01:05:49.200 But, did you notice it was the
01:05:51.800 worst kill shot he's ever
01:05:53.460 issued?
01:05:55.220 Is that an accident?
01:05:57.120 Is it an accident that the
01:05:58.740 kill shot he used against
01:06:01.240 DeSantis seemed to be designed
01:06:03.720 not to wound him?
01:06:06.000 Right?
01:06:06.480 It looked like a warning shot.
01:06:09.920 Shot over the bow, exactly.
01:06:11.720 That was a brushback pitch.
01:06:14.880 The brushback pitch, he just
01:06:17.160 told the world, you can run
01:06:19.620 against me in the primary, but
01:06:21.660 it's not going to be pretty.
01:06:24.600 Once you think about it a little
01:06:26.640 bit, you realize it was just a
01:06:28.300 brushback pitch.
01:06:29.700 It wasn't really about DeSantis
01:06:31.120 at all.
01:06:32.960 You know, I'd be surprised if he
01:06:34.140 thinks DeSantis, you know, does a
01:06:36.540 poor job.
01:06:37.200 I think he probably thinks DeSantis
01:06:38.520 does a good job.
01:06:39.640 He's also quite aware that
01:06:42.600 people are saying, hey, why
01:06:44.340 shouldn't DeSantis have the top
01:06:45.680 job?
01:06:46.880 Right?
01:06:47.460 So, Trump knows he has to knock
01:06:50.280 DeSantis down a notch, but I
01:06:52.640 think he was just telling the
01:06:53.760 world he's willing to do it.
01:06:57.760 Which is actually useful.
01:06:59.620 He's basically saying, yeah, I'm
01:07:01.040 not going to leave him alone.
01:07:02.100 I'm going to go after him.
01:07:03.180 But he went after him so softly
01:07:05.420 that it's easy to pull back.
01:07:08.700 It was such a soft brushback that
01:07:11.380 makes it look like it was an
01:07:13.080 intentional, just a warning shot,
01:07:15.880 just to let you know that, you
01:07:17.420 know, he's the big shark, and if
01:07:19.440 you go swimming with the big shark,
01:07:21.160 you're going to get bit.
01:07:24.000 So I started hating it, and then
01:07:26.980 the more I thought about it, it's
01:07:28.720 like, oh, God, he did it again.
01:07:31.200 He did it again.
01:07:32.000 He did something that I
01:07:32.940 immediately disliked, but the
01:07:34.360 more I thought about it, it's
01:07:35.220 like, shoot, that does move the
01:07:36.620 energy in his direction, in just
01:07:38.480 a slight way.
01:07:40.940 Anyway, so always be humble when
01:07:43.220 you're looking at Trump's
01:07:44.580 persuasion, because you never know
01:07:45.980 exactly what he's up to.
01:07:47.400 Sometimes it's smarter than you
01:07:48.480 think.
01:07:50.760 All right.
01:07:52.360 As we head into the midterms,
01:07:55.460 how's everybody feeling?
01:07:59.520 How's everybody feeling?
01:08:00.720 What do you think about the
01:08:01.720 economy?
01:08:04.580 Think the economy will be right?
01:08:06.500 Be all right?
01:08:10.620 Yeah, a lot of you are feeling
01:08:12.160 good, but also nervous.
01:08:15.600 I don't know what's going to
01:08:16.900 happen, but I'll tell you one
01:08:19.400 thing that might happen.
01:08:22.840 One thing that might happen is
01:08:24.440 there's tons of scrutiny, and
01:08:27.580 maybe we got way better at
01:08:28.980 watching things.
01:08:30.520 So we might have the cleanest
01:08:31.980 election we've ever had.
01:08:33.860 Like, that's possible, right?
01:08:35.740 Could be the cleanest one we've
01:08:36.860 ever had, just because of so much
01:08:38.240 extra energy watching things.
01:08:41.260 And there have been some changes.
01:08:43.360 There have been some changes that
01:08:44.880 should reduce the amount of
01:08:46.820 cheating, potentially.
01:08:48.020 But the other possibility, that I
01:08:52.800 don't discount at all, is that the
01:08:54.860 extra attention will end up finding
01:08:56.620 some real serious fraud.
01:08:59.840 It could be on either side.
01:09:01.220 I'm not going to say it's going to
01:09:02.060 be all Democrat.
01:09:03.840 Because, here's my opinion.
01:09:06.000 The reason that we don't have
01:09:07.940 elections that can be finished the
01:09:11.920 same day is because we don't want
01:09:14.920 it.
01:09:15.120 There's no other reason.
01:09:17.780 We obviously, we meaning both
01:09:19.840 Democrats and Republicans, you
01:09:22.180 know, each of them having control
01:09:23.660 over individual states, it's very
01:09:25.840 obvious that they want the option
01:09:27.180 of cheating.
01:09:28.860 It's very obvious.
01:09:30.200 Because if either side didn't want
01:09:32.440 the option of cheating, they would
01:09:34.340 just make the option go away in
01:09:36.200 their state.
01:09:37.520 Because there are places where the
01:09:38.840 Republicans can design the system
01:09:40.700 any way they want.
01:09:42.400 Right?
01:09:43.240 Right?
01:09:43.900 Give me a fact check.
01:09:44.760 There are states where the
01:09:46.000 Republicans could have an Israeli
01:09:48.680 type of system where it's all on
01:09:50.360 paper, and it all happens in one
01:09:52.820 room, the ballots never leave the
01:09:54.300 room, you count them in the room
01:09:55.620 with lots of witnesses, and then
01:09:57.660 there's basically close to zero
01:09:59.500 chance of fraud.
01:10:01.420 You don't think that one Republican
01:10:03.220 state would have implemented
01:10:04.600 something that's just fraud-free.
01:10:08.220 No.
01:10:09.360 It's because everybody who controls
01:10:11.420 elections in their own state wants the
01:10:13.340 option of gaming the system.
01:10:15.900 What else could it be?
01:10:17.660 What else could it be?
01:10:19.120 If either side wanted the elections
01:10:21.240 to be transparent and done on the
01:10:25.440 first day, we have models to follow
01:10:28.640 that are simple.
01:10:30.800 They're simple.
01:10:32.120 Just what does Israel do?
01:10:33.980 And what does some other countries do?
01:10:35.400 Just do that.
01:10:36.280 It works.
01:10:36.740 So, you can eliminate the possibility
01:10:40.620 that either side wants to fix this.
01:10:45.280 Does anybody disagree with that?
01:10:47.560 I'm not saying Republicans do anything
01:10:50.160 except, you know, dick around the
01:10:52.200 edges, you know, with voter ID and
01:10:54.400 stuff, which is important.
01:10:55.620 But that's just dicking around the
01:10:56.980 edges.
01:10:57.200 All right.
01:11:04.800 Let me make a case that everything's
01:11:06.540 getting better, even though we're low
01:11:08.900 on diesel fuel.
01:11:11.960 Let me make a diesel fuel prediction.
01:11:15.120 Are you ready?
01:11:15.580 So, apparently, if diesel fuel is
01:11:19.020 limited, trucking will fall off and, you
01:11:23.040 know, the supply chain will die.
01:11:26.240 And we don't have an obvious way to
01:11:28.520 make enough to fix that.
01:11:31.800 I think we know about this problem far
01:11:33.980 enough in advance that we'll get
01:11:39.120 through it, like our other problems.
01:11:40.860 So, if we know far enough in advance,
01:11:43.100 we, you know, trains and everything
01:11:45.540 else, we're going to need to, we'll
01:11:49.980 probably make the adjustments.
01:11:51.720 Now, imagine, if you will, that the
01:11:55.720 country just couldn't get products where
01:11:57.620 it wanted because there's not enough
01:11:58.820 diesel.
01:11:59.540 What would be the fallback?
01:12:02.980 Somebody builds an app where you can
01:12:05.680 all be Uber for packages.
01:12:08.440 Right?
01:12:08.920 All it would take is an app that says,
01:12:12.220 well, just, you know, take your van down
01:12:14.200 to this dock, we'll load you up with
01:12:16.740 stuff, and you, you know, deliver it to
01:12:18.980 the air, and you'll be covered with some
01:12:20.640 temporary insurance or something like
01:12:21.980 that.
01:12:22.520 You know, if it were a national emergency
01:12:24.520 where people were starving and stuff, we
01:12:27.400 would just use regular cars and regular
01:12:31.120 trucks.
01:12:31.700 Just people would put their, bring their
01:12:33.700 pickup truck and say, well, I can
01:12:35.280 deliver some of this.
01:12:36.020 So, it's one of those emergencies where
01:12:40.780 if you had to, like you're actually
01:12:42.980 starving and shit, we have other
01:12:46.060 vehicles, right?
01:12:47.180 We're not, we don't have a shortage of
01:12:48.800 vehicles that can transport stuff.
01:12:50.800 It would be an organizational, you know,
01:12:53.000 Dunkirk-sized thing, but we can do it.
01:12:56.040 We'll get through it.
01:12:56.700 All right.
01:13:00.580 Is there anything that's going wrong?
01:13:03.280 I'm not even entirely sure that Ukraine
01:13:05.660 is going wrong.
01:13:08.820 Are you?
01:13:09.480 And I don't even know what wrong looks
01:13:12.360 like.
01:13:13.740 Because we don't know how it ends, but
01:13:15.720 one potential ending is that Putin is
01:13:18.880 weakened and Russia is no longer a thorn in
01:13:22.420 our side and Ukraine is creating energy
01:13:25.420 and food for everybody.
01:13:26.740 It's possible that, you know, we're
01:13:29.660 going to have a huge bill to pay in
01:13:33.240 terms of our borrowing to fund that war.
01:13:37.060 But, yeah, if you're going to, if you're
01:13:40.220 going to guess, it does look like things
01:13:43.280 are pretty bad over there and will stay
01:13:44.980 that way and could turn into a nuclear war.
01:13:48.240 But I don't see it happening.
01:13:49.200 I feel like the signals of a potential
01:13:52.380 nuclear war would be much stronger than
01:13:54.520 they are.
01:13:55.500 So now it just looks like people bluffing
01:13:57.160 each other.
01:13:58.740 So I don't spend really any time worrying
01:14:01.600 about nuclear war.
01:14:04.600 I mean, that could change, but at the
01:14:06.340 moment, it just looks like bluffing.
01:14:10.200 All right.
01:14:10.840 And then I hear other people say the real
01:14:13.640 problem is that Russia and China and
01:14:18.440 other countries will have their own
01:14:21.400 currency and then America's, you know,
01:14:24.380 currency dominance will fall away.
01:14:26.700 The dollar won't be worth something.
01:14:28.260 But I feel like we've been talking
01:14:29.820 about that forever, haven't we?
01:14:33.600 Yeah.
01:14:34.160 It just feels like we've been talking
01:14:35.520 about that forever.
01:14:36.280 I don't know if that means it's about
01:14:37.400 ready to happen, but I feel like that's
01:14:40.140 another one of those Adam's Law of
01:14:41.720 slow-moving disasters.
01:14:43.520 There's probably a way around it.
01:14:45.620 Can we get reparations from China for
01:14:51.600 COVID?
01:14:52.940 Well, have you heard this story that
01:14:55.660 Russia has put together a case that the
01:14:59.680 virus escaped from a Ukrainian lab that
01:15:03.580 America backed?
01:15:05.620 And they've asked the UN to investigate it
01:15:07.760 and they've given them a bunch of
01:15:08.780 information.
01:15:10.060 Have you heard of that?
01:15:11.060 Now, how many of you would say there's
01:15:14.840 no chance that America was behind the
01:15:17.180 virus?
01:15:18.580 In the comments, how many of you think
01:15:21.080 there's no chance?
01:15:21.980 There's no chance that America was
01:15:23.360 behind the virus?
01:15:28.120 Yeah, you're a little uncertain, aren't
01:15:30.100 you?
01:15:31.580 Yep.
01:15:32.760 In my opinion, it's entirely possible.
01:15:37.740 Entirely possible.
01:15:38.500 I'm not even sure how I would put the
01:15:40.380 odds on it.
01:15:41.480 I would say the odds that we have
01:15:43.120 accurate information about the source
01:15:45.820 of the virus are probably low.
01:15:48.120 Probably low.
01:15:49.400 Now, obviously, the Wuhan lab looks like
01:15:52.120 the obvious place, but I'd like to hear
01:15:55.560 what the Russians say.
01:15:57.300 Because if America released that virus
01:15:59.480 and knew it, what would they do?
01:16:01.760 If America knew it released the virus
01:16:05.200 and it was already out there, and it
01:16:07.720 knew, you know, nobody noticed it yet,
01:16:09.420 but it's percolating and they know it's
01:16:10.900 going to percolate up, the first thing
01:16:13.080 you do is you go infect some people next
01:16:15.360 to the Wuhan lab.
01:16:19.380 That's what you do.
01:16:20.520 You go infect some people in Wuhan.
01:16:23.100 Because then when it breaks down in Wuhan,
01:16:25.040 everybody's going to say, oh, Wuhan.
01:16:26.460 So if you tell me that the odds are
01:16:30.820 like all pointing toward Wuhan being
01:16:33.140 the source, I would say not so much.
01:16:35.800 There are definitely strong signals.
01:16:38.260 But we live in a world in which that's
01:16:40.340 the sort of thing that gets faked.
01:16:42.480 Like it's in the category of things that
01:16:44.240 people do lie about every time.
01:16:46.520 Like every time.
01:16:48.900 Whoever's country it was that released
01:16:50.780 that virus, or whoever was responsible,
01:16:53.380 don't you think you could rely on them
01:16:55.540 lying? So you know somebody's lying.
01:16:58.880 You just don't know who.
01:17:02.440 Yeah, Russia's been saying it since
01:17:03.920 February, right?
01:17:06.880 Do prices return to previous levels
01:17:08.780 when inflation comes back down?
01:17:11.780 That's not exactly how that works.
01:17:14.980 No. The answer is no.
01:17:17.520 What should happen is your paycheck
01:17:21.040 should rise at some point if you're
01:17:23.980 working for the right company.
01:17:24.920 Your paycheck should rise to meet that.
01:17:28.960 But I don't know if that's going to happen.
01:17:32.840 Yeah, pangolins have been maligned.
01:17:38.480 Do you remember when we were hearing
01:17:39.960 the early theories about the virus
01:17:42.280 and people were so confident about it?
01:17:46.240 The theories that turned out to be wrong.
01:17:48.800 Oh, definitely came from that wet market.
01:17:50.840 Oh, definitely.
01:17:51.780 Oh, wet market.
01:17:52.520 Nope.
01:17:53.480 Nope.
01:17:56.360 Your coffee cup,
01:17:57.780 somebody got their coffee with Scott Adams
01:18:00.520 coffee cup while I'm live streaming.
01:18:04.120 Amazing.
01:18:04.600 Somebody asked me at an event yesterday.
01:18:12.780 Well, actually, lots of people asked me this.
01:18:15.020 They asked me if I'm retired.
01:18:17.980 Because I guess I'm at that age where
01:18:19.820 people just look at you and say,
01:18:21.640 how's retirement?
01:18:22.680 Like, shit, do I look that old?
01:18:25.020 Well, that's the first question you're going to ask me
01:18:27.200 is if I'm retired.
01:18:28.040 I don't like that.
01:18:30.700 But I decided that I am retired.
01:18:34.060 But I'm modern retired.
01:18:36.620 By modern retired, I mean I've simply stopped doing
01:18:40.300 the things I found unpleasant that I did for money.
01:18:43.960 And I do the things that I find pleasant.
01:18:47.640 It still works.
01:18:49.220 I was adding up how many hours per week I work now
01:18:52.420 in my retirement.
01:18:53.880 It's about 40 hours a week.
01:18:57.300 So my retirement is 40 hours a week of work.
01:19:00.980 Which feels like retirement,
01:19:02.580 because, you know, 60 to 70 would be more normal.
01:19:06.120 But 40 feels like vacation.
01:19:08.260 Like, I'm usually done by, you know,
01:19:10.680 if I start at 4 a.m.
01:19:13.220 and I work seven days a week,
01:19:16.160 right,
01:19:16.540 I'm done by 10 a.m.
01:19:19.580 and I've done 40 hours a week,
01:19:21.840 42.
01:19:23.880 And that's just on the live stream,
01:19:26.140 right,
01:19:26.880 because I do a lot of prep for that.
01:19:28.940 So I'm still doing the cartoon,
01:19:30.920 but my assistant
01:19:31.760 is doing all of the drawing now.
01:19:35.180 So if you see any differences in the drawing,
01:19:37.860 it's because a better artist is doing it.
01:19:42.060 So my assistant,
01:19:45.020 who does the art now,
01:19:46.460 is authorized to
01:19:48.380 change the look of it a little bit.
01:19:51.700 Like the background,
01:19:53.120 add a little background,
01:19:54.100 and maybe some angles that I don't do.
01:19:55.840 Just because she's a better artist than I am.
01:20:01.320 All right.
01:20:03.120 I need a time clock,
01:20:05.000 because I've run too long.
01:20:07.960 Got an interesting message here.
01:20:09.400 All right.
01:20:15.180 I got a funny message,
01:20:16.280 but I can't share it with you.
01:20:20.220 Sharon says,
01:20:21.940 Scott,
01:20:22.360 with all caps,
01:20:23.380 do you feel guilty
01:20:24.500 for falling for the fear hysteria of COVID?
01:20:27.480 Would anybody like to answer Sharon for me?
01:20:29.880 This is an all caps.
01:20:31.900 Scott,
01:20:32.380 do you feel guilty
01:20:33.160 for falling for the fear hysteria of COVID?
01:20:35.780 I wonder what's going to happen
01:20:41.400 to poor Sharon.
01:20:45.120 This is the sort of situation
01:20:46.620 where in the past,
01:20:49.880 people have seen me
01:20:51.760 have a reaction.
01:20:55.060 Except,
01:20:55.860 I don't believe she's real.
01:20:59.000 I think she's just goading me.
01:21:01.560 So I'm not going to respond to that.
01:21:03.840 But Sharon,
01:21:05.820 I'm sure there's somebody
01:21:07.860 in the comments
01:21:08.580 who will share with you
01:21:10.100 the word that
01:21:11.060 you have identified yourself as,
01:21:13.900 and I don't need to use it.
01:21:17.600 So Sharon,
01:21:18.980 you dug your own
01:21:20.220 situation.
01:21:23.540 I don't want to say grave.
01:21:24.860 That sounded violent.
01:21:28.380 Yeah,
01:21:28.800 I think she's just kidding around.
01:21:30.640 I don't think that's real.
01:21:32.360 I've blocked her.
01:21:33.840 I didn't block her.
01:21:36.800 All right.
01:21:37.520 Is there anything I missed?
01:21:39.120 Any stories I missed
01:21:40.200 that you need a reaction to?
01:21:42.260 Wake up,
01:21:42.840 Scott.
01:21:46.200 Check out this jingle
01:21:47.340 for the simultaneous sip.
01:21:49.140 Send that to me
01:21:49.880 somewhere else
01:21:50.580 so I can see it.
01:21:51.860 I won't be able
01:21:52.760 to go back
01:21:53.160 in the comments easily.
01:22:00.300 Baby troll,
01:22:01.280 what's that mean?
01:22:01.820 You still feeling better?
01:22:05.240 Yes.
01:22:05.880 Yeah.
01:22:06.140 So let me continue
01:22:07.420 with my update.
01:22:10.100 I have had nothing
01:22:11.460 but excellent days
01:22:13.020 all day long
01:22:14.500 from the day
01:22:15.820 that I stopped
01:22:16.320 taking my blood pressure meds.
01:22:18.380 Nothing but excellent days.
01:22:20.280 Mentally,
01:22:20.920 100%.
01:22:21.600 100%.
01:22:22.240 Physically,
01:22:24.560 my knees are hurting
01:22:25.800 a little bit
01:22:26.220 from overuse
01:22:27.160 but otherwise
01:22:28.140 my body is
01:22:28.980 like 20 years younger.
01:22:32.960 I mean,
01:22:33.620 I could exercise
01:22:36.100 for hours.
01:22:38.340 Everything's fixed
01:22:39.220 and it was just that.
01:22:40.660 Yeah,
01:22:44.500 I forget what kind.
01:22:45.260 I don't think
01:22:45.700 I had a beta blocker.
01:22:46.940 I think I said
01:22:47.500 it was a beta blocker
01:22:48.480 and then somebody
01:22:49.760 corrected me.
01:22:50.880 So I'm not going
01:22:51.440 to name the specific
01:22:52.620 meds because I don't
01:22:53.820 want to get sued
01:22:54.460 for making claims
01:22:56.120 that they can debunk
01:22:57.060 or something.
01:22:59.060 All right.
01:23:00.980 It was a steroid?
01:23:02.120 I don't think it was.
01:23:02.960 It wasn't a steroid.
01:23:03.820 Your coffee mugs
01:23:07.060 will be arriving soon.
01:23:11.400 Thank you, Henry.
01:23:16.960 High BP will jack
01:23:18.620 your cognition.
01:23:20.040 Jack your cognition
01:23:21.300 as in improve it?
01:23:23.880 So did I tell you
01:23:25.080 my reframe
01:23:25.960 for blood pressure?
01:23:27.880 Now,
01:23:28.620 I don't think
01:23:29.600 this has scientific backing.
01:23:32.160 So this is like
01:23:32.860 just a fun
01:23:33.660 speculative thing,
01:23:34.640 right?
01:23:38.720 Is it a coincidence
01:23:40.400 that all the things
01:23:41.680 that are good
01:23:42.260 for blood pressure
01:23:43.140 are things
01:23:46.020 that manage
01:23:46.560 your energy?
01:23:48.500 Right?
01:23:49.100 But basically
01:23:49.780 if you can lower
01:23:50.460 your energy
01:23:51.100 or expend your energy
01:23:52.440 so you can expend
01:23:53.500 it by exercise
01:23:54.200 or you can lower it
01:23:55.380 by meditation,
01:23:57.280 for example.
01:23:58.400 So it seems to be,
01:24:00.520 and this is just
01:24:01.200 a hypothesis,
01:24:01.800 that as you age
01:24:03.640 your body
01:24:04.860 does not naturally
01:24:05.940 off-gas your energy.
01:24:08.080 Meaning that
01:24:08.800 if you're a child
01:24:09.560 and you have
01:24:10.520 high energy,
01:24:11.880 nothing will stop
01:24:12.800 you from moving.
01:24:14.600 You'll just get
01:24:15.460 real active
01:24:16.040 until you burn
01:24:16.940 off your energy
01:24:17.580 and like fall asleep
01:24:19.520 into a dead sleep.
01:24:20.960 If you're an adult,
01:24:22.760 just because
01:24:23.520 a normal adult lifestyle
01:24:25.700 doesn't involve
01:24:26.340 moving much,
01:24:27.240 depending on your job,
01:24:28.780 but you sit
01:24:29.520 in the chair a lot
01:24:30.220 and I miss my workout
01:24:33.320 or something like that.
01:24:34.980 So I feel as if
01:24:36.800 high blood pressure
01:24:37.700 is really energy
01:24:40.960 that doesn't have
01:24:43.200 a way to get out.
01:24:46.040 Does that reframe
01:24:47.140 make sense to you?
01:24:48.260 Because it also
01:24:48.880 tells you what to do.
01:24:50.960 It tells you
01:24:51.940 if you keep your body
01:24:53.420 healthy enough
01:24:54.060 that you can exercise,
01:24:55.260 then you've got
01:24:56.880 a good chance
01:24:57.620 of managing
01:24:58.120 your blood pressure.
01:25:00.100 Yeah, follow the energy.
01:25:01.180 Exactly.
01:25:02.260 Follow the energy.
01:25:04.400 Dr. Oz has entered
01:25:05.520 the room.
01:25:06.980 Yeah, so I'm not
01:25:08.000 going to say that
01:25:08.620 you should take
01:25:09.940 that reframe
01:25:10.640 and ignore your doctors.
01:25:12.000 I'm just saying
01:25:12.620 that it so clearly
01:25:13.940 explains everything
01:25:15.460 that you need to do
01:25:16.440 and not do
01:25:17.200 just by energy.
01:25:19.600 And there are
01:25:20.900 a whole bunch
01:25:21.360 of things in which
01:25:22.280 if you reframe it
01:25:23.560 as energy,
01:25:24.720 you start
01:25:25.720 understanding it.
01:25:27.100 Remember when I
01:25:27.940 first reframed
01:25:29.280 Trump as energy?
01:25:32.140 Like you were thinking
01:25:33.060 in terms of
01:25:33.740 fact-checking
01:25:34.980 and policies.
01:25:36.160 But when I reframed it
01:25:37.300 in terms of
01:25:37.780 he's an energy monster
01:25:39.080 and he moves the energy
01:25:40.140 where you want it,
01:25:40.920 then suddenly
01:25:41.400 everything makes sense,
01:25:42.320 right?
01:25:42.940 All the things
01:25:43.520 are a little confusing.
01:25:45.280 It suddenly comes
01:25:45.940 into focus
01:25:46.740 when energy
01:25:47.680 is the frame.
01:25:49.140 I feel like this
01:25:50.080 might be another
01:25:50.600 one of those.
01:25:52.120 We'll see.
01:25:53.560 Oh, I just realized
01:25:56.400 I did something clever.
01:25:59.600 If I have a stroke,
01:26:00.980 because everybody
01:26:01.480 is teasing me
01:26:02.300 because, you know,
01:26:03.240 I got vaccinated
01:26:04.040 originally,
01:26:05.260 people saying
01:26:06.080 I'm going to get a stroke.
01:26:07.460 But now that I've
01:26:08.520 publicly gone off
01:26:09.460 my blood pressure meds,
01:26:11.200 if I have a stroke,
01:26:13.560 you're not going
01:26:14.600 to know why.
01:26:17.280 I've ruined your fun.
01:26:19.220 I'm sorry.
01:26:19.980 Now if I have a stroke,
01:26:22.520 it's because I'm off
01:26:24.040 my blood pressure meds.
01:26:26.640 So I've ruined
01:26:27.760 your anecdotal happiness
01:26:29.500 should I have
01:26:31.180 a tragic event.
01:26:33.120 Your anecdotal happiness
01:26:34.220 will be way down.
01:26:35.920 Clot atoms
01:26:36.520 will not be as funny.
01:26:37.740 It won't be funny.
01:26:38.680 People won't laugh at it.
01:26:41.060 Yeah.
01:26:42.740 So that was accidental,
01:26:44.200 but it was a stroke
01:26:44.980 of genius.
01:26:45.820 A stroke of genius,
01:26:46.820 exactly.
01:26:47.160 Exactly.
01:26:52.120 Okay.
01:26:53.880 No one here
01:26:54.700 wants you to be harmed?
01:26:55.760 Probably no one here.
01:26:57.200 But there are definitely
01:26:58.080 some people
01:26:58.640 who would have
01:26:59.460 mocked me,
01:27:00.380 my dead bones,
01:27:01.240 if I died.
01:27:02.740 Definitely.
01:27:04.100 And I don't
01:27:04.980 begrudge them.
01:27:05.860 They're fun.
01:27:07.480 If I'm dead,
01:27:08.240 what's the difference?
01:27:11.440 All right.
01:27:13.420 Could you have
01:27:14.120 lowered the dose?
01:27:15.360 Yes.
01:27:15.640 Yes.
01:27:17.160 And the lower dose
01:27:18.280 did not affect
01:27:19.220 my blood pressure.
01:27:20.680 So it didn't help.
01:27:22.300 It was actually
01:27:23.240 raised recently
01:27:24.360 because it wasn't
01:27:26.000 changing the blood pressure.
01:27:32.520 And by the way,
01:27:33.560 I'm not the person
01:27:34.940 telling you
01:27:35.440 you shouldn't take
01:27:36.040 blood pressure meds
01:27:36.880 because apparently
01:27:38.000 there are a bunch of them.
01:27:39.460 I tried one of them,
01:27:40.680 two of them actually,
01:27:41.660 and the two of them
01:27:42.560 I tried
01:27:42.940 didn't give me
01:27:43.620 a good result.
01:27:44.840 That's all I know.
01:27:45.580 I don't know
01:27:46.260 about the other
01:27:46.860 five or six
01:27:47.860 or whatever.
01:27:49.120 Do I eat grapefruit?
01:27:50.700 No.
01:27:51.440 It's weird.
01:27:52.900 What is your BP?
01:27:54.040 It ranges now
01:27:55.220 without blood pressure
01:27:56.240 the week after.
01:27:57.700 It ranges between
01:27:58.700 perfect,
01:28:00.380 120 over 80,
01:28:03.340 and 140 over 90 something
01:28:06.920 in the afternoon usually.
01:28:09.100 And as I said
01:28:11.800 to my doctor,
01:28:12.940 I'm not aware
01:28:13.680 of any studies
01:28:14.480 that say somebody
01:28:15.460 with even the high end
01:28:16.840 of my blood pressure
01:28:17.680 should be on
01:28:18.300 blood pressure meds.
01:28:19.580 Which is not to say
01:28:20.660 I shouldn't be.
01:28:22.100 I was simply challenging
01:28:23.560 whether there's
01:28:24.260 any medical
01:28:25.020 scientific
01:28:26.580 backing for it,
01:28:28.860 and my doctor
01:28:29.580 was not willing
01:28:30.380 to say there was.
01:28:32.180 Because I did enough
01:28:33.240 research to know
01:28:34.080 there isn't.
01:28:35.380 There isn't.
01:28:37.140 It's another one
01:28:38.100 of those big pharma
01:28:39.200 things, right?
01:28:41.380 Now,
01:28:42.660 I have to be
01:28:43.940 really careful.
01:28:44.640 I'm not saying
01:28:45.320 that you shouldn't
01:28:45.980 take blood pressure meds.
01:28:47.140 I'm saying
01:28:47.440 let me be very careful
01:28:48.860 about that.
01:28:49.320 That's between you
01:28:49.960 and your doctor.
01:28:53.960 All right.
01:28:57.160 What time do you suggest
01:28:58.560 I take my first
01:28:59.560 bong rip today?
01:29:01.000 Well, I think you've
01:29:02.080 wasted a few minutes
01:29:02.940 asking the question.
01:29:04.540 So,
01:29:04.820 no time like now.
01:29:13.640 140 isn't
01:29:14.560 necessarily high.
01:29:18.160 Yeah,
01:29:18.480 I think that,
01:29:20.060 so here's the question.
01:29:23.100 If you have a stroke,
01:29:24.460 it's not just because
01:29:25.860 of your blood pressure,
01:29:27.520 correct?
01:29:28.960 There,
01:29:29.420 you know,
01:29:29.800 a number of factors
01:29:30.740 have to be also true
01:29:31.840 at the same time.
01:29:32.540 like you're,
01:29:35.240 yeah,
01:29:35.560 you need some kind
01:29:36.280 of a thing going on.
01:29:38.280 Now,
01:29:38.640 in my case,
01:29:39.860 I'm a vegetarian
01:29:41.260 for, you know,
01:29:42.320 40 years,
01:29:43.760 more recently
01:29:44.480 a pescatarian.
01:29:47.080 My,
01:29:47.740 my body mass index
01:29:49.740 is exactly
01:29:50.540 where it should be.
01:29:51.480 I'm like right in the middle
01:29:52.380 of the good range.
01:29:53.500 I exercise daily.
01:29:57.240 I don't drink alcohol
01:29:58.760 or smoke tobacco.
01:30:02.860 And there are no
01:30:04.560 strokes in my family
01:30:06.720 background.
01:30:08.320 No strokes in my
01:30:09.420 family background.
01:30:10.700 Now,
01:30:11.160 can somebody tell me
01:30:12.400 that my blood pressure
01:30:14.240 at 140
01:30:14.940 is dangerous?
01:30:16.440 It might be.
01:30:18.900 I'm not saying it isn't.
01:30:20.060 Just being real clear.
01:30:21.440 I just don't think
01:30:22.180 there's any evidence
01:30:23.160 to show that me
01:30:24.620 as a specific individual
01:30:26.460 with a specific lifestyle,
01:30:28.640 I just don't know.
01:30:30.600 Here's what I think.
01:30:32.020 I've always believed
01:30:32.940 that my body
01:30:33.780 runs hot.
01:30:35.740 Runs hot.
01:30:37.180 And what I mean by that
01:30:38.300 is that when I go to sleep,
01:30:39.880 I wake up sweating.
01:30:41.740 Right?
01:30:42.620 Like my natural body
01:30:43.940 just runs hot.
01:30:44.700 I'd rather be in a cool room
01:30:46.480 than a warm room.
01:30:47.740 I can't do work
01:30:48.680 over 76 degrees
01:30:50.000 if the room is over 76.
01:30:52.820 You know,
01:30:53.100 my body just runs hot.
01:30:55.020 Now,
01:30:55.280 you've also noticed
01:30:55.980 that when you do
01:30:56.460 a lot of thinking,
01:30:58.340 you notice you can
01:30:58.980 break a sweat
01:30:59.560 just thinking hard.
01:31:01.020 Has anybody ever done that?
01:31:02.560 Yeah.
01:31:02.860 Your brain actually
01:31:03.700 uses a lot of energy.
01:31:05.720 So,
01:31:06.220 I have a theory
01:31:07.120 that
01:31:08.900 my brain
01:31:11.040 draws more
01:31:12.280 from my body
01:31:13.180 than other people.
01:31:14.700 It could have to do
01:31:15.840 with something
01:31:16.860 about my physical size
01:31:18.160 compared to my brain size.
01:31:20.020 Because I think,
01:31:20.880 I'm not sure about this,
01:31:21.940 but I think my brain size
01:31:23.320 is about the same
01:31:25.360 as any adult male,
01:31:26.380 right?
01:31:27.120 Even if I'm
01:31:27.960 5'8",
01:31:29.620 isn't my brain
01:31:30.600 about the same
01:31:31.320 as a 6'4 guy?
01:31:33.100 Can somebody,
01:31:33.640 is that true?
01:31:35.060 Somebody says no.
01:31:37.540 And maybe it's not the,
01:31:38.440 it's the wrinkles,
01:31:39.340 not the weight,
01:31:39.980 right?
01:31:40.660 All right.
01:31:41.120 Well,
01:31:41.320 whatever.
01:31:42.180 So,
01:31:42.540 forget about that point
01:31:43.440 because I don't have
01:31:44.000 any backing
01:31:45.000 for that point.
01:31:46.320 But,
01:31:46.880 I feel as though
01:31:48.060 my brain draws
01:31:49.280 more from my body
01:31:50.500 than most people.
01:31:51.900 I just feel like
01:31:52.640 I run hot.
01:31:53.980 And I don't know
01:31:54.920 that that's bad
01:31:55.620 for me.
01:31:58.100 Right?
01:32:00.260 I don't think
01:32:01.040 it's bad for me
01:32:02.120 that I run hot
01:32:03.840 as long as my body
01:32:04.940 is,
01:32:05.460 you know,
01:32:05.780 I'm taking care
01:32:06.460 of all the other
01:32:07.120 lifestyle things.
01:32:08.100 I feel like
01:32:09.180 I could run hot
01:32:10.080 and be optimized.
01:32:13.640 I don't know.
01:32:14.780 Just a,
01:32:15.320 just a theory.
01:32:18.460 Somebody says
01:32:19.120 that's not how it works
01:32:20.140 and I agree.
01:32:20.800 That,
01:32:21.020 that is definitely
01:32:21.600 not how it works.
01:32:22.860 But in a,
01:32:23.780 it's more of a
01:32:25.800 metaphorical sense
01:32:27.380 that I say
01:32:27.860 I run hot.
01:32:29.140 The,
01:32:29.540 the safer way
01:32:31.440 to say it is,
01:32:32.160 do I have
01:32:36.060 the same risk
01:32:36.920 at 140 blood
01:32:39.740 pressure
01:32:40.100 as somebody else?
01:32:41.540 And I don't think so.
01:32:43.060 I just don't know.
01:32:44.720 My,
01:32:44.960 my suspicion is
01:32:46.160 I don't,
01:32:46.780 but that could get
01:32:47.560 me killed,
01:32:48.020 of course.
01:32:50.180 All right.
01:32:51.340 That's all for now,
01:32:52.540 YouTube.
01:32:53.440 I will talk to you
01:32:54.320 tomorrow and,
01:32:55.420 of course,
01:32:55.760 on,
01:32:56.260 on Election Day.