Real Coffee with Scott Adams - February 09, 2023


Episode 2014 Scott Adams: Project Veritas Intrigue, Nord Stream Pipeline Intrigue, And More Fun


Episode Stats

Length

52 minutes

Words per Minute

140.19064

Word Count

7,388

Sentence Count

526

Misogynist Sentences

4

Hate Speech Sentences

12


Summary

In this episode of the show, Scott Adams talks about his troubles with his HP printer, and why he doesn't care if it doesn't have Wi-Fi. Plus, a debate about whether Bill Gates is a hypocrite or a hero when it comes to fighting climate change.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of civilization.
00:00:06.840 It's called Coffee with Scott Adams. There's never been a finer moment in all the world.
00:00:12.760 Looks like I forgot to tweet that we were going live, so we might have a low crowd here today on YouTube.
00:00:21.100 But that's okay.
00:00:23.620 Yes, did you notice the printer sarcophagus?
00:00:27.200 Yes. So let me give you an update on my printer, and then we'll do the simultaneous zip.
00:00:35.060 So it turns out that if you destroy your printer, your HP printer, in front of a live stream audience, HP will call you.
00:00:48.740 So I got a call from HP, and here's the good news.
00:00:55.040 So the good news is that HP has a really effective, quick response for stuff like this.
00:01:04.540 So big companies often have like a quick response complaint group for any high-level complaints.
00:01:10.420 So they were on it right away, and they offered to switch out my printer, which the problem was, I bought a printer, assuming that all printers, all high-end printers, I assumed all high-end printers had Wi-Fi.
00:01:27.540 That turned out to be a dumb assumption.
00:01:31.380 So I didn't really look very closely at the specs, because who would make a printer in 2022 or 2023 that wouldn't have Wi-Fi?
00:01:43.380 But it didn't.
00:01:44.260 So while I would say that was a user error, when I looked through the documentation, and it had all kinds of instructions on how to activate your Wi-Fi, and I used the software and it has instructions on how to activate the Wi-Fi,
00:01:59.040 in the tidiest little letters, in the tidiest little letters, it says, asterisk, tiny little letters, asterisk, only if your model has Wi-Fi.
00:02:12.780 And it did not.
00:02:15.240 So they offered to switch it out.
00:02:18.520 I bought it through Amazon, so I'm just going to return it and get one that has Wi-Fi on it.
00:02:23.720 And then we're set.
00:02:25.820 Have you ever heard this?
00:02:28.500 Now, the question you're going to ask me is this.
00:02:31.740 Why do I keep buying HP products if I keep having trouble?
00:02:37.800 Well, first of all, I believe that all printers have mechanical troubles.
00:02:41.840 I don't believe there's any such thing as a printer that doesn't break after a while.
00:02:46.820 Yeah, I know you've got your favorites, but I don't believe it.
00:02:48.900 Secondly, have you ever heard that when a customer complaint is satisfied, they become more loyal customers than if they'd never had a complaint in the first place?
00:03:01.580 Have you ever heard that?
00:03:03.020 It's a pretty well-established thing, especially in restaurants.
00:03:06.680 You know, that's where I heard it first.
00:03:08.600 When somebody takes care of your problem, you feel like there's a reciprocity thing going on.
00:03:15.300 And so they were so nice to me and so accommodating to fix the problem that I don't mind buying their new product.
00:03:24.500 Because like I said, I think it's a good product.
00:03:26.700 It just, all printers are going to have issues after a point.
00:03:31.060 But now, the simultaneous sip.
00:03:33.600 And if you'd like to enjoy this and take it up to a notch, a level, a dimension, heretofore unknown,
00:03:41.940 all you need is a cup or a mug or a glass or a tank or a gel, a sign, a canteen, a jug or a flask, a vessel of any kind.
00:03:47.520 Fill it with your favorite liquid.
00:03:49.220 I like coffee.
00:03:50.900 Join us now for the unparalleled pleasure of the dopamine at the end of the day, the thing that makes everything better.
00:03:56.540 It's called the simultaneous sip.
00:03:59.260 It happens now.
00:04:00.520 Go.
00:04:00.760 Go.
00:04:03.600 So here's to you, HP, for having an excellent customer response group.
00:04:10.880 Very well done.
00:04:12.360 Nicely done.
00:04:17.080 But since I don't have a printer at the moment, allow me to go to my notes on my phone here.
00:04:26.540 So, no, I don't want Dilbert ideas.
00:04:32.100 You don't need those.
00:04:34.520 How many of you think that Bill Gates is a hypocrite for flying in a private jet while complaining about climate change?
00:04:41.180 Go.
00:04:42.700 Bill Gates.
00:04:43.600 Hypocrite or not hypocrite?
00:04:45.980 For flying his private jet.
00:04:49.760 Don't care.
00:04:50.420 Well, he was asked by the BBC, and you would not be surprised to hear that he had an answer to this.
00:04:57.020 Now, you can decide if this is a good answer or not.
00:05:00.400 His answer is that he also invests billions of dollars into a company called Climeworks, which sucks carbon out of the air at scale.
00:05:13.080 And he's also invested in a number of other green technology businesses.
00:05:19.000 And so his argument is this.
00:05:21.840 If he can fly around on his jet efficiently and allocate his capital toward saving the planet, that he personally is very, very much, very much net positive.
00:05:35.340 Now, he would acknowledge that the jet uses more fuel than some other mode of transportation.
00:05:41.660 But overall, he's saying, you know, I'm saving the world, but it costs a little to get there.
00:05:47.780 How's his argument?
00:05:48.600 What do you think of that argument?
00:05:52.860 What's wrong with it?
00:05:54.520 You don't like it because it doesn't agree with you, but what's wrong with it?
00:05:58.320 Is that a bad argument?
00:05:59.900 Really.
00:06:01.060 If every person did what he did, we'd be in bad shape.
00:06:05.720 If everybody subtracted more CO2 than they added, we wouldn't be in good shape.
00:06:14.140 I see you saying it's a weak argument, but what's wrong with it?
00:06:16.840 I'm looking at your comments.
00:06:18.600 Yeah.
00:06:19.760 It's one of those arguments that you don't want to accept.
00:06:24.920 It's like painful to imagine that he's helping in any way.
00:06:31.980 I know he's so.
00:06:33.240 I think he's got the Gollum problem.
00:06:36.540 Do you know what I mean?
00:06:39.120 There's something about him, his physicality, because he doesn't seem to take care of himself in terms of his health.
00:06:45.940 He's got sort of a Gollum, oh, my precious, kind of a vibe to him.
00:06:53.840 It makes me wonder, if he looked like some healthy, good-looking young person, would people have the same opinion about him?
00:07:02.300 I just have a theory that his physicality is 75% of white people have an opinion about him, and that his wealth would be the rest.
00:07:11.160 And then, of course, your wild allegations, and I don't know if they're true or false, but they're certainly not confirmed.
00:07:23.560 All right.
00:07:25.960 Here's a good example of how the news tells you what to be mad about.
00:07:31.440 Do you believe that you wake up every day?
00:07:33.320 Sorry, I just saw a comment that was funny.
00:07:39.440 Do you believe you wake up every day, and you look at the news, and then you decide on your own, just all by yourself, what things are important and what things you're going to get all worked up about?
00:07:49.820 Is that your, let's say, your worldview?
00:07:53.560 That you look at the landscape of all the possible stories, and then you, on your own, with no help, decide which one's going to be important to you?
00:08:04.940 Nothing like that is happening.
00:08:07.060 No, the news tells you which ones to worry about, and it's almost random if you looked at, you know, if you were to look at it objectively.
00:08:14.520 And here's a perfect example of this.
00:08:16.040 And this was, I stole this from somebody else's tweet, whose name I forgot to write down, so I apologize for that.
00:08:25.260 But it was, how does Joe Biden get away with saying we're getting rid of oil and gas in a decade?
00:08:33.080 So at the State of the Union, Biden said they wanted to get rid of fossil fuels in, you know, 10 years.
00:08:40.800 Now, shouldn't that be the biggest story in the world?
00:08:43.920 Because if he actually accomplished that, it would, like, destroy civilization.
00:08:49.660 And it looks like he's actually trying to do it.
00:08:52.660 Now, how is that not the biggest story in the world, which is the point of the tweet?
00:08:56.760 The point of the tweet is, we've decided it's not.
00:09:01.480 And I don't mean we, I mean the media.
00:09:03.660 So if I had gone to the Fox News website today, and that was the main story, I'll bet a lot of you would be pretty worked up about it.
00:09:15.820 Because, you know, Fox News could have made that the big story.
00:09:19.020 They talked about it a lot.
00:09:20.520 But it wasn't like the big story.
00:09:22.400 If they just decided, that's the big story, a whole lot of people would be worked up about it.
00:09:29.340 But as it is, we just think, oh, old man said strange thing, doesn't make, you know, won't make much difference.
00:09:35.240 But instead, the main story that I saw on the Fox News, it may have changed by now, but the main story I saw on Fox News was that Leah Thomas, the trans athlete swimmer,
00:09:51.240 a few years ago, or not, I don't know when it was, changed in a dressing room, the girls' or women's dressing room, and her penis was showing, and the other women are upset about that.
00:10:08.320 So she showed her penis at the women's locker room, and that was the big story.
00:10:15.520 That was like the main story at Fox News.
00:10:21.240 I know some of you are racist, not racist, some of you are so close-minded that when I say her penis, you're like, wait, wait, you mean his penis.
00:10:33.000 But no, that would be wrong, so don't say that.
00:10:38.660 All right, but do I make my case?
00:10:42.160 Did I make my case?
00:10:44.220 You see it, right?
00:10:45.800 The things we decide to get worked up about, they're completely artificial.
00:10:49.300 They're chosen from this large menu of things we could get worked up about, but I guess this week we didn't.
00:10:58.720 Now, I think whatever is most clickable will always be where our attention goes, right?
00:11:05.640 So the news, obviously the news knows what you're going to click on.
00:11:09.060 Do you think I clicked on the story about Leah Thomas showing her penis to the other swimmers?
00:11:17.620 Do you think I clicked on that story?
00:11:20.120 It has, you know, almost no importance to my life.
00:11:23.500 Yes, I did.
00:11:24.760 Of course I did.
00:11:26.640 Have you met me?
00:11:28.440 Of course I did.
00:11:29.480 Yes, I spent some of my, you know, few remaining minutes on this earth, well, I hope there's more than a few, to look at that.
00:11:39.860 Because it was very clickable.
00:11:42.220 You kind of needed to know.
00:11:44.580 If it had been another story about inflation, possibly the biggest problem, would I have clicked on it?
00:11:52.260 No, probably not.
00:11:53.220 Because, you know, I know what the inflation story says already.
00:11:57.820 So, we're pretty much driven by clickability.
00:12:02.040 Here's a controversial little idea from, I saw this on a Tracy Follows tweet, which is a great name for a Twitter account.
00:12:13.540 Her actual last name is Follows.
00:12:16.820 Tracy Follows.
00:12:17.920 All right, anyway, Tracy tweets that a 2013 study on social conformity found that extroverts are more willing to go along with the opinion of the majority, even if it's wrong.
00:12:30.900 Even if it's wrong.
00:12:32.000 I don't know if they know it's wrong when they go along with it.
00:12:34.380 That's a little unclear.
00:12:36.280 And then the introverts are more likely to buck the system.
00:12:39.860 Now, consider the mandates and vaccinations and masks and all that stuff, which are not vaccinations, by the way.
00:12:50.540 Let me clarify.
00:12:54.080 Do you think that the fact that the people you hear the most from are all extroverts?
00:13:01.800 Because what do you need to be to run for government?
00:13:06.700 Members of Congress, are they not extroverts?
00:13:09.860 So we have a government of extroverts.
00:13:12.960 So we basically have a government of people who are selected for a trait which makes them go along with the majority.
00:13:21.580 Now, what about the pundits and the TV personalities?
00:13:25.700 Extroverts or introverts?
00:13:27.840 Well, you could have some introverts who are on TV, but probably extroverts, a lot of extroverts.
00:13:32.960 So what happens when you've got a situation where there's social conformity pressure and then you as a member of the public are trying to get some insight from all the experts and the government and the pundits?
00:13:49.680 But if they're heavily extroverted, because there's a selection thing, right?
00:13:57.020 The people who are famous usually try to be famous.
00:14:00.600 So there's something extroverted about them.
00:14:02.580 It kind of makes you understand how people, you know, how the top, anyway, the people in charge, it really makes you understand how they could so quickly conform, if it's true that extroverts conform.
00:14:18.940 Now, it does make sense.
00:14:20.960 Here's why this makes sense.
00:14:23.000 The reason extroverts are extroverts is they like putting themselves in front of people and getting a good reward for that.
00:14:31.200 Like, they get a good feeling literally in their body, in their mind, from exposing themselves to people.
00:14:39.700 But they wouldn't want to...
00:14:42.940 Yeah, they like getting attention.
00:14:45.760 So, you'd have to think that that's part of what happened.
00:14:53.140 And I've also speculated that maybe 10% of the people who were against vaccinations, which are not vaccinations.
00:15:02.860 How many times do I have to tell you?
00:15:04.780 They're just COVID shots.
00:15:06.020 But how many of those do you think were mostly because they were afraid of needles, and then they reasoned backwards from their fear of needles?
00:15:20.760 Everything that I know about psychology assumes that there are some people in that group.
00:15:26.080 Because that's the way the brain works.
00:15:27.780 The brain works, you know, you start with your irrational fear, and then you work backwards to your rationalization of why it all made sense in the first place.
00:15:36.440 Yeah.
00:15:36.820 You know, it's not...
00:15:37.720 I don't think it's 90%.
00:15:39.140 But it might be 1%.
00:15:40.920 It might be 10%.
00:15:42.240 Yeah.
00:15:43.220 But I wouldn't ignore it.
00:15:45.200 All right.
00:15:45.580 The big fun show on the news is the Twitter hearings in Congress.
00:15:52.200 And, oh, my God, these are brutal.
00:15:56.520 Have you watched any of the clips of Yoel Roth sitting there and just having his life destroyed by the members of Congress?
00:16:06.360 It is brutal.
00:16:08.000 Oh, my God.
00:16:10.000 He's being accused of being, let's say, pedophile-friendly.
00:16:14.640 And I'm not going to say that, right?
00:16:17.720 This is not my interpretation of anything he did.
00:16:21.240 My interpretation is that he did some academic stuff on some very controversial stuff, but it's being mischaracterized.
00:16:31.680 So in my personal opinion, he's being mischaracterized.
00:16:37.840 Now, how I feel about it, I'm having trouble sorting it out.
00:16:44.460 Because I also felt like I was being shadow banned.
00:16:48.480 I don't know if I was, but I felt like I was.
00:16:51.240 And I felt that people who at least agreed with me in some ways were being shadow banned, and I hated it.
00:16:58.100 And therefore, I had, you know, bad feeling about anybody who would have been involved in doing it.
00:17:02.240 So the animal part of me just wants to, you know, hate this Yoel Roth guy and, you know, grind him into the dirt for being on the other side from me and people I like.
00:17:17.600 On the other hand, I'm a big fan of innocent until proven guilty, and there's a lot of accusations thrown around about this guy that I'm not comfortable with coming from government officials.
00:17:32.000 You know what I mean?
00:17:35.740 Like, it's one thing for, you know, some idiot to say, oh, I think you're pedo-friendly or something.
00:17:41.820 It's another thing to have a member of Congress say it in public about you while you're sitting there on camera.
00:17:48.500 I thought it was too far.
00:17:49.480 I thought it was too far.
00:17:54.000 But, you know, the animal in me wants revenge.
00:17:58.120 So I'm like the socialized human in me says that's too far, because that's a little bit too much guilty until proven innocent.
00:18:07.820 When it comes from a sitting member of Congress, it just feels like the powerful hitting the less powerful.
00:18:16.440 But just like most of you, I have that revenge lust as well, and I'm a little uncomfortable with my own feelings about it.
00:18:27.080 I want the revenge, but I don't think that I wasn't comfortable with it, just wasn't comfortable with it, which is not defending him.
00:18:36.520 All right, so I'm not defending anything he did.
00:18:40.460 I'm just saying I wasn't comfortable with it, not my best feeling when I watched it.
00:18:46.560 But on the other hand, my lust for revenge was somewhat satisfied, and I'm not happy about it.
00:18:54.160 Like, you can't be proud of that, but it's true.
00:19:00.340 So you all had to answer for having called the Trump administration Nazis in a tweet.
00:19:05.720 He said he regretted the language.
00:19:11.580 But here's the fascinating part.
00:19:15.620 I read an opinion piece from, who is it?
00:19:20.200 What's his name?
00:19:22.420 Oliver Darcy on CNN.
00:19:24.160 So I watched those Twitter files and the Matt Taibbi Twitter files, Twitter threads and all that,
00:19:32.000 and I came away with, you know, a certain set of beliefs about how bad the Twitter files were.
00:19:38.760 In other words, you know, what bad behavior they showed.
00:19:42.240 But then you read Oliver Darcy on CNN, and he says there's a Republican distortion field,
00:19:47.820 and he said there's no evidence that FBI or government influenced Twitter on speech.
00:19:55.420 How many would agree with that characterization?
00:19:58.920 That of all the Twitter files, there's no evidence of FBI or government influencing Twitter on speech.
00:20:05.280 That feels like exactly the opposite of what we saw, isn't it?
00:20:11.120 So apparently the CNN, or at least some members of the left, are just going to say nothing happened.
00:20:19.840 So we're in this weird situation where the thing in the news is irrelevant.
00:20:29.300 Because the people on one side will say, this thing in the news is super important, and look at all this evidence.
00:20:36.780 And then the other side will say, I didn't see anything.
00:20:40.420 I'm looking right at it.
00:20:42.060 So you take January 6th.
00:20:43.840 January 6th, the Democrats say, look at all this evidence of horrible things.
00:20:49.080 And everybody on the right says, I'm looking at the same stuff.
00:20:51.680 I don't see anything.
00:20:52.860 Where's your insurrection?
00:20:55.000 They go, right here, looking at all the...
00:20:56.940 I'm looking at it, too.
00:20:58.440 I don't see any.
00:21:00.080 And then it's reversed.
00:21:01.760 The situation is reversed.
00:21:03.220 Now the one side is saying, hey, look at all these FBI, Twitter files, Democrat operatives,
00:21:09.460 like Jim Baker and all this, influencing.
00:21:11.840 And the Democrats are literally saying, I'm looking at the same stuff.
00:21:15.280 I don't see it.
00:21:16.600 I don't know what you're talking about.
00:21:18.520 No, look, look, here and here and here on the laptop.
00:21:21.640 Here's Hunter's own words, and it's all connected.
00:21:24.720 Uh-huh.
00:21:25.160 I don't see it.
00:21:26.580 No, it's right here.
00:21:28.320 It's right here, right here.
00:21:31.640 I don't see it.
00:21:33.780 That's the world we live in.
00:21:35.740 And by the way, it does work both ways.
00:21:39.420 It works both ways.
00:21:40.820 The other side, whatever the other side is, literally can't see the evidence.
00:21:48.040 Or they see evidence that doesn't exist.
00:21:51.560 So I don't know how much is lying anymore.
00:21:55.460 Do you?
00:21:56.900 Because there's definitely a cognitive phenomenon, which is convincing people that their side is right.
00:22:05.860 Both sides.
00:22:06.540 But, you know, we're at the point where the actual facts of the story don't have any role whatsoever on anything.
00:22:18.340 You know, again this morning, I'm arguing with somebody on Twitter whether long COVID should have been part of any decision-making.
00:22:31.260 And I'm thinking, really?
00:22:36.140 There's actually somebody who says that long COVID definitely doesn't exist.
00:22:40.280 Now, maybe it doesn't.
00:22:46.660 Maybe I'm hallucinating.
00:22:48.740 Maybe the people who say the anecdotal reports and the VAERS report are telling you something important,
00:22:56.760 but all of the anecdotal reports of long COVID are telling you nothing.
00:23:00.160 And I don't know how you can hold both of those things in your head.
00:23:05.620 I hold both of them in my head equally.
00:23:07.580 I say the VAERS report is not proof, but, man, there's so much on it, that's a pretty strong signal you better take that seriously.
00:23:18.400 But I also say all the personal individual reports from credible people who had long COVID for months anyway,
00:23:26.460 I take those seriously, but also it's not a randomized controlled test.
00:23:31.840 So why would you say that, you know, VAERS you can consider, but long COVID, all of those anecdotal reports you don't consider?
00:23:42.860 I don't know.
00:23:43.440 To me, the whole world is going crazy, and we're just seeing what we want to see at this point.
00:23:48.500 Like always, I guess.
00:23:50.180 Like always.
00:23:52.540 Well, I didn't get canceled this week, and I'm a little surprised.
00:23:58.900 And I'm feeling a shift in the woke stuff.
00:24:05.800 Does anybody else feel it?
00:24:07.520 Does anybody feel that we have reached peak wokeness, and it's either plateauing or starting to decrease?
00:24:17.460 Because let me tell you what I said that I feel like I would have been canceled four or two years ago.
00:24:23.080 And I got basically no pushback at all.
00:24:25.700 It was this tweet.
00:24:28.560 I already told you about this tweet, but the update is that I'm not canceled.
00:24:33.480 So I tweeted,
00:24:34.780 I wonder if any sober person with an IQ over 110 has ever been killed by police after resisting arrest during a traffic stop.
00:24:45.080 Now, I feel like I would have been canceled two years ago for that.
00:24:48.080 But today, I don't think anybody wants to touch it, because I'm perfectly willing to have that conversation.
00:24:58.580 I think stupid people and inebriated people get killed by police for resisting arrest.
00:25:04.480 And if you think that has anything to do with race, I don't know what, you show me a stupid, inebriated person, male, almost always male, and I'll show you somebody who's got a real risk with police.
00:25:19.180 Show me somebody who's smart and sober, and I'll show you somebody who could get into trouble, but they're not going to be murdered by police.
00:25:31.560 I mean, unless it was just the weirdest mistaken identity situation, which is a whole different story.
00:25:38.420 But let's stop pretending that people being killed by police while resisting arrest is anything about anything except stupid people.
00:25:49.220 It's just stupid people.
00:25:51.200 And if you want to tell me, no, no, Scott, it's a problem with the black community, then I say to you, why are you infantilizing black Americans?
00:26:00.060 Do you know when I was smart enough not to pick a fight with a guy with a gun?
00:26:04.220 Three years old.
00:26:06.780 Seriously.
00:26:07.180 At three years old, I already knew everything I would know for the rest of my life on the topic of whether I should pick a fight with somebody who had a gun.
00:26:17.160 Now, pick a fight means resisting arrest in this case, right?
00:26:21.580 Because I'm not a moron.
00:26:23.640 So why do we think that the black community can't figure that out?
00:26:29.560 Do you think that black people somehow have, like, can't figure out that you don't pick a fight with somebody with a gun?
00:26:35.240 Let's just treat it what it is.
00:26:37.220 It's not about race.
00:26:38.060 It's about stupid people and inebriated people, or usually both.
00:26:43.100 Yeah, stupid and inebriated.
00:26:44.940 And we should just call it what it is.
00:26:46.900 Stupid and inebriated people of all types.
00:26:50.220 White, black, Hispanic, doesn't matter.
00:26:53.020 If you're stupid and inebriated, you're at great risk.
00:26:56.080 Everybody else?
00:26:56.900 Not so much.
00:26:57.480 Not so much.
00:27:00.420 But back to my original point.
00:27:04.880 Do you think I could have said what I said just now, two years ago, with no blowback?
00:27:10.640 I've got no response.
00:27:12.560 No blowback to that.
00:27:16.060 It does feel different.
00:27:20.100 Yeah.
00:27:20.920 It does feel different.
00:27:21.840 And I know some of you want to go racial about it, go, oh, blah, blah, blah, percentages, this.
00:27:30.680 What does that have to do with anything?
00:27:33.600 Like, to me, it's just a useless statistic if you find any, you know, well, anyway, that's enough on that.
00:27:42.440 So there's some kind of intrigue with Project Veritas.
00:27:45.340 And we don't know the details of it, but the reporting is that James O'Keefe, the founder and head of Project Veritas,
00:27:54.380 apparently he has a board, board of directors of his organization,
00:27:57.860 and they have removed him of authority and put him on some kind of administrative leave.
00:28:03.680 Now, here's the first thing you should know about this story.
00:28:07.360 We don't know anything about this story.
00:28:09.220 It's disturbing, so we want to keep an eye on it, because the timing of it is suspicious.
00:28:18.300 So right after Project Veritas gets what looks like its biggest scoop, you know, the Pfizer employee who said some stuff on hidden camera,
00:28:28.760 that right after that, the board of directors moves the guy who has, you know, got the big win on that story.
00:28:34.960 Maybe, maybe, but until you hear why he was removed, don't you think you ought to hold your opinion on this a little bit?
00:28:46.580 I mean, you should be very curious.
00:28:48.880 So your curiosity should be as a citizen who's concerned that another citizen may be mistreated.
00:28:56.560 We don't know.
00:28:57.840 Don't know.
00:28:58.480 Well, but let's keep an eye on this one, because there's certainly cause for concern.
00:29:05.600 But it's way too soon to say that something that shouldn't have happened happened.
00:29:11.420 We don't know what he's accused of.
00:29:13.500 The reporting is that he's accused of being hard to work for.
00:29:19.400 What does that mean?
00:29:22.360 Being hard to work for?
00:29:23.580 I don't know what the story is, but I have a feeling that hard to work for is not going to be the reason that we finally figure out in the end.
00:29:37.480 Did I see the unarmed guy on his knees in the hotel hallway get murdered by cops?
00:29:42.940 Yeah.
00:29:45.180 But in every case, there's a special situation.
00:29:48.580 There was something going on there, and it might have been the cops themselves.
00:29:52.240 I don't know what the situation was.
00:29:54.000 But those are special situations.
00:29:56.840 It's not like there's some...
00:29:59.620 And I'm not sure that that was...
00:30:01.580 Actually, that wasn't the case of resisting arrest, right?
00:30:04.680 That was the opposite of resisting arrest.
00:30:07.900 So he was a case of someone who got killed not resisting arrest, right?
00:30:14.760 So my entire point was resisting arrest.
00:30:17.980 He was the opposite.
00:30:18.880 And he was drunk, somebody says.
00:30:21.260 So my point was, if you're smart and not inebriated, how often do you get killed?
00:30:28.880 Not often.
00:30:30.740 All right, so let's keep our mind open on this Project Veritas.
00:30:35.240 I don't know what's going on there.
00:30:37.020 All right.
00:30:37.640 Remember when I told you that when we heard that there was a story that the Nord Stream pipeline was blown up by Americans?
00:30:45.260 Do you remember I told you, I'm not so sure about this story?
00:30:50.960 Do you remember my skepticism at the initial reporting?
00:30:54.880 Well, so apparently there's this one American journalist, Seymour Hersh, who's the one who believes he has this inside information that U.S. divers planted their explosives during a NATO exercise, blah, blah, blah.
00:31:11.080 And then I see Mike Lee, senator from Utah, say, he tweeted, I'm troubled that I can't immediately rule out that the suggestion, that the U.S. rule out the suggestion, that the U.S. blew up Nord Stream.
00:31:25.580 I checked with a bunch of Senate colleagues.
00:31:28.800 Among those I've asked, none were ever briefed on this.
00:31:32.220 If it turns out to be true, we've got a huge problem.
00:31:35.440 So now we've got a problem.
00:31:38.340 If it's true, Congress should have been part of the decision-making.
00:31:45.440 Or at least they should know about it now.
00:31:48.120 And apparently they don't.
00:31:52.400 Apparently they don't.
00:31:53.420 So here's, I saw in the comments, is Seymour Hersh reliable?
00:31:59.860 Is there any history of any big stories he's gotten wrong?
00:32:09.120 He's the My Lai Massacre guy?
00:32:12.180 He goes back that far, huh?
00:32:16.420 Long history of good stories.
00:32:19.200 So maybe.
00:32:20.080 But what do you suppose his source was, or sources?
00:32:24.860 Because don't you think his sources were military spooks?
00:32:31.460 See, here's my problem.
00:32:34.100 If there was a real way that the thing was blown up,
00:32:38.180 and our military didn't want anybody to know the real way,
00:32:43.040 wouldn't they leak a story about sort of a traditional way?
00:32:46.420 So here's what I'm suspecting.
00:32:50.060 I'm wondering if maybe we blew it up with some,
00:32:54.700 let's say we had a submarine that could do that,
00:32:57.580 or some other kind of asset that was not divers with bombs.
00:33:02.140 We might want to put out the story that, yes, we did it.
00:33:05.820 It was us.
00:33:06.600 But we'd fool them about what technology we use,
00:33:10.500 because maybe that's a secret.
00:33:12.160 That's a possibility.
00:33:13.580 So that would be one reason to plant a false story,
00:33:15.980 so that the other side doesn't know what kind of assets you have.
00:33:20.860 Maybe.
00:33:21.700 You know, underwater drone.
00:33:23.280 Yeah, maybe we have submarine drones,
00:33:25.320 and we don't want anybody to know about it.
00:33:26.700 The other possibility is that we're sending some kind of a message.
00:33:34.120 Let's say it was an intentional leak or something like that.
00:33:38.640 Do you think it would be a message to Putin
00:33:41.460 to make sure that he knows how serious NATO is?
00:33:48.340 Because if you're Putin,
00:33:50.760 and you imagined it blew up, I don't know,
00:33:55.060 on its own or something,
00:33:56.560 that doesn't really show you
00:33:58.560 that the other side is serious about this war.
00:34:01.560 But at some point, maybe you want to tell Russia,
00:34:04.020 oh, yeah, we did this,
00:34:05.320 and we would do things like this again.
00:34:08.060 I don't know.
00:34:08.780 Maybe it's a way to show how serious we are.
00:34:14.560 Message to the Germans?
00:34:15.980 Yeah.
00:34:16.740 Now, do you remember when the Nord Stream was,
00:34:20.420 first blew up,
00:34:21.800 and the U.S. said it was Russia blowing up its own pipeline?
00:34:27.280 Do you remember my reaction to that?
00:34:30.280 About Russia blowing up its own pipeline?
00:34:33.460 I called bullshit to that on the first minute.
00:34:37.060 I think Tucker Carlson did as well.
00:34:39.640 I think other people did as well.
00:34:41.240 But as soon as I heard that Russia blew up its own pipeline,
00:34:43.740 that wasn't even slightly believable.
00:34:46.920 There wasn't a single thing about that
00:34:49.040 that was in the same zip code
00:34:51.680 of something you should believe.
00:34:53.640 It was just ridiculously unbelievable.
00:34:56.680 It was the opposite of all common sense.
00:34:59.420 And sure enough,
00:35:00.740 exactly like you thought it was a lie.
00:35:03.900 All right.
00:35:04.780 I'm very curious about Matt Gaetz
00:35:07.480 and his take on Ukraine.
00:35:09.800 So he's pretty much stopped the funding for Ukraine
00:35:14.060 and let's figure out how to end this thing.
00:35:17.880 And many of his colleagues are on the opposite side.
00:35:22.820 And I look at this as a long-term Matt Gaetz political play.
00:35:27.860 And I think he nailed it.
00:35:31.320 And here's why.
00:35:33.020 Is there much chance that at the end of the,
00:35:35.660 if there is an end,
00:35:37.260 of the Ukraine-Russia situation,
00:35:39.640 is there any chance at the end of it
00:35:41.340 citizens of the United States will be glad we did it?
00:35:46.180 What do you think?
00:35:47.640 Because it feels like this is just Iraq 2.0
00:35:50.700 or Vietnam 2.0.
00:35:54.420 It feels like yet another war that was optional.
00:35:59.040 And it feels like,
00:36:00.620 here's my big insight on this.
00:36:04.360 It feels like the only person who's playing it right
00:36:06.880 politically is Matt Gaetz.
00:36:08.740 Now, he might be also playing it right
00:36:10.560 for what's good for the country.
00:36:12.500 But just talking about politics for a moment,
00:36:15.320 there might be a time in our not-too-distant future
00:36:18.920 where Matt Gaetz is running for president.
00:36:22.440 That's not much of a stretch, right?
00:36:24.640 You know, you kind of assume
00:36:25.760 you're sort of angling toward the presidency at one point.
00:36:28.860 And he might be running against people
00:36:30.800 who voted for funding the Ukrainian war
00:36:35.260 over and over again.
00:36:37.380 He might be running against people
00:36:39.260 who are all unqualified
00:36:41.500 because by then it's very likely
00:36:44.040 the public will have turned on all this funding.
00:36:46.240 Even if we get something like a good,
00:36:49.440 that feels like maybe a good-ish outcome,
00:36:52.260 you know, of blunting Russia or whatever,
00:36:54.460 I don't think the public's going to see it that way.
00:36:56.840 I think the public's going to see all the money we spent
00:36:58.980 and that we didn't get anything from it.
00:37:01.540 And I think that Gaetz is going to be
00:37:03.600 one of the only purebloods.
00:37:07.220 So I'm using that term ironically and jokingly.
00:37:11.240 But he might be the only one who said
00:37:13.760 I was against the Ukraine war from the beginning.
00:37:17.080 And then I think people are going to say,
00:37:19.020 oh my God, my inflation is going crazy.
00:37:22.080 What could we have done to avoid this?
00:37:24.460 Well, one thing you could have done
00:37:26.900 is what Matt Gaetz said,
00:37:28.100 don't spend all your money in Ukraine
00:37:29.620 when there's nothing we could win.
00:37:30.980 And it does seem more and more like
00:37:35.020 the Ukraine war is an optional war for profit.
00:37:40.060 That's what it looks like.
00:37:42.060 From my perspective,
00:37:43.800 it looks like the military-industrial complex
00:37:48.240 is primarily behind it.
00:37:50.920 Some of the, you know,
00:37:52.020 maybe the Biden crime family has something to hide,
00:37:54.440 don't know.
00:37:55.540 But it looks like a financial war.
00:37:58.300 And certainly there might be some energy companies
00:38:02.180 who want to take Russia out of the energy market.
00:38:06.560 So doesn't it just feel like money?
00:38:09.600 Yeah.
00:38:10.280 I've got a feeling that in the end,
00:38:13.100 so this is very much like anybody
00:38:14.600 who just bets against the government.
00:38:16.900 If the only thing you did is
00:38:18.280 I'm going to bet against the government,
00:38:20.220 you're going to have a pretty good track record.
00:38:23.000 So he's betting against the government
00:38:24.880 and I feel like it's going to pay off.
00:38:26.800 You know, it feels like foreshadowing of,
00:38:30.120 I don't know, eight years from now
00:38:31.380 or whenever he's running for president.
00:38:34.860 He's going to be tough.
00:38:38.360 All right.
00:38:41.440 What?
00:38:43.840 So here's some of the things that
00:38:45.360 Oliver Darcy argues on CNN,
00:38:49.240 an opinion piece.
00:38:49.980 He's got a quote from James Baker,
00:38:56.320 who is the,
00:38:57.680 was he the head of legal at Twitter?
00:39:01.600 And he says,
00:39:02.580 this is what Baker said,
00:39:04.020 quote,
00:39:04.260 I am aware of no unlawful collusion with
00:39:08.040 or direction from
00:39:09.660 any government agency or political campaign
00:39:12.660 on how Twitter should have handled
00:39:14.400 the Hunter Biden laptop situation.
00:39:17.260 James Baker, Twitter's former deputy counsel.
00:39:19.980 Oh, he was a deputy counsel, okay.
00:39:22.500 Told the committee while under oath.
00:39:25.240 Now, here's what Oliver Darcy did not mention.
00:39:29.820 Who James Baker is.
00:39:33.000 Can you imagine quoting James Baker
00:39:35.800 about what the FBI did or did not do
00:39:38.860 without mentioning
00:39:40.140 James Baker's role with the FBI
00:39:45.420 and his prior role with the FBI
00:39:47.580 and his connection with the Democrats?
00:39:50.680 He just leaves that out.
00:39:53.060 Don't you think that's, like,
00:39:54.840 the most important thing to mention
00:39:56.680 when you have a quote from James Baker
00:39:59.620 about the Twitter files?
00:40:01.580 How do you leave that out?
00:40:02.720 So it's pretty clear that
00:40:05.860 the left wants to make the Twitter files
00:40:09.180 all go away.
00:40:12.020 Oh, and then we hear that Trump,
00:40:15.000 so let me not do what
00:40:17.280 I accuse others of doing,
00:40:20.000 showing one side of the story.
00:40:21.300 So for the first time,
00:40:24.280 I was hearing that
00:40:25.000 the Trump White House
00:40:26.200 attempted to censure
00:40:28.040 Chrissy Teigen's tweets
00:40:32.740 that were insulting about Trump.
00:40:35.260 But he was ignored.
00:40:37.040 Now, the fact that he was ignored,
00:40:39.500 that probably should be
00:40:41.500 the end of the story.
00:40:42.860 Because it's not much of a story
00:40:44.260 if the government asks Twitter
00:40:45.640 to do something
00:40:46.340 and Twitter ignores it.
00:40:47.400 It's only when they act
00:40:49.420 that it's a story.
00:40:51.560 Ignoring it would be
00:40:52.380 the right thing to do.
00:40:54.220 And I don't have a problem
00:40:55.560 with anybody asking.
00:40:57.500 I don't have a problem
00:40:58.520 that the Trump White House
00:40:59.820 asked for an insulting tweet
00:41:02.420 to be taken down.
00:41:03.520 Because the tweet that was,
00:41:04.720 I don't know what
00:41:05.320 Chrissy Teigen's tweet was,
00:41:07.120 but clearly whatever it was
00:41:08.660 had nothing to do with
00:41:09.900 informing the public.
00:41:12.480 Would you agree?
00:41:14.360 Like, whatever Chrissy Teigen's tweet was,
00:41:16.660 it wasn't important.
00:41:18.900 So, you know,
00:41:20.560 I don't see anything there.
00:41:22.640 But, it is part of the story,
00:41:25.000 so we won't ignore it.
00:41:26.740 Because that would be unfair.
00:41:29.280 All right.
00:41:34.120 That, ladies and gentlemen,
00:41:36.200 is about all that's going on.
00:41:37.560 Is there anything else going on?
00:41:38.700 It's kind of a slow news day.
00:41:40.720 You know, once that balloon
00:41:41.920 was shot down,
00:41:42.780 we didn't have much to talk about.
00:41:43.980 You saw the pictures of
00:41:46.640 our divers picking that balloon
00:41:49.140 out of the ocean?
00:41:51.820 Makes you wonder
00:41:52.480 how they found it.
00:41:55.640 How did they find that
00:41:56.920 in the ocean?
00:42:00.160 Like, was that easy?
00:42:02.360 Maybe, oh, oh,
00:42:04.000 satellites,
00:42:04.480 they probably tracked it
00:42:05.240 all the way down.
00:42:06.520 Yeah, they probably tracked it
00:42:07.700 all the way down.
00:42:10.300 Whoa, big spender.
00:42:11.500 Somebody had two eggs
00:42:12.560 this morning.
00:42:13.800 Look at you.
00:42:17.360 Good for you.
00:42:18.880 Distraction balloon.
00:42:21.100 It was a balloon pavement.
00:42:22.100 Yeah.
00:42:26.060 So, Fetterman went in
00:42:27.540 to the hospital,
00:42:29.160 but we don't know anything
00:42:29.840 about that, right?
00:42:30.560 He was lightheaded.
00:42:32.920 Is there any news
00:42:33.820 on Senator Fetterman?
00:42:37.880 I haven't seen anything yet.
00:42:41.500 All right.
00:42:45.520 He was lightheaded.
00:42:48.320 Digital IDs?
00:42:50.820 Thoughts on digital?
00:42:51.800 Is there something
00:42:52.340 about digital IDs
00:42:53.380 that's happening now?
00:42:55.400 So,
00:42:57.460 oh, the Turkey earthquakes?
00:43:00.980 Yeah.
00:43:02.440 Yeah, there's not much
00:43:03.320 to say about earthquakes
00:43:04.240 in other countries
00:43:05.000 except that they're tragic
00:43:06.220 and we wish them the best.
00:43:07.700 The only thing
00:43:08.800 I would add to that
00:43:09.460 is natural disasters,
00:43:12.620 the earthquakes
00:43:13.100 and the floods and stuff,
00:43:14.420 these are the times
00:43:15.600 when we can most impact
00:43:18.760 world events.
00:43:21.760 Because I feel like
00:43:22.780 when you give somebody
00:43:23.720 aid in a disaster,
00:43:25.820 I feel like they don't forget it.
00:43:28.200 Am I wrong?
00:43:29.740 Like, one of the reasons
00:43:30.600 that France and the U.S.
00:43:32.880 will probably, you know,
00:43:33.980 be allies forever
00:43:35.420 is that they each
00:43:36.800 helped each other out
00:43:37.700 in wars.
00:43:39.640 That goes a long way.
00:43:41.760 And if we wanted,
00:43:43.140 you know,
00:43:43.960 Turkey or Syria
00:43:45.140 to be,
00:43:45.760 Syria's a little more
00:43:47.420 complicated,
00:43:48.500 but if we wanted Turkey
00:43:49.700 to be, you know,
00:43:50.620 on our side,
00:43:52.500 helping them
00:43:53.080 during earthquakes
00:43:54.360 is a good way
00:43:55.020 to do it.
00:43:56.100 And right now,
00:43:56.700 the Russians are doing it
00:43:58.200 and doing what they can.
00:44:03.140 How about Boebert?
00:44:05.200 So Boebert
00:44:06.080 at the Twitter files
00:44:07.560 was going hard
00:44:09.520 at the Twitter people.
00:44:12.980 Yeah, she was,
00:44:13.740 she got shadow banned
00:44:15.360 for a joke.
00:44:18.020 Literally a joke.
00:44:19.600 She got shadow banned
00:44:20.520 and I guess that's well proven
00:44:21.800 at this point.
00:44:22.380 Yeah, it's hard
00:44:28.460 to help Syria
00:44:29.100 because
00:44:29.620 the Russian influence
00:44:33.720 there plus
00:44:34.460 it's sort of
00:44:35.380 a messy situation.
00:44:37.020 Oh yeah,
00:44:37.420 so I saw that
00:44:38.500 it looked like
00:44:39.100 a DJ used AI
00:44:40.480 to create
00:44:42.220 a new album
00:44:44.220 by Eminem
00:44:45.120 or a new song
00:44:45.820 by Eminem
00:44:46.460 that Eminem
00:44:47.220 had nothing to do with.
00:44:48.320 It was just AI
00:44:48.940 created it.
00:44:50.160 And he played it live
00:44:51.360 at an event
00:44:51.980 and people liked it.
00:44:54.280 So,
00:44:55.440 here we are.
00:44:57.360 Here we are.
00:44:58.640 Now,
00:44:59.320 I'm still holding on.
00:45:01.820 I'm still holding on
00:45:03.020 by my fingernails
00:45:04.000 because AI
00:45:05.420 still can't do humor.
00:45:08.140 But I think
00:45:08.640 that's all that's left.
00:45:09.920 I think AI
00:45:10.540 can do a hit song.
00:45:12.360 I think AI
00:45:12.980 can do music.
00:45:14.260 I think he can
00:45:14.900 write a book.
00:45:18.040 I think he can
00:45:18.920 do fiction.
00:45:20.240 I think he can
00:45:20.940 make a movie.
00:45:22.660 But it can't
00:45:23.600 do humor yet.
00:45:25.280 Can't do humor yet.
00:45:27.560 Poetry,
00:45:28.140 nobody cares about.
00:45:30.980 Yeah.
00:45:31.780 But what happens
00:45:32.400 if,
00:45:33.040 let me ask this,
00:45:34.460 what happens
00:45:35.040 if AI
00:45:35.860 learned how
00:45:37.360 to make humor
00:45:38.060 and got really good
00:45:39.180 at it,
00:45:39.600 like better
00:45:40.040 than humans?
00:45:41.720 Would it be,
00:45:42.360 would it make us
00:45:42.960 laugh all day long?
00:45:46.320 It can't write
00:45:47.160 Kashmir,
00:45:47.720 you're right.
00:45:48.720 It can't.
00:45:49.360 The one thing
00:45:52.060 that humans
00:45:53.740 still have
00:45:54.520 is we have
00:45:55.280 our physical
00:45:57.120 sensors
00:45:57.820 for when
00:45:59.480 something works
00:46:00.220 or doesn't
00:46:00.660 work for a human.
00:46:02.680 And so far
00:46:03.260 the AI
00:46:03.780 can't reproduce
00:46:05.140 that because
00:46:05.580 it has no
00:46:06.040 sensors in the
00:46:06.840 world that are
00:46:08.120 analogous to
00:46:08.820 our heart
00:46:09.820 and circulation
00:46:10.680 and that
00:46:13.300 stuff.
00:46:13.940 couldn't get
00:46:16.160 any worse
00:46:16.580 for movies?
00:46:18.320 Yeah.
00:46:26.620 Neuralink
00:46:27.200 will give
00:46:27.500 the bots
00:46:27.920 an interface?
00:46:30.260 Yeah.
00:46:30.780 Yeah.
00:46:36.060 And I don't
00:46:36.820 know exactly
00:46:37.380 why AI
00:46:38.100 can't do humor.
00:46:39.180 It could be
00:46:39.640 because I haven't
00:46:40.140 taught it yet.
00:46:43.940 Tied to a chair
00:46:49.120 needs to be
00:46:49.620 a Netflix
00:46:50.020 series.
00:46:50.640 Yeah,
00:46:50.820 I agree.
00:46:51.780 All right,
00:46:52.120 I'm just
00:46:52.400 looking at your
00:46:52.860 comments here
00:46:53.440 for a moment
00:46:53.900 because we've
00:46:54.340 got not much
00:46:56.040 else going
00:46:56.680 on.
00:46:58.940 Would AI
00:46:59.620 cancel itself?
00:47:02.200 Oh,
00:47:02.700 Burt Bacharach
00:47:03.320 died?
00:47:03.620 give us a
00:47:13.400 new reframe.
00:47:14.500 What would
00:47:15.060 you like a
00:47:15.480 reframe for?
00:47:20.040 Does anybody
00:47:20.680 have any
00:47:21.120 particular
00:47:21.600 problems you
00:47:22.180 want me to
00:47:22.840 fix?
00:47:26.640 How people
00:47:27.300 see things
00:47:28.000 in the morning?
00:47:29.420 Well,
00:47:30.020 that's not a
00:47:30.520 reframe issue.
00:47:31.160 How to
00:47:34.440 study harder
00:47:35.080 or more
00:47:35.540 effectively?
00:47:36.680 I don't
00:47:37.160 know if
00:47:37.380 that's a
00:47:37.700 reframe.
00:47:42.520 Yeah,
00:47:43.120 hoarding.
00:47:46.360 Well,
00:47:46.980 so here's
00:47:47.440 your
00:47:47.640 reframe
00:47:48.060 for hoarding,
00:47:48.940 but this
00:47:49.840 is just
00:47:50.220 one to
00:47:50.700 try.
00:47:51.920 So I
00:47:52.240 think
00:47:52.460 hoarding
00:47:53.020 and OCD
00:47:53.620 and the
00:47:54.400 behaviors
00:47:54.820 that you
00:47:55.100 want to
00:47:55.300 stop,
00:47:56.100 you can
00:47:56.760 do it
00:47:57.020 with a
00:47:57.400 fake
00:47:57.780 because.
00:47:59.660 So the
00:48:00.300 reason that
00:48:00.820 you're doing
00:48:01.200 your behavior,
00:48:01.960 whether it's
00:48:02.340 an OCD,
00:48:03.080 repetitive
00:48:03.400 behavior,
00:48:03.980 or hoarding,
00:48:05.000 these are
00:48:05.840 irrational
00:48:06.340 reasons which
00:48:07.100 you understand
00:48:07.720 to be
00:48:08.060 irrational.
00:48:09.020 If you
00:48:09.520 try to
00:48:09.820 use
00:48:10.060 rational
00:48:10.660 reasons to
00:48:12.060 talk yourself
00:48:12.700 out of doing
00:48:13.160 something
00:48:13.520 irrational,
00:48:14.340 like an
00:48:14.720 OCD,
00:48:15.280 habit,
00:48:15.960 or hoarding,
00:48:16.860 that doesn't
00:48:17.740 make sense.
00:48:19.220 Instead,
00:48:19.800 use an
00:48:20.100 irrational
00:48:20.580 reason.
00:48:22.200 And one
00:48:22.840 that I've
00:48:23.220 suggested,
00:48:23.860 but you
00:48:24.120 could make
00:48:24.520 your own,
00:48:25.100 right?
00:48:25.340 So you'd
00:48:25.740 find the
00:48:26.040 one that
00:48:26.260 works for
00:48:26.660 you.
00:48:27.260 But the
00:48:27.780 one I've
00:48:28.100 suggested is,
00:48:29.100 I don't
00:48:29.680 need to do
00:48:30.120 this because
00:48:30.520 I already
00:48:31.780 have too
00:48:32.140 much.
00:48:33.700 Now,
00:48:34.300 that doesn't
00:48:34.760 really quite
00:48:35.280 mean anything,
00:48:36.380 because there's
00:48:36.800 no standard
00:48:37.420 by which to
00:48:38.040 measure what
00:48:38.540 is too
00:48:38.880 much.
00:48:40.520 But you
00:48:41.140 just say
00:48:41.540 it.
00:48:42.080 It's just
00:48:42.480 an irrational,
00:48:44.180 repetitive
00:48:44.600 statement where
00:48:45.440 you just tell
00:48:45.880 yourself,
00:48:46.380 oh, I have
00:48:47.180 too much.
00:48:48.280 I have too
00:48:48.780 much.
00:48:49.540 And you
00:48:50.080 can reprogram
00:48:50.760 your brain
00:48:51.360 through repetition.
00:48:52.180 So if you
00:48:54.080 think that
00:48:54.540 your brain
00:48:55.080 is programmed
00:48:56.680 by good
00:48:57.500 reasons and
00:48:58.280 logic and
00:48:58.860 facts,
00:49:00.080 well, good
00:49:00.560 luck with
00:49:00.920 that, because
00:49:01.500 nothing like
00:49:02.080 that happens.
00:49:03.300 We are
00:49:04.100 reprogrammed
00:49:04.940 just by
00:49:05.840 repetition and
00:49:06.680 focus.
00:49:07.920 So if you
00:49:08.640 put your
00:49:08.960 focus on
00:49:09.540 something, and
00:49:10.240 then you go
00:49:10.940 back to it
00:49:11.480 over and
00:49:11.780 over, it
00:49:12.800 will reprogram
00:49:13.600 your brain.
00:49:14.860 And it
00:49:15.200 doesn't need
00:49:15.640 to make
00:49:15.920 sense.
00:49:17.080 It's a
00:49:17.960 totally
00:49:18.580 irrational
00:49:19.220 process.
00:49:19.800 whatever you
00:49:20.900 focused on
00:49:21.620 and obsessed
00:49:22.140 on, that
00:49:23.200 becomes your
00:49:23.700 programming.
00:49:24.980 So if you
00:49:25.640 have an OCD
00:49:26.240 thing, when
00:49:28.320 you're doing
00:49:28.780 it, you're
00:49:29.140 focusing on
00:49:29.820 it, and
00:49:30.140 you're
00:49:30.420 reinforcing
00:49:30.940 it every
00:49:31.440 time you
00:49:31.740 do it.
00:49:33.000 Because it
00:49:33.600 was never
00:49:33.940 based on
00:49:34.520 anything
00:49:35.100 logical.
00:49:36.320 It was
00:49:36.620 just focus
00:49:37.220 and repetition.
00:49:38.400 And OCD
00:49:39.200 is nothing
00:49:39.740 but focus
00:49:40.300 and repetition.
00:49:41.700 So the
00:49:42.060 longer you
00:49:42.460 do that
00:49:42.860 behavior, the
00:49:43.620 more it
00:49:43.900 gets drilled
00:49:44.360 in and
00:49:44.700 becomes a
00:49:45.140 part of
00:49:45.420 you.
00:49:46.020 So just
00:49:46.560 do the
00:49:47.200 opposite.
00:49:48.840 Just repeat
00:49:49.600 more often.
00:49:51.100 Oh, I
00:49:51.400 don't need
00:49:51.700 to do
00:49:51.940 that.
00:49:52.300 That's
00:49:52.500 enough.
00:49:53.480 Or three
00:49:54.200 times is
00:49:54.700 enough.
00:49:55.880 Or once
00:49:57.180 is too
00:49:57.500 many.
00:49:58.880 Whatever the
00:49:59.560 reason is, it
00:50:00.160 doesn't have to
00:50:00.620 make actual
00:50:01.320 sense.
00:50:01.880 It has to
00:50:02.220 sound like
00:50:02.740 it makes
00:50:03.060 sense, but
00:50:04.300 not really.
00:50:04.920 It's not
00:50:05.240 important.
00:50:06.060 So you
00:50:06.360 just reprogram
00:50:07.180 your irrational
00:50:08.120 parts with
00:50:09.780 other irrational
00:50:10.660 parts and
00:50:11.840 see what
00:50:12.120 happens.
00:50:13.060 And I have
00:50:13.600 heard that it
00:50:14.000 works, by the
00:50:14.540 way, in a
00:50:15.380 different context.
00:50:19.600 Oh, how
00:50:22.000 to tolerate a
00:50:22.860 long medical
00:50:23.420 test like an
00:50:24.160 MRI?
00:50:25.960 Hmm.
00:50:27.220 That I
00:50:28.400 would use
00:50:28.840 visualization
00:50:29.420 for.
00:50:31.320 Try to
00:50:32.040 visualize
00:50:32.440 yourself in
00:50:33.380 the MRI,
00:50:34.280 you know,
00:50:35.460 being still.
00:50:38.420 And visualize
00:50:40.020 as much as
00:50:40.840 possible so
00:50:41.540 that once, you
00:50:42.180 know, as
00:50:42.460 clearly and
00:50:43.220 as much.
00:50:44.180 And then
00:50:44.500 when it
00:50:44.820 frightens you,
00:50:46.280 back out.
00:50:47.540 Right?
00:50:47.720 Stop thinking
00:50:48.700 about it for
00:50:49.160 a while, but
00:50:50.300 then rest a
00:50:51.660 little bit, wait
00:50:53.100 a little while,
00:50:54.560 then just
00:50:55.000 imagine it
00:50:55.560 again.
00:50:57.160 And the
00:50:58.040 more you
00:50:58.400 imagine it,
00:50:59.060 in theory,
00:51:00.120 you'll just
00:51:00.580 sort of get
00:51:01.000 used to it.
00:51:02.640 Because you
00:51:03.180 can get used
00:51:03.660 to anything if
00:51:04.220 you do it
00:51:04.520 long enough.
00:51:05.260 But since
00:51:05.660 you're not
00:51:05.980 practicing the
00:51:06.720 actual MRI,
00:51:07.860 sometimes you're
00:51:09.220 visualizing it
00:51:09.960 can get you
00:51:10.420 there.
00:51:12.000 Now, I
00:51:12.620 would also
00:51:12.960 try that
00:51:13.480 Andrew
00:51:14.040 Huberman
00:51:16.000 breathing
00:51:16.380 exercise.
00:51:17.840 The two
00:51:18.340 inhales in
00:51:19.000 through your
00:51:19.320 nose and
00:51:19.840 one exhale
00:51:20.540 out through
00:51:21.100 your mouth.
00:51:22.000 It really
00:51:22.660 works.
00:51:23.620 I'm so
00:51:24.120 addicted to
00:51:24.620 that.
00:51:25.580 Two breaths
00:51:26.420 in through
00:51:26.760 your nose.
00:51:31.660 I mean,
00:51:32.400 I just
00:51:33.020 did it.
00:51:34.400 And, like,
00:51:34.980 my entire
00:51:35.420 body feels
00:51:36.020 different.
00:51:36.720 Like, that
00:51:37.060 literally just
00:51:37.900 two inhales
00:51:39.020 and an exhale
00:51:39.500 and I feel
00:51:40.160 completely
00:51:40.560 different.
00:51:41.300 My torso
00:51:41.820 just relaxed
00:51:42.620 totally.
00:51:44.640 All right.
00:51:44.960 what about
00:51:46.500 Max
00:51:46.860 Gates'
00:51:47.420 ex-girlfriend?
00:51:49.000 Is there
00:51:49.240 a story in
00:51:49.740 the news?
00:51:50.160 I don't
00:51:50.360 know about
00:51:50.680 that.
00:51:55.900 Too much
00:51:56.520 oxygen?
00:52:03.120 How did
00:52:03.680 this split
00:52:04.120 into NPCs
00:52:05.060 and non-NPCs?
00:52:06.000 Well, that
00:52:06.600 has more to
00:52:07.080 do with
00:52:07.380 imagining our
00:52:09.660 reality as
00:52:10.460 a simulation,
00:52:11.780 which is
00:52:13.360 by far
00:52:14.300 my favorite
00:52:15.180 frame
00:52:16.820 in the
00:52:17.100 world.
00:52:23.260 All right.
00:52:27.000 And that's
00:52:27.780 all I have
00:52:28.140 now.
00:52:31.680 What comes
00:52:32.520 to mind?
00:52:33.260 Okay.
00:52:34.440 That's all
00:52:35.180 we have now.
00:52:36.620 I'm going to
00:52:36.980 talk to you
00:52:37.300 later.
00:52:38.100 And bye
00:52:39.460 for now on
00:52:40.100 YouTube.
00:52:40.420 See you
00:52:41.760 tomorrow.