Real Coffee with Scott Adams - October 27, 2021


Episode 1543 Scott Adams: Today I Will Talk About IQ and the Correlation With Health. Because I Like Causing Trouble


Episode Stats

Length

54 minutes

Words per Minute

145.00807

Word Count

7,907

Sentence Count

659

Misogynist Sentences

11

Hate Speech Sentences

7


Summary

The dopamine hit of the day, the one that makes your antibodies sing and makes you feel smarter and more capable, is a thing that happens when you drink a lot of coffee and relax a lot. It s called the Simultaneous Sip, and it s happening now.


Transcript

00:00:00.000 La-da-da, da-da-da-da-da-da, bum-bum-bum-bum, bum-bum-bum.
00:00:10.000 Well, hello, everybody.
00:00:12.620 Guess what?
00:00:14.100 It's time.
00:00:16.000 It's time for the best part of your whole day.
00:00:20.860 Now, I'm not saying the rest of your day will be not good.
00:00:24.720 It might be awesome.
00:00:25.660 But no matter how awesome it is, no matter how awesome, yes, this will be the highlight.
00:00:33.400 And are you wondering if this sort of thing can improve your antibodies?
00:00:38.440 Well, according to a thing called science, yes, a lot.
00:00:43.360 You've heard of a thing called placebos, right?
00:00:46.880 That's right.
00:00:47.620 A placebo works quite often.
00:00:49.420 Not as well as the drug it's compared to if you do things right.
00:00:53.060 But, but, placebos work.
00:00:56.680 So what does that tell you about the ability of your mind to influence your health outcomes?
00:01:02.420 Pretty good.
00:01:03.460 Pretty good.
00:01:04.600 Unless it's just a statistical data problem.
00:01:07.180 But I think there's something happening there.
00:01:09.720 Secondly, we know for sure that if you're more relaxed and less stressed, you'll produce less.
00:01:16.160 If you're more relaxed, you'll produce less cortisol, the stress chemical.
00:01:21.320 And the stress chemical makes your immunity to everything go down.
00:01:26.440 So yes, 100% guaranteed, the people watching this live stream will be healthier than other people.
00:01:36.000 Because you have the benefit of all this relaxing content.
00:01:40.980 Things that take the edge off.
00:01:42.760 Things that make you feel better.
00:01:44.440 Things that make you feel smarter, more capable, more confident.
00:01:47.520 You put all that together, what's that do to your antibodies?
00:01:50.560 I think you know.
00:01:52.140 I think you know.
00:01:55.120 Well, there was a recent study that showed that zero people watching my live stream have died of COVID.
00:02:02.840 I don't think that's a coincidence.
00:02:06.160 Do you?
00:02:06.960 No.
00:02:07.900 How would you like to take it up a notch?
00:02:09.460 I know you would.
00:02:10.440 How about all you need is a cup or a mug or a glass, a tank or a chalice or a stein, a canteen jug or a flask, a vessel of any kind.
00:02:19.300 Fill it with your favorite liquid.
00:02:21.380 I like coffee.
00:02:23.020 And join me now for the unparalleled pleasure.
00:02:26.580 The dopamine hit of the day, the thing that makes your antibodies line up and sing.
00:02:31.820 It's called The Simultaneous Sip, and it's happening now.
00:02:36.040 Go.
00:02:41.140 You know, you can never tell when I'm completely serious, can you?
00:02:47.760 But I would bet a large amount of money that the people who watch this live stream have better outcomes than people who don't.
00:02:55.320 I would bet on that.
00:02:58.540 Do you remember during the height of the pandemic when the public was scared to death?
00:03:06.200 But many of you joined me then and were watching my content.
00:03:10.140 And when people were saying, the wheels are coming off, everything's falling apart.
00:03:14.520 And I said, no, not so much.
00:03:16.500 You'll be fine.
00:03:17.880 You'll be fine, and here's why.
00:03:20.120 Don't you think you had less cortisol than other people?
00:03:24.540 I think so, because actually I don't even know anybody else who was making chill-out content at the time.
00:03:32.120 Probably just me.
00:03:33.200 I mean, if you could think of somebody, let me know.
00:03:35.420 But it's real.
00:03:36.900 It's real, people.
00:03:37.840 You get healthier when you watch this.
00:03:39.960 Well, let's talk about all the things that are happening, shall we?
00:03:44.660 Rasmussen has a poll about Dr. Fauci.
00:03:48.660 It found that 46% of the likely voters polled want Fauci to be forced to resign.
00:03:55.140 46%.
00:03:55.620 Is that good or is that bad?
00:03:57.200 Anybody?
00:03:57.960 Anybody?
00:03:58.900 Is that good or bad for Dr. Fauci?
00:04:01.760 46% of them want him to resign.
00:04:04.120 46% is the percent that we want everybody to...
00:04:13.820 Basically, that's a universal number.
00:04:17.680 40% of the country wants every famous person who works for the government to resign.
00:04:24.000 How many people want AOC to resign or be forced to resign?
00:04:27.960 About 46%.
00:04:29.220 How many people think the president should be forced to resign?
00:04:32.620 I mean, there are more people who dislike him, maybe.
00:04:36.040 But how many think he should be forced to resign?
00:04:38.000 About 46%.
00:04:39.500 You know, I read this and I think to myself, after all the trouble that Fauci has gotten
00:04:47.380 into and gotten out of and gotten into again, all the criticism, the number of people who
00:04:53.360 want him to be fired is about the same as everybody else.
00:04:56.140 Apparently, we are completely immune to any kind of data or analysis, but boy, do we like
00:05:04.040 being on a team.
00:05:05.320 We like being on a team.
00:05:07.260 So 46% is pretty much like the universal number of people who want somebody else to resign.
00:05:13.960 Somebody famous.
00:05:16.200 All right.
00:05:16.560 I don't know why that's funny, but I'm going to read that comment.
00:05:21.780 In the comments on YouTube, 47% of people polled think Alec Baldwin is actually a cowboy.
00:05:27.900 That's not far off.
00:05:30.340 Let me give you my current final answer on Alec Baldwin.
00:05:35.460 And I like to say this because I would consider him a political opponent, adversary.
00:05:43.500 You know, I don't like to use those words because we're all on the same team, America.
00:05:47.060 But he would be on the other side politically from...
00:05:50.860 Whoops.
00:05:53.780 What the hell is happening over here?
00:05:58.020 I've got some kind of unstable internet connection messing up my locals' feed.
00:06:03.340 That's interesting, but only one of them.
00:06:05.040 I've got two feeds.
00:06:11.520 Huh.
00:06:12.780 I've got two feeds.
00:06:13.720 I'm going to turn another one on.
00:06:15.300 Let's see if I can cause this to work.
00:06:19.080 Technical problem being worked out.
00:06:21.240 Oh, permission denied.
00:06:22.980 Oh, that's interesting.
00:06:29.060 Well, let's see if I can reconnect this.
00:06:33.280 Might be a way to do it.
00:06:34.340 Hold on.
00:06:34.880 Hold on, YouTube people.
00:06:35.840 Hold on.
00:06:37.880 Oh.
00:06:38.840 For some reason, it looks like it's working now.
00:06:41.080 Interesting.
00:06:42.800 Let's try this.
00:06:44.200 That damn rumble.
00:06:45.980 Somebody's blaming rumble.
00:06:47.280 All right.
00:06:47.560 There we're back.
00:06:48.980 Wasn't such a problem after all.
00:06:51.200 All right.
00:06:51.740 But I was in the middle of saying something so fascinating that I can't even remember it.
00:06:57.520 Does anybody else remember what I was saying before I got sidetracked?
00:07:02.020 Anything?
00:07:02.340 Anything?
00:07:02.920 All right.
00:07:03.420 All right.
00:07:08.300 But anyway, hypnosis and antibodies.
00:07:10.600 It's real.
00:07:11.140 What was I talking about?
00:07:12.140 What was I talking about?
00:07:18.600 Tell me.
00:07:19.220 Oh, Eric.
00:07:20.460 Alec Baldwin.
00:07:21.100 I guess that didn't matter.
00:07:22.820 All right.
00:07:24.880 I think Locals is back, right?
00:07:26.840 I think Locals is back working.
00:07:28.280 It looks like it is.
00:07:28.880 I've decided that Joe Biden is what I call the stem cell president.
00:07:35.400 The stem cell president.
00:07:37.300 You're going to like this.
00:07:38.820 If you didn't see me say it on Twitter, you're going to like it.
00:07:41.660 You ready?
00:07:42.120 What is the process in the United States for replacing a vice president?
00:07:48.820 Not a president, but a vice president.
00:07:51.200 What's the process if, let's say, a vice president were to resign or to be, I don't know, impeached
00:07:58.380 or maybe have a medical problem, have to leave?
00:08:01.960 What would be the process?
00:08:04.100 Well, it turns out that Biden would appoint somebody and then Congress would vote by a
00:08:09.660 simple majority.
00:08:11.340 No, no, we're not talking about line of succession.
00:08:14.500 So it has nothing to do with Pelosi.
00:08:17.140 Pelosi is who would take over if the first and second in command were incapacitated at the
00:08:23.080 same time.
00:08:24.080 We're not talking about that.
00:08:25.940 So forget about line of succession.
00:08:27.940 That's not the conversation.
00:08:30.240 We're talking about how to replace a vice president, just a vice president.
00:08:35.040 So a vice president quits, let's say, and the president simply nominates another Democrat
00:08:41.720 in this case.
00:08:43.160 And then because Democrats have a majority, and also because historically the country likes
00:08:49.560 to have a vice president.
00:08:51.020 You know, we don't like to go without a vice president.
00:08:53.180 So as long as the president nominates somebody who's not horrible, you know, somebody who's
00:08:59.940 an ordinary person within that party, generally you can expect to get a quick agreement.
00:09:06.360 You know, maybe even from some Republicans who just are in favor of the idea of, you know,
00:09:11.320 good continuity and keeping tradition.
00:09:13.980 So, here's our situation.
00:09:18.200 It looks like maybe even the people on Biden's team didn't really ever expect him to go four
00:09:23.700 years.
00:09:24.800 Can we all agree on that?
00:09:26.740 Maybe they didn't know one way or the other, but they certainly weren't confident he could
00:09:31.860 make all four years.
00:09:33.000 Would you give me that?
00:09:34.040 That they weren't confident about it?
00:09:35.780 They might have been optimistic, but you couldn't be confident about it, which means that Kamala
00:09:42.660 Harris was always the backup plan.
00:09:45.440 That's what the vice president is.
00:09:47.060 Not really a shocking statement there.
00:09:50.280 But I think they were surprised at how poorly she performs.
00:09:55.560 I was.
00:09:56.820 I picked her to be the nominee.
00:09:58.740 And I was just shocked that as soon as you took her out of a, let's say, a well-defined
00:10:04.980 situation where she's, you know, grilling people in Congress like a lawyer, where she's
00:10:09.800 quite good, actually, or a debate.
00:10:12.940 She wasn't too bad in the debates.
00:10:14.680 But you put her in any kind of a free-form situation, it's just, she just falls apart.
00:10:19.760 And I think the Democrats can see it, too.
00:10:23.320 She seems to be de-emphasized at the moment.
00:10:25.960 So, yes, I was wrong about her being chosen as the top of the ticket.
00:10:34.080 That is correct.
00:10:35.180 I might not be wrong about the plan to put her in power.
00:10:38.060 We'll see about that.
00:10:39.260 But let's say, this is just speculative, let's say that the Democrats who are really pulling
00:10:44.240 the power strings behind the scenes, they say to themselves, we can put anybody in the
00:10:49.840 presidency we want now.
00:10:51.980 Because they can.
00:10:52.940 All they have to do is find a reason for Kamala Harris to resign before Biden does.
00:11:01.260 You see this play?
00:11:03.160 Biden just has to pick anybody who can get through Congress, and that's a pretty big,
00:11:08.000 probably a pretty big menu of people he could get approved.
00:11:11.800 You know, I don't think he could get, you know, AOC or, you know, one of the squad approved.
00:11:18.620 Maybe not that.
00:11:19.480 But he could get an ordinary Democrat approved pretty easily.
00:11:23.860 And effectively, the Democrats can pick anybody they want for president for now.
00:11:32.240 Right?
00:11:33.140 Now, I'm not wrong about any of this, am I?
00:11:35.700 Give me a fact check.
00:11:37.780 Can a vice president resign?
00:11:40.380 Yes.
00:11:41.520 Is the process to replace him that the president picks somebody new and Congress approves by a
00:11:46.620 simple majority?
00:11:47.340 That's important.
00:11:47.900 They don't need two-thirds.
00:11:49.400 Just a simple majority, and they have that.
00:11:53.800 And is Biden likely to make it four years?
00:11:56.720 Probably not.
00:11:57.460 I don't even think his own team assumes that.
00:12:01.300 So we've created a situation where he's the stem cell president.
00:12:05.940 He can be turned into any president they want who's also a normal Democrat.
00:12:10.520 Or, yeah, normal Democrat.
00:12:11.520 How many people have ever thought of that before?
00:12:16.200 This play is right here.
00:12:17.560 It's right in front of you.
00:12:19.240 If you see any moves for Kamala Harris to leave the job, you don't know who your next president's
00:12:29.840 going to be.
00:12:30.140 But it won't be based on democracy, that's for sure.
00:12:33.320 All right.
00:12:35.920 So that's interesting.
00:12:37.000 There's a story about Reid Hoffman, billionaire, founder of LinkedIn, and he was one of the
00:12:44.460 PayPal mafia people, meaning one of the startup people on PayPal, before LinkedIn.
00:12:50.780 And Reid Hoffman teamed up with George Soros.
00:12:53.760 You've heard of him, right?
00:12:55.480 Anybody?
00:12:56.140 Anybody?
00:12:56.640 George Soros.
00:12:57.800 Another billionaire.
00:12:58.420 And they're creating a media firm to, quote, combat disinformation.
00:13:09.560 That's right.
00:13:11.240 Two billionaires are teaming up to tell us what's true and what isn't.
00:13:18.840 Do you see any potential problems there?
00:13:22.260 Does anybody see any potential problem?
00:13:26.600 That looks good, right?
00:13:27.500 Hey, billionaires trying to help us with information.
00:13:30.720 Yes.
00:13:32.480 It's nice that they're going to help us understand what's true.
00:13:36.360 Because I don't know if you knew this, but you don't know.
00:13:40.400 And they do.
00:13:41.600 Or the people they hire will know.
00:13:43.640 And interestingly, although apparently nobody in the world can figure out what's true, these
00:13:49.400 two billionaires have figured out how to get some people who do know.
00:13:52.260 Apparently, there are people who can tell what's true, and they haven't been telling us.
00:13:58.820 I hate those people.
00:14:01.700 Think about it.
00:14:03.260 There are people who know what's true, and they haven't been telling us.
00:14:06.740 Why don't the people who know what's true tell us what's true instead of all the liars?
00:14:11.200 Well, hopefully, Reid Hoffman and George Soros will solve that problem by getting some totally
00:14:18.340 credible people that I'm sure you will believe to tell you what's true and what isn't.
00:14:23.360 All right?
00:14:23.620 It's called Good Information, Inc., and they're going to fund and scale businesses that, quote,
00:14:30.340 cut through the echo chambers with fact-based information.
00:14:34.200 Fact-based.
00:14:35.360 No more of this opinion crap.
00:14:37.580 No more of this conspiracy theory crap.
00:14:39.920 We're going for fact-based information now.
00:14:43.080 It even plans to invest in local news companies.
00:14:50.260 Do you see any red flags there?
00:14:52.180 Billionaires owning the news.
00:14:55.940 Huh.
00:14:56.920 Well, there's a billionaire that owns the New York Times, and it's not like they're producing
00:15:01.560 any fake news.
00:15:02.440 What?
00:15:03.140 They are?
00:15:04.600 The New York Times produces fake news, and it's owned by a billionaire?
00:15:08.180 Well, that's weird.
00:15:09.460 But at least we have the Washington Post, because they...
00:15:12.000 What?
00:15:13.640 The Washington Post prints fake news, and it's owned by a billionaire?
00:15:16.820 How could that be?
00:15:18.320 How could that be?
00:15:19.120 Because billionaires don't give you bad information.
00:15:23.440 That's not what they do.
00:15:25.400 Do they?
00:15:27.080 So, here's an example of why you should trust this entity.
00:15:34.880 Yeah, at first I was worried that maybe they would be biased themselves.
00:15:39.360 Please check out Rogan's podcast with Jewel.
00:15:43.980 She doesn't use these words specifically.
00:15:45.300 It's all about talent stacks.
00:15:47.260 Oh, interesting.
00:15:49.120 So, there's a Joe Rogan interview with Jewel that looks interesting in terms of understanding
00:15:55.440 her systems for success.
00:15:56.740 Anyway, back to my point.
00:16:00.400 So, one of the people who's going to run this new Good Information Inc. points out that they're
00:16:05.400 not going to be all just left-wing stuff.
00:16:07.840 It's not going to be a bunch of just left-wing stuff.
00:16:10.580 Don't worry.
00:16:11.200 Don't worry.
00:16:12.200 And one of the examples given of their fact-based information process is she points to The Bulwark.
00:16:19.200 Have you heard of it?
00:16:19.960 It's a publication online.
00:16:22.240 The Bulwark.
00:16:23.400 As a center-right news site, founded in opposition to Trumpism, as an example of the type of center-right
00:16:30.340 news outlet it could fund.
00:16:32.600 So, just an example.
00:16:34.280 They haven't funded it.
00:16:35.120 But they could fund something like The Bulwark.
00:16:38.320 Have you ever heard of The Bulwark?
00:16:41.360 Have any of you seen an article from The Bulwark?
00:16:45.060 Because I have.
00:16:45.960 Do you know which articles I've seen from The Bulwark?
00:16:49.960 Not a true one yet.
00:16:53.880 Not a true one.
00:16:55.820 Because the only time anybody sends me to read an article is when they have fake news.
00:17:00.320 Which has been often enough that the only news I've ever read in The Bulwark was obviously
00:17:06.040 untrue.
00:17:07.320 Like, I'm talking about the obvious hoaxes.
00:17:10.040 You know, the Trump-called neo-Nazis-find-people hoax.
00:17:14.400 You know, that sort of hoax.
00:17:16.260 The ones that you know are fake just by looking at it.
00:17:18.880 I was like, that's not true.
00:17:20.800 That's what The Bulwark is.
00:17:22.740 If you had said to me, name a bunch of fake news entities from this list, I would have
00:17:29.040 picked them out immediately as one of the fake news sources.
00:17:31.560 So literally, a famous fake news source that's anti-right is their example of how they won't
00:17:39.720 be biased because they'll pick a fake news source that says they're on the right but also
00:17:45.180 criticizes the right with fake news.
00:17:47.560 Why should you be worried that Reid Hoffman is part of this?
00:18:03.140 Anybody?
00:18:03.620 Let's see how informed you are about your billionaires.
00:18:08.360 So this would be sort of a trivia question, but useful.
00:18:12.700 Right, yeah.
00:18:13.600 So Reid Hoffman founded LinkedIn.
00:18:15.820 That's a fact.
00:18:16.680 But that's not directly related to my point that I'm going to make.
00:18:21.020 Why should you be worried specifically that Reid Hoffman is involved?
00:18:26.900 What do we know about him besides the billionaire founder of LinkedIn?
00:18:30.200 He knows how to get things done.
00:18:36.340 That's good.
00:18:36.740 Yeah, he's very effective.
00:18:37.900 We know that.
00:18:39.160 Connected, effective, very smart.
00:18:42.120 I don't know if you know how smart Reid Hoffman is.
00:18:44.800 But whatever you think is the smartest person you know, he's in that category.
00:18:52.460 Somebody says his ethnicity.
00:18:54.380 No, that's not what we're going for.
00:18:58.700 That's not what we're going for.
00:19:00.860 Here's what I know about Reid Hoffman.
00:19:03.920 He understands persuasion.
00:19:08.480 A lot of what you see in the stickiness of social media was invented by Reid Hoffman.
00:19:15.860 I forget what process it was specifically, but I think he was behind the idea that made things
00:19:23.520 network sticky, you know, they're recommending your friends.
00:19:27.140 I think he was behind the technology that seems obvious now, right?
00:19:31.960 Isn't this the most obvious thing?
00:19:33.600 If you're using an application that says, you know, why don't you invite your friends?
00:19:36.900 I think he invented that.
00:19:40.920 And I would offer that he probably knows more about psychology and persuasion than any of the
00:19:49.480 billionaires.
00:19:50.600 I'm not positive about that.
00:19:51.920 And, you know, Elon Musk is up there, too.
00:19:55.460 Elon Musk was on the PayPal mafia with Reid Hoffman.
00:19:58.920 So, by the way, Reid Hoffman and Elon Musk, their first startup, well, one of their first
00:20:04.340 successful startup, I guess, the big one.
00:20:07.080 PayPal was together.
00:20:11.140 Reid Hoffman isn't like other people.
00:20:14.440 He has a deeper understanding of...
00:20:15.920 By the way, I've met him and talked with him briefly.
00:20:21.440 So I have a tiny amount of interaction with him personally.
00:20:26.000 But even in a tiny interaction, you talk to him for five minutes and you know you're not
00:20:30.360 talking to a regular person.
00:20:32.460 You're talking to somebody who's operating at a really high level.
00:20:36.200 Sort of, yeah, Peter Thiel level.
00:20:37.780 Peter Thiel, also, from the PayPal mafia.
00:20:40.780 So we're talking about people who are extra, extra smart.
00:20:48.200 And he's fairly anti-Trump, right?
00:20:51.480 He's extra smart.
00:20:52.580 But he's smart in a persuasion brainwashing kind of way.
00:20:57.180 The difference between persuasion and brainwashing, of course, is just your intention.
00:21:00.460 If your intention is positive, then it's just called school.
00:21:04.660 And if your intentions are bad, it's called brainwashing.
00:21:07.300 So we don't know what his intentions are, but he is anti-Trump.
00:21:11.980 So one would have to, you know, make an assumption, which we can't verify, but make an assumption
00:21:17.820 that it's sort of a tool to work against Trumpism, so to speak.
00:21:24.680 And he would be one of the best brainwashers of all time.
00:21:27.740 Are you worried?
00:21:31.060 This would be the first time I've seen somebody who would have my kind of skill for persuasion
00:21:39.300 on the other team totally, totally powered up.
00:21:44.780 Because I don't think he was really as active before.
00:21:48.260 It looks like he was active, but he's taken it to another level.
00:21:52.600 He is a master persuader.
00:21:54.120 He has Trump-like skills, but he puts them to use behind the curtain instead of in front
00:22:00.760 of the curtain.
00:22:01.640 You know, Trump's an in front of the curtain persuader.
00:22:05.660 People like Thiel and Musk and Hoffman are behind the scenes persuaders.
00:22:10.340 Some in front, but mostly behind.
00:22:14.220 I'm really afraid of this one, honestly.
00:22:17.400 I'm afraid of how powerful this could become, just because of the people involved.
00:22:22.560 All right.
00:22:23.000 Here's a better model for figuring out what's true.
00:22:25.480 I'm going to put that out there.
00:22:26.460 I'm going to put that out there.
00:22:27.060 Internet dads.
00:22:29.300 Internet dads.
00:22:31.640 Like me.
00:22:33.140 So this is not a name I came up with myself.
00:22:35.660 You know, people have been calling me and some other people on the independents, basically.
00:22:40.480 The independent political voices, have been calling us the Internet dads, because we sort
00:22:45.920 of act like, hey, I'll take care of you.
00:22:47.940 You know, don't worry, I got this.
00:22:50.660 Dad will take care of it.
00:22:52.320 And I feel as if we Internet dads team up with, let's say, an informal group of fact
00:23:01.560 checkers, it's really powerful.
00:23:04.500 I don't know if you've watched the model that I've been sort of A-B testing here for a
00:23:08.500 while, which is that whenever I see a sketchy claim, which you see every day on Twitter,
00:23:15.620 I'll take the sketchy claim and I'll send it over to my two most productive, smartest
00:23:21.780 analysts, who can look at pretty much any claim and tell you where the BS is.
00:23:27.440 They can spot it pretty quickly.
00:23:29.240 So one is Andres Backhaus, and one is Anatoly Lubarsky.
00:23:35.080 And their responses are just insanely productive, right?
00:23:41.840 I don't know if it's all right.
00:23:43.540 Like, you know, I'm not claiming there's anybody who gets everything right.
00:23:46.620 I'm just saying that when they weigh in, you just see it differently.
00:23:50.400 The moment they weigh in, it's like, oh, this study is crap.
00:23:54.180 Here's the reason.
00:23:55.360 Here's where they got the data.
00:23:56.520 It's already been debunked.
00:23:57.560 That sort of thing.
00:23:58.180 Now, where I'm a little weak, I realize today, is the legal stuff.
00:24:04.380 But I do also have a whole bunch of lawyers who follow me on Twitter, et cetera.
00:24:10.380 So although I haven't formally activated that kind of a network, I could.
00:24:15.040 I could just ask a legal question.
00:24:17.520 You know, I think your Robert Barnes's, et cetera, would weigh in and answer the question.
00:24:21.820 So the Internet Dads have these informal, just if they have a big enough platform, they have
00:24:28.080 this informal group of fact checkers, all right, who can just come in and say, yeah, that's
00:24:34.240 true or not true.
00:24:35.680 And I also, you may have noticed, I have quite a few doctors who follow me.
00:24:41.960 Now, the doctors follow me for an interesting reason, which is some number of doctors have
00:24:46.680 figured out that they missed some lessons in medical school, persuasion for one, communication
00:24:55.160 for another, and maybe even some rationality in terms of, you know, comparing the right
00:25:02.920 things and statistics, et cetera.
00:25:05.300 So I get a lot of doctors who are just trying to add a little to their talent and skill that's
00:25:09.660 directly related to what they do.
00:25:11.420 So at this point, if you send me a, you know, a rumor or a claim, I can fairly easily on Twitter
00:25:21.060 just put it open to comments, you read the comments, and you're really going to know more
00:25:25.840 about it.
00:25:26.920 I'm not going to claim we can always get the right answer.
00:25:29.280 I'm just going to claim that if you look at my Twitter questions and then the answers
00:25:35.940 given, it's way better than any other source of understanding the world that I've seen.
00:25:43.400 Yeah, let me make that, yeah, Ron Coleman's another attorney.
00:25:47.600 There are a bunch of them who retweet my stuff fairly often.
00:25:53.260 So I know I have legal advice, I know I've got medical, and I've got economics and data
00:25:59.520 analysis.
00:26:00.540 So any of those fields, while not an expert myself, I can find somebody who will give
00:26:07.620 you a pretty good answer.
00:26:09.280 Don't you think that's a better model than whatever this is going to be, the Reid Hoffman
00:26:15.560 thing?
00:26:16.560 Now, I would also say that those people that you're calling your internet dads, they might
00:26:21.560 lean a certain way, but we're not wed to any philosophy, I guess.
00:26:28.520 Well, maybe we're wed to philosophies, personal philosophies, but we're not wed to any party.
00:26:34.460 Right?
00:26:34.680 Name anybody you think is, you'd call an internet dad, whatever your definition of that is,
00:26:41.100 and whoever you pick, and ask yourself if they're a slave to a party.
00:26:46.860 The answer is usually not.
00:26:48.400 Somebody's mentioning Michael Schellenberger.
00:26:51.680 Now, there's a great example.
00:26:52.960 So I've been interacting with and tweeting and following Michael Schellenberger's stuff,
00:26:59.120 read his books.
00:27:01.480 I don't know what his political bias is.
00:27:05.280 Think about that.
00:27:06.800 I've had extensive contact with him, and I couldn't tell you what his bias is.
00:27:11.940 I know he used to lean left, but he found out he'd been lied to on a lot of the green technology
00:27:17.520 stuff, so he's, you know, got a little turned off by that.
00:27:20.980 But I have no idea.
00:27:22.600 I don't know who he'd vote for.
00:27:24.820 Nothing.
00:27:25.860 I interviewed Bjorn Lomborg, also takes a business approach to stuff.
00:27:30.840 Now he's from another country.
00:27:32.160 But I talked to him for a long time, and I can't tell you if he leans left or right.
00:27:38.860 Not a clue.
00:27:40.180 Nothing.
00:27:41.120 I just don't know.
00:27:43.560 Yeah, Tim Pool.
00:27:45.760 Who does he vote for?
00:27:47.880 I don't know.
00:27:49.440 But I have a feeling he could vote for whoever made sense.
00:27:52.940 Like, I don't think he's wed to any party, right?
00:27:56.020 How about Mike Sertovich, my universal example for every example I use.
00:28:00.060 Do you think he could vote for somebody who wasn't in a particular party?
00:28:03.680 Yeah.
00:28:04.020 Yeah, easily.
00:28:05.220 Easily.
00:28:05.620 He could go wherever it makes sense, and you see him do it all the time.
00:28:09.560 So look for people who you're not entirely sure what their political leaning is, and also
00:28:15.600 have good advisors that they can, you know, employ.
00:28:19.300 And maybe this is a better model.
00:28:21.300 Maybe, you know, it's often true that things get invented by the public accidentally.
00:28:27.520 Now somebody says Sertovich is anti-Trump.
00:28:30.060 Absolutely untrue.
00:28:31.860 Absolutely untrue.
00:28:33.220 He's not anti-Trump.
00:28:35.300 I mean, that's the most simplistic reading of it all.
00:28:38.440 He's anti some things that Trump has done.
00:28:42.540 You could certainly be anti some things that Trump has done.
00:28:45.580 And you might even think that he's not, you know, your first choice to be president again.
00:28:50.900 I don't know where Sertovich stands on any of that.
00:28:53.200 But as soon as you say somebody's anti-Trump, that's not the person I'm talking about.
00:29:00.200 Because if you're anti a person, you're not really deep into the thinking part of things.
00:29:06.680 All right.
00:29:10.400 So let me cause some trouble here.
00:29:16.480 Here's some things we know.
00:29:18.540 Let's go to the whiteboard.
00:29:19.540 And yes, this is going to be big trouble.
00:29:24.260 I'll probably be cancelled by the end of today, if I've done it right.
00:29:27.600 Here's some things we know.
00:29:32.120 If you want to see the sources, I've tweeted some of these sources this morning, so you
00:29:36.440 can see them on my Twitter feed.
00:29:38.080 But here are some of the things we know.
00:29:39.240 There's this correlation between IQ and health.
00:29:41.380 And we know that smart people tend to make smart lifestyle choices.
00:29:48.240 For example, the higher your IQ and education, the less likely you'll smoke cigarettes.
00:29:55.340 Because, you know, you use the information and data and make decisions, etc.
00:30:00.020 So just making the right choices and not doing dumbass things gives you a healthier outcome.
00:30:05.620 Now, of course, I'm not talking about every smart person.
00:30:08.700 We're just talking about the average, right?
00:30:11.380 Smart people tend to be richer, and that allows them to live in lower crime neighborhoods
00:30:16.460 and associate with fewer bad influences and maybe get better health care.
00:30:21.340 And maybe even if they're rich, they can get a prettier or more handsome mate,
00:30:28.420 which is a signal for health.
00:30:30.500 You know, the more attractive people are, on average, the healthier they are.
00:30:34.720 So you can see lots of mechanisms where smart people would end up healthier.
00:30:38.200 There's also the notion that intelligence is probably a general indicator of health
00:30:45.060 because if your nervous system supports high intelligence or your neural network, I guess.
00:30:54.160 Apparently, if you have a fast neural network, you're smart,
00:30:56.780 but it's also a good indicator of health in general.
00:30:58.940 So there's all this correlation.
00:31:02.720 Now, here's where it gets dicey.
00:31:09.520 What is your knowledge about whether people who are more vaccinated,
00:31:16.760 are more educated people vaccinated or uneducated as a percentage?
00:31:22.080 Do you know in the comments?
00:31:24.760 Who is the most vaccinated category?
00:31:28.100 The highly educated?
00:31:31.080 Or the less educated?
00:31:33.880 Don't know, don't know.
00:31:37.100 The answer is the educated, yeah.
00:31:39.120 So at the moment, and by the way, this changed over time.
00:31:44.480 So I think early in the pandemic, when the vaccinations were brand new,
00:31:48.600 there wasn't a lot of difference between the educated and the less educated
00:31:53.340 in terms of whether they wanted it because nobody had any information.
00:31:57.660 But the higher educated people have been watching for a year,
00:32:03.060 whatever it's been, and now they say,
00:32:05.220 OK, we have more information now,
00:32:07.040 and we're persuaded by that more information.
00:32:09.080 So the educated people are far more likely to get vaccinated.
00:32:13.600 Now, it could be also access and information
00:32:15.920 and other things that are happening with poverty,
00:32:17.980 so it's not just because people are smarter or more educated, right?
00:32:21.840 Somebody says PhDs are lower.
00:32:23.760 I question that.
00:32:24.980 All right, I just saw a statistic go by
00:32:27.540 that says PhDs have a lower rate of vaccination.
00:32:31.400 I will make you a pretty big bet that that's not true.
00:32:34.360 I have seen that.
00:32:35.220 I've seen that data, by the way, but I don't think it's true.
00:32:38.720 I'd make a really big bet against that.
00:32:41.680 Would anybody take that bet?
00:32:44.620 Anybody want to take the bet?
00:32:46.760 And this is without research.
00:32:48.900 It's just an assumption.
00:32:50.900 Yeah, I don't think you want to take that bet.
00:32:52.340 But anyway, there was a new study that found out in Scientific America,
00:32:59.940 found out that people who got vaccinated with the COVID vaccination,
00:33:07.620 and by the way, don't assume that this is leading toward trying to get you vaccinated.
00:33:11.840 That's not where this is going.
00:33:13.680 All right.
00:33:14.260 I'm just causing trouble here.
00:33:16.280 That's my only objective.
00:33:19.140 And by trouble, I mean a fun little mental exercise, right?
00:33:23.660 That's the trouble I'm talking about.
00:33:26.380 So there's a study that showed that people who got vaccinated had better health outcomes in general,
00:33:34.660 even unrelated to COVID.
00:33:35.980 They had fewer of all kinds of health problems.
00:33:39.840 How do you explain that?
00:33:41.260 How do you explain the fact that people who got vaccinated had lower health problems in general?
00:33:49.120 Is it because the implication was that maybe the vaccination solves unrelated problems?
00:33:54.260 Well, it wasn't because the vaccination solved a bunch of unrelated problems.
00:34:01.160 Probably not.
00:34:02.780 It's probably because the higher IQ educated people are getting vaccinated at higher rates,
00:34:08.880 and they're just healthier in general.
00:34:11.320 Now, Scientific America did a funny thing to try to launder the news that high IQ people are having better outcomes.
00:34:25.040 Because you don't really want to tell a story about high IQ people doing things in society better than low IQ people.
00:34:32.540 Why?
00:34:32.740 Why do you never want to write a scientific story that says high IQ people are doing better than low IQ people in any context,
00:34:42.080 whether it's health or economics or anything else?
00:34:44.940 Why can't you do that?
00:34:47.260 Because it immediately turns into a racial discussion, right?
00:34:51.140 It turns into eugenics and all kinds of stuff.
00:34:53.160 So, Scientific America found a clever way to deal with that problem.
00:35:04.200 So they wrote an article in which they had to talk about IQ having an outcome.
00:35:08.840 It was different for high IQ and low IQ.
00:35:12.240 But they needed to make it go away also.
00:35:15.840 So they did this little trick.
00:35:17.580 This is the most clever thing that you'll see in a publication.
00:35:19.920 They said they laundered the correlation.
00:35:25.040 At the end of the article, they said, wait, yes, high IQ is related to good outcomes.
00:35:31.360 But we also tested reaction time.
00:35:35.220 And once we tested reaction time, the IQ correlation just disappeared.
00:35:40.800 Because the people with good reaction times were the ones who had the best outcomes
00:35:45.300 and the best health outcomes in general.
00:35:46.700 And so it's really about reaction time.
00:35:49.820 It's not about IQ at all.
00:35:52.860 What do you suppose is related to reaction time?
00:35:59.440 What would be something you kind of expect would be somewhat correlated with reaction time,
00:36:06.540 like a quick reflex?
00:36:08.100 Maybe intelligence, maybe general health, maybe people who are generally healthy,
00:36:16.440 who are also generally intelligence, are generally have better reaction times.
00:36:21.340 Wouldn't you expect that to be true?
00:36:24.120 So here's what Scientific America did.
00:36:26.600 They found a random, really random, I think, a random correlation with good health.
00:36:35.160 And they tried to confuse things.
00:36:37.880 No, it's not the IQ.
00:36:39.820 It's the reaction time.
00:36:42.040 Stop talking about IQ.
00:36:43.620 Come on, come on, come on.
00:36:45.200 You racist.
00:36:46.300 It's reaction time.
00:36:48.300 They just laundered the IQ.
00:36:50.320 They just took the correlation and tried to confuse you by throwing in this specious
00:36:54.540 other correlation that almost certainly has, you know,
00:36:58.880 I'm sure it's correlated, but there's some causation before that.
00:37:03.040 So here's a better way to do it.
00:37:07.180 A better way to approach this is to focus on individuals instead of groups.
00:37:12.920 Does it really help you that much to know that Elbonians have lower IQ than, I don't know,
00:37:19.360 left-handed Elbonians have lower IQ than right-handed Elbonians?
00:37:23.320 What are you going to do with that?
00:37:26.240 What are you going to do with it?
00:37:27.880 How would you act differently if you knew it were true?
00:37:30.440 If you would act differently, maybe you're racist.
00:37:35.480 Because you shouldn't.
00:37:38.360 The smarter way to approach it is just to say, why are we looking at averages?
00:37:43.760 Like, did you ever hire an average person?
00:37:46.780 There's no such thing as an average person.
00:37:49.580 The real world, we hire actual people.
00:37:52.500 And actual people are all over the place.
00:37:54.760 There's no such thing as an average actual person, except by weird coincidence.
00:38:00.340 So if we just stop making a racist assumption, you don't have to worry about where it comes out.
00:38:08.740 The racist assumption is that looking at it by race is useful.
00:38:13.680 It's not.
00:38:15.060 It's not useful at all.
00:38:18.360 Like, what would you do differently?
00:38:20.820 Nothing.
00:38:21.120 All right.
00:38:27.200 So, look out for all these weird correlations that are really not causation.
00:38:35.760 BlackRock Investment Company apparently is invested in China, according to some hip piece I saw on YouTube.
00:38:42.660 And I don't even know who was behind the video, but it was sort of an anti-BlackRock.
00:38:46.520 BlackRock is an enormous, what do you call them, hedge fund or investment entity?
00:38:51.880 I'm not sure what you call them.
00:38:53.400 But they're enormous, and they're investing in China, and somebody is making them pay for it by calling them out.
00:38:59.420 I think you're going to see more of this.
00:39:01.860 I think it will get increasingly hard for big entities that have to respond to shareholders, especially, to do business with China.
00:39:09.740 So, the Kyle Rittenhouse situation is happening, and I don't see any chance that he is going to get convicted.
00:39:22.280 Do you?
00:39:24.260 Let me ask you.
00:39:26.600 I would say if you put me on the jury for the Rittenhouse thing, there's no way that he's getting convicted if you put me on the jury,
00:39:34.880 because I would be the one person who held that, even if the rest of them wanted conviction.
00:39:39.680 Don't you think there's a 100% chance that the defense attorney can get one of me on a jury?
00:39:47.440 Because I'm an absolutist on this.
00:39:50.860 I'm a complete absolutist.
00:39:52.500 I don't want to hear any nuance or anything.
00:39:54.580 I saw him getting attacked, and I watched him shoot,
00:39:58.380 and there doesn't seem to be any question about the video being accurate in this case.
00:40:02.620 If there was some question about the video being accurate,
00:40:06.300 you know, but that hasn't even been alleged.
00:40:08.900 It looks like everybody says what we saw is what happened.
00:40:12.820 In that case, I don't see any case.
00:40:16.800 I don't see any way he could possibly get convicted.
00:40:21.740 It just doesn't seem possible.
00:40:23.900 So, what's going to happen then?
00:40:25.400 There will be rioting, and people will get shot.
00:40:27.440 They didn't need to get shot.
00:40:28.440 So, all the anti-gun people will do exactly what they don't want to happen,
00:40:33.520 which is create situations where people will get shot.
00:40:37.140 So, I think Kyle Rittenhouse gets off, but let's give some balance to the story.
00:40:42.600 I can't be a good internet dad unless you know I can give some balance to the story.
00:40:46.940 I also would oppose, knowing what we know now, but it could change,
00:40:51.940 Alec Baldwin going to jail or, you know, having legal problems for what he did.
00:41:00.140 Now, let me be clear.
00:41:01.900 From a gun ownership perspective, he's 100% responsible.
00:41:06.060 Not 99.9, you know, not 98, 9.99, 100%.
00:41:11.680 That's just the standard we can't be flexible with that, right?
00:41:16.520 All gun owners, I think, are on the same page.
00:41:18.440 You just can't be flexible about that.
00:41:20.460 The person in their hand is responsible, but that's not a legal standard.
00:41:25.980 That's a gun owner absolute and needs to stay that way.
00:41:30.180 But from a legal perspective, he did reasonable things that reasonable people can do.
00:41:35.960 Not reasonable for a gun owner, but reasonable for a citizen.
00:41:40.060 You know, and I think that the courts probably should lean toward the reasonable citizen standard
00:41:48.580 and not the gun owner standard.
00:41:51.680 Because I don't think the gun owner standard is...
00:41:54.880 It's not designed for court purposes.
00:41:59.120 It's designed to create a behavior that's an absolute behavior about safety.
00:42:03.400 It's not really quite compatible with legal standards, in my opinion.
00:42:07.560 So, I don't think that L.A. Baldwin should be faulted for the actual act of pointing the gun
00:42:17.240 and pulling the trigger.
00:42:19.580 Bad judgment? Yes.
00:42:22.220 Bad judgment? Yes.
00:42:24.320 From a gun owner perspective, very bad judgment.
00:42:26.880 But not beyond a legal standard, I don't think.
00:42:30.920 Because he did have processes in place to make that safe, he thought.
00:42:34.240 Now, as a producer who hired the crew, does he have some responsibility there?
00:42:41.120 I'd say yes.
00:42:42.660 And it looks like it's going to be his...
00:42:44.160 Somebody's insurance company is probably going to pay a lot of money, I guess.
00:42:48.060 I assume there's some insurance that covers this, either the production or his own.
00:42:52.740 So, well, see, manslaughter would require a reckless act.
00:43:00.380 And I don't think there's evidence of a reckless act except by the gun owner's standard,
00:43:07.040 which is way, way tighter than a legal standard should be.
00:43:11.640 So, can everybody deal with the nuance?
00:43:14.920 The nuance is that I agree with you, he's 100% responsible from a gun owner's perspective,
00:43:20.520 and it has to be that way.
00:43:22.360 But at the same time, from a legal perspective, it's got to be a little bit more commonsense-y.
00:43:27.400 That's the way you'd want to be judged, I think.
00:43:29.180 So, that's my opinion.
00:43:31.500 Neither Rittenhouse nor Baldwin should go to jail for anything they did.
00:43:37.120 We got an update on the...
00:43:38.840 Somebody told me I'm pronouncing the name of this place wrong.
00:43:42.340 Loudoun?
00:43:43.060 Ludon?
00:43:43.680 Ludon County?
00:43:45.640 Loudoun?
00:43:47.240 Can anybody give me the pronunciation?
00:43:50.220 But it was a story about the boy, or actually the non-binary student, I guess,
00:43:56.480 gender-fluid student who wore a skirt, allegedly, and attacked a girl in a restroom in school,
00:44:04.200 a girl's restroom, and had been accused credibly of a prior assault.
00:44:10.520 Now, I had said when I heard the story, really, really, that everybody's concerned about some boy putting on a skirt
00:44:20.840 as an excuse to, like, go into the ladies' restroom and rape somebody.
00:44:27.500 And I thought, really?
00:44:28.400 And that happened exactly the way people were worried about it.
00:44:31.100 Well, and I said, what did I predict?
00:44:35.820 What I predict is, I don't think there's a skirt involved.
00:44:39.160 I think there's probably an assault.
00:44:41.040 I definitely...
00:44:41.560 I never doubted the assault.
00:44:43.380 But I wasn't so sure about the skirt problem.
00:44:47.400 The current reporting is skirt confirmed.
00:44:52.100 Skirt confirmed.
00:44:53.040 Andrew says, in all caps, Andrew, let me explain something to Andrew Richardson.
00:45:02.600 I'm telling you I was wrong.
00:45:05.540 You shouting, in all caps, doesn't make me more wrong, but it does make you an asshole.
00:45:12.600 So, if you want to just stick with you being cool, compared to me being wrong and, you know, and admitting it in public,
00:45:22.980 you know, keep your higher status.
00:45:25.800 Because the moment you yelled at me in caps, you were sort of worse than me.
00:45:33.380 That's right.
00:45:34.240 The whole point is you're supposed to be celebrating your victory, your intellectual victory over me.
00:45:40.840 Don't ruin it with all caps.
00:45:43.000 That's like surrendering.
00:45:45.060 Enjoy your victory.
00:45:46.520 Come on.
00:45:47.880 All right?
00:45:48.380 Now, do you remember what happened when I said Kamala Harris would get the nomination for the Dems,
00:45:55.380 and she didn't, and she dropped out?
00:45:57.760 Do you remember what I did?
00:45:59.460 I doubled down.
00:46:01.240 I said, I still think she's going to be the Dem nominee.
00:46:05.840 I can't say that she is yet, so I'm not going to say that that isn't accurate.
00:46:09.440 But it's close to accurate.
00:46:11.900 Pretty close, isn't it?
00:46:13.740 Do you remember when the Vegas shooter, all the smart people said it's ISIS, and I said, it's definitely not ISIS.
00:46:19.900 And then ISIS claimed responsibility.
00:46:23.000 Do you remember what I did?
00:46:23.860 I said, it's not ISIS.
00:46:26.780 And I was right.
00:46:28.760 So here's another one where I have been confirmed to you wrong.
00:46:32.020 That student was wearing a skirt and was in the ladies' restroom.
00:46:36.620 I'm going to double down and say that this story is going to fall apart in an important way.
00:46:46.080 Here's the way that I think it'll fall apart.
00:46:48.900 I think the skirt may or may not be true.
00:46:52.280 I'm willing to accept that there was a skirt involved.
00:46:55.300 But I don't think it had anything to do with the rape.
00:46:57.460 So I don't think that anybody's going to connect the two, which was more to the point, that connecting the two was the problem.
00:47:07.300 The idea that somebody wore a skirt, maybe with the purpose of getting into a situation where they could do some raping.
00:47:15.400 I don't think we're going to find out that it was ever a master scheme to get at the girls' and the girls' restroom.
00:47:22.280 I feel like we're going to find out there's a disturbed person who's disturbed in a number of ways.
00:47:29.660 And that's it.
00:47:30.900 I think it's just going to be a disturbed person who's disturbed in a variety of ways.
00:47:35.480 And something bad happened.
00:47:38.280 But I feel as though the skirt is irrelevant to the story.
00:47:47.840 And that's what I'm going to stick with.
00:47:49.340 So I'll accept complete wrongness about the existence of a skirt.
00:47:55.620 Anybody like to yell at me in all caps?
00:47:57.540 This would be the time.
00:47:59.560 Actually, I'll give you an exemption.
00:48:01.860 I'll give you a 60-second exemption in which you may, in fact, you're encouraged, to gloat at my wrongness in all caps.
00:48:12.920 Enjoy yourself, please.
00:48:14.840 Please enjoy yourself.
00:48:17.140 Skirt lives matter.
00:48:20.060 Wrong.
00:48:20.600 All in caps.
00:48:21.320 Well done.
00:48:21.980 Thank you.
00:48:23.560 Freedom.
00:48:25.300 I like lamp.
00:48:26.660 Okay.
00:48:28.100 Amen.
00:48:28.900 All right.
00:48:29.440 So we're good on that.
00:48:32.220 Huma Abedin is coming out with a book.
00:48:34.560 And apparently one of the blockbuster things we're learning in this book is that she once was invited by a senator.
00:48:42.740 This was quite a while ago, years ago.
00:48:45.480 Invited by a senator that she'd been in some meeting with to come up to his place for some coffee.
00:48:51.180 She was a single woman.
00:48:53.440 It sounds like the senator was single.
00:48:55.540 It's not mentioned, but let's say he was.
00:48:58.780 And, well, actually, we don't know anything about the senator's status.
00:49:01.400 But she went up to the senator's place to have coffee.
00:49:04.660 The senator rolled up his sleeves, took off his blazer, made her some coffee.
00:49:10.180 And then the senator sat down on the couch, put his hands around her shoulder, and leaned in to kiss her and was pretty kind of aggressive about it.
00:49:19.100 And she said, no, no, no, sorry, I'm not here for that.
00:49:22.940 At which point the senator did what?
00:49:25.980 He apologized.
00:49:27.680 He was like, whoa, whoa, whoa, I read the signals wrong.
00:49:30.820 I'm very sorry.
00:49:33.100 And then she said, and she regrets it, that she apologized to him.
00:49:38.380 Because she was 24, I guess, and she was like, okay, you know, in retrospect, I should not have apologized to him.
00:49:44.700 It was he who did something.
00:49:46.280 What is your take on that story?
00:49:52.960 For those of you who might be female watching this now, let me, can I say something with one swear word?
00:50:03.320 Can you send people away for just one, just one?
00:50:08.060 Anybody?
00:50:08.740 I need permission.
00:50:09.880 Can somebody give me permission for one well-placed F word?
00:50:17.280 Anybody?
00:50:18.860 Okay, I got one.
00:50:20.920 I've got a little pushback here.
00:50:22.560 Somebody doesn't want it.
00:50:24.320 Okay, but the locals people are saying yes, so I think we're going to have to do it.
00:50:28.080 Let me give you some advice for the ladies.
00:50:30.280 Ladies, if a man invites you to his place, and it's just the two of you, for a beverage, be it coffee, be it alcohol, be it a very delicious vegetarian kind of a beverage.
00:50:49.080 Whatever that beverage is, if a man invites you, especially an attractive single woman, to his private place for that beverage, he has asked you if you'd like to fuck.
00:51:03.880 And you said yes, probably.
00:51:08.440 Not yes, definitely.
00:51:10.500 But when you said yes, I would like to go to your private place to have this beverage you've offered.
00:51:15.000 You, female listening to this, have told that man, I would absolutely be interested in fucking you, pretty much tonight.
00:51:25.980 Now, if you did not mean to send that message, here's why I'm informing you.
00:51:33.160 Because if you send the message, I pretty much am up to fuck, and the guy leans in to kiss you, he may be operating on bad information.
00:51:44.080 Our system doesn't make him, you know, off the hook.
00:51:51.300 You know, the man is still required to, you know, respond to the signals as soon as they are clarified, right?
00:51:58.300 We all agree to that.
00:52:01.740 Yeah, if it's a hotel room, it's pretty much guaranteed.
00:52:05.300 So, I guess this is a cautionary tale.
00:52:09.860 I suppose there could be such a thing as a 24-year-old who can be in a meeting with a senator
00:52:16.200 and is still too dumb to know what an invitation to come up to his place really means.
00:52:23.600 And the senator even said, oops, I read the signals wrong.
00:52:26.780 No, the senator didn't read the signals wrong.
00:52:30.020 The senator read the signals exactly as they were sent.
00:52:33.820 They were the wrong signals.
00:52:35.140 I think it was rather generous, very generous, for the senator to say he read the signals wrong.
00:52:44.020 He didn't read anything wrong.
00:52:46.120 The signals were crystal clear.
00:52:49.000 They were sent wrong.
00:52:51.060 So, he read it right, it was just sent wrong.
00:52:53.380 That's my opinion.
00:52:55.640 But I don't think anybody got hurt, which is good.
00:53:00.920 All right.
00:53:01.620 Here's another one.
00:53:05.400 I think we talked about that already.
00:53:07.940 All right.
00:53:08.260 I think I've got to go and do something else.
00:53:11.520 This probably wasn't my best show, but I think it relaxed you just the same.
00:53:15.960 I can feel your antibodies getting stronger even now.
00:53:20.660 Wow.
00:53:21.560 Yes, and apparently she only remembered her traumatic experience recently.
00:53:28.240 How about come up to my place to listen to records?
00:53:30.480 Yep, it all means the same thing.
00:53:35.980 Oh, here we go.
00:53:36.800 Here's some data on the PhD thing.
00:53:39.060 People with a master's degree had the least hesitancy, and the highest hesitancy was among those holding a PhD.
00:53:45.440 Hesitancy.
00:53:47.480 Hesitancy is fine, but I'm only going to doubt that the rate of vaccination is lower than other people.
00:53:56.020 Now, I don't think the PhD rate of vaccination would necessarily be higher than the master's degree is,
00:54:01.820 but I guarantee that the PhD vaccination rate is higher than non-college educated people.
00:54:10.300 Anybody?
00:54:11.100 Anybody?
00:54:11.820 All right.
00:54:12.560 That's all for now.
00:54:13.900 Talk to you later.
00:54:14.480 Here we go.