Real Coffee with Scott Adams - February 06, 2023


Episode 2011 Scott Adams: Killer Eggs, Romantic Robots, My Third Act, Madonna's Face, FBI vs America


Episode Stats

Length

1 hour and 13 minutes

Words per Minute

147.49048

Word Count

10,820

Sentence Count

868

Misogynist Sentences

12

Hate Speech Sentences

27


Summary

What's wrong with Madonna? Is she on drugs? Is there something wrong with the Grammys? Or is it something else going on in the world? Find out on this morning's HAPPY BIRTHDAY WHO CARES!


Transcript

00:00:00.680 Good morning, everybody, and welcome to the highlight of civilization.
00:00:05.860 It's called Coffee with Scott Adams, and there's never been a better time.
00:00:10.200 If you'd like to upgrade your situation, all you need to make this moment one of the greatest
00:00:17.320 moments of your entire life, all you need is a cupper mug or a glass of tank or a chalice
00:00:22.980 or a sign, a canteen jug or a flask, a vessel of any kind.
00:00:26.440 Fill it with your favorite liquid, I like coffee, and join me now for the unparalleled pleasure
00:00:34.120 of the dopamine of the day, the thing that makes everything better.
00:00:37.120 It's called the simultaneous sip, and it happens now.
00:00:39.840 Go.
00:00:44.960 Now, for you members of YouTube and anybody on the Locals platform who is just joining us,
00:00:52.740 listen, if you had been a local subscriber this morning, and if you had been early onto
00:01:01.000 the live stream, you would have had the pleasure of watching me destroy my printer on my hard
00:01:10.620 floor.
00:01:11.880 There's not much left to it, but I think you'd like to, well, I won't show you what's left,
00:01:16.060 but there's, I'm sort of sitting within a debris field here.
00:01:19.120 I'm trying to figure out, because I'm not wearing shoes, I'm trying to figure out how
00:01:26.280 to get out of the debris field when the live stream is over.
00:01:32.260 I think I'm going to have to climb over my desk to go get some shoes to go clean up the
00:01:37.020 debris.
00:01:38.100 Now, a lot of you are saying to yourself, Scott, do you have some kind of like terrible, I
00:01:44.760 don't know, some kind of anger problem?
00:01:48.440 Not anymore.
00:01:50.820 I did have an anger problem.
00:01:53.140 About 10 minutes ago, I had a hell of an anger problem.
00:01:56.580 Now I just have a broken printer problem.
00:02:00.660 But anger?
00:02:01.760 Totally gone.
00:02:03.360 Totally gone.
00:02:04.540 Now, a lot of people waste time going to anger management programs.
00:02:08.220 No, just destroy the thing that's bothering you.
00:02:12.820 Not humans.
00:02:14.560 Not humans.
00:02:16.120 Not dogs.
00:02:17.400 Not animals.
00:02:18.460 But if it's an inanimate object, at some point, the correct answer is to destroy it on your
00:02:24.520 hard floor.
00:02:26.240 I don't make the rules, that's just the only thing you can do that works.
00:02:30.680 So today I'll be working from my notes on my phone, which won't be nearly as dynamic,
00:02:35.640 but it's the best we can do for today.
00:02:38.640 I'll see if I can get myself a new printer.
00:02:41.600 The biggest story of the day is, what the hell's wrong with Madonna?
00:02:46.180 What the hell's wrong with Madonna?
00:02:49.960 What was the awards program that was on last night?
00:02:53.100 Was that the Grammys?
00:02:55.640 Was that the Grammys?
00:02:58.860 All right.
00:02:59.500 Am I the first one to make this joke?
00:03:01.480 If I'm the first one to make this joke, there's something wrong with the world.
00:03:08.440 And it goes like this.
00:03:11.120 Wind up, wind up, wind up.
00:03:16.120 It's called the Grammys.
00:03:19.340 Why else would Madonna be there?
00:03:22.120 It's called the Grammys.
00:03:25.100 All the grandmothers are there.
00:03:26.680 Now, did nobody else make that connection?
00:03:33.880 I can't be the first.
00:03:35.800 I can't be the first.
00:03:38.220 Yes, it's the grannies.
00:03:39.760 From the Grammys to the grandmys award.
00:03:43.800 Some people said that the entire thing was satanic.
00:03:47.760 Did I miss something?
00:03:48.840 I saw on social media today people saying that artist Sam Smith did some kind of satanic presentation.
00:04:00.800 And then I saw their outfits.
00:04:02.920 And the outfits did look like they came from the basement of some pizza parlor, if you know what I mean.
00:04:11.640 And here's the thing that I, and then also one of the big applause scenes was apparently there was a, the first trans woman winner.
00:04:26.920 I saw it only on a clip.
00:04:28.460 That's what happened, right?
00:04:29.780 Was it the first trans woman who won a Grammy?
00:04:33.880 And there was lots of clapping and people were quite happy about that.
00:04:38.160 And I've got another story that's going to tie into that pretty well.
00:04:42.600 But here's my take.
00:04:44.920 To me it looks like Madonna's on drugs.
00:04:47.120 What do you think?
00:04:48.960 Is that what's going on?
00:04:53.140 Because it doesn't look like insanity.
00:04:55.800 It doesn't look like any kind of dementia.
00:04:58.480 And it doesn't look like somebody who's not on drugs.
00:05:01.820 So I think, I think the people who are making fun of her physical appearance are sort of missing the plot.
00:05:09.060 But the physical appearance is really an outcome of whatever else is going on.
00:05:16.940 And I feel just, what I see is someone who can't be helped.
00:05:22.180 Because she's a free person and she's Madonna.
00:05:27.660 So nobody's going to embarrass her, right?
00:05:30.040 She's, it's impossible to embarrass her.
00:05:32.520 Nobody's going to shame her.
00:05:33.640 Nobody's going to pressure her.
00:05:35.160 Because she's Madonna.
00:05:38.940 But somebody needs to.
00:05:40.700 This is one of those, this is one of those situations where it looks like, you know, we can't read minds.
00:05:46.600 And so we're speculating quite a bit here.
00:05:48.900 But it looks like she needs some help.
00:05:52.040 But she's in this extraordinarily unique situation where she can't be helped.
00:05:58.220 Because she's Madonna.
00:06:00.240 And I feel like that's the fatal flaw of a lot of the celebrities who end up dead from overdoses and whatnot.
00:06:07.260 That nobody can help them.
00:06:08.360 At some point, they're just unhelpful.
00:06:11.720 And one of the reasons that I'm tuned into this is that I'm the, let's say, I'm the poor man's version of this.
00:06:20.460 Who the hell is going to tell me what I should do when I'm 90 and I'm out of my freaking mind?
00:06:27.760 Who?
00:06:29.700 That's right.
00:06:30.500 Erica.
00:06:31.080 Erica will tell me.
00:06:32.580 But will I believe Erica?
00:06:33.980 Because I'll be out of my mind.
00:06:35.180 So it doesn't matter how well-meaning or smart or helpful somebody is if you're not going to listen to them.
00:06:42.700 What are the odds that I'm going to listen to somebody's advice when I'm 90?
00:06:47.340 Assuming I still have assets and I can still live independently, I'm not going to take anybody's advice.
00:06:54.600 I don't think.
00:06:56.300 And, you know, I think I would at the moment.
00:06:58.980 So at the moment, I still have enough of my faculties that I know, oh, that sounds like good advice, then I'll incorporate that.
00:07:06.560 But I don't think I'll be able to do that forever.
00:07:09.260 So what the hell is going to happen to people like me?
00:07:11.660 I'm going to have too much power relative to my sanity if I live long enough.
00:07:18.580 So it is something I literally wonder how that's going to work out.
00:07:23.220 All right, the other important issue of the day, there's nothing but important issues today.
00:07:28.460 These are the big ones.
00:07:30.760 Eggs.
00:07:32.280 Eggs, are they the healthiest food you can eat, a great source of protein, or are they serial killers?
00:07:39.680 And if you want to spell serial C-E-R, that would be sufficient pun for this morning.
00:07:47.020 That would be okay.
00:07:47.960 I'm not going to do that, but you can.
00:07:51.700 And so I did a Google search, because whatever Google decides to highlight feels like that's the official, what the government wants you to know.
00:08:04.040 Doesn't it feel like that?
00:08:05.980 Because we've seen how much the FBI and Congress can influence all the social medias.
00:08:12.400 So if Google summarizes one of the answers, and you know how Google summarizes some answers?
00:08:19.120 If it's something that people ask a lot, they'll put a little category with their own official summary of the answer.
00:08:26.420 Well, Google's official summary of the answer of whether eggs are dangerous for your health is that half an egg a day is associated with more deaths from heart disease, cancer, and all causes.
00:08:39.520 All causes. All causes.
00:08:44.200 If you eat half an egg a day, your odds of falling off a ladder apparently go through the roof.
00:08:50.560 If you eat half an egg a day, the odds of being a victim of a drive-by shooting, way up.
00:08:56.980 All causes. It does say all causes.
00:09:00.000 Your odds of dying on a trip to Mars, probably up as much as 24%.
00:09:06.160 That's just a guess. But it's way up.
00:09:08.120 So that's what Google says, and you can certainly believe Google, because they're going to use science and stuff.
00:09:15.240 So half an egg a day, it'll probably, it'll frickin' kill you.
00:09:19.380 Or, or, as Ivor Cummins sent me in response to this tweet,
00:09:26.860 the story of an 88-year-old man who ate 25 eggs a day for many, many years,
00:09:33.140 and they studied him, and they studied him, and he had normal cholesterol.
00:09:38.100 He had normal cholesterol.
00:09:40.340 He ate 25 eggs a day.
00:09:41.700 Now, I don't want to tell you that studying one 88-year-old man who eats 25 eggs a day is science.
00:09:52.040 It's not exactly science.
00:09:53.360 There could be individual differences, and, you know, maybe some people are just natural egg eaters or something.
00:09:58.540 But are you blown away by the fact that we don't know if eggs are good for you or bad for you?
00:10:07.820 Everything is just COVID shots.
00:10:12.560 Once you have that frame in your head of the pandemic and all the COVID mandates and stuff and the vaccines,
00:10:18.680 even the egg looks like a COVID vaccination to me.
00:10:22.900 Like, my head just can't get out of that model.
00:10:25.360 I'm stuck in that model.
00:10:26.960 So, I don't believe anything about eggs.
00:10:31.040 I don't know that they're good or bad.
00:10:33.380 I have no way to know.
00:10:34.960 Do you?
00:10:36.840 So what do you do?
00:10:38.000 Do you eat eggs and make yourself healthier,
00:10:40.660 or do you eat eggs and have a 24% greater chance of dying?
00:10:45.480 Or whatever it is.
00:10:46.300 How do you make that decision?
00:10:50.640 Well, I have a suggested possibility, but we'll get to that.
00:10:57.400 Here's what I think.
00:10:58.880 I think almost nobody eats an egg by itself.
00:11:02.680 Am I right?
00:11:04.760 The only person I know who eats an egg by itself is this 88-year-old guy who eats 25 of them a day.
00:11:11.120 Because I think that's probably all he eats.
00:11:12.860 But don't you eat it with toast?
00:11:19.520 Right?
00:11:20.360 I feel like the people who eat an egg a day are eating other stuff that has cholesterol in it.
00:11:26.540 So, it might be a little hard to tease out just the egg part.
00:11:31.720 You wonder if they did that right.
00:11:33.820 Anyway, I have nothing to say about eggs except we don't know.
00:11:36.460 The most important story of the day is that there's a company that's trying to roll out a chain of sex robot brothels.
00:11:46.240 That bears repeating.
00:11:48.560 Sex robot brothels.
00:11:51.760 Now, if you think, well, that'll never work, it already did.
00:11:55.840 And the reason that they want to do a chain of them is that the first one must have been a big hit.
00:12:02.280 If you're surprised by that, if you're surprised by that, you don't really understand men.
00:12:10.860 Let me just say that.
00:12:12.560 If there are any women who don't understand why some number of men are, you know, happily going to sex robots,
00:12:19.880 you really don't understand men.
00:12:21.260 Because it's very, very understandable.
00:12:25.700 If you take a heterosexual man and put him in prison, how fast does he become flexible?
00:12:32.800 Really fast.
00:12:34.640 When that person gets out of prison, are they heterosexual again?
00:12:38.620 Yeah, usually, immediately.
00:12:40.440 Yeah, right back to heterosexual.
00:12:42.580 So, it turns out that men are very flexible when they have that one need that needs to get met.
00:12:47.640 And so, they're trying to put one in Houston, which I think is the funniest thing.
00:12:54.440 Of all the places you would put your sex robot brothel, I think Houston is the funniest.
00:13:00.920 Now, what I love about this story is when people have to think too much.
00:13:06.240 That always gets funny.
00:13:07.180 And one of the things that people have to think about is that our laws do not make it, well, in most cases,
00:13:14.760 I think most states it's legal for a human to get busy with, let's say, a sex toy.
00:13:22.300 It just happens to look like a person and act like a person a little bit.
00:13:26.240 So, I think this is going to be a big hit if it's legal, but why, let me ask you,
00:13:36.800 do you think it should be illegal for a human to have sex with a robot?
00:13:42.560 How many think that should be illegal?
00:13:49.660 Mostly say no, it should not be.
00:13:51.740 What happens if it eliminates reproductive processes?
00:13:59.440 Would you change your mind?
00:14:01.800 See, I think we're going to get pushed into a situation where our reproduction rate is so low
00:14:06.200 that some politician is going to say,
00:14:09.100 huh, maybe part of the reason the reproduction rate is too low is because of the alternatives.
00:14:14.620 These sex robots are too good.
00:14:16.640 We're going to have to get rid of the sex robots so that people want to reproduce.
00:14:21.740 That's going to happen.
00:14:23.380 That's definitely going to happen.
00:14:25.600 All right.
00:14:26.640 But here's my prediction.
00:14:29.780 I was reading, even today, the article that was about the sex robot brothels.
00:14:35.060 You would not be surprised to hear that the journalist, and I won't even name him,
00:14:39.360 the journalist seemed to be sneering in his attitude about the people who would be robosexuals.
00:14:48.620 Can we call them robosexuals?
00:14:51.740 People who like robots as their sexual preference?
00:14:54.840 Robosexuals.
00:14:55.680 I'm going to coin the term if nobody's already done it.
00:14:59.300 Robosexuals.
00:15:00.600 So here's my prediction.
00:15:04.560 Robosexuals are now the one sexual preference category that you can,
00:15:09.180 well, I think maybe the furriest people still make fun of.
00:15:11.960 But there are some groups you can still make fun of.
00:15:16.140 And so articles about the sex robot brothels, definitely you're going to throw some shade on the customers, don't you think?
00:15:24.420 Don't you think the customers are going to get a little shade every time they talk about it?
00:15:28.100 Yes, they will.
00:15:28.580 And do you think there will be a point where those customers organize and insist that they add the R for robosexual to the LBGTQ plus blockchain?
00:15:40.640 Because now the whole LBGTQ thing has become like a blockchain, where you never destroy the history, so it'll always be at least LBGTQ, but now we're adding letters to it as we find more discriminated classes.
00:15:56.980 And I think the robosexuals are going to have to add an R.
00:16:00.920 So someday, God willing, the Grammys will not just be about your grandmother, but will also be a robosexual winning an award, and people will cheer.
00:16:14.980 And they'll say, you are so brave.
00:16:17.320 You're so brave to have sex with robots and be out of the closet.
00:16:23.980 Would that be the right term?
00:16:26.980 Yeah.
00:16:29.060 So that's coming.
00:16:32.000 The GOP apparently have decided that in their gun control debate that they're having right now, that many of them are wearing an AR-15 pin instead of the American flag pin, which they often wear.
00:16:47.480 The Republicans.
00:16:49.160 Now, what do you think of that?
00:16:50.840 Is that a good play?
00:16:52.240 Because the AR-15 looks like the gun itself.
00:16:55.580 It's a little pin they have on the lapel.
00:16:58.700 So a number of them are wearing it.
00:17:00.000 I don't know what percentage you're wearing it.
00:17:01.840 Good idea or a bad idea?
00:17:05.660 All right.
00:17:06.340 Here's how you analyze it from a persuasion perspective.
00:17:10.340 You ask yourself, what's the upside?
00:17:12.600 You ask yourself, what's the downside?
00:17:13.940 How many extra votes will the Republicans get because they wore a gun on their lapel?
00:17:21.200 Extra votes?
00:17:22.820 Zero.
00:17:23.520 Right.
00:17:23.980 Right.
00:17:24.360 And we can be pretty confident of that.
00:17:27.280 You know, no matter how low your opinion of people in Congress, I believe, because I have faith in them,
00:17:34.920 almost nobody has ever based a vote on somebody's lapel pin.
00:17:38.000 So would you agree there's no upside?
00:17:41.880 Would you agree?
00:17:42.540 No upside.
00:17:44.040 Let's talk about the downside.
00:17:46.160 It created a whole new cycle which makes Republicans look like the real issue is love of guns as opposed to love of the Constitution and America.
00:17:58.280 The flag pin says, I like everything America stands for, and that includes the Second Amendment prominently.
00:18:06.920 Right.
00:18:07.300 The American flag is the exact logo for the Second Amendment.
00:18:14.420 It's not even off by, like, 1%.
00:18:17.260 It's right on target.
00:18:20.560 It's the exact message.
00:18:22.680 It's America.
00:18:23.860 We do this.
00:18:24.800 As soon as you make it about the gun, you lost the argument.
00:18:31.720 You come to me and say, I want guns to be legal because I love my gun so much I'm wearing it as clothing.
00:18:38.360 I will say, nope, nope.
00:18:40.680 You brought me no argument.
00:18:42.980 My argument wins.
00:18:44.520 But if you bring me the American flag, you high-grounded me.
00:18:48.960 Right?
00:18:49.140 That's the high ground.
00:18:49.900 You can't get higher than, you know, nationalism within the country.
00:18:54.800 Okay, so here's my prediction, which can be, I guess, falsified.
00:19:03.160 I don't know the answer to this, and you might already.
00:19:07.100 So I promise you I don't already know the answer.
00:19:10.120 It might be a public record, but I haven't seen it.
00:19:12.280 I promise you I don't know the answer.
00:19:13.900 I predict that Matt Gaetz will not wear that lapel pin.
00:19:21.580 And the prediction is based on the fact that he's playing at a higher level of understanding of persuasion.
00:19:28.120 I cannot wrap my head around the possibility that Matt Gaetz would be dumb enough to wear the lapel pin instead of the American flag pin, which is obviously the right answer.
00:19:41.160 Now, somebody go check that, because probably we'll know.
00:19:45.480 You know, the news will probably cover who wore it and who didn't.
00:19:48.920 But my prediction is he's operating at a higher level and would not fall for that ridiculous, bad imagery.
00:19:56.860 Let's see.
00:19:57.920 So now you have something you can test and call me wrong about.
00:20:03.140 Let's talk about Jonathan Turley, who you should be following.
00:20:08.820 He's talking about how the House Select Committee is holding his first.
00:20:12.900 So he's got an article in The Hill and also on his own website.
00:20:15.700 So the Republicans now, with their new majority in the House, are going to be looking into the weaponization of government, as they like to say.
00:20:27.060 And it's a variety of things, but one of the things we're looking into is the FBI influencing social media.
00:20:34.300 And we know at this point, we know that the FBI really, really have their finger all over Twitter, possibly legally.
00:20:46.680 In fact, probably.
00:20:48.420 Probably legally.
00:20:49.940 In the sense that they do have free speech.
00:20:52.360 They can talk to anybody they want.
00:20:54.280 They can express any opinion they want.
00:20:57.400 But we don't like it.
00:20:59.380 You know, if all of that was legal, and I think it might have been.
00:21:05.320 It might have all been legal.
00:21:06.300 Well, the question is, should it be?
00:21:09.840 And should we allow it?
00:21:11.460 Right?
00:21:11.720 And that's a different question.
00:21:15.720 But one of the things that Turley said in his article, I had been struggling to communicate, and I'd failed completely.
00:21:26.280 And he nails it, which is why you should follow him.
00:21:28.460 Right?
00:21:28.840 He has a uniquely good ability to write and just put things into simple terms.
00:21:35.460 All right.
00:21:35.660 So here's his point.
00:21:37.380 He said, for years, many politicians and pundits have dismissed free speech concerns.
00:21:42.380 I was thinking of this FBI talking to a Twitter thing.
00:21:46.920 They've dismissed free speech concerns by noting that the First Amendment only applies to the government.
00:21:52.480 So long as corporations do the censoring they contend, it is not a free speech problem.
00:21:57.600 And then Turley says, this obviously is wrong on several fronts.
00:22:04.200 Now, here he does a better job of making this point than I have, ref.
00:22:08.920 He said, the First Amendment is not the exclusive measure of free speech.
00:22:13.980 Corporate censorship of political commentaries or news stories are denials of free speech that harm our democratic system.
00:22:20.740 And I'm thinking, why did I struggle so much to say this simple thing?
00:22:25.740 Like, this is exactly my point, which I've never said clearly, which is that the First Amendment is not the only measure of free speech.
00:22:39.020 Because free speech is not just a thing we put in the Constitution.
00:22:44.640 Free speech is why we're here.
00:22:47.700 It's not just the Constitution.
00:22:50.400 It's the whole game.
00:22:53.600 The whole game is free speech.
00:22:57.340 Everything we do that's right comes from free speech.
00:23:01.340 Like, you couldn't have the free market.
00:23:03.400 You couldn't have anything.
00:23:05.760 So, I don't know why it took Turley to explain my own opinion to me.
00:23:11.220 But when you see it, this one sentence is just perfect.
00:23:14.760 I keep reading it over and over to myself.
00:23:17.140 The First Amendment is not the exclusive measure of free speech.
00:23:20.760 Just remember that sentence.
00:23:24.140 How well constructed that one sentence is.
00:23:26.760 It's not the...
00:23:28.220 Because if he said it almost any other way, it wouldn't have hit as well.
00:23:33.680 So, that's a really high level of writing skill.
00:23:36.880 All right.
00:23:37.080 That's why you should follow him.
00:23:38.340 Even if you didn't like his opinions, you should follow him as a writer.
00:23:41.440 All right.
00:23:43.360 Turkey had a...
00:23:44.360 Just...
00:23:45.040 We're going to move...
00:23:47.040 The topic is going from Turley to Turkey.
00:23:50.420 Turley to Turkey.
00:23:52.340 Turkey, the country, has had an enormous earthquake.
00:23:56.940 7.8 magnitude.
00:23:58.760 And it's pretty bad.
00:24:02.980 The pictures from over there are just insanely bad.
00:24:05.720 But one picture really caught my attention, which is somehow there are a bunch of smartphone videos of a 10-story building that was...
00:24:18.920 It looked like it was completely intact and then just dissolved into the, you know, into the basement.
00:24:24.840 It just went...
00:24:25.540 And they got that from the moment before it happened until it was gone.
00:24:30.420 It was all on video.
00:24:31.400 And I looked at that and I said to myself, now, somehow they probably knew it was weakened.
00:24:37.280 They might have known it was weakened by the earthquake.
00:24:40.200 But I didn't see any obvious signs of weakness.
00:24:42.860 It looked intact until it just disintegrated.
00:24:46.580 And I'll tell you, I have never been more thankful for government building codes.
00:24:55.900 Am I right?
00:24:57.620 Like, looking for the, you know, the positive and the tragedy here.
00:25:00.540 That building would not have collapsed in America.
00:25:05.020 There are probably almost no buildings in an earthquake zone, anyway, left in America that would have collapsed like that.
00:25:13.600 And that wasn't the oldest building.
00:25:15.760 It's not like it was an ancient building.
00:25:17.980 It looks like it was, you know, it might have been 40, 50 years old, but it was, you know, in modern times.
00:25:23.480 And Haiti was the same thing, yeah.
00:25:26.320 Yeah, every once in a while you just have to say, yeah, your government may be doing a lot of terrible things, and they are.
00:25:35.060 But U.S. building codes have kept us safer.
00:25:38.640 You know, I think, I believe that we should have a federal building code.
00:25:43.260 And it should be taken away.
00:25:44.640 I think states should have their own building code, but there should be a higher level one, a federal one.
00:25:50.800 And the federal one should allow for more experimental stuff.
00:25:54.880 You know, people want to experiment a little bit, but the local zoning codes are a little restrictive.
00:25:59.140 If you had one good, tight federal one, so somebody could say, look, I'm either going to follow all the local ones, or I have an exception, so the exception will allow me to do just the federal one.
00:26:12.640 Now, the federal one would still have all the good stuff in it.
00:26:15.880 It just wouldn't be stupid, right?
00:26:18.380 It'd just allow you a little more flexibility.
00:26:20.420 But you'd still have to, you know, earthquake-proof it and, you know, do your best to make it safe.
00:26:25.500 I think that's what we need.
00:26:26.640 But it's hard to sell because, you know, it just looks like it's regulations on top of regulations, when in fact it would be the opposite.
00:26:35.660 All right, so I saw a tweet this morning from Senator John Cornyn about the health impacts of weed.
00:26:43.060 And his tweet said, pot is making people sick.
00:26:46.240 Congress is playing catch-up.
00:26:47.680 So there's, you know, there continue to be more stories about the good and bad of weed, legalization and use.
00:26:59.760 And here's the problem.
00:27:02.200 We can't really measure the risk of either the benefits or the costs.
00:27:08.200 Right?
00:27:09.120 We actually don't know.
00:27:10.400 So, and the reason we don't know is that the tests seem to be all over the place.
00:27:18.200 You know, there'd be a test that people are getting more sickness, but then somebody will check the mortality and they find it's the same.
00:27:27.320 You know, could it be true that more people smoke weed, get emphysema, and yet the death rate is the same as everybody else?
00:27:35.260 Could be.
00:27:35.680 It could be because maybe weed is giving some benefits to some types of people while being worse for other types of people.
00:27:42.940 Maybe it sort of balances them.
00:27:44.700 Now, here's the biggest thing.
00:27:47.400 We, we as a human species, we like to treat mental health and physical health like they're two separate things.
00:27:56.440 But they're not.
00:27:58.100 You know, if you have bad mental health, you're just as unhappy, just as crippled as if you had some physical problem.
00:28:04.240 Or it could be.
00:28:06.140 So, if you, if you look at mental health and physical health as one ball, how do you measure the people who are benefiting from weed?
00:28:17.380 Because surely there's, you know, surely some people are worse off smoking marijuana.
00:28:23.440 I think that's just a given.
00:28:25.320 Because you can just observe it.
00:28:27.000 I mean, you don't need it.
00:28:27.720 You don't need any science for that, do you?
00:28:29.680 Would everybody agree with that statement?
00:28:31.640 Some people are worse off with weed.
00:28:36.520 I think, I think that's just guaranteed.
00:28:39.860 Would you also accept the possibility that some people, and it might be a very small set of people, some people are better off.
00:28:47.640 Let's say some people, you know, they don't have to worry about their career for whatever reason.
00:28:53.200 Maybe they're independently wealthy or they're an artist like me.
00:28:56.760 And they, you know, don't operate heavy equipment and their family doesn't care and it's legal in their state.
00:29:04.500 And so you can imagine, you can imagine a special case where somebody's better off, you know, mentally especially.
00:29:13.820 And you can imagine a case where people are worse off.
00:29:17.620 But how, how would you, so how do you make a law?
00:29:21.820 This is the same problem that I note with guns.
00:29:24.780 When we, when we have the gun control debate, it's all stupid.
00:29:30.420 It's just all stupid.
00:29:32.920 Here's the only thing you need to know about guns.
00:29:35.760 They make some people safer and they make some people less safe.
00:29:42.000 Does anybody disagree with that?
00:29:44.540 Yeah.
00:29:45.240 And, and we're all lying when we talk about guns, unless you say that.
00:29:50.740 Unless you say what I just said, you're just lying.
00:29:52.760 Here's the lie that one of those two conditions, either guns are legal or they're restricted.
00:30:00.600 The lie is that one of those is the better of a situation.
00:30:03.940 It's not.
00:30:05.140 It's better for some people.
00:30:06.680 It's worse for other people.
00:30:08.480 You can't, there's no way around that.
00:30:11.080 No way around it.
00:30:12.280 Now, I understand the arguments that even if it's bad for people, you know, you want that right because it protects the country and whatnot.
00:30:19.960 But you need to be at least honest about it.
00:30:23.640 If you can't say directly, you know, gun, gun freedom is good for some people and it's definitely bad for other people.
00:30:32.860 But there might be one argument where it's good for everybody.
00:30:37.260 And this is one where we all go stupid or, or we go Democrat.
00:30:41.400 It's the same thing in this case.
00:30:43.660 The Democrats are just stupid about guns because they don't understand how they would have any impact in keeping America free.
00:30:50.400 With my audience, I don't even need to explain that, do I?
00:30:56.580 Because, because they always say, well, what's your AR-15 going to do against my, you know, my jets and my nuclear weapons?
00:31:03.900 And, you know, I'm not going to go through the whole explanation again, how those guns would not be used against your nuclear weapons.
00:31:10.080 They would be used against your police force.
00:31:13.680 Ask the police how well they do in a fight against their own citizens.
00:31:17.480 How well do you think they do?
00:31:20.520 Because, because you can't have a, you know, big old fascist organization unless you own the police.
00:31:27.260 Owning the military doesn't get it done.
00:31:29.620 You have to own the police.
00:31:31.440 And the police are outgunned.
00:31:33.660 We outgun the police, I don't know, 10 to 1.
00:31:36.840 The police would be, if the police decided to join the fascists,
00:31:42.000 the citizens would surround the police department and light it up.
00:31:47.480 It would, it would, it would last an hour.
00:31:51.020 And you would have local control because you would have wiped out the police.
00:31:55.140 Assuming the police became fascist somewhat instantly.
00:31:58.420 And I don't think the police would become the fascists.
00:32:00.800 I think the police would join the citizens.
00:32:03.740 That's what I think.
00:32:05.080 You know, I don't, I don't think our police are, are like, you know, this close to becoming fascists.
00:32:09.660 I think they're this close to becoming on the side of Americans.
00:32:15.720 Because that's, you know, largely that's how they were trained.
00:32:19.360 So anyway, the gun control debate, I consider it a phony debate because the things we talk about are just pure bullshit.
00:32:26.140 You know, it's just people not understanding how guns would protect the country.
00:32:31.900 And people not understanding that the guns are good for some people, definitely, on average, right?
00:32:37.920 And bad for some people, definitely, definitely.
00:32:41.380 So if you know it's good for some people and bad for some people, how do you make the decision?
00:32:47.660 The way I would make the decision in this context, because you can't weigh those two things and it's different groups, etc.,
00:32:54.040 is I would, I would go to the higher level of liberty, exactly.
00:32:58.580 Yeah, the tiebreaker is liberty.
00:33:01.140 And I think, you know, I wouldn't say that that's based on logic or science,
00:33:05.940 but it's certainly based on a preference for how civilization should be organized.
00:33:14.440 My preference is if you can't agree on the details, you default to liberty.
00:33:20.180 Does that make sense?
00:33:21.580 If you can't agree on the cost-benefit, default to freedom.
00:33:27.260 That feels like the process, it just makes more sense.
00:33:32.340 Yeah.
00:33:32.460 But you won't see that argument, because it makes sense.
00:33:37.080 So when I talked about the documentary effect, where all documentaries are persuasive, even if they shouldn't be,
00:33:43.660 Elon Musk commented on it and agreed with, you know, that being a risk in other things as well.
00:33:50.860 And I got two million views on that tweet.
00:33:55.040 So if you wondered what does an Elon Musk comment on your tweet do for you, two million views.
00:34:06.660 Now, I just have to say something about this whole Internet Dad thing.
00:34:13.420 You know, I've talked before, and again, Internet Dad is, you know, is inclusive of all genders and non-binaries and everything.
00:34:20.820 So the dad thing is just sort of a shorthand for an adult who is taking some responsibility, I guess.
00:34:28.640 Now, I think that this whole dad effect is the biggest unreported power shift in the country.
00:34:38.240 The number of kids who were influenced by Andrew Tate is huge.
00:34:47.000 It's huge.
00:34:47.680 Whatever you want to say about him, his influence was huge.
00:34:50.980 You know, Jordan Peterson, his influence, huge.
00:34:55.080 Huge.
00:34:55.840 Elon Musk, his influence, huge.
00:34:59.180 My influence, you can decide for yourself.
00:35:01.820 But in this case, because one Internet Dad agreed with the point, two million people saw it, and it's two million people who would follow Elon Musk.
00:35:14.980 So it's, you know, a certain type of, you know, let's say more informed, you know, people running the country kind of people.
00:35:23.200 So every time you see this effect, just watch how it's getting bigger.
00:35:29.120 And I would also argue that we're either at or near the point where if the people I call the Internet Dads, and you know who they are, right?
00:35:39.000 You can make your own list of the people who are somewhat independent, and they talk about political stuff and social stuff.
00:35:46.180 And they're not going to just come down on one side all the time.
00:35:50.960 They're going to independently think about it.
00:35:52.940 And I think that group is just getting more and more powerful because other groups are harder to trust.
00:36:01.000 The Internet Dads, I honestly think, are mostly in it for the right reason.
00:36:07.360 Would you agree?
00:36:09.360 Would you agree that, you know, make your own list, mental list of who the Internet Dads are.
00:36:13.900 And again, you could add women and LGBTQ and robosexuals and non-binaries.
00:36:21.020 By the way, why do the non-binaries not get an N?
00:36:24.600 Shouldn't LGBTQ have an N for non-binary?
00:36:30.380 It's got a plus.
00:36:32.320 Or maybe they already added it.
00:36:33.560 I don't know.
00:36:33.940 It's hard to keep up.
00:36:35.000 Yeah, Joe Rogan would be one of the Internet Dads.
00:36:39.460 So keep an eye on those Internet Dads.
00:36:41.760 All right, we have reached Act 3 in Scott's public performance.
00:36:48.840 Do you know what I'm talking about?
00:36:51.400 Does everybody know what public performance I've been putting on for the last, I don't know, few weeks?
00:36:57.400 So it's about the pandemic and the so-called COVID shots.
00:37:02.660 And we've boiled it down to the third act.
00:37:07.760 So in the third act, Scott is wildly criticized by all of his critics.
00:37:15.880 The critics have been proven completely right.
00:37:19.680 Scott has been proven completely wrong.
00:37:22.900 And now he must pay.
00:37:25.840 He must pay.
00:37:26.940 So that's the third act.
00:37:29.560 Now, in a movie, the third act is where the protagonist, the subject of the movie,
00:37:35.180 is in such a problem that that person cannot possibly get out of that problem.
00:37:41.240 The viewer of this movie says, ah, there's no way to get out of that.
00:37:44.920 You're dead.
00:37:46.280 But then, as if by magic, the protagonist escapes.
00:37:50.740 We have reached Act 3.
00:37:54.760 I promised you it was coming.
00:37:57.020 I told you there was something coming.
00:37:59.020 See, you were just getting all mad because you were watching the first part of the movie.
00:38:02.500 And you didn't know it was a movie.
00:38:04.700 If it's just the first part, that would just make you mad.
00:38:08.200 There's no relief.
00:38:09.820 But we're reaching the part where I try to escape from the third act trap.
00:38:15.520 And comic Dave Smith, who may or may not have some disagreements with me on long COVID,
00:38:26.160 I asked him how he calculated the risk of long COVID.
00:38:30.620 And by the way, that is the solution to the third act.
00:38:36.020 It's one question.
00:38:38.220 How did you calculate the risk of long COVID?
00:38:43.480 And the question that we're trying to settle is,
00:38:49.180 did the people who got the right answer about the COVID shots,
00:38:53.240 did they use a better process, a rational process that is superior to mine?
00:38:59.260 If they did, then we should learn from them, especially I should learn from them.
00:39:03.880 If they did not, then it was guessing.
00:39:07.580 And I characterized it as guessing,
00:39:09.580 and then allowed people to defend themselves.
00:39:14.240 And after they defended themselves, I followed up with one question.
00:39:19.540 How did you calculate the odds of long COVID?
00:39:23.000 So, comic Dave Smith had a podcast, which I tweeted, and you can search for it.
00:39:29.400 Just look for Dave Smith Podcast,
00:39:31.440 and you could add my name if you want to see the episode.
00:39:33.720 I don't know why people still say, here's the URL.
00:39:40.980 Isn't it easier just to say, just Google these terms, if it works every time?
00:39:46.020 I mean, if you Google comic Dave Smith and Scott Adams,
00:39:51.500 the video pops right up.
00:39:54.240 So do that.
00:39:54.820 But nine minutes in, he and his co-host talk about how they handled the long COVID calculation.
00:40:05.620 Now, here's the setup.
00:40:07.780 If they used a rational process to look at all the risks and all the upside and downside,
00:40:14.300 even if they calculated it wrong, I would say that's a rational process.
00:40:21.060 Right?
00:40:21.360 Because you could make a, you know, you could have bad data or something.
00:40:25.000 But is it a rational process?
00:40:26.600 And my line for rational is if you simply considered all of the costs and all the benefits.
00:40:33.960 If there was a big cost or a big benefit that you just ignored, I would call that guessing.
00:40:42.400 In other words, if you thought, if you pretended you were doing a rational process,
00:40:46.200 but one of the biggest parts of the decision you just ignored, that's just guessing.
00:40:52.400 It's just you think it's not, but it is.
00:40:54.780 Because you're guessing that the biggest variable doesn't matter.
00:40:59.280 So, comic Dave Smith and his co-host actually argued that the only thing you needed to know
00:41:05.380 was that the government was completely, let's say, not trustworthy.
00:41:12.660 To which I said, wait a minute.
00:41:15.600 If you're only looking at if the government is not trustworthy,
00:41:19.260 you would have to say the government has never made a correct decision
00:41:22.980 and there's never been a pharmaceutical drug they approved that was good for you.
00:41:28.960 Would anybody say that?
00:41:30.940 Would anybody say that every drug approved by the government is bad for you?
00:41:35.480 There's some chance that the government approves something that's good for you, right?
00:41:39.760 Is it zero?
00:41:40.400 I completely buy into the, don't trust the government when they're rushed.
00:41:46.280 Don't trust the government when lots of money is involved.
00:41:49.160 Don't trust them when they show you their work and they say, it looks sketchy.
00:41:53.080 Don't trust the government when they say, we'll let big pharma hide their data for 75 years.
00:41:59.080 Don't trust the government when they say, it usually takes 5 to 10 years to know,
00:42:04.100 but this time we did it in 6 months or a year or whatever, right?
00:42:07.440 So, Dave goes through all those reasons why you should not trust the government.
00:42:11.720 Do I agree with him?
00:42:13.700 Yes.
00:42:14.660 Yes.
00:42:15.860 Totally on the same page.
00:42:17.180 And here's how you know that people are hallucinating about me.
00:42:25.580 Who in the world would believe that I trusted scientists or the government?
00:42:31.020 Nobody did.
00:42:32.280 Did they?
00:42:33.000 I don't think anybody did.
00:42:34.840 They just took their best shot based on what they knew and what they didn't know.
00:42:38.680 So, comic Dave Smith admits that he looked at the distrust of government as his primary decision-making thing.
00:42:47.860 And when it came to long COVID, they said he believes it's a very low risk and impossible to calculate.
00:42:55.920 So, what do you do when you believe something's a very low risk?
00:43:00.280 That was based on analysis?
00:43:03.980 No, that's a guess.
00:43:05.280 He basically argued in public that he didn't know that one of the biggest factors, so he acted like it doesn't exist.
00:43:16.200 And he said he thought the risk was similar to getting hit by lightning.
00:43:21.160 And if something is a very low risk, if something's a very low risk, it's reasonable to ignore it.
00:43:28.320 Do you know what else is a very low risk?
00:43:32.540 Vaccination injury.
00:43:33.460 Vaccination injury.
00:43:34.460 Vaccination injury.
00:43:36.480 Long COVID.
00:43:38.260 All of it.
00:43:40.000 The entire domain is low risk stuff.
00:43:43.960 So, we're trying to compare a low risk of vaccine injury to a low risk of COVID injury to a low risk of COVID death.
00:43:53.560 Right?
00:43:54.320 So, his argument is that one of them is low risk, so you can't count it.
00:43:58.840 Is that analysis or is that guessing?
00:44:04.700 To me, that's just a laundered guess.
00:44:08.260 Right?
00:44:08.460 That's just a guess, because you don't know, so you just guessed.
00:44:12.140 And I think he says it directly.
00:44:16.060 That's the third act.
00:44:17.100 Now, do you think he would be the only person to take the same tact to explain that the government should not have been trusted, and somehow, and here's the logic gap, somehow, I'm an educated, well-informed person.
00:44:33.060 And I'm the only person in the world who didn't understand that the government shouldn't be completely trusted in this context.
00:44:41.120 Who would believe that?
00:44:44.500 Who would believe that I'm an educated, well-informed person, but I'm the only person in the country who missed the fact when every day I talk about the government lying to us?
00:44:56.420 Every day.
00:44:59.040 Every day.
00:45:00.720 For years.
00:45:02.180 For about an hour a day, I spend some of that hour talking about how the government can't be trusted.
00:45:08.080 Do I not?
00:45:11.900 You tell me.
00:45:12.980 Do I not spend every day in public telling you the government can't be trusted?
00:45:18.620 And do I say that data on studies can totally be trusted?
00:45:24.020 No.
00:45:24.880 Every day.
00:45:26.060 For years.
00:45:27.580 I've told you that you can't trust any of the data about the pandemic.
00:45:30.960 Or much of anything else.
00:45:32.180 Right?
00:45:33.380 But being the most skeptical person you've ever met,
00:45:36.500 comic Dave Smith's best theory about why my opinion differs from him is that I didn't pay attention to anything that happened in the last two years.
00:45:48.420 That somehow I'd missed it all.
00:45:49.800 That's his working assumption, is that I missed all of it, uniquely.
00:45:57.600 And that if I'd paid more attention to one side of the argument, just one side of it, I could have gotten the right answer like he did.
00:46:06.100 But what I did was make the terrible mistake, apparently, of looking at all the risks and realizing that they were uncalculable.
00:46:13.660 And therefore, if you can't calculate the risks, you're guessing.
00:46:22.540 Comic Dave Smith says, if you can't calculate the risks, you're making a good decision by ignoring them.
00:46:32.660 Now, here's something that I should say more often.
00:46:41.220 If you're an expert, you know, let's say you have some experience in decision making,
00:46:45.680 you would say that looks crazy and irrational, right?
00:46:50.520 And I would guess that 100% of the engineers are saying that right now.
00:46:54.660 But here's something you should keep in mind.
00:46:59.000 Rational thinking is not a natural ability.
00:47:01.160 It's a learned skill.
00:47:05.340 Scientists are not born scientists.
00:47:07.440 They learn technique.
00:47:10.280 I, too, have learned how to compare things without leaving stuff out.
00:47:14.720 I've spent years of, you know, both educational attainment and, you know, corporate experience of knowing what not to leave out.
00:47:23.880 So there's nothing wrong with comic Dave Smith's brain.
00:47:28.100 Let me be as clear as possible.
00:47:29.600 He's a real smart guy.
00:47:31.160 He's a very smart guy who apparently cares a great deal about his country and the people in it.
00:47:37.740 That's all good.
00:47:39.300 That's all good.
00:47:41.080 You want more Dave Smith's.
00:47:43.720 Can we be clear about that?
00:47:45.740 The more Dave Smith's you have, the better the country is.
00:47:48.980 I said the same thing about, you know, Brett and Heather.
00:47:51.520 The more of them, the better.
00:47:52.580 So I don't have to agree with them to say that they're adding value.
00:47:56.800 They add a lot of value.
00:47:58.580 They add value by disagreeing with me.
00:48:01.280 I think their disagreement with me enhances all of your knowledge.
00:48:06.620 Because it, you know, inspires me to make my argument better, right?
00:48:11.420 So this is all good.
00:48:13.660 You're actually seeing the best part of America right now.
00:48:17.680 You're seeing free speech.
00:48:19.920 Nobody stopped, you know, Adam from saying, I'm sorry, Dave Smith from saying what he was saying.
00:48:27.740 Nobody stopped me from saying what I was saying.
00:48:30.340 And then you got to see what we both said.
00:48:32.960 And then you can decide.
00:48:34.760 That's as good as it gets.
00:48:36.920 You're seeing the best.
00:48:39.360 All right.
00:48:39.720 And there was another article that made exactly the same point, that you can ignore the biggest risk if you can't calculate it.
00:48:48.340 People are saying that out loud.
00:48:51.140 But you won't see any engineers say it.
00:48:53.100 You will see journalists, comedian, and entertainers say it.
00:48:59.240 But you won't see any engineers say it.
00:49:01.420 Here's what an engineer would say.
00:49:04.000 Specifically, a data engineer named Joe Moore, who is also a real good follow on Twitter.
00:49:12.600 And he often comes into my Twitter feed to teach people how to think.
00:49:20.780 And it's wonderful to watch.
00:49:23.100 He is simply somebody who has learned how to compare things.
00:49:27.620 And when people don't do it right, he goes into the comments and teaches them.
00:49:32.180 So here's what he says.
00:49:34.900 He says, Scott's position is, now remember, he's a data engineer.
00:49:39.800 So he's trained to know how to make these decisions.
00:49:43.300 Scott's position is it's impossible to know the right decision.
00:49:46.100 And the podcast dudes, that was Dave Smith and his co-host, mock him and say it's impossible to know the risk of long COVID.
00:49:55.060 So they ignored it.
00:49:56.960 Proves Scott's point.
00:49:59.120 Exactly.
00:50:00.060 By ignoring it, they proved my point.
00:50:02.000 Now, again, these are smart people who are, you know, high-functioning people in society, well-meaning.
00:50:12.140 And there's nothing wrong.
00:50:14.240 There's nothing wrong with the fact that you don't know how to do stand-up comedy.
00:50:18.980 And he's not trained to be a data engineer.
00:50:23.900 There's nothing wrong with that.
00:50:25.600 We're just figuring it out best we can, right?
00:50:29.760 And then Joe Moore went on.
00:50:31.660 Second tweet, he says, Scott isn't calling anyone wrong.
00:50:34.920 Never, in terms of whether they got vaccinated or not.
00:50:37.860 That's correct.
00:50:39.080 I've never said, I've never criticized any individual decision.
00:50:42.660 So Joe watches me so he knows that.
00:50:45.260 So he says, Scott isn't calling anyone wrong, specifically about the vaccination.
00:50:49.700 I am calling people wrong about the decision-making.
00:50:53.160 It never has.
00:50:54.380 Dave admitted he couldn't calculate long COVID, and so he ignored it.
00:50:58.420 And then Joe asked this question.
00:50:59.800 I like this context.
00:51:02.400 He says, all the bad stuff, speaking about the pandemic, has low odds.
00:51:08.320 And then he lists these odds.
00:51:10.100 He goes, 1%, 1 in 1,000, 1 in 100,000, 1 in 1 million.
00:51:15.700 Which number is the vax risk, and which is the long COVID risk?
00:51:20.640 I don't know.
00:51:22.320 Do you?
00:51:23.340 I don't know.
00:51:26.160 That's a pretty good point.
00:51:27.840 Like, the difference between 1%, or 1 in 1,000, or 1 in 1 million,
00:51:33.180 that's a really big difference.
00:51:35.360 But I don't know which one of those is which.
00:51:37.440 I can't even tell.
00:51:38.120 Would comic Dave Smith know?
00:51:41.660 If he saw three statistics, would he know which one's the long COVID,
00:51:45.540 which one's the dying from the vaccination?
00:51:48.880 I don't know.
00:51:51.900 All right.
00:51:56.020 I'm going to do something that I thought long and hard about,
00:51:59.680 and I think it might be a huge mistake.
00:52:02.080 But I'm going to do it here in public, because there's a benefit to it.
00:52:08.800 I'm going to teach our politicians and our corporate leaders how to identify a fang fang,
00:52:18.720 in other words, a Chinese spy, or any spy,
00:52:21.400 who is trying to befriend you the way fang fang befriended Swalwell.
00:52:31.800 Now, there is a pattern that you can identify them.
00:52:37.640 Now, I don't know anything about how fang fang infiltrated the Swalwell organization,
00:52:43.540 but he was a councilman in a smallish town, mid-sized town.
00:52:51.380 So he was a council person.
00:52:53.920 Now, the first thing you need to know is that they try to influence people early on in their career.
00:53:01.580 So I got a phone call.
00:53:05.300 I'm sorry, not a phone call.
00:53:06.300 I got a text the other day that is either a scam.
00:53:13.840 This is definitely not a real person.
00:53:15.700 It's either a scam, or it's a fang fang Chinese spy
00:53:21.140 who is trying to get to me while I'm still in my, let's say, not too influential stage.
00:53:27.220 Now, I'm not going to claim this as an actual spy contact, right?
00:53:33.420 If the FBI wants the phone number that it came from, I'd be happy to give it to them.
00:53:37.540 But here's the problem.
00:53:39.940 What if they already have it?
00:53:42.860 What if the number that texted me is already known as a Chinese spy?
00:53:50.100 Because I feel like we probably know some of them, don't we?
00:53:54.320 You know, don't we have ways to identify who the spies are?
00:53:58.920 If, in fact, this is a known spy,
00:54:02.960 then there's a good chance that the FBI is already looking at all of my private conversations.
00:54:08.700 Because I believe that's legal.
00:54:12.180 Give me a fact check on that.
00:54:15.780 If they knew for sure that a spy was contacting me,
00:54:19.120 and I'm telling you that maybe that happened,
00:54:21.180 wouldn't they have access to all of my private communications?
00:54:27.600 I think so.
00:54:29.320 And if they don't already, they're going to have it now.
00:54:32.540 Because I'm telling you that I had a contact
00:54:35.260 that, in my opinion, looks suspicious.
00:54:39.700 Now, they don't even need to know for sure that it's a spy, right?
00:54:43.400 They would only have to suspect it's a spy,
00:54:45.640 and then they could look at all of the spies' communications,
00:54:49.200 but also the communications of anybody they contacted.
00:54:53.120 All of it.
00:54:54.320 I believe.
00:54:56.040 I believe they'd have all of it.
00:54:57.980 Now, here's a little rule that I would like to teach you.
00:55:01.880 You never had privacy.
00:55:04.920 You never had privacy.
00:55:07.560 It's not attainable.
00:55:09.380 We don't live in a system where anybody has privacy.
00:55:13.900 The only way that you can stay safe is to be uninteresting,
00:55:17.860 so that nobody cares to look.
00:55:20.180 But the government always had the ability to, you know,
00:55:23.540 do a court order and look at all your stuff.
00:55:26.980 They just needed a reason.
00:55:28.840 And they didn't need a good reason.
00:55:30.840 That's one thing we learned from the FISA stuff.
00:55:33.140 They don't need a good reason.
00:55:34.360 They need anything that a judge would look at on a piece of paper
00:55:38.300 and go, all right, well, whatever.
00:55:40.800 I don't have time to look into it, but if you say.
00:55:45.080 So I'm going to read you the contact as it happened,
00:55:48.040 and then I'm going to teach you how to identify
00:55:50.000 how a spy would operate.
00:55:53.600 Now, I'm going to make the following claim
00:55:55.180 that I know how a spy would operate.
00:55:58.700 Don't ask me how.
00:55:59.780 I'm not a spy.
00:56:01.280 I'm not a spy, but don't ask me how I know that.
00:56:04.360 I just know it, all right?
00:56:06.880 So let me find the communication here,
00:56:11.920 and I'll read it to you as it happened.
00:56:16.540 All right.
00:56:17.480 The first text came in, and it was only a phone number,
00:56:22.780 and it was a 617 area code,
00:56:27.300 which I think is Massachusetts area.
00:56:31.520 All right, so the first message says,
00:56:32.780 I'll be in California next Saturday.
00:56:34.640 Would you like to eat Korean barbecue together?
00:56:38.600 Now, I often get spammy things, and I get scammers.
00:56:44.740 So I thought, oh, it's just a scammer,
00:56:47.680 but it could be somebody had a wrong number.
00:56:50.780 So what did I do?
00:56:53.560 I ignored it,
00:56:54.620 because I ignore all the things that look like they're not for me,
00:56:57.900 or it's a scammer.
00:56:58.880 So I just ignore it.
00:57:00.900 A few days go by.
00:57:02.160 I get a follow-up message.
00:57:04.000 Are you really busy that you don't have time to reply to my message?
00:57:09.660 Now, that's interesting.
00:57:11.620 That's interesting.
00:57:12.740 Now, the first thing you need to know is that a spy
00:57:15.020 would try to meet you accidentally.
00:57:19.380 It has to look accidental.
00:57:20.860 So here's somebody meeting me accidentally.
00:57:25.320 And the first thing they asked about was Korean barbecue.
00:57:28.840 Now, nobody who knows me well would invite me to a Korean barbecue,
00:57:32.920 because I don't eat red meat.
00:57:35.680 So there's nobody who would casually ask me to lunch for a Korean barbecue.
00:57:42.440 So I know they don't know me, right?
00:57:44.060 So I replied, thinking that there was some possibility
00:57:49.020 it's somebody who just had a wrong number.
00:57:51.220 And I didn't want to ruin her day.
00:57:53.280 I didn't want to ruin her day in case it's just a friend they're trying to catch up with.
00:57:58.320 So I said, I don't know who this is from.
00:58:01.140 Obviously, no one who knows me, based on the message.
00:58:05.280 And then it comes back, and the message says, it's me.
00:58:07.800 And I won't say the name, just on the off chance that it's a real person.
00:58:12.640 But it's an Asian first name.
00:58:15.140 Asian as in somebody not born in this country, maybe.
00:58:18.780 Or maybe a second generation person with an Asian sounding name.
00:58:25.380 And then I replied, I said, I don't know you.
00:58:28.580 You must have a wrong number.
00:58:30.600 So I'm already, right, my alert is already up.
00:58:34.900 That this is, you know, a scam or something.
00:58:37.100 But there's some possibility, some small possibility,
00:58:41.040 it's like a real person who has a wrong phone number.
00:58:44.280 So I want to help him out.
00:58:45.760 I said, I don't know you.
00:58:46.680 You must have a wrong number.
00:58:47.840 And then I said, who do you think I am?
00:58:50.960 And then the answer came back, are you Jesse?
00:58:54.480 J-E-S-S-E.
00:58:59.020 Now, I said, no, not Jesse.
00:59:01.820 That's the problem.
00:59:03.160 Have a good trip.
00:59:04.120 I hope you two connect.
00:59:05.080 So now I've just closed, I've closed the circuit.
00:59:10.260 Everything this person needs to know is now known.
00:59:13.400 I'm not the person they're looking for.
00:59:15.720 And I said, have a nice day.
00:59:17.700 Now, if it's a real woman, what would that real woman do next?
00:59:24.700 What would a real woman do?
00:59:28.980 Sorry I bothered you?
00:59:30.920 Boop.
00:59:31.760 Or nothing.
00:59:33.620 But nobody keeps talking after that.
00:59:36.820 Am I right?
00:59:38.640 Nobody keeps talking after that.
00:59:44.740 Next message was, oh, I input the wrong number.
00:59:50.280 Sorry.
00:59:51.100 Oh, okay.
00:59:51.940 So now we're done.
00:59:53.520 That's the end of it, right?
00:59:55.540 So I think, well, that's the end of it.
00:59:58.660 Then another message comes in.
01:00:00.480 Thanks for being nice.
01:00:02.860 Oh, okay.
01:00:04.040 It still could be nice.
01:00:05.140 This still could be real.
01:00:07.080 Then the third one comes in.
01:00:08.500 I haven't answered the first two.
01:00:09.680 Are you in California?
01:00:11.820 There it is.
01:00:16.780 There it is.
01:00:18.800 There's the tell.
01:00:20.460 All right.
01:00:21.040 As soon as this question was asked, we can now eliminate real person.
01:00:26.320 Correct?
01:00:27.880 We've eliminated the possibility it's a real person.
01:00:30.500 Because zero real people who are women follow up with a man that they don't even know his name.
01:00:37.660 Keep in mind, she doesn't even know my name.
01:00:41.500 Allegedly.
01:00:42.300 Like, allegedly she doesn't know my name.
01:00:43.980 But I think she does.
01:00:47.200 All right.
01:00:47.780 So I wanted to tease this out a little bit.
01:00:51.280 So I said, I am in California.
01:00:54.660 So probably same area code as your friend.
01:00:57.580 So I'm playing along at this point.
01:01:00.060 So I'm giving her a reason why she would have asked the question.
01:01:03.480 Oh, because you're both the same area code.
01:01:05.240 Maybe that's how you got the wrong phone number.
01:01:08.040 And then she answers, yeah, he is my childhood friend.
01:01:13.000 But we used to study together when we were in Hong Kong.
01:01:17.780 Used to study together when they were in Hong Kong.
01:01:21.300 Now, what information do I have about this person so far?
01:01:27.800 Here's what I know.
01:01:29.860 I know that they have an Asian, some kind of connection to Asia.
01:01:34.120 I know that they're young enough that they're talking about studying together in Hong Kong.
01:01:41.460 So the age of the person is sort of not too long after college.
01:01:47.780 So she's sort of identifying herself in her 20s-ish, probably.
01:01:52.960 And she's saying that she lived in Hong Kong.
01:01:56.160 Now, why would you say you studied in Hong Kong?
01:02:00.660 Do a lot of people internationally travel to Hong Kong just to study, or did they?
01:02:05.200 Or is it the safest way to say you're a Chinese resident?
01:02:11.880 Because, so I said, after she said, yeah, he was my childhood friend, but we used to study together when we were in Hong Kong.
01:02:24.080 Now, nobody needs to tell that to a stranger.
01:02:26.620 I'm a stranger who did not need to know anything about Jessie or anything about their history.
01:02:33.200 This is clearly a tell that something scammy is going on, right?
01:02:38.280 So I try to close it down, but really I'm testing to see if she stops, right?
01:02:43.620 So it looks like I'm trying to close the conversation, but I wasn't.
01:02:46.740 So what I was trying to do is close it in a way that any normal person would be done, but I wanted to see if there was more.
01:02:55.060 So I said, good luck finding him, exclamation mark.
01:02:58.780 All right, that's just done, right?
01:03:00.180 Good luck finding him.
01:03:02.400 Next message.
01:03:03.540 I live in Florida.
01:03:04.560 I didn't ask.
01:03:08.100 I never asked.
01:03:09.820 I'd already checked the area code and it was from Massachusetts, but it says, you know, that doesn't mean anything.
01:03:17.240 And then the next text, and I don't even answer.
01:03:20.280 It says, I live in Florida.
01:03:21.580 I ignore it.
01:03:22.640 Next text is, have you ever been here?
01:03:28.260 Have I ever been to Florida?
01:03:29.660 Like, why would she ask a stranger whose name she doesn't even know if I've been to Florida?
01:03:38.200 All right.
01:03:38.940 So now I'm going to draw around a little bit.
01:03:41.120 So I go many times.
01:03:42.800 And then I text, are you Chinese by citizenship?
01:03:47.820 So I said, are you Chinese by citizenship?
01:03:52.060 And she answered, yeah, I am.
01:03:55.320 Do you have Asian friend?
01:03:58.160 Do you have Asian friend?
01:04:00.180 So the Asian friend thing suggests it could be just a scam, you know, not necessarily a spy.
01:04:06.720 So my next message was, is your name Fang Fang, LOL?
01:04:12.560 And she responds, nope, haha, I'm, and then she tells her name, first and last name that are Chinese sounding.
01:04:20.780 And then you can call me, blah, blah, blah, my first name.
01:04:24.960 Now, I didn't answer after that.
01:04:26.520 And that was the end of the exchange.
01:04:28.820 So here's what you look for.
01:04:31.800 You look for an accidental meeting that looks accidental, might not be accidental.
01:04:37.300 So it happens in the normal course of life.
01:04:40.240 In Swalwell's case, it was probably somebody who joined his campaign, and it looked like somebody who just liked his politics.
01:04:48.320 You also want to look out for the flattery and the flirting that's out of context.
01:04:55.940 Nobody flirts with a stranger.
01:04:58.500 No woman flirts with a man she knows nothing about that she called by accident.
01:05:03.660 That doesn't, that doesn't happen.
01:05:05.280 So the reason that that would work with anyone is that men will believe anything if it's a compliment to them.
01:05:15.640 If you tell a man that the most beautiful woman in the world is maybe flirting with him, the first thing he's going to think is maybe she is.
01:05:24.440 So we're very easy to fool with flattery.
01:05:28.200 So flattery, accidental meeting, and then following up too much, way too much interest, giving too much information.
01:05:37.840 In this case, she started early by mentioning Hong Kong and studying.
01:05:42.200 The Hong Kong thing, I believe, was to prime me so that once I found out, if I did, that she was a Chinese citizen, she would have a reason to explain that she's one of the good ones.
01:05:56.640 Because if you were a Hong Kong resident before China took over, well, yeah, maybe you're a Chinese resident now, but you were one of the freedom ones.
01:06:05.960 You weren't necessarily one of the communist ones.
01:06:08.420 I don't think it's a coincidence that she mentioned Hong Kong to, you know, right off the bat, to be able to frame herself as one of the good Chinese citizens.
01:06:19.800 By the way, all Chinese citizens are good.
01:06:21.760 I'm not saying anything except the government of China has issues.
01:06:28.800 You don't like my absolute?
01:06:30.240 So, look for the person who's willing to admit they're a Chinese, I imagine that Fang Fang was very open about her connections to China, just guessing.
01:06:44.860 And that she probably had a whole explanation of a backstory that would make her look like she was safe.
01:06:52.560 All right.
01:06:53.200 Now, how many of you would have recognized this as a potential Chinese spy?
01:07:00.940 How many of you would have thought potential spy?
01:07:07.180 You would?
01:07:08.140 Good.
01:07:10.040 Good, good.
01:07:11.980 Now, how many of you think that somebody at my level of, I don't know, public exposure would be approached by a Chinese spy?
01:07:23.200 I think so.
01:07:25.620 Yeah, I would imagine that a lot of people who talk about politics on both sides have had maybe a casual contact with somebody who acted a little friendly.
01:07:35.880 So, anyway.
01:07:41.620 That's your lesson on how to spot a spy.
01:07:44.780 Now, I don't know if it was a spy.
01:07:46.200 You know, you can't be 100% sure.
01:07:47.560 It could have been just a, you know, a romance scam kind of a thing.
01:07:55.080 But everything about it suggested Chinese spy.
01:08:00.360 Am I flattered?
01:08:01.940 No.
01:08:02.740 It's just creepy.
01:08:04.520 Like, all I thought about it was, oh, this is creepy.
01:08:07.380 But it also tells you to be careful.
01:08:11.120 You think it was a bot scam?
01:08:15.280 You think I was talking to a bot?
01:08:18.280 There were typos.
01:08:19.380 I don't think so.
01:08:23.660 All right.
01:08:25.020 What top secret stuff do I know?
01:08:28.420 I don't know.
01:08:29.880 Do I know any?
01:08:31.320 It depends what top secret means, I guess.
01:08:33.300 I don't know anything that nobody knows.
01:08:36.140 I know things that not many people know.
01:08:39.160 But I don't know what a top secret would be in this context.
01:08:44.120 Yeah.
01:08:44.700 So, if it had been, if I had taken it further, I could find out for sure.
01:08:50.040 Here's how I would find out for sure.
01:08:52.340 If it was a money scam, the way it would work out is, oh, you know, I can't get a hold of Jesse, but I'd love to visit California.
01:09:01.500 And then see if she could get me to say, you know, I'm single.
01:09:06.640 I could show you around.
01:09:08.600 Something like that.
01:09:10.140 And then, and if it's a scam, then the next thing would happen is she'd say, okay, I'm booking my flight, but my credit card got stolen.
01:09:20.660 So, I can't book the flight, and there's only an hour left to get this price.
01:09:24.220 Oh, my goodness.
01:09:24.900 If I don't know what to do, I guess I can't come.
01:09:26.940 And that's when I'm supposed to offer, oh, but you could temporarily use my credit card and then pay me back.
01:09:34.500 So, it'd be something like that.
01:09:36.140 So, if she, if I had made plans to see her because I believe she was just sort of interested in me against all odds,
01:09:44.300 at some point, there would be an emergency on her end that required immediate funding to make the trip.
01:09:52.380 If that didn't happen, and she actually flew all the way out here, funded it herself, and said, oh, I have a job.
01:10:00.600 I don't need any money.
01:10:01.960 That would be a Chinese spy.
01:10:04.840 For sure.
01:10:06.420 Yeah.
01:10:06.680 Because it's not a real person.
01:10:07.840 No real person does that.
01:10:13.600 And if she paid for everything, yeah.
01:10:15.280 Yeah.
01:10:22.380 Oh, wow.
01:10:23.300 You had a fake person calling saying they knew you.
01:10:25.700 Yeah, that's.
01:10:29.480 I don't know why I'm thinking about this.
01:10:31.220 This is the most random thing.
01:10:33.040 I saw a pickup line that somebody on Instagram was saying was a pickup line.
01:10:40.400 Tell me if I've told you this one before.
01:10:42.740 As soon as I heard it, I thought, that is the most persuasive pickup line I have ever heard.
01:10:47.080 All right.
01:10:48.480 You walk up to somebody at, let's say, let's say it's a business event, but you're definitely in flirting mode.
01:10:55.220 No, maybe not a business event.
01:10:56.620 Let's say it's a social event.
01:10:58.560 You get the business out of it.
01:11:00.340 It's a social event, and you walk up to somebody, or maybe it's not.
01:11:04.500 It doesn't have to be a social event.
01:11:06.120 And a man says to the woman, there's no reason for you to look this good today.
01:11:11.940 That is the best pickup line.
01:11:19.640 Because what is a woman going to say to that?
01:11:23.480 No woman ignores that.
01:11:25.940 Every woman will engage on that question.
01:11:30.360 There's no reason for you to look this good today.
01:11:33.180 And it's completely different than just giving a compliment.
01:11:36.000 If you walked up and say, you look beautiful, I just had to tell you.
01:11:39.180 Well, that's kind of creepy.
01:11:40.300 But if you walk up and you criticize her for looking better than she needs to, that's totally different.
01:11:47.280 That's like the negging, the so-called insult compliment is built into the first sentence.
01:11:53.680 You don't have a reason to do it, meaning that your brain isn't working right.
01:11:59.500 Yeah, you look good.
01:12:00.340 Big deal.
01:12:01.120 You didn't have a reason to do it.
01:12:02.440 If you do it with a smile, and, you know, I think the person would get that you're just being clever.
01:12:11.540 But they would also engage.
01:12:13.140 It's like, what do you mean?
01:12:15.600 What do you mean?
01:12:16.860 And then get another compliment, blah, blah, blah, blah.
01:12:21.480 Goodbye, Sharon.
01:12:22.180 Sharon?
01:12:26.960 Sharon, you don't seem like a good person, so we'll get rid of you.
01:12:30.100 What about Ukraine?
01:12:38.900 You're sending prayers my way.
01:12:40.400 Good.
01:12:41.380 Thank you.
01:12:43.400 Clipped?
01:12:44.000 Oh, you clipped that?
01:12:46.420 All right.
01:12:47.200 There's not much happening in Ukraine, is there?
01:12:49.200 Is there anything happening in Ukraine today?
01:12:51.020 Except people lying about what's happening in Ukraine.
01:12:53.380 I think someday is the only lesson about Ukraine that I'll give you.
01:13:03.820 The only lesson is that I don't believe anything that comes out of Ukraine.
01:13:08.380 If you do, that's on you.
01:13:11.520 All right.
01:13:12.020 That's all for now.
01:13:12.680 YouTube, I'll say bye.
01:13:14.220 I'm going to talk to the locals' people a little bit more.
01:13:16.780 Bye for now.
01:13:17.320 Bye.
01:13:17.440 Bye.
01:13:17.480 Bye.
01:13:17.520 Bye.
01:13:17.560 Bye.
01:13:17.600 Bye.
01:13:19.440 Bye.
01:13:19.520 Bye.
01:13:21.520 Bye.