Real Coffee with Scott Adams - November 24, 2022


Episode 1937 Scott Adams: Today I Will Bend Your Brains Like Never Before. Happy Thanksgiving!


Episode Stats

Length

1 hour and 7 minutes

Words per Minute

142.91818

Word Count

9,701

Sentence Count

776

Misogynist Sentences

8

Hate Speech Sentences

6


Summary

Happy Thanksgiving to all Americans and Canadians, and non-Americans, too! Today's episode is all about being thankful for all of you, the people who helped keep us alive this year, and all of the things we're thankful for.


Transcript

00:00:00.000 Good morning, everybody, and a special Happy Thanksgiving to all Americans and Canadians and non-Americans, too.
00:00:14.480 We'll be thankful for all of you.
00:00:17.320 And today, you have managed to blunder upon, or possibly cleverly planned to be here,
00:00:25.440 the highlight of civilization itself, and I'm going to bend your brain so hard today.
00:00:31.640 Oh, yeah, you better put a seatbelt on your brain, because I'm going to be tweaking it a little bit today.
00:00:37.900 And if you'd like to get ready for this brain-bending experience,
00:00:42.480 the most awesome thing, commercial-free, that you've ever seen in your entire life,
00:00:47.400 oh, yeah, it includes a whiteboard.
00:00:49.500 I wasn't even going to tell you, because I didn't want you to get too excited too fast.
00:00:53.800 We've got to build to it, build up to it.
00:00:56.980 And the first thing you need is a cup or a mug or a glass, a tank or a chalice or a stein,
00:01:01.520 a gantin jar or a flask, a vessel of any kind.
00:01:04.880 Fill it with your favorite liquid.
00:01:06.840 I like coffee.
00:01:08.880 And join me now for the unparalleled pleasure, the dopamine of the day,
00:01:12.640 the special Thanksgiving sip.
00:01:15.400 It's called the Simultaneous Thanksgiving Sip today, and it happens now.
00:01:19.800 Go.
00:01:23.800 Ah, yeah, holiday good.
00:01:30.820 Well, I almost don't know where to start, because today was filled with so much wonder
00:01:37.360 and goodness in the news that I can barely contain myself.
00:01:42.540 Yeah, good times.
00:01:44.060 Well, let's just start with a big thank you to Locals for saving my life this year.
00:01:55.160 A lot closer to a factual statement than you think.
00:01:58.460 You think there's a little hyperbole in there, but literally you kept me alive.
00:02:02.720 I don't think I could have gotten through the pandemic without this experience.
00:02:08.760 I tried as hard as I could to give back as much value as I could so that you would come out ahead.
00:02:15.880 That's one of my rules, by the way.
00:02:18.100 You want a good rule for life?
00:02:20.680 Here's probably the best.
00:02:23.400 Yeah, maybe.
00:02:23.880 Maybe the single best advice anybody would ever give you.
00:02:29.220 Give people more than they expect.
00:02:32.400 That's it.
00:02:33.280 You've heard that advice before, right?
00:02:35.700 The only people who ever give you that advice?
00:02:38.620 Successful people.
00:02:41.320 How much do you want to work with somebody who gives you more than you expected?
00:02:45.340 It just changes your whole day.
00:02:47.000 It's like, really?
00:02:48.220 That's more than I expected.
00:02:50.320 Yeah.
00:02:52.000 Plant over, deliver.
00:02:53.100 Plant over, deliver.
00:02:55.900 All right.
00:02:56.480 So, that's a sincere thank you, and let's have another good year.
00:03:03.480 Joy Reid apparently said some things about Thanksgiving and called it out for being fake news and propaganda.
00:03:13.000 Thank you, MSNBC, for taking our wonderful holiday and turning it into garbage.
00:03:18.700 But here's the funny thing.
00:03:19.880 Today's Thanksgiving, and today we will not be saying only negative things about people.
00:03:27.840 I'll save that for the rest of the year.
00:03:30.160 Today we're going to be good to people.
00:03:31.540 And I would like to call out that Joy Reid, no matter what you think of her political opinions, she's very smart.
00:03:40.320 She's very smart.
00:03:42.040 And she's also totally right.
00:03:44.300 She's totally right.
00:03:45.380 But Thanksgiving was designed as a propaganda brainwashing operation to, in part, sanitize our ugly past of how we treated the Native Americans.
00:03:57.760 And as she said it, she told her viewers on MSNBC, quote, the Thanksgiving is built on this myth that the indigenous welcome their colonizers with open arms and ears of corn.
00:04:11.180 And that's pretty funny.
00:04:12.160 Because that's almost exactly what I learned, that the Native Americans are like, hey, hey, we got company.
00:04:20.500 Somebody get the corn.
00:04:22.460 They're going to need corn.
00:04:24.780 That's sort of how I learned it.
00:04:26.820 Not exactly historically accurate.
00:04:30.420 So I have two feelings.
00:04:33.360 Number one, Joy Reid, you're totally correct.
00:04:36.800 Thanksgiving is the ultimate fake news.
00:04:39.400 It's built entirely upon this myth of awesome people getting together.
00:04:44.220 And maybe there was a little bit of it.
00:04:46.120 But the intention of it is to bring the country together and create a narrative that we can rally around and make us better people.
00:04:54.240 So it is fake news.
00:04:56.560 It is propaganda.
00:04:57.520 But it's the good kind.
00:05:00.120 It's the good kind.
00:05:01.380 I'll put it in this category, the Pledge of Allegiance.
00:05:05.720 The Pledge of Allegiance is just brainwashing.
00:05:09.360 It's literally just propaganda brainwashing of American citizens, for which I am totally in favor.
00:05:16.900 That's the good kind of brainwashing, right?
00:05:19.140 You send your kids to school.
00:05:20.840 If the school does a good job, we won't talk about the bad public school system.
00:05:25.200 But if they do a good job, they're brainwashing your kids, but in a good way.
00:05:31.040 Hey, you should work hard.
00:05:33.040 You know, they don't learn that like I just tried to brainwash you by saying you should deliver more than people expect.
00:05:40.040 I'm definitely trying to influence you.
00:05:43.320 So, you know, influence and brainwashing is everywhere all the time.
00:05:46.640 But some of it's good.
00:05:48.360 And I think Thanksgiving is a great example of turning some horrible history into maybe some positive.
00:05:58.120 Give people an excuse to just feel thankful for a while.
00:06:03.020 It's good for us.
00:06:03.680 All right, here's a robot logic test.
00:06:08.520 Are you ready?
00:06:09.880 Here's your test to see if you can objectively look at this story without your robot bigotry that you will bring to it.
00:06:20.740 Okay?
00:06:21.920 The story is the San Francisco Police Department has proposed a new policy that would give robots the license to kill humans.
00:06:29.940 Now, you might say to yourself, I would need more information about that story.
00:06:36.360 It feels like there might be some important details that are left out of the initial reporting.
00:06:41.760 The important detail being it's not an autonomous robot.
00:06:47.580 It's a robot that somebody's like steering and controlling.
00:06:51.720 The robot's not deciding to kill people.
00:06:54.960 Come on.
00:06:56.440 We don't have those robots.
00:06:57.820 And if a person is controlling the robot and telling it to shoot and where to shoot, isn't that just a gun?
00:07:09.640 Basically, we just defined a fancy gun as a robot.
00:07:14.860 It's just, if a person is telling it to shoot and then it shoots, it's just a gun.
00:07:20.880 That's all it is.
00:07:23.140 It's just a really fancy one that has, you know, a remote sight and it can move around without you touching it.
00:07:29.760 But it's just a gun.
00:07:32.760 Now, how many of you immediately realized that it wasn't really a story about robots?
00:07:38.900 Not really.
00:07:39.820 It's just a better form of a gun or a way to deliver a gun, I guess.
00:07:44.180 All right.
00:07:47.420 Here's a challenge I did on Twitter today.
00:07:50.140 I want you to see if you could summarize the year 2022 in the snarkiest, most cynical way.
00:07:58.660 What's the snarkiest one sentence you could say about the whole year?
00:08:02.840 Here was my initial primer for this.
00:08:05.940 Mine is, everything you suspected about everything turned out to be true.
00:08:09.460 Doesn't it feel like that?
00:08:15.140 Doesn't it feel as if everything you sort of suspected was exactly what was going on?
00:08:23.640 Now, to be clear, many of you are in a different reality than I am.
00:08:29.840 But in both cases, we followed a reality in which the thing we expected or suspected turned out to be true.
00:08:37.680 Your reality and mine might be different, but we both found ourselves to be right, but in different realities.
00:08:44.320 They don't match, like your reality and mine might not match.
00:08:47.540 But we all got to feel like we were right in the end, which is weird.
00:08:52.680 All right.
00:08:53.600 Here's some fake news coming out, I think, about Foxconn and China.
00:08:58.340 Have you seen all the video of what appeared to be the Chinese government cracking down on the workers who seem to be trying to escape from their COVID quarantine?
00:09:10.780 Have you all seen those videos or that news story?
00:09:13.860 It looks terrible, right?
00:09:15.540 Looks like the government is cracking down on them.
00:09:18.760 They're trying to keep them in like a little jail to keep working and stuff.
00:09:22.080 Well, that might not be the story.
00:09:25.840 Turns out that might be an entirely fake story.
00:09:28.880 Don't know.
00:09:30.040 So I'm not going to say that the story that calls BS on the story is true, but I'll let you know what it is.
00:09:39.080 Apparently, the real problem is they were promised bonuses and then the company reneged.
00:09:45.160 On top of that, they're also mad about the lockdowns, but it was about the bonuses.
00:09:52.780 And apparently, they solved it by working out some accommodation with the bonuses.
00:09:59.360 So it was also about the lockdowns.
00:10:02.500 That was part of the story.
00:10:03.420 But it wasn't really a lockdown story.
00:10:05.360 It was pretty much an ordinary story of a company having an employee problem.
00:10:12.460 Now, I saw also a theory that Foxconn always has to relocate and has over time because they can only operate where you can abuse your employees the most.
00:10:27.240 So allegedly, they were in a place where as soon as it got more industrialized, you couldn't have this horrible company in the middle of your nice place.
00:10:37.200 So I had to move where there are even poorer people who have fewer health protections.
00:10:44.960 So the thinking is that China eventually will be too modernized to actually accommodate Foxconn.
00:10:52.220 And that Foxconn will actually have to leave China to find an even worse country to operate in because their business model requires that you not have essentially safety controls.
00:11:05.200 So they can't have that, I guess.
00:11:08.000 Now, that's somebody's version.
00:11:09.640 I'm not saying that's true.
00:11:11.180 I'm saying that, you know, that's being promoted as one interpretation of what's happening there.
00:11:17.420 I have no idea what's happening in China.
00:11:19.520 It's just as a general statement.
00:11:22.620 All right.
00:11:23.080 I did a poll on Twitter, which is, of course, highly unscientific.
00:11:30.140 And I asked if how many people have rejected advice from their own doctors since the pandemic?
00:11:40.580 How many people have rejected advice from their own doctors since the pandemic?
00:11:46.120 Now, famously, you know I have.
00:11:50.280 And each time I rejected my doctor's advice, I did well.
00:11:55.800 I am not recommending you reject your doctor's advice.
00:11:59.940 Can I be really clear about that?
00:12:02.040 I am not recommending that anybody ignore their doctor's advice.
00:12:06.980 But we can talk about it, right?
00:12:09.600 Is that fair?
00:12:10.620 If I talk about it, it doesn't mean I'm recommending it.
00:12:13.460 Just be clear about that.
00:12:16.820 Two-thirds of the people who answered, of course, it's a biased, unscientific poll.
00:12:22.360 But two-thirds of the people said yes, that they had ignored their doctor.
00:12:27.500 I even ignored my veterinarian.
00:12:31.160 I ignored my veterinarian and my doctor on several very important things.
00:12:36.800 I am currently ignoring my doctor who ordered a full blood panel for me.
00:12:42.020 Do you know why I am ignoring my doctor's full blood panel?
00:12:48.700 Because I don't know why I need it.
00:12:51.320 I don't have a specific complaint.
00:12:54.880 If I do the blood panel, do you think it will find something that looks suspicious?
00:13:00.700 Probably.
00:13:01.660 You know, I'm at that age.
00:13:03.180 Do you think I should treat it?
00:13:05.860 That's where it gets dicey.
00:13:08.000 What if it finds something?
00:13:09.240 And it probably will, right?
00:13:11.980 Oh, your kidney function is a little this or that, so you better take this pill.
00:13:17.400 Do you think I'm going to take that pill?
00:13:19.960 Probably not.
00:13:21.360 Probably not.
00:13:23.040 Right?
00:13:23.980 Yeah.
00:13:24.520 Maybe, you know, two years ago, probably yes.
00:13:27.620 Probably yes.
00:13:28.360 But today, I mean, I'm really, really going to have to have a strong, strong argument to take another pharmaceutical product.
00:13:38.380 It's going to have to be a strong argument.
00:13:41.300 Here's the other one.
00:13:42.220 I've been on, I had been on asthma meds for 20 years, 25, something like that.
00:13:47.640 And I have two kinds.
00:13:50.480 I don't use the emergency inhaler, because I just don't need it.
00:13:54.480 But there's a kind that you use, you know, sort of a baseline one to keep you healthy.
00:13:59.080 My doctor looked at that and said, you know, do you have any symptoms of asthma?
00:14:04.120 I said, no.
00:14:05.420 Because, probably because I do this baseline thing every day.
00:14:09.060 And she said, you don't need that, unless you have symptoms.
00:14:12.440 And I said, what?
00:14:15.180 I've been taking this for 25 years.
00:14:17.860 Twice a day for 25 years.
00:14:19.800 She goes, no, that's not really even indicated for what you have, if you don't have any symptoms.
00:14:25.900 And I said, but if I stop taking this, won't I immediately have trouble breathing?
00:14:32.640 And she said, well, if you do, then use your emergency inhaler and get back on it.
00:14:38.260 So I thought, I'll try it.
00:14:39.860 First time, first day that I didn't do my regular asthma meds, I couldn't breathe.
00:14:49.460 I couldn't get to sleep.
00:14:51.000 I was, like, gasping for air.
00:14:52.420 And I was like, oh, shit, I really do need this stuff.
00:14:55.980 And then I thought about it a little bit more.
00:14:58.080 And I thought, I wonder if that was psychological.
00:15:02.420 And it was.
00:15:03.260 I just sucked it up and tried it again.
00:15:07.400 Put a little more effort into relaxing.
00:15:10.460 And now it's been a week without the meds, and I breathe perfectly.
00:15:14.880 I can run up and down stairs.
00:15:17.420 My exercise routine is actually the best it's ever been.
00:15:22.240 You know, I have the best muscle definition of my entire life at age 65.
00:15:25.820 I mean, I'm just not sure that, I don't know what's going on with health care, but it's pretty sketchy.
00:15:35.880 Now, here's the payoff for this.
00:15:37.680 Remember, we were all confused about why the baseline deaths are so high.
00:15:44.020 And some people say, it must be the long COVID.
00:15:47.820 And other people say, no, no, no.
00:15:49.440 It's the treatment for the long COVID.
00:15:51.300 It's the vaccination.
00:15:52.940 That's what's hurting people.
00:15:54.660 But allow me to add a hypothesis to the mix.
00:15:59.280 The hypothesis is that people stop taking medical advice.
00:16:04.360 And maybe some of it was essential for them to stay alive.
00:16:08.900 Because I don't know what I'm going to do when I reach a situation where the doctor's right.
00:16:16.620 Because it's going to happen eventually, right?
00:16:18.420 Like, I can't keep getting lucky and overruling, you know, a trained medical professional with decades of experience.
00:16:27.280 I'm not going to be right every time.
00:16:30.000 You get that, right?
00:16:31.760 Even though it appears that I have had extraordinarily good success, finally, of, you know, making decisions that, you know, opposite of, let's say, medical science.
00:16:44.380 Because even though I believe I've done it totally successfully up to this point, I feel like the next one could be the one that kills me.
00:16:53.460 Right?
00:16:54.640 The very next thing I refuse to do might be the one that kills me.
00:16:58.440 So I can't make a recommendation that you ignore your doctor.
00:17:01.840 But I will say this.
00:17:03.440 We should at least be open to the hypothesis, subject to testing, that maybe people stop listening to good advice.
00:17:13.160 Is that a thing?
00:17:14.820 What do you think?
00:17:15.840 I think the baseline death thing is going to be a combination of several variables.
00:17:20.280 It's not going to be one thing.
00:17:21.720 But that might be one of them.
00:17:23.300 Might be.
00:17:24.200 I don't know.
00:17:25.800 All right.
00:17:28.400 Elon Musk continues to entertain.
00:17:30.860 Remember he found that big closet full of T-shirts for Twitter employees that said, stay woke?
00:17:37.780 And we all had a good laugh that, you know, their closet was full of them.
00:17:43.000 Well, apparently Musk has taken it to the next level.
00:17:45.500 And he's created shirts that he's selling, I don't know, on the Twitter store.
00:17:52.820 I don't know where he's selling it.
00:17:53.840 But instead of stay woke, it says, stay at work.
00:17:59.420 Stay at work.
00:18:00.440 Now, could that be funnier?
00:18:07.080 It would be impossible for that to be funnier.
00:18:10.040 That's just perfect.
00:18:11.040 All right.
00:18:17.000 Remind me on the Locals platform.
00:18:20.200 Then I'm going to tell you some behind-the-scenes stuff that I can't tell the YouTube people.
00:18:24.320 So, when we get to the end of the live stream, I'm going to go private on Locals.
00:18:30.280 But remind me that the topic is cybernetic intelligence.
00:18:35.660 That's the topic.
00:18:37.280 All right.
00:18:38.040 There will be a little behind-the-scenes stuff for you.
00:18:43.660 All right.
00:18:46.140 What else is going on?
00:18:49.600 So, here's the most mind-blowing.
00:18:51.820 You thought it was true, and then it turns out it is.
00:18:56.220 So, Musk is saying, Musk asked, should Twitter offer a general amnesty to suspended accounts,
00:19:04.180 provided that they have not broken law or engaged in egregious spam?
00:19:09.580 And I answered.
00:19:11.400 A lot of people were giving him advice, which I love.
00:19:13.880 I mean, I just love watching people give Musk advice how to fix Twitter.
00:19:19.180 Because, Musk has determined that Twitter itself operates like a brain.
00:19:27.760 Do you see how he's using it?
00:19:29.100 He uses it the same way I use my live stream, especially the Locals platform.
00:19:34.020 I use the Locals platform as my auxiliary brain.
00:19:37.940 Like, any question I have, I just put it out there, and there's somebody who knows the answer.
00:19:41.820 It's way better than Google.
00:19:43.240 It's way better than the experts sometimes.
00:19:44.900 And Musk knows that, too.
00:19:48.060 He said it directly, that Twitter's operating like a big brain.
00:19:52.260 And now he asks the big brain, what should I do?
00:19:56.240 And then the big Twitter brain tells him.
00:19:58.660 Now, he still makes the final decision.
00:20:00.440 But my God, my God, watching, you know, the richest guy in the world, arguably the most successful engineer without even being a trained engineer,
00:20:13.100 watching him create or actually take advantage of Twitter as a brain, and then using that brain as part of his business process?
00:20:23.940 We've never seen anything like this.
00:20:27.660 This is all next level, kind of like management.
00:20:30.780 It's like management from the future or some damn thing.
00:20:34.020 I mean, this is just insane to watch this happening.
00:20:37.060 But my advice for him was this, that if you're going to do a general amnesty for these suspended accounts,
00:20:45.300 the single best time in the world to do it would be under new management.
00:20:51.900 There will never be a time that's more perfect than that.
00:20:54.880 Why?
00:20:56.020 Because you don't want to set a precedent.
00:20:58.260 The biggest problem with bringing somebody back is it creates this ugly precedent of,
00:21:04.360 well, I guess I can do anything I want because I'm coming back.
00:21:07.800 But if you say once only, you know, because we're redoing Twitter, it's under new management, once only,
00:21:16.760 we're going to give a general amnesty, this is never going to happen again.
00:21:22.000 People would believe that.
00:21:24.180 I think if you said it's because of new ownership and also the product is being revamped,
00:21:28.900 people would say, oh, that's a once only.
00:21:31.300 I'll take advantage of this one only time.
00:21:33.300 So whether or not he lets everybody come back, I don't know.
00:21:40.040 I don't have a prediction about that.
00:21:42.060 But I'll tell you that if he decides to do it, it's now.
00:21:46.480 Like, would you agree?
00:21:48.600 That if you're going to do it, this is the time to do it, period.
00:21:52.240 So that's all I could add to the conversation.
00:21:54.780 Yeah, this is the time.
00:21:56.480 But I don't know.
00:21:57.380 There might be some people that are just way over the line.
00:21:59.880 Now, let's talk about two movies on one screen.
00:22:09.760 Back in 2016 or so, I coined that phrase because I saw that Trump was creating a situation where we'd bifurcated reality into two completely different realities.
00:22:22.040 And I saw that coming early and I said, we're living in multiple realities at the same time.
00:22:26.900 We're looking at the same stuff, the same screen, but what I see on that screen isn't what you see.
00:22:33.640 We're just in our own realities, even though we're looking at the same stuff.
00:22:37.180 Now, the way most of you have traditionally interpreted that is that if people disagree what's true, it could be that both of them are wrong.
00:22:47.640 But usually you're trying to figure out which one is right and which one is wrong.
00:22:52.120 That's the general view of the world.
00:22:54.080 That is not, however, my view of the world.
00:22:57.460 I don't live in your world.
00:22:59.840 I'm going to see how many I can bring to my world.
00:23:02.620 And by my world, I mean my interpretation of reality itself.
00:23:09.540 All right.
00:23:10.180 Do you remember when the other day we watched reality bifurcate in real time?
00:23:15.720 It's when I did the simultaneous sip that I do before every live stream and many people said, hey, you forgot to do it.
00:23:24.480 And then other people who were here at exactly the same time watching the same thing said, yes, you did.
00:23:29.820 And then you could see people were quite sure that their reality was right and the other reality was wrong.
00:23:35.360 Now, your interpretation of the situation is one of those groups was correct and one of those groups was incorrect.
00:23:44.980 Can I confirm that?
00:23:47.560 Is it your view that maybe you don't know who is correct, but you would say somebody was correct and somebody was incorrect?
00:23:54.640 Because it's binary.
00:23:55.900 It's a yes, no.
00:23:57.440 It either happened or it didn't happen.
00:23:59.780 That's it.
00:24:00.740 Right?
00:24:01.120 And now that's your normal view of the reality, is that it either literally happened or it didn't.
00:24:06.120 That's not my reality.
00:24:08.400 I do not live in your reality where it was either true or false.
00:24:13.260 Which will take a little explaining.
00:24:16.740 All right.
00:24:17.920 Here's my reality.
00:24:19.020 In my reality, there was this event that triggered two different worlds.
00:24:23.820 So we split into people who were sure it happened and people who were sure it did not happen.
00:24:29.940 As long as there's never a requirement that we agree, we can live in those two different worlds forever.
00:24:39.700 And there's not a preferred correct one.
00:24:42.680 Here's where I'll make you crazy.
00:24:44.940 You might be aware that Einstein once said there's no preferred observer.
00:24:50.400 So if you're trying to understand reality, Einstein says, well, if you were traveling at the speed of light, reality would look different to you and would actually be different to you.
00:25:09.640 Right?
00:25:10.360 There's no preferred reality.
00:25:13.220 That's Einstein.
00:25:14.260 That's not me.
00:25:14.840 Einstein said, you can live in your own reality.
00:25:20.900 Depending on the speed, the angle, and how you're observing.
00:25:24.940 There's no preferred reality.
00:25:27.540 And I'm going to tell you the same thing.
00:25:29.920 That this thing you think is a truth, that one of these happened and one didn't, probably nothing happened.
00:25:38.180 That's the world I live in.
00:25:39.460 The world I live in is like a user interface.
00:25:43.740 All right, so here's the analogy, and then I'll take it back to the real world.
00:25:46.840 If you're using a computer and you click on an icon, you don't actually know what the icon's doing.
00:25:52.760 It's moving zeros and ones in a way that you trust will get you something you predict, but you don't know what's happening.
00:25:59.480 You only know you get a predictable result.
00:26:02.840 My view of reality is that there are two things that you can know, and that's about all you can know.
00:26:10.940 One, you can know you exist, because you're asking the question.
00:26:16.260 Two, you can know that some things appear to have consistent predictive results to you.
00:26:23.880 And I say appear, because you can't be sure.
00:26:26.360 The only thing you're sure of is that it appears that way.
00:26:29.280 But you're not sure it's true, you're just sure that that's your impression.
00:26:33.980 So the two things you know is that you exist, and that you have impressions of things.
00:26:38.680 That's it.
00:26:39.880 Everything else is mysterious.
00:26:42.600 So, here's the world I live in.
00:26:45.760 And this will sound so weird that I'm going to lose most of you.
00:26:51.000 In the world I live in, there was no simultaneous sip.
00:26:54.820 The simultaneous sip was like an icon on the screen, and you saw the icon, but you didn't know the reality of it.
00:27:04.560 The reality of it could be that I'm a three-headed lizard from the planet Flurbapond,
00:27:11.720 and I was sticking a metal spike in my ear.
00:27:16.580 And then, maybe for other people, they were watching an empty screen.
00:27:25.360 But they were imagining I was there.
00:27:27.340 Who knows?
00:27:28.280 So what I'm saying is that something happened, but the people who say it happened didn't see anything real.
00:27:36.360 That's just the icon.
00:27:37.900 And the people who say it didn't happen also did not see anything real.
00:27:42.520 They only saw the icon.
00:27:43.740 So this can all work within our reality, as long as there's never a requirement that the two realities solve.
00:27:55.200 And there isn't.
00:27:56.740 Those two realities never need to solve.
00:27:59.020 Now, you say to yourself, Scott, you just have to go back and look at the video, and then solve it.
00:28:05.020 So what happens if somebody goes back and looks at the video?
00:28:07.620 Well, the most that they'll find out is that their icon-level user interface reality part, maybe they'll say, oh, I guess I just missed it.
00:28:18.820 So you would rapidly explain to yourself why you were in the wrong reality, but then you would think, oh, I went from being wrong to being right.
00:28:27.320 But probably not.
00:28:28.280 But the way I see reality is you went from one thing you didn't understand to another thing you didn't understand.
00:28:34.320 But maybe one of them was more predictive.
00:28:37.340 That's it.
00:28:38.540 Or at least it feels more predictive.
00:28:40.880 That's all you know, is that you exist, and it looks like some things are predictive.
00:28:47.000 That's it.
00:28:47.980 There's nothing else you know.
00:28:48.980 So you live in a world where it was either true or false that there was a simultaneous SIP.
00:28:55.460 I permanently live in a world where the question doesn't make sense.
00:29:00.320 All I know is I was pushing icons.
00:29:02.980 That's it.
00:29:05.200 All right.
00:29:10.220 So let's talk about this Balenciago story.
00:29:13.180 I so wanted not to talk about this story.
00:29:17.460 Like, I'm really, really trying to not talk about mass murders or anything with children.
00:29:24.640 Right?
00:29:24.920 I just try and, like, I just try to avoid it.
00:29:28.880 But the story became too fantastical.
00:29:33.760 I mean, I can't avoid it anymore.
00:29:35.760 And here's what we know.
00:29:37.880 But there's probably more we don't know than what we know.
00:29:39.880 So Balenciago, a big fashion brand that worked with Kanye and dropped him because of his recent comments.
00:29:46.340 They did an advertisement campaign in which children were shown with teddy bears that looked like they were in, you know, bondage attire.
00:29:55.060 And then other photos, not the same one with the bears, I guess, but other photos that Balenciago had, had documents sort of on a desk that seemed to indicate some kind of message about, you know, illegal underage sex or something.
00:30:13.680 So there's a court case.
00:30:16.660 Now, there were some other, like, little indications that Balenciago has got some badness going on.
00:30:22.820 Now, the question I asked is, well, why hasn't somebody just talked to the people who did the photo?
00:30:29.420 Like, why are we listening to the executives of Balenciago, who, by the way, apologized?
00:30:35.140 And they said we never should have paired children with these products.
00:30:38.400 But there's still a lot missing, right?
00:30:42.580 Now, so I said, why can't we hear from the people who did the photo?
00:30:47.780 And then people on Twitter, because Twitter is a big cybernetic brain, immediately corrected me and said, we have, we have.
00:30:54.180 He's already issued a statement.
00:30:55.880 So here's what the photographer said.
00:30:58.420 The photographer said, hey, hey, hey, I'm just the guy who makes sure the lighting and the, you know, the photo is good.
00:31:06.200 It's the brand that tells me what to photograph.
00:31:09.060 I'm not the person who chose the teddy bears.
00:31:11.980 So he's in the clear, right?
00:31:15.540 Photographer's in the clear.
00:31:17.260 Because the photographer was not in charge of what the photo was.
00:31:21.080 He was just pushing the button to make sure the lighting was good, right?
00:31:23.520 So I had this conversation with another photographer who agreed with that photographer and said that, this other photographer said, well, I too have done many photo shoots, including brands.
00:31:38.900 And when it's a brand, you actually don't get to choose what you shoot.
00:31:43.440 It's the brand.
00:31:44.420 So the brand hires you.
00:31:45.620 They say, here's my product.
00:31:47.640 Take a picture of this thing.
00:31:48.700 And then, you know, you can control the lighting and stuff.
00:31:51.860 But basically, you're just doing what the brand tells you to do.
00:31:57.240 And now, ladies and gentlemen, I would like to tell you about something I've never seen before.
00:32:03.640 This is the greatest thing that's ever happened in the history of social media.
00:32:07.460 You may never see it again.
00:32:09.240 I don't want you to miss it.
00:32:10.500 And maybe it's a special Thanksgiving story.
00:32:13.140 The greatest thing anybody's ever seen on social media.
00:32:16.000 I changed somebody's mind.
00:32:21.260 Have you ever seen that?
00:32:23.160 Actually changed somebody's mind.
00:32:25.720 So the photographer who said that, did I actually not get a photo of that?
00:32:35.280 And so I tweeted back and said, now, I've been in lots of photo shoots, maybe hundreds.
00:32:47.700 So being the Dilbert guy, for years and years, every two days, somebody would say, can we do a photo shoot and put you in some publication?
00:32:58.200 And I would say yes, because it was all good publicity.
00:33:00.380 So I know a lot about photo shoots, right?
00:33:04.380 And some of you have been in one or two, but I know a lot about photo shoots.
00:33:11.460 And here's what I said.
00:33:14.980 I said that the photographer is the captain of the ship.
00:33:20.320 Yes, the brand says we'd like you to hire you to do this.
00:33:23.500 But once you're in the room, I said the photographer is the captain of the ship.
00:33:30.060 Now, that's been my experience.
00:33:32.420 But this photographer argued and said, no, the brand tells you what to shoot.
00:33:36.740 You just push the buttons and make sure the lighting is right.
00:33:39.640 And so I responded.
00:33:46.720 And thank you.
00:33:48.080 Somebody at Locals posted it for me so I could read it for you.
00:33:51.920 So I responded.
00:33:52.900 I said, you'd shoot child porn for a paycheck?
00:33:56.580 And then I said, let me defend you before you answer.
00:33:59.680 No, you would not.
00:34:02.480 You're a citizen before you're a photographer.
00:34:05.380 And here's the funny part.
00:34:07.640 Have you ever seen anybody just agree with you on Twitter?
00:34:10.700 So after I said that, you'd shoot child porn for a paycheck?
00:34:13.800 Let me defend you before you answer.
00:34:15.320 No, you would not.
00:34:16.000 You're a citizen before you're a photographer.
00:34:17.460 And then he responded with, his name is MJ Reincarnate, I guess.
00:34:24.600 Or Reincarnate.
00:34:25.500 No, that's not his name.
00:34:26.360 That's his Twitter name.
00:34:27.860 But he said, I would not, meaning he would not shoot child porn for a paycheck.
00:34:33.640 He said, I would not, of course.
00:34:35.540 Okay, you're right.
00:34:37.000 I think the photographer should share more of the blame here.
00:34:40.280 You're such a good persuader.
00:34:41.600 Holy fuck.
00:34:42.240 I never thought I'd change my mind on this.
00:34:46.260 And David Boxenhorn caught the exchange.
00:34:50.160 And he notes that he thinks the persuasive part of what I said was when I said,
00:34:56.060 let me defend you before you answer.
00:34:58.580 Now, do you recognize the technique?
00:35:00.260 Let me defend you before you answer.
00:35:03.900 What was the technique?
00:35:07.900 The technique was basically, I'm on your side.
00:35:12.160 Yeah.
00:35:12.700 It's basically, you and I are on the same side.
00:35:15.380 Let me defend you here.
00:35:16.240 Because you basically just said you wouldn't say no to pedos.
00:35:23.740 Yes, you would.
00:35:25.060 Yes, you would.
00:35:25.680 Let me help you out here.
00:35:27.300 So, but also the argument was pretty solid.
00:35:30.520 Right?
00:35:30.800 So he immediately gave to it.
00:35:33.020 So, here's what you need to know about the photographer.
00:35:36.740 The photographer said, I'm not the one who set up the scene.
00:35:40.040 I just took the picture.
00:35:40.920 But you're still a citizen.
00:35:45.000 You know, maybe a parent.
00:35:46.660 I don't know.
00:35:47.980 I think there's an obligation that's a little bit bigger than a photographer.
00:35:52.100 You know what I mean?
00:35:53.460 So, I will take it as truth that the photographer was not behind some pedophile ring.
00:36:00.300 Probably not.
00:36:01.520 Probably not.
00:36:02.900 But should the photographer have raised an objection?
00:36:05.920 Probably yes.
00:36:06.560 And I think the photographer, you know, maybe should say that directly.
00:36:11.180 Because the person who hired the photographer says it directly.
00:36:16.280 The person who hired the photographer says we should not have done this.
00:36:19.820 So, it's easy for the photographer to say, yeah, you know, like I got caught up in it.
00:36:25.120 Totally wrong on me.
00:36:27.340 My bad.
00:36:28.440 I would accept that.
00:36:30.100 Would you?
00:36:31.500 I would actually accept that.
00:36:32.860 Because people do kind of get rolled into doing things they don't want to want.
00:36:37.500 You know, it's just not my day to complain sort of thing.
00:36:41.560 You should have complained, but I could see a normal person making that mistake.
00:36:45.320 That's not the worst thing in the world.
00:36:47.600 Now, he also points out that he had nothing to do with what was on the set.
00:36:51.540 That's probably true.
00:36:53.180 Right?
00:36:53.440 So, the photographer doesn't necessarily bring the props.
00:36:56.800 That would be the brand people more often than not.
00:36:59.680 And so, it might not have, you know, and some other photographer took the picture that had those sketchy documents on it.
00:37:06.600 Now, here's the second thing.
00:37:09.400 Why would somebody leave these obvious clues in a photo shoot?
00:37:15.240 Let's say it wasn't the photographer who did it, but somebody did.
00:37:19.100 Somebody involved in the setup.
00:37:20.640 Why would somebody do that?
00:37:23.320 I can see two possibilities.
00:37:25.060 One, they're part of a large grooming ring, and they're sending out the signal, and, you know, making it safe for everybody.
00:37:33.820 Something like that.
00:37:35.080 But here's the other possibility.
00:37:37.480 It was a whistleblower.
00:37:39.500 It was a whistleblower.
00:37:41.620 It could have been a cowardly whistleblower who said, you know what?
00:37:46.760 I can't blow the whistle on this.
00:37:48.940 It's too dangerous.
00:37:49.980 But I'll make sure somebody knows.
00:37:52.720 I'm just going to put this right here.
00:37:54.040 If you don't see this, that's on you.
00:37:57.080 Now, if somebody says it could be a practical joke, maybe.
00:38:01.080 Maybe.
00:38:01.920 I mean, that would be a horrible practical joke, but it could happen.
00:38:07.840 I mean, it's not impossible.
00:38:10.460 Could be somebody who liked Kanye more than they liked this company.
00:38:15.400 Right?
00:38:16.080 It could be employee sabotage.
00:38:19.260 Employee sabotage.
00:38:20.860 Right?
00:38:21.000 Somebody who simply liked Ye and didn't like what the company did and said, all right, well, see if you like this.
00:38:29.440 Yeah.
00:38:30.400 So I wouldn't assume that Balenciaga is guilty of any, you know, major thing.
00:38:37.780 That's not an evidence.
00:38:41.880 But I wouldn't clear them either.
00:38:44.440 Because I don't think the response goes to why there were so many indications.
00:38:51.820 It's one thing to say we shouldn't put kids in an ad, but they dealt with one variable when the conversation is multiple variables.
00:38:59.500 So, yeah, there's more to know here.
00:39:04.080 Now, I have been never on the team, and I think you can confirm this, I have never been on the team that says the world is run by a cabal of elite pedos, pedophiles.
00:39:18.940 You can confirm I've never said, I've never suggested that that's the case, right?
00:39:26.700 However, in the interest of fairness, however, let me point out that, number one, yes, there are a lot of signals that are starting to look very suspicious.
00:39:42.980 A lot of signals.
00:39:45.000 Does that mean it's true?
00:39:46.820 Nope.
00:39:47.180 No matter how many pieces of confirmation you see, confirmation bias is still more likely.
00:39:56.720 Still more likely.
00:39:59.060 However, if you would like to have a little bit of comfort in your view, which might differ from mine, allow me to help you out.
00:40:07.620 Here's the best argument for the existence of a pedo elite running the world.
00:40:15.500 Follow the money.
00:40:17.500 Follow the money.
00:40:18.900 That's it.
00:40:20.940 Unless follow the money doesn't work in this and only this situation, because it works everywhere else.
00:40:27.420 Follow the money should give you the following situation, and I'll explain why.
00:40:31.160 If you assume that Epstein Island was a blackmail operation, you mostly believe that, right?
00:40:39.900 Don't you all understand that is probably a blackmail operation?
00:40:45.020 Most people believe that.
00:40:46.340 Now, I don't know that it's true, but let's assume it is.
00:40:48.960 Now, let's assume also that billionaires like to fund politicians they can control.
00:40:58.740 Would you agree with that?
00:40:59.940 Billionaires who want to fund politicians have a strong preference for funding ones they can control.
00:41:10.400 What would be the most valuable politician for a billionaire?
00:41:16.020 One that has some blackmail, you know, working against them.
00:41:19.400 So, in theory, follow the money as long as blackmail of political people is a real thing.
00:41:27.400 I think Epstein shows it is.
00:41:30.160 And as long as billionaires fund the people they want to control,
00:41:34.080 you should eventually get to the point where only pedophiles and sexual criminals are running things.
00:41:41.100 Because the billionaires would make sure they only had people they could control
00:41:45.800 in the highest rings of government, because...
00:41:52.100 Let's get rid of this cunt, Calvin Johnson.
00:41:57.460 Doesn't even sound like a real person.
00:42:00.580 Goodbye, Calvin.
00:42:02.040 You are worthless and a piece of shit.
00:42:04.520 Happy Thanksgiving.
00:42:10.280 Yeah.
00:42:11.100 So, just to be clear, if I get taken out of context,
00:42:15.700 I'm not saying that there is an elite pedophile government entity.
00:42:22.940 I'm just saying that if you believe following the money gives you good predictions,
00:42:28.360 it does predict that we'll have that eventually.
00:42:31.280 If we don't have it now, we should be heading in that direction.
00:42:36.500 And I don't know what the argument against it would be.
00:42:39.540 Can anybody think of a counter-argument?
00:42:41.100 Either follow the money works, or it doesn't.
00:42:46.560 And if it works, we're heading toward a blackmail-only government.
00:42:52.240 And maybe we're already there.
00:42:54.560 Maybe we're already there.
00:42:55.820 Their live fashion show is all the same theme.
00:43:01.880 Yeah.
00:43:03.280 Well, I don't know.
00:43:08.240 You know, there's something that happens with fashion that doesn't happen other places.
00:43:14.420 Do you remember when heroin chic was the look that they were all trying to get on the runway?
00:43:22.480 Like, women who looked like they literally were heroin addicts?
00:43:26.440 That was a thing for a while.
00:43:28.400 And I don't think that that meant that the fashion industry was promoting heroin.
00:43:33.440 Like, I think they actually treated it as just a fashion statement.
00:43:37.740 So, I would not rule out that as upsetting and, you know, let's say, as big as the signal looks,
00:43:47.560 I wouldn't rule out that the fashion industry doesn't see it.
00:43:50.220 Because the fashion industry has always been a little suspicious about how they treat underage people, right?
00:43:58.320 Wouldn't you say that's sort of a general theme that runs across the fashion industry?
00:44:03.680 The, say, the exploitation of the young.
00:44:07.760 So, I feel like it's just so much in the DNA of the fashion industry that it's not shocking that one of them would get,
00:44:16.140 would, you know, cross the line and not even know it.
00:44:19.540 Like, just be completely unaware.
00:44:21.200 Oh, to normal people this is too far?
00:44:25.220 Don't you think that the fashion people don't interact with normal people too much?
00:44:30.580 They probably interacted with people who said, oh, yeah, that's edgy and, you know, that works.
00:44:35.500 So, there are still other ways to explain what we see that do not require an elite pedophile ring of people coordinating.
00:44:47.060 You just can't rule it out.
00:44:48.780 Can't rule it out.
00:44:54.320 I live in a world where it's true and not true at the same time.
00:44:59.680 I feel like somebody could run for president under the slogan of making health care follow science.
00:45:11.280 And you wouldn't even have to explain what that meant.
00:45:13.580 But imagine a presidential candidate saying, look, this has never really been a topic of politics before, but I guess it needs to be.
00:45:24.600 Let's make health care subject to science.
00:45:28.520 Now, let me do a little test for you.
00:45:35.240 Let me promise you that I'm not going to talk about the effectiveness of masks or vaccines.
00:45:42.180 Can we agree not to do that?
00:45:44.280 So, I'm going to be in that domain, but I'm not going to say whether they work or don't work.
00:45:47.960 Because I believe that neither is true.
00:45:52.040 I don't think it's true that they work.
00:45:54.440 I don't think it's true that they don't work.
00:45:56.720 I think we're operating at a user interface level.
00:46:00.480 And that's it.
00:46:01.060 Maybe masks don't even exist.
00:46:06.020 I don't know.
00:46:06.800 So, but I'm just making the narrow claim that, let's say Trump or somebody else, DeSantis or somebody, if they ran and said, look, we're going to have to fix the public's trust of health care science or science in general, because there's climate science as well.
00:46:22.240 So, if you just said, I'm going to do what we can to make science more credible.
00:46:28.980 Now, I don't know what you would do.
00:46:31.080 Maybe it's information.
00:46:33.040 And then let me ask you this.
00:46:34.140 This would be a good test.
00:46:36.740 How many of you believe that the, and again, I'm not going to argue what's true and not true.
00:46:43.260 I'm only going to ask you what you've seen, right?
00:46:45.560 It's not an argument about the science.
00:46:47.860 How many of you have seen a collection of studies saying face masks do not work?
00:46:56.460 Go.
00:46:57.640 How many have seen at least one study or maybe a collection of studies that say face masks do not work?
00:47:05.280 Lots of yeses.
00:47:06.760 Yes, yes, yes.
00:47:07.580 Yes, yes, yes.
00:47:09.100 All right.
00:47:09.960 Tons of them.
00:47:10.800 And even the ones who have not seen them, you're aware that other people have, right?
00:47:15.320 You know that lots of people have seen those studies, okay?
00:47:18.540 All right.
00:47:18.840 Now, I'm going to give you a second and stop answering this question.
00:47:22.600 I need you to stop answering this question so that I can ask you the next one and they don't get confused, okay?
00:47:28.460 So, stop answering that question for a moment.
00:47:31.500 I'll give you a little time lag here.
00:47:33.540 Now, thank you.
00:47:35.940 Now, how many of you have seen the collection of studies that prove face masks work against COVID?
00:47:46.620 Not just one, but like a collection of them.
00:47:49.080 How many of you have seen those?
00:47:52.740 A lot of yeses.
00:47:54.320 Okay, now you're surprising me on locals.
00:47:56.720 Locals people have seen more.
00:47:58.540 There are more no's.
00:47:59.480 Now, those of you who have never seen that, you've never seen an entire collection of studies that all indicate masks work.
00:48:10.900 You've never even seen it.
00:48:12.660 Do you believe it exists?
00:48:14.860 If you've never seen it, do you think the problem is you haven't seen it or that it doesn't exist?
00:48:24.760 This is going to be hurting some of your brains.
00:48:26.840 Let me tell you something that I know that some of you have not experienced.
00:48:34.320 Since I've always been open to the question, I've had some opinions, but we won't get into them.
00:48:40.480 I've always been open to the question of, well, I could be totally right or I could be totally wrong.
00:48:45.120 So, I wasn't really too married to a view on face masks.
00:48:49.580 And so, I've seen, because I've asked for it, I've seen vast lists of proof that masks don't work, and I've seen just as persuasive, large lists of studies proving it totally does.
00:49:04.580 If you haven't seen both, then you're not informed.
00:49:12.000 Is that fair?
00:49:13.560 Now, I'm not saying which of the studies are dependable, because I don't think any of them are.
00:49:17.740 My opinion of all of the studies is that they're all bad, 100% bad.
00:49:21.800 I don't believe any of the studies, none of them, not either direction, don't believe them at all.
00:49:28.540 But if you've never seen that the other side has as much credibility in their studies as your side, whichever side you're on, the other side is just as armed.
00:49:40.140 Did you know that?
00:49:41.780 How many of you knew, and again, I'm not saying who's right or wrong, did you know that the other side from you has just as good evidence?
00:49:50.280 Now, you may say, oh, those are bad studies and mine are good.
00:49:54.800 You don't know.
00:49:56.920 We don't know what a good study is.
00:49:59.460 Sometimes you can find a bad one, but you can't tell if the good one is really good.
00:50:03.820 Yeah.
00:50:05.480 Is anybody's mind blown by learning for the first time right now, because I think you believe me, right?
00:50:11.760 I think everybody believes me.
00:50:13.080 I have seen it, and there are other people who have seen them, too.
00:50:15.480 But is anybody's mind blown that the other side from you has just as good scientific information?
00:50:26.820 I guess nobody's mind is blown by that.
00:50:28.980 You all expect it, I guess.
00:50:30.040 In a related story that Shooter, who shot up the, I guess the Club Q, which is LBGTQ kind of a place,
00:50:48.840 and it turns out that the Shooter's father was a porn actor named Dick Delaware,
00:50:57.840 Dick Delaware, which is weird, because I thought that was Hunter Biden's porn name.
00:51:03.900 Wasn't Hunter Biden Dick Delaware?
00:51:06.880 I mean, if he wasn't, he should have been.
00:51:09.140 He should have been.
00:51:09.780 Well, that's all I had to say about that.
00:51:13.660 You really need to see the video of the Shooter's father talking about his anti-gay opinion
00:51:22.140 and how he raised his son to favor violence for solving problems.
00:51:28.680 I think we found the problem.
00:51:31.940 He was actually raised to be this kind of guy.
00:51:35.700 Okay.
00:51:37.580 That's pretty bad.
00:51:39.780 All right.
00:51:43.760 Oh, I forgot to mention this.
00:51:46.180 That poor photographer, who is innocent until proven guilty,
00:51:51.700 can we agree on that, that the photographer for Balenciaga
00:51:55.060 is innocent until proven guilty,
00:51:58.320 which is separate from the question of whether he should have stopped it?
00:52:03.460 Yeah, he's innocent until proven guilty.
00:52:05.920 So I'm going to treat him as innocent, because I actually think he probably is.
00:52:08.960 Made a bad call about whether taking the picture, but, you know, that's just a bad call.
00:52:13.920 But he had an earlier tweet, I guess in July of this year, that is being interpreted in two opposite ways.
00:52:23.080 So we'll see which way you interpret this.
00:52:26.340 His tweet was, it was something about gun control.
00:52:28.500 And he said, in a tweet, he said, why restrict child porn, but not guns?
00:52:37.060 That's what he said.
00:52:38.160 Why restrict child porn, but not guns?
00:52:41.720 Is he saying child porn should be restricted, or child porn should not be restricted?
00:52:46.520 It's a little unclear, isn't it?
00:52:51.780 So confirmation bias is driving people to believe that he tweeted in favor of pedos.
00:53:03.480 Nobody does that.
00:53:05.260 He did not tweet in favor of pedophilia.
00:53:08.360 That did not happen.
00:53:09.660 I'm sure somebody's done it somewhere.
00:53:11.420 But he didn't do that.
00:53:12.360 No, it was against gun ownership.
00:53:17.480 He was just saying that they should both be banned.
00:53:22.100 If you think he was pro-gun, then it looks like maybe both should be legal or something.
00:53:28.180 But if you read that tweet as being pro-pedo, I think that's on you.
00:53:36.000 All right.
00:53:39.420 There's a story I don't have a confirmation on.
00:53:43.240 Was it true that the Club Q that got shot up had advertised it was going to hold an all-ages drag show the next day?
00:53:50.500 I saw that on Twitter.
00:53:52.080 Is that confirmed?
00:53:54.160 Because that feels a little too on the nose, doesn't it?
00:53:57.520 Is that a little too convenient?
00:53:59.460 Or was it actually the reason?
00:54:04.260 Because it could have been the reason.
00:54:05.980 Because here's the thing.
00:54:06.800 Why wouldn't he shoot it up?
00:54:08.720 Oh, the kids would be there the next day.
00:54:10.400 Does that make sense?
00:54:11.600 I was going to say he should attack the event, not the place that advertises the event.
00:54:19.000 I don't know.
00:54:20.100 That doesn't sound...
00:54:21.780 I'm going to still say I don't believe it.
00:54:23.520 It could well be true, but I feel like I'd need more reporting on that to believe that one.
00:54:35.800 And then Tim Poole is getting in trouble.
00:54:40.100 He tweeted this.
00:54:41.260 We shouldn't tolerate pedophiles grooming kids.
00:54:44.600 And then he says Club Q had a grooming event, referring to what I just mentioned.
00:54:48.820 And then Tim says, how do you prevent the violence and stop the grooming?
00:54:53.520 Like, how do you talk against the grooming without causing a crazy person to do something like this?
00:55:03.000 And then people are trying to cancel Tim Poole for saying that.
00:55:06.740 To which I say, isn't that...
00:55:08.700 That's the question we're all dealing with, isn't it?
00:55:12.260 Aren't we all grappling with the exact question that Tim Poole just asked?
00:55:16.920 Why in the world is he getting canceled?
00:55:19.300 It's literally exactly what you're thinking.
00:55:21.420 You just said it.
00:55:22.440 How loud.
00:55:23.520 How do you get in trouble for that?
00:55:26.040 Yeah, the whole point is it's difficult to separate your criticism from the fact that it might activate a crazy person.
00:55:33.160 And that's true of a lot of different topics.
00:55:35.900 If you say those darn Republicans are a bunch of fascists, does that cause some Democrat to hunt a Republican?
00:55:46.100 Yes.
00:55:47.260 Yes, it does.
00:55:48.160 Should you do a fair criticism of something that deserves to be criticized, somebody, somewhere, can see that as an excuse for danger.
00:56:09.540 You know, you know, danger.
00:56:10.900 So I think, unfortunately, it's just the cost of a free system.
00:56:15.840 You know, I don't know what you could do about that.
00:56:20.060 All right.
00:56:20.640 Ladies and gentlemen, that brings us to the conclusion of the plan part of my thing.
00:56:33.720 I'm going to tell the local stuff some extra stuff.
00:56:36.500 Is there anything I forgot that's in the news?
00:56:39.420 Anything I should mention?
00:56:44.320 It's a 10 out of 10.
00:56:45.440 I think so.
00:56:46.560 I think so.
00:56:50.600 Have you checked it?
00:56:51.420 Well, I'm not a turkey eater.
00:56:52.620 Remember, when you don't eat turkey, did Dan Crenshaw really threaten the cartels?
00:57:03.180 I didn't see any specific threat.
00:57:05.660 Did he do that?
00:57:09.680 I hope he did.
00:57:11.640 You know, when you see Dan Crenshaw threaten the cartels, that should be telling you something.
00:57:18.180 Because remember, whatever you think of Crenshaw, and I know you've got different opinions about him as a politician,
00:57:24.900 the one thing we don't doubt is that he's a brave military guy who seems to have a different relationship with risk and danger than the rest of us do.
00:57:37.180 And you might need somebody as brave as a Crenshaw to take on the cartels, because they would definitely go after him.
00:57:46.500 Right?
00:57:46.940 So you need somebody with gigantic balls to even talk out loud about going after the cartel.
00:57:58.420 Oh, by the way, on locals, remind me to tell you something else I can't say out loud here.
00:58:04.800 About the cartels.
00:58:07.160 So you're going to remind me about the cartels.
00:58:09.160 I'll tell you about that, too.
00:58:10.940 All right.
00:58:11.220 All right.
00:58:11.280 Crenshaw is as tough as an old boot.
00:58:17.240 That's what Peterson says.
00:58:24.820 All right.
00:58:25.440 I don't know if I'm going to do a live stream today just to say hi for Thanksgiving.
00:58:32.120 Whoa.
00:58:32.680 Hello.
00:58:32.980 Hello.
00:58:33.080 Crenshaw introduces declaring war on cartels act.
00:58:37.460 God, I love the locals platform because they can they can paste a image in the comments.
00:58:42.380 So every time I ask a question, here's the exact answer.
00:58:46.320 So let me read this.
00:58:47.260 This is from Crenshaw, November 16th.
00:58:49.380 Today, Representative Dan Crenshaw announced the introduction of the declaring war on the cartels act.
00:58:57.140 The bill is designed to combat transnational criminal cartels, illegal activities, with increased criminal penalties and the targeting of their finances.
00:59:05.660 As Mexico, Mexico, blah, blah, blah, blah, blah, developed into a national security threat.
00:59:09.160 OK.
00:59:09.800 Well, it's that's a different definition of war.
00:59:12.660 War against their finances is good.
00:59:21.260 I'm all for it.
00:59:24.060 But it's short of what we need to do.
00:59:28.120 Now, here's how I would do it.
00:59:30.740 I've said this before.
00:59:32.020 I would not just bomb a cartel operation.
00:59:36.520 I would make sure they had a few days to know it was going to be bombed.
00:59:41.860 Right.
00:59:42.020 So I think you do it the Israel way, where if there's, let's say, a Palestinian terrorist, Israel will destroy the family's home.
00:59:53.060 But they tell the family to get out, and then they bulldoze it.
00:59:56.180 Right.
00:59:56.300 They don't leave the family in there when they're destroying the home.
00:59:59.140 So I think giving the cartel notice so that you can try to avoid any collateral damage is the best you can do.
01:00:09.340 They probably would leave human shields there.
01:00:12.020 And then it's war.
01:00:13.680 You're going to have to do what you've got to do.
01:00:15.560 But I think you make them, you tell them that you know where their operations are, and you make them keep moving the operation or else get bombed.
01:00:23.100 It would at least disrupt it.
01:00:24.580 Now, I saw something from a Twitter user that I do not know is true, but maybe some of you do.
01:00:32.580 Is there anybody in a position who can confirm the following thing?
01:00:35.660 There's a heroin shortage in the country, and so fentanyl is all you can get.
01:00:44.880 Have you heard that?
01:00:47.080 Does anybody know?
01:00:48.320 I saw a big yes in capitals.
01:00:52.600 So I know there's somebody around here who's on heroin.
01:00:57.100 Unfortunately, I have a big enough audience that one of you is on heroin, and somebody knows.
01:01:04.320 And I got an emphatic yes on the heroin shortage.
01:01:10.220 So I've heard it now from two sources.
01:01:12.360 So if people are getting only fentanyl, there are two possibilities.
01:01:22.120 So there's a non-common sense thing happening with fentanyl that you need to understand.
01:01:29.000 If you don't know you're getting fentanyl, your risk of dying is a little higher because you didn't know to take care of yourself.
01:01:35.780 If you know you're buying fentanyl and you're an experienced addict, and unfortunately most are experienced,
01:01:43.240 you can actually take precautions to make sure you don't take too much or that you don't OD or that you've got Narcan nearby or something like that.
01:01:54.400 So people who are experienced might actually have the least risk.
01:02:00.060 So there's a possibility that the heroin shortage would make all addicts know they're getting fentanyl if they're injecting.
01:02:10.720 And if you told all of the injectables, hey, it's only fentanyl now, would there be more death or less?
01:02:20.280 I don't know.
01:02:21.780 There might be less because they would know what they're dealing with and they'd take the right precautions.
01:02:25.580 If that's a thing.
01:02:27.940 I mean, I've been told by addicts that's a thing.
01:02:31.220 But I suppose there's got to be some huge risk in getting the right dose, no matter what.
01:02:43.600 Yeah.
01:02:45.920 All right.
01:02:46.620 I'm being told that's not how drugs work.
01:02:57.280 Speed lovers want speed, downers lovers want downers.
01:02:59.880 Well, that's, I don't know if you're talking to me, but if you can't get heroin, you'll take fentanyl.
01:03:08.260 You're not going to switch to weed.
01:03:12.520 So I'm not going to argue with that point.
01:03:14.120 That's just true.
01:03:16.620 Yeah, and here's my final opinion on ivermectin.
01:03:21.160 You ready?
01:03:22.460 My final opinion on ivermectin is...
01:03:26.260 It doesn't exist.
01:03:31.980 That ivermectin is just a user interface icon.
01:03:36.360 And that below it, it neither works nor doesn't work.
01:03:39.900 That there's just some reality that's mysterious to us.
01:03:42.660 But at the user interface level, we can't decide.
01:03:48.480 But below that level, I think it's neither true nor untrue.
01:03:51.560 It just doesn't exist.
01:03:54.660 Now, I'm not expecting anybody to adopt my point of view.
01:03:58.180 But here's what I mean by it doesn't exist.
01:04:01.400 I believe that our two worlds that we bifurcated, the people who are sure the ivermectin has been proven beyond a shadow of a doubt,
01:04:10.040 and the people who are sure it's been proven not to work beyond a shadow of a doubt,
01:04:14.140 they both will live complete lives.
01:04:16.420 And they'll never need to decide who was right.
01:04:20.280 Those two realities will just run forever.
01:04:23.220 And they never need to coordinate.
01:04:26.020 Now, by the way, this is my best evidence that we live in a simulation.
01:04:31.480 Let me tie together two concepts.
01:04:33.840 The biggest reason I believe we're in a simulation is this stuff.
01:04:38.080 The fact that we can bifurcate worlds and then live in them, and it doesn't matter.
01:04:42.060 Why?
01:04:43.440 If you were trying to program the Earth and write code, it would be impossibly complex to make sure that everything that changed anywhere in that world
01:04:55.320 sort of broadcasted its effect to the rest so that everything worked in a coordinated way.
01:05:02.580 It would be too difficult to keep your life and my life separate but also never conflicting.
01:05:10.780 Right?
01:05:12.060 You would need to program it so the conflict is explained away.
01:05:17.280 And the way we explain conflict when we have two different worlds and views of what's true,
01:05:22.320 how does the simulation let us explain it away without having to program them to be compatible?
01:05:30.420 It makes you believe the other people are wrong.
01:05:34.280 Right?
01:05:34.640 As long as you think the people who disagree with you have bad information or bad thinking,
01:05:41.140 you can just keep your own view of reality.
01:05:44.860 So you just make up a story that says, oh, I guess those other people are wrong about everything.
01:05:49.820 That's what they say about you.
01:05:51.000 So it would be easy to program a world where things are not compatible, but all the people in it explain the lack of compatibility away
01:06:02.340 by saying, well, the other people are just dumb or stupid or uninformed.
01:06:06.720 But if they knew what I knew, they would know what's true.
01:06:09.460 Once you see that the world is designed to make us think it makes sense, when in fact it never can,
01:06:18.800 then you realize you're in a simulation.
01:06:20.800 What is a simulation simulating?
01:06:31.040 Probably the creator's world.
01:06:34.180 You know, if you take the Elon Musk view, that it's very unlikely we're the original species,
01:06:39.420 more likely we're one of the simulations that any species will create at some point in their development.
01:06:47.020 We're probably some reflection of what they look like.
01:06:49.820 You know the whole, we're designed in God's image?
01:06:55.760 That might be true.
01:06:57.860 You know, our God might be some human-like programmers who made us to look like them,
01:07:04.320 because when they went into our, you know, when they inhabit us or use us to solve problems,
01:07:09.220 they want it to be like them.
01:07:11.600 That would make the most sense.
01:07:14.100 All right, I'm going to say goodbye to YouTube for now.
01:07:17.140 Happy Thanksgiving.
01:07:19.820 And I'll see you soon.
01:07:21.640 I'll see you soon.
01:07:22.700 Thank you.