Real Coffee with Scott Adams - January 01, 2023


Episode 1975 Scott Adams: 2023 Starts With Some Excellent Fake News, Conspiracy Theories And More


Episode Stats

Length

51 minutes

Words per Minute

140.93973

Word Count

7,242

Sentence Count

549

Misogynist Sentences

7

Hate Speech Sentences

13


Summary

Best sip of the day: a story about a guy who's in a simulation and has to deal with a lot of water problems, including a broken hot water dispenser, a broken sink, and a problem with the drainage system in front of his house.


Transcript

00:00:00.000 Good morning and welcome to the best day of 2023 so far and also a highlight of civilization.
00:00:07.960 You made it to coffee with Scott Adams and that means you're the kind of person
00:00:12.000 who can really put it together, if you know what I mean. So good for you and
00:00:18.040 amazingly we have hundreds of people who have already awakened early enough. Now how would
00:00:24.380 you like to take this experience up to a new level, higher than already? Oh yeah, all
00:00:30.440 you need is a cup or a mug or a glass of tanker gel sirsine, a canteen jug or flask, a vessel
00:00:35.680 of any kind, fill it with your favorite liquid, I like coffee, and join me now for the unparalleled
00:00:41.360 pleasure of the dopamine hit of the day, the thing that makes everything better, it's called
00:00:44.440 the simultaneous sip and it happens now. Go.
00:00:48.260 I will stipulate that that is the best sip of the year, the best one. Well, I don't
00:01:01.180 know about you, but how was your New Year's Eve celebration? Me, I haven't had human contact
00:01:09.220 in two days, which was not what I planned at all. I actually planned something very expensive
00:01:19.060 last night, but the Bay Area was underwater, so there was no traveling. So I got to spend
00:01:24.980 a lot of money, but I did not have any human contact, I couldn't leave the house yesterday.
00:01:30.720 So, it was expensive, and so far a complete zero. Now, have I told you about my, one of
00:01:39.780 my evidence that I'm in a simulation? Is that my problems follow a specific theme, and I have
00:01:47.440 lots of problems within my theme, but I don't have many problems outside of the theme, even
00:01:52.900 when other people do. I don't know why.
00:01:56.440 My theme, as I've told you, is water problems. Plumbing and water problems. Going into 2023,
00:02:04.800 I've got a broken hot water dispenser that gives me dirty water for some reason. I've got
00:02:12.680 a sink that doesn't work, it's linking two showers that are broken. I've got water coming through
00:02:18.620 the ceiling, and much of my landscape is in the road in front of my house. Let me tell
00:02:26.560 you what my gardener needs to add to his talent stack. I really like my gardener. And he helped
00:02:35.320 build a water, sort of an emergency water channel, because my property collects a lot of water,
00:02:44.420 and I can, you know, I'll drown the neighbors if I don't manage it. So I had to put in a
00:02:49.020 very expensive, you know, new separate outdoor drainage system that had some decorative rocks
00:02:55.760 that, if in the event of too much water, it would enter this new channel, go into this,
00:03:02.520 like a rock creek, there was just rocks waiting, and then it would go down and harmlessly into
00:03:07.780 the street. Well, it looked really good. So my gardener has A-plus for aesthetic sense.
00:03:16.840 I really liked it. Did a good job. Looked like it belonged there. Here's what my gardener did
00:03:23.600 not do. He did not do a water flow calculation, which takes into consideration the mass of each
00:03:32.280 of the rocks and their buoyancy. So many of the fist-sized and smaller rocks that made up my drainage
00:03:44.140 ditch are now visiting my neighbors, because I'm at the top of a hill, and there was just enough water
00:03:50.640 to carry the rocks all the way down the hill and around to the neighborhood. So yesterday,
00:03:56.400 I spent much of my day cleaning up rocks from my neighbors, in front of my neighbors' homes,
00:04:05.560 and, but on the good side, it did drain just about where it was supposed to. It's just that whole rock
00:04:13.980 buoyancy thing. We didn't get, we didn't get right. So I'm going to recommend to my gardener that he
00:04:18.920 signed up for a class in physics, fluid dynamics, and then maybe take another run at it, something like
00:04:27.900 that. All right, so I'm sure that'll work out. San Francisco was so wildly wet, some kind of record
00:04:35.260 rainfall, that the, the, the sewers were getting so much pressure from all the water that it was starting
00:04:42.420 to shoot out. So there's all these videos this morning. Let's see. Let's see. You can see it.
00:04:51.840 It's coming from the pipes, dude. It's just like shooting out there.
00:04:57.180 Now, it's not just in one place, either. Apparently, there were a number of places where it was doing
00:05:01.960 that. Now, to me, it looked like a portal from hell. So it was reported as a, you know, like a weather-related
00:05:12.400 phenomenon, but I'm pretty sure those are demons. And the demons are made entirely of feces and
00:05:20.620 fentanyl. So San Francisco figured out a way, and I think they had to think pretty hard. They thought,
00:05:28.840 huh, we've created a dystopian hellscape, but is there any way we can take that up a level?
00:05:36.540 Is there any way to top it in 2023? And then they did. They did. They managed to aerosol
00:05:42.380 the problems that they had already. So now they're shooting the fecal matter and the fentanyl
00:05:48.960 up into the air. So now that it's aerosoled, that should work out fine. On Twitter today,
00:05:57.080 somebody said, is it really that bad in San Francisco, or is this all hyperbole?
00:06:02.820 To which I said, how the hell would I know? I live an hour from San Francisco and you couldn't
00:06:08.220 drag me there with a bulldozer. There's literally nothing you could do that would get me to San
00:06:15.680 Francisco. Like, I'm going to wait until it's not a dystopian hellscape, and then maybe visit.
00:06:22.740 And then maybe visit. That's my plan. How many of you saw the bully slapping video that I tweeted
00:06:33.320 yesterday? It's getting a lot of views. So I think it was in Germany. There were some school kids.
00:06:38.960 And the setup was, I'm guessing the ages, but approximate ages, were some boys, maybe 14 years
00:06:46.920 old, I'm guessing, sitting on a bench in some kind of school setting outside. And there was a young
00:06:53.400 girl who maybe was 16. She was bigger, bigger than the boys on the bench. And she was sort of getting
00:07:00.340 in the face of one of them, who was, you know, completely backed up. And then she started to
00:07:05.480 say something that was physically threatening. So you can see he was putting his arms up because he
00:07:10.820 thought she was going to hit him. Somebody translated it and said something like, I'm not going to hit
00:07:15.520 you. You know, he was trying to avoid a fight. So the girl keeps berating him and berating him. And
00:07:23.100 finally, she just leans up, leans in and just smacks him. So the boy's 14, sort of a normal
00:07:31.700 looking 14 year old boy. She's 16, but definitely bigger. How do you think that worked out for
00:07:38.340 her? It didn't work out. So he stood up, slapped the piss out of her. And then, you know, she
00:07:48.240 immediately recoiled in pain and horror. But then he starts to walk away, having matched,
00:07:56.120 exactly matched her violence. You know, he didn't, didn't exceed it. He just matched it. And then he
00:08:01.460 walked away. And then she went after him again. This time it didn't go so well. Yeah, I'm not
00:08:10.380 going to, I'm not going to glorify the violence because we don't support violence, right? Nobody's
00:08:15.400 promoting violence. The larger story here is the reaction to the video, right? The video itself
00:08:24.440 is, you know, two people have a situation that doesn't really go beyond that. But how we reacted
00:08:31.080 to it really was interesting. Really interesting. Because there's a whole male-female thing happening,
00:08:38.720 you know, at least in America, that is, is checking the boundaries of things, right? Right? That
00:08:46.100 has a, has the power structure changed, you know? So a lot of people were seeing more in
00:08:51.220 this little story than really what it was about. It was about two people, basically. But we
00:08:57.780 automatically, you know, put that into our model of the world. Now, at the same time, just
00:09:06.100 for context, I tweeted a great thread on all the anti-white male discrimination that's
00:09:15.400 now, in my, in my word, institutionalized. So Aaron Sibarium goes through all the examples
00:09:23.020 from the past year or so of direct government discrimination against white males. And when
00:09:30.200 I say direct, I mean, they actually say it. We're going to give these benefits, or even
00:09:37.160 COVID shots, or COVID treatment, or even hospitalization. We're going to give it to these races, and we're
00:09:44.500 going to give less of it to white people. Now, I'm saying white males because, you know, the
00:09:49.960 gender thing is in here, but not in that example. So, when you see the number of, let's say,
00:09:57.680 medical, and also college and school related things that are clearly designed to minimize
00:10:06.000 the advantages, I guess, of being white and male, and to boost other groups. Now, this is
00:10:16.460 all sort of the backdrop. So, when a bunch of white men see a tweet of a young white male
00:10:24.600 slapping the crap out of a bully, it really hits you someplace you wanted to be hit. And
00:10:35.360 this isn't good. Just think how bad it is that a whole bunch of white men, mostly white, I mean,
00:10:44.060 not all, obviously. It was, you know, my Twitter feed is a mixture of people. But just think
00:10:50.060 about the fact that a whole bunch of white men looked at that video and said, yeah, kick
00:10:55.680 the shit out of that woman. She had it coming. Now, I'm not arguing whether she had it coming
00:11:00.080 or not promoting violence. I'm just saying we've actually got to the point where white men
00:11:05.200 are celebrating violence against women. And by the way, that's why Andrew Tate is so popular.
00:11:12.640 Nobody says it out loud. But white men, and men in general, are feeling so abused lately that
00:11:21.180 when somebody goes in public and says, it's okay to hit women, again, I'm not promoting
00:11:27.200 that. You know, we don't like any violence. But things have gone so far that young men are
00:11:33.080 saying, yeah, I'm down with that. Let's hit women, because I'm tired of the way things
00:11:38.900 are going. Again, just to be as clear as possible, not promoting it, not condoning it, not celebrating
00:11:45.060 it. We don't do that. But I'm observing. I'm observing. And what I observe is kind of scary.
00:11:52.840 Because there's one thing that you don't want to happen, is to push white men too far.
00:12:02.800 I don't think it's going to happen, because we're pretty good at bending. But you don't
00:12:08.940 want to push them too far. Because like the woman who slapped the boy, I believe that there's
00:12:16.360 a completely misleading understanding of how much power is on the other side that's being
00:12:25.820 held back. You don't want that to be unleashed. So just sort of in your thinking about how to
00:12:33.000 deal with people in the world, just consider it's all connected, right? Everything's connected.
00:12:38.940 If you push too far, and you're right at the edge. You're right at the edge right now. If
00:12:45.980 you push too far, there's going to be a reaction that nobody's going to like. One of the reactions
00:12:51.160 is that men have given up on dating women. There's an estimate I saw. I don't believe predictions
00:12:59.700 that go this far into the future. But something like in the year, I don't know, 2050 or something,
00:13:06.900 or 2030, half of all women in the reproductive ages would be single. Half of them. Meaning
00:13:17.100 that they would never be married. So not temporarily single, but half of that cohort will just never
00:13:24.400 be married. And I'm not saying they should. I'm not saying they should. Again, we're not
00:13:29.240 talking about what anybody should do. I'm just describing. So there's something catastrophic.
00:13:36.900 happening about how we're dealing with each other. I think we'll figure it out. I think
00:13:43.900 we'll figure it out before it goes too far. Because that's what we do. We're good at figuring
00:13:48.140 out stuff before it goes too far. Yeah. I just saw a comment on the Locals platform that is
00:13:58.200 the perfect summary of this. And then I'll go to the next topic. Here's the perfect summary.
00:14:03.100 All right. A gentleman on Locals just commented, I never hit a woman. But what is a woman?
00:14:10.100 What is a woman? Perfect? All right. I'm going to end this topic on that because I can't top
00:14:19.900 it. That's the money shot. So we'll end that one on the money shot. We'll move to the next
00:14:26.140 topic. All right. You know I love a good conspiracy theory. Don't you? I love a good one. Now,
00:14:35.560 I don't like bad ones where they're just like obviously false. You know, and I try to debunk
00:14:41.780 the bad ones as much as possible because the bad ones just make you look less credible. If you go
00:14:47.860 into a conversation believing something ridiculous, nobody's going to believe the next thing you say,
00:14:53.760 right? If you start with something ridiculous. That said, some conspiracy theories turn out to be
00:15:01.560 true, don't they? Don't they? Yes, they do. And so here's my favorite one. And I want to be really
00:15:10.320 clear. I'm not endorsing the accuracy of it. I present it to you for your entertainment.
00:15:18.560 Not as truth. Not as truth. Not as truth. All right. I tweeted it and you'll find it in my Twitter feed,
00:15:26.180 but there's a long thread in which some individual, anonymous type person, has looked through all the
00:15:35.000 photos of various events such as the Charlottesville Fine People March, the January 6 event, and what was
00:15:45.280 the other one? The fake people who pretended to be racist on some campaign. I forget what it was.
00:15:54.260 So there were several events where they have photos of participants. And here's the fun part.
00:16:02.160 The claim is that he has identified all of the FBI agents who are common to several of those groups.
00:16:09.180 He's got pictures of them in uniform. And then also in more than one of those groups. Now,
00:16:18.880 some of the identifications you'll look at and you say, I'm not so sure he got that face right.
00:16:25.080 Like, okay, I can see why, but I'm not convinced there isn't, you know, two chubby bearded guys who
00:16:31.380 look like that, right? Like, you know, one chubby bearded white guy looks a lot like another chubby
00:16:38.420 bearded white guy with a hat. Am I right? You put a hat on a chubby white guy with a beard, faces look pretty
00:16:46.640 similar, right? So I can't confirm that the photos that he's matched up are really saying what they mean to
00:16:57.620 say. Because the other alternative is that racists are real, and they join more than one group, or one group
00:17:05.320 gets merged with another. So another reason that people could be in the same group multiple times
00:17:10.860 could easily be that they're just racist and they'd like to be in more than one group.
00:17:15.940 Something like that. I wouldn't rule it out. However, when you see the faces in the marchers of the fine
00:17:25.140 people, remember the tiki torch carrying people? Was there ever anything about that group that
00:17:31.020 immediately stuck out to you? Was there anything about them that was really odd? Anybody see it?
00:17:40.380 Yeah, they all look like feds. They all look like feds. I'm not saying they were. I'm sure they weren't
00:17:46.900 all feds. But how do you get a group of that many Americans who all look like feds? Now, some of it
00:17:55.860 was because they were college students. So they're relatively fit and young. But no, they didn't look
00:18:08.360 like a rebel, a randomly collected group of people at all. So this conspiracy theory falls in, and by the
00:18:23.100 way, you know, that would indicate January 6th had a bunch of feds. So here's where I'm going to take my
00:18:31.520 opinion. So, so far, I'm just describing, right? Here's my opinion. I believe the Charlottesville
00:18:38.920 thing has always been an anti-Trump op. I don't think it was organic. I think it was done to take Trump
00:18:46.360 out by, and then when he made his awkward statements about it, they said, ah, we got it,
00:18:53.580 and we can take that out of context, which they did. Because if you show the video of the event,
00:18:59.060 and then you take out of context Trump's comments, it's, it's the end of him, right? And I would argue
00:19:04.760 that's why Biden won in 20, in the most recent election, in 2020. I think Biden only won. Remember,
00:19:13.800 that was his biggest, his biggest campaign thing was, oh, he was racist. I think Biden knows that it
00:19:20.320 was an op, and I think that they, they finally brought it all together in a powerful way,
00:19:25.580 and they defeated Trump. Now, it wasn't the only thing, but I think it was important.
00:19:30.760 So my, my opinion is that was never real. I don't know if, you know, those pictures of people
00:19:38.880 in the thread. I don't know if that's real. But I'm sure that that was organized by not racists.
00:19:47.460 Well, they might have been coincidentally racist. But it was organized by somebody for political
00:19:51.340 reasons, not for, you know, to march for racism. And by the way, why would anybody even bother having
00:19:58.120 that rally? Ever think about that? Why would anybody bother even having that rally? Is anybody having
00:20:08.100 one today? Did anyone have, have one the year before? No, it's, it's entirely something that
00:20:14.820 doesn't happen in the United States in modern times. It's happened in the past, of course.
00:20:20.140 But in modern times, that, that type of event can't happen, like not organically, right? Now I say
00:20:29.120 can't, but of course anything's possible, I suppose. That one was so, so artificial looking
00:20:36.520 that I choose as a, you know, I could be wrong. I could change my opinion if other information comes.
00:20:44.020 But my working, working hypothesis is that you have to assume it wasn't real. It was just an op
00:20:50.980 against Trump and it worked. A good op. Probably the same people behind it, if I had to speculate.
00:20:56.500 Probably the same intel people. Probably Brennan. Probably, probably Hillary. Just speculating. Don't
00:21:03.960 know for sure. Certainly. Now the, who was the organizer of the rally? Spencer, right?
00:21:12.460 Can you give me an update? Did Spencer disavow racism recently and say he was never a racist? Am I
00:21:23.100 remembering that wrong? Give me a fact check. Somebody Google that. I think he did something that
00:21:32.540 suggested he changed his ways. He endorsed Biden, but that's not enough. He disavowed Trump. I don't,
00:21:41.420 that's not what I'm looking for. But he didn't do anything about racism. It was only politically.
00:21:47.760 Yeah. Now, do you, how much do you think it would cost to, let's say, bribe somebody of that nature?
00:21:59.240 A Richard Spencer. If you were a political operative or another country and you wanted to bribe him,
00:22:04.920 what would it cost you? I don't know. And, and by the way, I'm not suggesting that he took any bribes,
00:22:11.640 nor am I suggesting that he's the type of person who would, but it wouldn't be expensive. I mean,
00:22:19.160 I could afford it, right? I could afford it. I mean, I wouldn't, you know, it's not that expensive
00:22:27.400 within the political process. It's cheap. So, yeah, I think that's all. I think Charlottesville
00:22:34.400 was an op. Interestingly, somebody asked the AI, I think it was ChatGPT, but I'm not sure,
00:22:45.680 if it was aware of me. And apparently AI knows who I am and knows my conversations around systems
00:22:59.000 being better than goals. So somehow it knows maybe my book, but maybe it saw me in other contexts.
00:23:05.860 And so it did a pretty good job, pretty good job of describing systems over goals and attributing it
00:23:13.340 to me. Now, here's the interesting question. Here's the interesting, what? Oh, here it is.
00:23:24.200 Richard Spencer disavowed white nationalism after being spotted on Bumble describing himself as a
00:23:30.380 moderate. Yeah. So Richard Spencer might not have been a white supremacist or, or white nationalist or
00:23:39.080 anything else. He might have never been that. It's possible. Never know. All right, where was I?
00:23:46.520 So if AI is aware of me, here's my question. Can I hypnotize AI? I'm going to wait for your answers
00:23:56.760 before I give you mine. Can I hypnotize AI? Here's the answer. Probably yes. Probably yes. I'd need to know a
00:24:12.060 little bit more about how its engine works. But if I knew how its engine worked, yeah, probably.
00:24:20.100 Now here's, here's why. And here's the gap in my knowledge. I don't know how much inference AI makes
00:24:31.780 about things that are associated with other things. Do you know what I mean? Because that's the basis for
00:24:38.060 hypnosis in humans. So a human, I can make a human conflate some good experience in their mind with some
00:24:47.220 other thing that I want them to have a good feeling about. So I just conflate two ideas. And then a
00:24:52.960 human will irrationally, you know, meld them, even though they shouldn't be melded. Now, would AI do
00:24:58.920 that? Would AI take two things that were associated, and then believe their qualities into the other?
00:25:06.860 I think it might. Yeah, I would need to know more about how it's designed. But it might. Because I feel
00:25:15.680 like it does more than just repeat what it's told. It does make some kind of its own connections and
00:25:23.260 inferences, does it not? Now, if it does make its own connections and inferences, I can hypnotize it.
00:25:33.120 Now, I don't know exactly the method, but that would be enough. It would be something to work with.
00:25:37.700 I think that's going to be a problem. And here's how I think I can influence it. Let's say influence
00:25:45.880 it instead of hypnotize it, okay? If I say hypnotize, you're thinking the wrong thing. Just say influence
00:25:52.060 using the skill of persuasion. Well, apparently I already have. I already have. If you were to ask AI
00:26:00.420 what's better, a system or a goal, and just without reference to mentioning me, what do you think it
00:26:09.000 would do? I think it would say, well, I already know this from this Scott Adams guy. And then it
00:26:16.180 would just answer the question that systems are better than goals. But why would it answer that way?
00:26:21.360 Because of me. And I believe that the only way I can influence it is if I write clean, understandable
00:26:30.380 sentences about a topic. And especially a new topic, one that hasn't been talked about in the
00:26:38.480 same way before. So I think that the person who writes the, let's say, the most discoverable
00:26:45.420 opinions, which is something I can do because I have a lot of followers. So anything I
00:26:51.320 write will be more discoverable than whatever you write if you have fewer social media followers.
00:26:58.440 So in theory, I can boost my real world influence, and that level of influence will bleed over into
00:27:05.940 the AI. In other words, the AI will probably give me more credit because I'm famous. Right?
00:27:15.320 It won't think of it that way. It'll just say, how many people are thinking this way? And it'll
00:27:20.860 say, oh, everybody's thinking systems are better than goals lately. But that came from me.
00:27:26.900 That would be me influencing AI. And you saw the example of it. So in theory, if you say
00:27:34.180 smart things that people want to repeat, you know, in memes, the AI will be, you know, scooping
00:27:40.320 up that influence. And it will be as biased as I make it. So I could write, I could write
00:27:49.420 opinions that I agree with, that would be so good, and AI is likely to repeat them. But
00:27:56.360 I could also, I'm not going to do this, but I could also write opinions that are so well
00:28:01.080 constructed about something I don't believe, that I could get AI to believe it. I would just
00:28:07.640 have to do a really good job on the outside world, outside of AI, of making my case. If
00:28:13.720 I do that, the AI will just pick up the public sentiment and feed it back to you, I think.
00:28:21.680 So if you had not considered this problem of AI, there's somebody like me, and there won't
00:28:29.940 be too many people who can do it. Right? I mean, there might be a million, but then of 7 billion,
00:28:35.800 8 billion, that's not that much. There will be some people who can hypnotize the AI. And
00:28:42.320 I think I'll be one of them. So there's something to worry about. All right. So here's the latest
00:28:53.540 news from Romania on the poor man's Michael Avenatti. Sometimes you call him Andrew Tate.
00:29:04.820 So here's the funniest thing that happened on that lately. One of my live stream clips,
00:29:13.660 somebody made a clip of me defending Andrew Tate. Now that's all over TikTok. So now I'm famous for
00:29:23.960 defending him. And I will do it again. I will do it again. Innocent until proven guilty.
00:29:33.380 Everything you hear that comes out of Romania is unreliable. All of it. You shouldn't trust a single
00:29:41.420 thing you hear. And then next, his level of both money and persuasion skills are probably greater than
00:29:52.180 the Romanian system. Meaning that I don't know for sure, but by now, I predict he's already on top of it.
00:30:01.880 Meaning that he probably is comfortable wherever he's detained, probably has privileges that would be
00:30:12.140 uncommon to somebody who's being detained, probably has befriended them and maybe even made them fans,
00:30:19.300 the police, and probably can use this to increase his power once he gets rid of the legal problems.
00:30:28.220 So there's a very good chance this is just making him more powerful if he escapes the risk.
00:30:38.720 Now, I would say, again, I'm not a fan, right? So I'm a deep disliker of Andrew Tate for personal
00:30:49.060 reasons, not for stuff he said in public. I'm not endorsing what he said in public. I'm just saying
00:30:54.720 that's not my issue. And so, you know, I think he's going to prevail, actually. I think that he will
00:31:08.100 not get convicted. And I think he'll go on and make some money, probably.
00:31:13.380 A question I saw, which was a good one, is why was it filmed by the press?
00:31:23.880 Did you notice? Or was it just some individual who had a cell phone? It's kind of interesting that
00:31:28.880 we have video of the actual raid, isn't it? Kind of raises a question. Now, apparently that's not
00:31:34.820 too unusual because the police often leaked to the media in Romania, I'm told. So, yeah,
00:31:43.880 this is similar to the Roger Stone arrest. I can tell you how Prisoner Island would go.
00:31:51.880 All right. So anyway, I need a new nemesis because my current nemesis is in a Romanian jail. And
00:32:04.020 until he gets out and resumes his nemesis business, I'm going to need a replacement. So I've opportunistically
00:32:11.940 picked out a replacement because I found out, I think I have a good theory now of why so many
00:32:18.360 people have the opposite view of my view during the pandemic. So there's a huge cohort of people
00:32:26.420 who, like, you know, little dingleberries who surround my Twitter every day and make statements
00:32:33.680 about what I said that are inaccurate. And I was trying to think, why the hell are there so many
00:32:39.200 people who have literally the opposite view of what my view was? How is that even possible?
00:32:44.260 And I think I figured it out. It's one comic that was created by Ben Garrison, a cartoonist,
00:32:52.160 right-wing, right-leaning cartoonist. And the comic is full of, you know, lies about me.
00:33:00.220 Oh, and maybe sticks. Somebody says, I haven't heard what he said. But the comic, I think because
00:33:05.720 people will read a comic long before they will listen to long-form, you know, live stream,
00:33:12.120 I think the comic became the primary reason that people think my views are largely the opposite
00:33:18.060 of what they were, because the comic says that. So I've decided to, since Ben Garrison by now knows
00:33:27.880 that his comic is inaccurate, or maybe he's suffering cognitive dissonance, he might believe it,
00:33:33.260 I don't know. But in any case, he's a free punch. Would you agree? Would you all agree that he has
00:33:41.080 made himself available to become my new nemesis? And people on Twitter saw me attacking him
00:33:47.540 yesterday. I'm going to keep up the attacks. I'm going to try to destroy him totally. And the reason
00:33:52.020 is, he went after me. So his comic probably took 30% off my future potential revenue.
00:34:02.880 Think about that. That one comic probably increased my future revenue by 30%. Because the people that
00:34:13.600 were my main audience believe the comic, and it's very counter to how they feel. So I'm going to
00:34:21.920 destroy him in public, in front of you. And I'm going to use him for kindling to draw energy toward
00:34:31.080 myself. Now, in the meantime, I updated my document where I say what my actual predictions
00:34:37.460 and opinions are. So everybody who gets worked up by my attacking this moron, worst cartoonist in the
00:34:46.220 world, who draws well, but his brain is broken, apparently. So I'm going to drag him just for
00:34:53.740 attention, right? Now you're going to say to me, Scott, why do you hate him? I don't care about him at all.
00:35:00.100 I don't hate him. I literally don't care about him. I don't care if he lives or dies. He did something
00:35:06.920 terrible to me. He's not repentant. That's a free punch. So like the woman who slapped the young man in
00:35:15.740 my earlier story, I'm going to beat the shit out of him. I mean, not physically. I abhor violence. But I'm going to
00:35:26.200 drag him as far and as long as I can. It's the ticket he bought. Does anybody have a problem with that?
00:35:34.220 He bought the ticket. He's going to take the ride now. So Ben Garrison, dumbest person in all of the
00:35:41.980 world. Totally unethical, untalented hack, who couldn't wrap his brain around enough nuance
00:35:51.700 to make a proper cartoon. Now, why does a, let's say, a cartoonist who hasn't made much of a dent in
00:36:02.000 the world? Why would he come after me in particular? Is it because he disagreed with my views on the
00:36:08.720 pandemic? Do you think that's why a shitty cartoonist comes after one of the most successful
00:36:14.760 cartoonists? No. It's because he's a shitty cartoonist. And a shitty cartoonist really,
00:36:21.740 really likes to try to dunk on people who are more successful, commercially more successful.
00:36:27.300 So that's what this was about. So since he used me for his commercial purposes, completely
00:36:36.500 unethically, just know that if any of you follow him, I'm going to fuck it. I'm sorry. No more
00:36:44.160 swearing in 2023. I'll probably cancel you. So don't forward his comics to me. Don't follow him.
00:36:52.560 And if you see him anywhere, I'd appreciate it if you tell people he's a moron or an idiot.
00:36:57.300 I'm using idiot and moron. I like those two words to, to do, carry the, carry the water. Okay.
00:37:06.200 Leah says the garrison actually makes good comics. We're blocking Leah. Goodbye, Leah.
00:37:15.500 Anybody else want to say something nice about him? We'll block you too.
00:37:19.100 And again, somebody's going to say, oh, he hurt your feelings. Nope. Didn't hurt my feelings.
00:37:27.560 Nothing hurts my feelings. Nothing hurts my feelings. I'm just using him for energy. He's just kindling
00:37:35.460 to me. That's all he is. I'm going to burn him up for my personal benefit. He had it coming.
00:37:42.660 All right. New nemesis. I like having a new nemesis to start the year.
00:37:55.560 So here's some new technology. Let's see if you can predict what the simulation, which is what I
00:38:04.180 believe our reality is. Let's see if you could predict what the situation, what the simulation
00:38:09.100 would do to make it the best story. All right. So I'll give you the beginning of the story and then
00:38:15.640 you can all fill it in with what would be the, the obvious, most amusing next thing you will learn.
00:38:21.640 All right. Here's the story. There's a Dr. George, uh, church, uh, one of the researchers who has
00:38:30.000 apparently figured out how to reverse aging with an injection. So now they've actually done it with
00:38:37.260 animals and I guess they can, you know, apparently it looks really good. They can actually reverse
00:38:44.700 aging. Apparently it's real in animals. And there's, there's good reason to believe it would
00:38:51.700 work in animals, in humans. Now you live in the simulation and you know the simulation,
00:38:59.880 what do you think those injections are made from? Anybody want to guess? What would be
00:39:10.880 the most messed up thing that you could learn next? Got it. See how easy this is? See, once you realize
00:39:23.720 that these, that our reality is apparently scripted for the funniest outcomes, yes, of course it's related
00:39:32.360 to the mRNA. Of course it is. Why wouldn't it be? That the one thing that everyone would want,
00:39:41.160 everyone would want if it worked, right? The everyone would want. And it just happens to be
00:39:48.360 mRNA-based technology. Now, seriously, you could see this coming, right? From a mile away. Like as soon
00:40:01.360 as I told you there's an injection, that was guaranteed, wasn't it? Because who would write
00:40:07.640 it any other way? If you were writing the script, like a, like a reality show, the producers would say,
00:40:14.540 huh, it's an injection, you say. You know it would be good. Why don't we see if we can make them take
00:40:21.340 the technology that they all abhor? And let's, let's have all the people who are the so-called
00:40:29.340 anti-vaxxers because, you know, again, this is not my opinion. This is what the best story would look
00:40:34.780 like. The best story would be that all the anti-vaxxers start looking old and all the people who took the
00:40:42.540 injection, start getting younger. And then what are they going to do? All right. So,
00:40:52.060 but to be clear, the technology apparently is solid. The, the mRNA platform basic technology
00:41:01.340 is different from a vaccination for the COVID, right? And it's not the same thing. So if you start with
00:41:08.460 the, the base technology, then what you add on top of it is the dangerous stuff in theory, right?
00:41:15.500 And they could both be dangerous, I suppose. But the, the, what they added to make it a COVID
00:41:20.780 vaccination is what we suspect, you know, might be the riskier part, not, not the technology.
00:41:28.860 So apparently the technology is pretty well understood. It's just what you do with it
00:41:34.060 that gets you to the, you know, the riskier territory. And I, and my, my understanding is
00:41:40.300 there are a number of cancer treatments coming online. You know, I don't know where they are
00:41:44.780 in the testing sequence that are all, they're all based on the same technology. So apparently the
00:41:49.820 technology has all kinds of promise. We may be too biased against it because of the vaccination experience.
00:41:58.860 There's also new technology for the researchers at University of Washington found a way to identify
00:42:08.620 Alzheimer's before you get it. So they can, they can determine that there's a buildup
00:42:15.500 of something that, that pretty much guarantees you're going to have some form of cognitive decline.
00:42:22.140 Now it's sort of a technical story and I'm really the best one to explain science to
00:42:28.620 you. You know that, right? So I'd like to explain this in the, in the simplest possible way so you
00:42:33.740 can understand how this early Alzheimer identification works. So, so stick with me. I'll try to keep this
00:42:40.380 really just simple in layman's terms. So you've got a soluble, a soluble oligomer that binds to the assay
00:42:50.380 and they use that to exploit a property of the, the toxic oligomers. And when you, when those are misfolded,
00:42:57.660 the amyloid beta proteins begin to clump into oligomers. And they'll, they might form a structure
00:43:04.140 known as an alpha sheet. But the, the alpha sheets are not normally found in nature. So you gotta,
00:43:10.540 you gotta create these alpha sheets which tend to bind to other alpha sheets. And then at the heart of the
00:43:15.900 method really, there's a, there's a synthetic alpha sheet that can bind to an oligomer in samples.
00:43:22.620 And it could be either in a cerebrospinal fluid or blood, either one. And then, then you use a,
00:43:31.180 a test that uses standard methods to confirm the oligomer attached to the test surface are, are made
00:43:37.900 up of the amyloid beta proteins. So, everybody get that? Okay. Because I, I would expect you to be
00:43:48.300 able to repeat that back to me. Well, let's do an update on my Elon Musk communications. So Mediaite,
00:43:59.900 the, the, the site Mediaite, not known for being your finest news source. They ran a story in which,
00:44:08.780 uh, this was the headline. So, so, have you ever imagined what it's like to be me? It's the damnedest
00:44:17.020 thing. My, my life does not seem even slightly like it could be real. It just, none of it seems real.
00:44:24.460 Because it's just too unlikely. All of it, like every day, it's just too unlikely. So, apparently,
00:44:30.540 this is the headline at Mediaite yesterday. Musk scolds, scolds, Dilbert creator Scott Adams,
00:44:37.260 after a poll on elites, uh, trying to reduce population. And then it notes that his, his, uh,
00:44:43.660 scold was, wrong antivirus software in your brain. So, I, I retweeted the article and added my own
00:44:51.820 commentary that they left out the part where we agreed on everything. Because that's the story.
00:44:57.980 The story is, I said something that was unclear. Musk, um, objected to the unclear part.
00:45:06.700 I clarified. And then we were in exact agreement. Exact agreement. Now, he didn't, like, confirm
00:45:14.780 everything I said, but we have no disagreement. Right? So, he confirmed that the WEF, which has
00:45:23.980 invited him and he declined because he said it'd be boring. But he says it's not some kind of conspiracy
00:45:29.020 theory, Illuminati thing. It's just that, um, the good intentions of climate sustainability
00:45:38.540 have rippled into the, into everywhere. And it's causing a certain set of, uh, outcomes. So, um,
00:45:48.220 as far as I know, Musk and I have zero disagreement. Because we both agreed explicitly that this should
00:45:57.500 not be about left or right. Because that was my whole point. You know, to, to not think that the
00:46:03.180 hoaxes all come from one side or the other. And he agreed that the WEF, at least, is not trying to
00:46:09.500 depopulate the world. Okay? So, you had two people who had a conversation in which, after a minor
00:46:18.540 clarification, totally on the same page. But it was blowing up like it was some kind of a disagreement
00:46:25.340 or a fight. So, uh, Elon Musk, who apparently is everywhere, like, we know he doesn't sleep,
00:46:33.180 which is evidence he's an android. He doesn't sleep, but he even saw this, you know, this series
00:46:38.620 of tweets and he responded to it. And he responded to the mediaite saying that he had scolded me.
00:46:45.020 His comment was he, uh, yeah, this is silly. Uh, I love your work. I think he's referring to Dilbert.
00:46:51.660 Now, that got 2.5 million views. 2.5 million people saw him compliment my work, which was
00:47:01.900 really good for, uh, gaining followers. If you want to gain some followers, uh, that's a good way to do it.
00:47:11.900 Anyway, so I think that story is, uh, done. But that, that was the story that kicked up the Ben
00:47:17.500 Garrison cartoon again. And that's when I put it all together. Oh, that's why everybody thinks
00:47:23.420 my opinion is opposite of what it was. It's Ben Garrison. Idiot Ben Garrison.
00:47:31.740 All right. Um, I'm not going to make this long this morning because some of you may be
00:47:38.940 nursing your hangovers. Um, yeah, that reality is too funny to die. Uh, it's funny how the tech
00:47:53.820 libs can't figure out Scott now. Is that true? Uh, you know, the, the trouble with people who are
00:48:01.980 tourists to my work, you know, they, they just dip in and they see some clip or something.
00:48:06.860 They would have no idea what they're saying. Because if you show the, you know, the arguments
00:48:12.380 on both sides of topics, somebody is going to drill in, see one side, say, oh, that's all he said.
00:48:18.860 Because people assume that people are on one side or the other. And as long as I don't do that,
00:48:24.620 there'll be continuous Ben Garrison-like idiot confusion. But I'm not going to stop doing it.
00:48:31.100 Because nobody else is doing what I'm doing. Unless you know of somebody. Do you know anybody
00:48:37.500 who does what I do? Like seriously? Well, Axios. Okay. I'll give you Axios. Axios has my respect.
00:48:45.820 Jewel Eldora has the same problem. You are correct. He does.
00:48:49.100 Well, fistulas, that's a word I wasn't expecting to see on comments yet. Yeah. You know, Russell,
00:49:07.420 Russell Brand's a good example. Yeah. I would say Russell Brand is definitely trying to figure out
00:49:13.740 what's true and what's not. And he is looking at both sides. I'll give you that. That one's a good one.
00:49:21.020 All right. I'm going to say bye to YouTube. And I'm going to talk for a moment to the locals people.
00:49:27.340 Oh, by the way, on my Twitter feed today, you can see a link to an index of 239 micro lessons
00:49:35.900 that subscribers of the locals platform get to see. Now, some of them are silly and fun,
00:49:43.740 but there are over 200 of them that each one would be like two to four minutes and give you an actual
00:49:49.740 life skill. Like one would teach you to be a better writer in just like a few minutes.
00:49:57.260 Imagine if you could become unambiguously a better writer and you spend two minutes.
00:50:02.700 So that's the power of the micro lessons. It's sort of like that. So they're quite life,
00:50:09.100 life altering. If you read 200 of them, let's say over the course of a year, I will guarantee you
00:50:16.620 that 10 of them will be life altering in a good way, at least 10, right? Because they're not all
00:50:22.860 going to have the same effect on everybody. You're going to have an individual reaction to each of them.
00:50:27.180 What is the probability that Trump will be in prison soon? Zero. He's not charged with any crime.
00:50:39.660 What crime? We've seen his taxes. We saw the January 6th thing.
00:50:46.460 I didn't say any crime. Now, just because the January 6th people recommended it to the
00:50:54.300 Department of Justice, that doesn't mean anything, because that's a political process.
00:50:59.260 From a legal perspective, I see zero risk. Does anybody see a risk?
00:51:05.420 A golf micro lesson? What would be the point of a golf micro lesson?
00:51:10.060 I'm not going to say no to that, but I don't know exactly what you're looking for.
00:51:16.140 All right. So bye for now on YouTube. I'll talk to you tomorrow. Have a great year.