Real Coffee with Scott Adams - August 17, 2022


Episode 1838 Scott Adams: Trump Flushes Lizard Cheney Down The Dynasty Toilet, HOAX List Is Up To 14


Episode Stats

Length

39 minutes

Words per Minute

149.16559

Word Count

5,953

Sentence Count

432

Misogynist Sentences

9

Hate Speech Sentences

14


Summary

The She Hulk is a woman. Arizona opens a charter school program for public school students, and the new She Hulk movie is out and it's good. Also, a new movie is being made about the first time a woman has ever been the Hulk.


Transcript

00:00:01.000 Good morning, everybody. Welcome to another greatest moment of your life. It's just one
00:00:08.080 after another now. And today will be special. It'll be incredible. Lots of stuff happening.
00:00:14.440 It's all fun. And if you'd like to enjoy this experience at the maximum rate, well, all
00:00:22.800 you need is a cup or a mug or a glass, a tanker, a chalice, a canteen jar, a flask, a vessel
00:00:29.160 of any kind. Fill it with your favorite liquid. I like coffee. And join me now for the unparalleled
00:00:36.960 pleasure. It's the dopamine hit of the day. The thing that makes everything better. It's
00:00:43.300 called the simultaneous sip. It happens now. Go. Somebody emailed me the other day that
00:00:57.080 they have a coworker whose actual natural name is Brandon Fell. It's probably not the only
00:01:06.120 one. It sounds like it might be a common name. Brandon Fell. Well, that's his name. All right.
00:01:12.080 I didn't put this in my notes, so I'm going to say this first. I just saw a tweet by Corey
00:01:17.700 DeAngelis saying that Arizona, and I think they're their first state, now has opened up
00:01:24.700 their program that money can follow students so they can do homeschooling and charter schools
00:01:30.520 and whatever. And it's live. Apparently the program is live now. So parents can take the
00:01:38.660 funding that would have gone to the public school and take it with them to a private school.
00:01:42.080 That's amazing. Now, this is the best of America, in my opinion. This is a system that's working
00:01:50.560 exactly the way it should be. The states are the laboratory. Arizona wanted to go first.
00:01:57.460 And I don't know, maybe in a year or two years, we'll have a pretty good idea how it worked out.
00:02:03.440 Maybe two years is short. I don't know. But we'll have some indications.
00:02:06.460 Now, is this the best idea in the world or the worst idea? We don't know. That's the beauty of it.
00:02:14.140 I think it's a good idea. Seems like it would be. Competition and all that. But, you know,
00:02:19.820 unintended consequences, so we'll see. But that is a really, really big moment in the, let's say,
00:02:30.360 the integrity of the republic. Because as you know, our school systems are our biggest weakness right
00:02:37.300 now. If we're miseducating the youth, we're not going to have a good outcome. And fixing that's the
00:02:43.480 biggest, I think, the biggest challenge we have. And not that it's hard to do. It's just the most
00:02:50.060 important thing to do. Because if you get everybody on the same page and they're educated and they're
00:02:54.280 doing a good job, well, everything else seems to work out a little bit better. So that's all good
00:02:59.620 news. More good news. Oh, my God, it's just all good news today. The Marvel has now started to stream
00:03:08.580 the She-Hulk. So this is your Marvel Hulk movie in which the new Hulk is a woman. You still have the
00:03:16.200 build Hulk. You know, the obsolete, patriarchal male Hulk. But we have a new, improved, woke Hulk
00:03:24.720 called She-Hulk. And I think this is starting off a good, sort of a good pattern. I'd like to see
00:03:32.740 more of this. Because here they've got the pronoun for the Hulk, like, right in the title of the movie,
00:03:38.480 so you can't get it wrong. Don't you think they should do that with all movies going forward?
00:03:44.120 Like, Captain America? You know, it could be, like, He-Captain?
00:03:52.880 Yeah, I feel like we should just, you know, what about, what are some other movies?
00:04:00.280 Like that, yeah, we should just put the pronoun in front of all of them. He-Iron Man? Or maybe
00:04:05.880 they? Yeah, so instead of Iron Man, it would be Iron They? Or could it be Iron Woman?
00:04:14.120 But my problem with this whole thing is, is that it's not really woke enough. Is, this
00:04:22.300 is 2022. Am I, am I wrong? Did I get the, did I get the year wrong? It's 2022. Now, I realize
00:04:30.720 it takes years to actually get a movie made. But I feel like this is already somewhat out
00:04:36.700 of step with the times. It's not nearly woke enough, is it? So I'm going to skip She-Hulk.
00:04:43.620 I'm not going to, this will be actually the first Marvel movie that I'm not watching. Because
00:04:47.900 just out of principle, I'm going to wait for LGBTQ plus Hulk. I want to, I want a Hulk that's
00:04:54.180 either disabled or has something, a little extra flavor. Like just being a woman, a little bit
00:05:03.400 boring in 2022. If somebody said, oh, I invited somebody to the party. You say, who'd you invite?
00:05:10.120 A woman. You'd be like, oh, that's a little boring. But also she's bringing a man. You're
00:05:17.400 like, God, that's boring. A man and a woman. Give me a little LGBTQ action, a little flavor,
00:05:25.320 please. I like a little spice in my movies. And She-Hulk is just not getting it done in 2022.
00:05:31.220 You got to ramp it up, ramp it up a little bit. Well, I have a category of humor that I love,
00:05:38.220 which I'm going to share with you. It's called the shortest dismissive insult. For some reason,
00:05:46.440 I love it when somebody can get the crispest, you know, the most cutting insult in the smallest
00:05:52.480 number of words. I'll give you an example. For example, if you were to say that Brian Stelter
00:06:00.720 of CNN is the poor man's Jeffrey Toobin. That's good, right? Yeah. Brian Stelter is the poor man's
00:06:09.460 Jeffrey Toobin. Come on. You love that. I know you do. I know you love that. Right? It's a good
00:06:15.200 category of humor. But here's one from somebody on Twitter named Het Puniza Paradij, which I think I
00:06:24.560 nailed it. I think I nailed that last name. But he tweets about the She-Hulk movie. And I just love
00:06:32.200 this. So this is one, two, three, four, five words. Are you ready for five words that sum up the
00:06:40.340 She-Hulk movie without ever having watched it? This is what Het gives us. This is Green Karen.
00:06:47.940 Try to forget you ever heard that. Good luck. You'll never be able to see She-Hulk again without
00:07:01.700 thinking Green Karen. Is that the funniest thing you've seen in a while? Green Karen. Oh, God.
00:07:12.860 We'll get to Liz Cheney and all the fun stuff. So yesterday I fell for a hoax, which is not the
00:07:22.300 first time. Because you see stuff, when you see stuff on Twitter that looks like what you think
00:07:30.200 you expect, it's real easy to fall for it. So I fell for one yesterday for, I don't know, about a
00:07:36.020 minute or two before somebody said, hey, that's a hoax. So I took it down and confessed that I'd
00:07:41.500 fallen for a hoax. So it was somebody had published the so-called Rules for Radicals.
00:07:47.840 How many of you heard of the Rules for Radicals? So this is from the 60s or something. And
00:07:54.500 allegedly these Rules for Radicals tell the left, yeah, it's Alinsky's work, tell the left how to
00:08:02.800 manipulate and, I guess, gaslight the public until they can get what they want. So the eight rules
00:08:10.720 that I saw that I retweeted were fake ones. And the fake ones had stuff like, you know,
00:08:18.420 run up the debt and bankrupt the country. And I thought to myself, now let me make a confession
00:08:25.280 here. This would be my second confession of error in five minutes. I got a lot of them
00:08:31.080 today. So my second confession is that although people talk continuously about the rules for
00:08:41.340 radicals, I believed without doing research, so here's my confession part, and you can judge
00:08:48.140 me, you can judge me unfairly for this, if you like, or even fairly, I never bothered to
00:08:55.460 look into them. So it's one of these things you hear all the time. Oh, these Rules for Radicals,
00:09:01.320 they're using those rules again. And for whatever reason, I was so sure that those rules were complete
00:09:08.920 bullshit that I never bothered to look at them. I just dismissed them as, you know, more just
00:09:15.080 political talk. But I finally had to look at them because I had accidentally tweeted the fake
00:09:20.480 ones, because I thought the fake ones were proving my point. The fake ones looked like
00:09:25.180 bullshit. And sure enough, they were. They were actually hoax. They were fake ones. So
00:09:30.220 I looked at the real ones. What do you think? Are the real ones, are the real ones like really
00:09:36.580 powerful work? Or are they just bullshit? What do you think? Are the real Rules for Radicals
00:09:44.260 most powerful plans for taking over countries? Or just a bunch of bullshit? Well, I looked
00:09:50.960 at them, and as far as I can tell, it's just generic stuff. Let me run through them quickly.
00:09:57.280 I'll do this just quickly. But this is the most generic stuff I've ever seen. There's nothing
00:10:01.760 here. To me, it's completely empty. But it could be that these were, maybe these were groundbreaking
00:10:09.160 ideas when they came out in the 60s. Is that possible? But in 2022, there's nothing on here
00:10:14.960 that's even interesting. These are just completely normal things. Let me give you the idea. Number
00:10:20.440 one, power is not only what you have, but what the enemy thinks you have. That's called bluffing.
00:10:28.380 Did you need the Rules for Radicals to know that pretending to have more power than you do
00:10:33.160 is a good thing? Well, what did that add? Literally every human knows that bluffing can
00:10:40.100 be a good thing. Number two, never go outside the expertise of your people. Well, isn't that
00:10:47.060 a little bit commonsensical? You should stay with what you know instead of talking about
00:10:51.760 things you don't know, because that would embarrass you. Or it's not going to be as effective.
00:10:56.840 Isn't that the most basic thing you would ever do in any domain, is to stay within your expertise
00:11:02.320 of your people, not necessarily your own. But then three is, whenever possible, go outside
00:11:08.380 the expertise of the enemy. In other words, if you have some knowledge or expertise they
00:11:14.420 don't have, use it against them. Did you need to be told that? Who wouldn't do that automatically?
00:11:20.940 If you have some expertise and your opponent does not, wouldn't you take advantage of that?
00:11:26.560 Did you need to read rules to find out to do that? These are the most ordinary, basic
00:11:32.100 things anybody would do. How about make the enemy live up to its own book of rules? Well,
00:11:37.480 what's the most common thing that politicians do? They accuse the other of being a hypocrite,
00:11:42.360 making the other one live up to their own rules. The claim of hypocrisy is the most empty,
00:11:49.800 useless thing that anybody ever used. There's no power in any of this stuff. This stuff is all
00:11:54.700 inert. So far. Then it says, ridicule is man's most potent weapon. You know, so you should use
00:12:00.780 ridicule and mock your opponents. That's what everybody does. That's what everybody does and
00:12:07.140 always has. Both sides mock their opponents. What insight did we get by having this on the list?
00:12:15.340 How about a good tactic is one your people enjoy? Did you need to be told that? That people will do
00:12:22.060 things they like to do more than things they don't like to do. Was that like an innovation in the
00:12:29.100 60s? How about seven? A tactic that drags on too long becomes a drag. Well, everybody knows you need
00:12:36.400 to keep your energy high. Duh. Keep the pressure on. Duh. That's everybody on every topic all the time.
00:12:46.860 The threat is usually more terrifying than the thing itself. Yes, fear works as persuasion.
00:12:54.120 Did you need to know that scaring people works? Who in the world needed to tell you that?
00:13:00.620 How about the major premise for tactics is the development of operations that will maintain a
00:13:06.700 constant pressure upon the opposition. So you should have an organization that keeps the pressure on.
00:13:11.740 Again, did you need to know that if you're organizing, you need an organization?
00:13:19.660 There's nothing here. How about the last one?
00:13:25.140 If you push a negative hard and deep enough, it will break through
00:13:29.920 into its counterside, which is based on the principle that every positive has its negative.
00:13:36.220 All right. So first of all, it's just a bunch of word salad, obvious stuff about if you push a
00:13:42.740 negative hard enough, it'll make a difference. Well, everybody knows that. If you, if you accuse
00:13:48.820 your opponent of something long enough and hard enough, this starts, starts working. That's both
00:13:55.540 sides all the time. Number 12, wait, is there two 12s? No. Number 12, the price of a successful
00:14:01.780 attack is a constructive alternative. All right. Well, that doesn't even mean anything.
00:14:08.000 The price of a successful attack is a constructive alternative. Literally means nothing.
00:14:14.740 Number 13, pick the target, freeze it, personalize it, and polarize it. So here he's saying, you know,
00:14:21.560 if you have a, if you have a political policy topic, you should put a person on it so you can attack the
00:14:29.320 person, which is what everybody does. Every one of these rules for radicals is what every politician
00:14:36.220 does in every situation. There isn't a single thing here that's even the least bit interesting.
00:14:42.460 Am I wrong? Did you hear anything on this list that you said, oh, now we're in trouble?
00:14:49.840 They've seen these rules. This is completely nonsense. It's completely useless twaddle.
00:14:57.340 And I can't believe anybody would need to read these to know to do these things.
00:15:03.380 Demonize your opponent? Oh, it's very good that somebody told me to do that. I never would have
00:15:09.100 thought of that. This is literally NPC programming. Because every NPC, in other words, somebody who
00:15:18.340 didn't even have a complete brain, would know to do this stuff. All right. So I'm going to stick
00:15:24.460 with my, it was bullshit. I'm seeing people compare Kamala Harris to the boss from Dilbert.
00:15:32.740 I told you eventually every story comes back to me, even if I don't want it to. I want to see if I
00:15:37.940 can find her latest word salad. It's a beauty if you haven't heard it. Pretty sure I can find it.
00:15:47.300 We know that we really are quite behind in terms of maximizing our collective understanding about how
00:15:56.020 we will engage on the technology of today. And what we can quickly and easily predict will be
00:16:02.740 the technology over the next decades. So to maintain our position as the United States of America on this
00:16:09.480 issue, it is critical that we work together to understand where we are, to recognize and have
00:16:16.920 the courage to speak truth about what is obsolete, and then to partner to ensure that we are speaking
00:16:23.920 the same language with the same motivation, inspired by the opportunity of it all, but then doing the work
00:16:32.880 of updating how we've been talking and thinking about our exploration in space. We know that we really
00:16:41.280 are quite... Now, do you think it's fair to compare her to the pointy-haired boss in the Dilbert comic strip?
00:16:51.840 Do you feel that maybe that's not too far off, is it?
00:16:57.480 Wow. You know, it's funny because every time I see one of these clips, I think to myself,
00:17:07.160 well, she's going to tighten up her speaking after that, you know, because she's not going to want
00:17:11.860 to do this word salad again. But didn't you hear that her speechwriter quit? Can you do a fact check?
00:17:18.220 Didn't her, or was it her communications head or a speechwriter? Like whoever was in charge of
00:17:23.800 helping her speak better, quit? And I'm thinking, how could you ever get another job after that?
00:17:30.960 Imagine putting on your resume, Kamala Harris's speechwriter.
00:17:36.940 It sounds like a punchline, doesn't it?
00:17:41.440 I mean, seriously. Imagine looking at, you know, you've got a job applicant.
00:17:45.800 Yeah, let's see, your last job was, let me see, you were Kamala Harris's speechwriter.
00:17:50.840 I think we're done here. The door's over there.
00:17:57.980 All right. Germany has decided to postpone closing its last three nuclear plants. Now,
00:18:05.120 this is an update on a story. They had already decided to keep open some other ones, but now
00:18:11.420 I think three more. So Germany is really getting a lot of religion on nuclear power, as they should.
00:18:19.000 And Michael Schellenberger and his team did a lot of persuasion on this one. So they get the win.
00:18:27.440 They get the win.
00:18:30.360 All right. So Liz Cheney lost her primary to a Trump-endorsed candidate. I guess she won about,
00:18:38.580 she lost about two to one. It wasn't even close.
00:18:40.700 So she had won her prior elections by overwhelming majorities, but she just got slaughtered.
00:18:48.580 Now, how is the news covering it? Well, CNN says that it's more a sign that it's Trump's party
00:18:56.280 and that eight of the ten Republicans who voted to impeach Trump ended up losing or having to retire
00:19:05.040 or something. Okay. So that's CNN's thing. And I'm thinking, is it 100% because Liz Cheney
00:19:15.240 wanted to impeach Trump? Is that the only problem with Liz Cheney, is that she went after Trump?
00:19:25.260 Because to me, it looks like there's some Liz Cheney problems here that have to be addressed.
00:19:30.660 I'm not sure that she did anything but some kind of weird personal vendetta to self-aggrandize.
00:19:43.080 I mean, there's nothing about what she did that even seems laudable at all.
00:19:49.480 She was trying to do the, I'm above it all, and I'm better than you Republicans.
00:19:53.660 It completely failed because she ended up being a puppet for one of the biggest hoaxes.
00:20:00.660 So the January 6th thing, I'm just going to call a hoax, because I think that's fair enough,
00:20:06.820 at least in terms of it being an organized insurrection. That part's a hoax.
00:20:12.200 And so she basically got taken down by the Democrats.
00:20:17.680 The Democrats basically attached her to their giant hoax, and they guaranteed that she was
00:20:22.680 going to lose her job, which was, I suppose, maybe kind of clever of them in some weird way.
00:20:30.240 But to me, Liz Cheney, of course, she will keep on the fight. They always say that.
00:20:35.320 But I feel like she's Captain Ahab and Moby Dick.
00:20:39.420 I feel like she's just chasing her own personal white whale that happens to be an orange whale in this case,
00:20:45.400 great orange whale, and God, I want to say a word I don't want to say, but I'm not going to do it.
00:20:56.140 Let's just say that Liz Cheney may have come across as not a person that you want to support.
00:21:03.820 That's the best I can do.
00:21:05.000 Well, CNN did this great gaslighting piece as part of their ongoing hypnotizing of the public.
00:21:16.360 This seems to only work on their base.
00:21:18.580 So listen to this.
00:21:19.420 So CNN attended a hackers convention in which part of the hacker convention was
00:21:25.740 they had access to a bunch of voting machine hardware and software.
00:21:30.120 And then the hackers were going to look for vulnerabilities in the voting machine software.
00:21:36.500 Now, how do you think CNN would cover a story about hackers looking for vulnerabilities in voting machines?
00:21:45.800 How would they handle that?
00:21:47.060 Because it would be counter to their narrative to say that the voting machines have vulnerabilities.
00:21:53.240 But on the other hand, that is the main context of the story
00:21:57.320 because the hackers did, in fact, find vulnerabilities.
00:22:01.500 Now, there wasn't much detail about those vulnerabilities,
00:22:04.340 but even CNN reported, yes, we found vulnerabilities.
00:22:09.360 Here's how CNN reported it.
00:22:11.960 Hey, hacker guy whose only expertise is hacking,
00:22:16.780 were there any vulnerabilities?
00:22:18.520 Yes.
00:22:19.740 Did you find any evidence that those vulnerabilities were exploited to change the election?
00:22:24.640 Hacker guy says, no, no, absolutely not.
00:22:29.000 No evidence that anything bad happened in the election.
00:22:31.900 Absolutely not.
00:22:34.160 What's wrong with that?
00:22:37.720 Here's what's wrong with it.
00:22:40.580 How are they going to find evidence that the election had been hacked
00:22:44.220 by looking at unconnected machines sitting on a table?
00:22:49.660 How is he going to look at a clean voting machine with no data on it
00:22:54.500 and determine that the election had been rigged
00:22:57.300 by looking at a machine that may not have ever been part of the voting process?
00:23:02.840 Ridiculous.
00:23:04.100 But the way they presented it was that he was talking with some authority
00:23:07.700 that they had told that they could determine through their hacking skills
00:23:11.940 that these were, that had not been hacked.
00:23:14.940 So it wasn't, so they could determine for sure that it was hackable
00:23:19.060 and that seems to be, you know, something nobody's arguing about.
00:23:25.260 But then they went to the point where they fooled you into thinking
00:23:28.440 this guy could tell that nothing had happened in the actual election
00:23:33.140 by looking at a piece of equipment
00:23:35.280 which in all likelihoods had never been near an election.
00:23:39.080 It probably was an extra one that the voting company gave them.
00:23:43.180 Well, let me ask you that.
00:23:45.440 Where do you think they got them?
00:23:47.200 Do you think they got the spare voting machines from, let's say, counties that used them?
00:23:53.280 Did they go to a county and say,
00:23:54.640 do you have an extra that we can just use and take apart?
00:23:58.520 Maybe. I don't know.
00:23:59.760 Or do you think it's more likely they went to the voting machine companies themselves
00:24:03.300 and said, could you give us a blank that we can play with?
00:24:08.040 And if it came from the voting machine companies themselves,
00:24:12.400 would it necessarily look exactly like the ones that were in the election?
00:24:17.600 And let me ask you this.
00:24:19.780 Do you think there's one version of voting machine out there?
00:24:23.620 Isn't there all kinds of different software versions?
00:24:28.300 What software version got tested?
00:24:30.500 Because I imagine that every voting machine had exactly one version of software on it.
00:24:34.500 Don't you think that there are dozens of software patches and versions all along the way?
00:24:42.120 They probably saw the most recent software, wouldn't you say?
00:24:46.920 I assume that these had software on them, of course.
00:24:50.780 So did the hackers see the software version that was in the election,
00:24:56.380 or did they see the most modern software version,
00:24:59.700 which one assumes might have some patches of it, who knows, updates?
00:25:04.300 There is absolutely nothing about this that tells you whether or not the election was rigged.
00:25:10.840 It just tells you that it was totally possible.
00:25:14.440 And here's the story that makes me crazy.
00:25:17.920 What is the most likely way an election would be rigged electronically?
00:25:22.940 What is the most likely way?
00:25:24.820 Is the most likely way an outside hacker gets into the system?
00:25:30.400 Is that the most likely way?
00:25:32.940 It's not.
00:25:34.100 But we keep talking about it like it is.
00:25:36.520 The most likely way, by a factor of, I don't know, 100 to 1?
00:25:42.260 20 to 1?
00:25:43.660 I mean, it's not even close.
00:25:45.560 The most likely way it would happen was an insider.
00:25:49.280 Now, what can they tell us?
00:25:50.800 Did the hackers test any insiders?
00:25:52.580 Did the hackers put any devices into the skull of all the employees who have access to the data
00:26:00.900 and say, we checked your brain, and we don't see that you did anything?
00:26:04.600 There's no way to check that.
00:26:06.600 So the biggest risk, we completely ignore, because we don't have a way to check.
00:26:11.960 Right?
00:26:13.000 So we look at the shiny object.
00:26:15.360 Oh, hackers.
00:26:17.200 Hackers are trying to break in, but...
00:26:19.580 Meanwhile, the biggest risk is always just some guy or some woman.
00:26:26.980 Always.
00:26:28.000 It's always the biggest.
00:26:29.000 It doesn't mean it happened.
00:26:30.040 It just means that's always the biggest risk.
00:26:33.820 All right.
00:26:36.040 So that...
00:26:36.900 It's amazing that that happens right in front of us.
00:26:39.580 So I had some vague understanding that that whole Gretchen Whitmer, a governor, might-be-kidnapped plot by the extremists.
00:26:52.220 And I didn't know the whole story, but apparently the story is that it was a complete setup by one FBI agent,
00:26:59.300 whom probably coerced some others.
00:27:01.100 But primarily one FBI agent who had some outside security business interests seemed to have been trying to create the impression
00:27:10.160 that the extremist risk was higher than it was, probably for his own income.
00:27:15.660 Meaning it wasn't some kind of an FBI plot.
00:27:19.120 It was one guy in the FBI who looked like he could make some money by, you know, making it look like there's a threat,
00:27:24.960 and then he could presumably pay to take care of the threat that he had created.
00:27:30.760 So they did everything they could to convince some regular people who did not want to kidnap anybody to get involved in this plot.
00:27:39.160 And now, because it's in a court, we have really good evidence, proof, you could say, because it's gone through the court system,
00:27:48.660 that an FBI agent was behind a whole fake hoax kidnap plot.
00:27:56.560 That really happened.
00:27:59.180 That's not something made up.
00:28:01.880 And part of the story, which is interesting, is that the FBI, one FBI did talk to the local police when the protests happened
00:28:10.500 and asked them to stand aside and let the protesters enter the Capitol building.
00:28:15.640 So at the same time that we're wondering if the January 6th thing had anything sketchy about it,
00:28:25.400 we see an exact model in the real world of what people suspected was happening with January 6th.
00:28:34.260 Now, I'm not going to allege that January 6th was an FBI operation.
00:28:38.780 So I don't have that information.
00:28:41.040 But when you look at one that was an FBI operation and you look at the parallels,
00:28:47.100 you have to start asking some questions.
00:28:51.140 But there is one part that you shouldn't lose sight of,
00:28:54.380 that this Lansing, Michigan, kidnap plot,
00:29:03.220 the kidnap plot really seems to be run by one person as opposed to the FBI.
00:29:11.040 So there's no evidence that I've seen that the FBI as an organization was trying to do this.
00:29:17.620 It seemed more like one person.
00:29:19.660 So you'd have to take that to the January 6th protest and say to yourself,
00:29:24.480 could it be that one or more persons is all it took to make it look like an FBI plot?
00:29:32.840 It wouldn't take much if it only takes one or two people to motivate other people to look like a plot.
00:29:40.960 So given that that happened, it's hard for me to imagine that you could completely dismiss the possibility that the FBI was involved.
00:29:50.380 Again, I don't have any information that would suggest they were.
00:29:55.460 Well, suggest is the wrong word.
00:29:57.420 I don't have any proof that they were.
00:30:00.060 But man, it goes right to the top of your possibility list, doesn't it?
00:30:05.320 It goes right to the top of the possibility list.
00:30:09.000 Rasmussen did a poll on asking people if they trusted lawyers.
00:30:12.180 And only 35% of American adults trust lawyers.
00:30:17.300 But here's the fun part.
00:30:19.300 Of the people who had actually hired lawyers and had experience with lawyers,
00:30:23.780 the more experience you had with a lawyer, the less you trusted lawyers.
00:30:28.880 So if you had never hired a lawyer, you were more likely to trust them.
00:30:32.400 So that tells you something.
00:30:39.240 But I feel like people trust their own lawyer more than they trust other people's lawyers.
00:30:45.780 I generally trust my own lawyer.
00:30:48.400 But I always think the other lawyers are, I can't trust them.
00:30:51.400 But, of course, it's a system where you're not supposed to trust the other side.
00:30:56.380 All right.
00:30:58.380 Elon Musk made some news by saying that he was basically in the middle.
00:31:02.840 Politically.
00:31:03.800 So he was on the right side of the Democrats and the left side of Republicans.
00:31:09.340 Meaning he's in the middle.
00:31:11.860 And then he also announced he's buying Manchester United,
00:31:15.200 the most famous soccer team in the world.
00:31:19.300 Soccer club.
00:31:22.320 But here's a...
00:31:23.900 If somebody would like to take over America, let me tell you how to do it.
00:31:28.760 All right.
00:31:28.980 Here's...
00:31:29.820 If you want to actually take over America,
00:31:32.400 you want to do it Joe Manchin style.
00:31:35.140 You want to have a situation where the Democrats and the Republicans are so close
00:31:39.880 that a third party would always determine who won.
00:31:44.300 So you create the third party.
00:31:46.600 And here's what you could...
00:31:47.460 Now, third parties have existed.
00:31:49.500 But the problem with third parties so far
00:31:51.580 is that they're obviously left or obviously right.
00:31:55.780 Am I right?
00:31:56.500 So you could always tell who they're taking votes away from.
00:32:00.840 So if you always know who the third party is going to take votes from,
00:32:04.700 let's say the Green Party takes from the left,
00:32:07.500 then you don't really have power except to be a spoiler.
00:32:10.620 You want to be like Joe Manchin where nobody knows which way you're going to go.
00:32:16.980 So here's what I would call a center party if I were to create one to run the world.
00:32:22.360 I would call it the common center party.
00:32:25.320 The common center party.
00:32:28.940 What does common center remind you of?
00:32:32.060 Common sense.
00:32:33.700 Your brain just goes to common sense.
00:32:35.460 And what does the center generally represent?
00:32:39.520 Common sense.
00:32:41.380 And the center is where most people believe they are.
00:32:44.700 And they believe it's common.
00:32:46.720 And they believe that they'd like you to know it's common.
00:32:49.800 If you belong to the biggest group in America
00:32:52.780 and you're being ignored
00:32:55.460 because the far left and the far right get all the attention,
00:32:58.640 you probably want people to know,
00:33:00.320 hey, we're here, we're common, we're in the middle.
00:33:04.140 So if you called yourself the common center party
00:33:07.720 and started a political party
00:33:09.580 and you made sure that sometimes you leaned left
00:33:12.520 and sometimes you leaned a little right,
00:33:14.700 you would have a Joe Manchin situation
00:33:16.640 where you could control politics.
00:33:18.620 All you need to do
00:33:19.960 is cannibalize one side a little bit harder than the other
00:33:24.160 and you could determine who the president would be.
00:33:27.360 So let's say you wanted Trump to lose.
00:33:31.820 You wanted Trump to lose.
00:33:32.960 If you're in the center, then you say,
00:33:34.840 okay, the center wants to have stronger immigration.
00:33:40.440 So suddenly, all the people who think Trump is too extreme,
00:33:44.380 but they like strong immigration policy,
00:33:47.820 they go, oh, the center party, you know, that's for me.
00:33:51.540 And then that center party could make the left win
00:33:53.960 because they'd take away too many of the votes from the right.
00:33:57.660 But it could go the other way.
00:33:58.740 So you could have somebody who's in the left,
00:34:02.140 but I'm going to do it with nuclear energy.
00:34:06.100 Like, ah, huh.
00:34:10.700 Presumably.
00:34:11.340 No, these are just examples.
00:34:13.920 But presumably, you could manage your campaign
00:34:17.260 intentionally to take a little more from the left
00:34:20.480 or a little more from the right.
00:34:21.960 And in so doing, you would become Joe Manchin
00:34:24.600 and you would run the country effectively
00:34:26.960 because you'd get to decide who the president was.
00:34:30.700 Am I wrong?
00:34:33.240 Now, it would split the party
00:34:34.780 if you were exactly identical to both sides.
00:34:37.900 But the whole thing here is to be Joe Manchin
00:34:40.560 and to clearly push one way
00:34:42.900 or clearly push the other way
00:34:44.180 when you want to influence things.
00:34:45.460 Yeah, we'll see.
00:34:50.700 My last point, I have to save for the locals' platform
00:34:54.420 because it's an idea
00:34:56.560 which would be too dangerous in the public.
00:35:00.280 Too dangerous.
00:35:03.180 True story.
00:35:05.060 So after I turn off YouTube this morning,
00:35:08.020 I'll give the locals' people a little extra.
00:35:13.060 Oh, no, I love you, too.
00:35:14.040 I love you, too, on YouTube.
00:35:17.460 How many of you saw the clip
00:35:19.160 that was taken out of my recent cast?
00:35:24.460 It was just a clip about the hoax pattern.
00:35:27.740 Seems to be getting a lot of uptick.
00:35:29.320 I saw Jack Posobiec tweeted it,
00:35:32.300 a number of other people.
00:35:33.600 I feel like it's making a difference.
00:35:35.340 Do you?
00:35:38.000 Does anybody think that by calling out the hoax pattern,
00:35:41.980 I feel like it could make a difference?
00:35:45.460 Yeah, there's something powerful about writing it down
00:35:47.660 because I've been, well, lots of people
00:35:50.480 have been pointing out that the Democrats
00:35:52.120 use the same pattern over and over,
00:35:54.200 but it's not until you write it down
00:35:55.760 or make a picture of it
00:35:57.000 that it becomes potent.
00:36:00.060 So maybe that's the part I did.
00:36:02.200 I just made a picture, and that made it potent.
00:36:06.200 All right.
00:36:11.980 You wish I still blogged.
00:36:14.020 Yeah.
00:36:15.700 I kind of wish I did, too.
00:36:17.480 I do think that maybe my blogging
00:36:19.480 has more impact because it's more portable.
00:36:23.100 If I say something in a blog post,
00:36:25.480 it gets passed around more easily.
00:36:28.820 All right.
00:36:29.620 That's all for now.
00:36:32.320 Oh, the two hoaxes that I added to the...
00:36:35.220 Oh, and let me ask you this.
00:36:37.180 Do you think my list of...
00:36:38.360 Now it's 14.
00:36:39.980 My list of 14 hoaxes,
00:36:42.360 do you think that that's making a dent?
00:36:46.600 I think it might be.
00:36:48.600 Now, the trouble is it's not being fully exposed
00:36:51.240 to the other side.
00:36:56.140 And, yeah.
00:36:59.760 So my real problem is
00:37:01.260 I don't really break through to the other side.
00:37:03.440 But on Twitter, you can see that
00:37:05.040 when I paste the hoax,
00:37:07.460 that people immediately just sort of collapse.
00:37:11.200 And you'll see a number of people say,
00:37:13.200 every one of those things is true.
00:37:15.760 And I think to myself, really?
00:37:18.520 Really?
00:37:18.960 Really?
00:37:20.180 There's even one person
00:37:21.440 who thinks everything on the list
00:37:22.700 actually really happened?
00:37:24.020 And they'll say that in public.
00:37:26.240 And I think, well, okay.
00:37:28.220 You can't find one thing on that list,
00:37:30.400 even the things that the court has shown as hoaxes.
00:37:33.560 Okay.
00:37:36.420 All right.
00:37:38.280 YouTube,
00:37:39.340 I'm sure this is the finest thing
00:37:40.660 you've seen this morning.
00:37:41.800 It will stay with you
00:37:42.940 for the rest of your days.
00:37:44.040 But I got a little talking to do
00:37:47.320 to the locals people.
00:37:49.120 I'm going to go do that.
00:37:51.180 And let me just,
00:37:54.180 one technical note.
00:37:56.060 If you listen to the podcast
00:37:57.640 and you saw it was all chopped up,
00:38:01.100 or you listen to the,
00:38:03.500 I guess it's the podcast that was chopped up.
00:38:06.020 For reasons that I don't understand yet,
00:38:08.220 some of my content can't be downloaded
00:38:11.560 from YouTube.
00:38:14.160 On most days,
00:38:15.960 it can be downloaded
00:38:17.520 and then turned into podcasts,
00:38:19.680 you know, the audio part.
00:38:21.140 But for reasons I don't yet understand,
00:38:23.900 YouTube will block some all day long.
00:38:26.360 There'll be technical glitches
00:38:27.520 and you can't download it.
00:38:28.460 I don't know if this affects everybody
00:38:31.460 or just my account.
00:38:34.660 But those that seem the most,
00:38:36.380 let's say, impactful,
00:38:39.480 and this is anecdotal,
00:38:41.000 but the ones that seem most impactful,
00:38:43.160 like, you know,
00:38:44.180 the hoax pattern,
00:38:46.320 I feel like those are highly correlated
00:38:48.160 with when I get a technical glitch
00:38:49.920 so that I can't transmit it.
00:38:53.700 I also had two tweets yesterday
00:38:56.180 that under normal days
00:38:58.440 would have been close to 1,000 retweets,
00:39:00.920 but capped out at, like, 100 or 200.
00:39:04.100 And they were both pretty dangerous.
00:39:06.780 Dangerous in the sense that
00:39:08.060 they questioned the narrative.
00:39:10.520 And to me,
00:39:12.720 it looks like they were just obviously throttled.
00:39:15.760 But, of course,
00:39:16.680 we're deep into confirmation bias territory, right?
00:39:19.920 My confirmation bias says
00:39:21.720 if I do a tweet that I think is good
00:39:24.180 but other people don't,
00:39:25.340 I'm going to think it was throttled
00:39:27.180 instead of thinking it was a bad tweet.
00:39:29.660 So I have to, you know,
00:39:31.020 I have to keep a little bit of honesty
00:39:33.480 on my, on myself.
00:39:36.520 So I don't know.
00:39:37.840 I'll tell you it looks exactly
00:39:39.560 like it's being throttled.
00:39:41.440 But I don't know.
00:39:42.540 There's no way to know that.
00:39:44.980 All right.
00:39:47.000 And that, ladies and gentlemen,
00:39:48.680 is all I have for YouTube.
00:39:51.220 And I'll be talking to locals in a minute.
00:39:53.180 Bye for now.