Real Coffee with Scott Adams - January 06, 2022


Episode 1615 Scott Adams: How January 6 Reversed the Presumption of Innocence and More


Episode Stats

Length

48 minutes

Words per Minute

145.09949

Word Count

6,967

Sentence Count

453

Misogynist Sentences

7

Hate Speech Sentences

10


Summary


Transcript

00:00:00.000 Well, welcome to the best thing that ever happened to you.
00:00:07.360 Sometimes I call this Coffee with Scott Adams, but today I'm going to call it an armed insurrection.
00:00:14.620 Yeah, because I've got arms and sometimes I protest.
00:00:20.960 And if you have arms and sometimes you protest, you, my friends, are an armed insurrection.
00:00:28.180 I'll tell you who else is an armed insurrection pretty soon when we get to the news.
00:00:34.080 Now, how would you like to do something that, oh, many of you have heard of it.
00:00:37.600 It's called the Simultaneous Sip, and it'll definitely be the best one you've ever had today.
00:00:46.140 And join me now by grabbing your cup or mug or glass, a tank or chalice or stye,
00:00:53.520 and a canteen jug or glass, a vessel of any kind.
00:00:57.540 Fill it with your favorite liquid.
00:01:00.420 I like coffee.
00:01:02.500 And join me now for the unparalleled pleasure, the dopamine hit of the day,
00:01:08.840 the first simultaneous sip of the day.
00:01:11.120 As far as you know, it's going to happen now.
00:01:14.420 Go.
00:01:14.700 I don't know about you, but I feel my insurrection getting stiffened.
00:01:27.600 Stiffened.
00:01:28.080 I have stiffened my insurrection.
00:01:32.560 Well, how many of you have seen the movie Don't Look Up?
00:01:37.960 Have you all seen that?
00:01:39.000 A number of people were saying,
00:01:40.160 Scott, you should see that movie Don't Look Up.
00:01:42.960 And I didn't know why, and I didn't realize it was political in nature.
00:01:48.920 Now, here's the thing you need to know about it.
00:01:53.120 Give me your opinion as those who have seen it.
00:01:55.360 Do you love it or hate it?
00:02:01.420 Loved it, hated it, loved it, hated it, loved it, hated it.
00:02:05.720 All right, here's my take on it.
00:02:07.380 I can definitely see why many of you would hate it.
00:02:11.180 I can definitely see that.
00:02:12.960 Because it targets, yeah, I would say it targets Trump supporters.
00:02:17.060 Would you say that's a fair statement?
00:02:20.020 Did you feel that it targeted Trump supporters or Republicans?
00:02:23.680 I felt like it did.
00:02:26.600 But here's my take on it.
00:02:29.560 I really enjoyed it.
00:02:31.600 In fact, it's one of my favorite movies that I've watched in a long time.
00:02:36.840 In fact, it's the only one I've enjoyed in a long time.
00:02:39.900 And here's why.
00:02:41.680 Unlike you, I don't exactly view movies the same way.
00:02:48.140 Meaning that because I write for a living, if the writing is really good, and, you know, the acting and everything, I'm totally sold.
00:02:56.440 And if they make fun of me or, you know, some group I'm associated with, as long as they do it really well, I love it.
00:03:04.940 It's only when it's not done well.
00:03:06.320 For example, when some of you started calling me Vax Adams, my only objection to it was it wasn't quite clever enough.
00:03:18.560 It was like, all right.
00:03:19.500 But then somebody came up with Claw Adams, and I was all over that, because that's actually genuinely funny, because a clot is way worse than the vaccination.
00:03:31.300 And it's funny because it's God.
00:03:33.680 So if somebody does a really good job of skewering me, I love it, because I just appreciate the art of it.
00:03:42.240 I don't take it any too personally.
00:03:44.000 But I can totally get how you would be offended by it and hate every moment of it.
00:03:50.580 But I'm just going to have to say, in all fairness, it was really well executed, in my opinion.
00:03:58.200 I thought the writing was sensational.
00:04:00.960 And I thought the acting was really, really good.
00:04:04.360 And the casting and the directing, the pace.
00:04:08.520 I mean, I liked everything about it.
00:04:10.000 So I like the artistry of it a lot, a lot.
00:04:16.380 But I get it, you know, if it's not your thing.
00:04:20.840 All right.
00:04:21.660 I don't know if you noticed that Elon Musk replied to me to a tweet in which I noted, I will paraphrase my tweet.
00:04:29.760 But I said, I'm not saying that people who work technical jobs are smarter than people who work in marketing.
00:04:35.880 I just think it should be noted that Elon Musk performs the job of an entire marketing department while tweeting as he shits.
00:04:46.700 Now, that actually is close to literally true.
00:04:50.600 Because he has said that he tweets when he's on the toilet, which I believe.
00:04:54.680 I mean, that doesn't mean every tweet.
00:04:56.940 And it is true that he doesn't have a marketing department, and that he does a lot of his marketing by tweeting.
00:05:04.160 And you want to get really meta?
00:05:06.560 The fact that he responded to my tweet about it is marketing.
00:05:12.080 Because he just keeps a high profile and, you know, keeps it light and keeps it funny.
00:05:17.460 And it reminded me, I keep going back to this.
00:05:22.620 You saw the interview of Elon Musk with the Babylon Bee people.
00:05:26.260 And the thing that sticks in my mind is that before they interviewed him, he started interviewing them.
00:05:31.740 And the questions he was asking about how they got started seemed very much like somebody who was looking to start a media empire.
00:05:39.960 Now, I don't know if he's ever talked about it, but doesn't it make sense for everybody in his position to have some media asset?
00:05:54.680 And we also heard, well, I did.
00:05:57.220 I was hearing an interview in which Elon Musk was saying at one point they actually considered, or he did, starting a candy company.
00:06:05.080 But they evaluated a bunch of samples, and the reason he didn't is that he couldn't come up with a candy that was just way better than another candy.
00:06:16.320 You know, he made some that were good candies and maybe even a little bit better.
00:06:20.100 But unless it could be a way better candy, he didn't really want to get involved, so he let that go.
00:06:24.820 And I thought to myself, what would an Elon Musk putting together a media asset, what would that look like if it had to be substantially better than other media properties?
00:06:43.720 Do you think he could do that?
00:06:45.100 Because you've got all these media news entities, and they're all kind of the same, aren't they?
00:06:50.680 In a way, I mean, some lean left, some lean right.
00:06:54.420 But they're kind of the same product.
00:06:57.380 And then you've got the Babylon bees and the onions and the satire people, and they're really just pure humor.
00:07:06.600 Do you see where the opportunity is?
00:07:08.520 And I'm wondering if Trump is going to fill this space.
00:07:11.440 The opportunity is something that's genuinely interesting, meaning you learn something about politics and the world and it makes you a better citizen.
00:07:21.580 At the same time, it's hilarious.
00:07:24.860 Now, in a way, John Stewart pioneered that because we learned after he became a big phenomenon that a lot of people in a certain age group, the young, were actually getting their news from John Stewart.
00:07:37.340 But he would deliver it in the humorous way.
00:07:40.640 But before he would do his joke, he would actually have to tell you what the news was before he did the joke.
00:07:48.120 And it was in John Oliver, now, etc.
00:07:51.260 But you don't have anybody who's actually trying to inform you, like a major platform, that is just funny all the time.
00:07:59.980 And when you see a work, such as Gottfeld, both on The Five and on his own show, Gottfeld, you see that when real news is combined with humor, the ratings are great.
00:08:16.220 Ratings are great.
00:08:17.840 Yeah, Bill Maher, real news and humor.
00:08:20.200 So if I were an Elon Musk and if I were, and this is all speculative, there's no indication of this, but if I were thinking of doing it and if it had to be better than, you know, substantially better than other things, there is really an opening for news that's just not serious ever, but still gives you the news.
00:08:42.580 And maybe even shows you both sides.
00:08:47.000 Ian Bremmer tweeted on our anniversary today, January 6th.
00:08:52.660 Does it seem like it's only a year since January 6th?
00:08:56.340 I'm not the first person to note this, but I feel like it was three years ago.
00:09:02.640 It's sort of mind-blowing that it was one year ago that January 6th happened.
00:09:06.460 So Ian Bremmer was tweeting today.
00:09:11.420 He said, one year later, most Americans think another 1-6 event is likely.
00:09:17.060 I agree.
00:09:18.440 Americans increasingly believe political opponents at home are their principal enemy, that we're more targeted at each other.
00:09:29.260 And he said, democracies don't persist for long under those conditions.
00:09:32.960 Now, I weighed in and said, the odds of an imaginary insurrection, what would you say are the odds of another imaginary insurrection?
00:09:48.340 It's 100%.
00:09:49.980 Because we're in a world where both sides are just accusing each other of insurrections all day long.
00:09:56.280 So the odds of another imaginary one, 100%.
00:09:59.960 You know, in a year or whatever.
00:10:03.540 But how many real ones are going to happen?
00:10:06.820 How many real insurrections are going to happen in, say, the next year?
00:10:11.440 Probably one or two, because that's the run rate.
00:10:15.400 Our run rate is one or two insurrections a year.
00:10:17.960 We just use different language to describe it.
00:10:22.020 What was the Russia collusion hoax?
00:10:25.000 The Russia collusion hoax was a government overthrow, or, you know, preemptively, depending on what timing you look at.
00:10:33.260 But the whole point of it was to change our form of government through an organized, intelligence-led effort.
00:10:43.460 How's that not a coup?
00:10:45.280 Or an insurrection or something?
00:10:48.340 Right?
00:10:48.560 I would argue that both the fine people hoax and the drinking bleach hoax were organized.
00:10:55.620 Probably intelligence was involved in it.
00:10:59.360 And it was a coup attempt to make the public so convinced that something ridiculous had happened that you had to get rid of the president right away.
00:11:09.020 I mean, to me, I think we're in a continuous coup situation.
00:11:11.560 Now, am I willing to call the January 6th protests an insurrection?
00:11:19.620 If you want to put a small eye on it and apply it to maybe 20 deranged people who were there, maybe.
00:11:30.120 But if you're applying it to the whole crowd, well, we're in dangerous territory now.
00:11:35.480 Because as I tweeted, something terrible has happened, which is the presumption of innocence, which is the bedrock of, really, civilization, at least the civilization we want to live in.
00:11:54.980 The presumption of innocence got reversed on January 6th.
00:12:00.700 The president of the United States is right now calling them armed insurrectionists.
00:12:07.720 Have the courts decided that there were any armed insurrectionists there?
00:12:13.020 Nope.
00:12:14.060 There were armed people there, and anybody who broke a law should be punished.
00:12:18.760 I don't think there's anybody listening to this who would disagree with the statement that people breaking laws need to be punished, blah, blah.
00:12:28.040 But there were a ton of people who didn't even know they were breaking a law because the fences were down before they got there.
00:12:34.540 They were just protesting.
00:12:35.820 So if you've got a president who is presuming, without the benefit of a trial, that these people were there for insurrection,
00:12:47.000 you have put the assumption of guilt on citizens from the highest office in the land.
00:12:53.640 The highest office in the land, the president, just put the assumption of guilt on a bunch of people.
00:13:01.620 Some of them, some of them, some, but most of them, not so much.
00:13:09.680 Most of them were trying to do what they thought was preserving democracy by postponing the certification until some audits could get done.
00:13:22.240 Now, of course, when the fake news reports about it, they act like the idea is that they would take over and just change the government.
00:13:29.180 No demand like that ever happened.
00:13:32.660 The only demand was, can you give us a few days to audit some suspicious stuff?
00:13:38.700 That is protecting the republic, or at least in the minds of the people there.
00:13:45.260 You know, we're not mind readers, but the asserted public often stated purpose was to delay things until there was an audit.
00:13:55.300 I don't believe anybody there, anybody, actually, I think literally, there might not have been, it's weird to say an absolute, right?
00:14:05.680 When you talk an absolute, you're pretty much always, pretty much always wrong.
00:14:11.260 But I think it would be an accurate absolute to say nobody that was there to destroy democracy.
00:14:17.500 Would you buy that?
00:14:20.140 Would you buy that every person who was there, everyone, 100%, no matter whether they were breaking laws or not,
00:14:26.040 wouldn't you say that 100% of them thought, many incorrectly, that this was a good way to protect the democratic process?
00:14:37.480 Because they believed that the vote had been rigged.
00:14:40.880 Now, were they right?
00:14:42.720 I haven't seen evidence of it.
00:14:45.340 I haven't seen evidence.
00:14:46.180 I also haven't seen evidence that they're wrong.
00:14:49.940 Because, once again, January 6th reversed the presumption of innocence and guilt.
00:14:58.880 It used to be that if there was something about the government that was opaque, that we couldn't see what was going on,
00:15:06.120 you just assumed it was corruption.
00:15:08.580 Am I wrong?
00:15:09.080 Well, if any part of the government was, trust us, you know, just trust us, except for, you know, military secrets and stuff.
00:15:16.700 But if any part of the government was, well, we've got this budget, and we don't want to give you the details,
00:15:22.520 but, you know, we're spending this thing.
00:15:25.560 Right?
00:15:25.680 The assumption is corruption.
00:15:28.500 And now we've gone to, the courts couldn't find the thing we didn't look for in the election,
00:15:33.940 so now we must presume the governments, of the states anyway, are innocent.
00:15:39.460 We're giving the states the presumption of innocence, when they're not showing us an auditable election.
00:15:46.640 That's backwards.
00:15:47.320 The presumption has to be that the election is corrupt if he can't audit it.
00:15:53.980 That doesn't mean it is.
00:15:56.120 But like the standard in the law, where you assume innocence until proven guilty,
00:16:01.700 that doesn't mean you're innocent.
00:16:03.860 That just means the system has to assume innocence.
00:16:08.180 It can only work that way.
00:16:10.400 But that's not the government.
00:16:12.680 The government has to prove they're not guilty.
00:16:15.460 They've got to open the books.
00:16:18.640 They've got to be transparent.
00:16:20.320 They've got to show you who voted for what.
00:16:22.780 They've got to show you where their donations came from.
00:16:27.420 Right?
00:16:27.860 We do all of that because of the presumption of guilt,
00:16:31.960 meaning that if the government can hide anything, we do presume they will.
00:16:37.800 Am I wrong?
00:16:39.000 Is there anybody who would disagree with the statement
00:16:40.800 that we require our government to be transparent
00:16:43.900 because we presume, every one of us,
00:16:47.860 that if it's not transparent, there will be problems?
00:16:51.600 Right?
00:16:52.000 100% of us.
00:16:54.460 There's so few things you can say are 100% true.
00:16:58.760 So few.
00:17:00.300 This is one of them.
00:17:02.740 Anyway.
00:17:03.040 I think that should be the lesson of January 6th,
00:17:07.160 that we lost the most valuable thing that our republic has ever created,
00:17:11.940 which is the presumption of innocence for citizens.
00:17:16.840 And I still have many questions about the people we're being held
00:17:20.260 on charges that we're very skeptical
00:17:24.500 that these are not just political situations.
00:17:29.280 Very skeptical.
00:17:30.040 Can we prove it?
00:17:32.560 Nope.
00:17:33.420 Nope.
00:17:34.240 Do you know why we can't prove it?
00:17:36.020 Because the government is not transparent
00:17:39.120 about why these people are being held
00:17:41.380 and what their deal is.
00:17:44.340 I presume corruption
00:17:45.960 because it's not transparent.
00:17:49.220 That is a fair assumption.
00:17:50.880 Doesn't mean it's true.
00:17:52.640 Doesn't mean it's true.
00:17:54.320 But it's a fair assumption,
00:17:55.460 and it's the only one that makes sense.
00:17:56.680 Rasmussen did a poll
00:18:00.100 and found out that Trump is,
00:18:02.460 on the anniversary of January 6th,
00:18:05.560 Trump is almost exactly as popular
00:18:07.560 as he was at the end of his presidency.
00:18:09.780 I mean, before the trouble.
00:18:12.140 52% very or somewhat favorable.
00:18:15.120 That's Trump today.
00:18:17.360 By contrast,
00:18:18.560 Black Lives Matter is 46%
00:18:20.780 very or somewhat favorable.
00:18:22.780 So after everything that's happened,
00:18:26.140 Trump is more popular than Black Lives Matter.
00:18:29.220 I'm not sure that the news reflects that all the time.
00:18:37.780 Here's a question that's self-serving,
00:18:41.980 but I think you're going to learn something.
00:18:43.360 One of the biggest questions
00:18:46.480 that I'm getting on social media
00:18:48.020 lately
00:18:49.300 is
00:18:50.800 why I'm
00:18:52.220 intentionally destroying my reputation.
00:18:56.140 Any of you wonder about that?
00:18:58.760 Or have any of you observed that to be true?
00:19:02.440 That I'm destroying my reputation
00:19:04.360 by, I guess,
00:19:06.000 having different opinions than some of you
00:19:07.820 about whatever, pandemics.
00:19:10.020 Now, on locals,
00:19:12.760 it doesn't seem that way
00:19:13.600 because the subscription service
00:19:15.460 attracts people
00:19:16.360 who like to be challenged
00:19:18.320 by different ideas.
00:19:22.200 So you're not curious?
00:19:25.760 Well, you're going to ruin
00:19:26.660 my whole premise here.
00:19:28.840 I thought you'd at least be curious
00:19:30.520 why I do it intentionally.
00:19:33.380 All right, well, I'm going to tell you anyway
00:19:34.600 in case you had an idea
00:19:36.060 but it wasn't exactly what I was thinking.
00:19:37.840 They're like, no,
00:19:40.220 we're not curious.
00:19:43.500 If you are a creative person,
00:19:46.700 success becomes a prison.
00:19:49.240 For example,
00:19:50.360 when I created Dilbert,
00:19:52.340 I got this great creative,
00:19:54.540 let's say,
00:19:55.340 benefit.
00:19:57.220 You know,
00:19:57.420 it just feels good to create
00:19:58.800 something that people like.
00:20:00.600 But once Dilbert was created,
00:20:02.460 I couldn't really turn him
00:20:03.640 into something else.
00:20:04.660 So every day for 33 years
00:20:07.860 or whatever I've been doing it,
00:20:09.680 I create a Dilbert
00:20:10.680 that's not that different
00:20:12.200 from the first one.
00:20:14.880 I try to make new jokes,
00:20:16.580 of course.
00:20:18.360 But it's a trap.
00:20:20.500 If I tried to make a second comic,
00:20:22.480 what would happen?
00:20:23.140 They'd say,
00:20:23.820 that's not as good as Dilbert
00:20:25.320 and it wouldn't be
00:20:26.140 because it would be new.
00:20:27.740 Even Dilbert wasn't as good as Dilbert
00:20:29.400 until I did it for 10 years
00:20:31.060 and I figured out how to do it.
00:20:32.940 So I'm trapped
00:20:34.940 that I can't change Dilbert much
00:20:37.000 and I'm trapped
00:20:37.940 that I can't introduce
00:20:39.620 something new
00:20:40.300 because it would just be
00:20:41.220 compared to Dilbert.
00:20:42.820 So what do you do?
00:20:44.580 Now the problem is that
00:20:45.580 the Dilbert comic
00:20:46.620 was largely,
00:20:48.280 you know,
00:20:49.060 liked by a wide group of people.
00:20:52.280 Now one thing I did,
00:20:53.340 and some people on locals
00:20:54.440 know this,
00:20:55.540 on the subscription service,
00:20:57.760 I did an obscene comic.
00:20:59.640 So you've seen some of them
00:21:01.660 on Twitter.
00:21:03.320 I don't put them all there.
00:21:05.080 But I have some robots
00:21:06.540 just reading the news
00:21:07.700 and they say horrible things
00:21:10.520 that I couldn't say
00:21:11.420 in a normal place.
00:21:14.020 So it's about the horribleness.
00:21:16.020 So the robots read news
00:21:17.340 and the subscription service
00:21:18.660 in particular
00:21:19.260 because I can put things
00:21:20.620 where people are not
00:21:21.480 going to get upset.
00:21:22.700 That gave me freedom.
00:21:25.320 That gave me a lot of freedom.
00:21:26.600 But when I do this stuff,
00:21:29.140 and especially when I talked
00:21:30.160 about some pandemic stuff,
00:21:32.660 well, let me go back.
00:21:34.300 When I first started
00:21:35.140 talking about politics,
00:21:36.960 I knew that my,
00:21:38.960 whatever popularity I have
00:21:40.440 would reduce by about half.
00:21:44.120 Because anybody who didn't like
00:21:45.240 what I was saying about politics
00:21:46.420 would like me less.
00:21:48.060 So whatever Dilbert popularity I had
00:21:50.180 would decrease by half.
00:21:52.340 I did that intentionally.
00:21:53.840 Because as a creator,
00:21:56.360 and here's the payoff,
00:21:59.780 creation and destruction
00:22:01.400 are basically the same thing.
00:22:04.760 Not exactly.
00:22:06.080 But you can't separate them.
00:22:07.880 In order to create,
00:22:09.360 you have to destroy.
00:22:10.860 You know, you've got to
00:22:11.360 take down a house
00:22:12.520 to build a new one.
00:22:13.760 You've got to destroy
00:22:14.600 who you are to be a new you.
00:22:16.560 You've got to destroy
00:22:17.900 what you were doing
00:22:18.720 to do something new.
00:22:20.120 Almost, yeah,
00:22:21.280 almost invariably,
00:22:23.400 you have to destroy
00:22:24.360 something to be free.
00:22:26.460 And so part of what
00:22:27.300 you see me doing
00:22:28.140 is fighting for my freedom.
00:22:30.880 Creative freedom.
00:22:32.200 And the creative freedom
00:22:33.200 requires many of you
00:22:34.640 to hate me.
00:22:36.300 And I'm going to do it anyway.
00:22:38.420 Because
00:22:38.860 I'm not going to stay
00:22:40.940 in a creative box
00:22:42.080 at this point in my life.
00:22:43.420 You know, I just,
00:22:44.500 it's intolerable.
00:22:46.180 I want to do
00:22:46.920 where I think I can help,
00:22:50.680 where I can make a difference,
00:22:51.900 where I can do something
00:22:52.800 maybe better
00:22:53.540 than other people
00:22:54.140 are doing it.
00:22:54.900 But I don't want to be locked
00:22:55.960 in some little world.
00:22:58.180 And so I'm going to describe
00:22:59.060 this better
00:22:59.580 by talking about three things.
00:23:01.140 The three-act movie structure.
00:23:03.500 I talked about creative prisons.
00:23:05.280 And then I'm going to give you
00:23:06.320 a new filter
00:23:06.920 that I call
00:23:07.540 the really filter.
00:23:09.040 And the really filter
00:23:11.460 goes like this.
00:23:12.380 You describe something
00:23:15.100 to find out
00:23:16.220 if it's real.
00:23:17.340 And you do it sarcastically
00:23:19.020 with the word really.
00:23:21.020 I'll talk about that in a moment.
00:23:23.760 All right.
00:23:25.700 So,
00:23:27.060 so here's how
00:23:30.920 the really filter works.
00:23:33.680 Let me give you some examples.
00:23:35.120 Let's say if I said to you
00:23:36.240 that Bob went to the store.
00:23:39.460 Totally normal thing.
00:23:40.600 Hey, Bob went to the store.
00:23:42.380 Could you turn that
00:23:43.420 into something sarcastic
00:23:44.740 by adding really?
00:23:46.640 You couldn't.
00:23:47.640 I'll try.
00:23:48.180 Watch.
00:23:48.940 Oh, really?
00:23:50.120 Really?
00:23:51.460 Bob went to the store.
00:23:54.140 Seriously.
00:23:55.380 Of all the things
00:23:56.200 that Bob can do,
00:23:58.380 you're telling me
00:23:58.960 Bob went to the store.
00:24:00.620 You see how that doesn't work?
00:24:02.780 Because you're thinking
00:24:03.760 that's just normal.
00:24:04.920 Bob went to the store.
00:24:05.780 Why are you adding
00:24:06.520 that attitude to it?
00:24:09.280 Right?
00:24:09.380 So that's an example
00:24:10.580 of the filter
00:24:11.320 detecting something
00:24:13.040 that's probably true.
00:24:14.300 Probably Bob just went
00:24:15.340 to the store.
00:24:17.180 But if you find something
00:24:18.900 that's absolutely not true,
00:24:21.500 watch what happens.
00:24:23.660 Okay?
00:24:24.140 Here's another one.
00:24:26.200 The president suggested
00:24:27.760 that you should drink bleach
00:24:29.120 maybe to help with your COVID.
00:24:32.740 Really?
00:24:34.080 Really?
00:24:35.860 A person who built an empire,
00:24:39.640 ran for president,
00:24:41.580 destroyed the field,
00:24:44.260 accomplished many things
00:24:46.060 according to his supporters anyway,
00:24:48.320 did all of these things,
00:24:49.420 speaks in public continually,
00:24:52.320 has a college education,
00:24:55.960 and really, really,
00:24:58.080 you think that he stood
00:24:59.740 in front of the public,
00:25:01.000 you really think this,
00:25:02.180 and suggested
00:25:03.620 for real
00:25:05.660 that he said
00:25:08.100 to drink bleach.
00:25:09.680 Really?
00:25:11.000 Really?
00:25:13.840 See that?
00:25:15.420 There's somebody on YouTube
00:25:16.480 who still thinks
00:25:17.200 he really did that.
00:25:19.040 We're way past
00:25:20.100 did he really do it.
00:25:22.100 That's been fact-checked to death.
00:25:23.800 No, he didn't really do that.
00:25:25.560 There's,
00:25:26.180 if you believe it,
00:25:27.360 that's part of the
00:25:28.100 mass formation psychosis
00:25:30.100 or some damn thing.
00:25:31.180 Let me do another one.
00:25:32.180 The president of the United States
00:25:35.060 stood in front of the world
00:25:36.860 and called neo-Nazis
00:25:38.280 fine people.
00:25:40.280 Really?
00:25:41.960 Really?
00:25:43.440 Every single person
00:25:44.620 in the world
00:25:45.260 knows that that would be
00:25:46.920 a bad idea
00:25:47.680 if you were a politician.
00:25:49.800 But the only person,
00:25:51.620 the only person in the world,
00:25:53.500 really,
00:25:54.800 that couldn't figure out
00:25:55.880 not to do that
00:25:56.820 was the one
00:25:58.080 who is president.
00:25:59.020 Really?
00:26:00.240 Really?
00:26:00.600 Really?
00:26:00.700 That happened.
00:26:01.920 And of course,
00:26:02.400 it didn't happen.
00:26:03.660 Some of you on YouTube
00:26:04.500 don't know it
00:26:05.260 because you think
00:26:06.080 it really did happen.
00:26:07.380 No, it didn't happen.
00:26:08.100 It was a fake edit
00:26:09.120 just like the
00:26:10.760 bleach hoax thing
00:26:11.520 is a fake edit.
00:26:13.120 Yeah, yeah,
00:26:13.660 you really saw it.
00:26:14.840 No, you didn't.
00:26:15.860 You all saw the fake edits
00:26:17.200 and you saw it
00:26:17.700 a lot of times.
00:26:18.360 So if you find
00:26:20.120 the real edits,
00:26:20.760 you'll realize
00:26:21.200 that both of those are real.
00:26:22.220 But here's the shortcut.
00:26:24.460 Did you need to do
00:26:25.480 all the work
00:26:26.020 of finding the fake edit
00:26:27.240 and find out
00:26:27.920 what they took out of context?
00:26:29.820 No.
00:26:30.600 You could have used
00:26:31.380 the really filter.
00:26:33.480 Here's another one.
00:26:36.360 The events of January 6
00:26:38.520 were an insurrection.
00:26:43.140 Try the really filter on it.
00:26:45.360 The events of January 6
00:26:47.060 were an insurrection.
00:26:48.980 Go ahead.
00:26:50.420 How does that work?
00:26:52.400 Really?
00:26:53.360 Really?
00:26:54.260 The guy with the buffalo horns
00:26:56.020 and the dude
00:26:57.360 with the zip ties
00:26:58.660 were going to conquer
00:27:00.320 and hold
00:27:01.160 the government
00:27:02.200 of the United States.
00:27:03.880 Really?
00:27:04.700 That was their actual plan.
00:27:06.680 They looked around
00:27:07.400 and they said,
00:27:07.980 hey,
00:27:08.480 I think several of us
00:27:09.760 are armed.
00:27:11.080 Let's go into the Capitol,
00:27:13.100 maybe take some hostages.
00:27:15.500 I do think that part
00:27:16.340 was probably real.
00:27:17.380 And we have
00:27:19.420 a pretty good idea
00:27:20.300 that this will
00:27:21.700 create something
00:27:22.580 that will let us
00:27:23.420 hold the country
00:27:24.560 and rule over it
00:27:26.880 with our,
00:27:28.660 whatever,
00:27:30.200 our God King Trump
00:27:31.240 or something.
00:27:32.120 Really?
00:27:32.940 You really think
00:27:34.060 that there's some people
00:27:35.460 thought that they could
00:27:36.320 conquer the government
00:27:37.820 of the United States
00:27:38.800 with their flagpoles
00:27:40.300 and their bear spray?
00:27:43.000 Really?
00:27:43.660 You really think that?
00:27:44.640 See?
00:27:45.860 Now, did that work?
00:27:48.280 That worked, didn't it?
00:27:51.940 So,
00:27:52.600 I'm not going to say
00:27:53.500 this filter works
00:27:54.340 every time,
00:27:55.840 but it works pretty well.
00:27:57.480 You want to do another one?
00:27:58.980 There was a story
00:28:02.980 about me years ago
00:28:04.280 that you'll still see
00:28:06.040 if you Google me.
00:28:06.960 It comes up all the time.
00:28:08.820 And it was long
00:28:09.540 before we learned
00:28:10.460 that all the news
00:28:11.520 is fake,
00:28:12.580 like we know now.
00:28:14.740 This was back
00:28:15.560 when you thought
00:28:16.060 some of the news
00:28:16.940 was probably real.
00:28:18.760 Now, I'm going to tell you
00:28:19.800 what the story is
00:28:21.140 about me,
00:28:22.320 and I want you to apply
00:28:23.660 the really filter, okay?
00:28:25.440 You've probably even
00:28:26.180 heard this about me.
00:28:27.320 I bet you even thought
00:28:28.080 this was true.
00:28:29.580 I bet a lot of you
00:28:30.240 thought this was true.
00:28:31.160 So, here's what was said
00:28:32.040 about me.
00:28:33.260 That I wrote a blog post
00:28:34.780 in which I compared women
00:28:36.900 to children
00:28:38.420 and the mentally disabled.
00:28:41.420 So, that's the story
00:28:42.760 about me.
00:28:43.660 That I wrote a blog post
00:28:45.000 in which I compared women
00:28:46.540 to children
00:28:47.760 and the mentally disabled.
00:28:52.060 Really?
00:28:54.180 Really?
00:28:54.620 I'm somebody
00:28:58.560 who lives
00:28:59.160 in the media
00:28:59.720 for a job.
00:29:01.460 I'm media trained.
00:29:03.220 I have a college education.
00:29:05.200 I do this
00:29:05.940 for a living.
00:29:07.240 And you really think
00:29:08.360 that I wrote that?
00:29:10.360 Really?
00:29:11.500 You really think
00:29:12.500 that I thought
00:29:13.040 it would be a good idea
00:29:14.080 to compare women
00:29:16.080 to children
00:29:16.860 and the mentally disabled?
00:29:18.840 And that I didn't see
00:29:19.940 that maybe that
00:29:20.580 would cause trouble?
00:29:22.280 Really?
00:29:22.600 Do you really think
00:29:24.500 I did that?
00:29:26.340 And the answer is,
00:29:27.520 of course I didn't do that.
00:29:29.440 Of course not.
00:29:30.760 Now, if you don't know
00:29:31.460 the story,
00:29:31.900 I'll tell you what I did do.
00:29:33.680 In the context
00:29:34.980 of mocking men,
00:29:36.940 that's the part
00:29:38.000 they leave out.
00:29:39.780 It was mocking men.
00:29:41.120 That was the whole context.
00:29:42.880 I said that men
00:29:43.620 were cowardly,
00:29:44.800 essentially,
00:29:45.380 I'm paraphrasing,
00:29:46.600 that men were cowardly
00:29:47.800 and that they don't
00:29:51.060 tell women the truth.
00:29:53.160 So that was the point.
00:29:55.140 Now,
00:29:55.660 is it offensive
00:29:56.420 to say that men
00:29:57.420 are afraid
00:29:58.640 of telling women
00:29:59.440 the truth?
00:30:01.160 No,
00:30:01.700 that's just
00:30:02.140 an observation.
00:30:03.620 It doesn't even say
00:30:04.420 anything about women.
00:30:05.840 Well,
00:30:06.140 I guess it does.
00:30:06.980 But it's more about men,
00:30:08.080 right?
00:30:09.700 And in order to make
00:30:11.400 what I thought
00:30:12.360 was a hilarious point
00:30:14.360 that was that point,
00:30:16.300 I thought to myself,
00:30:17.420 wouldn't it be funny
00:30:18.380 if I got everybody
00:30:21.520 riled up
00:30:22.280 by putting three things
00:30:24.160 in a list
00:30:24.680 that don't belong?
00:30:27.080 And so I consciously
00:30:28.620 formed this sentence
00:30:29.760 in which I said,
00:30:31.940 I forget the exact sentence,
00:30:33.360 but it was something like,
00:30:34.540 there are three groups
00:30:35.440 that men don't argue with.
00:30:38.120 And I said,
00:30:39.360 children,
00:30:40.860 mentally handicapped,
00:30:42.120 and women.
00:30:43.420 Now,
00:30:43.820 the reason
00:30:44.440 that I put them
00:30:45.320 in the list
00:30:46.020 is that you're supposed
00:30:47.700 to say,
00:30:48.380 wait,
00:30:48.640 what?
00:30:49.480 Did you just put women
00:30:50.500 in a list
00:30:50.960 with children
00:30:52.140 and the mentally disabled?
00:30:54.280 And you were supposed
00:30:54.980 to say,
00:30:56.060 my God,
00:30:56.580 that feels wrong
00:30:57.980 because it was designed
00:31:00.040 to feel wrong.
00:31:01.340 That's why I wrote it.
00:31:02.840 You were supposed
00:31:03.360 to say,
00:31:04.020 wait a minute,
00:31:04.540 my brain is breaking.
00:31:06.000 Okay,
00:31:06.180 I get your point.
00:31:07.820 Your point is,
00:31:08.680 these are three things
00:31:09.600 that men
00:31:10.080 don't argue with,
00:31:11.660 but for different reasons.
00:31:13.900 The reason you don't argue
00:31:15.180 with those three groups
00:31:16.180 isn't the same reason.
00:31:18.740 It's not because women
00:31:19.920 are mentally disabled.
00:31:21.600 It's not because women
00:31:22.660 have undeveloped brains.
00:31:25.360 There is three different reasons.
00:31:27.620 But by putting them
00:31:28.720 in a list,
00:31:29.860 I was aware
00:31:30.660 that it would make
00:31:31.460 people's heads explode
00:31:32.580 and they would think
00:31:33.920 momentarily,
00:31:34.680 wait a minute,
00:31:35.200 did he just compare them?
00:31:37.020 But then they would
00:31:37.640 read it again
00:31:38.300 and they're like,
00:31:38.780 okay,
00:31:39.080 he's just being a jerk
00:31:40.260 and he thinks that's funny,
00:31:42.320 which is what I was doing.
00:31:43.540 It was just being a jerk
00:31:44.360 and I thought it was funny.
00:31:45.880 Now,
00:31:47.440 when you hear that
00:31:49.600 in the context
00:31:50.300 of 2022,
00:31:52.320 do you believe me?
00:31:56.740 If you heard this
00:31:57.680 in the context
00:31:58.360 of when it was said
00:31:59.900 or when this happened,
00:32:01.680 everybody still thought
00:32:02.900 that what they read
00:32:03.740 was true.
00:32:05.300 Like,
00:32:05.520 people widely believed
00:32:06.820 that I had made this
00:32:08.220 like a serious comment.
00:32:11.680 Thank you,
00:32:12.260 Eric.
00:32:14.280 Anyway,
00:32:15.200 so use the really,
00:32:17.100 and by the way,
00:32:17.960 the really filter
00:32:18.760 is a variant
00:32:20.060 on the Scott-Alexander theory
00:32:22.620 that if something
00:32:24.020 sounds too incredible
00:32:25.320 to be true,
00:32:26.720 it's because it always is.
00:32:28.880 Not always,
00:32:29.660 but like 98% of the time.
00:32:31.540 If you hear something
00:32:32.380 that's just a head shaker,
00:32:33.780 oh,
00:32:34.320 I can't believe that,
00:32:35.360 there's a good chance
00:32:37.100 it didn't happen.
00:32:38.420 That's all you need to know.
00:32:39.540 Just go,
00:32:40.140 really?
00:32:40.660 Did that happen?
00:32:42.480 All right,
00:32:42.920 here's the other thing
00:32:43.600 you need to know.
00:32:44.480 I'm going to pull this
00:32:45.100 all together in a minute.
00:32:46.420 I've talked about this,
00:32:47.500 how our minds
00:32:48.240 are oriented
00:32:49.260 towards stories.
00:32:50.340 You all know that,
00:32:51.140 right?
00:32:51.880 The human brain
00:32:52.700 can be influenced
00:32:53.480 by stories
00:32:54.200 and programmed
00:32:54.820 by stories
00:32:55.500 and narratives,
00:32:56.640 if you like.
00:32:58.040 But we're also influenced
00:32:59.180 by the three-act,
00:33:00.260 three-act movie structure.
00:33:03.820 Now the three-act movie structure
00:33:05.780 is the first act,
00:33:07.080 something happens
00:33:07.860 that changes people's lives,
00:33:09.460 you know,
00:33:09.640 somebody dies
00:33:10.440 or takes over
00:33:12.120 or there's a war
00:33:13.200 or whatever.
00:33:14.280 Act two
00:33:14.940 is the fun and games
00:33:16.600 area
00:33:17.820 where everybody's
00:33:18.980 just acting out
00:33:19.840 the thing that's happened,
00:33:21.620 you know,
00:33:21.800 they're struggling
00:33:22.400 through something.
00:33:23.540 But generally,
00:33:24.500 there's not that much problem
00:33:26.280 until you get
00:33:27.340 to the third act,
00:33:28.500 in which case
00:33:29.460 the hero
00:33:30.000 is doomed
00:33:31.100 and doomed forever.
00:33:32.180 And an unsolvable problem
00:33:34.760 and then at the end
00:33:35.560 the problem gets solved.
00:33:37.260 So our brains
00:33:38.040 are just built
00:33:38.740 to imagine
00:33:39.720 that's going to happen.
00:33:41.200 Let me give you
00:33:41.660 an example.
00:33:43.240 When I left
00:33:45.140 my cartooning
00:33:46.480 sort of field
00:33:47.700 and started talking
00:33:48.620 about Trump,
00:33:50.280 Trump was the first act
00:33:51.800 when he first came
00:33:52.900 on the scene
00:33:53.480 and then I wrote
00:33:54.820 a blog post
00:33:55.620 called Clown Genius
00:33:57.060 in which I reframed him
00:33:58.560 as more clever
00:34:00.120 than you think
00:34:00.920 because he has
00:34:01.620 this skill set
00:34:02.320 and persuasion.
00:34:04.180 I think that actually
00:34:05.180 made a difference
00:34:05.780 in getting him elected,
00:34:06.700 by the way,
00:34:07.380 because until then
00:34:08.300 he was widely thought
00:34:09.400 to be a clown
00:34:10.380 and I explained
00:34:11.960 that the clowning
00:34:12.780 has utility,
00:34:13.780 which he proved,
00:34:14.500 by the way.
00:34:15.160 And he actually said
00:34:16.040 I'm doing this
00:34:16.780 for a purpose
00:34:17.440 and he absorbed
00:34:18.740 all the energy
00:34:19.400 and won
00:34:19.860 just like I predicted.
00:34:22.060 But act one
00:34:22.800 was that Trump
00:34:23.580 gets into the race
00:34:25.120 and then in terms
00:34:26.320 of my movie,
00:34:27.640 my reputation
00:34:28.500 got destroyed
00:34:29.360 because I was
00:34:31.520 the person saying
00:34:32.240 he was going to
00:34:32.780 get all the way
00:34:33.260 through and get elected.
00:34:35.180 And so I was
00:34:36.060 just destroyed.
00:34:37.420 I used to be
00:34:37.980 the good Dilbert
00:34:39.520 cartoonist
00:34:40.280 and then half
00:34:41.820 of the world
00:34:42.240 said you were
00:34:43.500 really dumb
00:34:44.160 and then another
00:34:46.060 half of the half,
00:34:47.500 the Republican,
00:34:48.280 said okay,
00:34:49.720 I see what you're
00:34:50.260 saying but there's
00:34:51.060 no way he's going
00:34:51.760 to become president.
00:34:52.520 So I made 75%
00:34:55.180 of the world
00:34:55.680 think I was stupid.
00:34:59.340 My reputation
00:35:00.240 was in the toilet.
00:35:02.220 And the third act
00:35:03.200 for me
00:35:04.840 and for the Trump
00:35:07.040 story as well
00:35:07.860 was before election
00:35:09.720 when the news
00:35:10.740 came out
00:35:11.240 or the recording
00:35:12.700 of him saying
00:35:13.380 that he sometimes
00:35:14.820 grabbed women
00:35:15.620 by the you know
00:35:17.560 what.
00:35:18.900 So here was
00:35:19.920 the setup.
00:35:20.560 First act,
00:35:21.160 Trump changes
00:35:22.340 things.
00:35:22.960 I become part
00:35:23.600 of that by
00:35:24.160 writing about him.
00:35:25.540 Second act is
00:35:26.440 all this fun
00:35:27.060 and games.
00:35:27.640 I write about
00:35:28.140 what he's doing
00:35:28.740 is linguistic
00:35:29.480 kill shots.
00:35:30.760 It's fun and games.
00:35:31.640 It's the campaign.
00:35:33.020 And then just
00:35:33.580 before the election,
00:35:35.800 third act,
00:35:37.140 it's over.
00:35:39.000 Nobody gets
00:35:39.860 elected with
00:35:41.680 that big of a
00:35:42.420 thing hanging
00:35:42.940 over them right
00:35:43.580 before the election.
00:35:45.680 And then
00:35:46.260 he got elected
00:35:48.260 anyway.
00:35:50.000 He got elected.
00:35:51.160 And that was
00:35:52.620 like the surprise
00:35:53.840 result at the end
00:35:54.880 of the movie.
00:35:56.520 Now,
00:35:58.240 is this the first
00:36:00.600 time I've been
00:36:01.560 involved in a
00:36:02.380 three act play?
00:36:03.280 Well, no.
00:36:04.400 No.
00:36:05.620 Not.
00:36:08.440 For example,
00:36:10.160 back in the 90s
00:36:11.280 I predicted that
00:36:12.140 evolution would be
00:36:13.280 debunked in my
00:36:15.200 lifetime.
00:36:16.060 By scientific means,
00:36:17.540 not by religious
00:36:18.300 means.
00:36:20.060 Do you know what
00:36:20.500 happened in the
00:36:21.380 90s when I
00:36:22.300 predicted that
00:36:23.520 evolution would be
00:36:24.580 debunked?
00:36:26.180 I destroyed my
00:36:27.620 reputation.
00:36:28.880 Because most of
00:36:29.700 my customer base
00:36:32.100 were technical
00:36:32.980 or at least
00:36:33.800 educated people
00:36:35.300 who would say,
00:36:37.380 no, Scott,
00:36:38.040 if you're selling
00:36:38.680 me this
00:36:39.300 intelligent design,
00:36:44.060 which I
00:36:45.000 wasn't,
00:36:46.140 at the time,
00:36:47.540 I just said,
00:36:48.020 I don't know
00:36:48.320 what the answer
00:36:48.800 will be.
00:36:49.980 I don't think
00:36:50.820 it will be a
00:36:51.300 God answer,
00:36:52.660 but I think
00:36:53.500 the evolution
00:36:54.060 will be
00:36:54.460 debunked.
00:36:55.340 I got
00:36:55.840 savaged by
00:36:56.800 that.
00:36:57.120 And anywhere I
00:36:57.920 went to talk
00:36:58.860 about anything
00:36:59.480 for about 20
00:37:00.680 years,
00:37:02.400 everywhere I
00:37:02.980 went,
00:37:03.300 somebody said,
00:37:03.960 don't listen to
00:37:04.540 him,
00:37:04.680 he's the guy
00:37:05.060 who thinks
00:37:05.440 evolution isn't
00:37:06.240 real.
00:37:06.460 And then
00:37:09.840 the simulation
00:37:10.480 theory came
00:37:11.200 out,
00:37:12.120 as I
00:37:13.980 predicted.
00:37:14.940 A complete
00:37:15.780 different way
00:37:16.540 of looking at
00:37:17.080 reality,
00:37:17.660 which is
00:37:17.900 specifically
00:37:18.400 what I
00:37:18.800 predicted.
00:37:19.460 I predicted
00:37:20.000 the evolution
00:37:20.620 would be
00:37:21.220 replaced because
00:37:23.500 we would see
00:37:24.200 reality itself
00:37:25.220 differently,
00:37:25.840 and that would
00:37:26.200 just be a
00:37:26.620 subset of
00:37:27.100 reality.
00:37:28.700 And now
00:37:29.420 the simulation
00:37:30.040 theory,
00:37:30.980 which has
00:37:32.180 lots of
00:37:32.600 backing from
00:37:33.460 smart people,
00:37:34.700 Elon Musk
00:37:35.240 again.
00:37:36.460 actually
00:37:37.260 rewrites the
00:37:38.040 entire idea
00:37:38.800 of what
00:37:39.200 evolution would
00:37:39.900 even mean
00:37:40.460 if we're a
00:37:41.420 simulation.
00:37:42.760 And so
00:37:43.240 we've come
00:37:44.740 full circle.
00:37:47.540 Writing my
00:37:48.380 book and
00:37:48.960 calling evolution
00:37:50.120 a mistake,
00:37:51.400 20 years of
00:37:52.500 eating shit
00:37:53.320 for it,
00:37:54.560 followed by,
00:37:56.320 can you
00:37:56.700 believe it?
00:37:57.460 There is a
00:37:57.940 competing theory
00:37:58.860 called the
00:37:59.600 simulation.
00:38:00.540 There's probably
00:38:01.020 a trillion to
00:38:01.700 one more likely
00:38:02.380 than evolution.
00:38:03.640 We don't
00:38:04.000 know,
00:38:05.060 but it's
00:38:05.440 probably a
00:38:05.880 trillion to
00:38:06.340 one more
00:38:06.680 likely,
00:38:07.260 just because
00:38:07.900 there will
00:38:08.300 be lots
00:38:08.740 of simulations
00:38:09.440 and we
00:38:10.500 know that.
00:38:11.320 So the
00:38:11.820 odds that
00:38:12.280 they urine
00:38:12.560 the original
00:38:13.060 one are
00:38:13.380 pretty low.
00:38:15.600 So that
00:38:16.020 was another
00:38:16.360 one.
00:38:16.580 That was
00:38:16.940 sort of a
00:38:17.320 three-act
00:38:17.760 play.
00:38:19.860 And we're
00:38:20.500 right in
00:38:20.800 the middle
00:38:21.040 of another
00:38:21.940 one,
00:38:22.620 in which I
00:38:23.880 made some
00:38:25.060 statements about
00:38:25.820 rogue doctors
00:38:26.860 usually being
00:38:27.640 wrong,
00:38:28.600 and made a
00:38:29.100 lot of other
00:38:29.700 statements about
00:38:31.460 the pandemic
00:38:32.860 and the
00:38:33.280 virus and
00:38:33.900 vaccinations
00:38:34.580 caused the
00:38:37.840 few people
00:38:38.380 who still
00:38:38.940 had any
00:38:40.500 respect for
00:38:41.180 me to
00:38:42.320 lose it.
00:38:43.240 Mostly they
00:38:43.920 imagined I
00:38:44.580 had opinions
00:38:45.080 that I didn't
00:38:45.520 have, but
00:38:46.540 we'll get to
00:38:46.940 that.
00:38:47.840 But the
00:38:48.640 net of it
00:38:49.280 was that
00:38:50.000 something
00:38:50.280 happened,
00:38:50.820 the pandemic.
00:38:52.080 That was the
00:38:52.840 thing that
00:38:53.140 happened.
00:38:53.460 That's the
00:38:53.760 first act.
00:38:54.600 The second
00:38:55.200 act was all
00:38:56.080 of us dealing
00:38:56.920 with it and
00:38:57.480 making predictions
00:38:58.240 and, you
00:38:59.300 know, etc.
00:39:00.620 And then the
00:39:01.200 third act is
00:39:01.900 whatever is the
00:39:02.640 conclusion, in
00:39:03.960 which we figure
00:39:04.660 out who was
00:39:05.440 right about
00:39:06.160 everything and
00:39:06.800 who was
00:39:07.080 wrong.
00:39:08.200 My third
00:39:11.980 act, my
00:39:12.760 impossible feat,
00:39:14.620 is to
00:39:15.080 recover from
00:39:16.380 not only
00:39:17.060 everybody on
00:39:18.140 the left
00:39:18.640 hating me,
00:39:19.680 because I
00:39:20.680 talk about
00:39:21.140 things on
00:39:21.600 the right,
00:39:22.540 but also a
00:39:23.660 huge portion
00:39:24.400 of the right
00:39:25.320 hating me
00:39:26.460 because they
00:39:27.020 believe I
00:39:27.520 disagree with
00:39:28.160 them on
00:39:28.540 vaccinations
00:39:29.120 and stuff,
00:39:30.160 and I
00:39:30.500 probably
00:39:30.780 don't.
00:39:32.260 So how
00:39:32.660 do you
00:39:32.860 get out
00:39:33.180 of that?
00:39:35.780 This,
00:39:36.520 ladies and
00:39:37.100 gentlemen,
00:39:38.020 is where
00:39:39.020 we're at.
00:39:39.560 We're at
00:39:39.820 the third
00:39:40.200 act.
00:39:41.240 Can I
00:39:41.620 do it?
00:39:42.580 Is there
00:39:42.900 any way
00:39:43.360 that I
00:39:43.740 can defend
00:39:44.200 myself from
00:39:44.860 what people
00:39:45.880 have said?
00:39:46.600 Is Scott
00:39:47.140 losing it?
00:39:48.700 That, in
00:39:49.060 fact, my
00:39:49.820 flailing at
00:39:51.040 my critics
00:39:51.680 is proving
00:39:52.480 how crazy
00:39:53.820 I am,
00:39:54.440 how possibly
00:39:55.440 lost in a
00:39:56.320 mass formation
00:39:57.700 psychosis,
00:39:59.160 and that
00:39:59.940 there is no
00:40:01.280 way, having
00:40:02.340 predicted so
00:40:03.760 many things
00:40:04.260 wrong and
00:40:04.700 gotten so
00:40:05.280 many things
00:40:05.780 wrong in
00:40:06.260 the pandemic,
00:40:07.200 and everybody
00:40:07.900 knows it,
00:40:09.220 how wrong I've
00:40:09.980 been about
00:40:10.440 really everything
00:40:11.320 in the pandemic,
00:40:12.760 there is no
00:40:13.560 way to recover
00:40:14.280 from this
00:40:15.800 third act,
00:40:17.160 is there?
00:40:20.280 Well,
00:40:21.240 here's a little
00:40:22.320 update.
00:40:22.660 I put together
00:40:24.180 my list of
00:40:24.980 my predictions
00:40:25.880 that I could
00:40:26.600 remember,
00:40:27.320 which of course
00:40:27.840 includes none
00:40:28.920 of them that I
00:40:29.500 got wrong,
00:40:30.740 because the
00:40:32.540 way memory
00:40:33.040 works is you
00:40:33.660 remember your
00:40:34.120 good ones and
00:40:34.700 you forget the
00:40:35.220 bad ones.
00:40:35.960 So I put it
00:40:36.460 on the
00:40:36.760 locals platform,
00:40:37.940 the subscription
00:40:38.560 platform, and I
00:40:39.460 asked the
00:40:40.400 people who
00:40:41.000 follow me the
00:40:41.980 closest to
00:40:42.880 fact check it
00:40:43.640 and to add
00:40:44.680 onto it in
00:40:45.880 the comments
00:40:46.420 anything I got
00:40:49.120 wrong that I
00:40:49.940 just didn't have
00:40:50.580 in the list.
00:40:51.880 So fairly
00:40:52.380 soon I'm
00:40:53.260 going to have
00:40:53.500 a list of
00:40:54.000 my predictions
00:40:54.660 along with
00:40:55.660 what the
00:40:56.560 people who
00:40:57.180 follow me
00:40:57.660 have added
00:40:58.740 to it or
00:40:59.260 corrected it,
00:41:00.460 and then I'm
00:41:01.180 going to make
00:41:01.640 the case that
00:41:02.500 I had the
00:41:03.000 best predictions
00:41:03.840 in the pandemic,
00:41:05.460 and then I'm
00:41:06.280 going to find
00:41:06.700 somebody to
00:41:07.300 publish it.
00:41:10.940 And then
00:41:11.820 you're going
00:41:12.740 to have to
00:41:13.100 argue whether
00:41:13.740 I had the
00:41:15.200 best predictions
00:41:17.160 of the whole
00:41:18.080 pandemic of
00:41:18.960 the entire
00:41:19.440 world,
00:41:20.580 or maybe
00:41:21.160 just top
00:41:21.660 ten.
00:41:23.880 And when
00:41:24.520 you're done,
00:41:25.260 I will have
00:41:25.740 created a
00:41:26.260 record of
00:41:27.300 predicting better
00:41:29.480 than anybody
00:41:30.160 predicted,
00:41:31.700 and that
00:41:32.300 is the end
00:41:34.340 of the movie.
00:41:35.820 And we'll
00:41:36.460 need a new
00:41:36.880 movie after
00:41:37.380 this, because
00:41:38.260 there's always
00:41:38.580 a new movie.
00:41:40.280 But do I
00:41:41.720 destroy my
00:41:42.620 reputation
00:41:43.200 intentionally?
00:41:45.020 Yes.
00:41:45.940 Do you think
00:41:46.600 that I act
00:41:47.200 exactly like
00:41:48.140 this in
00:41:48.640 person?
00:41:49.940 No.
00:41:50.580 How often
00:41:51.460 have you
00:41:51.740 seen me
00:41:52.160 go into
00:41:52.640 cursing
00:41:54.200 tirades?
00:41:56.060 I mean,
00:41:56.340 I don't do
00:41:56.840 that much
00:41:57.300 in person,
00:41:58.740 but here
00:41:59.320 it comes
00:42:00.080 easily.
00:42:01.220 And I
00:42:02.400 wouldn't say
00:42:02.780 that I'm
00:42:03.120 putting on
00:42:03.540 an act,
00:42:04.100 per se,
00:42:05.220 because my
00:42:06.660 argument is
00:42:07.360 that everybody
00:42:07.940 modifies their
00:42:08.800 communication for
00:42:09.820 the situation.
00:42:11.520 You don't
00:42:11.980 talk to a
00:42:12.820 two-year-old
00:42:13.500 the way you
00:42:13.920 talk to an
00:42:14.460 adult,
00:42:15.040 the way you
00:42:15.380 talk to your
00:42:15.820 boss,
00:42:16.660 the way you
00:42:17.000 talk to
00:42:17.480 anybody else.
00:42:18.180 And so
00:42:19.400 when I
00:42:19.840 talk to
00:42:20.400 you here,
00:42:22.140 this is
00:42:22.600 completely
00:42:23.060 genuine in
00:42:24.100 the sense
00:42:24.620 that this
00:42:25.620 is my
00:42:26.080 genuine
00:42:27.020 communication
00:42:28.000 style for
00:42:28.860 this
00:42:29.160 situation.
00:42:31.020 But of
00:42:31.680 course,
00:42:32.500 if I'm
00:42:32.840 negotiating with
00:42:33.540 a terrorist,
00:42:34.600 I'm going to
00:42:35.200 modify my
00:42:35.840 style that
00:42:37.700 everybody does
00:42:38.360 in every
00:42:38.680 situation.
00:42:39.180 I don't
00:42:46.240 know if
00:42:46.480 any of
00:42:46.760 this was
00:42:47.040 interesting,
00:42:47.620 because the
00:42:48.040 problem is
00:42:48.540 it's too
00:42:49.240 self-referential.
00:42:51.180 But I
00:42:51.820 would like to
00:42:52.260 get back to
00:42:52.800 January 6th and
00:42:54.460 tell you that
00:42:55.020 you are in
00:42:55.840 the middle of
00:42:56.500 a massive
00:42:57.280 brainwashing
00:42:59.180 operation.
00:43:00.380 I don't
00:43:01.000 know the
00:43:01.380 degree to
00:43:02.160 which it is
00:43:02.640 organized,
00:43:03.320 but it
00:43:03.520 looks organized.
00:43:04.840 It looks
00:43:05.360 as if the
00:43:06.520 whole January
00:43:07.260 6th thing is
00:43:08.120 just to keep
00:43:08.700 Trump out
00:43:09.680 of office
00:43:10.200 and to
00:43:10.600 keep Trump
00:43:11.120 supporters
00:43:11.620 and Republicans
00:43:12.400 in general
00:43:12.980 demonized
00:43:14.320 by keeping
00:43:15.740 that story
00:43:16.280 in your
00:43:16.540 head.
00:43:18.200 Because it
00:43:18.820 creates a
00:43:19.240 situation where
00:43:19.880 Democrats can
00:43:20.640 do anything
00:43:21.360 to Republicans
00:43:22.380 because,
00:43:23.380 damn it,
00:43:23.700 those Republicans
00:43:24.420 deserve it.
00:43:25.780 Look at them
00:43:26.320 and their
00:43:26.660 insurrections.
00:43:32.000 On election
00:43:32.900 tampering,
00:43:33.640 I don't have
00:43:34.020 any updates
00:43:35.120 on anything
00:43:35.640 about that.
00:43:38.700 All right.
00:43:42.020 Somebody says
00:43:43.700 January 6th
00:43:44.380 was less
00:43:45.480 dangerous than
00:43:46.200 the Travis
00:43:46.760 Scott concert.
00:43:49.160 Literally
00:43:49.760 true.
00:43:54.680 All right.
00:43:56.480 What act is
00:43:57.160 Trump currently
00:43:58.120 in?
00:43:58.480 Good question.
00:44:00.880 If I had to
00:44:02.440 put money on
00:44:03.140 it, I would
00:44:05.000 say his best
00:44:05.840 play would be
00:44:06.540 to start a
00:44:07.200 media empire
00:44:07.940 and to use
00:44:09.780 his clout
00:44:10.360 there because
00:44:11.200 he could
00:44:11.460 practically run
00:44:12.260 the country
00:44:12.740 from a
00:44:13.460 media empire.
00:44:15.120 Right?
00:44:16.100 He doesn't
00:44:16.860 have to do
00:44:17.260 all the work
00:44:17.780 and go to
00:44:18.120 the meetings.
00:44:19.220 He can just
00:44:20.140 do his thing
00:44:20.900 and change
00:44:22.100 public opinion
00:44:22.860 and then
00:44:24.720 things start
00:44:25.380 going his
00:44:25.840 way.
00:44:27.940 Prevent
00:44:28.460 election
00:44:28.880 tampering
00:44:29.260 in 2022.
00:44:30.300 Can't be
00:44:30.640 done.
00:44:31.640 Can't be
00:44:31.980 done.
00:44:32.540 There's no
00:44:33.200 energy on the
00:44:34.060 left or the
00:44:34.680 right to make
00:44:35.520 our elections
00:44:36.180 transparent.
00:44:39.080 And when
00:44:39.400 elections are
00:44:39.980 not transparent,
00:44:41.120 meaning that
00:44:41.540 you can't
00:44:42.000 fully audit
00:44:42.600 them, including
00:44:43.460 the electronic
00:44:44.140 parts, if it's
00:44:45.500 not fully
00:44:46.100 transparent, the
00:44:48.020 assumption is
00:44:48.840 corruption.
00:44:50.460 Am I wrong?
00:44:52.220 But is there
00:44:52.820 anybody who
00:44:53.260 disagrees with
00:44:53.920 that, by the
00:44:54.500 way, that the
00:44:55.760 assumption should
00:44:56.680 be corruption?
00:44:57.780 Because not
00:44:58.240 only is it
00:44:59.040 not auditable,
00:45:01.220 there's no
00:45:01.940 energy to fix
00:45:03.040 it.
00:45:03.640 It's the
00:45:04.160 biggest problem
00:45:04.660 in the
00:45:04.880 country.
00:45:05.700 If the
00:45:06.040 biggest problem
00:45:07.020 in the
00:45:07.280 country is
00:45:07.740 not being
00:45:08.180 fixed or
00:45:08.720 even addressed,
00:45:11.680 assume
00:45:12.140 corruption.
00:45:13.480 When the
00:45:14.260 FDA was
00:45:15.240 tardy approving
00:45:17.340 rapid tests,
00:45:19.500 and we're not
00:45:20.240 quite sure why,
00:45:22.280 I mean, we've
00:45:22.960 heard stories, but
00:45:23.640 they don't quite
00:45:24.220 sell, assume
00:45:26.040 corruption.
00:45:26.860 Because you
00:45:27.460 can't tell what
00:45:28.100 happened, and
00:45:30.400 they won't tell
00:45:31.000 us exactly.
00:45:32.220 So, assume
00:45:34.000 corruption, yeah.
00:45:41.020 Twitter is
00:45:41.820 run by a
00:45:42.520 MAGA
00:45:42.920 hater, is
00:45:44.080 that what
00:45:44.280 you're saying?
00:45:45.340 Getter is?
00:45:48.880 Bitcoin
00:45:49.400 crashed?
00:45:51.120 Because
00:45:51.600 Kazakhstan had
00:45:52.580 an internet
00:45:52.980 shutdown?
00:45:53.560 What?
00:45:53.820 What?
00:45:53.980 somebody's
00:46:01.840 saying that
00:46:02.200 Marjorie
00:46:02.640 Taylor
00:46:02.920 Green was
00:46:04.900 on Tim
00:46:05.720 Cast last
00:46:06.620 night, and
00:46:07.760 this user
00:46:08.420 says, I
00:46:08.840 never saw
00:46:09.360 her not
00:46:09.860 ugly filtered
00:46:10.760 by fake
00:46:11.400 news before.
00:46:12.620 So you're
00:46:12.920 saying that
00:46:13.380 in person
00:46:13.840 she's a
00:46:14.940 more attractive
00:46:15.640 person than
00:46:16.420 the pictures.
00:46:17.920 I love when
00:46:19.320 the Fox
00:46:20.620 News does
00:46:21.140 this really
00:46:21.560 well, but
00:46:22.380 CNN does
00:46:22.900 it too.
00:46:23.600 When they're
00:46:24.220 doing a
00:46:24.840 negative story
00:46:25.520 about somebody
00:46:26.100 they don't
00:46:26.500 like, and
00:46:27.460 the photo
00:46:28.240 that they
00:46:28.700 choose is
00:46:29.680 always just
00:46:30.140 the most
00:46:31.840 insulting
00:46:32.380 photo of
00:46:33.020 a person.
00:46:34.440 Whenever
00:46:34.760 people would
00:46:35.420 write stories
00:46:36.680 about me
00:46:37.140 that were
00:46:37.380 negative,
00:46:38.140 they would
00:46:38.400 go to
00:46:38.620 this piece
00:46:39.080 of some
00:46:40.100 kind of
00:46:40.460 public
00:46:40.940 photograph,
00:46:43.160 somehow they
00:46:43.820 would get
00:46:44.040 the rights,
00:46:44.820 and they
00:46:45.220 would always
00:46:45.520 pick the
00:46:45.940 same one.
00:46:47.060 It was
00:46:47.320 one where I
00:46:47.760 was literally
00:46:48.240 acting like
00:46:49.140 I was
00:46:49.520 talking while
00:46:50.600 they took
00:46:50.920 the picture.
00:46:51.800 I was literally
00:46:52.680 going, blah,
00:46:53.720 blah, blah,
00:46:54.120 blah, blah,
00:46:54.780 blah, blah,
00:46:55.100 blah, you
00:46:55.540 know, not
00:46:55.840 even talking.
00:46:56.920 And one of
00:46:57.340 the pictures
00:46:57.680 just makes
00:46:58.180 me look like
00:46:58.680 a real jerk.
00:46:59.340 I'm like,
00:47:00.360 like this.
00:47:02.620 And whenever
00:47:04.040 there's a
00:47:04.460 negative, like
00:47:05.900 a hit piece
00:47:06.500 on me, they
00:47:07.620 always go right
00:47:08.220 to it, because
00:47:08.880 it's the one
00:47:09.300 that makes me
00:47:09.740 look the most
00:47:10.240 ridiculous.
00:47:11.440 And it's
00:47:12.640 actually so
00:47:13.180 funny that it
00:47:13.740 makes me laugh
00:47:14.500 when I see
00:47:16.120 it.
00:47:17.760 when can
00:47:21.380 we get
00:47:21.620 more micro
00:47:22.120 lessons?
00:47:23.240 You know, my
00:47:23.540 biggest problem
00:47:24.060 is that I've
00:47:24.660 got some noisy
00:47:25.460 home construction,
00:47:26.440 so I can't do
00:47:27.040 it in the
00:47:27.520 hours that I
00:47:28.100 want to do
00:47:28.440 it.
00:47:28.920 But I've
00:47:29.400 got a couple
00:47:30.340 lined up.
00:47:32.200 So, yeah,
00:47:33.240 this was one
00:47:33.840 today.
00:47:34.980 And we
00:47:36.540 will talk to
00:47:37.400 you tomorrow.
00:47:38.940 tomorrow.
00:47:40.440 And I
00:47:41.960 think it's
00:47:42.320 fair to say
00:47:42.820 this is the
00:47:43.500 best one
00:47:45.220 ever in a
00:47:47.040 very small
00:47:47.520 way.
00:47:48.220 The only one
00:47:48.800 I think is
00:47:49.240 useful is
00:47:49.820 noting that
00:47:50.720 the presumption
00:47:52.380 of innocence
00:47:52.860 has been
00:47:53.380 reversed.
00:47:54.440 And that's a
00:47:54.900 really big
00:47:55.580 deal.
00:47:56.420 And I don't
00:47:56.780 see anybody
00:47:57.260 talking about
00:47:57.820 it.
00:47:58.380 So, I
00:47:59.280 will talk
00:47:59.600 to you
00:47:59.940 tomorrow.
00:48:00.860 tomorrow.