Real Coffee with Scott Adams - November 13, 2022


Episode 1926 Scott Adams: Let's Talk About Election System Credibility, If That Is Still Legal, More


Episode Stats

Length

1 hour and 29 minutes

Words per Minute

141.89975

Word Count

12,697

Sentence Count

955

Misogynist Sentences

10

Hate Speech Sentences

15


Summary

In this episode of the highlight of civilization, we discuss the Tesla robot that's coming soon, the Canadian government's plan to allow doctor-assisted suicide in the country, and why it's a good or bad thing that people with mental health problems can legally kill themselves.


Transcript

00:00:00.160 Good morning, everybody, and welcome to the highlight of civilization.
00:00:05.680 You made it again.
00:00:07.440 I know, I know, you're very consistent, and good for you.
00:00:11.980 Now, if you'd like to take it up a notch before we have the best livestream you've ever seen in your entire life,
00:00:17.260 what do you need to do that?
00:00:18.320 Well, all you need is a cup or mug or a glass of tank or gel,
00:00:20.900 a stein, a canteen, a jug or a flask, a vessel of any kind,
00:00:23.240 fill it with your favorite liquid I like, coffee.
00:00:26.060 And join me now for the unparalleled pleasure of the dopamine hit of the day,
00:00:29.740 the thing that makes everything better.
00:00:31.240 It's called the simultaneous sip, and it happens in this awesome mug,
00:00:35.240 and it happens now.
00:00:36.860 Go.
00:00:42.180 Oh, yeah.
00:00:43.660 Yeah.
00:00:44.740 Yeah.
00:00:45.740 Mm-hmm.
00:00:47.080 I feel my, well, hold on.
00:00:50.060 Yes, my mitochondria is being repaired, and it's just from one sip.
00:00:55.540 Wow.
00:00:55.800 See, what happens is the coffee by itself gives you some benefits,
00:01:01.900 but when you pair the coffee with this simultaneity, well, a whole different level.
00:01:10.800 All right, let's talk about some stuff.
00:01:12.600 I saw a tweet from Elon Musk with a little icon of a robot and then some other icon that
00:01:20.820 was too small for me to look at, and I said to my, and then he said, soon.
00:01:25.660 And I thought to myself, he's announcing that their new Tesla robots are going to be produced
00:01:30.900 soon.
00:01:32.500 But it turns out that the other little thing that I didn't recognize was a tombstone.
00:01:36.140 So what he was really tweeting is that the bots on Twitter will be destroyed soon.
00:01:43.400 Now, it feels to me like there's already some action there.
00:01:48.640 I don't know if you've noticed it, but I feel there are fewer trolls on my account.
00:01:54.460 Now, I think I would notice it earlier than some of you because I get more troll activity
00:01:58.540 than most.
00:01:59.040 But to me, it looks like a better experience.
00:02:02.460 I don't know if that's just, I don't know.
00:02:05.400 It could be some other reason, but it looks like they're doing something different.
00:02:09.560 Do you have the same experience?
00:02:13.360 It could be, you know, completely, you know, confirmation bias.
00:02:19.400 Yeah, we'll talk about FTX.
00:02:22.660 But it made me think of the Tesla robots that are coming in, and I was wondering this.
00:02:29.040 How fast do you think the Tesla robots will be able to run?
00:02:33.740 If you just said, you know, run and get something, how fast would they be?
00:02:38.560 Because if they're fast enough, can I just strap a saddle to my robot and get rid of my car?
00:02:46.500 Can I just, like, ride on the back of the robot and say, robot, take me to Safeway.
00:02:51.720 I need some groceries.
00:02:52.420 And they would just start running down the street with me on the back.
00:02:59.100 Why not?
00:03:02.620 Well, something to look forward to.
00:03:06.560 Twitter user Resist Communism suggested that Elon should program the robots so that they can count ballots.
00:03:14.980 It can't be any worse than the people.
00:03:26.440 Up in Canada, I saw a headline.
00:03:28.360 I didn't look into it.
00:03:29.500 But apparently Canada is looking to allow mental health as one of the, bad mental health,
00:03:37.600 as one of the conditions in which people could request and get doctor-assisted suicide.
00:03:46.160 What do you think about people with mental health problems being able to kill themselves legally in Canada?
00:03:53.140 Good or bad?
00:03:55.000 Good or bad?
00:03:55.860 Well, so first of all, I'm going to make an assumption.
00:04:01.840 I'm going to make an assumption that they're not talking about people who don't know what their situation is.
00:04:07.780 I haven't looked into it.
00:04:09.580 But it seems to me that they would not include people who were just so out of it that they didn't know if they, they didn't know anything.
00:04:17.560 I doubt they're talking about that.
00:04:19.020 I'm guessing what they're talking about is someone who is completely in their right mind
00:04:24.120 and they have a depression or some kind of trauma that they've worked for decades to get rid of and it's just a living hell.
00:04:34.240 If somebody is in pain, permanent pain, does it matter if it's psychological or physical?
00:04:40.980 You know, you could argue that either one of them might have some future cure in the future.
00:04:48.060 So hold it down.
00:04:49.040 There might be a way to fix it in the future.
00:04:51.000 But that applies to both physical and mental problems, right?
00:04:56.260 There's always a reason to wait.
00:04:57.520 But what about, now, eugenics is different.
00:05:03.000 That's somebody else deciding to kill you.
00:05:04.900 Nobody's talking about that.
00:05:06.940 We're only talking about people who are in their right mind making a decision about their own bodily autonomy.
00:05:15.800 I don't know.
00:05:16.780 I think I'm open to listening to the argument.
00:05:19.480 But I'm completely on board with a person who's in their right mind, but they're basically locked in their own torture head.
00:05:29.180 I think that person needs an option.
00:05:31.800 And it's not up to me to tell them what they can and cannot do with their own body.
00:05:35.880 So I'm leaning toward it as long as they're really careful about not sliding into eugenics, obviously.
00:05:42.920 I saw Jordan Peterson suggest that Twitter should have effectively two parts.
00:05:53.720 One where you're seeing the comments and you can interact with the people who are verified, or at least their identity is known to Twitter.
00:06:01.580 And then a second part of Twitter that you could call, like, Twitter hell, where it's all the unverified bots and trolls and, you know, all the sadists and stuff.
00:06:13.980 And you could go there if you want, or you could avoid them easily.
00:06:18.440 And I thought to myself, well, I'm not sure you could, you know, cleanly create those two groups, but suppose you could.
00:06:24.420 If I had the option of only people who have used their real identity being sorted to the top of my comments, that's all I need.
00:06:35.200 I just want the real people to be at the top.
00:06:38.580 And then as I read down the list, I'll get to Twitter hell.
00:06:42.280 You know, it might be 20 comments down.
00:06:44.220 And then I'll just stop reading, because I don't want to go to Twitter hell.
00:06:47.800 But I might want to.
00:06:49.400 I might want to just see what they said.
00:06:51.780 So I like the option.
00:06:54.420 The other thing, it was an interview on Piers Morgan with Jordan Peterson.
00:06:59.680 Pretty interesting.
00:07:01.100 And Peterson was referring to a study that women who use Instagram are the worst kind of women.
00:07:09.920 I'm summarizing in a way he did not.
00:07:12.980 I think the traits were something like, you know, the people who use the most hours on Instagram were narcissists.
00:07:20.140 And narcissists and sadists and sociopaths and stuff like that.
00:07:25.860 And so the thinking is that the average person on social media, the average person who uses it a lot, right?
00:07:32.940 The more you use it, the more likely it's signaling a personality or a character defect that's pretty deep.
00:07:40.560 So I'm saying the first comment here, shocking.
00:07:42.960 Have you ever noticed how many times Jordan Peterson says something that your first reaction is, my God, that is academically and scientifically interesting and new information.
00:07:54.480 And then you think about it for like a minute and you go, oh, wait, I think we already knew that.
00:07:59.900 Yeah, because it was kind of obvious.
00:08:02.560 Right?
00:08:02.960 But it's nice to know that the science backs up what you think is obvious.
00:08:10.120 And also Jordan Peterson said he would prefer a DeSantis-like candidate over Trump.
00:08:20.060 You know, he said good things about Trump, but said that Trump raises the temperature too high.
00:08:26.400 It was bad for the Republic.
00:08:28.620 What do you think of that?
00:08:29.320 But if you agree that, let's say, Trump and DeSantis might have similar Republican politics,
00:08:36.860 would you prefer the one who does not raise the temperature to the point where the temperature itself becomes a problem independent of the politics?
00:08:46.820 You know, it's a tough one because the reason that Trump raises the temperature is because that's what the Democrats do to him.
00:08:55.000 So should Republicans be denied the president of their choice because he is more susceptible to the other side's dirty tricks?
00:09:06.600 I don't know.
00:09:08.100 Yeah, you have to make a practical decision.
00:09:11.180 Do you want to be a moral and ethical loser or, you know, the opposite?
00:09:16.960 Well, correct me if I'm wrong, but would you say, let's say you imagine the subset of prior Trump supporters who were also in the public eye, right?
00:09:36.700 So if I were to narrow the conversation to just people in the public eye, people like me, you know, like prominent pundits,
00:09:45.240 what percentage of the people who were Trump supporters and are prominent pundits, what percentage are still backing him as of today?
00:09:54.500 Forget about the public.
00:09:56.580 I'm only talking about the public voices, you know, the people who are in public.
00:10:00.580 Because it's a lot easier to say you don't back him at the moment, isn't it?
00:10:06.440 It's kind of safe to say that because, you know, it feels like you're on the side of the angels.
00:10:14.680 Well, I don't know the percentage, but it's hard to argue with DeSantis gives you everything you want without the bad parts.
00:10:25.060 It's a pretty strong argument.
00:10:26.680 I don't know if it's true, which is a whole different topic.
00:10:30.580 Because you might say, well, DeSantis won't push as hard as Trump would, so Trump's still the better choice.
00:10:37.160 Maybe.
00:10:38.540 Maybe.
00:10:39.380 Don't know.
00:10:40.800 But I would note that a lot of people that you would consider reasonable have now decided that even if they like Trump,
00:10:49.160 and even if they liked his first term, that things have changed.
00:10:53.940 We're in a different environment.
00:10:55.420 He's older.
00:10:56.780 A lot has changed.
00:10:57.620 And so it's not the perfect fit anymore.
00:11:01.040 Now, how would you like to hear the opposite argument?
00:11:07.960 It goes like this.
00:11:10.360 Have you noticed that reality...
00:11:12.540 This is something that Musk tweeted once, but I think I'm the first person who noted it.
00:11:17.360 Have you noticed that reality follows the path of most entertainment?
00:11:20.460 Not for the victims, because there's always a victim, but for the casual observers, reality follows the pattern of most entertainment.
00:11:32.200 For example, Trump getting elected the first time, greatest entertainment.
00:11:37.660 What would be the most entertaining thing that could have happened with the midterm elections?
00:11:44.040 The most entertaining thing is a super close election that nobody expected.
00:11:50.000 I won't say nobody.
00:11:52.220 Most people didn't expect.
00:11:54.380 Followed by a delayed vote.
00:11:56.960 Followed by, unexpectedly, Democrats do better than ever before for a midterm with the president of their own party.
00:12:05.080 Now, if you would predict it in advance and say, okay, what's the most entertaining thing that could happen?
00:12:11.920 The most entertaining thing is neither a clean win for a Democrat or a Republican.
00:12:16.780 The most entertaining thing is that we're going to fight over who really won.
00:12:22.040 By far, that's the most entertaining.
00:12:23.840 In a bad way, but it will get you the most clicks and the most energy.
00:12:28.260 So that would have predicted that, totally.
00:12:35.880 So I had a point I was working toward that I've totally lost.
00:12:40.680 What was I talking about?
00:12:46.780 Expected cheating.
00:12:48.380 Oh, and it happened.
00:12:49.320 Okay, here we go.
00:12:51.320 What would be the most, let's say, the most entertaining story arc for Trump as of now?
00:13:00.160 The most entertaining story arc would be what?
00:13:05.460 Third act.
00:13:06.800 Exactly.
00:13:08.820 We just had Trump's third act.
00:13:11.300 Now, he might have more than one third act because he's the comeback, like, shocker.
00:13:16.780 But at this point, aren't all the smart people saying he's dead?
00:13:22.440 Right?
00:13:23.760 The most entertaining outcome would be a discovery of fraud in the 2022 election that changed the outcome.
00:13:34.760 Followed by a landslide victory by Trump for having been completely cleared of his accusations about 2020.
00:13:51.940 Now, I'm not...
00:13:53.420 Let me be very clear.
00:13:55.760 I am not saying that there's any evidence of election fraud.
00:14:01.540 I'm not saying that at all.
00:14:03.600 Apparently, I could say that on Twitter if I wanted to, but I don't see any.
00:14:08.620 So I'm not aware of any, like, obvious fraud.
00:14:12.640 Let me be really clear.
00:14:15.360 But wouldn't that make the best story?
00:14:19.240 Wouldn't it?
00:14:19.900 I mean, there's no contest.
00:14:22.920 That would be the best movie.
00:14:24.980 Right?
00:14:25.700 So here we get to test the best movie filter.
00:14:29.260 The best movie filter is that Trump is down and out, and there's no way he can come back from apparently being blamed for the midterms.
00:14:38.520 But if, and there's no evidence of this, no evidence whatsoever, we're only talking speculative, you know, movie imagination stuff here.
00:14:48.480 What if they found an actual smoking gun in Arizona?
00:14:54.680 It changes everything.
00:14:56.780 It changes everything.
00:14:58.740 Now, I don't think it'll happen.
00:15:00.740 You know, I'd bet against it.
00:15:02.160 But that filter does predict it.
00:15:04.760 Would you agree?
00:15:05.360 Would you agree that that's the most obviously entertaining outcome?
00:15:10.820 Not that it's likely, just that it's the most entertaining.
00:15:15.420 Okay?
00:15:16.740 So let's just keep an eye on that.
00:15:19.280 We'll talk about some other prediction filters in a bit.
00:15:24.460 Speaking of people who are bailing out on Trump, Candace Owens had a personal interaction with Trump that apparently she considered rude.
00:15:34.280 That he was rude to her.
00:15:36.120 And I guess it was based on some rumor in the Daily Beast that completely quoted her out of context and made it look like she was insulting Trump when literally she didn't say it.
00:15:48.340 And by the way, if Candace says, and she did, I didn't say those things, and the Daily Beast says you said those things, you don't have to wonder who to believe, do you?
00:15:59.160 Like, that one's a slam dunk, right?
00:16:03.440 Yeah.
00:16:04.580 It's definitely Candace.
00:16:06.060 There's no competition.
00:16:07.780 The Daily Beast is the least credible thing you could ever see.
00:16:11.480 So, do you think that Trump could win back Jordan Peterson, you know, it doesn't matter, he's Canadian, but Candace Owens, me?
00:16:25.580 Do you think he could win back his base?
00:16:29.120 And even the prominent people who've abandoned him?
00:16:32.440 Yeah.
00:16:33.160 Yeah, he could.
00:16:34.480 Totally he could.
00:16:35.340 All he has to do, all he has to do is be the best option.
00:16:39.840 That's it.
00:16:40.820 He just has to be the best option, and then everybody comes back.
00:16:44.060 There's no mystery to it at all.
00:16:45.920 Could he do that?
00:16:47.520 Totally.
00:16:48.700 Totally he could.
00:16:49.860 He could give me back just by having a reasonable fentanyl plan.
00:16:54.560 Because nobody else is going to have one.
00:16:56.360 Do you think anybody else will say anything serious about fentanyl?
00:17:00.000 Nope.
00:17:00.920 Nope.
00:17:01.240 But if Trump does, I'm back.
00:17:05.520 And I'm not just back.
00:17:06.760 I'm back full power.
00:17:09.640 Right?
00:17:10.020 If I come back, I'm going to forgive every other fucked up thing he ever did, because I'm a single issue voter.
00:17:18.100 Right?
00:17:18.960 Now, let me tell you why I'm a single issue voter.
00:17:21.260 It gives me more power.
00:17:22.320 If my vote, you know, or my support were, you know, spread over lots of issues.
00:17:29.300 But if I just pick one thing, I can maybe, you know, have enough of a wedge to make something happen.
00:17:37.700 So that's why I'm a single issue voter.
00:17:39.780 You don't have to be.
00:17:40.800 I'm not suggesting that you be one.
00:17:43.800 I'm just saying I'm going to be one.
00:17:45.000 It gives me more influence.
00:17:47.180 All right.
00:17:47.460 You're all following the story of the crypto exchange called FTX that burst on the scene just in 2019.
00:17:59.440 So I'd missed this entire story, you know, until it blew up recently.
00:18:03.600 But I'd never heard of FTX.
00:18:05.360 I didn't know that the creator of it, the founder, was worth some tens of billions of dollars, he claimed, or on paper or something.
00:18:14.640 And he was giving massive amounts of money to Democrats.
00:18:19.160 But here's the, and he was like 30 years old, right?
00:18:21.380 And here's the interesting thing about it.
00:18:25.200 It seems like you always have this.
00:18:27.200 It didn't take long for somebody to find some smart people who know the crypto world, and they had some video podcast where the two smart people are saying, I don't understand this whole FTX thing.
00:18:40.540 Like, there's no way he could be producing as much cash to use outside of the exchange with what the exchange is doing.
00:18:48.800 Basically, people who understood it looked at it and said, this doesn't make sense.
00:18:54.340 This couldn't possibly, the math doesn't work.
00:18:57.200 And sure enough, we don't know the exact story of what was going on, but the math didn't work.
00:19:04.680 It may have been a total fraud.
00:19:07.680 Okay.
00:19:09.000 Do you know how many people have asked me to mention the last name of the founder?
00:19:14.620 The last name of the founder who created an exchange, which some people would say in a crypto kind of a way is sort of like a bank.
00:19:22.920 You know, where you put your money and then you can get it back.
00:19:26.160 It's like a bank.
00:19:27.740 It's an exchange.
00:19:30.660 And then in the end, that bank, if you could call it that, sort of got burned to the ground by, I don't know, fraud or something.
00:19:40.640 And the last name of the founder, I'm not making this up.
00:19:46.440 It's a hyphenated name.
00:19:48.780 Bank man, bank man fried.
00:19:53.040 F-R-I-E-D, fried.
00:19:56.420 Bank man fried.
00:19:58.620 And sure enough, he's a bank man.
00:20:01.680 Bank man fried, and he fried.
00:20:07.060 If you tell me that the simulation is not sending us any winks, I call bullshit on that.
00:20:14.900 I think there's bloody winks.
00:20:17.640 Might be coincidences, but it's fun to think of them as winks.
00:20:20.560 So, one of the big investors in FTX was BlackRock.
00:20:32.920 BlackRock was one of the investors.
00:20:36.920 So, what else is BlackRock invested in?
00:20:42.000 BlackRock.
00:20:42.680 Where have I heard that name before?
00:20:44.080 Oh, yeah.
00:20:47.040 ESG.
00:20:49.320 So, BlackRock, the people behind what I would consider a scam, which is all the ESG stuff.
00:20:57.800 Now, it's a fairly transparent scam in the sense that I don't think BlackRock is hiding anything.
00:21:04.440 But, you know, everything looks legal.
00:21:06.400 But, convincing people that, you know, these little environmentally friendly companies that have been raided by somebody are positive, it's kind of scammy looking.
00:21:18.320 For example, guess what?
00:21:22.420 FTX, the company, was rated as an ESG company.
00:21:25.640 And I remind you that one of the people who invested in FTX, reportedly, is BlackRock.
00:21:34.240 And BlackRock is the promoter of ESG.
00:21:37.380 So, do you think that they gave a high score or a low score to the company that they invested in personally?
00:21:46.600 High score.
00:21:48.420 Surprise.
00:21:49.480 Really high score.
00:21:50.300 Now, it makes sense that they would have a high score, because if you were a digital company, and their problem, they weren't, I don't think they were heavy on server farms or anything like that, right?
00:22:04.720 So, they weren't burning up a lot of electricity.
00:22:07.740 So, you know, according to the ESG rules, that would be a good ESG investment.
00:22:12.880 Not a good investment, but it would be strong on ESG.
00:22:16.640 Anyway, you see BlackRock as an investor, that should be a red flag.
00:22:20.300 There's an ex-BlackRock executive type who's writing in, I think it's in the Wall Street Journal, Terrence Keeley.
00:22:31.320 He was a former BlackRock guy.
00:22:33.520 He says in his new book that the ESG investment model is broken.
00:22:38.220 And what he means is that the ESG investments don't perform better than non-ESG, which is, like, the reason people invest.
00:22:47.180 But also, it isn't necessarily causing anybody to do things better.
00:22:53.720 In other words, it's not causing the low ESG score people to try harder and, you know, be compatible with ESG.
00:23:01.600 So, it seems like it's not happening.
00:23:03.400 There's probably a lot of greenwashing where they're pretending to be compatible.
00:23:07.380 But, so apparently the ESG thing just doesn't work.
00:23:09.920 Let's talk about TikTok.
00:23:16.080 I told you, and I was wrong, so I'm admitting I'm wrong now.
00:23:20.280 If you like that, you came to the right place.
00:23:23.540 You like it when I admit I'm wrong, don't you?
00:23:25.660 Especially when I'm cocky.
00:23:27.900 And then I'm wrong?
00:23:29.540 That's the good stuff.
00:23:30.500 So, if you like to see me be wrong when I was cocky, here it is.
00:23:38.480 I told you I could destroy ESG, and, I don't know, maybe I helped.
00:23:42.140 It's hard to know what impact I had.
00:23:44.340 That's not what I'm talking about.
00:23:46.500 I also told you that I was going to destroy TikTok.
00:23:50.060 Because TikTok, sort of their soft underbelly was revealed by the election.
00:23:56.260 Now, I'm not alleging that TikTok changed the election.
00:23:58.900 What I'm alleging is that we saw very clearly that it could have.
00:24:04.860 If China wanted to change the algorithm at TikTok, which they could do,
00:24:09.080 then the biggest group of voters that determined the election, say the experts,
00:24:13.660 which is young people and especially young single women, mostly the TikTok users.
00:24:18.580 If China wanted to influence them, they'd just tweak the algorithm, and there it is.
00:24:24.500 There it is.
00:24:25.300 So, here's what I mistakenly thought.
00:24:30.520 I thought, once we have this close, sketchy election, sketchy in the sense that the vote
00:24:36.580 count was delayed, if you've got a really close, sketchy election, and everybody's talking about
00:24:41.260 what are all the things that might have affected it, this was the perfect time for everybody
00:24:47.540 to say, whoa, whoa, whoa, whoa, we don't know that TikTok had any influence, but it's obvious they could.
00:24:54.620 It's obvious they could, right?
00:24:56.420 So, why would you want to keep that situation?
00:24:58.840 Because next time, they might actually influence it.
00:25:01.720 It's entirely possible that, yes, sketchy.
00:25:05.600 Why are you asking me about the word sketchy?
00:25:07.920 Sketchy, in my telling of it, just means we didn't get a quick, certain, reliable result that we find credible as voters.
00:25:19.900 Doesn't mean there's anything illegal.
00:25:22.000 I haven't seen any evidence of that.
00:25:23.460 But here's why I will not be able to have any impact on TikTok.
00:25:30.760 And it took me a while to figure this out, because I was noticing a pattern, but I wasn't sure it was a pattern.
00:25:37.620 And here's the pattern.
00:25:40.400 Nothing gets done unless there's a billionaire who wants it to be done.
00:25:46.020 General statement.
00:25:46.880 In politics, nothing happens until there's at least one billionaire who wants to get that done.
00:25:55.920 TikTok is the only situation where there are no billionaires who want to get it done.
00:26:03.200 A nut.
00:26:04.500 Because all billionaires have Chinese business.
00:26:08.360 It would be almost impossible to have vast resources and not have a China connection.
00:26:14.700 It's almost impossible.
00:26:15.600 So you're not going to get Murdoch to go anti-China, are you?
00:26:22.100 So there goes the Wall Street Journal, there goes, you know, Fox News, there goes the New York Post.
00:26:28.580 Am I wrong?
00:26:30.000 Well, let me ask you this question.
00:26:31.320 Do you think Rupert Murdoch has a business interest that he needs to make China happy?
00:26:39.120 I don't know anything about his business, but I assume so.
00:26:43.060 Wouldn't you assume so?
00:26:44.040 Media mogul?
00:26:46.160 It seems like it.
00:26:47.280 Yeah.
00:26:50.640 So how about Trump?
00:26:53.960 Now, Trump doesn't seem to have China business, but for whatever reason, he did try to ban TikTok, so that might be the one exception.
00:27:04.080 Who would be the other billionaire who doesn't have any business in China?
00:27:11.140 So here's what you need to know about the world.
00:27:14.980 Everybody that you see in public, you know, all the politicians and the pundits, they're not the real people.
00:27:21.040 Almost all of the important pundits and the important politicians have at least one billionaire who is the real power behind the throne.
00:27:31.860 You don't necessarily know who they are.
00:27:34.460 The people behind the curtain know all those billionaires.
00:27:36.640 So if you name anybody in Congress, the people behind the curtain will tell you they're a billionaire.
00:27:43.400 You go, oh, that's that guy's guy.
00:27:45.480 All of them.
00:27:48.300 Until you see it yourself, you don't believe it.
00:27:51.760 No, even Trump had his own billionaire, right?
00:27:54.180 Remember in Trump in 2016?
00:27:56.500 Mercers, right?
00:27:57.200 So everybody, and Adelson, right?
00:28:00.500 So if you don't get a billionaire or two, you can't get anything done.
00:28:04.840 So my, in order for my persuasion to be effective in general, I would first have to persuade other media types.
00:28:14.960 And then the other media types would, you know, boost it.
00:28:18.020 And if the media said it enough, then the politicians, you know, would feel, oh, that's a big click item.
00:28:24.740 I'll get on that.
00:28:25.640 So that's the way it would normally work.
00:28:27.820 But that process doesn't work because the major pundits, as soon as a billionaire who backs them finds out what they're saying, they're going to stop saying it.
00:28:40.320 So here's what I think should happen.
00:28:45.520 I think Musk can make Twitter insanely profitable with just two things.
00:28:51.260 One, and he plans to do at least one of these, probably both.
00:28:54.120 One, adding payment options so I can directly pay through Twitter, you know, some kind of digital payment system.
00:29:04.780 I want to pay through Twitter for anything advertised on Twitter.
00:29:08.420 I mean, you do that and you serve me up advertisements of things I want, I'll give Twitter all kinds of money because I'm going to be buying that same stuff anyway.
00:29:19.540 Might as well give them a little cut of the, you know, the transaction.
00:29:22.340 The second thing, and Musk asked about this, which is bringing back Vines.
00:29:28.080 Now, Vines was the short video thing that Twitter had.
00:29:31.440 And I don't know if it wasn't technically up to what it needed to be, but it didn't succeed.
00:29:37.080 But at this point, we know for sure that the most viral social media is video.
00:29:43.960 And it's short videos.
00:29:45.560 It's TikTok videos or it's Instagram Reels or it's Facebook videos.
00:29:50.600 And it wouldn't be hard to add a filter on Twitter so that you could filter out everything that isn't a short video.
00:30:00.860 Imagine that.
00:30:02.780 Imagine just hitting one button and suddenly instead of seeing things that are in your feed, you see all the short videos and they're ranked by popularity.
00:30:13.520 I'm already hooked, totally hooked on the Instagram videos.
00:30:19.520 They're so addictive.
00:30:21.100 It's just crazy.
00:30:22.740 And Twitter could be the same.
00:30:24.520 So at least there would be an American substitute.
00:30:28.060 There already are, you know, Snapchat and Instagram.
00:30:31.640 But I think we could maybe compete TikTok away.
00:30:35.620 Maybe.
00:30:36.600 Might be able to just have a better product and make it go away on its own.
00:30:40.080 But it would be hard to get kids to think that TikTok wasn't cool anymore and to get a kid to think that Twitter, which seems like your grandfather's service, it seems like an older male sort of thing.
00:30:55.740 It'd be hard to get teens to go onto that platform.
00:30:59.120 But not impossible if the product is better.
00:31:02.960 So maybe that's what will happen.
00:31:04.180 And, you know, I can't directly convince people that like TikTok not to like it.
00:31:13.140 That's a hard sell.
00:31:15.060 You can get people who are unaligned and don't have an opinion to come down your way.
00:31:20.480 But if somebody loves TikTok, talking them out of it, you can't do it.
00:31:25.240 You can't convince teens not to use it.
00:31:27.300 That's not even a thing.
00:31:28.060 Now, here's the biggest risk with TikTok.
00:31:33.640 And I've taught you about this before.
00:31:35.600 It's how magicians work.
00:31:38.120 If you have a big complaint about, let's say, topic X, and it's a real valid complaint, and you say, my complaint about topic X is this,
00:31:49.600 the best way to make that go away is to have a second complaint about that same topic that gets more attention.
00:31:59.100 But the other complaint isn't actually that important.
00:32:02.440 And that's happening with TikTok.
00:32:04.540 If I tell you TikTok is a risk, what's the top thing you say?
00:32:09.120 It's a risk because, go.
00:32:11.540 TikTok is a risk because, fill in the blank.
00:32:14.740 What is the risk?
00:32:15.780 Well, China, right.
00:32:19.660 But what is China doing, specifically?
00:32:25.440 Influence, brainwashing.
00:32:28.880 Interesting.
00:32:30.440 Most of you have been, I think, informed by me to know what the real risk is, so you got most of it right.
00:32:36.380 If you look in the media, the media will say the risk of TikTok is that they have access to data about Americans.
00:32:43.820 Access to data.
00:32:47.260 Is that the big problem?
00:32:51.080 No.
00:32:52.300 The problem is that they would use not just that information, but they would tweak the algorithm to persuade people.
00:33:02.460 And if you look at it, I was just reading a major article.
00:33:04.860 So it's a major article about the risk of TikTok that never mentions the influence.
00:33:12.160 It only mentions the data security element.
00:33:16.540 Now, if you were to rank those two risks, the risk of data security is important, especially when children are involved, right?
00:33:24.480 If you were to give that an importance, I'd say, that's at least an 8 out of 10.
00:33:29.400 Wouldn't you say?
00:33:30.740 Probably an 8 out of 10.
00:33:33.760 What is the level of risk for the algorithm and using AI to persuade you?
00:33:39.220 That's a weapon of mass destruction.
00:33:41.800 That could bring down an entire country.
00:33:44.800 That's not just even a 10.
00:33:46.280 That's a 25.
00:33:46.920 That's comparing a nuclear weapon to a really good tank or an HMAR system.
00:33:55.840 They're not really in the same class.
00:33:58.840 Data security, really important.
00:34:00.740 Really important.
00:34:01.920 But it's not in the same class with what the algorithm could do.
00:34:06.460 Those are really different.
00:34:07.720 So every time you see an article about the data security part of TikTok, I don't know what's intentional, but it is taking you away from the real problem.
00:34:17.320 It's taking your mind in the wrong direction to something small, when in fact, it's really big.
00:34:25.440 All right.
00:34:30.900 Let's see.
00:34:31.600 Well, it's not the most practical idea, but would it be hilarious if all of the American dads, mostly dads, decided to kill TikTok by making it uncool?
00:34:58.960 Like, we all just sign up and make dance videos?
00:35:01.600 Like, we're barbecuing and shit?
00:35:03.720 I'm barbecuing.
00:35:08.380 And just, like, fill TikTok with dad dancing?
00:35:14.280 Could you kill it?
00:35:17.220 Because, you know, it's not a joke that kids will run away from anything parents are using.
00:35:23.120 That's not a joke.
00:35:24.340 That's real.
00:35:25.800 I don't know.
00:35:26.200 There's probably some level of dad dancing that would make TikTok die.
00:35:32.620 But they would just use the algorithm to make sure you didn't see it, so that won't work.
00:35:37.860 Kyle Becker on Twitter asks this by a tweet.
00:35:41.320 Remember the nuclear secrets that Donald Trump stole and hid at Mar-a-Lago?
00:35:46.220 You know, whatever happened to those?
00:35:51.420 The Democrats played a very effective persuasion game.
00:35:57.280 So the January 6th thing was the setup.
00:36:00.380 So that was, like, the canvas.
00:36:03.360 So if they talked enough about January 6th, it looked like Trump was, you know, just a person who would do, like, any bad thing.
00:36:11.680 But the January 6th thing, the longer it went on, it just felt a little aged.
00:36:16.400 It felt like, you know, yesterday's news.
00:36:18.120 So they needed something new and fresh that would remind you of how bad he was.
00:36:23.380 So suddenly he's got nuclear secrets at Mar-a-Lago, which we'll never hear about again.
00:36:29.700 You'll probably never hear about it again.
00:36:32.420 Unless he runs for office, then you'll hear about it.
00:36:35.860 But do you think that was ever real?
00:36:38.420 I don't.
00:36:39.860 I don't think that was ever real.
00:36:41.320 I think there might be nuclear-related documents.
00:36:48.320 Would you doubt that?
00:36:50.180 I don't doubt that there are nuclear-related things.
00:36:54.040 They might even be marked as confidential or secret or whatever.
00:36:58.060 They might be.
00:36:59.600 But I'll bet they're not important.
00:37:02.520 I'll bet they're not important.
00:37:04.880 All right.
00:37:05.800 So here's the big question.
00:37:07.340 Should not all of the major media be reporting the following data to us today?
00:37:14.940 Hey, public, you should know that most of the time when an election is delayed more than, let's say, a day,
00:37:24.480 most of the time, one party usually wins.
00:37:28.900 Wouldn't you like to know that?
00:37:30.780 Feels like that would be important.
00:37:33.100 Do you think anybody will report that today?
00:37:35.500 It should be the front page of every paper, right?
00:37:39.480 Because if you don't know the context, how can you judge today's story?
00:37:44.980 If you tell me, hey, this story took days longer than we hoped, well, would you like to know the context?
00:37:54.000 So my understanding is people say that it usually goes to the Democrat when the vote is delayed.
00:38:03.980 But is that by itself evidence of fraud?
00:38:08.140 Go.
00:38:09.300 Suppose you knew that eight out of ten times when an election is delayed, the Democrat ends up winning.
00:38:15.940 Is that evidence of fraud?
00:38:18.500 It is not.
00:38:19.400 Because the Democrats prefer the ballot method, which in some cases is not necessary, but in some cases they like to give a little cushion for counting it.
00:38:34.200 So their system is actually just designed to make sure that they have time to verify all of the ballots coming in.
00:38:39.860 So it would make sense that where Democrats like ballots, ballots take longer, and they don't have to take longer.
00:38:48.680 There's a way to get around it.
00:38:51.340 But in some places they might want to have low lines and let you vote up to the last minute.
00:38:56.980 Just a difference in preference.
00:38:58.920 And that would explain everything.
00:39:01.260 It would explain everything.
00:39:02.580 So let me say that as clearly as possible.
00:39:05.540 That one difference, it would explain everything.
00:39:08.900 It would explain why some are late.
00:39:11.200 It would explain why the Democrats usually win, because they have more ballot voting.
00:39:16.680 And that gets counted last.
00:39:19.020 So we're all good, right?
00:39:21.240 All good.
00:39:22.860 No problem.
00:39:24.280 Because the only evidence of alleged fraud is easily explained by the obvious, really.
00:39:30.260 It's quite obvious.
00:39:33.960 Well, here's my spin on it.
00:39:36.940 Suppose you are an engineer.
00:39:38.300 Do we have any engineers here?
00:39:40.160 If you're an engineer, identify yourself in the comments.
00:39:42.460 I just want to see how many we have today.
00:39:44.420 Fuck you, Jenny, in all caps.
00:39:47.400 Stupid cunt.
00:39:51.440 Get the fuck out of here.
00:39:55.480 Oops.
00:39:57.760 Unhide.
00:39:58.160 Oh, there you are.
00:40:00.700 You're already hidden.
00:40:02.440 All right.
00:40:03.300 I see quite a number of engineers.
00:40:07.440 How do you keep doing that?
00:40:09.800 How come I keep hiding you?
00:40:12.760 You keep appearing.
00:40:15.040 All right.
00:40:15.380 Lots of engineers.
00:40:16.520 This question is for the engineers only.
00:40:18.960 Okay?
00:40:19.580 So just for a moment, I want only the engineers to comment.
00:40:26.080 Okay?
00:40:27.660 And I know you all want to comment.
00:40:29.200 I'll let you comment.
00:40:30.120 But just for a moment, only engineers.
00:40:32.920 And here's the question.
00:40:37.360 What specifications do you think the elections had that were executed in terms of the system they built?
00:40:46.080 What do you think the specifications were, the specs?
00:40:48.560 Do you think that the specs were reduce the lines?
00:40:54.900 Maybe.
00:40:55.800 Do you think the specs were make sure they have maximum time for checking ballots?
00:41:02.660 Here's what I think the specs should have been.
00:41:09.100 Maximum credibility.
00:41:11.900 Would you disagree with that as the specs?
00:41:15.200 No.
00:41:15.820 Not maximum accuracy.
00:41:17.380 Nope.
00:41:18.080 Nope.
00:41:18.800 Maximum credibility.
00:41:19.840 Not maximum accuracy.
00:41:23.480 Maximum accuracy gets you a slow election.
00:41:27.060 Right?
00:41:27.400 That's what they are.
00:41:28.680 They did design it for that.
00:41:30.680 But the slow election gets you what?
00:41:34.420 A delayed vote count gets you what?
00:41:37.540 Guaranteed every time.
00:41:40.000 It gets you doubt.
00:41:41.340 Right?
00:41:41.480 So if you were an engineer and you designed a system that by its design would give you a days-long wait for the outcome in a key election battleground,
00:41:54.820 if you designed it so that you knew, by design, that it would take days to count it properly,
00:42:02.480 what are you trying to maximize?
00:42:04.780 Are you trying to maximize the credibility or something else?
00:42:08.720 Would that maximize the credibility of the vote?
00:42:15.440 No.
00:42:17.260 No.
00:42:18.220 The longer people have to wait, remember, the public is not sophisticated.
00:42:23.060 If the public were sophisticated, you could say, well, these are just different preferences.
00:42:28.640 Some states like low lines.
00:42:32.280 They just have a different preference.
00:42:34.800 And it takes longer to count under some schemes than others.
00:42:38.100 That's all it is.
00:42:39.500 But the public is not sophisticated.
00:42:42.060 The public just knows they don't have an answer, and it looks suspicious.
00:42:46.860 How many of you engineers, this is just for the engineers,
00:42:49.980 how many of you engineers would have totally known that designing a system with a late outcome,
00:42:57.140 when other states are not late,
00:42:59.240 how many of you had known it would have reduced the credibility of the result?
00:43:03.860 All of you, right?
00:43:07.080 100% of engineers would know that the timing of when you're done is maybe the biggest factor in credibility.
00:43:16.060 So, do you think that they would intentionally design the system for credibility,
00:43:23.220 as their highest standard, would they design it that way?
00:43:26.240 Engineers only.
00:43:29.700 Engineers only.
00:43:30.700 If your top requirement was credibility, would you ever design it that way?
00:43:36.060 No.
00:43:37.060 No.
00:43:38.180 All right.
00:43:38.740 Suppose you say to me, Scott, Scott, Scott.
00:43:41.960 You know, credibility isn't one variable.
00:43:45.120 You know, the time it takes to do the election is just one variable.
00:43:48.640 But what about all the extra effort to make sure that the signatures match?
00:43:55.600 That gives you more credibility, right?
00:43:58.300 Because you're taking it really slow.
00:44:00.940 You know, people are watching.
00:44:02.460 You've got people watching.
00:44:04.440 Taking it real slow, and they're really, really going to be careful about curing all the signatures.
00:44:08.280 Make sure you get a really good vote.
00:44:09.660 Doesn't that give you more credibility?
00:44:13.380 No.
00:44:13.780 Not as much as you lose by being late.
00:44:19.100 Again, just for the engineers.
00:44:20.840 Would that not be obvious to you?
00:44:23.020 Just the engineers.
00:44:24.080 Wouldn't it be obvious that that little extra effort of checking signatures would be sort of lost to the public?
00:44:31.520 But they're definitely going to notice if you're late.
00:44:34.460 And especially when the other states are not that late.
00:44:38.020 So here's my ultimate take.
00:44:40.940 There are no engineers in the world, and engineers built the system, right?
00:44:46.680 No matter whether politicians said, go build a system.
00:44:50.080 Whoever said, whoever paid for it, an engineer built it.
00:44:54.860 Engineers.
00:44:56.660 Technical people.
00:44:58.580 So they built a system that is not optimized for credibility.
00:45:03.980 Would you agree with me?
00:45:05.240 Any system that has a delayed count, even for all the best reasons, is not designed for credibility.
00:45:13.660 If it's not designed to optimize credibility, what is it optimizing?
00:45:20.360 Is it optimizing convenience?
00:45:23.340 Because that's one of the things they say.
00:45:24.840 You don't have to stand in line.
00:45:26.540 You could just drop off your ballot on the same day.
00:45:29.340 No standing in line.
00:45:30.200 Well, if you optimize for convenience, is that better than optimizing for credibility?
00:45:38.800 No.
00:45:39.680 Literally nobody would say that, right?
00:45:42.500 How about this?
00:45:43.960 Democrats would say, no, we're optimizing to get the most votes.
00:45:48.980 We want the most people to be not disenfranchised.
00:45:51.920 You know, the system is healthier the more participation we have.
00:45:56.480 So we're doing everything we can to make sure every vote counts and we get everybody to the election.
00:46:03.040 Would that give you more credibility, because more people were participating,
00:46:09.680 compared to being late with your result?
00:46:13.500 No, it's very clear again, that being on time with your result would trump convenience.
00:46:21.160 I won't use trump.
00:46:22.300 It would beat convenience, and it would beat having extra people vote.
00:46:27.220 Does having extra people vote, who are the marginal people who weren't that interested in voting?
00:46:33.460 Am I right?
00:46:34.940 The extra people that, you know, extra work brings into the system are the dumbest ones.
00:46:40.600 They're the least information, the least caring.
00:46:46.620 They add absolutely nothing to the quality of the decision.
00:46:50.580 So do you think Democrats are trying to improve the quality of the decision by bringing in the dumbest people?
00:46:58.120 Obviously no.
00:46:59.580 Obviously no.
00:47:01.080 Nobody would design a system trying to get the best outcome
00:47:05.360 by bringing in extra of the dumbest group of people who are paying the least attention.
00:47:11.360 Nobody would do that.
00:47:13.560 So I give you this following conclusion.
00:47:16.780 While I cannot say there was any fraud intended or actual in this election,
00:47:21.920 I have no data that would suggest it.
00:47:25.220 But I can tell you,
00:47:26.200 no engineer would build this system for any intention other than hiding fraud.
00:47:34.720 It's designed to hide fraud.
00:47:37.760 It's designed that way.
00:47:40.400 And it's a battleground state.
00:47:43.220 And they didn't have to.
00:47:45.120 And everybody who says,
00:47:46.160 but Scott, you don't understand.
00:47:48.540 Their system does take longer because of the way they did it.
00:47:51.300 Nobody else had to do it that way.
00:47:55.640 And that's not an argument at all.
00:47:57.640 Because you've got all the other states who are not doing that.
00:48:00.860 So they had an option.
00:48:02.500 They had an option to be credible.
00:48:04.960 And they chose not.
00:48:08.440 Am I wrong?
00:48:09.720 They had the option to be credible.
00:48:12.180 And they had time.
00:48:13.080 I'm sure they had time to make adjustments.
00:48:15.820 They had an option.
00:48:17.240 They chose not to.
00:48:18.300 What are you supposed to make of that?
00:48:21.300 How else can you interpret that, really?
00:48:26.980 From an engineering perspective,
00:48:29.180 it's obvious it's designed for fraud.
00:48:31.840 It's obvious.
00:48:34.300 Now, let me be clear.
00:48:35.760 When I say designed for fraud,
00:48:37.720 it doesn't necessarily mean intentionally.
00:48:41.260 It doesn't necessarily mean intentionally.
00:48:44.300 But a design,
00:48:45.220 a design is a decision in a sense.
00:48:49.420 The design tells you what,
00:48:51.300 where your priorities were.
00:48:53.260 Whether they're conscious or unconscious,
00:48:55.880 it's still going to tell you where your priorities were.
00:48:58.100 And the priority was an election that's less transparent.
00:49:01.880 What are you supposed to make of that?
00:49:04.020 It's obviously not designed for credibility.
00:49:07.060 And they could.
00:49:08.200 And it's the highest requirement.
00:49:10.120 Now, the reason they could get away with it
00:49:15.300 is that there aren't that many engineers.
00:49:17.920 If everyone were an engineer,
00:49:20.240 this couldn't happen.
00:49:22.700 Am I right?
00:49:24.080 Give me 100 engineers,
00:49:26.280 and we'll walk in and talk to the politicians in Arizona
00:49:29.940 and say,
00:49:31.060 I got 100 engineers here.
00:49:33.000 Ask these fuckers how to do this right.
00:49:34.820 They'll tell you.
00:49:39.140 They'll have different opinions of what's right,
00:49:41.500 but they'll all agree,
00:49:43.240 100% of them,
00:49:44.580 that the way they're doing it
00:49:45.660 is not trying to meet the specification of credibility.
00:49:50.000 They will all agree on that.
00:49:51.420 So it's a lack of,
00:49:53.920 I guess,
00:49:55.780 talent stack that even allows this situation to happen.
00:50:01.260 So what ratio of late and disputed kinds of elections
00:50:06.560 go Democrats versus Republicans?
00:50:09.760 I don't know.
00:50:11.160 I've heard people say 80% go one way versus the other.
00:50:15.620 But there are some notable exceptions.
00:50:17.420 For example, Gore versus Bush in 2000.
00:50:23.900 So a very close election,
00:50:26.140 but went all the way to the Supreme Court,
00:50:29.080 and then Gore conceded.
00:50:31.760 So that's an example of things going the way of the Republican.
00:50:39.020 But what state was that that was the disputed one?
00:50:42.360 Oh, yeah.
00:50:42.840 So in a state where the election is managed by Republicans,
00:50:48.780 in a state managed by Republicans,
00:50:51.960 the close election went to a Republican.
00:50:54.380 How about that?
00:50:58.000 And do you know what happens
00:50:59.320 when the press audited the votes themselves
00:51:04.040 to see if the vote count was right?
00:51:05.980 Do you know what happened?
00:51:06.660 Well, it turns out that the only reason that Bush won
00:51:12.560 is because there had been a recent agreement
00:51:15.720 about what types of ballots do not count.
00:51:20.560 So at the time,
00:51:22.360 a ballot that was not perfect,
00:51:24.500 let's say it was missing a date or something,
00:51:26.560 would be tossed out.
00:51:28.740 By today's standard,
00:51:30.060 that probably wouldn't happen in a lot of states.
00:51:32.480 A lot of states would say,
00:51:33.520 no, it's more important
00:51:34.400 since it's obvious what the intention of the voter was,
00:51:38.100 like the fact that they wrote the date wrong,
00:51:40.980 that shouldn't take their vote away.
00:51:43.120 If you had used that standard
00:51:44.680 where it was obvious that it was a real person
00:51:47.400 trying to make a real vote,
00:51:48.760 they just filled out something wrong,
00:51:51.880 Gore would have won.
00:51:52.540 Do you think in 2022,
00:51:56.440 Gore would have just given up
00:51:58.060 because the Supreme Court had ruled
00:52:00.920 that they both agreed to those rules?
00:52:03.200 I doubt it.
00:52:05.340 Like, I don't know what you would do,
00:52:06.680 but I think there would be, you know,
00:52:08.820 infinite, clever, legal challenges
00:52:11.220 until they found a way to say the more important,
00:52:15.860 you know, they'd find some Democrat judge
00:52:17.940 who would say the more important standard
00:52:20.320 is that you don't disenfranchise these voters.
00:52:23.060 Yes, both sides agreed to these rules,
00:52:27.760 and yes, it's the law of the land in Florida,
00:52:30.580 but, you know,
00:52:32.600 you can imagine some judge overruling it
00:52:35.600 and say, yeah, but it, you know,
00:52:37.720 the higher standard
00:52:39.040 is to make sure that everybody's vote counts.
00:52:42.280 So it doesn't matter what you agree to,
00:52:44.180 it doesn't matter what the state says,
00:52:45.900 the higher standard is the vote's got to count
00:52:48.160 if it's obvious what the vote was for.
00:52:53.220 The 2000 election wasn't slowed down
00:52:55.460 to count new ballots.
00:52:57.640 Yeah, there was a different situation for sure.
00:53:00.580 So everything about Gore versus Bush
00:53:02.500 was a little unique.
00:53:04.160 I'm not sure if you could learn anything from that.
00:53:06.520 But it is interesting that Gore would have won
00:53:08.460 depending on which ones you accept.
00:53:11.160 I said the other day
00:53:16.620 that the midterm election
00:53:19.020 was the best case scenario
00:53:20.440 because the country wanted gridlock
00:53:23.380 and then they got it.
00:53:26.440 But when I see, you know,
00:53:28.480 as this drags out,
00:53:30.620 the difference between the best case scenario
00:53:32.520 and the worst case scenario
00:53:33.680 is really, it's really, really close
00:53:36.960 because the worst case scenario
00:53:38.620 is a close election
00:53:39.740 that looks exactly like it was fraudulent.
00:53:43.080 That's the worst case scenario.
00:53:45.360 And that's what we got.
00:53:47.600 Now, I don't see any fraud,
00:53:50.380 but it fits the pattern.
00:53:52.860 We'll talk about that.
00:53:56.420 So here are all the reasons
00:53:57.720 that people are sort of settling on
00:53:59.940 for why the midterm
00:54:01.780 didn't go the way the Republicans hoped.
00:54:06.340 So there's a whole bunch of filters.
00:54:07.820 One filter is, as we discussed,
00:54:11.480 that reality always takes
00:54:12.680 the most entertaining path.
00:54:14.500 That would have largely predicted
00:54:16.980 where we are.
00:54:18.560 So that filter worked.
00:54:20.280 Now, I'm not saying it works in every case,
00:54:21.880 but it worked in this case.
00:54:24.240 Here's some other filters.
00:54:28.020 That people like to be given things
00:54:32.260 and they don't like to have things
00:54:33.560 taken from them.
00:54:34.420 And I know what you're going to say
00:54:36.940 before I make this point,
00:54:38.840 so just wait for a moment
00:54:39.840 and I'll agree with your point
00:54:41.200 before you make it, okay?
00:54:43.280 So you're going to immediately disagree with me,
00:54:45.760 but I'm going to confirm
00:54:47.280 your disagreement in a moment.
00:54:51.080 Giving beats taking.
00:54:52.200 What did the Democrats offer?
00:54:56.540 We will give you free money
00:54:58.220 to cancel your student loans.
00:55:01.040 People like stuff.
00:55:02.020 Free money.
00:55:02.580 Yay.
00:55:03.520 When the Democrats offer to cut taxes,
00:55:06.480 that's the same thing.
00:55:07.860 We'll give you free money.
00:55:09.440 Oh, yay.
00:55:10.400 I like my free money.
00:55:13.360 Whereas the Republicans,
00:55:15.240 and I know you're going to disagree,
00:55:16.800 but I'll agree with you in a moment,
00:55:17.920 what the Republicans offered
00:55:20.900 was less freedom.
00:55:23.980 That's what they offered.
00:55:25.600 We're going to take your abortion rights,
00:55:28.160 as you believe you have.
00:55:29.500 We're going to take them from you.
00:55:32.540 And then the Democrats cleverly said
00:55:34.740 they're also going to take
00:55:35.560 your democracy away,
00:55:37.120 which I laughed at
00:55:38.700 because it was so ridiculous.
00:55:40.700 But apparently a lot of Democrats
00:55:42.100 thought they were going to lose
00:55:43.200 their democracy.
00:55:43.820 And so that should have predicted
00:55:49.280 the outcome.
00:55:50.940 Now what you're going to say is,
00:55:52.360 Scott, Scott, Scott,
00:55:53.120 you have the giving and taking
00:55:54.420 all backwards.
00:55:55.460 What the Republicans were doing,
00:55:57.020 they were trying to give life
00:55:58.720 to the unborn.
00:56:00.240 They were giving, right?
00:56:02.520 They were giving.
00:56:04.760 So really you say that's taking,
00:56:07.400 but that's giving.
00:56:07.960 That's giving life.
00:56:09.640 That's irrelevant.
00:56:10.480 If the voters felt like
00:56:14.020 they were losing something,
00:56:15.140 that's all that matters.
00:56:16.440 And obviously they felt like
00:56:17.820 they were losing something.
00:56:19.480 Yeah, young women thought
00:56:20.700 they were losing a right
00:56:21.700 that they wanted.
00:56:24.360 So, and then the democracy thing,
00:56:27.620 you know, I don't think that
00:56:28.500 the Republicans had a good
00:56:29.760 counter to that.
00:56:31.420 You know, the counter to
00:56:32.280 you're losing your democracy
00:56:33.420 is no, you're not.
00:56:35.140 Right?
00:56:35.920 And fear persuasion works really well.
00:56:38.640 So if the only thing you did
00:56:40.900 is look at it and said,
00:56:42.460 what feels like giving
00:56:43.840 and what feels like taking,
00:56:47.520 you know, forget your academic argument
00:56:49.200 about who's really doing the giving
00:56:51.140 and who's really doing the taking.
00:56:52.620 That doesn't matter.
00:56:54.020 What matters is how it feels.
00:56:56.040 And this totally felt like
00:56:57.440 the only things the Republican offered
00:56:59.200 was less freedom.
00:57:03.420 That's what it felt like.
00:57:05.600 Now, if we'd been in the middle
00:57:06.980 of the pandemic and the Republicans
00:57:09.740 have said, we will free you
00:57:11.660 from the lockdown.
00:57:13.000 Well, that's good.
00:57:14.760 Then the Republicans are giving you
00:57:16.360 some freedom, potentially.
00:57:19.060 But the Republicans offered nothing,
00:57:23.020 let's say nothing prominent.
00:57:25.480 They do have plans.
00:57:26.800 But nothing prominent about solutions.
00:57:29.140 They just said, we're going to take
00:57:30.360 some things you like.
00:57:31.360 We're going to take them away from you.
00:57:33.300 Am I wrong?
00:57:34.580 Now, remember, this is how people
00:57:35.860 received the message.
00:57:37.420 That's not the message they were sending.
00:57:39.280 But that's how it was received.
00:57:41.240 Right?
00:57:41.620 It was a contest between
00:57:43.220 I'll take your shit
00:57:44.200 versus I'll give you free shit.
00:57:46.420 That's not even a contest.
00:57:49.100 You could predict the winner
00:57:50.340 of that every time.
00:57:52.360 Right?
00:57:52.940 Now, you know,
00:57:54.140 you've got that midterms
00:57:55.620 usually go the opposite way
00:57:57.040 of the president thing.
00:57:58.480 But this is as close to a victory
00:58:01.660 as you could get in the House
00:58:03.040 with actually a victory.
00:58:04.180 All right.
00:58:06.380 Here's another filter
00:58:08.500 that as long as the ballot harvesting
00:58:12.060 is part of the process,
00:58:14.500 Democrats will win every time
00:58:16.220 because they have a built-in advantage.
00:58:19.560 No, not cheating.
00:58:21.100 The built-in advantage
00:58:22.080 is that they have urban areas.
00:58:24.180 So it's easier to collect
00:58:26.820 a bunch of ballots
00:58:28.440 from, let's say,
00:58:29.780 you know,
00:58:30.260 an apartment building
00:58:31.180 than it is to drive down
00:58:33.180 a country road
00:58:33.940 and say,
00:58:34.960 do you have a ballot?
00:58:35.680 Do you have a ballot?
00:58:36.560 So as long as Democrats
00:58:38.780 are concentrated in cities,
00:58:42.140 they should win every election
00:58:44.020 because you can ballot harvest
00:58:45.180 more easily.
00:58:47.360 And so there's a filter.
00:58:50.200 How about that predicted everything?
00:58:51.940 That could have predicted
00:58:54.400 literally everything.
00:58:56.020 So that's the filter
00:58:56.860 that would have worked.
00:58:58.860 Here's another filter
00:59:00.080 that would have worked.
00:59:00.940 Follow the money.
00:59:02.740 Follow the money works, right?
00:59:04.820 Blake Masters,
00:59:06.040 and I didn't know this at all,
00:59:07.340 but this is shocking.
00:59:09.560 So Blake Masters lost.
00:59:11.180 If he'd won,
00:59:11.940 you know,
00:59:12.180 it could have changed
00:59:12.760 the direction of the Senate.
00:59:15.880 He was super underfunded.
00:59:20.900 He had like $12 million
00:59:22.180 or something.
00:59:23.460 And his opposition
00:59:25.100 who won had $80 million,
00:59:26.920 something in that neighborhood.
00:59:28.260 It wasn't even close.
00:59:30.340 So if you followed the money,
00:59:32.360 it looks like the Democrats
00:59:33.720 put their money
00:59:34.380 in the right places.
00:59:36.140 And it looks like
00:59:36.840 the Republicans
00:59:38.180 maybe put their money
00:59:39.180 in the wrong places.
00:59:40.660 So follow the money
00:59:41.800 would have actually gotten you
00:59:44.320 to where you are.
00:59:46.140 Maybe not as cleanly,
00:59:47.620 but it looks like
00:59:48.580 it would be predictive.
00:59:54.100 Let's see.
00:59:58.320 What else?
01:00:01.620 And of course,
01:00:02.400 there's the Trump effect.
01:00:03.580 You know,
01:00:03.760 Trump will get blamed
01:00:04.600 because there were
01:00:05.300 Trumpy people running,
01:00:07.320 blah, blah, blah.
01:00:12.100 All right.
01:00:14.320 So I keep tweeting this
01:00:16.680 because I know
01:00:17.280 it makes people crazy.
01:00:19.080 Not people.
01:00:19.980 It makes my critics crazy.
01:00:22.360 This is just the best thing
01:00:23.640 for bothering Democrats.
01:00:25.940 All right.
01:00:26.260 So here's my tweet.
01:00:27.160 And I've tweeted,
01:00:27.820 I think I've tweeted
01:00:28.460 the same tweet
01:00:29.340 three or four times.
01:00:31.680 But every time I do it,
01:00:33.480 I get no feedback.
01:00:35.740 Because it's just a,
01:00:36.900 it's basically
01:00:37.600 a high ground tweet.
01:00:40.020 See,
01:00:40.300 the high ground
01:00:40.980 is the thing you say
01:00:42.020 that just shuts everybody up.
01:00:43.260 They're like,
01:00:43.800 okay,
01:00:44.200 there's nothing
01:00:44.620 I can say about that.
01:00:46.260 Right?
01:00:46.660 So you've got people
01:00:47.360 in the low ground
01:00:48.280 saying the election
01:00:49.360 is rigged
01:00:50.000 or it wasn't rigged.
01:00:52.180 That's the low ground.
01:00:53.240 That's in the weeds.
01:00:54.440 Hey,
01:00:54.920 is the fact true
01:00:55.780 or is the fact not true?
01:00:57.320 And then I come along
01:00:58.300 with this tweet
01:00:58.860 and I high ground
01:01:00.100 the shit out of everybody.
01:01:01.920 I go,
01:01:02.240 the FTX fraud
01:01:03.240 on top of the pandemic
01:01:04.940 fiasco
01:01:05.540 reminds us
01:01:06.740 to be grateful
01:01:07.500 that our election systems
01:01:09.180 are the only systems
01:01:10.460 in America
01:01:10.960 that are not fraudulent
01:01:12.800 despite the incentives
01:01:14.440 and opportunities
01:01:15.380 to be so.
01:01:17.420 And that's why
01:01:18.440 I accept the results
01:01:19.480 of the midterm.
01:01:22.780 Now,
01:01:23.580 do you think
01:01:24.660 that the critics
01:01:26.060 come in
01:01:26.600 and criticize me
01:01:27.640 for saying
01:01:28.740 that the election system
01:01:29.820 is the only one,
01:01:31.760 the only one
01:01:32.580 that worked perfectly?
01:01:34.420 No,
01:01:34.940 they don't.
01:01:35.340 They just shut
01:01:38.020 the fuck up.
01:01:39.500 If you want
01:01:40.020 to just shut
01:01:41.380 somebody down
01:01:42.220 like at Thanksgiving
01:01:43.540 or Christmas,
01:01:45.480 you do your own
01:01:47.060 version of that tweet
01:01:48.080 but verbally.
01:01:49.460 Just say,
01:01:49.880 you know,
01:01:50.500 I do accept
01:01:52.380 the outcome
01:01:53.340 and I'm just grateful
01:01:55.040 that I live
01:01:55.580 in a country
01:01:56.080 where although
01:01:57.160 every system
01:01:58.060 that we have
01:01:58.740 has been proven
01:01:59.420 to be fraudulent
01:02:00.320 from our finance
01:02:01.920 to our experts
01:02:03.040 to our medical system
01:02:04.280 to our CDC
01:02:05.500 to our government,
01:02:07.460 isn't it great
01:02:08.160 that all 50 election systems
01:02:10.820 worked perfectly?
01:02:16.420 If you want to see
01:02:18.140 somebody change
01:02:18.860 the subject,
01:02:20.360 that leads to
01:02:21.440 instant change
01:02:22.420 of subject.
01:02:23.520 I am not going
01:02:24.380 to talk to you anymore.
01:02:25.760 I'm going to go
01:02:26.440 eat some turkey.
01:02:28.420 So try that at home.
01:02:29.580 It works.
01:02:34.280 So Robbie Starbucks
01:02:36.300 reports on Twitter.
01:02:39.160 He said that
01:02:39.900 there was a colonist
01:02:41.720 in Nevada
01:02:42.300 who wanted to test
01:02:44.220 the election system.
01:02:46.500 So this colonist
01:02:48.740 got 11 voters
01:02:50.740 to, you know,
01:02:52.340 work with him
01:02:52.820 on this little project
01:02:53.680 and 11 people
01:02:55.660 put his signature
01:02:56.820 on their ballots.
01:02:59.500 All right.
01:03:00.140 So very intentionally,
01:03:02.000 they didn't,
01:03:02.460 now they didn't try
01:03:03.120 to forge his signature.
01:03:04.720 They put their own signature
01:03:06.340 on his ballot.
01:03:09.000 And now those ballots
01:03:10.720 were sent in.
01:03:12.100 What percentage of those
01:03:13.580 were caught
01:03:14.660 as the signature
01:03:16.340 does not match?
01:03:17.200 They caught
01:03:21.780 six.
01:03:26.800 No, six were accepted.
01:03:28.980 They caught five.
01:03:30.640 They caught fewer
01:03:31.740 than they didn't catch.
01:03:35.520 Right.
01:03:36.100 So six out of 11
01:03:37.520 got through
01:03:38.120 with a completely
01:03:39.360 different signature.
01:03:41.180 And again,
01:03:42.040 I'm not talking about
01:03:42.740 somebody trying to
01:03:43.620 match, you know,
01:03:44.560 somebody else's signature.
01:03:45.520 I'm talking about
01:03:46.520 a different name.
01:03:48.440 And six out of 11
01:03:49.360 got through.
01:03:50.440 And now let me ask you this.
01:03:54.260 If you're doing
01:03:55.300 election work,
01:03:57.380 you're paid by the hour,
01:03:58.940 and your job is just
01:03:59.980 to painstakingly
01:04:01.460 verify addresses.
01:04:04.740 And you've looked
01:04:05.620 at a thousand in a row,
01:04:07.400 and every one
01:04:08.700 of them matched.
01:04:10.560 You're paid by the hour.
01:04:11.660 you can't really
01:04:13.260 get fired.
01:04:15.680 Not really.
01:04:17.200 What are you going
01:04:17.800 to do?
01:04:18.720 Are you going
01:04:19.180 to put as much
01:04:19.840 attention into
01:04:20.820 the next
01:04:21.460 hundred thousand
01:04:23.020 that you look at,
01:04:24.260 or are you going
01:04:24.700 to say to yourself,
01:04:26.700 I'm just going
01:04:29.000 to approve all of
01:04:29.700 these because it's
01:04:30.300 exactly the same pay?
01:04:32.120 What would you do?
01:04:34.160 The average person
01:04:35.600 is just going
01:04:36.160 to start approving
01:04:36.880 every fucking thing
01:04:37.680 that comes through
01:04:38.360 because they get paid
01:04:39.460 exactly the same.
01:04:40.420 That would be
01:04:42.640 the most obvious
01:04:43.760 human nature thing
01:04:44.720 to do.
01:04:45.800 Again,
01:04:46.940 if you were
01:04:48.360 an engineer,
01:04:50.780 would you trust
01:04:51.520 people who are
01:04:52.860 paid by the hour,
01:04:54.340 would you design
01:04:55.140 a system
01:04:56.280 where they are
01:04:57.340 incentivized
01:04:58.240 to lie?
01:05:01.520 Am I wrong
01:05:02.400 that that
01:05:02.800 incentivizes them
01:05:03.800 to lie?
01:05:04.580 Because there's
01:05:05.060 almost no penalty.
01:05:06.840 If somebody says,
01:05:07.620 hey, you missed
01:05:08.140 this one later on,
01:05:09.540 they'll say,
01:05:09.920 ah, damn it,
01:05:10.760 I missed that one.
01:05:12.040 Did you still pay me?
01:05:13.540 Yes, I did.
01:05:14.760 Is it illegal
01:05:15.680 to miss one?
01:05:16.820 No, it's not.
01:05:17.560 It's just a mistake.
01:05:18.820 All right,
01:05:19.300 I'll see you next year.
01:05:21.580 Right?
01:05:22.140 There's no downside.
01:05:24.260 So, of course,
01:05:25.460 people would just
01:05:26.260 approve them
01:05:26.800 without looking.
01:05:27.860 Of course they would.
01:05:29.720 It's the most
01:05:30.980 easily predictive thing.
01:05:33.900 Incentives work.
01:05:35.260 Right?
01:05:35.360 Any kind of incentive
01:05:37.220 you put out of the
01:05:37.940 system
01:05:38.300 is going to have
01:05:39.520 some impact.
01:05:41.000 It might not
01:05:41.520 completely change it
01:05:42.560 around,
01:05:43.020 but it's always
01:05:43.460 going to have
01:05:43.780 some impact.
01:05:45.960 So, anyway,
01:05:47.160 I accept the
01:05:48.900 election
01:05:49.520 regardless of
01:05:51.360 these small things
01:05:52.060 because it was a test.
01:05:53.480 We don't know
01:05:53.820 if anybody actually
01:05:54.460 tried to do
01:05:55.140 anything like that.
01:05:57.500 All right,
01:05:58.000 how about this
01:05:58.460 filter for predicting?
01:06:00.060 History repeats.
01:06:01.060 A lot of you
01:06:03.100 believe that, right?
01:06:03.800 History repeats.
01:06:05.360 So, therefore,
01:06:06.760 when you see
01:06:07.280 something forming,
01:06:08.240 you can say,
01:06:08.580 oh, this is like
01:06:09.440 that other thing.
01:06:11.260 So, history would
01:06:12.240 have predicted
01:06:13.100 a red wave,
01:06:13.900 wouldn't it?
01:06:16.640 History would have
01:06:17.400 said,
01:06:17.740 you've got an
01:06:18.220 unpopular president,
01:06:20.420 everything's going
01:06:21.160 wrong,
01:06:22.100 and it's a
01:06:22.660 midterm election
01:06:23.380 with the president
01:06:24.240 of the other team.
01:06:25.900 It should.
01:06:27.420 It should have
01:06:28.180 been a red wave.
01:06:29.860 History is very
01:06:30.740 clear on this.
01:06:32.400 History could not
01:06:33.240 be more clear
01:06:33.940 on this.
01:06:34.540 This is like
01:06:35.000 one that just
01:06:35.500 guarantees.
01:06:37.080 In fact,
01:06:37.600 the experts
01:06:37.980 were only talking
01:06:38.640 about how big
01:06:39.360 the red wave
01:06:39.980 would be.
01:06:41.500 That was sort
01:06:42.020 of the only
01:06:42.360 decision.
01:06:42.720 How big is it?
01:06:43.940 Didn't happen
01:06:44.400 at all.
01:06:45.500 So, history
01:06:46.020 was useless.
01:06:49.580 But,
01:06:50.120 how about this?
01:06:55.260 How about
01:06:55.900 fraud?
01:06:57.780 Suppose
01:06:58.180 your filter
01:06:59.720 was fraud.
01:07:01.240 And you say,
01:07:01.660 all right,
01:07:01.880 everything's fraudulent.
01:07:03.020 Just everything's
01:07:03.700 fraudulent.
01:07:05.240 What would that
01:07:06.200 have predicted?
01:07:08.540 This.
01:07:09.940 Exactly what you
01:07:10.820 see.
01:07:11.940 If your filter
01:07:12.860 was everybody
01:07:13.520 is a crook
01:07:14.180 and everything's
01:07:14.780 fraudulent,
01:07:15.860 you would have
01:07:16.420 perfectly predicted
01:07:17.360 exactly where we
01:07:18.500 are.
01:07:19.220 Doesn't mean
01:07:19.780 there's fraud.
01:07:21.240 I'm just saying
01:07:22.060 that if you use
01:07:22.780 it as a predictor,
01:07:24.880 you would have
01:07:25.300 been right.
01:07:26.320 Maybe you're
01:07:26.820 right for the
01:07:27.220 wrong reason.
01:07:28.540 But the filter
01:07:29.180 would have
01:07:29.460 worked.
01:07:30.880 So, how
01:07:32.220 about the
01:07:32.760 persuasion
01:07:34.620 filter?
01:07:36.240 Do you think
01:07:37.020 that I could
01:07:37.720 have done a
01:07:38.820 better job of
01:07:39.640 predicting what
01:07:40.480 would happen
01:07:40.960 if I'd used
01:07:42.280 the persuasion
01:07:43.020 filter, which
01:07:43.920 I didn't really?
01:07:46.160 Well, here's
01:07:47.420 what the persuasion
01:07:48.160 filter would have
01:07:48.880 said.
01:07:51.360 You're only
01:07:52.020 talking to your
01:07:52.620 own team,
01:07:53.120 so whoever
01:07:55.280 scares their
01:07:56.120 team the most
01:07:56.820 effectively wins.
01:07:58.760 And that was
01:07:59.260 the Democrats.
01:08:00.380 So, the
01:08:00.900 persuasion
01:08:01.400 filter did
01:08:02.940 predict that
01:08:03.800 the Democrats
01:08:04.440 would outperform.
01:08:06.100 What did I
01:08:06.820 predict?
01:08:09.320 I predicted
01:08:10.200 there would not
01:08:10.640 be a red
01:08:11.040 wave, which
01:08:12.780 is the same
01:08:14.020 as Democrats
01:08:15.500 don't perform.
01:08:16.900 So, the
01:08:17.400 persuasion
01:08:17.900 filter was
01:08:19.440 right again.
01:08:19.940 Again, several
01:08:22.180 of these
01:08:22.500 different kinds
01:08:23.080 of filters
01:08:23.740 for predicting
01:08:24.560 got things
01:08:25.800 right, but
01:08:27.180 not necessarily
01:08:27.780 because the
01:08:28.660 filter was
01:08:29.120 right.
01:08:29.520 You could get
01:08:30.500 a false
01:08:30.840 positive pretty
01:08:31.480 easily.
01:08:32.480 But you
01:08:33.120 should always
01:08:33.500 do a little
01:08:33.880 audit of
01:08:34.460 what worked
01:08:35.240 and what
01:08:35.700 didn't.
01:08:37.840 Here's another
01:08:38.560 filter.
01:08:39.740 Everything
01:08:40.060 that's complicated
01:08:41.140 does not
01:08:43.720 go the way
01:08:44.180 everybody
01:08:44.560 predicts.
01:08:46.900 So, this
01:08:47.280 is one, I
01:08:47.680 don't know if
01:08:47.940 I've ever talked
01:08:48.500 about this, but
01:08:49.260 whenever you
01:08:49.580 have a
01:08:49.860 complicated
01:08:50.340 situation,
01:08:51.180 elections have
01:08:51.920 lots of
01:08:52.280 moving parts,
01:08:53.460 very complicated,
01:08:54.320 like an
01:08:54.960 economy.
01:08:55.980 You know,
01:08:56.140 economy is
01:08:56.640 hard to
01:08:56.920 predict, lots
01:08:57.400 of moving
01:08:57.740 parts.
01:08:58.560 Whenever you've
01:08:59.140 got lots
01:08:59.720 of complexity
01:09:00.400 and moving
01:09:01.020 parts, you
01:09:02.780 could just
01:09:03.160 say that all
01:09:03.840 the experts
01:09:04.280 are going to
01:09:04.640 be wrong
01:09:05.060 without any
01:09:07.020 thinking at
01:09:07.560 all.
01:09:08.540 Just say,
01:09:08.900 what do all
01:09:09.180 the experts
01:09:09.620 say?
01:09:10.000 Oh, all
01:09:10.400 the experts
01:09:10.980 are looking
01:09:11.860 at this big
01:09:12.480 complicated
01:09:13.000 situation, and
01:09:14.480 they all say
01:09:15.080 it's going to
01:09:15.600 shift to the
01:09:16.280 right.
01:09:17.880 A really
01:09:18.560 good filter
01:09:19.120 is to say,
01:09:20.120 really?
01:09:20.780 It's a big
01:09:21.420 complicated
01:09:21.900 situation and
01:09:22.740 everybody thinks
01:09:23.320 it's going to
01:09:23.660 go right?
01:09:25.100 Well, I
01:09:25.540 will predict it
01:09:26.200 goes the
01:09:26.500 opposite.
01:09:27.280 How often
01:09:27.780 would you be
01:09:28.240 right?
01:09:29.480 If all you
01:09:30.380 did is just
01:09:30.880 predict that the
01:09:31.960 experts are
01:09:32.480 wrong, how
01:09:33.360 often would you
01:09:33.860 be right?
01:09:35.960 At least half
01:09:36.940 the time.
01:09:38.460 Yeah, at least
01:09:39.120 half the time.
01:09:40.260 And I'm basing
01:09:41.360 that on the fact
01:09:42.080 that scientific
01:09:43.260 studies, they get
01:09:45.120 peer-reviewed and
01:09:45.980 actually published in
01:09:46.920 the literature,
01:09:47.900 only about half
01:09:48.940 of them are
01:09:49.320 actually reproducible.
01:09:52.400 So all the
01:09:53.360 experts agreeing
01:09:54.160 with something
01:09:54.640 means it's
01:09:55.340 about 50%
01:09:56.700 likely to be
01:09:57.400 true.
01:09:58.260 At best.
01:09:59.840 So if you
01:10:00.580 had done the
01:10:01.200 coin flip, I'll
01:10:02.580 just take the
01:10:03.140 contrarian view,
01:10:04.960 you would have
01:10:05.500 been right.
01:10:07.260 You would have
01:10:07.680 been right.
01:10:09.360 Yeah, a lot of
01:10:10.040 people who took
01:10:10.640 the contrarian view
01:10:11.660 on COVID will
01:10:13.400 tell you that
01:10:14.020 they got everything
01:10:14.660 right.
01:10:15.560 Maybe they did,
01:10:16.380 maybe they
01:10:16.680 didn't.
01:10:19.360 How about this
01:10:20.160 filter?
01:10:21.680 I've used this a
01:10:22.520 lot of times and
01:10:23.240 it never fails.
01:10:24.740 Whenever you have
01:10:25.340 a situation where
01:10:26.400 there's lots of
01:10:27.100 complexity, so you
01:10:29.240 can't see in all
01:10:29.860 the corners because
01:10:31.080 it's just too
01:10:31.820 confusing, so you
01:10:33.140 got lots of
01:10:33.580 complexity, you
01:10:34.720 got lots of
01:10:35.220 people involved,
01:10:36.980 and you've got a
01:10:39.020 lot to gain by
01:10:40.040 fraud, and you
01:10:41.920 have the
01:10:42.180 opportunity for the
01:10:43.820 fraud.
01:10:45.060 What happens in
01:10:46.280 100% of the
01:10:47.240 cases where
01:10:47.800 that's the
01:10:48.240 situation?
01:10:49.780 Fraud.
01:10:51.140 Not sometimes,
01:10:52.780 every time.
01:10:54.400 Why wouldn't
01:10:54.960 it?
01:10:55.680 It's a pure
01:10:56.560 incentive system.
01:10:58.220 If you can get
01:10:59.140 away with it,
01:11:00.560 because it's
01:11:01.420 hidden in the
01:11:01.960 complexity, and
01:11:04.360 there's a huge
01:11:05.240 upside, maybe
01:11:06.560 money, maybe
01:11:07.200 power, maybe
01:11:08.000 people, and
01:11:09.600 lots of people
01:11:10.200 involved, you
01:11:10.920 can guarantee
01:11:11.460 that some of
01:11:12.060 them are going
01:11:12.420 to try something.
01:11:13.820 Not all of
01:11:14.500 them, but some
01:11:15.100 of them are
01:11:15.420 going to try
01:11:15.760 it.
01:11:16.780 So that would
01:11:18.060 predict that our
01:11:18.720 elections are, if
01:11:20.420 they have not been
01:11:21.500 fraudulent in the
01:11:22.320 past, it kind of
01:11:24.240 predicts that they
01:11:24.800 will be.
01:11:26.580 But it doesn't
01:11:27.320 tell you anything
01:11:27.780 about the current
01:11:28.280 one.
01:11:29.560 All right, Mike
01:11:30.320 Pompeo said, he
01:11:37.740 tweeted, he said,
01:11:38.780 conservatives need to
01:11:39.940 make the case that
01:11:41.380 helping Ukraine defeat
01:11:42.700 Putin is in our
01:11:43.980 interest.
01:11:44.880 It will strengthen
01:11:45.660 our national
01:11:46.260 security, deterfo, and
01:11:48.720 lower costs for
01:11:49.660 Americans.
01:11:51.320 What do you think
01:11:51.860 of that?
01:11:52.340 Should conservatives
01:11:53.140 need to make the
01:11:55.180 case that helping
01:11:55.920 Ukraine defeat
01:11:56.780 Putin is in our
01:11:57.540 interest?
01:11:57.860 Let me tell you a
01:12:02.440 little story, a
01:12:04.300 little context, and
01:12:05.400 then we're going to
01:12:05.900 get back to Mike
01:12:06.600 Pompeo.
01:12:08.100 Years ago, my
01:12:09.060 first job in the
01:12:10.180 adult world was a
01:12:11.620 bank teller.
01:12:13.360 And my boss told me a
01:12:15.400 story about when he
01:12:16.240 was a bank teller,
01:12:17.260 years before that.
01:12:18.680 And one of his
01:12:19.440 first jobs was to be a
01:12:20.480 bank teller at the
01:12:21.480 drive-up window.
01:12:22.360 Now, if you remember
01:12:24.760 drive-up windows at
01:12:26.080 banks, I think they
01:12:27.940 still have them, there
01:12:29.240 would be like a glass
01:12:30.380 window, and there's
01:12:31.360 like a little thing
01:12:32.500 where you can put money
01:12:34.340 back and forth through
01:12:35.720 the little, what would
01:12:38.240 you call it, the little
01:12:38.960 channel there.
01:12:40.460 And so my boss's boss
01:12:43.880 said, all right, there's
01:12:45.820 a good chance that you're
01:12:46.660 going to get robbed.
01:12:48.400 Like, you know, a
01:12:49.440 criminal will drive up
01:12:50.540 and tell you to give
01:12:52.200 all the money.
01:12:53.220 And that criminal
01:12:53.960 might actually have a
01:12:54.800 gun, which would be
01:12:56.840 typical.
01:12:57.960 And so he was trained
01:12:59.200 that the glass window
01:13:01.080 was bulletproof.
01:13:03.160 And, you know, they
01:13:04.140 can't get to you.
01:13:05.480 So he was trained to
01:13:06.540 not give the robber the
01:13:07.880 money, because you're
01:13:09.420 behind a bulletproof
01:13:10.300 glass, and if the guy
01:13:12.400 has just a gun, you
01:13:14.880 know, what can he do?
01:13:16.560 So, sure enough, the
01:13:18.620 guy takes the job, and
01:13:19.620 sure enough, he gets
01:13:20.200 robbed.
01:13:20.520 And a car drives up, and
01:13:23.040 the guy pulls out a
01:13:23.780 gun.
01:13:24.380 But here's the funny
01:13:25.120 part.
01:13:26.880 Do you know that the
01:13:28.440 dirty, hairy gun?
01:13:30.640 Is it a .44?
01:13:32.660 What is it called?
01:13:33.380 A .44?
01:13:35.300 Do you know what those
01:13:36.200 look like?
01:13:38.100 A .44 Magnum.
01:13:39.440 .44 Magnum, yeah.
01:13:40.980 A .44 Magnum is a
01:13:42.320 handgun that's like as
01:13:43.680 big as a fucking
01:13:44.360 rifle.
01:13:44.740 It's the most dangerous
01:13:47.360 looking weapon you've
01:13:48.880 ever seen in your
01:13:49.760 life, right?
01:13:51.020 And it can blow a hole
01:13:52.260 through an elephant.
01:13:53.480 I don't know if it
01:13:54.020 could do that, but it's
01:13:54.660 really powerful.
01:13:55.880 So this guy drives up to
01:13:57.940 my ex-boss who's in the
01:13:59.880 window, and he pulls out
01:14:01.540 this gun that's like this
01:14:04.240 long and puts it up to
01:14:05.760 the glass.
01:14:06.380 And the guy's looking at
01:14:09.340 this fucking cannon
01:14:10.860 pointed at his face.
01:14:13.180 And he just starts
01:14:13.980 shoveling the money
01:14:14.700 through the channel.
01:14:17.720 Take mine.
01:14:18.840 Everything you want.
01:14:20.020 So he just gives him
01:14:20.820 everything he has, right?
01:14:23.180 And so my boss's boss
01:14:25.380 comes to him after the
01:14:27.220 fact and says, what did
01:14:29.200 you do?
01:14:30.220 What did you do?
01:14:30.860 I told you that's a
01:14:31.740 bulletproof glass.
01:14:33.280 It's bulletproof.
01:14:34.300 Why did you give him the
01:14:35.140 money?
01:14:35.380 And then my boss said
01:14:38.180 to him, define bulletproof.
01:14:43.700 I've been laughing about
01:14:45.040 that for 30 years.
01:14:47.080 Define bulletproof.
01:14:51.020 All right, so let's get
01:14:52.400 back to Mike Pompeo.
01:14:54.280 Conservatives need to make
01:14:55.440 the case that defeating
01:14:57.380 Putin is in our best
01:14:59.240 interest.
01:15:01.200 Define defeating.
01:15:05.380 You kind of need that,
01:15:06.820 don't you?
01:15:08.540 Define defeating Putin.
01:15:11.720 You mean as in thwarting
01:15:13.600 his desire to control
01:15:15.100 territory in Ukraine?
01:15:17.040 Or do you mean as in he's
01:15:19.040 out of business?
01:15:21.180 Or dead?
01:15:23.380 What does that mean?
01:15:25.920 And why would you say in
01:15:27.560 public something so
01:15:29.600 infuriatingly vague and
01:15:32.960 that that should be like a
01:15:33.960 goal?
01:15:35.160 You should have a
01:15:35.880 annoyingly vague goal?
01:15:39.820 That seems like the worst
01:15:40.820 advice ever.
01:15:42.400 So I think conservatives,
01:15:45.180 as I tweeted in response,
01:15:46.800 I think conservatives could
01:15:47.800 start by defining what
01:15:48.960 defeat Putin means.
01:15:50.540 If you don't have an end
01:15:51.900 game, don't ask us to fund
01:15:54.240 it, which is what's
01:15:56.020 happening.
01:15:56.300 I think I'll give you
01:15:57.700 money for the thing,
01:15:59.680 whatever it is you're
01:16:00.440 doing over there,
01:16:01.500 whatever it is you're
01:16:02.380 trying to accomplish.
01:16:04.300 Now, Thomas Massey,
01:16:06.760 responding to the
01:16:07.900 Pompeo tweet,
01:16:10.200 had this take on what a
01:16:11.980 conservative should be
01:16:12.920 doing.
01:16:13.340 So Pompeo says a
01:16:14.720 conservative should be
01:16:15.720 getting on board to
01:16:17.360 defeat Putin.
01:16:18.580 But Thomas Massey says
01:16:20.660 that taxing Americans,
01:16:21.840 printing money, and
01:16:22.580 taking on more debt to
01:16:24.800 send billions of
01:16:25.600 dollars overseas with
01:16:26.580 very little oversight
01:16:27.380 is not conservative.
01:16:30.440 And he says conservatives
01:16:31.400 make the case that
01:16:32.200 Europe should pay for
01:16:33.060 its own defense, and
01:16:34.480 we should stop meddling
01:16:35.360 in the affairs of other
01:16:36.280 countries.
01:16:38.200 Now, compare those two
01:16:42.720 opinions.
01:16:44.100 And you don't even have
01:16:45.260 to take a side.
01:16:46.720 Just compare the
01:16:48.260 quality of the opinion.
01:16:51.280 Thomas Massey has a
01:16:52.780 perfectly clear standard.
01:16:54.260 perfectly clear.
01:16:56.680 The people who are there
01:16:58.300 should take care of it,
01:16:59.260 and we shouldn't meddle in
01:17:00.160 other people's business.
01:17:03.200 Now, you could agree with
01:17:04.460 it or disagree with it,
01:17:05.680 but it's very clear,
01:17:07.020 isn't it?
01:17:08.520 You know, you should assume
01:17:09.680 there are some exceptions
01:17:10.800 where we would meddle,
01:17:12.000 but in general, in general,
01:17:14.060 we shouldn't meddle.
01:17:15.180 Now, I'm not saying that
01:17:16.460 the Massey approach is
01:17:18.180 right and that
01:17:19.000 Pompeo is wrong.
01:17:19.940 I'm saying that one of
01:17:21.180 them said something
01:17:21.840 perfectly clear and
01:17:23.420 clean and something you
01:17:25.420 could actually do,
01:17:26.920 whereas Pompeo said
01:17:27.980 something that would be
01:17:28.840 a blank check.
01:17:30.520 It's not even close.
01:17:32.840 If you're trying to look
01:17:33.960 for a leader, look for
01:17:36.200 the one who can tell you
01:17:37.120 what they're thinking,
01:17:38.240 and you can understand
01:17:39.080 it.
01:17:41.120 Like, start there.
01:17:42.880 Start with somebody who
01:17:43.840 says something clear and
01:17:45.300 clean and you understand
01:17:46.220 it.
01:17:46.480 That's a pretty good
01:17:47.840 starting place.
01:17:50.960 All right.
01:17:54.380 I have a feeling that
01:17:56.000 Thomas Massey's political
01:17:59.840 future is going to follow
01:18:02.040 a pattern a little bit
01:18:03.160 like Bernie Sanders,
01:18:06.200 but the non-liberal version
01:18:08.520 of that.
01:18:09.680 And here's what I mean by
01:18:10.500 that.
01:18:11.460 Bernie was, you know,
01:18:12.840 crazy clown with his
01:18:15.060 socialism and stuff for
01:18:16.460 decades, right?
01:18:17.600 But eventually the world
01:18:19.460 changed and it became
01:18:22.040 closer to what he'd been
01:18:23.240 saying for years.
01:18:24.180 So he went from crazy to,
01:18:26.080 you know, leadership of
01:18:27.180 the party or, you know,
01:18:28.960 one of the leading lights
01:18:29.780 of the party.
01:18:30.860 I think Thomas Massey is,
01:18:32.860 you know, different enough
01:18:34.140 from other conservatives
01:18:36.320 that he's going to be sort
01:18:38.600 of living on his little
01:18:39.420 island.
01:18:40.240 You know, his opinions
01:18:40.820 won't match the mainstream
01:18:42.220 for a while.
01:18:42.860 Well, but if we check back
01:18:44.800 in, I don't know, 15 years
01:18:46.780 if he's still in politics,
01:18:48.440 I feel like the world will be
01:18:50.300 closer to him than to other
01:18:53.040 people.
01:18:53.960 What would you say?
01:18:55.720 I feel like he's a little
01:18:57.200 ahead of the world, and that's
01:19:00.220 not good or bad, because I'm
01:19:02.100 comparing it to Bernie, who is
01:19:04.000 also ahead of the world, but in
01:19:05.480 a completely opposite way.
01:19:07.720 Right?
01:19:07.900 I feel like the world is going
01:19:09.700 to catch up with Thomas Massey.
01:19:12.220 So, and one of Bernie's
01:19:15.180 strengths that a lot of people
01:19:17.280 actually admire, and I would
01:19:18.860 be one of them.
01:19:20.060 I admire that Bernie's stuck
01:19:22.020 with, you know, he's stuck
01:19:24.400 with his guns.
01:19:25.660 I never agreed with him, but
01:19:27.120 at least he's consistent.
01:19:28.940 You know, there's not much,
01:19:30.700 there's very little hypocrisy
01:19:32.060 happening with Bernie, which
01:19:34.120 is one of the reasons people
01:19:35.840 appreciate it.
01:19:36.500 And I think Massey's going to
01:19:38.860 be in that same category.
01:19:40.260 You're going to say for
01:19:41.200 however many years, you're
01:19:42.680 going to say, I don't really
01:19:43.440 agree with that opinion, but
01:19:45.480 it's very consistent, it's very
01:19:47.460 clean, it makes sense, and
01:19:50.580 then you might find yourself
01:19:52.580 embracing it eventually.
01:19:57.320 Now, Bernie owning three
01:19:58.880 houses is not, you know,
01:20:01.240 that's not really a, that's
01:20:04.040 such a weak attack, because he
01:20:05.720 lives in a capitalist system
01:20:07.120 and, you know, he's okay with
01:20:09.180 people making money and
01:20:10.160 spending it.
01:20:11.400 He's not, like, against all
01:20:12.920 capitalism.
01:20:14.460 So, you know, he's been
01:20:16.180 around for a long time in a
01:20:17.860 high-end job, he has some
01:20:19.380 houses, it's not the biggest
01:20:21.500 surprise.
01:20:23.260 All right, are you happy to be
01:20:29.340 here again?
01:20:32.940 Wilma?
01:20:33.460 Wilma says I've made it happy to
01:20:35.300 be here again.
01:20:36.320 Are you happy?
01:20:37.540 I don't know.
01:20:39.060 I hope you are.
01:20:40.300 This was the best show you've
01:20:41.700 ever seen.
01:20:42.940 I think we could say that with
01:20:44.340 total confidence.
01:20:45.260 All right.
01:20:53.720 SNL's Chappelle monologue was
01:20:55.220 pulled off of YouTube.
01:20:56.520 Now, I haven't seen Chappelle's
01:20:57.880 monologue, so he was on SNL,
01:21:00.220 and I saw on social media that he
01:21:02.480 did something that might have
01:21:03.580 angered the Jewish community.
01:21:07.180 Did that happen?
01:21:09.460 Did he fly a little too close to
01:21:11.020 the sun there?
01:21:13.840 Oh, he says Kanye isn't crazy?
01:21:18.960 You know, one of the things that
01:21:20.740 Chappelle does and Bill Burr does,
01:21:24.640 and I guess some of the other
01:21:26.160 greats have done it as well, see
01:21:28.500 if you have the same experience.
01:21:30.600 If you're watching other Chappelle
01:21:31.720 or Bill Burr, they'll start saying
01:21:35.280 something that you say to
01:21:36.240 yourself, there's no way he's
01:21:38.540 going to survive this.
01:21:39.480 Oh, you've gone too far.
01:21:41.800 Oh, you can't say that.
01:21:44.120 And then he starts moving this
01:21:47.200 battleship in a way that you don't
01:21:50.020 think a battleship could be moved.
01:21:52.640 And then by the time he's done,
01:21:54.500 the whole battleship is turned
01:21:55.700 around, and you say to yourself,
01:21:58.040 oh, okay, that worked.
01:22:00.380 That wasn't horrible after all.
01:22:02.300 That was actually clever, and there
01:22:04.020 was a good point to be made there.
01:22:05.740 And there are very few people who
01:22:07.220 can write that well.
01:22:08.160 So the thing that both Chappelle
01:22:11.300 and Bill Burr don't get enough
01:22:14.380 credit for is they're not just
01:22:16.520 writing jokes, right?
01:22:19.080 Both Burr and Chappelle have a,
01:22:21.460 like a theme that, like an arc,
01:22:24.940 that they'll carry through their
01:22:26.220 routine.
01:22:27.080 That's a whole different level than
01:22:28.880 the joke people, right?
01:22:30.580 That's writing.
01:22:31.820 That's just good writing, because it
01:22:33.520 all has to, like, tie together into
01:22:35.180 some kind of comprehensive hole.
01:22:37.200 And Chappelle does that the best of
01:22:39.080 all.
01:22:39.800 Like, he'll take you on this path,
01:22:41.740 and you think, oh, I'm over in these
01:22:43.500 woods, and boom, you're over here.
01:22:45.740 And then that's the funny part.
01:22:47.020 Are the loud ums a persuasion technique?
01:22:57.140 The loud ums.
01:22:59.060 Who does the loud ums?
01:23:00.580 Do I do that?
01:23:01.740 Are you talking about me?
01:23:03.160 Or somebody else?
01:23:05.960 I don't know what the loud ums
01:23:08.020 refers to.
01:23:09.360 Do I do that?
01:23:10.060 Do I do that?
01:23:10.080 I didn't think I did.
01:23:16.340 All right.
01:23:20.100 Kanye has the word con in it.
01:23:22.040 That's funny.
01:23:25.320 Oh, you love the way I say but.
01:23:28.020 Yeah, I do that to you, too, don't I?
01:23:30.140 I take you down a path,
01:23:31.940 and then show you it's the wrong path
01:23:33.820 when I get to the end.
01:23:37.080 But it's fun, isn't it?
01:23:38.180 It's more fun to not know
01:23:40.660 where you're going to end up.
01:23:41.640 A little bit of surprise
01:23:42.640 every time I take you down a path.
01:23:48.280 You like the way I say taint?
01:23:51.000 All right.
01:23:51.660 So this just has to be discussed,
01:23:53.620 even though it's the last thing
01:23:54.600 I want to discuss.
01:23:58.220 Andrew Taint has beaten social media.
01:24:02.680 Am I wrong?
01:24:04.500 I think he went up against it
01:24:06.620 and beat it.
01:24:08.180 He pretty much owns social media
01:24:10.840 by getting banned.
01:24:12.820 I can't get away from his content.
01:24:15.500 His content is everywhere.
01:24:17.780 Are you seeing it,
01:24:18.460 or is it just being served up to me?
01:24:22.820 But not only is his content everywhere,
01:24:25.080 but he's right on the zeitgeist.
01:24:29.320 Because the things he's talking about
01:24:31.060 are like other people are talking about
01:24:33.080 in their own ways.
01:24:33.960 So he's right on the reading the room right.
01:24:38.600 So one of the things I try to model,
01:24:43.240 and I think you can confirm this,
01:24:45.120 is that when there's somebody you don't like
01:24:47.180 or a team that you're against,
01:24:49.680 can you make the argument for them?
01:24:52.320 Because if you can make their argument
01:24:53.980 like without any hesitation,
01:24:56.120 then maybe you're an objective thinker.
01:25:00.980 Otherwise, your bias is taking over.
01:25:03.640 Now, as you know,
01:25:04.400 I have a personal beef with Andrew Taint.
01:25:09.120 So I don't like him.
01:25:11.560 And if something happened to him tomorrow,
01:25:15.560 I wouldn't care.
01:25:16.500 So I totally don't like him.
01:25:19.900 But he is completely successful
01:25:23.460 in beating social media.
01:25:27.860 He's an energy monster,
01:25:29.620 and he's taking all the negative energy
01:25:31.860 and turning it into money.
01:25:33.660 And he's doing it really, really effectively.
01:25:37.020 So I have to say,
01:25:41.120 as much as I don't think I'll ever like him
01:25:43.300 as a human being,
01:25:45.980 he is very effective
01:25:47.060 at exactly what he's telling you he's doing.
01:25:50.260 So he's fun to watch,
01:25:52.440 even though it just bothers me.
01:25:55.440 And a lot of the things that he says,
01:25:58.280 and let me ask you this.
01:25:59.120 If you watched Andrew Taint,
01:26:01.260 how many of the things he says
01:26:02.480 are things that you've heard from me first?
01:26:06.260 Let me just test that.
01:26:08.060 Because when I listen to him,
01:26:09.360 it just sounds like he watches my live streams
01:26:11.560 and repackages them.
01:26:13.300 Do you get that sense?
01:26:20.540 I'm getting mixed answers.
01:26:24.880 Yeah, some people are saying a lot,
01:26:26.780 and yes,
01:26:27.360 and some people are saying no.
01:26:30.000 But the people who are saying yes
01:26:31.820 are more likely to have
01:26:34.020 probably seen enough of my content to know.
01:26:37.760 Because, you know,
01:26:38.840 I've been talking recently about
01:26:40.520 relationships and marriage and stuff.
01:26:43.300 And he's on the same page.
01:26:54.220 You're saying
01:26:55.140 it's anti-Semitism.
01:26:57.980 I don't know.
01:26:58.860 I haven't seen any anti-Semitism.
01:27:02.300 Have you?
01:27:03.000 And remember,
01:27:06.060 so I hope I've succeeded in modeling this.
01:27:10.280 So there's a good example,
01:27:11.500 somebody I don't like at all,
01:27:13.520 but I can unambiguously say
01:27:15.340 that he's really good at his job.
01:27:18.460 Fair?
01:27:20.260 Does that buy me any credibility?
01:27:22.180 Because it's hard for me to say.
01:27:24.960 Honestly, it hurts me.
01:27:27.340 But I'd rather be on the level about it
01:27:30.600 than to, you know,
01:27:32.060 mislead you.
01:27:33.580 All right.
01:27:36.600 I will do the same thing with Trump,
01:27:38.800 which is he's bugging me at the moment.
01:27:42.140 But if he does, you know,
01:27:44.840 effective things,
01:27:46.140 I'm going to say that's effective.
01:27:49.400 All right.
01:27:50.540 Anything I left out?
01:27:53.380 I believe we've covered everything,
01:27:55.140 and we've gone a little bit long,
01:27:57.560 which is just right.
01:27:59.780 Just right.
01:28:00.480 And for those of you
01:28:03.000 who watched my cooking show last night,
01:28:07.100 I did a live stream from the man cave,
01:28:09.120 which I took into the kitchen,
01:28:10.920 showed you how to cook in my new air fryer,
01:28:14.260 and I didn't complete the cooking
01:28:16.100 when I signed off,
01:28:17.460 and somebody asked me,
01:28:18.600 how did my salmon go
01:28:20.040 that I had marinated and cooked?
01:28:23.100 And the answer is completely inedible.
01:28:26.560 I took a few bites and threw it away.
01:28:28.380 I don't know why.
01:28:30.560 I may have undercooked it,
01:28:32.100 or I didn't marinate long enough,
01:28:33.460 or it wasn't a good enough filet or something.
01:28:37.120 It didn't taste good,
01:28:38.260 and it wasn't a good texture,
01:28:39.700 and I just tossed it away.
01:28:42.280 So I didn't eat any protein last night
01:28:45.860 except a protein bar,
01:28:47.480 in case you wondered.
01:28:48.500 All right.
01:28:49.020 But you cannot miss my cooking show.
01:28:52.780 Is there anybody who watched it?
01:28:55.180 Because I want to see if you recommend it.
01:28:58.180 If you watched it,
01:29:00.980 do you recommend it to the people
01:29:02.700 who didn't watch it?
01:29:04.200 I think the video clip got clipped off,
01:29:07.660 the one that I loaded to Locals.
01:29:10.000 I think it got clipped off.
01:29:11.680 So I think the kitchen part isn't even there,
01:29:13.520 but it was there yesterday.
01:29:15.420 I checked it,
01:29:16.280 but somebody said it's not there today.
01:29:18.200 All right.
01:29:18.600 That's all for now.
01:29:19.400 Well,
01:29:19.520 this has been the best live stream
01:29:23.260 you've ever seen.
01:29:25.080 And goodbye to YouTube.
01:29:26.640 I'll see you tomorrow.
01:29:27.840 And Spotify too.