Real Coffee with Scott Adams - December 26, 2022


Episode 1969 Scott Adams: Nothing Happening In The News But That Will Not Stop Us From Sipping


Episode Stats

Length

59 minutes

Words per Minute

136.6519

Word Count

8,102

Sentence Count

654

Misogynist Sentences

6

Hate Speech Sentences

20


Summary

Jimmy Kimmel's Trump jokes, the Maricopa County Attorney General's decision, and why the Supreme Court should have overturned the election of Donald Trump. Plus, a special Christmas edition of HAPPY THANKSGIVING!


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of civilization, and it's Boxing Day.
00:00:11.320 Is that right? Boxing Day. For those of you who like to, I don't know, box or live in a box or something.
00:00:21.360 But it's a day for you. We don't know why it's called that.
00:00:24.300 If you'd like to take your experience up to almost god-like levels, I can help you.
00:00:30.880 All you need is a cup or a mug or a glass of tank or chalice or stein, a canteen jug or a flask or a vessel of any kind.
00:00:37.240 Fill it with your favorite liquid. I like coffee.
00:00:41.240 And join me now for the unparalleled pleasure.
00:00:43.860 It's the dopamine here, the day, the thing that makes everything better.
00:00:46.920 It's called the simultaneous sip, and it's going to happen now.
00:00:50.340 Go.
00:00:54.300 Ah, yeah, that's good.
00:00:58.340 Yeah, that's good stuff.
00:01:00.420 Well, let's talk about stuff.
00:01:02.820 So, I saw an article about Jimmy Kimmel.
00:01:05.780 He's got that late-night show that's being destroyed by Gottfeld with an exclamation.
00:01:12.780 But Kimmel was saying that his executives hinted at that he should maybe do fewer Trump jokes
00:01:22.460 because Jimmy Kimmel lost half of his audience by making, you know, so many Trump jokes.
00:01:31.240 And he said that if they asked him to stop doing it, they sort of hinted around, they never asked.
00:01:36.320 But if they asked him to stop making Trump jokes, he would quit.
00:01:40.760 He would quit.
00:01:43.960 Do you see what's wrong with Hollywood people?
00:01:47.700 It's sort of everything's in that story, isn't it?
00:01:51.980 Does he understand, did Jimmy Kimmel understand that he was part of a business enterprise
00:01:57.240 that really likes to serve all of its customers
00:02:00.980 and that he was hired to do a specific job and he decided not to do it
00:02:05.700 and they still renewed him?
00:02:08.940 They should have fired his ass immediately for losing half of his audience.
00:02:12.780 His only job is to maintain his audience or grow it, right?
00:02:16.080 He has one job.
00:02:17.880 Maintain your audience.
00:02:19.140 He decides to get rid of half of it and he keeps his job.
00:02:21.980 What employer doesn't fire you for that shit?
00:02:27.100 Am I wrong?
00:02:28.780 That was a firing offense.
00:02:30.720 He tried not to even do the job for which he was hired,
00:02:33.840 which is entertain the general public.
00:02:36.200 He wasn't hired to entertain half of his audience.
00:02:40.940 But the funny thing is that the article is written
00:02:43.820 as though he's the hero of the story
00:02:46.400 because he stayed with his principles.
00:02:52.260 He said, I will quit if I can't make fun of Trump
00:02:54.580 and lose half of my audience.
00:02:57.820 He has every right in the world to say anything he wants privately,
00:03:02.620 which is why he's that.
00:03:04.440 He can do it privately.
00:03:06.240 But why do we treat that like it was smart and good and holy?
00:03:11.340 And by the way, the same would be true if he reversed it, right?
00:03:14.560 If he'd lost his liberal audience, I'd say the same thing.
00:03:20.260 He was hired to do a job.
00:03:22.600 He wasn't hired to tell us his opinions.
00:03:25.900 That would be the opposite of what he was hired for.
00:03:29.280 Anyway, how was it?
00:03:32.400 Was it because it was Christmas?
00:03:33.980 How did I miss the fact that there was a Carrie Lake Maricopa lawsuit decision?
00:03:40.280 Do you all know that there was even a decision?
00:03:42.060 It was like there was a lot of silence about it.
00:03:46.100 Yeah, she lost.
00:03:47.640 She lost.
00:03:48.960 Was there anybody who predicted that she would lose?
00:03:56.280 Yes.
00:03:57.260 Probably a lot of you.
00:03:58.720 I predicted she would lose because they did not show intent.
00:04:02.100 They showed that things went wrong.
00:04:05.340 And I think they did a good job of showing that enough went wrong between the chain of custody question and the things printed on the wrong ballots, etc.
00:04:16.800 I think there was good evidence that it could have changed the result, and there was good evidence that things went wrong.
00:04:27.900 But there was no evidence that it was intentional.
00:04:31.280 I think the closest they got was speculation that the administrator had to do it intentionally.
00:04:37.920 But that's not really evidence.
00:04:39.800 It's more speculation.
00:04:40.740 So, how many people were surprised at the result?
00:04:47.100 Nobody's surprised, right?
00:04:49.600 What I predicted, after you'd seen how clear the evidence was that there was a problem, because that part seems to be not disputed.
00:04:58.100 I don't think there's a dispute that there were major problems that could have changed the outcome.
00:05:03.360 But I just don't see the courts willing to overturn any kind of election.
00:05:14.920 Broke a thousand viewers.
00:05:17.400 Yeah.
00:05:19.440 So, here's what the court said, in part.
00:05:23.160 A court setting such a margin aside, meaning the number of votes.
00:05:28.340 So, a court setting such a margin aside, as far as the court is able to determine, has never been done in the history of the United States.
00:05:37.160 Should that matter?
00:05:39.640 Should it matter that an election has never been set aside before?
00:05:46.680 I'm a little bit on the fence on this.
00:05:50.520 Because I can see it both ways.
00:05:54.400 I like the instinct of the court not to interfere.
00:05:59.860 You like that, don't you?
00:06:01.800 Don't you like the instinct of the court not to interfere?
00:06:05.380 Like they need a really, really good reason?
00:06:08.060 Like really, really, really good reason?
00:06:10.500 I think I'm okay with that.
00:06:12.720 Even though it, you know, maybe didn't go the way I would have preferred.
00:06:16.660 I feel like that's responsible and yet also not legal.
00:06:25.120 You know, why does a judge take the non-law into consideration?
00:06:30.220 Because that's not the kind of precedent that they, it's not a legal precedent, is it?
00:06:34.680 I suppose it is.
00:06:36.640 A legal precedent of non-action?
00:06:40.380 Can you have a precedent of non-action?
00:06:42.840 Or does a precedent have to be an action?
00:06:48.980 Or is it the same thing?
00:06:50.700 Non-action is a decision.
00:06:53.420 I don't know if that's a detail or not.
00:06:56.880 Anyway.
00:06:58.220 Oh, so I'm not, I'm not surprised.
00:07:01.500 I also don't mind.
00:07:04.160 I don't mind that we move on.
00:07:05.920 I do mind that we don't fix it for next time.
00:07:08.060 Have you heard the stories about all the things they're doing to make sure these problems don't occur again?
00:07:14.920 No.
00:07:16.540 No.
00:07:17.920 It looks like they plan to do it again.
00:07:21.040 Oh no, that's right.
00:07:22.340 There's no evidence of intention.
00:07:24.640 No evidence of intent.
00:07:25.620 Let me remind you again, that whenever there's an opportunity to do something sketchy, that has a huge potential gain, such as winning an election, or making money, and there are lots of people involved, and it's possible to get away with it.
00:07:46.420 How often do bad things happen?
00:07:49.420 All the time.
00:07:50.240 Every time.
00:07:51.420 Maybe not on day one.
00:07:53.000 That's the only question.
00:07:54.640 But eventually?
00:07:55.640 Yeah, 100%.
00:07:56.480 That situation is a breeder for bad things to happen.
00:08:02.960 So I don't know if any bad things happened, but could they have done bad things and hidden it easily in this context?
00:08:09.680 If somebody did do, let's say, the ballot printing, if they had done it intentionally, would they be able to hide their intention?
00:08:20.720 Easily.
00:08:21.900 It could have been a maximum of one personal conversation.
00:08:26.580 It might have been one person who talked to one other person who said, you know, if you just change the setting, all hell will break loose and conservatives won't get to vote on voting day.
00:08:38.500 It would only take one person whispering to one other person.
00:08:42.860 No documentation.
00:08:44.580 No, no, you don't even need to collude.
00:08:47.260 Like after the first person gets the idea in their head, there's no extra meetings, no conversations.
00:08:53.180 It just happens.
00:08:54.980 Now, let's do a, let's do a reality check.
00:09:00.300 Do you believe that anybody is smart enough to know that the election results would change if they changed the size of the printed ballot?
00:09:11.420 Do you think that that was, that is likely that that was an intentional plan?
00:09:16.100 So most of you are saying yes.
00:09:20.960 Where do you think that opinion came from?
00:09:23.960 Why do you think you have that opinion?
00:09:25.600 Was it assigned to you?
00:09:26.900 Or do you think you would have come up with that on your own?
00:09:32.860 It's a tough question.
00:09:35.540 To me, that looks like an assigned opinion.
00:09:38.580 Here's why.
00:09:39.240 If you tell me that somebody is smart enough to know that that plan would work, I say, I haven't met that person yet.
00:09:48.360 There's just like one degree of sort of cleverness and uncertainty that just feels a little bit more than what real people do.
00:10:00.280 Right?
00:10:01.140 I'm not saying it's impossible.
00:10:02.360 And I'm not saying that you're wrong if you think it was intentional.
00:10:06.260 I don't know.
00:10:07.340 I'm just saying that the nature of it requires a little bit more cleverness and thinking than I expect in politics.
00:10:18.580 Now, here would be a straightforward plot.
00:10:21.700 I have no evidence to happen.
00:10:23.600 But suppose it was a straightforward plot of collecting ballots from various places that were sketchy and then voting.
00:10:31.260 Does that sound like something that real people would do?
00:10:35.300 Yes.
00:10:36.360 I'm not saying it happened or in big enough amounts to change anything.
00:10:41.520 I'm saying that would be like an ordinary, yeah, people would do that.
00:10:45.780 How about people filling out ballots for dead people?
00:10:49.060 Does that sound like something that would actually happen?
00:10:52.160 Yeah.
00:10:53.340 Yeah, because that's sort of easy and just straight ahead.
00:10:57.220 Yeah.
00:10:58.040 Yeah.
00:10:58.200 Now, maybe not enough to change the election, but yeah, you'd believe that.
00:11:02.860 But if somebody gets into the machine and they know enough to change the size of the printing because they know that the conservatives vote on voting day, they know it'll slow it down, maybe.
00:11:16.200 Maybe.
00:11:17.120 But I think you're in coin flip territory at the very least.
00:11:22.420 Like, if you could say that situation is 50% or more likely to be a conspiracy, I would say, I don't know.
00:11:31.940 I don't think it's 50%.
00:11:33.580 If you told me there's a 25% chance that it was intentional, I'd say, that sounds about right.
00:11:42.220 Now, again, we're all just using our totally subjective opinion.
00:11:49.360 My 25% has no basis, in fact.
00:11:52.640 It's just that when you hear anything that's a little too clever, I always discount it.
00:11:59.140 Just too clever, boom, take down the odds that it's true.
00:12:03.640 All right, which doesn't mean I know it's true.
00:12:05.940 I just, that's my little rule.
00:12:07.720 So, what the heck is going on with Maggie Haberman?
00:12:12.380 I keep reading the stories and I'm all confused.
00:12:15.700 So, Maggie Haberman was with Politico and is now with the Washington Post, right?
00:12:22.500 And she is famous on the right for having Trump derangement syndrome and writing continuous anti-Trump stories.
00:12:33.900 But, because the, what we learned recently is that, I guess it was the January 6th committee, found out that one of Trump's lawyers referred to, you know, was talking to somebody else, said that she was a friendly.
00:12:52.780 In other words, that she was friendly to the Trump organization.
00:12:55.620 Does that sound realistic?
00:13:01.660 Because the left now believes that Maggie Haberman is some kind of a, I don't know, Trojan horse or a trick or a plant or something, and that she's really working for Trump.
00:13:14.720 And their evidence includes the fact that Maggie Haberman's mother works with Jared Kushner?
00:13:23.460 Does that even sound true?
00:13:26.780 I mean, that's what I'm saying, that there's some connection with the Trump world.
00:13:32.380 Now, and then I saw Cernovich tweet that people on the inside know that Trump talked to Haberman on the phone, you know, fairly often.
00:13:43.080 Which doesn't make sense for all the bad things she writes about him.
00:13:47.400 Now, what do you think is going on there?
00:13:50.060 Because it can't be true that she's anti-Trump and pro-Trump at the same time.
00:13:56.700 And we're looking at exactly the same stuff.
00:13:59.740 How could we not tell, how could we not tell if she's pro or anti?
00:14:04.720 All right.
00:14:12.760 Well, I'm confused by the story, but I'm going to give you one hypothesis that has not been mentioned.
00:14:19.240 Which is that the story that she was friendly to Trump was incorrect.
00:14:25.440 So, the entire evidence is one reported statement by one person once.
00:14:34.940 That's the whole evidence that she's friendly to Trump.
00:14:38.640 Now, whatever Mike Cernovich has is based on information he has.
00:14:42.760 But in terms of the public information, one statement by one lawyer that she's friendly.
00:14:51.680 What are the odds that that was misheard, heard out of context, or misremembered?
00:14:59.920 Don't you think that the most likely explanation is it never happened?
00:15:03.140 The most likely explanation is that conversation didn't happen.
00:15:09.360 Or that he said, she's not friendly and somebody missed the word not.
00:15:16.020 Or maybe he said, she acted friendly, so I'm hoping for the best.
00:15:21.440 Or maybe he said, I'm going to act friendly and maybe that will, you know, hope.
00:15:27.020 Or maybe he said, it was a friendly conversation, which is different from saying, she's a friendly.
00:15:34.260 Don't you think the most likely explanation for this clear opposite of reality situation we have
00:15:42.020 is that it was just reported wrong?
00:15:45.080 What do you think?
00:15:46.900 Now, I'm not saying I know that.
00:15:48.800 I'm saying it's the most likely explanation.
00:15:51.380 Because it seems weird and complicated that she would be some kind of weird double agent
00:15:57.260 working for Trump at the same time.
00:15:59.640 Now, do you think it's true that Trump would have multiple conversations with her,
00:16:03.620 even though she was anti-Trump?
00:16:05.800 I think so.
00:16:07.420 Because at the very least, he would want to get his version of events out there.
00:16:11.600 Right?
00:16:12.280 So he should definitely be talking to the enemy.
00:16:15.140 Because if there's anything he can do to, you know, add a fact or some context
00:16:19.220 to help him out.
00:16:21.380 If, you know, she's going to come after him anyway, he should try to influence him.
00:16:27.140 So, I don't know.
00:16:28.180 Maybe we'll find out what that was all about.
00:16:30.000 Maybe not.
00:16:31.980 I asked ChatGPT, the AI, what it thinks of marriage.
00:16:39.580 Whether marriage is just, statistically, whether it's a good idea.
00:16:46.020 What do you think AI said?
00:16:47.620 Do you think AI said that getting married is a good idea or a bad idea?
00:16:54.300 Well, I'll tell you.
00:16:56.000 It says the overall divorce rate in the United States has been declining in recent years.
00:17:00.800 Is that true?
00:17:02.360 Is it true that the divorce rate has been declining?
00:17:07.720 It might be.
00:17:08.840 Do you know why?
00:17:10.380 Fewer people getting married.
00:17:11.620 If fewer people get married, probably it's the ones who are pretty sure they get married, right?
00:17:19.120 And the religious ones, they have maybe a better chance.
00:17:22.400 Could be that.
00:17:23.600 Anyway, I didn't know that was happening.
00:17:25.500 But the AI says the divorce rate was about 2.9 per thousand people.
00:17:31.480 This suggests that the odds of a marriage in the United States working out well may be relatively high.
00:17:39.020 What?
00:17:39.940 2.9 per thousand?
00:17:43.640 That means every year.
00:17:47.300 Right?
00:17:48.320 That's how many people out of all of them get divorced every year.
00:17:52.040 Am I wrong that AI doesn't know how to look at statistics?
00:17:58.680 I thought that would be one thing it would be good at.
00:18:01.660 Because the right statistic is you have a 40 or 50% chance of getting divorced.
00:18:08.600 Or you might not get divorced, but the statistics say you might not be happy.
00:18:13.380 But somehow it picked the annual number and compared it to some big number and concluded that it was a good risk.
00:18:23.620 Because it's bad at looking at statistics.
00:18:26.220 It didn't know what to compare.
00:18:27.900 It actually compared the wrong thing.
00:18:30.300 Am I wrong?
00:18:31.820 I mean, you see it, right?
00:18:32.720 They compared the wrong numbers.
00:18:35.060 They, the AI, compared the wrong numbers.
00:18:37.080 So that's just yet another thing that you can't trust the AI on.
00:18:45.800 Because the AI might know all of the facts and still compare the wrong things.
00:18:51.620 Do you know why AI would maybe compare the wrong things?
00:18:55.600 Because the people who program it don't know how to compare things.
00:18:59.740 I assume it'll learn.
00:19:02.180 But programmers aren't necessarily economists.
00:19:05.280 They don't know what to compare.
00:19:08.240 So AI wouldn't know what to compare.
00:19:10.980 It's a big problem.
00:19:13.640 All right.
00:19:16.360 There's a Rasmussen poll that says 63% of likely U.S. voters believe Congress should investigate whether the FBI was involved in censoring information on social media.
00:19:28.000 Is that even worth doing?
00:19:31.060 Because the answer is yes, right?
00:19:35.280 That, what are we going to ask?
00:19:39.020 Now, I suppose there might be some deeper question about whether the FBI had, you know, had orders to, you know, change the political landscape.
00:19:48.320 But that's not an evidence, right?
00:19:50.340 Is there any evidence that the FBI was doing it with the intention of changing politics?
00:19:55.780 Except for the laptop thing.
00:19:57.400 The laptop thing is obviously, you need to look into that.
00:20:00.000 So maybe that's enough.
00:20:02.260 Maybe just because the laptop thing is so sketchy.
00:20:06.200 But I think that has more to do with the 50 Intel people who signed it.
00:20:10.380 Yeah, but the FBI was also involved.
00:20:12.540 Okay.
00:20:12.760 So they're also culpable, it looks like.
00:20:16.120 All right.
00:20:18.160 All right.
00:20:18.940 So a little pattern recognition.
00:20:22.600 As you know, my YouTube feed often gets demonetized.
00:20:31.600 Now, we've been watching, you know, what kind of content leads to demonetization.
00:20:37.080 And it turns out it's the swearing.
00:20:40.120 It's not the opinions.
00:20:41.420 You know, you'd think it'd be some kind of, you know, I think I'm also probably being, you know, throttled in some way.
00:20:50.720 But the thing that demonetizes me is the swearing.
00:20:54.260 So probably the F word, probably the C word.
00:20:57.080 So I've decided to make a new rule of no swearing going to the next year.
00:21:03.300 So 2023, no swearing.
00:21:04.940 But it's not really 2023 yet, is it?
00:21:13.640 And I saw something today that I just have to give you one more.
00:21:21.740 So there's going to be a little demonetization going on in a moment.
00:21:27.260 All right.
00:21:27.560 It goes like this.
00:21:31.040 You might be aware that I get a lot of pushback from people who say, why did you get vaccinated, Scott?
00:21:40.100 You know, getting vaccinated is like promoting vaccinations, say people.
00:21:45.140 Like if you're a public person and you got vaccinated, it's like you're promoting it.
00:21:49.420 Because you said it's okay for you.
00:21:50.980 So, you know, that influences people.
00:21:54.520 And then people usually say, and then I say stuff like, but you know that nobody knew at the time whether the COVID was more dangerous or the vaccination.
00:22:06.800 They were both complete unknowns.
00:22:08.780 So that's my defense.
00:22:09.900 And then my critics say, Scott, listen to Dr. Robert Malone.
00:22:16.280 If you would listen to him, he would explain to you what these mRNA technologies are and all the dangers of them.
00:22:24.900 Right?
00:22:25.800 Now, what do you think of that?
00:22:28.380 Do you think if I'd paid more attention to Dr. Robert Malone, I would have known not to get vaccinated?
00:22:34.320 Yes or no?
00:22:34.760 If I had taken him more seriously, do you think I would have not gotten vaccinated?
00:22:40.900 Because he was very clear about the warnings, wasn't he?
00:22:44.460 Very clear about the dangers.
00:22:46.280 And he got a lot of attention.
00:22:48.300 Why are you saying no?
00:22:49.720 If I paid attention to the guy who says they're dangerous, you don't think I would have skipped it?
00:22:54.800 Why wouldn't I?
00:22:56.040 He's the expert.
00:22:57.560 Everybody tells me he's the expert.
00:22:58.860 He's like, he invented it.
00:23:00.580 That's what they say.
00:23:02.280 Why wouldn't I listen to him?
00:23:04.760 Okay.
00:23:05.960 Well, I'll give you one reason, maybe not.
00:23:10.620 Did you know he's vaccinated?
00:23:13.500 How many of you know that the biggest critic of vaccinations, Dr. Robert Malone, is himself vaccinated?
00:23:19.820 Did you know that?
00:23:20.480 So I would like to just say something to my critics, who are suggesting that if I'd listened to the people they listened to, such as Dr. Robert Malone, chief among them, that if I'd listened to him like they did, that I would have made a different decision.
00:23:38.480 But he's vaccinated.
00:23:41.620 And so I'd like to say to all my critics, who say that I should have listened to Dr. Robert Malone, I might have made a different decision.
00:23:50.900 Fuck you.
00:23:52.720 Fuck your fucking balls.
00:23:55.620 Fuck your stupid faces.
00:23:57.860 Fuck your tiny brains.
00:23:59.740 Fuck your opinions.
00:24:00.780 Fuck you until you disappear into the fuckness.
00:24:10.420 Now, I didn't need the C word there.
00:24:14.060 This is sort of a little test as well.
00:24:16.540 I'm seeing if the F word is enough.
00:24:19.600 So we're holding off on the C word.
00:24:20.880 Now, Dr. Robert Malone says it probably was a mistake.
00:24:28.560 That's correct.
00:24:29.880 Did he say that the day he got the vaccination?
00:24:33.540 Because probably was about the time I did.
00:24:37.900 So all of those who said that I should have listened to the guy who made the same fucking decision I did, fuck all of you.
00:24:45.980 Which, by the way, doesn't mean you're wrong.
00:24:49.440 Doesn't mean you're wrong.
00:24:50.880 If I drop dead tomorrow with, you know, vaccine-related complications, well, you win.
00:24:58.220 You win.
00:24:59.360 But if I live a long, healthy life, then I took a risk so I could fly to Bora Bora, I did not get sick, and I had a nice vacation.
00:25:10.280 I win.
00:25:11.180 But we don't know yet.
00:25:12.700 So we'll see.
00:25:13.660 I'd say the vote is still out whether I die or not.
00:25:16.360 The Rand group, R-A-N-D, Rand, is a think tank, sort of a think tank situation, I think.
00:25:26.320 But they had an article on Russian propaganda and how they're doing it.
00:25:31.800 Kind of interesting.
00:25:32.600 So apparently the Russian propaganda method is not to be surgical.
00:25:39.380 So they're not coming up with, like, the perfect meme or the perfect rumor.
00:25:45.460 They're just flooding the zone with bullshit so that you can't tell what's true.
00:25:50.620 And apparently it's working.
00:25:51.480 So the Russians are just, you know, it's just overwhelming, continuous, but it's also inconsistent.
00:26:00.480 So even the messages they give are not all the same because they don't care what you believe.
00:26:07.120 They want to overwhelm you.
00:26:09.260 This would be the Rand point of view.
00:26:11.720 Now, that's interesting, isn't it?
00:26:13.040 They're not trying to persuade you necessarily.
00:26:15.180 They're trying to wear you out so you can't tell what's true.
00:26:19.780 Because if you can't tell what's true, you can't act on it.
00:26:22.440 And that's as good as anything else.
00:26:24.980 So they're just trying to make America ineffective.
00:26:27.840 They don't need to make us believe a specific thing.
00:26:30.100 They need to make us not believe anything.
00:26:33.140 And then we're toast.
00:26:35.380 But here are some other persuasion-related things from Rand that I thought were worth mentioning.
00:26:42.000 Now, this is based on experimental psychology.
00:26:44.340 So this is tying the technique to actual scientific studies.
00:26:50.620 And Rand says, experimental psychology literature tells us that first impressions are very resilient.
00:26:57.340 So the first thing the Russians do is they flood the zone with their own rumors, you know, false stuff,
00:27:04.000 before you've seen the correct reporting on something.
00:27:07.440 Because your first impression is really sticky.
00:27:09.880 So that's the first thing they do right.
00:27:11.240 And I would agree that that's good form.
00:27:15.700 It says, an individual is more likely to accept the first information received on a topic,
00:27:22.080 just because it's first.
00:27:24.820 So that's the same thing.
00:27:25.800 When faced with conflicting messages.
00:27:30.260 So if you have more than one message, they're going to favor the first one they heard.
00:27:34.940 For no logical reason.
00:27:37.000 Just going first matters.
00:27:39.860 And furthermore, repetition leads to familiarity.
00:27:42.680 The more you repeat the fake news, the more people will believe it, because they say,
00:27:48.440 oh yeah, I've been hearing that.
00:27:50.000 Everybody's saying that.
00:27:51.000 That's probably true.
00:27:51.940 Couldn't be so many people saying it if there's nothing to it.
00:27:55.640 So, so, so, uh, be first, repeat it until people think it's familiar.
00:28:03.620 Then, um, the illusory truth effect is well documented.
00:28:09.700 This is where people raise statements as more truthful and believable, um,
00:28:14.960 when there are new statements, those statements previously, oh, then when they were new.
00:28:22.300 So, a new statement will be considered, you know, a little bit sketchy,
00:28:26.740 but if they keep hearing it over and over again, there's nothing else that happens.
00:28:31.800 They'll start, they'll start to believe it just because they keep hearing it.
00:28:35.680 But I guess you knew that.
00:28:37.340 Um, when people are less interested in a topic, they're more likely to accept familiarity.
00:28:44.960 So, if you don't know much about it, whatever you hear the most is going to be your truth,
00:28:50.520 because you haven't looked into it.
00:28:52.660 Um, consumers, uh, save processing, you know, resources.
00:28:59.220 In other words, their brains don't work so hard, uh, because they use a frequency heuristic,
00:29:06.460 a rule of thumb, that if they've been hearing the same thing over and over again,
00:29:11.700 uh, that it must be true.
00:29:13.820 And, of course, that is weird.
00:29:17.380 Um, even with preposterous stories and urban legends,
00:29:21.020 those who have heard them multiple times are more likely to believe that they are true.
00:29:25.620 So, no matter how ridiculous the story is,
00:29:28.880 if you hear it enough, you think it's true.
00:29:32.160 And that's all it takes.
00:29:33.960 And, by the way, this is all stuff that hypnotists all know, right?
00:29:37.460 This is one, hypnosis 101 stuff.
00:29:39.720 So, none of this is surprising to me, but maybe, maybe some of you.
00:29:44.400 Um, if an individual is already familiar with an argument, uh, they process it less carefully.
00:29:52.840 Right?
00:29:53.300 So, if you've heard the argument, you don't rethink it.
00:29:56.520 You just say, oh, yeah, yeah, yeah, that old argument.
00:29:58.820 So, your brain doesn't click in again once you've decided what's going on.
00:30:04.220 All right.
00:30:06.740 Um, and let's see.
00:30:10.760 So, it's high volume, continuous stuff.
00:30:14.540 Oh, they all, I guess the Russians also, uh, they use fake accounts that look like they're on your team.
00:30:20.620 So, if you saw a fake account that was all MAGA, pro-Trump, American flag,
00:30:26.780 and it says, oh, Bigfoot is real, and you also are in that same camp,
00:30:31.720 you're a MAGA-loving person, you're going to say, oh, this MAGA person, I trust them.
00:30:37.140 They say Bigfoot's real, so I think so, too.
00:30:39.940 So, that's part of how they do it.
00:30:42.700 All right.
00:30:44.020 Um, Mark Andreessen had an interesting, uh, tweet today,
00:30:48.180 a quote from Michael Crichton, famous author, some of you know.
00:30:52.800 And this was Crichton's quote.
00:30:54.860 Quote, often, talking about the news and how unreliable the news is,
00:30:59.340 often the article is so wrong, it actually presents the story backward,
00:31:03.460 reversing cause and effect.
00:31:05.340 I call these the wet streets cause rain stories.
00:31:08.500 Paper's full of them.
00:31:11.180 I might start using that, wet streets cause rain.
00:31:14.040 Because almost every vaccination mask-related story is one of these.
00:31:22.840 Where they did something and something happened,
00:31:24.920 so therefore that caused the other thing.
00:31:27.860 You know, or you can't tell the difference between cause and effect, basically.
00:31:31.980 And they act like they can.
00:31:33.720 It's everything.
00:31:34.820 It's like, probably 30%, well, not everything,
00:31:38.540 probably 30% of all study news-related stories
00:31:43.180 literally just get correlation and causation mixed up.
00:31:47.620 And the, and readers can't tell the difference.
00:31:52.160 Uh, in case you're wondering, over there on YouTube,
00:31:54.600 the clank is a reference to Sticks and Hammers show.
00:31:58.980 I think he clanks a spoon in a cup or something.
00:32:02.300 Uh, so they're, they're basically making it known
00:32:05.260 that they're in that club and they're coming over here
00:32:08.600 to, to watch the show.
00:32:12.480 L. Green says,
00:32:13.780 Scott took the jab because he was scared.
00:32:17.120 Now, do you think that's what it was?
00:32:19.060 L. Green.
00:32:20.260 Now, do you think, is that something you'd bet on?
00:32:22.980 That I took the jab because I was scared.
00:32:25.680 So, I took the jab
00:32:27.280 so I could get in a big metal tube
00:32:30.900 and fly across the ocean
00:32:32.240 to a small island.
00:32:33.900 Because I was scared.
00:32:36.820 So, I did, I did, uh, something that you didn't trust
00:32:40.360 and you were afraid of.
00:32:43.060 But, because you were afraid of it and I wasn't,
00:32:46.720 that made me the afraid one.
00:32:49.640 Now, my decision, I think,
00:32:51.920 by the way, if anybody, if anybody saw me at the time,
00:32:55.060 did I mention how loud or was this just my private thinking
00:32:58.320 that I was going to take the hit
00:33:00.520 because, at the time, they lied to us
00:33:03.540 and said it would reduce transmission.
00:33:06.000 I think Ben Shapiro said the same thing.
00:33:08.720 That part of the decision is that you're protecting other people.
00:33:12.940 Now,
00:33:14.060 we were, of course,
00:33:15.080 lied to about the effectiveness
00:33:17.280 of the so-called vaccination.
00:33:19.560 But we were operating on what we were told
00:33:21.940 because that's all we had.
00:33:23.020 So, do you think it was
00:33:24.900 that I was afraid
00:33:27.060 that I took an experimental drug
00:33:30.440 with the hope that it would protect people
00:33:32.720 I don't even know
00:33:33.720 because there's no elderly people around me,
00:33:37.020 but that I was protecting people
00:33:38.420 I don't even know
00:33:39.320 at great personal risk to myself
00:33:41.920 and that's what you're calling afraid?
00:33:45.820 Is that your point of view?
00:33:46.900 And you were so afraid of the shot
00:33:51.260 with good reason.
00:33:53.420 With good reason.
00:33:54.420 I'm not saying that your fear was unfounded.
00:33:56.980 You were afraid of the shot
00:33:58.360 and so you didn't get it.
00:34:01.860 Now, are you telling me that you're brave
00:34:03.820 because you are not afraid of the COVID?
00:34:07.660 Is that your argument?
00:34:10.280 Anyway,
00:34:11.140 the person who accuses you
00:34:14.340 of making a decision and a fear?
00:34:16.560 They're just assholes.
00:34:18.320 You don't need to answer them.
00:34:20.680 Because everybody is looking at it
00:34:22.780 and saying,
00:34:23.320 what scares you the most?
00:34:24.860 What's the upside?
00:34:26.060 What's the cost of the benefits?
00:34:28.440 And what is the downside of any decision?
00:34:32.600 Everybody considers it.
00:34:34.680 Everybody considers the downside.
00:34:40.140 So...
00:34:40.620 Yeah, you know what?
00:34:42.140 I'm really sad that I didn't buy stock
00:34:44.520 in the vaccination companies.
00:34:47.520 I always thought I got into...
00:34:49.400 You know,
00:34:49.540 I would have been getting in too late.
00:34:51.160 But at the same token,
00:34:52.440 it would have bothered me.
00:34:54.660 Well, Ukraine says that...
00:34:56.780 Well, Ukraine doesn't say this.
00:34:58.340 But Russia says that they shot down
00:35:00.780 a Russian drone
00:35:03.260 that was attacking deep into Russia.
00:35:06.780 And three Russians were killed
00:35:08.400 in Russian territory.
00:35:09.500 And how many of you thought
00:35:12.720 that Ukraine would attack
00:35:14.140 mainland Russia?
00:35:18.380 Because I've been predicting it
00:35:19.800 for a long time.
00:35:21.420 Yeah.
00:35:21.720 I don't see that there's any...
00:35:23.600 There's no way around it.
00:35:25.420 Yeah.
00:35:26.320 Ukraine has to put maximum pressure
00:35:28.320 on Russia.
00:35:29.660 It would be ridiculous
00:35:30.500 not to attack mainland Russia.
00:35:32.600 In fact,
00:35:34.020 I think they should attack Moscow.
00:35:36.820 That's what I think.
00:35:38.500 Because I think attacking Moscow
00:35:39.820 would reduce the...
00:35:41.740 Let me say what I said.
00:35:43.160 I only mean that
00:35:44.160 from military strategy.
00:35:46.060 I don't mean that
00:35:46.820 that would be moral
00:35:47.780 or ethical to do.
00:35:49.140 I don't want any civilians
00:35:50.560 in Moscow to die.
00:35:52.300 Right?
00:35:52.960 The Russian people are awesome.
00:35:55.280 This is only about Putin.
00:35:57.240 We like the Russian people.
00:35:59.340 We like them a lot.
00:36:00.520 And ideally,
00:36:02.760 I think we'll be allies
00:36:03.900 at some point.
00:36:05.280 Someday,
00:36:05.800 after Putin,
00:36:06.380 we'll be allies, I think.
00:36:07.820 So I don't want to kill
00:36:08.760 any Russians.
00:36:09.380 But if you're talking
00:36:09.920 about strategy,
00:36:11.760 I think that Russia
00:36:12.720 has now shown
00:36:13.420 they're not going to use nukes
00:36:14.660 because I feel like
00:36:16.920 it would have happened by now
00:36:17.920 or at least they'd be
00:36:18.840 threatening it more.
00:36:20.740 And Putin is weak
00:36:22.520 and a little bit more
00:36:24.800 domestic pressure on him
00:36:26.080 would probably be good.
00:36:26.960 And it seems to me
00:36:28.920 that attacking the mainland
00:36:30.280 of Russia
00:36:32.040 is probably just
00:36:32.640 good strategy.
00:36:34.200 But I don't know.
00:36:35.840 The other possibility
00:36:36.560 is that, you know,
00:36:38.840 ramps things up,
00:36:39.960 makes it worse.
00:36:40.980 But I think it makes sense
00:36:42.100 now when you can see
00:36:43.280 that Putin is weak,
00:36:45.280 the military is weak.
00:36:46.600 It wouldn't have made sense
00:36:47.600 if you thought
00:36:48.100 they were strong
00:36:48.760 because then they would say,
00:36:50.240 oh, we're going to
00:36:51.080 crush you twice as hard
00:36:52.820 as we were
00:36:53.460 because you did that.
00:36:54.460 You're charmingly delusional.
00:36:58.320 Well, fuck you.
00:36:59.460 How about telling me
00:37:00.260 what you disagree with,
00:37:01.860 you asshole?
00:37:04.120 Goodbye.
00:37:09.100 Scott has the MSM view
00:37:11.120 of Putin.
00:37:11.880 Asshole.
00:37:13.160 Good comment, asshole.
00:37:15.820 Any other?
00:37:16.380 I'll just get,
00:37:17.280 if you don't mind,
00:37:18.480 I'm going to get rid
00:37:19.020 of some assholes.
00:37:21.120 So I'm just
00:37:22.580 blocking the assholes.
00:37:23.900 Anybody else?
00:37:25.200 Anybody else want to say
00:37:26.180 like a ridiculous thing?
00:37:29.320 The two most
00:37:31.080 NPC comments
00:37:32.860 that I get
00:37:33.480 are that I did something
00:37:35.040 because I was afraid,
00:37:36.920 like I'm the only person
00:37:37.940 in the world
00:37:38.540 who figures out,
00:37:40.020 who looks at the costs
00:37:40.860 and the benefits,
00:37:42.520 or that you're
00:37:44.500 backing some team
00:37:46.200 because you said something,
00:37:48.420 or have some weird intention.
00:37:51.260 Those are all
00:37:51.740 the dumb comments.
00:37:53.900 No, I don't block
00:37:59.180 dissenters,
00:37:59.960 I block assholes.
00:38:01.620 Have you ever seen me
00:38:02.480 block somebody
00:38:03.060 just for disagreeing
00:38:04.060 with me on fact?
00:38:05.400 That's never happened.
00:38:06.860 Have you ever seen
00:38:07.540 anybody disagree
00:38:08.300 with me on opinion
00:38:09.360 and I blocked them?
00:38:11.580 Never happened.
00:38:12.660 No, I only block
00:38:13.280 the people who make
00:38:14.380 personal
00:38:15.040 and erroneous comments.
00:38:16.960 What's the tire pressure
00:38:19.900 on my BMW?
00:38:21.520 32 in the front?
00:38:23.740 Well, it depends
00:38:24.260 on the season.
00:38:33.340 Putin is not losing
00:38:34.500 in all caps.
00:38:36.940 Now, do you think
00:38:37.600 that's a Russian troll?
00:38:40.460 So somebody in all caps
00:38:42.340 is shouting,
00:38:43.660 Putin is not losing.
00:38:45.800 Does that sound like
00:38:46.540 a real person?
00:38:49.440 So all caps
00:38:50.560 shatter.
00:38:52.820 Oh, and here's
00:38:53.380 another one.
00:38:54.240 No, you were afraid.
00:38:56.540 Kalen.
00:38:57.840 Kalen.
00:38:59.020 No, you were afraid.
00:39:00.520 That's such a dumb comment
00:39:01.940 that I won't even comment
00:39:03.620 about the comments.
00:39:06.720 That's like the lowest
00:39:08.040 level of awareness.
00:39:11.400 Anybody else?
00:39:12.200 Anybody want to show
00:39:16.280 themselves for their
00:39:17.480 chew?
00:39:18.660 I'm having fun today
00:39:19.500 just getting rid of assholes.
00:39:29.620 Stick's name?
00:39:30.520 What?
00:39:31.940 Somebody says,
00:39:35.720 imagine listening to someone
00:39:36.860 who took gene therapy
00:39:38.100 for a cold.
00:39:40.980 Now, is that a real person?
00:39:43.080 Or is that just a troll?
00:39:47.180 Do you know who else
00:39:48.240 took gene therapy
00:39:49.240 for a cold?
00:39:50.220 The inventor of it.
00:39:54.180 Asshole.
00:39:55.580 Go on.
00:39:57.840 Does anybody have
00:39:58.760 any good comments?
00:40:01.940 There's an asshole.
00:40:11.780 Goodbye.
00:40:13.660 This is like,
00:40:14.300 you know,
00:40:14.700 this is weirdly satisfying.
00:40:16.920 You can't tell
00:40:17.600 on the locals platform.
00:40:20.420 But it's sort of satisfying
00:40:22.020 to block people
00:40:23.680 into real time.
00:40:24.500 It made everybody quiet.
00:40:27.580 All right.
00:40:35.660 Potential peace negotiations
00:40:37.360 at this point?
00:40:38.420 I don't think there will be
00:40:39.540 any peace negotiations
00:40:40.840 that are real.
00:40:42.060 people.
00:40:44.400 So here's my prediction
00:40:45.620 on Ukraine and Russia.
00:40:47.600 2023,
00:40:48.840 the same.
00:40:50.340 Like,
00:40:50.620 there'll be some
00:40:51.080 back and forth.
00:40:52.420 There'll be more,
00:40:53.160 you know,
00:40:53.400 nuclear threats
00:40:54.300 and stuff like that.
00:40:55.320 But I think
00:40:55.740 at the end of 2023,
00:40:56.740 it's going to look
00:40:57.400 just like it looks now.
00:40:59.180 Two sides
00:41:00.040 shooting each other
00:41:01.500 and hoping the other
00:41:02.360 one gives up.
00:41:03.720 But I don't see it ending.
00:41:05.660 Does anybody see
00:41:06.580 a negotiated end?
00:41:08.140 Just because you and I
00:41:09.620 can imagine
00:41:10.440 what a deal
00:41:11.300 would look like,
00:41:12.540 I don't think
00:41:13.360 Putin can do it.
00:41:14.180 You need a birthday
00:41:30.420 shout-out?
00:41:33.580 Well,
00:41:34.060 you didn't put
00:41:34.480 your name there,
00:41:35.760 Mighty Mutt.
00:41:37.560 But happy birthday
00:41:38.300 anyway.
00:41:43.220 I blocked the
00:41:44.100 mind readers too.
00:41:46.320 Oh,
00:41:46.780 thank you.
00:41:47.700 Change your life.
00:41:49.580 That's good to know.
00:41:56.960 Negotiations start.
00:41:58.020 Yeah,
00:41:58.380 they'll start negotiating
00:41:59.440 but nothing will happen.
00:42:02.420 How do you really
00:42:03.380 negotiate with Putin
00:42:04.320 when you know
00:42:04.920 he's not going to
00:42:05.720 keep his...
00:42:07.380 Let's be...
00:42:10.320 I was going to have
00:42:11.300 like a trick ending
00:42:12.120 to that.
00:42:12.580 but I'll just
00:42:14.520 start it out straight.
00:42:16.340 How in the world
00:42:17.000 can the United States
00:42:18.120 and Russia
00:42:18.800 negotiate anything
00:42:20.440 when both sides
00:42:22.460 cheat on everything?
00:42:24.160 Am I wrong?
00:42:25.420 Hasn't the United States
00:42:26.480 basically broken
00:42:27.280 all of its deals
00:42:28.220 and Russia has too?
00:42:31.520 Like,
00:42:31.760 I don't think
00:42:32.300 we're the good ones
00:42:33.000 on this,
00:42:33.400 are we?
00:42:34.300 I feel like
00:42:35.180 the same people
00:42:40.600 are doing something else.
00:42:43.920 Logan Paul
00:42:44.700 is the Scott Adams
00:42:45.700 of autonomous thought.
00:42:47.860 Okay.
00:42:55.240 I need to negotiate
00:42:56.280 with Biden,
00:42:56.980 yeah.
00:42:57.160 Yeah,
00:42:57.240 I don't see any way
00:43:04.320 that there's going to be
00:43:06.080 no negotiate settlement.
00:43:07.100 All right.
00:43:10.960 Here's my economic forecast
00:43:13.000 for 2023.
00:43:18.060 The press will call it
00:43:19.640 a mild,
00:43:20.660 mild recession.
00:43:23.900 Others will say
00:43:24.640 it's not mild.
00:43:26.040 All right,
00:43:26.220 so just remember
00:43:27.800 this exact thing,
00:43:29.620 that the press,
00:43:30.420 some of the press
00:43:31.120 will say it's mild
00:43:32.020 in the middle
00:43:32.620 of next year
00:43:33.240 and others will say
00:43:34.580 that's not mild,
00:43:35.480 look at this or that,
00:43:37.240 that's way worse
00:43:37.880 than mild.
00:43:39.240 But a lot of people
00:43:40.660 will call it mild
00:43:41.360 and I'm going
00:43:41.840 to call it mild.
00:43:43.660 Inflation will continue
00:43:44.740 but we'll start
00:43:46.760 to recess a little bit
00:43:48.280 so it'll be a little bit
00:43:49.320 better but not a lot.
00:43:53.060 We're going to be
00:43:53.840 disappointed
00:43:54.520 at how quickly
00:43:55.520 manufacturing
00:43:56.480 comes back to America.
00:43:58.500 Biden will
00:43:59.380 talk it up
00:44:01.300 but there won't
00:44:02.700 be much happening.
00:44:03.940 But a lot of people
00:44:04.420 will be making plans.
00:44:05.480 So I think we're moving
00:44:06.380 in the right direction
00:44:07.000 but it won't be
00:44:07.520 much happening.
00:44:09.860 And I think
00:44:10.460 the stock market
00:44:11.300 will be flat
00:44:13.540 until toward
00:44:15.700 the end of the year.
00:44:18.180 So I think
00:44:18.820 the stock market
00:44:19.780 will limp along
00:44:20.700 up and down
00:44:21.960 until the end
00:44:22.960 of next year
00:44:23.480 and then if nothing
00:44:24.980 new is on the horizon
00:44:26.420 it's going to look
00:44:27.800 like we got
00:44:28.360 through the recession.
00:44:30.080 So we'll be seeing
00:44:31.420 the end of the recession
00:44:32.340 by the end of this year,
00:44:33.580 the end of the coming year.
00:44:34.440 So that's my prediction.
00:44:40.880 Turn over my stream
00:44:42.140 to AI.
00:44:43.800 All right.
00:44:44.420 Oh, I need to make
00:44:45.280 another minor announcement.
00:44:47.340 So I had said
00:44:48.400 that I was going to
00:44:48.960 abandon YouTube
00:44:50.280 or at least
00:44:51.700 do a rumble
00:44:52.660 in addition
00:44:53.160 for live streaming.
00:44:53.960 but I was looking
00:44:55.960 at the instructions
00:44:57.300 for live streaming
00:44:59.780 on rumble
00:45:00.920 and there's 16 pages.
00:45:07.900 You know,
00:45:08.540 I mean,
00:45:08.760 it's big pictures.
00:45:09.520 That's why it's so many pages.
00:45:10.820 But there's many steps
00:45:12.120 involved
00:45:12.680 and this is only
00:45:14.560 for the rumble software.
00:45:16.400 You have to use
00:45:17.060 a third-party software
00:45:18.180 that's much harder
00:45:19.680 than this.
00:45:20.980 Maybe 10 times
00:45:21.920 harder than this.
00:45:23.180 This just is a bunch
00:45:24.180 of steps
00:45:24.640 but it's not hard.
00:45:27.320 So it's not,
00:45:28.500 it's just not doable.
00:45:31.040 So I changed my plans.
00:45:33.060 I can't use rumble
00:45:33.920 because you need
00:45:35.180 to use a third-party software,
00:45:36.980 OBS
00:45:37.460 or StreamYard
00:45:38.640 or something
00:45:39.080 and that adds a,
00:45:40.940 both of those
00:45:42.280 add a level
00:45:43.000 of complexity.
00:45:44.080 It's completely impractical.
00:45:45.580 So if rumble
00:45:48.000 upgrades their
00:45:49.660 live stream option
00:45:50.560 so you can do it
00:45:51.720 without the third-party software
00:45:53.080 then I will
00:45:54.320 go there immediately.
00:45:56.620 But right now
00:45:57.560 it would,
00:45:58.260 it would add
00:45:59.560 probably an hour a day.
00:46:02.300 It would add
00:46:03.060 probably an hour a day
00:46:03.940 to my process.
00:46:05.380 It's,
00:46:05.560 it's a complete
00:46:06.180 non-usable,
00:46:08.100 usable process.
00:46:09.660 Now it's still
00:46:10.500 on rumble
00:46:11.000 recorded,
00:46:12.060 right?
00:46:12.280 So we've always
00:46:13.540 been upgrading
00:46:14.740 to rumble
00:46:15.300 the recorded show.
00:46:17.040 So you can still
00:46:17.880 see that.
00:46:19.960 An assistant
00:46:20.580 could configure it?
00:46:21.700 No.
00:46:22.420 No.
00:46:22.880 No, it's well beyond
00:46:24.020 practical.
00:46:27.660 Yeah,
00:46:27.960 it's not something
00:46:28.920 you could,
00:46:29.360 you could work with
00:46:30.200 because my assistant
00:46:31.680 would have to be here
00:46:32.420 in the room with me
00:46:33.280 at 4 a.m.
00:46:34.580 and, you know,
00:46:35.120 that's not gonna happen.
00:46:38.840 How long does it take?
00:46:40.680 So the two,
00:46:41.320 the two platforms
00:46:42.520 I'm streaming on now
00:46:43.560 are both
00:46:44.560 on an iPad.
00:46:46.880 So that's the first
00:46:47.720 problem with rumble.
00:46:49.540 You need a laptop,
00:46:51.020 I think.
00:46:52.900 I might be wrong
00:46:53.840 about that,
00:46:54.260 but I think so.
00:46:57.260 And
00:46:57.700 all I have to do
00:47:00.580 is put it in the titles
00:47:01.600 and the description.
00:47:04.000 That's it.
00:47:05.200 And on YouTube
00:47:05.820 I post for a picture
00:47:07.100 that happens automatically.
00:47:08.980 That's it.
00:47:09.440 So it takes me
00:47:11.500 less than
00:47:13.600 one minute
00:47:14.480 to put up
00:47:16.240 two platforms.
00:47:17.220 But if I had rumble
00:47:18.180 it will add an hour a day
00:47:19.460 because it will never work.
00:47:22.560 See,
00:47:23.020 when you go through
00:47:23.500 the third-party software
00:47:24.580 half the time
00:47:26.520 it doesn't work.
00:47:30.280 Scotty Appleseed,
00:47:31.640 ChaiCom supporter.
00:47:33.120 There's somebody
00:47:33.740 out here
00:47:34.100 who's so clueless
00:47:35.200 they think that
00:47:36.740 I'm a China supporter.
00:47:40.860 How much of a troll
00:47:41.940 would you have to be
00:47:42.540 to think that?
00:47:48.320 Malone almost died
00:47:49.320 from the shot.
00:47:50.100 No, he didn't.
00:47:54.540 If Viva Fry
00:47:55.460 can do it,
00:47:56.100 why can't you?
00:47:57.640 I can do it too.
00:47:59.120 I just told you
00:47:59.760 I can do it.
00:48:00.900 It would take me
00:48:01.480 an hour a day.
00:48:02.120 Now, I don't know
00:48:04.140 what Viva Fry
00:48:04.960 has going on
00:48:06.500 besides live streaming
00:48:07.860 but you know
00:48:08.900 live streaming
00:48:09.400 isn't my main job.
00:48:11.400 It's just one thing
00:48:12.320 I do.
00:48:13.180 If it were my main job
00:48:14.540 oh, I'd do it.
00:48:16.080 If live streaming
00:48:17.280 is all I did
00:48:17.940 I'd do it.
00:48:20.040 Because then
00:48:20.440 an hour a day
00:48:21.020 wouldn't seem like so much.
00:48:22.360 But I don't have
00:48:22.800 an hour a day
00:48:23.320 I'm going to...
00:48:24.240 Now, the hour a day
00:48:25.040 is because it won't work.
00:48:27.200 Not because
00:48:28.060 that's how long
00:48:28.760 the process takes.
00:48:30.060 The hour is
00:48:30.860 because you try it
00:48:31.680 and it doesn't work
00:48:32.380 because of the
00:48:32.860 third-party software.
00:48:34.200 You don't know why.
00:48:35.560 You have to figure it out.
00:48:36.680 You have to reboot.
00:48:38.000 Then you've got to
00:48:38.440 reboot this one first
00:48:39.500 but maybe it was
00:48:40.460 the other one
00:48:40.880 that had to be up first.
00:48:42.380 Then you've got to
00:48:42.860 put in a new code
00:48:44.140 because you don't know
00:48:44.880 if the old code
00:48:45.540 wore out
00:48:46.220 or why isn't
00:48:47.140 this code working.
00:48:48.380 Is it my third-party
00:48:49.260 software
00:48:49.700 or is it Rumble?
00:48:51.520 It would be that
00:48:52.200 every day.
00:48:53.820 Whereas,
00:48:54.380 I just fire this up
00:48:55.360 in two minutes
00:48:55.900 and I'm done.
00:48:58.560 Yeah, reboot your router.
00:48:59.980 Yeah, all that stuff.
00:49:01.680 What's going to happen
00:49:07.900 with Taiwan?
00:49:10.140 I'm going to say
00:49:11.100 that China
00:49:12.880 will not attack
00:49:14.140 Taiwan in 2023
00:49:15.560 but they certainly
00:49:18.420 do seem
00:49:19.100 bent on owning it.
00:49:20.500 it affects the quality
00:49:30.520 of your screen.
00:49:33.640 Does your doctor
00:49:34.600 recommend boosters?
00:49:36.220 No.
00:49:38.240 Well, let me be
00:49:39.140 more specific.
00:49:40.520 My doctor did ask me
00:49:42.320 if I had been boosted
00:49:43.420 and I said no
00:49:44.360 and offered
00:49:46.920 it as sort of
00:49:49.040 a company
00:49:50.140 that the HMO
00:49:52.380 offers it.
00:49:52.980 So it was made available.
00:49:54.780 My answer was
00:49:56.000 I don't believe
00:49:57.020 scientific studies
00:49:58.740 so no.
00:50:00.860 Zero pushback.
00:50:03.500 I just said
00:50:04.380 I don't believe
00:50:04.880 the studies
00:50:05.400 so there would be
00:50:07.540 no reason
00:50:07.900 to put it in my body
00:50:08.780 and she just said
00:50:09.960 okay
00:50:11.340 next topic
00:50:12.400 no pushback.
00:50:18.340 All right.
00:50:21.060 Would A-hole
00:50:21.960 get me demonetized?
00:50:23.740 Well, we'll find out.
00:50:24.780 Probably not.
00:50:25.320 I don't know
00:50:32.080 what that's about
00:50:32.720 but I'll hide you.
00:50:35.560 Was I vindicated?
00:50:38.080 You are vindicated?
00:50:39.480 Vindicated by what?
00:50:42.140 What vindicated me?
00:50:46.860 Why are you denying
00:50:47.900 the experts?
00:50:51.120 See,
00:50:52.000 all of my critics
00:50:53.600 are not actually
00:50:55.580 criticizing
00:50:56.320 anything that I did.
00:50:58.980 A hundred percent
00:50:59.960 of the vaccine critics
00:51:01.040 are just
00:51:01.440 they just have
00:51:02.000 wrong information
00:51:02.800 or they're
00:51:04.340 somehow illogical.
00:51:05.760 It's not even
00:51:06.740 about me.
00:51:07.980 It's like
00:51:08.320 I think I trigger
00:51:09.340 some feelings
00:51:10.700 in them
00:51:11.180 that they have
00:51:11.660 to work out.
00:51:12.380 It just doesn't
00:51:12.880 seem about me.
00:51:19.740 I think
00:51:20.500 sticks kind of
00:51:21.320 proves Scott's
00:51:22.100 point with that.
00:51:23.060 I don't know
00:51:23.340 what that is.
00:51:30.500 Right.
00:51:31.440 Yeah,
00:51:31.700 people believe
00:51:32.420 that because
00:51:33.000 I'm a public figure
00:51:34.100 that anything
00:51:35.700 I said
00:51:36.900 is interpreted
00:51:39.000 as supporting it
00:51:40.100 or not supporting it.
00:51:41.820 And you're right.
00:51:42.680 I was never
00:51:43.380 on the supporting it
00:51:44.340 or not supporting it.
00:51:47.040 Let me ask you this.
00:51:48.860 The people
00:51:49.360 who have been
00:51:49.620 watching me,
00:51:50.720 is there anything
00:51:51.520 I've said more
00:51:52.240 often than
00:51:52.960 don't get your
00:51:53.640 medical advice
00:51:55.760 from me?
00:51:56.680 Can you think
00:51:57.460 of anything
00:51:58.040 I've said more
00:51:58.660 often than
00:51:59.720 don't take any
00:52:00.640 advice from me
00:52:01.400 medically?
00:52:02.440 It's like a mantra.
00:52:05.080 And literally
00:52:05.780 on the,
00:52:06.520 I'll bet,
00:52:07.680 I'd be willing
00:52:08.220 to bet nobody
00:52:08.900 in the world
00:52:09.760 says that publicly
00:52:11.840 as often as I do.
00:52:13.160 I'll bet nobody
00:52:13.940 in the world
00:52:14.940 says don't take
00:52:16.380 my medical advice
00:52:17.360 more often
00:52:18.080 than I do.
00:52:18.560 And yet,
00:52:19.580 what is my
00:52:19.980 biggest criticism?
00:52:21.780 Why are you
00:52:22.240 giving medical advice?
00:52:25.020 That's the world
00:52:25.720 we live in.
00:52:29.220 When are we
00:52:29.820 going to accept
00:52:30.300 that there are
00:52:30.720 no experts?
00:52:31.360 Well,
00:52:31.640 no experts
00:52:32.140 you can trust,
00:52:32.940 that's for sure.
00:52:35.920 Scott,
00:52:36.360 would you have
00:52:36.840 gotten the jab
00:52:37.520 if you didn't
00:52:38.060 go to Greece?
00:52:38.840 It was Bora Bora
00:52:39.620 and,
00:52:40.760 oh,
00:52:41.580 it was Greece.
00:52:44.340 Oh,
00:52:44.700 you're right.
00:52:45.200 Yeah.
00:52:45.920 I think I conflated
00:52:47.160 the two trips.
00:52:48.200 Yeah,
00:52:48.360 I think Greece
00:52:48.820 was the first one.
00:52:49.520 You're right.
00:52:49.980 I've been saying
00:52:50.520 Bora Bora,
00:52:51.060 but that came after.
00:52:52.120 It was the same thing.
00:52:52.920 I had to be vaccinated
00:52:53.700 for both.
00:52:56.520 So the question was,
00:52:58.800 what was the question?
00:53:01.120 Oh,
00:53:01.560 if I didn't have to travel,
00:53:04.000 would I have gotten them?
00:53:06.360 Well,
00:53:06.840 you probably know
00:53:07.580 that I waited
00:53:08.400 as long as possible
00:53:09.520 to get as much information
00:53:11.660 about it as possible.
00:53:12.560 So I think
00:53:13.740 because I was doing
00:53:14.980 a good job
00:53:15.540 of socially distancing,
00:53:17.160 probably would have
00:53:18.300 continued waiting
00:53:19.200 and I don't know
00:53:20.860 how long I would have
00:53:21.600 waited,
00:53:22.440 but at some point
00:53:23.240 the information changed,
00:53:24.480 right?
00:53:25.040 So if I had waited
00:53:25.980 long enough,
00:53:26.680 I would have seen
00:53:27.260 the opinion about it
00:53:29.440 change.
00:53:30.220 I don't know
00:53:31.120 if I would have
00:53:31.600 gone that far
00:53:32.240 because I feel like
00:53:33.340 that would be
00:53:33.760 complimenting myself
00:53:34.700 too much,
00:53:35.700 right?
00:53:36.360 Like I love to tell you
00:53:37.500 if I didn't have to
00:53:38.340 take that trip
00:53:39.140 or want to take that trip.
00:53:40.560 If I didn't want
00:53:41.260 to take the trip,
00:53:42.680 you know,
00:53:42.880 I could have waited
00:53:43.740 long enough to know
00:53:44.800 that it was a bad idea,
00:53:46.040 you know,
00:53:46.940 get all the way
00:53:47.700 to Omicron.
00:53:49.720 But I don't know.
00:53:50.600 I can't promise you
00:53:51.360 that would have happened.
00:53:52.360 But maybe.
00:53:53.560 But I do have,
00:53:54.980 it's well documented
00:53:55.900 that I wanted to wait
00:53:57.740 as long as I could
00:53:58.840 before I did it.
00:54:00.600 So that's well documented
00:54:01.740 because I didn't trust it.
00:54:03.600 And why did I call it
00:54:08.980 a pandemic of the
00:54:09.920 unvaccinated?
00:54:12.120 How many times
00:54:12.900 do I have to explain that?
00:54:15.600 You've seriously
00:54:16.400 never heard me
00:54:17.020 explain that in public
00:54:17.940 like 15 times?
00:54:20.340 The rest of you have,
00:54:21.940 right?
00:54:22.840 Do you need it again?
00:54:26.020 Well,
00:54:26.380 I'll explain it to you.
00:54:28.760 I didn't say that.
00:54:30.740 So first,
00:54:31.660 the first thing
00:54:32.100 you have to do
00:54:32.600 is quote me correctly,
00:54:34.160 right?
00:54:34.760 If you quote me correctly,
00:54:36.200 it's a little easier.
00:54:37.760 So what I said was,
00:54:39.020 it feels like,
00:54:40.660 right?
00:54:41.180 So once you got vaccinated,
00:54:42.960 the day I got vaccinated,
00:54:44.640 I could travel anywhere
00:54:46.040 for the first time.
00:54:48.940 Right?
00:54:49.660 So for me,
00:54:50.420 it was over
00:54:50.980 because the only thing
00:54:52.980 that was bothering me
00:54:54.000 was I couldn't travel.
00:54:56.060 Otherwise,
00:54:56.900 you know,
00:54:57.220 I mean,
00:54:57.960 I didn't really
00:54:58.420 have to wear a mask
00:54:59.240 because I didn't go
00:54:59.940 to a job.
00:55:00.960 I didn't have to go
00:55:02.140 anywhere.
00:55:02.480 I could have my
00:55:03.120 groceries delivered.
00:55:05.700 So for me,
00:55:06.500 the pandemic was over
00:55:07.760 when I got my vaccination.
00:55:10.360 Now,
00:55:10.560 some people have,
00:55:12.000 some people have said
00:55:15.480 that they disagreed
00:55:21.900 with my opinion
00:55:22.940 about how I felt.
00:55:24.840 Do you think
00:55:25.500 that's fair?
00:55:27.220 Do you think
00:55:27.760 that you should
00:55:28.500 fact-check
00:55:29.220 how I felt?
00:55:31.480 Because the tweet
00:55:33.000 was how I felt.
00:55:36.280 Why are you
00:55:37.000 fact-checking
00:55:37.500 my feelings?
00:55:39.920 I felt like
00:55:41.280 my problem was over.
00:55:43.380 And was I correct?
00:55:45.160 Was I correct
00:55:46.120 that that was
00:55:47.120 the end of the pandemic
00:55:47.940 for me?
00:55:48.900 Basically,
00:55:49.460 yes.
00:55:50.760 Basically,
00:55:51.420 selfish.
00:55:53.520 Are you late?
00:55:54.600 Never mind.
00:56:00.100 Yeah.
00:56:03.340 Yes,
00:56:03.920 fair to disagree.
00:56:05.500 It's fair for you
00:56:06.840 to have an opinion
00:56:07.440 about yourself.
00:56:08.820 Of course.
00:56:10.100 But people fact-checked
00:56:11.320 my opinion about myself.
00:56:13.560 How does that make sense?
00:56:14.560 Do I think
00:56:22.800 Lex Friedman
00:56:23.440 is a good fit
00:56:24.100 for a Twitter executive?
00:56:25.620 Well,
00:56:25.880 I don't know
00:56:26.280 his management
00:56:26.920 capabilities.
00:56:29.840 We know
00:56:30.500 he's intellectually
00:56:31.260 capable,
00:56:32.300 but managing
00:56:33.100 is a different skill.
00:56:36.720 And Paradigm
00:56:37.580 is arguing
00:56:38.160 my opinion.
00:56:39.240 so there's
00:56:40.800 actually somebody
00:56:41.480 who's so fucking
00:56:42.220 stupid
00:56:42.620 that they're
00:56:43.880 arguing
00:56:44.320 my opinion.
00:56:45.900 So this says
00:56:46.740 the pandemic
00:56:47.800 was not over
00:56:48.880 for you.
00:56:50.140 You realize
00:56:50.680 that it was
00:56:51.040 an opinion
00:56:51.920 of how it
00:56:53.500 feels.
00:56:55.380 And you're
00:56:55.820 literally
00:56:56.140 fact-checking
00:56:56.960 me
00:56:57.180 on my
00:56:58.300 internal
00:56:58.820 feelings.
00:57:00.520 That's actually
00:57:01.400 happening here.
00:57:03.880 If you say
00:57:04.780 it wasn't over
00:57:05.360 for you,
00:57:06.140 do you think
00:57:06.860 I would argue
00:57:07.360 it?
00:57:07.620 Oh,
00:57:09.460 no,
00:57:10.240 because you'd
00:57:10.760 be saying
00:57:11.100 how you felt.
00:57:12.600 Well,
00:57:12.900 I would
00:57:13.140 in fact
00:57:13.500 check
00:57:13.760 your
00:57:14.040 feelings.
00:57:16.960 Yeah,
00:57:17.160 I said
00:57:17.360 Dr.
00:57:17.780 Shiva
00:57:17.980 wants to
00:57:18.420 run
00:57:18.560 Twitter
00:57:18.800 too.
00:57:24.500 You're
00:57:25.020 blocking
00:57:25.320 people
00:57:25.700 for calling
00:57:26.220 you selfish
00:57:26.820 now?
00:57:27.860 Yes,
00:57:28.300 because that's
00:57:28.980 just a
00:57:29.500 personal
00:57:29.900 insult.
00:57:30.940 And
00:57:31.140 goodbye.
00:57:32.680 You're
00:57:33.180 gone.
00:57:37.620 All right.
00:57:46.360 Joel's
00:57:47.000 fact-check
00:57:47.420 opinions
00:57:47.860 for the
00:57:48.200 gotcha
00:57:48.500 factor.
00:57:48.980 Yeah.
00:57:49.880 Anybody
00:57:50.180 else want
00:57:50.600 to fact-check
00:57:51.080 my personal
00:57:51.720 opinion?
00:57:54.720 All right.
00:57:57.900 There is
00:57:58.580 almost no
00:57:59.300 news today,
00:58:00.140 and there
00:58:00.720 will be
00:58:01.040 probably no
00:58:01.820 news tomorrow.
00:58:02.500 we are so
00:58:04.200 newsless that
00:58:05.080 we're going
00:58:06.260 to have to
00:58:06.480 figure out
00:58:06.940 something else
00:58:07.440 to do
00:58:07.740 if we're
00:58:08.020 going to
00:58:08.160 meet here
00:58:08.460 every day.
00:58:09.560 But things
00:58:10.240 will pick up
00:58:10.900 at the
00:58:11.220 beginning of
00:58:11.660 the year.
00:58:14.280 Janie's
00:58:14.740 on vacation?
00:58:18.040 Yes,
00:58:18.520 I did
00:58:18.800 know the
00:58:19.460 ChatGPT
00:58:20.960 can...
00:58:22.500 By the
00:58:23.100 way,
00:58:23.480 ChatGPT
00:58:24.440 version 4,
00:58:25.840 I understand,
00:58:26.600 is available
00:58:27.080 maybe this
00:58:28.500 spring,
00:58:29.600 before spring,
00:58:30.580 and it's
00:58:31.100 100 times
00:58:31.720 more powerful
00:58:32.360 than the
00:58:33.500 existing one.
00:58:34.580 I'm not
00:58:35.200 sure if
00:58:35.500 you'll notice,
00:58:36.800 because I
00:58:37.540 think maybe
00:58:37.960 it has to
00:58:38.300 be 10,000
00:58:39.000 times better
00:58:39.600 to, you
00:58:40.580 know,
00:58:40.680 like,
00:58:40.960 really impress
00:58:41.520 us.
00:58:48.840 All right.
00:58:52.640 Yes,
00:58:53.140 I plan to
00:58:53.620 be happy
00:58:53.960 all week.
00:58:57.780 Did
00:58:58.220 cursing at
00:58:58.900 people sound
00:58:59.400 like I
00:58:59.680 wasn't happy?
00:59:00.420 I was
00:59:00.880 totally happy
00:59:01.460 when I
00:59:01.780 was cursing
00:59:03.140 out the
00:59:03.580 trolls.
00:59:07.620 All right.
00:59:11.320 That is
00:59:11.880 all I
00:59:12.180 got for
00:59:12.460 today.
00:59:13.120 YouTube,
00:59:13.580 I'm going
00:59:13.800 to say
00:59:14.000 goodbye to
00:59:14.360 you.
00:59:15.140 I will
00:59:15.460 talk to
00:59:15.800 you soon.
00:59:16.720 Bye for
00:59:17.140 now.