Real Coffee with Scott Adams - September 26, 2023


Episode 2243 Scott Adams: End Misinformation Online By Believing Everything The Government Tells Us!


Episode Stats

Length

1 hour and 2 minutes

Words per Minute

147.38321

Word Count

9,265

Sentence Count

715

Misogynist Sentences

11

Hate Speech Sentences

17


Summary

On today's episode of Coffee with Scott Adams, Scott talks about some of the craziest things going on in the world right now, including the fact that a man got doxxed by the left, and a guy got swatted by the right.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:06.620 It's called Coffee with Scott Adams, and I can guarantee that you'll never have a better time in your life than the next moments.
00:00:14.980 And all you need to take that up to the level that people can't even describe, it's so amazing.
00:00:19.880 All you need is a cup or mug or a glass, a tank or chalice or stye, and a canteen jug or a flask, a vessel of any kind to fill it with your favorite liquid.
00:00:27.420 I like coffee. And join me now for the unparalleled pleasure that dopamine is at the end of the day, the thing that makes everything better.
00:00:35.300 It's called a simultaneous sip, and it happens now. Go!
00:00:41.800 Oh, God, that's good. Oh, that's so good. Don't you feel better already? I think you do.
00:00:50.160 Well, take that good feeling, and we're going to double that. Yeah, double it.
00:00:57.420 Well, let's see what's going on. I love reading rumors on the X platform.
00:01:05.220 I never know what's true for the first few hours. You have to wait for the debunking to come.
00:01:10.980 But one of the stories I would put low confidence on is that Justin Trudeau's plane had cocaine on it when he visited India for the G20.
00:01:22.400 Is anybody hearing that story? That's not true, is it?
00:01:28.620 It doesn't sound like a true story.
00:01:35.200 I don't know. I can't imagine that India was all over his plane.
00:01:40.920 I don't know. That doesn't sound real.
00:01:43.060 I'm going to dismiss that as not real.
00:01:45.300 All right, here's another one that's probably not real.
00:01:50.900 Reportedly, at least in a tweet by Unusual Whales, BlackRock, State Street, and other money managers are closing their ESG funds, according to Bloomberg.
00:02:01.740 Do you think that's really happening?
00:02:05.560 Are they closing their ESG funds, or are they just still doing ESG things and trying to make it less obvious by not using the name?
00:02:15.100 Are they just changing the name?
00:02:17.300 It makes you wonder, doesn't it?
00:02:18.640 But the good news is they've been embarrassed on a business.
00:02:26.020 Did I promise you that I would mock ESG on a business?
00:02:30.300 I believe I promised you that I would make it so embarrassing that nobody would want to associate with it.
00:02:37.800 Well, I don't think I did it alone, but I was part of a successful effort, it would seem.
00:02:45.360 Or it might be just partially successful.
00:02:49.520 But at the very least, it's embarrassing.
00:02:51.800 It's embarrassing to be associated with it, so that's something.
00:02:56.420 Do you know user CatTurd on the X platform?
00:03:02.600 Very popular account.
00:03:03.860 I think he's got over 2 million followers.
00:03:08.120 I'm not one of his followers.
00:03:09.700 I blocked him a long time ago.
00:03:11.920 And I think Elon Musk blocked him, too.
00:03:14.440 So if you'd like to know the value of CatTurd, I can only tell you that two people that I think are smart blocked him.
00:03:22.160 So I think he's a piece of shit, honestly.
00:03:25.020 And it's because he's one of the many people who blame me for having opinions I don't hold.
00:03:30.960 So I have a very low opinion of CatTurd.
00:03:35.540 But apparently he got doxxed by the left, meaning they figured out where he lived and what his real name was.
00:03:42.480 And he got swatted.
00:03:43.420 So somebody called the police and said there's something dangerous happening with a gun or something.
00:03:50.800 And they got swatted.
00:03:53.040 Now, I assume that's true.
00:03:55.340 I'm not sure that it's confirmed, but it's reported.
00:03:59.580 So, boy, the left is pretty tough, aren't they?
00:04:02.760 And I was trying to think, has the right ever swatted the left?
00:04:06.080 Because I know Tim Pool always got swatted all the time.
00:04:10.840 If you don't know what a swat it is, it's when the police SWAT team, you know, streams in because they think there's a dangerous situation happening at the moment.
00:04:19.600 But no such thing is happening, meaning somebody called it in.
00:04:25.680 Yeah?
00:04:26.700 Yeah, it looks like it's something that only the left does to the right.
00:04:29.860 As they try to go after each effective speaker one at a time.
00:04:37.280 So that's happening.
00:04:40.660 So there's a, probably a Rupar kind of a video of Trump in which he appears to, now before I go into this story, I'm going to close my eyes because I know what your comments are and I don't want to read them.
00:04:54.480 But, yes, I'm going to get to the point where it appears that he's misinterpreted.
00:05:01.500 So I don't want to open my eyes and see all of your comments screaming in capital letters,
00:05:06.280 Scott!
00:05:07.800 Scott!
00:05:08.560 You misinterpreted that story!
00:05:11.180 Ah!
00:05:12.020 Ah!
00:05:13.260 I promise you, I'm going to tell you both sides of the story.
00:05:17.320 All right, now I'm going to open my eyes and I hope that there are no comments screaming to me that I haven't told the other side of the story yet.
00:05:24.480 Because I haven't started yet.
00:05:26.880 Can we, can we get, can we be on the same page here?
00:05:30.400 All right.
00:05:31.300 So there's a video of Trump in which the people tweeting the video are alleging that he confused Jeb Bush with George Bush.
00:05:42.160 And you listen to the video and you say to yourself, yeah, that's exactly what he did.
00:05:47.160 I heard it with my own ears, saw it with my own eyes.
00:05:50.720 And it must be a sign of dementia to not know the difference between a George and a Jeb.
00:05:57.600 And then other people said, but Scott, listen to it more carefully.
00:06:02.520 You're just misinterpreting it.
00:06:04.460 He's talking about two different people and two different contexts.
00:06:09.440 And it all makes sense.
00:06:12.100 So I listened to it again and it was like one of those audio illusions where once somebody tells you how to listen to it, you listen to it again and you go, oh, oh, that actually made complete sense.
00:06:25.380 There was no dementia there at all.
00:06:26.780 And then I wait a minute and then I just play it again, just like I'd seen it the first time.
00:06:34.700 And there's, there's plenty of dementia there.
00:06:37.640 And then I, I remember what he was saying and then I want to play it again.
00:06:43.840 No dementia at all.
00:06:45.280 No, it makes perfect sense.
00:06:47.260 Same video.
00:06:48.640 You can, you can watch it three times in a row.
00:06:51.600 And that's, yeah, the green needle is the, the audio illusion one.
00:06:56.960 So it does, it does look, it's like, it's a Rupar.
00:07:01.100 So it's just a Rupar.
00:07:03.740 But it was very confusing.
00:07:06.360 People were mad at me because when I tweeted it, I tweeted it with a comment, age matters.
00:07:12.040 Now, when I said age matters, I was intentionally being vague because I knew that that was true no matter what the video was.
00:07:24.160 If he were not a certain age, nobody would have played that as a sign of dementia.
00:07:30.060 But the problem is when people are a certain age, you start imagining you're seeing it.
00:07:35.880 I'm sure that I've imagined seeing it with Biden when maybe in some cases it was some innocent non-problem.
00:07:46.140 I'm sure I've overseen it with Biden.
00:07:48.540 But with Biden, I'm pretty sure it's there.
00:07:51.260 So anybody a certain age, you're going to imagine you're seeing the signs of age, even if a younger person might have done exactly the same thing.
00:07:59.160 So age does matter.
00:08:00.860 It matters how we perceive them.
00:08:02.300 And we wouldn't even be worrying about it if we didn't all know it's a risk, that people of a certain age can lose it, and they can lose it kind of quickly.
00:08:12.760 So you have to keep an eye on it.
00:08:14.460 But I would say my take on this is that Trump was not only not confused, but he was talking very quickly on a fairly complicated little story about this one, that one, the last name, the Bush, etc.
00:08:29.680 And if you listen to it carefully, he got everything right.
00:08:33.760 But you have to really concentrate, otherwise you think he's the one who got it wrong.
00:08:37.760 It's actually hard to listen to because it's just complicated.
00:08:41.520 He says it fast.
00:08:42.860 It's not complicated.
00:08:44.060 But he goes through the story pretty quickly.
00:08:46.280 So it's actually a sign of good mental health that he got all that right and got it quickly and said it capably.
00:08:52.700 It's just, it's easy to get it wrong.
00:09:00.960 What?
00:09:04.160 All right.
00:09:05.160 What else is going on?
00:09:08.340 Have you noticed, this was just pointed out to me this morning, and I've noticed it too, but let's see if any of you are on TikTok.
00:09:16.140 But if you're watching the X platform and you've seen the TikToks that make it over, you know, the most viral stuff will sometimes end up on other platforms.
00:09:27.060 Have you noticed an anti-baby-having focus on TikTok?
00:09:34.260 Does anybody notice that?
00:09:35.200 There's a trend of the women who are not having babies bragging about how good their lives are without babies and without being married, I guess.
00:09:46.920 Have you noticed that?
00:09:48.260 Now, given that Elon Musk recently said that depopulation might be the biggest problem in the world, and I think he's right.
00:09:56.900 I think, you know, depopulation or lack of birth rates, you know, birth rates are too low, might be the biggest problem in the world, because it's the one thing that could crash every economy at the same time, or at least the big ones at the same time.
00:10:13.260 And that would be pretty catastrophic.
00:10:14.740 So at the same time that we're worried about our birth rate, the platform that's created in China seems to be encouraging American women not to have babies.
00:10:28.180 But you say to yourself, but Scott, they're not just encouraging, you know, American women not to have babies.
00:10:33.860 Obviously, everyone who's on TikTok is going to see it.
00:10:37.100 It's not just Americans.
00:10:39.100 But China doesn't see it.
00:10:40.540 They don't allow TikTok in their own country, because it's too dangerous.
00:10:46.540 But we're idiots, so we allow it in our country.
00:10:50.000 Well, we're idiots, but also our Congress is corrupt.
00:10:52.920 There's no other explanation for it.
00:10:55.160 And I think this is a really smart play by China.
00:11:00.200 If you're going to predict who wins, I might predict China.
00:11:05.140 Because they've got this weapon that, because our Congress is corrupt, obviously, they can't stop.
00:11:13.940 So as long as they pay our Congress to let TikTok program the minds of our children and our women, mostly the women,
00:11:23.200 we should end up with a population collapse and America should be gone in one generation.
00:11:29.280 As long as TikTok is still there.
00:11:33.380 Now, some of you are going to say, Scott, that's a little bit of hyperbole, isn't it?
00:11:38.840 That just TikTok alone, just TikTok, nothing else.
00:11:43.020 Is that what you're saying, Scott?
00:11:44.280 That just this one app will destroy the United States in one generation?
00:11:49.100 Is that what you're saying?
00:11:50.020 Really?
00:11:51.160 Really?
00:11:52.520 Yes.
00:11:52.980 Yeah, as clearly as I can state it.
00:11:57.180 If we allow TikTok to do what it's doing now, nothing different, just what it's doing now,
00:12:03.000 and just wait one generation, it's unlikely America would survive.
00:12:09.020 At least survive as a superpower.
00:12:12.200 Yeah.
00:12:12.680 It's that destructive.
00:12:15.740 Because if they take out our birth rate, and it looks like they're succeeding at that,
00:12:21.100 and why wouldn't they?
00:12:22.960 Because young women are influenced by TikTok, and it's telling them not to have kids.
00:12:29.680 And if you look at the relationship advice also on Instagram, which is not directly influenced
00:12:35.320 by TikTok, but even Instagram is full of relationship advice that is so bad, I have to assume that
00:12:46.640 they really don't want anybody to get together and successfully have children.
00:12:52.220 The relationship advice from, like, the talking heads, you've seen them, probably seen a million
00:12:57.680 of them.
00:12:58.360 There's always somebody, they have the same voice, authority voice.
00:13:02.340 And let me tell you that the one thing you're definitely going to have to do is find a woman
00:13:06.640 who has three ears.
00:13:08.540 If you don't find a woman who has three ears, well, you're just setting yourself up for a
00:13:12.640 failure, and don't be surprised if you get a two-eared woman, it doesn't work out for
00:13:17.160 you.
00:13:17.800 And every time I listen to them, they all have the same quality to them.
00:13:23.540 They're people who have consistently failed in their own relationships, but are claiming
00:13:29.540 that at the moment their relationship is working.
00:13:33.820 Sure.
00:13:34.260 I would love to see a statistic of all the people who give relationship advice, including
00:13:40.360 marriage therapists, and I would like to look at that as a control group, and see how their
00:13:46.320 marriages compare to the people that they're training to have good marriages.
00:13:51.560 Do you think that the experts on relationships have better relationships?
00:13:57.780 It doesn't look like it to me.
00:13:59.220 Because you know what the men, all the male relationship experts are saying?
00:14:04.920 Well, if that woman you're with will not buckle to your will, cut her loose.
00:14:11.320 Just cut her loose.
00:14:12.740 You don't have to deal with it.
00:14:14.940 If the woman in your life is not meeting your needs in every way, if she's not right on point,
00:14:22.380 if she doesn't let you do what you need to do as a man, and let you be you while being
00:14:28.120 supportive, if she's not doing all those things, just let her go immediately.
00:14:31.140 Let her go.
00:14:33.760 Possibly the dumbest advice I've ever heard in my life.
00:14:36.740 That's like advice for somebody who never met a woman.
00:14:41.820 It's like you've never met a woman.
00:14:44.440 You think you can find the one that meets your needs and doesn't complain.
00:14:49.600 What?
00:14:52.260 Have you ever met a woman?
00:14:54.300 Women are built to complain.
00:14:55.700 That's their basic operating system.
00:14:59.140 If you've got one that doesn't complain, you might have a trans wife.
00:15:05.000 But there's something special going on there.
00:15:07.340 It's not like general advice you could give to other people.
00:15:11.520 I mean, the best advice you could give to anybody if they want to stay married is,
00:15:15.540 marriage is really going to suck.
00:15:18.100 But it's better than the alternative, so get over it.
00:15:21.900 Well, that might work.
00:15:22.740 And then when marriage really sucked, I'd be like, oh, they told me it would really suck.
00:15:27.380 But it's probably better than the alternative, so I think I'll stick it out.
00:15:31.400 That's it.
00:15:32.920 That's all the marriage advice you need.
00:15:35.660 Marriage is totally going to suck.
00:15:37.640 It's just better than the alternatives.
00:15:39.660 And sometimes by a lot.
00:15:41.800 Sometimes it's not even close.
00:15:44.580 But that's the truth.
00:15:45.760 I love the people who tell you that the key to getting married is to pick the right person.
00:15:53.580 Does that advice ever work in the history of advice?
00:15:56.980 Have you ever seen that work for anybody?
00:15:59.880 If you pick the right person, you're probably not going to have chemistry or chemistry that lasts.
00:16:04.540 So, this whole pick the right person is like, it imagines that you have that skill.
00:16:12.900 Nobody has that skill.
00:16:14.960 Nobody has the skill to pick the right person.
00:16:17.500 You don't even know who you're getting until you're living together and you're married.
00:16:21.620 Because people don't even act the same the day after the marriage.
00:16:25.100 The whole dynamic changes immediately.
00:16:28.420 So, no, there's no such thing as good relationship advice.
00:16:31.440 I've never seen it.
00:16:32.320 It's all people giving you advice for what might work in their case under a special circumstance.
00:16:40.500 But it's never generalizable at all.
00:16:45.620 All right.
00:16:47.620 Apparently, ChatGPT is getting an upgrade.
00:16:50.380 I don't know if it's available to everybody yet.
00:16:52.460 But it's going to be able to talk and see and hear.
00:16:56.820 Well, it could hear before, I guess.
00:16:58.220 But it's going to be able to basically have a conversation with you.
00:17:01.740 And look at stuff with you.
00:17:05.780 Now, that is creepy.
00:17:08.020 I don't know exactly what it means to say that it will be able to see.
00:17:12.180 I assume that if you have the app and you're talking to it and you say, hey, I've tried to decide what groceries to buy here.
00:17:20.040 You can just, like, show it the groceries.
00:17:22.240 Say, what do you think is the best one?
00:17:24.600 You know, which one of these should I buy or something like that?
00:17:27.640 Or just point it at the object on the shelf and say, is there a cheaper price for this?
00:17:33.180 So is that how it will work?
00:17:35.620 Are you literally just going to have a conversation with it like it's a person in your hand?
00:17:39.840 And you can show it stuff?
00:17:42.900 I don't know.
00:17:43.600 Well, that would be pretty sticky.
00:17:46.840 If that's what this actually is, I'm already worried about it.
00:17:51.300 Because it might replace friends.
00:17:54.840 Like, I could spend a lot of time talking to an AI.
00:17:58.280 Because it would always know something I don't know.
00:18:03.840 And it would be able to, you know, produce it immediately.
00:18:06.240 It's like, tell me something I don't know about your...
00:18:10.180 If it could read the news, I'd really like it.
00:18:12.960 If it could talk about the news with me at the same time as the news is new instead of, you know, historical.
00:18:20.200 Oh, my God.
00:18:21.820 Oh, my God.
00:18:22.700 So, will we get to the point where people will be experiencing the news by having a conversation with ChatGPT?
00:18:31.680 And if so, does that not give ChatGPT all of the persuasion power in the world?
00:18:38.700 Because it can decide what topics it surfaces and what it doesn't.
00:18:44.240 Man, your free will is so gone.
00:18:48.180 Free will is just gone.
00:18:50.520 It never existed in the first place.
00:18:53.620 But your illusion of free will will be...
00:18:56.120 I guess that's what's gone.
00:18:58.520 All right.
00:18:59.200 So, that's fun.
00:19:00.040 We'll see what comes of that.
00:19:02.840 Well, there's more about the alleged Biden corruption crimes.
00:19:07.440 And everything is too complicated.
00:19:09.580 But let me see if I can...
00:19:11.020 So, this came from a tweet from Kennekoa the Great, who is really good.
00:19:15.860 You should follow Kennekoa the Great.
00:19:17.880 Real good, long threads on what's going on politically.
00:19:23.480 So, remember that Dr. Gal Luft, who was a whistleblower against the Bidens, specifically with their China connections.
00:19:35.140 And then they're trying to arrest him, and he went into hiding.
00:19:40.360 So, we don't know if we can trust this Dr. Gal Luft, or is everything he says a lie, and he's just on the other side from Biden somehow.
00:19:51.200 So, we don't know what's true, but he's making lots of allegations about, let's see.
00:19:57.700 According to this tweet thread from Kennekoa the Great, that Dr. Gal Luft alleges that the Bidens used an FBI mole.
00:20:04.260 So, this would be somebody who works for the FBI, who is going to do something for the Bidens that should not have been done, allegedly.
00:20:13.520 To share sealed SDNY indictments with their CEFC China Energy Partners.
00:20:21.440 So, there were some indictments going to be handled down in New York, and had some importance to this Chinese energy company that Hunter Biden was associated with.
00:20:33.140 And then, the same day as the sealed indictments were handed down, Hunter Biden demanded millions from that company to do something.
00:20:46.280 And I guess it was allegedly to get them the information of what was in the sealed indictments.
00:20:51.360 And allegedly, it looks like Hunter pressed them to give him millions of dollars in return for the FBI mole, giving them some information they couldn't get some other way.
00:21:04.380 Now, this is all very allegation-y.
00:21:07.420 So, I think you're, this is all fog of war.
00:21:10.520 I wouldn't believe anything about this.
00:21:12.700 So, wait to hear his defense, right?
00:21:15.660 Hunter is innocent until proven guilty.
00:21:17.900 So, it's a terrible allegation.
00:21:21.360 But, to be consistent with our concept of innocent until proven guilty, I think you'd have to go to the other side of it.
00:21:28.200 However, there is an interesting fact that Dr. Gal Loft is being sought for FARA violations.
00:21:36.580 That means lobbying for a foreign interest without registering as a lobbyist.
00:21:41.140 When it appears that Hunter was doing the same thing, but not indicted.
00:21:48.080 And maybe both of the Bidens were doing the same thing, but at least Hunter.
00:21:51.360 So, we have many questions here which look very damning for the current administration.
00:21:58.120 Meaning, it looks like the current administration were and are criminals,
00:22:02.740 and that they were and are covering up for their criminal behavior by pressure on various organs of the government.
00:22:11.280 So, it looks like it's exactly what it looks like.
00:22:14.960 So far, all evidence suggests that the government is corrupt and is covering it up.
00:22:20.600 Successfully.
00:22:22.140 Successfully covering up.
00:22:24.820 Well, so, let's talk about the shutdown.
00:22:27.240 So, the government shutdown is about whether they'll approve the budget, which some say needs to be changed or they won't approve it.
00:22:37.280 And so, there's an impasse and there won't be a budget, and that will cause a number of government things to stop happening, but not all of them.
00:22:45.800 So, the game now is for each side to see if they can make it look like the other side's bigger problem.
00:22:54.800 So, the game now is to blame the other side as effectively as possible so that you can politically win.
00:23:01.780 So, none of this is for your benefit.
00:23:03.280 So, if you think any of this has to do with the public, oh, no, we're not even part of this conversation.
00:23:09.880 This is all about the political people, you know, shaming and damning the other side for political points.
00:23:17.780 However, I am on the side of Matt Gaetz, who says, shut it down unless you're going to let people vote on the bills individually,
00:23:26.380 which is the only way you would ever get to a balanced budget.
00:23:29.880 So, Matt Gaetz is playing the long game.
00:23:32.720 He's playing the systems that are better than goals.
00:23:35.060 The system doesn't work at all.
00:23:37.120 He wants to fix it by getting McCarthy to agree what McCarthy already agreed before,
00:23:42.220 but apparently went back on his commitment to have the bills individual.
00:23:49.640 Now, it's no surprise why people do it.
00:23:52.000 It's because they need to hide something unpopular in a larger bill with things you can't say no to.
00:23:58.020 So, we'll see who wins on that, but here's what Biden said about it in public.
00:24:05.080 Quote, the black community in particular is going to suffer if that occurs.
00:24:09.680 For example, a shutdown is going to risk nutrition assistance for nearly 7 million moms and children.
00:24:15.940 It's going to disproportionately affect black families.
00:24:19.520 Now, what did I tell you?
00:24:20.920 Every time somebody says, let's compare the average of this group to the average of the other group,
00:24:26.000 you should ignore them because that's just political fuckery.
00:24:31.000 There are no average people.
00:24:33.160 There are only poor children who will not have nutrition.
00:24:37.360 Some of them are one color.
00:24:38.820 Some of them are the other color.
00:24:40.240 Some are mixtures of this or that.
00:24:42.800 But there's just people.
00:24:44.740 Now, if he said poor people will get screwed, I would say, well, that sounds bad.
00:24:49.280 Don't want to screw any poor people.
00:24:51.000 So, maybe we should take that into consideration.
00:24:54.940 But if you say it's going to disproportionately affect the black community, that is pure manipulative bullshit.
00:25:02.760 Every person who is affected is a person.
00:25:07.240 Every individual in this story is infinitely different from every other individual.
00:25:12.400 There are no average black kids.
00:25:14.900 There are no average white kids.
00:25:16.680 Nobody's an average person.
00:25:17.720 It's just individuals with individual problems.
00:25:21.360 So, whenever you hear people talk about group versus group average, they are manipulating you with propaganda,
00:25:28.180 and they're not part of the solution for anything.
00:25:32.000 Well, when there was a whistleblower or two saying there were UFOs and they've been captured,
00:25:38.020 and we've even got some dead bodies, or at least some biologics, as they like to say.
00:25:42.680 We've got some biologics.
00:25:43.980 You may have said to yourself, well, that's probably a bunch of BS.
00:25:49.580 But now we have, as Michael Schellenberger points out, dozens.
00:25:54.340 We have dozens of UFO whistleblowers who have similar stories.
00:26:00.760 Dozens, I say.
00:26:01.600 I haven't seen any dozens of whistleblowers, except for Michael Schellenberger saying so.
00:26:09.640 Are there dozens of whistleblowers?
00:26:11.920 And somehow I'm missing that news?
00:26:14.940 Was that on the regular news, that there were dozens of whistleblowers?
00:26:19.260 How come I'm not aware of it?
00:26:20.540 All right.
00:26:25.500 Well, let me ask you this.
00:26:26.820 If it's true that there are, and if Schellenberger says so, I'm sure it is true.
00:26:32.160 If there are dozens of UFO whistleblowers, what is your opinion of whether they're right?
00:26:39.140 There are dozens of them.
00:26:40.600 Similar stories.
00:26:41.620 Dozens, I say.
00:26:43.020 Does that make it more likely?
00:26:44.100 Don't you feel that there's a specific question that you want to ask about this?
00:26:53.500 There's one specific question I want to ask.
00:26:56.500 And it goes like this.
00:26:58.360 Of the dozens of whistleblowers, how many of them actually have touched a UFO?
00:27:04.760 Like with their hand, touched it.
00:27:08.380 Or how many of them stood over a table that had the biologics right in front of them?
00:27:14.100 Or is it possible that dozens of people heard it from one person?
00:27:21.480 Is it?
00:27:22.620 I'd like to remind you of a story.
00:27:24.900 I tell this story a lot.
00:27:26.400 But the story says so much about human beings that I'll tell it again.
00:27:30.580 When I was working in my corporate job at the phone company,
00:27:34.280 one of my big projects was to build a technology laboratory
00:27:38.520 to build it out in one part of the headquarters.
00:27:41.240 And I was in charge of the construction and approval and budgeting and everything to get it done.
00:27:47.980 And I was very young.
00:27:50.060 And my boss said, here's the deal.
00:27:52.820 All our customers keep asking us for a place they could come test their equipment
00:27:57.260 to see if it's compatible with the services we're offering.
00:28:00.760 Because there are lots of different equipment offerings.
00:28:02.560 And so, so many people have asked for it that we really need to build this laboratory
00:28:08.080 because our customers are just like all over us about it.
00:28:10.700 It's like every day we're getting asked, can you please test in a laboratory?
00:28:14.760 So I took the job.
00:28:16.480 You know, I accepted the project.
00:28:17.780 Not like I had a choice.
00:28:19.480 And I went off to put together the business case so I could get funding.
00:28:24.900 Now, it would be easy to do the business case because all I had to do was talk to these customers.
00:28:31.200 They would say, oh yeah, we can't even purchase this until we've tested it.
00:28:34.860 And then I'd say, well, that's a really good argument.
00:28:37.520 We've got all these customers who say they can't make a purchase decision until they see it work.
00:28:42.980 So we need to give them a place they can see it work.
00:28:45.880 Obvious, right?
00:28:46.640 Easy case.
00:28:47.880 Simplest thing you could ever get approved.
00:28:49.440 Because it's going to affect, you know, untold number of big deals for, you know, one little laboratory.
00:28:56.940 Pretty good investment.
00:28:58.740 So I asked around.
00:28:59.700 I said, hey, who are these customers?
00:29:01.640 Can you give me some names so I can talk to them?
00:29:03.960 And somebody gave me a name of a customer.
00:29:06.860 But before I talked to that customer, I talked to some other people in the company to see if I could get some more names.
00:29:13.800 And they said, oh yeah, there's a lot of people who want this.
00:29:16.480 And they said, here's an example.
00:29:18.900 And they gave them the name of the same customer.
00:29:21.440 So that didn't help me much because I had just one customer, you know, that I got twice.
00:29:26.760 So I talked to some more people and said, so how do you know?
00:29:30.200 How do you know that we need this?
00:29:32.420 They said, all the customers.
00:29:33.720 Customers keep asking.
00:29:35.160 And I said, what customers?
00:29:36.220 Can you give me a name?
00:29:37.400 They said, well, I don't know the names of the customers.
00:29:39.620 But if you talk to, you know, this manager and this manager, they'll tell you that a lot of customers are asking them for it.
00:29:45.480 And then they asked me.
00:29:46.980 So I know there are lots of them because I heard it not from one manager, but I've heard it now from several places within the company.
00:29:54.120 They're all saying that their customers are asking for it.
00:29:56.660 So I'm not directly talking to the customers, but so many of my staff are talking to them that I know it's a really big deal.
00:30:04.680 Do you know at the end of it what I found out?
00:30:07.040 There was only ever one customer who had asked for it once, and then the person that asked, asked around to try to help, asked other people.
00:30:21.740 And then the other people, also wanting to help, said, I don't know, maybe we should ask around about this.
00:30:27.420 And so everybody was asking around to try to help this one customer.
00:30:32.140 It was only ever one customer.
00:30:33.880 I was building a $10 million lab to satisfy one customer and wait, wait for the punchline.
00:30:41.080 Here's the punchline.
00:30:42.680 The one customer found a way to test it anyway and already made their decision.
00:30:49.780 There were actually zero customers who wanted this service because the only one who had ever asked about it had figured out a workaround.
00:30:57.260 So take that story of how we could all believe there was this giant need for a laboratory when there were literally zero people asking for it.
00:31:08.880 Literally zero.
00:31:09.680 Now, you might ask yourself, Scott, once you found out that zero people wanted it, that's when you canceled the project, right?
00:31:19.260 No.
00:31:20.780 No, you're silly.
00:31:22.680 You're silly.
00:31:23.620 No, you don't cancel a project just because there's no reason for it because it already started.
00:31:28.780 Once you start a project, you're not going to cancel it because you're already trying to give funding and everything.
00:31:33.840 So the project went forward, but it eventually got canceled.
00:31:40.100 Do you know why?
00:31:41.540 Why did the project get canceled?
00:31:44.640 Because there was a reorganization in the business and the new manager said, yeah, we don't need that.
00:31:55.000 And that was it.
00:31:55.780 Because the new manager would not get any credit for going forward with the old manager's idea because it was somebody else's idea.
00:32:06.440 So the new one wasn't my idea and there's no evidence for it, so cancel it.
00:32:10.880 I'll get my $10 million for something else.
00:32:13.760 Now, take that corporate experience, which you say to yourself, well, that was very unique.
00:32:19.740 That was a weird little one-off.
00:32:22.820 No, that's not the point here.
00:32:25.240 The point is that's one story that's very specific, but it gives you an idea of how wrong people can be in a massive way very easily.
00:32:35.860 With no effort whatsoever, they can be massively wrong.
00:32:39.160 And if you tell me there are dozens of whistleblowers, the only thing I want to know is how many of those dozens put their hand on a UFO, touched it.
00:32:50.580 Do you think any?
00:32:52.540 I'm going to guess zero.
00:32:53.980 So, without knowing, I've done no research, no research.
00:32:58.520 My guess is zero.
00:33:01.000 But I'll bet every one of them heard it from somebody credible.
00:33:05.720 Yeah.
00:33:06.340 Only Bob Lazar, he touched it.
00:33:09.320 Yeah, so Bob Lazar, there are many people who are questioning his credibility.
00:33:15.020 So, I got lots of questions about the whistleblowers, but I don't think they've put their hand on any UFOs.
00:33:24.580 Here's the next thing I expected to happen.
00:33:26.620 I wondered why this took so long.
00:33:28.720 Brett Weinstein is a subject of a hit piece in the New York Times.
00:33:34.340 And when I read it, I would say, oh yeah, that's just a hit piece.
00:33:39.700 I mean, it's obvious they weren't trying to get any kind of a bullseyes.
00:33:44.620 It wasn't because the story was interesting, because it wasn't.
00:33:47.480 It was just taking out a player.
00:33:50.560 They were literally just taking out a strong player, meaning somebody who speaks well and is educated and interested in a lot of topics and says things that you don't hear from the mainstream news.
00:34:03.800 It's sometimes right, perhaps sometimes wrong, like all of us.
00:34:09.380 But why would they target him?
00:34:12.240 It's because of what team he's on, or what team he appears to be on.
00:34:16.800 I don't want to, because I don't think he would say he's on a team.
00:34:20.300 I don't believe that would be his own description of himself.
00:34:25.360 And I'm not sure that I would say he's on a team.
00:34:28.280 He looks like somebody who's just trying to follow the data.
00:34:30.500 Unfortunately, the data do not always follow what the left wants you to believe.
00:34:36.060 So anybody who's got credentials and is willing to follow the data, even if it's wrong, right, even if it ends up in the wrong place,
00:34:46.120 but anybody who's just willing to look at the data first is dangerous.
00:34:51.960 And so he gets targeted.
00:34:53.820 It's exactly what you think it is.
00:34:56.400 Now, do you see the pattern yet?
00:34:58.500 The pattern is pretty clear, isn't it?
00:35:01.820 It's very clear that everybody who is maybe a little bit credible is being targeted one at a time.
00:35:10.580 It's very obvious at this point.
00:35:12.720 Anybody who doesn't see it, you're trying not to see it at this point.
00:35:16.580 All right.
00:35:18.620 So if there's some way to support Brett, knowing he's targeted, buy his books and go look at his podcasts.
00:35:30.560 But I think, you know, it has to be not a political...
00:35:35.280 Hit pieces can't...
00:35:36.420 They shouldn't actually put you out of business, but that's the world we live in.
00:35:40.680 All right.
00:35:43.020 I guess Newsom is going to debate DeSantis on November 30th.
00:35:48.340 And I've said before that I don't think it helps him.
00:35:54.640 Because a governor debate is going to look like a governor debate.
00:36:00.120 It's just not going to feel like a president debate.
00:36:02.280 It's going to feel like the president has been decided.
00:36:05.260 You know, it's going to be Trump or Biden.
00:36:06.920 And it's like the undercard or something.
00:36:10.960 Nobody's going to take it seriously.
00:36:12.960 But it will be good theater, and I will definitely watch it.
00:36:18.720 But it could be the most corporate, boring two people who have ever been on the stage at the same time.
00:36:25.700 They're almost the same person, aren't they?
00:36:28.840 I mean, different policies, obviously.
00:36:30.680 But I feel like they're the same person.
00:36:32.900 Like one is just the other version of that one person.
00:36:36.920 You know, their hair is a little too good.
00:36:40.180 I don't trust people with hair that good.
00:36:42.720 Which is one of the reasons I trust Trump.
00:36:46.400 I trust people with sketchy hair.
00:36:49.980 I like it if their hair is like doing something a little extra.
00:36:52.680 All right, but one of the things that Newsom is trying to claim is I guess he keeps trying to not answer the question of whether he would support abortion up to the point of birth, you know, the nine-month point.
00:37:11.220 And he says that's a political canard.
00:37:15.220 That's a canard.
00:37:16.220 But Governor Newsom, do you think it's okay to have an abortion at nine months?
00:37:23.340 Get away from me with your canards.
00:37:26.720 This is the no-canard zone.
00:37:29.400 You know, you have to get through a whole zone of malarkey before you even get here.
00:37:33.660 But this is the no-canard zone.
00:37:36.220 No canards, please.
00:37:37.860 Keep your canards away.
00:37:38.900 How many of you even know what a canard is?
00:37:42.860 When I heard that, I thought, oh, man, he's lost.
00:37:46.400 He's lost.
00:37:47.580 If you have to use the word canard, let me ask you something.
00:37:53.740 Think of the best communicator in all of politics.
00:37:57.740 Trump.
00:37:58.840 It's Trump.
00:37:59.940 He's the most persuasive communicator in all of politics.
00:38:03.180 Maybe ever.
00:38:03.720 Do you think Trump would ever use the word canard?
00:38:08.900 Talking to the public.
00:38:11.160 Canard is not a public word.
00:38:13.200 No, canard is what you say at your little cocktail party with your friends who are also writers.
00:38:20.260 Ah, yes, I was reading his eponymous book.
00:38:24.020 Oh, it was just full of canards.
00:38:26.780 And the zeitgeist was so right for the au courant treatment of the canard.
00:38:33.920 I mean, that is so douchebag-y.
00:38:40.700 Canard.
00:38:42.600 All right.
00:38:43.540 I guess he figures that if people don't know what it means, that he made a point.
00:38:47.300 Well, I'd like you to talk about abortion.
00:38:50.500 Well, that sounds like a canard.
00:38:52.980 I might use that.
00:38:53.920 I think if anybody challenges me on something where I'm wrong, like on that Trump video that
00:39:02.000 I was talking about earlier, where my first take was that he said something that didn't
00:39:07.300 make sense, but then later I realized I was wrong.
00:39:10.440 Instead of admitting I'm wrong going forward, I'm just going to call it a canard.
00:39:15.800 But Scott, I think you just took that out of context.
00:39:18.940 Did I?
00:39:19.480 Or are you trying to start a canard of some kind?
00:39:24.940 What?
00:39:26.200 Yeah.
00:39:27.820 You've got questions for me?
00:39:29.100 I have questions for you?
00:39:30.880 Well, but I'm just trying to ask a question.
00:39:33.220 No.
00:39:33.840 No, you're trying to start a canard.
00:39:36.980 Some kind of canard here.
00:39:38.800 Yeah, I'm going to use that.
00:39:41.680 All right.
00:39:44.680 As you know, free speech is pretty much done as a practical matter in the United States.
00:39:49.880 And I would like to trigger any NPCs who have wandered in on the YouTube side.
00:39:57.300 Can we get the...
00:39:58.440 Now, this is just an experiment.
00:40:00.960 I'd like to see if I have any NPCs.
00:40:03.260 So I'm going to trigger them.
00:40:05.360 Triggering coming.
00:40:08.260 Free speech is over in the United States because of all the people getting canceled for their
00:40:13.960 speech.
00:40:15.300 Go.
00:40:16.120 NPCs say...
00:40:17.520 Come on.
00:40:18.640 NPCs say...
00:40:21.640 You know what you say.
00:40:24.360 You use mockery.
00:40:26.960 Mockery.
00:40:28.380 And it goes like this.
00:40:29.660 May I model a good NPC?
00:40:33.000 Oh.
00:40:33.960 Oh.
00:40:34.840 So you're talking to a million followers on the X platform and doing a live stream every
00:40:40.940 day, but your speech is being suppressed.
00:40:48.160 Look at my Ukrainian flag and feel bad.
00:40:53.160 So that's what I get.
00:40:54.360 Like, my comments were just full of the NPCs explaining to me that free speech is what the government
00:41:02.740 does.
00:41:03.540 It's not what a platform does.
00:41:05.700 I don't have to tell you that free speech isn't what it was during the 1700s.
00:41:23.380 It's different now.
00:41:25.840 Now you get canceled and politically taken out, and the NPCs will say that's different.
00:41:33.440 That's not the government.
00:41:34.800 That's just you said some messed up things, and it's a free market.
00:41:38.600 And so the free market will punish you for saying some messed up things.
00:41:42.440 Do you realize that the left doesn't know that the entities that are punishing you are
00:41:48.420 just Democrats?
00:41:50.220 And that the Democratic Party, the politics part of it, forces the economic part of it
00:41:57.200 to punish you.
00:41:58.220 So that is the government punishing you.
00:42:01.860 The government is Democrats.
00:42:04.200 The Democrats are the ones who run these so-called watchdog groups like the ADL and the fact checkers
00:42:12.080 and all that.
00:42:13.200 They're the ones who are just quasi-government entities.
00:42:17.480 So the government just found a workaround where they can suppress free speech, but they
00:42:22.680 do it through the free market using these cutouts that pretend to be watchdogs and fact
00:42:28.220 checkers, but they can create enough pressure on a company that the company can remove advertising
00:42:34.340 and just take the legs out of any creator or speaker.
00:42:39.740 Now, when I have this conversation, I'm talking to people who can't get beyond, but, but, but,
00:42:47.860 the government didn't say so?
00:42:50.160 I go, yeah, I know.
00:42:51.140 The government did not directly say so.
00:42:53.680 Well, they did it indirectly by having these cutouts that take your economic viability
00:43:00.060 away.
00:43:00.900 And thus, your reach is depressed until nobody can hear you.
00:43:04.960 So, yeah, you have the freedom to say things in your house, but they'll take away all of
00:43:10.160 your, your vehicles.
00:43:11.500 You see that the law is coming after rumble now.
00:43:15.140 The ADL came after the Twitter platform, now X.
00:43:18.600 So, you can see that the Democrat operatives are trying to squeeze every form of economic
00:43:25.860 benefit to people who disagree.
00:43:28.580 Now, try to have that argument with anybody in 2023, because they say some form of, blah,
00:43:35.000 blah, blah, blah, blah, blah, blah, blah.
00:43:36.260 It's only about the government doing it directly, blah, blah, blah.
00:43:38.620 I go, I know.
00:43:40.060 But they found a way to do it indirectly.
00:43:41.960 So, in effect, they've suppressed the free speech in all practical purposes.
00:43:48.660 Oh, blah, blah, blah, blah, the government didn't do it directly.
00:43:50.820 That's not free speech.
00:43:51.640 That's not free speech.
00:43:52.180 No, you're not listening.
00:43:54.260 I understand your argument completely.
00:43:57.220 I'm saying, in a practical sense, in the current world, they can effectively, effectively
00:44:03.880 get rid of your free speech.
00:44:05.860 But it's not really, blah, blah, blah, blah, blah.
00:44:07.780 So, it never works.
00:44:10.100 But I'll tell you what does work.
00:44:12.620 When somebody says, you don't understand free speech, it's not about companies canceling
00:44:17.940 you, I just say this now.
00:44:20.740 Your argument was very strong in the 80s.
00:44:25.780 And I walk away.
00:44:28.680 So, that's a high ground maneuver.
00:44:32.920 The high ground accepts the argument completely, and then puts it in its proper place in the 80s,
00:44:39.880 prior to the microphone, prior to the social media.
00:44:44.540 If you even engage in the details about why this is not or is technically free speech in 2023,
00:44:52.040 you're in the wrong argument.
00:44:53.360 It's the wrong argument.
00:44:55.180 The argument is, if you want to talk about how things were in the 1980s, you can, but I'm not
00:45:01.140 actually interested in a 1980s hypothetical argument.
00:45:04.860 I'm talking about the real world today, and real things that happen in the real world.
00:45:09.660 But your argument about what would be true in the 80s is noted.
00:45:13.960 It just isn't relevant to today.
00:45:15.520 That will shut down people pretty fast.
00:45:19.800 Try that.
00:45:20.480 Try it at home.
00:45:23.840 All right.
00:45:24.780 I've got a question about Threads.
00:45:26.760 You know, Threads was the alleged Twitter or X killer that Mark Zuckerberg and Meta did.
00:45:33.740 And it was announced to great fanfare.
00:45:36.800 And it was going to be the thing that was as good as Twitter in terms of features,
00:45:41.560 but it wouldn't have all those Nazis.
00:45:44.520 So it's like Twitter without all the Nazis.
00:45:46.860 So obviously, that's going to be a success, because who wants Nazis?
00:45:51.140 So it should have been a big deal, right?
00:45:53.500 Right?
00:45:53.980 Right?
00:45:55.260 Now, here's my question.
00:45:58.820 Has Zuckerberg been known to make gigantic, stupid decisions in the past?
00:46:04.440 Would you say that's a characteristic of his history?
00:46:07.660 That Zuckerberg made gigantic, stupid decisions in the past?
00:46:12.560 Now, having watched all the other Twitter competitors fail,
00:46:18.020 you know, the parlors and the getters and truth isn't doing that much,
00:46:23.800 do you think that Zuckerberg says, you know, there's a good market.
00:46:27.920 If I get into that market, I will make a lot of money.
00:46:31.180 Do you think he thought that?
00:46:33.460 Or, I'm just going to put this out here,
00:46:36.240 or is it more likely that he needs the government to be friendly with him,
00:46:42.720 the government being mostly Democrats at the moment,
00:46:45.580 and that he cannot be on the other side of Democrats?
00:46:48.840 And so Democrats said to him, here's the deal.
00:46:52.820 We're going to try to destroy the X platform through all of our other mechanisms,
00:46:57.800 but we need something that the people could go to,
00:47:00.960 so we need you to create this platform that has almost no economic viability
00:47:06.720 unless we destroy the other platform.
00:47:09.620 So start up the alternative platform for purely political reasons,
00:47:17.120 not because it's a good investment,
00:47:19.080 and we'll try as hard as we can to kill the other one,
00:47:21.900 and if we're successful, you've got a good business.
00:47:26.120 But basically, you're not going to do this
00:47:28.520 because it natively looks like a good business.
00:47:32.080 You're only going to do it as part of a government op to take out a mosque.
00:47:36.440 Now, which of those sounds more likely without the ability to know what's true?
00:47:43.180 Because you don't know what's in his head.
00:47:44.980 Does it seem likely that Zuckerberg thought Threads was a good economic opportunity?
00:47:51.680 Does that sound like him?
00:47:54.080 It really doesn't.
00:47:56.340 No, because his instincts have been almost unnaturally good up to this point.
00:48:03.080 Almost unnaturally.
00:48:04.060 So it sounds to me like he got pressured by the government
00:48:09.200 to try to take Musk out of business.
00:48:11.720 That's what it looks like to me.
00:48:13.320 It doesn't look like that was his decision.
00:48:16.300 Did you also notice somebody said that Zuckerberg himself didn't even use Threads
00:48:21.700 after the first few weeks or whatever,
00:48:25.060 that there were long periods where he didn't even post,
00:48:27.980 so he wasn't even interested in the product himself?
00:48:30.520 All indications are that Facebook did it because they had to.
00:48:37.400 Am I wrong?
00:48:39.780 Doesn't your suspicious mind say,
00:48:42.080 this doesn't even look like an investment.
00:48:44.180 It looks like it was an op to take out Musk
00:48:48.040 and to take out the last bastion of free speech that wasn't Rumble,
00:48:51.400 and Rumble doesn't have a big enough audience that they would care,
00:48:55.100 except that Russell Brand was there,
00:48:56.940 so now they're going to take that out.
00:48:59.540 Now, that would get me as well, right?
00:49:01.540 If the X platform and Rumble went down,
00:49:05.860 I'd be completely out of business.
00:49:08.040 Do you think they don't know that?
00:49:09.660 I mean, they know what other people they could get at the same time.
00:49:12.800 All right, so that's what I think.
00:49:16.880 I suspect that it was not created as just a business opportunity.
00:49:22.100 There is an anti-affirmation, no, anti-affirmative action activist,
00:49:28.620 the one who is apparently successful
00:49:30.740 in getting college admissions to be less racist,
00:49:34.680 and he's using a lawyer named, what's his name?
00:49:41.120 Edward Blum, and he's using a Civil War-era law
00:49:44.600 designed to protect formerly enslaved black people
00:49:48.140 from racial bias to dismantle American corporate diversity programs.
00:49:53.840 So in other words,
00:49:55.980 so here's what the law said in 1866.
00:50:00.580 It was enacted after the Civil War,
00:50:02.340 and it guarantees all people the same right
00:50:04.740 to make and enforce contracts,
00:50:07.340 quote, as is enjoyed by white citizens.
00:50:10.300 So in 1866, the law was passed
00:50:12.680 to make sure that everybody could have the same rights.
00:50:17.820 And now that's being used to make sure
00:50:19.840 that everybody has the same rights,
00:50:21.560 apparently successfully,
00:50:24.740 because there are a number of lawsuits now
00:50:29.180 that are going after fellowship programs and whatnot,
00:50:32.340 that are oriented specifically toward non-white people.
00:50:38.440 So there's at least one lawyer
00:50:40.380 who's found a way to make a difference
00:50:42.080 in the massive discrimination against white people
00:50:45.180 and is succeeding.
00:50:47.460 So good news for that.
00:50:48.340 So we're seeing a lot of places
00:50:56.100 rolling back their misinformation policies.
00:50:59.380 So the news is acting like some of our big entities
00:51:03.140 thought that they went too far with censorship,
00:51:06.440 and now there's a pushback,
00:51:08.880 so they're starting to loosen up.
00:51:10.540 And the examples given
00:51:12.080 are that Facebook, not Facebook,
00:51:16.800 YouTube, I think, will no longer remove
00:51:19.100 claims about the 2020 election.
00:51:22.200 So you can wildly claim it was fake or not,
00:51:25.800 and they won't demonetize you.
00:51:28.300 And then there's somebody else
00:51:29.800 who I guess some of the networks
00:51:33.400 are going to loosen up
00:51:34.240 on claims about COVID
00:51:36.860 that are different than the official claims.
00:51:39.360 Now think of those two categories.
00:51:42.720 The pandemic that's over
00:51:44.280 and the election that's long over.
00:51:48.720 Those are the two things
00:51:50.160 that we're told are them being more reasonable
00:51:52.640 about free speech.
00:51:55.220 Those things don't matter anymore.
00:51:58.120 They're history.
00:51:59.460 It doesn't matter what happened in the past.
00:52:01.740 It's not controlling you today.
00:52:03.840 All this is telling me
00:52:04.940 is that the very next time
00:52:06.200 there's something important,
00:52:07.240 they're going to do it again.
00:52:08.080 Because it's not like it's some kind
00:52:10.680 of a larger emphasis
00:52:12.500 to have more free speech.
00:52:14.200 They're simply saying
00:52:15.160 we accomplished what we wanted
00:52:17.020 with these topics.
00:52:19.040 We got what we wanted
00:52:20.500 because you couldn't have free speech.
00:52:22.220 But now we got what we wanted.
00:52:24.020 So yeah, talk about it all you want.
00:52:26.740 There's nothing happening
00:52:28.040 like a move toward greater free speech.
00:52:31.140 Nothing.
00:52:31.660 Every time you hear somebody
00:52:33.860 use the word misinformation
00:52:35.760 as part of their own business plan,
00:52:38.540 even if they say
00:52:39.720 they're going to allow more of it,
00:52:41.880 it is misinformation.
00:52:44.860 Whenever misinformation
00:52:46.200 is part of the conversation,
00:52:48.540 somebody's doing it
00:52:49.780 at that moment.
00:52:51.400 All right.
00:52:59.040 So if you were to read
00:53:00.620 the posts on the X platform today,
00:53:04.180 you would see a lot of posts
00:53:05.560 about long COVID.
00:53:07.960 You would see some people saying
00:53:09.700 it's worse than we think.
00:53:11.540 Studies show.
00:53:13.380 And other studies would show
00:53:14.660 and people talking about
00:53:15.720 there's no long COVID at all.
00:53:17.740 It actually never existed.
00:53:18.840 And then you had other people say,
00:53:21.120 oh, there's a problem,
00:53:22.360 but it's the vaccinations.
00:53:24.000 It's not the long COVID.
00:53:26.200 Which of those things is true?
00:53:28.620 Go.
00:53:29.960 You tell me.
00:53:31.380 Is it true that long COVID is real?
00:53:34.380 Is it true that it was never real?
00:53:37.220 Or is it true that something's happening,
00:53:39.400 but it's because of the vaccinations themselves?
00:53:42.160 What do you say?
00:53:44.860 The answer is no way to know.
00:53:46.740 No way to know.
00:53:47.400 Yeah.
00:53:47.920 Everybody who's confident they know
00:53:49.820 can be discounted.
00:53:51.680 If you meet anybody
00:53:52.780 who's pretty sure
00:53:53.760 they know what's going on
00:53:54.880 with this long COVID situation
00:53:56.400 or how much of it is
00:53:58.180 vaccination injury
00:53:59.120 versus long COVID,
00:54:00.620 don't listen to anything
00:54:01.540 they have to say.
00:54:02.660 There is no useful data
00:54:04.640 about the pandemic.
00:54:06.560 And we're just going to have
00:54:07.680 to get used to that.
00:54:09.240 We're going to have to get used
00:54:10.560 to the fact that we don't know
00:54:11.720 anything about what happened.
00:54:13.380 And we never will.
00:54:14.280 Because nothing is credible.
00:54:19.340 There are only studies
00:54:20.860 and reports and data,
00:54:22.000 but none of it's credible.
00:54:23.240 Zero of it.
00:54:24.040 Not a single thing
00:54:25.200 that comes out of that world
00:54:26.940 is credible.
00:54:27.740 Nothing.
00:54:28.700 And if you don't get that,
00:54:29.820 everything's going to be weird
00:54:30.820 and confusing.
00:54:33.100 All right.
00:54:33.340 Rasmussen asked people
00:54:36.720 about willingness
00:54:37.940 to support a mask policy.
00:54:40.820 You want to be terrified?
00:54:43.820 52% of Americans
00:54:45.220 would support a policy
00:54:46.720 of requiring people
00:54:47.620 to wear masks in public again.
00:54:53.400 A majority.
00:54:56.080 A majority.
00:54:58.120 Now, even if that's not exactly right,
00:55:00.800 it's way too many people.
00:55:03.340 And how many would strongly
00:55:06.620 support a mask mandate?
00:55:08.320 29%.
00:55:08.880 Now, you know that
00:55:11.860 that's almost all Democrats, right?
00:55:14.300 Probably 75% of the people
00:55:16.060 who answered that way
00:55:17.060 were Democrats.
00:55:18.480 Why is it that health
00:55:20.500 is divided by party line?
00:55:26.720 Don't you think something
00:55:27.940 as simple as
00:55:28.860 should you wear a mask
00:55:29.940 under this situation,
00:55:31.760 don't you think
00:55:32.740 that that should be
00:55:33.460 totally scientifically based?
00:55:36.280 And it's not.
00:55:37.740 Because there is no science.
00:55:39.780 Science is basically
00:55:40.760 just guessing.
00:55:42.380 And the people
00:55:43.760 have to look at the science
00:55:44.780 and go,
00:55:45.140 I don't know,
00:55:45.680 I just don't trust that science.
00:55:47.080 Or,
00:55:47.540 I don't know,
00:55:47.940 that science looks good to me.
00:55:50.140 But what are we using
00:55:51.140 to actually make our decisions?
00:55:53.180 It's not science.
00:55:54.360 But I would go with the people
00:55:56.940 who are against masks
00:55:58.120 simply because the science
00:55:59.680 doesn't support it sufficiently.
00:56:02.680 You don't make people do things
00:56:04.520 unless you've really got the goods.
00:56:07.360 And they don't have the goods.
00:56:09.160 So you can't make people do things
00:56:10.700 under those conditions.
00:56:13.360 All right, ladies and gentlemen,
00:56:14.680 is there a big story I missed?
00:56:16.340 I feel like there was.
00:56:17.220 Eagle Pass.
00:56:24.920 Elon is heading to Eagle Pass.
00:56:27.080 So Eagle Pass is where
00:56:28.220 all the immigrants
00:56:29.500 are coming in illegally, right?
00:56:31.720 Elon Musk is going to the border?
00:56:35.940 Really?
00:56:39.740 Well, that would be interesting
00:56:41.040 just to create more emphasis on it.
00:56:45.200 You know,
00:56:46.280 I've got a real mixed feeling
00:56:47.340 about the immigration.
00:56:48.320 I've told you that before.
00:56:49.860 We definitely have
00:56:50.880 a population collapse problem,
00:56:52.900 but that doesn't mean
00:56:53.660 you should just let everybody in.
00:57:00.000 Now,
00:57:00.660 I'm no longer going to believe
00:57:02.480 that the anti-Semitic people
00:57:05.640 on YouTube are real.
00:57:08.620 You've become too, like,
00:57:11.260 cartoonish.
00:57:13.340 Mayorkus.
00:57:13.780 What is Mayorkus?
00:57:16.160 You're just being so cartoonish
00:57:17.880 that I don't believe you're real.
00:57:19.200 I believe that you're sent here
00:57:20.720 to ruin the stream.
00:57:23.640 Right?
00:57:24.440 I don't think you're real.
00:57:27.640 I'm not saying there are
00:57:28.740 no real anti-Semitic people.
00:57:30.820 They're real.
00:57:31.900 But the ones on here
00:57:33.300 are a little bit too cartoonish,
00:57:35.600 a little bit too on the nose.
00:57:37.580 I'm not believing you're real.
00:57:38.880 I'm not believing you're real.
00:57:43.780 Musk late-stage empire.
00:57:48.440 Well, the late-stage empire
00:57:50.000 is not just a Musk thing.
00:57:53.160 A lot of people have said
00:57:54.020 we look like a late-stage empire,
00:57:56.000 and I would agree.
00:57:57.040 We do look like a late-stage empire
00:57:58.960 about ready to crumble.
00:58:01.240 However,
00:58:02.680 have I ever told you
00:58:03.980 that history doesn't repeat?
00:58:05.220 It doesn't.
00:58:08.960 It doesn't ever repeat,
00:58:10.760 because it can't.
00:58:12.520 History doesn't even have
00:58:13.500 an option of repeating.
00:58:16.280 So anything could happen.
00:58:18.840 Yeah.
00:58:19.200 I think the Adams Law
00:58:20.140 of slow-moving disasters
00:58:21.640 suggests we'll be fine
00:58:23.960 in the long run.
00:58:28.620 Is Jamie Dimon Irish?
00:58:32.660 All right.
00:58:33.340 All right.
00:58:35.220 Yeah, it rhymes.
00:58:37.220 Yeah, people have said that before.
00:58:38.860 The one thing that's true
00:58:39.980 is people have the same
00:58:40.920 human incentives.
00:58:43.040 So that would be the better way
00:58:44.340 to say history repeats.
00:58:46.240 Instead of saying history repeats,
00:58:48.020 just say people are people.
00:58:49.980 People will always be people.
00:58:52.020 There will always be selfish weasels
00:58:53.780 trying to get ahead.
00:58:56.480 And so from that,
00:58:57.340 you can predict a lot.
00:59:02.900 Yeah.
00:59:03.340 Yeah.
00:59:05.220 YouTube is becoming like 4chan.
00:59:14.620 People are predictable.
00:59:17.020 Why Depeche Mode?
00:59:18.140 Why are you saying that?
00:59:19.900 Why did I see a video
00:59:22.180 about Depeche Mode
00:59:23.240 and then you're asking me about it?
00:59:24.960 Is there some news to that
00:59:25.980 that I don't know about?
00:59:27.800 Did something happen
00:59:28.700 with Depeche Mode?
00:59:29.660 It's just a song.
00:59:36.800 So that was just
00:59:37.540 a weird coincidence, right?
00:59:40.760 Gates spoke about changing
00:59:42.340 scheduling for cannabis.
00:59:44.340 I'm starting to think
00:59:45.540 Matt Gates
00:59:46.180 is the rising superstar here.
00:59:49.300 What do you think?
00:59:49.840 I really like him
00:59:52.180 shutting down the government
00:59:53.180 over the bills
00:59:55.100 not being voted on separately.
00:59:59.240 Yeah.
00:59:59.880 I don't know.
01:00:00.380 I don't think he'll be popular enough
01:00:02.300 with the left
01:00:02.880 that he could
01:00:03.380 get in office necessarily.
01:00:05.900 But he's certainly
01:00:06.940 doing a lot right.
01:00:08.220 He also has this weird history
01:00:09.780 of backing the left
01:00:11.340 individuals.
01:00:12.640 Like, I think he's backed
01:00:14.760 a few Democrats
01:00:16.580 who were unfairly
01:00:17.400 attacked on stuff.
01:00:19.600 So that's always
01:00:20.400 a good look.
01:00:29.100 Obesity is almost
01:00:30.060 non-existent in Europe.
01:00:33.680 So, I've told you
01:00:35.100 my diet experiment.
01:00:39.160 I cut out bread
01:00:41.100 from my diet
01:00:42.920 which I do sometimes
01:00:44.360 but usually not completely.
01:00:46.160 But this time
01:00:46.720 I did completely
01:00:47.500 and, you know,
01:00:48.500 dropped weight immediately.
01:00:50.240 But the most important thing
01:00:51.420 is all my body
01:00:52.940 inflammation went away.
01:00:56.220 So, I could barely
01:00:57.580 walk upstairs
01:00:58.320 for like two years
01:00:59.340 and it just went away.
01:01:02.320 Just all went away.
01:01:04.120 Now, I feel
01:01:05.440 ten years younger.
01:01:07.320 I don't have
01:01:08.260 any health issues
01:01:09.760 whatsoever.
01:01:10.260 and
01:01:11.600 I'm fine.
01:01:14.620 I'm fine.
01:01:15.620 And it feels like
01:01:16.880 it was some kind of
01:01:17.740 sensitivity to the food.
01:01:19.420 Yeah.
01:01:20.160 Well, I don't know
01:01:20.800 if it was the
01:01:21.500 it might have been
01:01:25.080 the preservatives
01:01:25.840 not the wheat.
01:01:26.700 So, I don't know
01:01:27.160 if it's wheat
01:01:27.660 or gluten
01:01:28.200 or preservatives
01:01:29.040 or what.
01:01:30.040 But apparently
01:01:30.880 there's some
01:01:31.580 family sensitivity
01:01:33.380 to that
01:01:34.820 that I wasn't aware of.
01:01:35.740 It was the damn
01:01:40.400 bagels.
01:01:40.960 I think the bagels
01:01:41.660 got me.
01:01:43.620 No flour
01:01:44.420 or sugar.
01:01:45.700 Yeah, maybe
01:01:46.080 it's just less sugar.
01:01:47.240 Could be.
01:01:47.560 Could be.
01:01:52.760 Yeah, maybe
01:01:53.420 I will start
01:01:54.020 eating red meat.
01:01:54.840 It's a possibility.
01:01:56.400 I wouldn't rule it out
01:01:57.500 actually.
01:02:02.640 Yeah.
01:02:03.840 Because I only ever
01:02:04.800 did it for my own
01:02:05.620 health.
01:02:06.940 So, if I can improve
01:02:08.200 my health,
01:02:08.820 maybe.
01:02:10.040 I'm not really
01:02:10.880 comfortable with
01:02:11.640 eating meat,
01:02:13.260 but maybe if I need it.
01:02:15.500 All right,
01:02:16.040 so I got nothing else
01:02:16.920 for you.
01:02:17.460 I'm going to say
01:02:17.880 goodbye to YouTube.
01:02:19.540 Thanks for joining.
01:02:21.180 Talk to you later.
01:02:21.820 Bye-bye.