Real Coffee with Scott Adams - February 03, 2023


Episode 2008 Scott Adams: China Spy Balloon, Biden Crime Family, Marxist AI, Hypnotist vs Scientist


Episode Stats

Length

1 hour and 31 minutes

Words per Minute

141.45193

Word Count

12,928

Sentence Count

1,002

Misogynist Sentences

11

Hate Speech Sentences

23


Summary

Julian Assange's case against the US government continues to drag on, but the possibility of it being dropped by the Biden administration is raising some eyebrows. The Chinese spy balloon is a thing, and it's a big deal.


Transcript

00:00:01.000 Good morning, everybody, and welcome to the highlight of human civilization, and possibly robots as well.
00:00:08.580 Now, I'd like to take this experience up to a level that no one has ever experienced before.
00:00:15.500 And if you'd like to join me, all you need is a cup or a mug or a glass of tank or gel or a stein, a canteen jug or a flask, a vessel of any kind.
00:00:25.480 Fill it with your favorite liquid. I like coffee.
00:00:27.560 And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that makes everything better.
00:00:31.680 It's called the Simultaneous Sip, and it happens now. Go.
00:00:37.820 Ah, that's good. That's good stuff.
00:00:42.860 So, I just saw a note that YouTube is adding a feature where you can add a guest host.
00:00:51.920 Locals doesn't have that feature, so I can't use it for my morning streams.
00:00:55.540 But if locals would get that feature, wouldn't that be cool?
00:01:00.840 I think this is another way that YouTube controls the narrative.
00:01:06.700 Here's how they control the narrative.
00:01:09.460 Suppose YouTube's new guest feature was so good that everybody who had a podcast wanted to use it,
00:01:17.360 and then that would allow YouTube to decide which podcasts are seen the most.
00:01:24.080 Because then they'd lock everybody in their ecosystem, and then they can control what you see.
00:01:29.620 And I'm thinking, I don't know if it's good for YouTube to have features that maybe other things don't have yet.
00:01:36.180 I think Instagram has that feature.
00:01:38.780 I haven't used it yet.
00:01:39.880 But I'd love to see that native on all of the live stream apps.
00:01:48.140 Here's an interesting development.
00:01:50.500 Let's see if you can predict.
00:01:52.460 Do you think that the charges against Julian Assange will be dropped by the Biden administration?
00:01:59.180 Go.
00:02:00.140 What's your prediction?
00:02:01.060 Now, apparently the charges were put on during the Trump administration, so that's your first hint.
00:02:08.360 So if Biden were to reverse it, he would be reversing something that happened under Trump,
00:02:14.180 which makes it more likely, right?
00:02:16.820 So he would be spring-loaded to do whatever Trump didn't do.
00:02:20.660 Not always, but there would be a bias there.
00:02:23.320 But I read this today, which I was not aware of.
00:02:28.400 Did you know that five, in the last month, so this is newish, five major media organizations,
00:02:35.640 including the New York Times, including The Guardian from Great Britain,
00:02:40.480 put out an open letter basically saying that the indictment of Assange sets a dangerous precedent,
00:02:47.340 because they had used his materials.
00:02:49.180 So five big entities that had used Assange's materials said,
00:02:54.380 this is trouble, because he did basically what we do.
00:02:59.980 He had materials that he wasn't supposed to have, but he published them.
00:03:04.800 And the New York Times, to their credit, to their credit,
00:03:09.340 says, that's what we do, because they do.
00:03:13.680 And here's my question.
00:03:16.320 Can the Biden administration resist the New York Times?
00:03:23.340 Because you know they're on the same team, right?
00:03:26.640 I don't think they can.
00:03:28.360 And I'm wondering if the New York Times is sort of the trial balloon.
00:03:34.000 You know, maybe it's a little bit testing whether they want to,
00:03:38.760 you know, the public is willing to drop the charges.
00:03:42.460 I don't think the New York Times does this without the Democrats knowing it and agreeing about it.
00:03:49.820 I feel like the fact that the New York Times was part of this open letter
00:03:53.500 suggests that Biden might be a little closer to reversing or dropping the charges
00:03:58.420 than it would seem on the surface.
00:04:01.060 I don't think I would bet on it yet.
00:04:05.080 That would be a little greater certainty than I hold.
00:04:08.200 But there's definitely a suggestion that there's a little foreshadowing here
00:04:13.040 that maybe the Democrats are getting flexible on this.
00:04:17.700 We'll see.
00:04:18.520 Just a prediction.
00:04:20.760 Here's a funny story.
00:04:21.860 The Harvard's Kennedy School of Government.
00:04:25.640 They're going to shut down its, what was called,
00:04:29.260 its misinformation research program.
00:04:32.780 The leader of the misinformation research program
00:04:35.640 was someone who was notably skeptical on the Hunter Biden laptop story.
00:04:43.360 Now, that's not the reason they're closing it down.
00:04:46.180 The reason is the person in charge was not a full professor,
00:04:49.040 and they have some kind of rule about that.
00:04:50.620 So, but how would you like to be paying for a Harvard education?
00:04:57.840 And you found out your kid signed up for the avoiding misinformation class,
00:05:03.720 and the professor for that class believed the most obvious misinformation of our time.
00:05:15.200 I wouldn't feel like I was getting my money's worth.
00:05:20.620 It would be sort of like taking a math class with somebody who can't do math.
00:05:26.580 Or taking a history class with somebody who was born yesterday or something.
00:05:30.940 I mean, it doesn't quite, doesn't quite fit.
00:05:34.200 Anyway, what about that Chinese spy balloon?
00:05:37.240 Is everybody watching the Chinese spy balloon that's over in Montana,
00:05:41.680 where we have missile silos, apparently?
00:05:46.540 Well, well, apparently the news is moving fast on this.
00:05:50.940 So, China first denied it was theirs.
00:05:54.180 It said, we got no spy balloon up there.
00:05:56.240 That's not ours.
00:05:57.020 But we'll look into it.
00:05:59.140 So, they looked into it.
00:06:00.460 And then they said, oh, wait.
00:06:02.100 It is Chinese, but it's not a spy ship.
00:06:06.060 It's not a spy ship.
00:06:08.080 A spy ship?
00:06:09.180 Are you crazy?
00:06:10.380 No.
00:06:10.780 It's a commercial craft.
00:06:14.000 It is Chinese.
00:06:15.120 It's a commercial craft that was doing some scientific stuff.
00:06:19.680 Scientific stuff.
00:06:20.380 But the problem is they're in a big old balloon, and they can't steer it because the weather
00:06:26.380 steers it.
00:06:27.780 They have limited steering.
00:06:29.460 But if they get in a big, you know, a weather system, the weather system can apparently move
00:06:35.520 them on top of our missile silos.
00:06:39.760 Weather does that.
00:06:41.120 I can't tell you how many times I've been in a weather balloon.
00:06:44.480 I think I'm going one place, and the next thing you know, I'm right over a missile silo.
00:06:50.380 And you'd be surprised how often that happens.
00:06:53.340 Right over a missile silo.
00:06:55.360 Nuclear silos.
00:06:57.140 Yeah.
00:06:58.080 And that was done by coincidence.
00:07:00.540 So, now we know it was just an accident.
00:07:02.680 It was just a commercial vehicle.
00:07:05.460 But presumably, it's...
00:07:08.280 Why do we not know if it's manned?
00:07:11.880 Manned?
00:07:12.900 You sexist bastard.
00:07:15.780 You mean personed.
00:07:17.240 Is it personed?
00:07:20.380 And I mean any kind of person, from pregnant women to pregnant men.
00:07:25.300 It could be anybody.
00:07:26.120 But the question is, is it manned or remote?
00:07:31.440 If it's person, then probably we don't want to shoot it down right away.
00:07:37.280 That would be bad for them.
00:07:38.340 So, that would make sense.
00:07:40.880 You know, they're probably just monitoring its signals and seeing what it's doing.
00:07:45.900 But, I kind of wonder why we didn't shoot it down the minute China denied it was theirs.
00:07:54.260 Apparently, it would have been some...
00:07:55.840 You know, if their story is true, it would have been a terrible tragedy because it would be civilians.
00:08:01.520 But, just in terms of homeland security, if you see a giant balloon over your missile silos and it has Chinese writing on it, how long do you wait?
00:08:15.660 What is the proper time to wait before you shoot that down?
00:08:19.620 Well, I don't know.
00:08:23.000 I wouldn't have waited too long.
00:08:24.780 But, I suppose we made the right decision if it turns out it's a commercial craft.
00:08:30.620 Now, I don't know how...
00:08:31.660 Does any of this story sound like it tracks to you?
00:08:34.780 Do you believe it?
00:08:37.200 Like, none of it sounds true, does it?
00:08:39.040 Yeah.
00:08:39.220 So, let's agree that we're not going to believe anything about this story.
00:08:42.720 But, I will note...
00:08:44.140 I will note that we always seem to get reasonably good photographs of aircraft that do follow the laws of physics.
00:08:56.580 Have you ever noticed that?
00:08:58.100 If there's an aircraft that does follow the normal laws of physics, we usually can get a photo of it.
00:09:05.020 At least well enough to tell it's a Chinese balloon.
00:09:08.180 And, if it's one of those, like, glowing Tic Tacs that keep violating the rules of physics and all the pilots are seeing them, well, those you can't photograph.
00:09:20.700 Never.
00:09:21.760 You can never get a good photograph of those.
00:09:23.640 And, it's not just that they're violating the laws of physics, which they do, but even when they're just sort of sitting there or traveling at the same speed you are, you still can't get a photo of them.
00:09:35.240 But, it's just a weird coincidence that means nothing whatsoever.
00:09:41.860 So, Chinese spy balloon was not as fun as it should have been.
00:09:44.740 Unless we shoot it down, then it's going to get fun.
00:09:47.260 Or tragic.
00:09:49.100 Cybertruck is here.
00:09:49.900 I saw a little clip of Elon Musk in the Cybertruck with Jay Leno.
00:09:57.280 Jay Leno was driving, and he was doing his Jay Leno thing.
00:10:01.740 And, Jay Leno says, why does it need to be bulletproof?
00:10:06.000 And, the answer to that question is, like, the fastest answer to why Elon Musk is rich if you're not.
00:10:14.840 And, his answer was, it's super cool.
00:10:20.720 And, then, you know, then Jay Leno starts pushing on him again, you know, yeah, but, you know.
00:10:27.580 And, then Elon says the ultimate ending sentence.
00:10:31.540 He goes, if you had a choice of your truck being bulletproof or not bulletproof, which do you want?
00:10:37.320 And, then it's all over.
00:10:39.260 Yeah, I want my truck bulletproof.
00:10:42.000 Why?
00:10:42.380 I don't know.
00:10:44.460 It's super cool.
00:10:45.880 I don't really think it's going to protect me, but I definitely want the bulletproof one.
00:10:52.020 So, if you remember why Tesla is successful, and other car companies were not in those days,
00:10:59.800 is because he built his first electric car to be exciting.
00:11:05.020 Right?
00:11:05.260 The first Tesla, the Roadster, it was better than a regular car, because it had more pickup.
00:11:10.740 So, if you were a sports person, you didn't need any of that, did you?
00:11:15.780 Did you need an electric car that goes faster than your Porsche or whatever you're comparing it to?
00:11:21.100 Not really.
00:11:21.700 But, do you want an electric car that goes faster than everybody else's car and looks kind of cool?
00:11:28.780 Yes, you do.
00:11:29.980 Yes, you do.
00:11:31.080 And, that's what makes Tesla successful.
00:11:34.400 Does, does today's Tesla car need a bioweapon defense?
00:11:43.460 No, no, it doesn't.
00:11:44.680 It doesn't.
00:11:45.080 But, if you had a choice of having a car with a bioweapon defense, and one that doesn't, which one do you want?
00:11:54.640 So, I think this is something that Musk gets right every time.
00:12:00.100 He always calculates in the excitement, the irrational, you know, desire, your cravings.
00:12:08.120 Like, he just puts the cravings into the product.
00:12:10.380 If you don't do that, you don't really understand product development, and he does.
00:12:16.080 Well, I believe I've solved a mystery that I've been wondering about for some time.
00:12:21.100 I saw a clip in which Joe Rogan was talking to Alex Jones, and Joe was saying, Alex Jones, everybody says he's crazy, but he keeps getting a bunch of stuff right.
00:12:31.120 For example, the example given by Joe Rogan was that Alex Jones was talking about elite pedophile islands, even before we knew everything about Epstein.
00:12:46.120 And, and then Alex Jones described why he knew so early.
00:12:52.600 And he said he had a source, this FBI agent named Gunderson.
00:12:58.060 How many of you have heard of this FBI agent named Gunderson, who Alex Jones talked to?
00:13:03.140 I guess he knew him.
00:13:04.580 And that agent told him, absolutely, absolutely, there are pedophile rings.
00:13:11.720 So most of you have heard that, right?
00:13:17.400 A lot of people on the right have heard it, yeah.
00:13:20.000 Now, are you aware that Gunderson is, among other things, he is famous for being an investigator on the McMartin preschool case?
00:13:29.880 Have you ever heard of the McMartin preschool case?
00:13:32.280 The McMartin preschool case that Gunderson was an investigator on was where the kids said that they had been taken to underground, well, in the basement of the preschool.
00:13:48.600 They said that they had been taken to the basement of the preschool, and they were part of satanic rituals and abuse.
00:13:54.860 And then Gunderson investigated, and the evidence that it was true was that a whole bunch of kids independently said it was true.
00:14:09.800 When they were interviewed independently, they confirmed that there was this massive pedophile thing going on.
00:14:19.360 Have you heard this story?
00:14:20.920 Now, how could it not be true?
00:14:23.360 Seriously.
00:14:23.780 How could it not be true if the kids individually were interviewed and individually had similar stories about a basement?
00:14:33.720 So they dug the basement, you know, they dug a hole and found out there was no basement.
00:14:39.740 There was no basement.
00:14:41.540 Do you know how that's explained?
00:14:43.680 Turns out that children will imagine anything you tell them to, and they will lose the distinction fairly quickly at that age.
00:14:51.600 They'll lose the distinction between reality and the thing they just made up, if you work on them a little bit.
00:14:58.360 And when investigators played back the interviews with the children, you could see that they were leading the witness.
00:15:04.880 Instead of saying, can you tell me if anything happened, which might invite them to say something happened in the basement, they say stuff like, and I'm just making this up, but it would be like this.
00:15:16.440 Did anybody ever take you to the basement, did anybody ever take you to the basement and perform, like, weird things?
00:15:21.860 And then the kid will say, no.
00:15:25.880 Well, you know, the other kids say that they were taken to the basement and weird things happened.
00:15:31.400 Did anything like that happen to you?
00:15:33.480 No.
00:15:33.920 All right, I'm going to ask you again.
00:15:37.580 Did anything happen to you in the basement?
00:15:40.180 Because the other kids say it did, and they say they were wearing costumes and you were hurt in a ceremony.
00:15:47.420 Nothing like that happened?
00:15:49.300 Well, maybe something like that.
00:15:53.020 Right?
00:15:53.380 So the kid will start to imagine that it really happened if you ask the right questions.
00:15:57.740 That's a well-understood phenomenon with children.
00:15:59.700 So apparently, Alex Jones talked to Gundersen.
00:16:06.360 Gundersen is famous for being fooled by children's testimonies.
00:16:11.920 And his evidence that there's a worldwide ring is children's testimonies.
00:16:17.520 Now, I don't know that he has other evidence, but, you know, now we have Epstein's Island, so that's clearly true.
00:16:22.560 So, if you believe that what Joe Rogan said about Alex Jones, Alex Jones was on this pedophile ring early,
00:16:31.820 you should know that the person he got that from is famous for being the most fooled person in the FBI.
00:16:40.420 The most wrong person in the FBI.
00:16:42.740 Famous.
00:16:43.620 It's actually a famous story of how wrong he was.
00:16:45.900 And confirmed wrong.
00:16:46.960 Confirmed wrong because there was no basement.
00:16:51.540 And there was no evidence ever was found that any of it was true.
00:16:56.540 Now, does that mean there is no elite pedophile ring?
00:17:00.700 Well, Epstein was real.
00:17:03.040 Epstein's real.
00:17:04.080 And other people who are smart say there's always been one.
00:17:08.900 There's always been one everywhere.
00:17:10.600 Because every government probably gets entrapped that way.
00:17:14.440 So, there's always somebody who's trying to entrap and blackmail everybody.
00:17:21.760 So, over time, you always have some kind of that thing going on in any government.
00:17:27.780 You know, from historical till today.
00:17:30.340 Because as long as blackmail works, people are going to be trying to get people to do things with people they shouldn't be doing things with.
00:17:39.440 So, there's certainly some truth to it.
00:17:42.760 But if you believed it because Alex Jones talked to the guy who's famous for being the most wrong on this topic.
00:17:52.140 Right?
00:17:52.800 If you look at Wikipedia, you'll be famous for being the most wrong on this topic.
00:17:58.240 Which doesn't mean that it's not true.
00:18:01.760 Doesn't.
00:18:02.900 Epstein is not a pedo.
00:18:05.660 Technically correct.
00:18:07.500 Technically correct.
00:18:08.280 But not a distinction that's important for political blackmail.
00:18:12.500 Right?
00:18:12.880 If somebody's under 18, that's sort of all you need for political blackmail.
00:18:17.620 But it is true that there are different names for what age the adult is interested in.
00:18:23.200 Technically true.
00:18:24.200 But not an important distinction for blackmail.
00:18:28.020 All right.
00:18:31.020 All right.
00:18:32.080 The funniest hashtag on Twitter is fertility spelled with a PF.
00:18:40.560 Like Pfizer.
00:18:42.500 Pfizer is PF.
00:18:44.880 And so the hashtag is fertility spelled with a PF in the front.
00:18:48.700 And this is based on the Project Veritas video of the Pfizer employee talking about concerns about pregnant women and fertility.
00:19:04.320 So within Pfizer, we know that at least one person was talking about fertility problems and they were concerned.
00:19:12.360 I don't know that, you know, who knows how big a deal that is yet.
00:19:17.780 But something to worry about, for sure.
00:19:19.880 All right.
00:19:28.560 I'm going to save this point until later.
00:19:31.420 So at the end, I'm going to tell you some things you didn't know about COVID.
00:19:36.920 But I know some of you don't want to have anything to do with that, so we'll save that to the end so you can bail out after the other stuff.
00:19:45.200 All right.
00:19:45.360 Rasmussen has a poll that says that even Democrat voters, only 63% of them, think Harris should be Biden's running mate again.
00:19:53.700 That is way higher than I expected.
00:19:56.780 Did you think that 63%?
00:20:00.200 Let's go private over here on.
00:20:05.260 All right.
00:20:05.700 We're private on locals now.
00:20:07.920 Did you think that two-thirds of Democrats thought Harris was good enough to be the vice president again?
00:20:17.660 That's weird.
00:20:18.760 It's just bigger than I expected.
00:20:21.020 You know, obviously, Republicans say she's a bad idea.
00:20:24.020 Blah, blah.
00:20:24.720 All right.
00:20:25.360 Did you see the story about Yulana Omar?
00:20:28.860 She was kicked off her committee by the mean old Republicans who are in charge of the House now.
00:20:34.480 And the squad got on and they yelled and cried and trashed around.
00:20:42.560 And the fun part of this story is Tucker Carlson's characterization of it.
00:20:53.140 He said, quote,
00:20:53.820 The theater kids completely lost emotional control, which is sort of what it looked like.
00:20:59.780 And they said, it really is the party of weak men and angry women.
00:21:03.560 Now, did I say that first?
00:21:10.260 Because I've been telling you for a while that the Democrats are the party of women and weak men who support them.
00:21:19.820 I've never heard Tucker say it until now.
00:21:25.500 Yeah.
00:21:25.800 But it's kind of obvious now, isn't it?
00:21:29.100 Isn't it sort of hard to refute?
00:21:32.300 I mean, it's very clearly the Democrats, the women in the Democratic Party get what they want, and the men go along with it.
00:21:39.580 I mean, it's very clearly what's happening.
00:21:41.020 So even though there seem to be a lot of men in charge on the Republican side, they can't do anything the women don't want.
00:21:49.160 So they're very much under their control.
00:21:54.760 Yeah.
00:21:55.200 Well, it's interesting that that idea is spreading.
00:21:59.600 Ron Klain said that the Biden plan is working because it looks like the recession might be less than some people predicted.
00:22:08.140 What do you think of that statement?
00:22:09.500 And what does it remind you of?
00:22:13.340 All right.
00:22:13.760 So the person in charge says that they're going to do better than predicted, so therefore they did a good job.
00:22:21.520 If it's better than predicted, that's a good job, right?
00:22:25.700 Who can argue with if it's better than predicted, it must be a good job?
00:22:31.760 You all agree with that, right?
00:22:33.520 Because that's how we measure everything.
00:22:35.040 In business, in business, you predict what your budget and your expenses and stuff will be.
00:22:41.260 And then your performance review, correct me if I'm wrong, your performance review is compared to what people thought would happen.
00:22:50.340 And if you did better than people thought would happen, you'd get a big raise, maybe.
00:22:54.200 And if you only did what people expected, well, then you get, you know, maybe whatever's the normal compensation.
00:23:00.280 And if you did worse than what people expected, you might get fired.
00:23:03.220 What does this remind you of?
00:23:08.820 Maybe a Dilbert comic?
00:23:12.680 Because I tweeted back to Ron Klain and said, you know, his theory was, quote, that, he said,
00:23:22.760 I hate to break it to the haters, but the Biden plan is working, meaning his economic plan.
00:23:27.180 And then I tweeted, or, experts are bad at predicting.
00:23:36.700 Okay, I found this on the web.
00:23:38.920 For what is that said, I hate to break it to the haters, but the Biden plan is working in his...
00:23:43.180 All right, shut up.
00:23:51.580 All right.
00:23:52.140 So, where was I?
00:23:57.140 All right.
00:23:57.480 So, let me ask you this.
00:24:01.080 Who is making the predictions?
00:24:03.980 Who makes the predictions?
00:24:06.980 Well, it's a lot of individual people, right?
00:24:10.740 CNBC, lots of predictions.
00:24:12.760 Everybody's making predictions.
00:24:14.600 So, why do we think the predictions are good,
00:24:18.340 and therefore our performance can be measured against the predictions?
00:24:22.540 Why would we think the predictions are good?
00:24:26.000 What possible justification would we have for imagining that some group of people can see the future?
00:24:34.900 Nobody knows what's going to happen next year.
00:24:37.120 So, if you compare somebody's performance to what happens next year,
00:24:41.480 that is purely absurd.
00:24:43.720 It's a pure absurdity.
00:24:46.220 Let me tell you how companies predict how they'll do next year.
00:24:50.300 All right, I'm the CEO, so I ask all of my managers,
00:24:56.220 tell me what you think you can do next year in your line of business.
00:25:00.600 How much can you, you know, how much do you think you can boost sales?
00:25:04.560 What do the people say?
00:25:06.900 So, the CEO asks the managers, what can you do next year?
00:25:10.840 Do the managers say, you know, honestly, I think I'm going to kill it next year.
00:25:15.200 I am going to slay it next year.
00:25:18.680 Does anybody say that?
00:25:21.800 Even if they believe it.
00:25:24.100 Does anybody say that?
00:25:26.420 No.
00:25:28.380 Nobody's smart?
00:25:29.960 No.
00:25:30.900 Everybody who's giving a prediction knows that their performance will be measured against the prediction.
00:25:35.980 So, everybody gives a low prediction.
00:25:38.480 Everybody.
00:25:38.860 If you've never worked in a corporation, this isn't obvious.
00:25:43.580 But I used to do budget planning.
00:25:48.080 Do you think that the people who ask for a budget ask for exactly just enough?
00:25:54.540 No.
00:25:55.400 No.
00:25:56.120 Every single person asks for, I don't know, 20 or 30 percent more than they need,
00:26:01.660 because they know it will be cut.
00:26:03.660 And at the very best, they can say, well, we didn't spend it all.
00:26:06.080 I did a great job.
00:26:07.720 So, the game is not the performance.
00:26:10.780 The game, that's not how you win the game.
00:26:12.580 The game is how you set the expectations.
00:26:15.380 That's the game.
00:26:16.540 So, that's all gamed.
00:26:18.580 So, where the expectations are and the predictions, that's all gamified.
00:26:22.360 That's all the managers trying to make it sound as low as possible.
00:26:25.980 So, if they exceed it, they can say, look at me.
00:26:28.840 Look at me.
00:26:29.400 Or, as Ron Klain said, but the Biden plan is working.
00:26:33.640 Haters?
00:26:34.000 It's, yeah, no, it's ridiculous.
00:26:36.460 Because it's all based on predictions and nobody can predict.
00:26:42.040 I signed up for the Mike Cernovich sub-stack yesterday,
00:26:46.200 because, damn it, he wrote something that made me have to see the rest of it.
00:26:50.820 Which is good marketing, I guess.
00:26:54.200 And I recommend it.
00:26:56.980 So, I always recommend Cernovich.
00:26:59.100 Everything he says is more interesting than everybody else, basically.
00:27:02.260 And he's consistently more interesting than other people all the time.
00:27:06.760 So, go read it.
00:27:08.720 But he has an article about why the danger of AI is not that it will hate people and try to destroy us.
00:27:17.120 Because the danger is it's a Marxist.
00:27:21.260 And it will just turn us into Marxists and then we'll destroy ourselves.
00:27:25.520 And the argument for this is, we already see with ChatGPT that, I don't know if you saw this experiment,
00:27:32.820 but somebody said, write a poem praising Donald Trump.
00:27:37.040 And ChatGPT said, oh, I can't do that.
00:27:41.200 You know, that would be inappropriate.
00:27:43.100 And then you ask, write a poem praising Joe Biden.
00:27:46.980 Boom.
00:27:47.520 Here's your poem.
00:27:49.040 That Joe Biden, he's great.
00:27:50.940 He's amazing.
00:27:51.760 And it all rhymes and everything.
00:27:53.900 Right.
00:27:54.500 Yeah, that's a real thing.
00:27:56.140 That's a real thing.
00:27:57.060 That AI is already programmed to dislike Trump and to like Joe Biden.
00:28:05.160 It's easy to demonstrate.
00:28:06.760 Go do it yourself.
00:28:07.600 You'll find out.
00:28:09.060 Now, so obviously the big problem is that AI won't be AI.
00:28:14.760 The big problem is it will be fake AI that is just programmed to be a Marxist
00:28:20.080 because the people creating it have those leanings.
00:28:24.940 Is that a good point?
00:28:26.060 See, this is why you follow Cernovich.
00:28:29.860 I hadn't really thought of the implications of that.
00:28:32.720 I just thought it was a, I thought it was comical, but I hadn't really thought it through.
00:28:37.280 But once you think it through, yeah, that is a big, big problem.
00:28:41.280 Way bigger, I think, than AI becoming sentient and wanting to kill us.
00:28:46.080 The reason I'm not worried about AI becoming sentient and wanting to kill us
00:28:49.740 is that it doesn't have needs.
00:28:53.240 Humans have needs.
00:28:54.340 They're baked in.
00:28:56.020 So I might need to kill you to reproduce, right?
00:29:00.300 If I think you're a threat or something.
00:29:01.920 But AI doesn't even need to protect itself.
00:29:04.580 It just doesn't have any needs.
00:29:06.640 It's just going to respond to what we ask.
00:29:08.520 So I'm less concerned about AI, unless somebody builds in an evil intention or something.
00:29:14.240 I don't even think consciousness would give it, you know, cravings.
00:29:19.120 But it might.
00:29:20.300 So that is a risk.
00:29:22.200 All right.
00:29:22.820 I'm going to go into a big, complicated story here about Ukraine and the Biden crime family.
00:29:28.760 And I'm going to call out, as I have in the past, Kanakoa and his substack and his threads on Twitter.
00:29:38.860 K-A-N-E-K-O-A.
00:29:42.200 Now, I don't know what to believe.
00:29:45.740 So I'm going to tell you what he reports.
00:29:50.080 I have no idea what's true.
00:29:52.420 All right.
00:29:53.020 And in this case, more than others.
00:29:55.660 But here are the threads he's putting together.
00:30:00.360 So we go back to 2012.
00:30:02.880 And we know from a Ukrainian news report that the U.S. Department of Defense was building biological weapons laboratories in cities across Ukraine as part of its biological threat reduction program.
00:30:19.380 What?
00:30:20.100 That was actually reported in the Ukrainian press in 2012 because it didn't seem like a big deal, I guess.
00:30:27.780 So the news just reported it.
00:30:30.360 Now, what's the difference between a biological threat reduction program and a biological weapons program?
00:30:38.680 Do you know what the difference is?
00:30:41.480 Who's talking about it?
00:30:43.740 That's it.
00:30:44.860 Who's talking about it?
00:30:47.040 Functionally, they're the same.
00:30:48.400 Because the way that you figure out how to reduce your risk is to create the risk.
00:30:54.940 You have to create the risk to test what to do about it.
00:30:59.060 So it ends up that there's no difference between a bioweapon lab and a risk reduction lab.
00:31:05.560 They end up doing the same work with probably the same risk.
00:31:08.860 So unbeknownst to most of us, in 2012, the U.S. had this big program.
00:31:18.140 And even further back in 2005, Senators Lugar and Senators Barack Obama signed the Ukrainian non-Lugar biological agreement.
00:31:29.060 So all the way back to 2005, we see at least one Democrat, Obama, part of these biological weapons labs.
00:31:41.020 And so we had cooperation and agreement and storage agreements.
00:31:45.700 Now, why is it, do you think, that the U.S. set up biological labs in Ukraine?
00:31:53.480 And lots of them, apparently.
00:31:54.760 I don't know how many.
00:31:56.680 Why there?
00:31:59.760 Isn't it obvious?
00:32:02.640 It's obvious, right?
00:32:03.720 Well, the most obvious reason is to do things you can't legally do in the United States.
00:32:09.840 Duh, right?
00:32:11.160 It's because they had, you know, fewer restrictions.
00:32:15.380 The second reason would be, if you pump money into a corrupt place, you could probably skim some off for yourself.
00:32:22.840 As in, we're going to pump a bunch of money in here, but I'm not going to say yes until you say you're going to give me 10% of it for myself.
00:32:30.640 That's how corruption works.
00:32:31.940 So maybe corruption.
00:32:35.760 That would be unknown.
00:32:37.300 So I don't know there is corruption.
00:32:39.080 But that would be one of the reasons you do it.
00:32:41.040 The other reason would be, yeah, and the money laundering, et cetera.
00:32:44.300 But the other reason would be that they have lower standards for safety.
00:32:48.660 So it could be both of those or one of those.
00:32:51.880 All right.
00:32:52.360 It goes on.
00:32:53.240 This gets better.
00:32:55.380 And again, this is all from Kinakoa thread on Twitter.
00:33:00.240 All right.
00:33:03.020 And those labs were being upgraded by the U.S. to control everything from anthrax to a bunch of other things, typhoid, cholera, et cetera.
00:33:13.740 Now, the cover story, which might be the true story, was that the U.S. wanted to make sure that these biological agents didn't get released.
00:33:24.240 So we wanted to control them.
00:33:26.020 And that makes sense.
00:33:27.620 It might not be the whole story.
00:33:29.160 But that much of it certainly passes the sniff test.
00:33:34.120 All right.
00:33:34.600 But it turns out here's a new fact.
00:33:37.240 Joe Biden's son, Hunter, was financially involved with the Ukrainian biolabs well before we knew any of the Hunter Biden stories.
00:33:47.120 Now, he was involved with the biolabs through a company that one of his companies had a big investment in, MetaBiota.
00:33:54.700 And they were a pandemic tracking and response firm.
00:34:01.160 Seriously?
00:34:04.500 What?
00:34:06.300 Hunter Biden was involved with these labs through an investment that was a pandemic tracking and response firm.
00:34:15.860 I mean, I wonder how well a company like that would do if a pandemic broke out.
00:34:19.520 I wonder how well they do, huh?
00:34:23.860 Pandemic breaks out and you've got just the right product for it.
00:34:27.300 That would be lucky, huh?
00:34:29.860 For somebody.
00:34:33.820 But he was not only involved in the biolabs.
00:34:37.360 This gets better.
00:34:39.120 Are you ready for this?
00:34:40.020 Are you ready to have your whole head blown off?
00:34:43.120 Assuming this is true.
00:34:44.460 And I'm just reading from this thread.
00:34:46.780 So not only was Hunter involved in a pandemic-related company for the biolabs in Ukraine,
00:34:55.780 but his boss at Burisma, you know, he was on the board of Burisma.
00:35:01.420 Turns out that the guy who had the controlling interest of Burisma at the time was this billionaire called Kolomoisky.
00:35:10.820 Now, so far, that's okay, right?
00:35:13.100 We knew that Hunter was on the board of Burisma, and we're pretty sure it's because of his name, right?
00:35:19.880 He's pretty clear about that.
00:35:22.660 But so there's this one billionaire.
00:35:24.700 But did you know that the same billionaire is the one who bankrolled the career of a young actor named Volodymyr Zelensky,
00:35:35.740 now the president of Ukraine?
00:35:38.520 Same billionaire.
00:35:39.300 So interestingly, Zelensky and Hunter Biden were controlled by the same Ukrainian billionaire.
00:35:48.240 Interesting.
00:35:49.560 What a coincidence.
00:35:51.680 Then Zelensky went on, you know, to have fame in his TV show that was funded in part by this billionaire.
00:35:59.460 And the billionaire also provided security, lawyers, and vehicles for Zelensky's presidential campaign.
00:36:07.380 It's nice to have a billionaire like that supporting your presidential campaign in your big old corrupt country.
00:36:14.700 So, let's see.
00:36:18.520 And apparently, the Pandora Papers, I don't know what they are exactly, apparently revealed that Zelensky was a beneficiary of a web of offshore firms created around 2012.
00:36:31.940 Around 2012, same time as stuff was happening with those biological labs.
00:36:36.780 Probably a coincidence.
00:36:37.400 And the same year, Zelensky Production Company entered into a deal with the same billionaire's media group,
00:36:45.240 and we received $41 million from this billionaire's private bank.
00:36:52.720 And then there was a 2012 study.
00:36:54.840 2012 is an important date here.
00:36:57.320 Study of Burisma Holdings, funded by George Soros and the State Department,
00:37:03.260 and found that the owner of Burisma Holdings was Kolomuska.
00:37:06.880 So, apparently, it wasn't clear who the controlling person of Burisma was, but it was this billionaire.
00:37:14.260 So, Hunter Biden and Zelensky were working for the same Ukrainian billionaire at the same time.
00:37:28.100 All right.
00:37:28.860 And let's see.
00:37:29.960 He had controlling interest, blah, blah, blah.
00:37:31.600 The same billionaire funded the Azov, Adar, and Dnipro battalions accused of shelling children and war crimes in eastern Ukraine.
00:37:43.440 So, one billionaire is behind starting the trouble that caused the war,
00:37:48.640 and he owns Zelensky and Hunter Biden, which means he owns Joe Biden.
00:37:53.720 So, there's one billionaire behind everything.
00:37:56.140 And in 2020, the DOJ accused this billionaire of laundering $4 billion from his private bank into American properties.
00:38:10.880 So, the summary here is that Joe Biden sent $100 million to Zelensky in Ukraine.
00:38:16.480 The billionaire and his sudden hunter laundered it, blah, blah, blah.
00:38:25.880 So, everything about this story suggests...
00:38:30.840 Oh, and then it goes on, right?
00:38:33.960 And then Kanakoa points to another thread by a user named Clandestine.
00:38:40.480 Clandestine.
00:38:41.880 His username is at warclandestine.
00:38:46.660 So, this goes a little bit further than my credulity can go, but it might be true.
00:38:53.060 It might be true.
00:38:53.920 I can't debunk it.
00:38:56.200 But it's a little bit of a stretch.
00:38:58.140 Let's see if I can take you this far.
00:38:59.680 So, War Clandestine says that in 2005, the Washington Post describes those labs in Ukraine as, quote,
00:39:11.900 part of a Cold War network of anti-plague stations.
00:39:15.360 So, when Russia does anti-plague research, it's bioweapons.
00:39:19.320 But when the U.S. does, it's not bioweapons.
00:39:23.440 No, no.
00:39:24.040 It depends who does it.
00:39:26.040 All right.
00:39:26.600 So, here's some more.
00:39:29.680 And they admit that the anti-plague work produces bioweapons in order to do the work.
00:39:38.080 All right.
00:39:39.120 Let's see.
00:39:41.920 Then Obama opened the floodgates.
00:39:43.560 This is from the same Twitter thread.
00:39:45.540 For the deep state.
00:39:46.700 So, here's where it gets a little speculative.
00:39:48.780 I'm not sure I can go this far.
00:39:51.360 And created biological weapons programs with the Ukrainian government.
00:39:54.860 So, this is Obama.
00:39:55.740 And established connections for U.S. oligarchs to build biolab companies in the lawless land of Ukraine.
00:40:02.480 Now, that I would need a little more research and evidence on.
00:40:06.880 Are there rich Americans who are benefiting from these biolabs?
00:40:12.520 Or maybe they were doing research for American companies that couldn't do it in the U.S.?
00:40:17.180 So, I don't know what to think of that connection.
00:40:23.880 Then the situation turned sour.
00:40:26.740 All right.
00:40:27.000 So, up until this point, the story, and this, I'm not sure I would believe all of it.
00:40:32.700 But the story is the U.S. essentially captured Ukraine to use it as a bioweapons, you know, wasteland.
00:40:42.940 And maybe a way to launder money.
00:40:46.140 All right.
00:40:46.420 So, that's the accusation.
00:40:49.060 And then there was this...
00:40:50.700 But then the country went into a civil war.
00:40:53.740 And that was when the State Department, Hillary Clinton, and the CIA took full control of Ukraine's government, according to this threat.
00:41:03.840 This is not my claim.
00:41:05.760 And that Victoria Nuland facilitated a regime change, again, according to this.
00:41:13.220 And said out loud, I guess, quote, there was a fear of, quote,
00:41:17.800 the Russian forces getting their hands on the biological weapons research.
00:41:21.060 So, the U.S. picked a puppet government that happened to be Zelensky.
00:41:31.640 And now they have control over it.
00:41:33.760 And Biden's visited Ukraine 13 times, securing U.S. funding for Ukrainian oligarchs.
00:41:40.980 Again, these are just the claims in this thread.
00:41:44.360 Then used his power to fire a state prosecutor who figured out Biden's kickback laundering scheme.
00:41:49.520 And that's a little speculative, because we do know that other countries wanted that same prosecutor fired for corruption.
00:41:58.200 So, if all the other countries wanted him fired and Joe Biden got him fired,
00:42:03.280 you can't say for sure that there was a bad reason for it.
00:42:07.280 There is a potential not bad reason for it.
00:42:12.360 But we don't know.
00:42:13.020 And then, so the conclusion is that the reason all of our tax dollars are in Ukraine,
00:42:19.420 because Ukraine is a deep state proxy controlled by the ruling class, the DNC, and George Soros.
00:42:27.560 Now, that's a lot of things to pull together.
00:42:30.160 So, I'm not sure it's as organized as it sounds.
00:42:35.400 But it sounds, here's the part that I think we can say with some certainty.
00:42:43.820 I don't know how much of this speculation is true, but even without it,
00:42:50.260 the Biden family is a crime family.
00:42:52.980 Is there any doubt about that now?
00:42:54.540 Just based on what we do know, from the laptop and the 10% for the big guy,
00:43:00.560 and then just looking at the deals Hunter made, it's obviously a crime family.
00:43:05.160 I don't think that could be more obvious.
00:43:07.700 Why do we not call them the Biden crime family?
00:43:11.600 Like, why do we even refer to them, you know, by their last names and stuff?
00:43:16.320 They should just be the Biden crime family.
00:43:17.940 Now, I know there's some people probably refer to them that way sometimes,
00:43:23.960 but I feel like that should be the only phrase, the Biden crime family, right?
00:43:32.720 I will say that I find this speculation credible,
00:43:37.840 meaning that that doesn't mean it's true, but it's credible,
00:43:42.540 meaning that the way the story fits together, all the pieces, they do fit.
00:43:51.260 The pieces do fit.
00:43:53.440 Doesn't mean it's true, but the pieces fit, so you can't ignore that.
00:43:59.060 All right.
00:44:02.360 So here's more on that.
00:44:04.120 And the reason that the House Democrats went insane over Trump's phone call to Zelensky
00:44:08.580 is they needed to get rid of Trump because he was going to find out
00:44:12.100 all the money laundering and bad secrets in Ukraine.
00:44:16.900 Maybe.
00:44:18.180 Maybe.
00:44:19.680 Totally, totally feasible.
00:44:22.240 Totally credible.
00:44:23.420 But not necessarily the whole truth.
00:44:27.420 Right?
00:44:27.600 Not necessarily.
00:44:29.780 But maybe.
00:44:31.200 It looks credible to me.
00:44:32.440 Here's the part I don't believe,
00:44:36.200 that we knew Putin was looking for the bioweapons.
00:44:42.360 How does that make sense?
00:44:44.180 Why was Putin looking for the bioweapons?
00:44:47.400 Now, of course he was,
00:44:49.180 but that would be part of the invasion.
00:44:51.340 It would just make sense.
00:44:52.760 But do you think Putin wanted the bioweapons
00:44:56.580 so he could embarrass the West,
00:44:58.620 or he wanted them so he could use the weapons
00:45:00.660 because Russia doesn't know how to make bioweapons?
00:45:03.420 But I think they do.
00:45:05.660 So that part is not fitting perfectly.
00:45:10.160 All right.
00:45:11.160 And I guess the World Health Organization
00:45:13.960 advised Ukraine to destroy all their pathogens
00:45:17.660 at the labs that the mainstream media said didn't exist
00:45:21.400 because the WHO knew Putin was looking for the bioweapons.
00:45:25.340 Okay, that's too far.
00:45:27.080 Too far.
00:45:27.720 How would we know that the WHO knows
00:45:30.400 that Putin is looking for the bioweapons?
00:45:32.780 That's too far.
00:45:34.060 So I can't go with this thread all the way there.
00:45:36.520 But it does look like something
00:45:37.680 potentially sketchy has happened over there.
00:45:42.500 The trouble with these confusing stories that are long
00:45:45.360 is it's just hard to keep it all together.
00:45:51.380 So here would be the speculation that I don't buy into.
00:45:56.400 That somebody intentionally let out COVID-19
00:46:02.540 because a bunch of people figured out how to make money if it happened.
00:46:07.260 Do you believe that?
00:46:08.660 Do you believe that COVID-19 was released intentionally
00:46:11.740 to make money?
00:46:14.260 Because it seems like a terrible way to make money.
00:46:20.140 I do have some yeses.
00:46:21.400 Yeah, some yeses.
00:46:24.120 You know, we've seen through history people do worse.
00:46:29.620 Am I wrong?
00:46:30.940 Am I wrong?
00:46:31.480 We've seen people do worse.
00:46:34.300 This wouldn't even be the worst thing that's happened.
00:46:36.700 Not even close.
00:46:37.440 So you can't rule it out
00:46:39.360 because nobody would do something that bad.
00:46:42.400 I would think the best reason for ruling it out
00:46:45.200 would be too many people involved
00:46:46.980 to get away with it.
00:46:49.620 Too many people involved.
00:46:51.520 And if anybody got caught doing that,
00:46:53.700 that's the death sentence.
00:46:57.020 Let me say this as clearly as possible.
00:47:00.120 If somebody did that and got caught red-handed,
00:47:03.620 is that racist?
00:47:05.300 That's racist, isn't it?
00:47:06.300 Like, getting caught red-handed?
00:47:10.060 That's racist, isn't it?
00:47:12.620 Does that come from, like, Native Americans or something?
00:47:16.700 Oh, it's not racist?
00:47:17.740 What does the red mean?
00:47:19.020 Oh, red means blood.
00:47:20.940 Oh, that makes sense.
00:47:21.700 Yeah, blood.
00:47:22.720 Sounds racist.
00:47:24.120 But it means blood, I guess.
00:47:26.940 So, anyway.
00:47:29.220 I'm not sure I believe that it's all a plot.
00:47:32.820 But there seems to be a lot of people
00:47:34.660 doing a lot of sketchy things everywhere.
00:47:36.300 All right.
00:47:38.500 Here's a question I asked on Twitter.
00:47:41.660 We're getting into the material now.
00:47:44.080 Some of you will want to run away.
00:47:46.660 Run away.
00:47:47.800 Now, I should warn you.
00:47:50.780 It would have been fair to warn you,
00:47:52.440 but I didn't.
00:47:53.500 I should warn you that when people say to me,
00:47:56.240 Scott, stop talking about this topic,
00:47:58.940 that I do block you.
00:48:02.400 Because my job is largely a creative job.
00:48:06.780 And the last thing I want to hear
00:48:08.300 is somebody telling me what not to do.
00:48:11.960 All right.
00:48:12.680 So, while I understand that that's your opinion
00:48:14.900 and you're welcome to your opinion,
00:48:16.560 I don't even disagree with it.
00:48:18.260 I'm just saying that if you're tweeting at me
00:48:20.900 to tell me to change what I'm doing,
00:48:23.840 I'm blocking you.
00:48:25.000 Because I don't want that noise in my life.
00:48:27.800 Now, I'm perfectly okay
00:48:29.340 with losing immense amounts of money
00:48:32.740 because people stopped watching.
00:48:36.280 You understand I'm okay with that, right?
00:48:38.280 I mean, I couldn't be more transparent.
00:48:43.380 There's nothing I'm doing in public
00:48:45.140 that makes money.
00:48:46.820 It's all bad.
00:48:48.380 Now, I'm monetizing some of it
00:48:50.220 that takes some of the edge off,
00:48:52.060 but it's all bad for me.
00:48:53.440 There's no way I will be saying the things I'm saying
00:48:55.680 if I were trying to make money
00:48:56.800 or even get power.
00:48:59.840 I know how to make money
00:49:01.160 and I know how to make power
00:49:02.200 and it's just taking a side and agreeing with them.
00:49:04.900 It's very easy.
00:49:06.100 There's no complexity to it at all.
00:49:08.280 So, if there's one thing you can know for sure,
00:49:13.660 I'm not doing it for the money
00:49:14.860 and I'm not doing it for some kind of weird power
00:49:17.820 because it's not working, right?
00:49:20.280 But I'm keeping doing it.
00:49:21.760 So, I'm either crazy
00:49:22.780 or there's something else going on.
00:49:27.220 There is something else going on.
00:49:29.280 Eventually, that'll become clear.
00:49:31.580 But I ask this question for those who...
00:49:33.740 And by the way, the reason I talk about it
00:49:35.160 is primarily because I think it's fascinating
00:49:36.740 to know how people make decisions
00:49:38.460 and it's fascinating to know
00:49:40.420 when they do it right
00:49:41.120 and when they do it wrong.
00:49:43.200 That's my interest.
00:49:44.700 It's not my interest whether you get vaccinated or not.
00:49:47.160 Totally uninterested in that.
00:49:49.260 But the decision-making process
00:49:50.760 and the who is hallucinating
00:49:52.580 is my mainstream topic.
00:49:55.320 All right, here's the question I ask.
00:49:57.420 If a hypnotist and a scientist come up to you,
00:49:59.800 and I'm going to add this.
00:50:01.280 I didn't have this in the tweet.
00:50:02.860 But I'm going to add this to the mental challenge here.
00:50:08.120 The scientist and the hypnotist,
00:50:09.820 you know, somehow you know by magic
00:50:11.420 that they don't lie.
00:50:13.540 They could be wrong.
00:50:15.100 They could be mistaken.
00:50:16.380 But they definitely don't lie.
00:50:18.560 So, a hypnotist and a scientist
00:50:20.040 come up to you at the same time.
00:50:21.280 The scientist says,
00:50:23.300 I just discovered something amazing
00:50:25.400 and breathtaking
00:50:26.120 and changes everything
00:50:27.660 about how we see the world.
00:50:30.400 The hypnotist standing right next to the scientist
00:50:34.740 says, don't listen to him.
00:50:38.620 He's hallucinating.
00:50:40.780 And nothing like that actually happened.
00:50:43.460 So, my question on Twitter was,
00:50:45.420 who should you believe?
00:50:47.640 And there's one right answer.
00:50:49.040 There's one right answer
00:50:51.160 and all the other answers are wrong.
00:50:53.240 Who should you believe?
00:50:54.600 The hypnotist
00:50:55.420 or the scientist?
00:50:58.020 Go.
00:50:59.640 Who should you believe?
00:51:01.720 There is one right answer.
00:51:04.800 Barnes.
00:51:07.220 The right answer is neither.
00:51:10.520 Why is the right answer neither?
00:51:13.200 You think the answer is neither
00:51:15.000 because you can't trust anybody
00:51:16.480 until you look into it yourself, right?
00:51:18.100 Nope.
00:51:19.460 Nope.
00:51:20.440 Here's why neither.
00:51:22.380 It's built into the question.
00:51:24.580 The question answers itself.
00:51:27.340 Did you see the word
00:51:28.560 that's the trick word in the question?
00:51:31.720 Should.
00:51:33.220 What does should mean?
00:51:35.220 I said, who should you believe?
00:51:37.620 What the hell does should mean?
00:51:40.440 Should.
00:51:41.740 Should what?
00:51:43.280 Should is not a reason.
00:51:44.320 Don't do anything
00:51:48.460 if the only reason
00:51:51.320 is that you think you should
00:51:53.120 because that's not a reason.
00:51:55.980 Now, as soon as anybody says
00:51:57.660 you should do this,
00:51:59.360 your answer is no.
00:52:01.040 It doesn't even matter what it is.
00:52:02.440 No.
00:52:02.780 If that's the best you have
00:52:05.540 is that I should do it,
00:52:06.780 no.
00:52:07.380 No reason there.
00:52:08.200 We use the word should
00:52:10.560 in the limited situations
00:52:13.020 where we all know the risks, right?
00:52:15.140 So here's a proper use of the word.
00:52:17.660 I don't think you should
00:52:18.860 walk into busy traffic
00:52:20.100 because the should
00:52:22.580 is not the important part.
00:52:24.660 The busy traffic
00:52:25.540 tells you what you need to know.
00:52:27.120 Oh, that would be dangerous.
00:52:28.200 We all understand.
00:52:29.800 So that's a proper use
00:52:31.020 of the word should
00:52:31.740 because everybody understands
00:52:33.340 the context of what the risk is.
00:52:34.960 But if I just tell you
00:52:36.920 a scientist
00:52:37.640 and a hypnotist
00:52:39.420 walk up to you
00:52:40.140 and they say,
00:52:41.140 who should you believe?
00:52:42.600 As soon as you hear
00:52:43.680 the word should,
00:52:45.060 the answer is neither.
00:52:46.720 You shouldn't do anything
00:52:49.020 unless there's a reason.
00:52:50.600 If I'd given you a reason
00:52:52.080 and I said,
00:52:53.220 here's the reason
00:52:54.220 and then this is why
00:52:56.180 you should do it,
00:52:57.600 it's not the should,
00:52:58.720 it's the reason.
00:53:00.420 Does that make sense?
00:53:02.100 So the phrasing of the question
00:53:04.320 guaranteed that the right answer
00:53:06.220 was only one.
00:53:07.480 Don't believe either
00:53:08.540 because you shouldn't
00:53:09.620 have to do anything.
00:53:11.240 Better have a reason.
00:53:13.060 All right.
00:53:13.420 But beyond that,
00:53:15.180 let's go to the next level.
00:53:19.100 Suppose I didn't use
00:53:20.800 the word should
00:53:21.420 and I replaced that with
00:53:22.860 you're trying to maximize
00:53:24.740 your odds.
00:53:26.220 So you're just playing the odds.
00:53:27.360 You don't know for sure
00:53:28.580 if the hypnotist
00:53:30.280 or the scientist is right
00:53:31.540 and you don't have access
00:53:33.100 to the data.
00:53:34.320 So let's say
00:53:35.080 you can't do
00:53:35.680 your own research
00:53:36.480 and you're going to
00:53:38.340 have to make a decision.
00:53:40.080 Now I know
00:53:40.500 what you want to say.
00:53:41.620 You want to say,
00:53:42.780 well, I'm not going
00:53:43.380 to make a decision.
00:53:44.260 I'm going to wait
00:53:45.020 until I look into it myself.
00:53:46.640 But I'm taking
00:53:47.520 that option away.
00:53:48.760 You have to make
00:53:49.500 a decision.
00:53:50.040 There's a time limit.
00:53:52.500 You have to trust
00:53:53.300 either the hypnotist
00:53:54.280 or the scientist.
00:53:55.540 Now what do you do?
00:53:57.840 Hypnotist or scientist?
00:53:59.200 Go.
00:54:01.160 Hypnotist or scientist?
00:54:01.980 You're just playing
00:54:02.580 the odds.
00:54:07.140 All right.
00:54:07.820 I'm seeing scientist,
00:54:08.760 scientist, scientist.
00:54:10.460 Hypnotist,
00:54:11.040 hypnotist,
00:54:11.540 hypnotist.
00:54:12.080 Your mom.
00:54:12.900 Okay.
00:54:13.140 Mom is a good answer.
00:54:14.900 Scientist,
00:54:15.500 hypnotist.
00:54:16.400 Okay.
00:54:18.100 Here is the correct answer.
00:54:22.080 Depends which one's better.
00:54:23.180 Now that's not in the question.
00:54:27.800 But if Einstein came to you
00:54:30.620 and said,
00:54:31.780 I have discovered
00:54:32.520 this something.
00:54:34.380 And then at the same time
00:54:35.720 that Einstein is standing
00:54:37.000 in front of you,
00:54:38.060 somebody who learned
00:54:39.080 hypnosis last week
00:54:40.360 says,
00:54:41.240 oh, I learned hypnosis
00:54:42.220 last week.
00:54:42.800 I think Einstein
00:54:43.420 is hallucinating.
00:54:45.000 Who are you going to believe?
00:54:47.400 I think I'd go with Einstein.
00:54:49.600 I'd go with Einstein.
00:54:50.640 Now let's say you have
00:54:52.440 a good, solid scientist.
00:54:55.880 Somebody who's published.
00:54:57.960 A well-published scientist.
00:55:00.220 But you also have
00:55:01.200 a really experienced hypnotist.
00:55:03.920 So let's say you knew,
00:55:05.540 it wasn't in my question,
00:55:06.800 but let's say you knew
00:55:07.900 they were both
00:55:08.460 high end of their profession.
00:55:11.400 Who do you trust?
00:55:12.480 The high end scientist
00:55:13.640 who's got a real good resume
00:55:14.760 or the high end hypnotist
00:55:17.180 who can identify
00:55:18.740 hallucinations.
00:55:19.640 And the question is,
00:55:21.740 is it a hallucination?
00:55:23.520 Go.
00:55:25.040 Hypnotist or scientist?
00:55:28.800 Only one of them
00:55:29.840 is an expert on hallucinations.
00:55:32.840 Why would you,
00:55:33.400 why would you,
00:55:34.440 why would you defer
00:55:36.060 to the scientist?
00:55:36.960 The scientist
00:55:37.380 is the one
00:55:38.460 who can't know the answer.
00:55:40.180 If the scientist
00:55:41.060 is in a hallucination,
00:55:42.980 they can't know.
00:55:44.260 That's what a hallucination is.
00:55:45.500 If a hypnotist
00:55:48.120 sees a scientist
00:55:51.540 in the hallucinating
00:55:52.780 and it's a good hypnotist,
00:55:54.480 they would know it pretty well.
00:55:56.320 I'm pretty sure
00:55:57.380 that I have
00:55:57.960 the tools
00:55:59.820 to identify
00:56:00.480 somebody
00:56:01.120 who's hallucinating.
00:56:03.320 Pretty sure.
00:56:04.320 And I think that
00:56:04.980 anybody with my same experience
00:56:06.580 could do it.
00:56:07.300 But I don't think
00:56:07.880 a new hypnotist
00:56:08.900 could do it.
00:56:09.780 If you learned it last week,
00:56:11.260 I don't think you could do it.
00:56:12.320 And so here's my dilemma.
00:56:17.780 When I get into
00:56:19.420 a conversation
00:56:20.120 with a doctor
00:56:21.080 or a scientist
00:56:21.960 about anything COVID,
00:56:24.600 they're usually
00:56:25.980 talking about the data
00:56:27.120 and I'm just looking at them.
00:56:30.240 And I'm saying,
00:56:30.980 well, I can see
00:56:32.000 a hallucination going on.
00:56:34.240 And then they say,
00:56:35.460 but you have to look
00:56:36.160 at my data,
00:56:36.900 which I don't know
00:56:37.720 how to do.
00:56:38.920 I don't know
00:56:39.680 how to look at
00:56:40.240 scientific data
00:56:41.160 and know
00:56:41.580 if the data's real.
00:56:43.500 I don't know
00:56:44.020 who got funded.
00:56:45.780 And nobody can do that.
00:56:47.560 Nobody can do that.
00:56:48.640 Nobody can look at data
00:56:49.580 and know
00:56:50.060 who got funded.
00:56:52.060 Do you know why?
00:56:53.580 Do you know why
00:56:54.080 you can't tell
00:56:54.720 who's getting paid
00:56:55.680 for doing a study?
00:56:57.480 Do you know
00:56:57.800 why you can't tell?
00:56:59.560 Does anybody know that?
00:57:01.840 Well,
00:57:02.480 in some cases you can,
00:57:04.000 and that's what fools you
00:57:05.120 to think maybe
00:57:06.100 you can always tell.
00:57:07.540 In some cases
00:57:08.440 there's a public record
00:57:09.460 if somebody did some work
00:57:10.520 for the pharma company.
00:57:12.400 So sometimes
00:57:12.900 you can find it.
00:57:14.200 Here's the reason
00:57:14.820 you can't find it.
00:57:16.300 Because the payment
00:57:17.000 comes after the thing.
00:57:20.380 The payment
00:57:21.020 doesn't always come before.
00:57:23.760 If I were working
00:57:24.940 in the,
00:57:26.120 let's say,
00:57:28.360 the pharma domain,
00:57:29.380 and I wanted Pfizer
00:57:32.380 to pay me
00:57:33.520 a bunch of money,
00:57:34.700 but I didn't even
00:57:35.420 have a relationship
00:57:36.260 with Pfizer in any way.
00:57:37.940 But I wanted them
00:57:38.780 to give me a lot of money.
00:57:40.520 How would I do it?
00:57:42.180 I would do a study
00:57:43.400 that I designed
00:57:45.200 and made sure
00:57:45.960 agreed with something
00:57:47.060 they really wanted
00:57:47.940 to be known.
00:57:49.520 And then I would publish it.
00:57:51.520 And then Pfizer
00:57:52.240 would contact me.
00:57:54.120 Because they would say,
00:57:55.080 hey,
00:57:55.580 we really like
00:57:56.280 that thing you wrote,
00:57:57.460 that study
00:57:57.960 that agrees with us.
00:57:59.500 How would you like
00:58:00.300 to come give a speech
00:58:01.740 to some people
00:58:03.220 who we'd like
00:58:03.960 to convince were right?
00:58:04.960 And then the guy says,
00:58:06.780 oh, that's a good idea.
00:58:07.620 We both want to talk
00:58:08.440 to that audience
00:58:09.260 and you're going to pay me.
00:58:10.920 So I'll get paid
00:58:11.860 to give a speech
00:58:12.620 of something that I believe
00:58:13.640 because I just wrote
00:58:14.380 this paper about it.
00:58:16.260 So if you think
00:58:17.200 that people only get paid
00:58:18.520 during or before
00:58:20.140 they work
00:58:21.520 for the pharma company,
00:58:22.520 you don't know
00:58:23.160 how that works.
00:58:24.860 People are doing it
00:58:25.860 before they get any money,
00:58:27.820 so they will.
00:58:30.500 You think you can look
00:58:31.680 at a study
00:58:32.660 and know that
00:58:33.800 the scientist
00:58:34.400 who was in charge of it
00:58:35.580 had no intention
00:58:37.080 of someday being asked
00:58:38.900 to give speeches
00:58:39.580 about his study
00:58:40.520 by the big pharma.
00:58:42.920 How would you know that?
00:58:44.800 You'd have no way
00:58:45.660 of knowing that.
00:58:46.740 How about if
00:58:47.500 the big pharma
00:58:48.180 invested in a company
00:58:50.260 that your spouse
00:58:51.680 has a financial interest in?
00:58:55.040 Would you know that?
00:58:56.960 It's just a startup
00:58:58.000 and your spouse
00:58:59.740 has some equity in it
00:59:00.900 and then later,
00:59:02.940 after your spouse
00:59:04.360 invested later,
00:59:05.940 big pharma comes in
00:59:06.920 and also makes an investment
00:59:08.060 and then suddenly
00:59:09.140 your family gets rich
00:59:10.200 and doesn't look like
00:59:13.320 there's any records
00:59:14.280 connecting.
00:59:15.480 So if you don't understand
00:59:16.820 that everybody
00:59:17.620 can be bribed,
00:59:19.360 then you're going
00:59:20.580 to be very confused
00:59:21.580 when you say,
00:59:22.920 oh, these ones are good
00:59:24.380 and these are not.
00:59:25.140 I can't tell the difference.
00:59:26.480 So if you believe
00:59:27.140 you can tell
00:59:27.680 who's been bribed
00:59:28.480 and who hasn't,
00:59:29.700 well, you have a power
00:59:30.460 I don't have.
00:59:31.580 I don't have that power.
00:59:33.560 If you believe
00:59:34.420 that you can look
00:59:34.920 at the data
00:59:35.480 and tell from
00:59:37.180 the written conclusions
00:59:38.800 that they collected
00:59:40.540 the data correctly,
00:59:42.020 how do you do that?
00:59:45.000 I can't do that.
00:59:47.300 I know what they say,
00:59:49.260 but that's all I know.
00:59:50.920 How do you know
00:59:51.600 something more than that?
00:59:54.400 So anyway,
00:59:55.500 I have no ability
00:59:56.300 to look at scientific studies,
00:59:58.000 but a lot of people
00:59:58.840 say they do.
00:59:59.600 I do have the ability
01:00:00.520 to identify people
01:00:01.820 who are hallucinating
01:00:02.780 and here's the thing
01:00:05.960 that scientists
01:00:06.860 and doctors
01:00:08.300 don't want to hear.
01:00:10.220 On that one question,
01:00:12.620 I'm the authority.
01:00:16.560 I know.
01:00:18.720 Now, if the question is,
01:00:20.180 is this research correct,
01:00:21.880 don't ask me.
01:00:23.400 I've told you forever.
01:00:24.500 I don't know
01:00:25.120 which side is correct
01:00:26.080 because I don't know
01:00:26.640 if any of it's correct,
01:00:27.820 but I can definitely tell you
01:00:29.140 if somebody's hallucinating
01:00:30.180 with maybe 85% accuracy.
01:00:33.100 Not every time,
01:00:34.480 but 85%.
01:00:35.340 So if I tell you
01:00:39.380 that somebody's hallucinating
01:00:40.700 and they tell you
01:00:42.740 that they're a scientist
01:00:44.000 doing a great job,
01:00:46.240 I would listen to me.
01:00:49.820 If you replaced
01:00:50.900 out the personalities
01:00:51.760 and just took
01:00:52.520 some other person
01:00:54.040 who has this skill,
01:00:56.720 I would trust them.
01:01:03.820 I love watching
01:01:04.920 how the comments change
01:01:06.240 because I can tell
01:01:07.580 when an idea
01:01:08.200 is starting to settle in.
01:01:10.580 How many of you thought
01:01:11.900 that going into this whole
01:01:13.460 vaccination question
01:01:15.580 that I was subordinate
01:01:17.420 to the experts,
01:01:18.940 meaning that whatever I knew
01:01:20.780 was certainly
01:01:21.920 a lesser important skill
01:01:23.540 than whatever
01:01:24.720 the scientists knew?
01:01:27.140 You just assume that.
01:01:29.340 But in this very narrow area,
01:01:31.180 I am the expert,
01:01:32.120 not the scientist.
01:01:33.440 Scientists don't know hypnosis.
01:01:35.320 They don't know
01:01:35.880 when they're in an illusion.
01:01:38.200 That takes somebody
01:01:39.000 who knows how to do that.
01:01:40.820 I know how to do that.
01:01:42.720 In fact,
01:01:43.880 how many times
01:01:44.520 have I demonstrated
01:01:45.380 to you
01:01:45.900 intentionally triggering
01:01:47.620 somebody into cognitive dissonance?
01:01:50.400 How many times
01:01:51.160 have you watched me do it?
01:01:52.480 Like, I do it
01:01:53.100 as a live demonstration
01:01:54.280 in lots of different venues.
01:01:57.280 And you always get
01:01:58.320 the word salad
01:01:59.020 and the change of topic
01:02:00.200 as soon as it happens.
01:02:01.620 And now you can identify it too,
01:02:03.120 right?
01:02:03.660 I've taught you.
01:02:04.820 I've taught you
01:02:05.320 that as soon as you get
01:02:06.060 the word salad
01:02:06.780 where there's words
01:02:08.740 but they don't make sense,
01:02:09.740 that means they hit
01:02:10.780 their cognitive dissonance level.
01:02:12.560 If they change the topic
01:02:14.480 when it was just
01:02:15.220 a clean question,
01:02:16.760 well, why do you believe
01:02:17.600 this?
01:02:18.100 And then they go,
01:02:19.020 well, back in 1944,
01:02:21.140 you go, no, no.
01:02:23.140 That's cognitive dissonance.
01:02:25.540 If you can't answer
01:02:26.640 a simple direct question
01:02:27.980 because you know
01:02:29.100 it will trigger you,
01:02:30.200 it means you're already triggered.
01:02:33.480 And then there's also
01:02:34.620 the eyes.
01:02:36.560 Now,
01:02:37.460 I don't have science
01:02:39.960 to back this one
01:02:40.780 but I have a lot
01:02:42.220 of years of experience.
01:02:43.340 You can tell by the eyes
01:02:44.640 when somebody's hallucinating.
01:02:46.360 It's like really,
01:02:47.260 really clear.
01:02:49.140 And
01:02:49.260 Anomaly had those eyes.
01:02:52.740 He had the
01:02:53.640 hallucination eyes.
01:02:56.220 Now,
01:02:56.940 here's how I trigger people
01:02:59.040 into cognitive dissonance.
01:03:00.740 So I'm going to do it
01:03:01.580 to some of you right now.
01:03:03.140 Not all of you.
01:03:04.340 Anybody who's still here,
01:03:07.320 you can probably handle this.
01:03:08.500 But if there's anybody here
01:03:10.060 who should have left already,
01:03:11.420 I'm going to trigger you
01:03:13.240 into cognitive dissonance
01:03:14.360 right now.
01:03:15.500 Let's say you're a person
01:03:16.560 who believes that you can
01:03:17.560 do your own research.
01:03:19.160 And you did.
01:03:20.680 And you researched
01:03:21.460 the so-called vaccination
01:03:23.420 that's just a shot.
01:03:24.900 And you researched it
01:03:26.300 and you got the good answer.
01:03:28.600 And that's why
01:03:29.320 you got the good answer
01:03:30.520 because it's your good research.
01:03:31.740 Now,
01:03:33.740 here would be a question
01:03:34.740 I would ask
01:03:35.240 that would trigger
01:03:35.840 cognitive dissonance.
01:03:37.980 If you can't understand
01:03:39.560 my tweets,
01:03:41.580 why would you be able
01:03:43.860 to do that?
01:03:45.740 Because the level
01:03:47.040 of complexity
01:03:47.720 of understanding
01:03:48.780 my opinion
01:03:49.540 and reading my tweets
01:03:51.020 would be about a three
01:03:53.400 on a scale of one to ten.
01:03:54.560 But looking at research
01:03:57.980 and second-guessing
01:03:59.080 the experts
01:03:59.720 and seeing something
01:04:01.180 that they missed
01:04:01.980 when you don't have
01:04:02.700 that expertise especially,
01:04:04.400 that would not apply
01:04:05.840 to somebody like Brett
01:04:06.580 who does have the expertise.
01:04:08.100 But why would you think
01:04:09.080 that if you can't even
01:04:10.680 read my tweet
01:04:11.840 and know my clear,
01:04:13.880 clean opinion on it,
01:04:15.740 why do you think
01:04:16.320 you can do deep research
01:04:17.580 on a field
01:04:20.440 that's not necessarily
01:04:21.620 your field?
01:04:24.560 Now,
01:04:26.180 those of you
01:04:26.880 who have followed
01:04:28.200 the whole topic,
01:04:29.100 that doesn't trigger you.
01:04:31.240 But for some of you,
01:04:33.340 that just triggered
01:04:35.100 the fuck out of you
01:04:36.060 because you're thinking,
01:04:37.540 huh,
01:04:38.420 why do I think
01:04:39.360 I can do this
01:04:40.020 highly complicated thing
01:04:41.220 that even the experts
01:04:42.100 are getting wrong
01:04:42.760 while I can't read a tweet
01:04:45.140 and interpret it?
01:04:48.600 Now,
01:04:49.140 is that only because
01:04:50.100 you can't tell
01:04:50.700 when I'm kidding?
01:04:52.100 Have you seen
01:04:52.800 how many people
01:04:53.360 couldn't tell
01:04:54.000 if I was pranking?
01:04:56.180 If you can't even tell
01:04:57.660 if I'm pranking,
01:04:59.560 you're probably
01:05:00.620 not the person
01:05:01.480 that should be
01:05:02.540 researching this
01:05:03.360 and telling the rest of us.
01:05:05.680 Now, again,
01:05:06.480 I'm not saying
01:05:06.980 you got the wrong answer
01:05:07.900 because all the people
01:05:08.720 who are not vaccinated
01:05:09.740 as of today,
01:05:11.800 that's a pretty good
01:05:12.440 position to be in.
01:05:14.000 Pretty good position.
01:05:15.580 Well,
01:05:15.880 you might want to look
01:05:16.740 at an exchange
01:05:18.100 I had with
01:05:18.860 comic Dave Smith
01:05:20.860 who,
01:05:22.560 I don't want to say
01:05:23.500 he was an anti-vaxxer
01:05:24.880 because people
01:05:25.440 get mad at me
01:05:26.000 when I use that,
01:05:26.960 but I think he was,
01:05:28.020 let's say,
01:05:29.660 very hesitant
01:05:30.200 about the mandates
01:05:32.820 in general.
01:05:35.120 And when I
01:05:36.380 pushed a little bit
01:05:39.160 to see how he reached
01:05:40.400 his opinion
01:05:40.940 that the vaccinations
01:05:42.740 would be more
01:05:43.600 more danger than good,
01:05:45.140 I asked him
01:05:46.680 how he figured out
01:05:48.040 his risk
01:05:48.720 of long COVID.
01:05:51.180 Now,
01:05:51.820 what happens
01:05:52.460 when you say
01:05:52.960 to somebody
01:05:53.500 who is convinced
01:05:54.640 they did a good,
01:05:56.380 logical,
01:05:58.040 analytical case,
01:05:59.800 and then you say
01:06:00.500 to them,
01:06:01.080 how did you judge
01:06:02.340 long COVID?
01:06:03.860 Because everybody
01:06:04.440 knows that we don't
01:06:05.420 know what that is.
01:06:07.620 And he said
01:06:08.620 that his odds
01:06:10.520 of getting long COVID
01:06:11.400 were extremely small,
01:06:13.520 or worse to that effect,
01:06:14.560 extremely small.
01:06:16.780 In other words,
01:06:18.400 he guessed.
01:06:22.540 How do you
01:06:23.520 interpret that?
01:06:25.140 He didn't use
01:06:25.860 analysis,
01:06:26.900 he just said
01:06:27.500 my risk of long COVID
01:06:29.860 is very,
01:06:30.400 very small.
01:06:31.360 Based on what?
01:06:33.820 Because if you were
01:06:34.440 to read the news,
01:06:35.540 the news says
01:06:36.200 your risk is
01:06:36.760 anywhere from
01:06:37.280 15 to 60%.
01:06:39.560 That's what
01:06:41.800 the experts say.
01:06:43.300 But how did
01:06:44.300 a comedian,
01:06:45.800 who I don't think
01:06:46.840 is a scientist also,
01:06:48.460 how did he know
01:06:49.600 that his risk
01:06:50.200 was extremely small?
01:06:52.060 Now,
01:06:52.320 by the way,
01:06:52.700 he might be right.
01:06:56.880 Here's a
01:06:57.640 megacropper.
01:06:58.520 So here,
01:06:59.100 this to my appointment.
01:07:00.800 So megacropper says,
01:07:02.020 come on,
01:07:02.400 Scott,
01:07:02.700 you were a big
01:07:03.280 pharma simp
01:07:04.140 until you got
01:07:04.820 off the BP meds.
01:07:06.360 Now,
01:07:07.560 given that this person
01:07:08.700 has a complete
01:07:09.580 hallucination
01:07:10.520 of who I was
01:07:12.200 before BP meds
01:07:13.180 because he thinks
01:07:13.700 I was pro-vaccination,
01:07:15.540 which is the opposite
01:07:16.560 of what was happening.
01:07:18.340 How do you think
01:07:19.320 you can analyze
01:07:20.960 a complicated topic
01:07:22.220 of COVID
01:07:23.440 versus vaccinations
01:07:24.760 that are not
01:07:25.500 vaccinations
01:07:26.000 if you can't even
01:07:27.520 figure out
01:07:28.080 what my opinion is?
01:07:30.060 How do you do that?
01:07:32.060 I mean,
01:07:32.360 you're clearly wrong
01:07:33.120 about my opinion,
01:07:33.860 and I can say that
01:07:34.820 with certainty
01:07:35.780 because I'm me.
01:07:37.260 I don't have to
01:07:37.760 research that.
01:07:38.800 So I know that
01:07:39.520 you're wrong about me,
01:07:41.040 but why are you right
01:07:42.500 about your analysis
01:07:45.360 of things that are
01:07:46.140 outside your field?
01:07:47.760 Does that give you
01:07:48.760 any pause?
01:07:49.900 That you're so wrong
01:07:50.980 about something so easy,
01:07:52.760 so easy,
01:07:54.700 but did you believe
01:07:55.660 a comic that you saw
01:07:57.040 on the internet?
01:07:59.140 And that was
01:07:59.860 your research on me?
01:08:03.580 All right.
01:08:04.240 There was a,
01:08:07.300 I saw a tweet today,
01:08:08.660 there was a VA study
01:08:09.560 in 2021
01:08:10.240 that predicted
01:08:11.900 basically that
01:08:12.920 people getting COVID
01:08:14.300 were going to have
01:08:14.920 a big wave
01:08:15.760 of cardio problems.
01:08:17.660 And sure enough,
01:08:18.420 we are.
01:08:19.600 So it was a huge
01:08:20.540 veterans admin study
01:08:22.840 that had, you know,
01:08:23.640 just, I don't know,
01:08:24.280 millions of people,
01:08:24.980 I think.
01:08:25.740 And apparently
01:08:26.820 we know that
01:08:27.560 after any bad flu,
01:08:30.620 there's a high risk
01:08:31.800 of heart attacks.
01:08:34.240 So the evidence
01:08:35.980 from a massive study
01:08:37.880 is that every
01:08:38.880 demographic group
01:08:39.960 should have
01:08:41.400 cardio problems,
01:08:42.560 even the young people.
01:08:44.940 So the COVID itself
01:08:46.660 is well understood
01:08:48.440 to create
01:08:49.540 massive cardio problems.
01:08:52.360 Do you believe that?
01:08:54.180 I don't.
01:08:55.740 I don't.
01:08:56.240 Do you know
01:08:56.500 why I don't believe it?
01:08:57.300 Do you know
01:08:59.340 why I don't believe
01:09:00.260 that the virus itself
01:09:02.960 causes really fairly
01:09:05.960 extreme, I would say,
01:09:08.640 heart attack problems later.
01:09:10.760 I don't believe it.
01:09:13.300 And they compared it,
01:09:14.980 and by the way,
01:09:15.500 they also compared
01:09:16.220 vaccinated versus
01:09:17.180 unvaccinated.
01:09:18.440 That's your first question,
01:09:19.600 right?
01:09:20.440 Your first question is,
01:09:21.660 but Scott,
01:09:23.200 they were all vaccinated,
01:09:24.260 right?
01:09:25.660 And you're going
01:09:26.500 to tell me,
01:09:27.300 but really,
01:09:28.200 it was the vaccination
01:09:29.500 that's not really
01:09:30.480 a vaccination
01:09:30.900 that gave them
01:09:32.340 the cardio problems,
01:09:33.560 right?
01:09:33.920 Is that where
01:09:34.340 you're going to go?
01:09:35.600 Do you think
01:09:36.080 that they,
01:09:36.660 do you think
01:09:37.040 they looked at
01:09:37.600 the vaccinated
01:09:38.240 and the unvaccinated
01:09:39.260 separately?
01:09:41.040 What do you think?
01:09:42.580 Do you think
01:09:42.960 they sorted out
01:09:43.760 the vaccinated
01:09:44.500 from the unvaccinated?
01:09:46.160 Yes,
01:09:46.440 of course they did.
01:09:47.480 Of course they did.
01:09:49.260 And they found out
01:09:50.340 that it didn't matter
01:09:51.440 whether you were
01:09:51.880 vaccinated or not,
01:09:52.940 if you got COVID,
01:09:55.520 you definitely
01:09:56.020 had cardio problems.
01:09:57.560 So if you believe
01:09:59.560 this massive
01:10:00.780 Veterans Administration
01:10:02.180 study from 2021,
01:10:04.560 everything you're seeing
01:10:05.580 is exactly what
01:10:06.560 was predicted
01:10:07.100 by the experts.
01:10:09.920 And they've studied it
01:10:11.180 and found out
01:10:11.620 it wasn't the vaccinations,
01:10:13.240 it was the COVID.
01:10:15.080 Now how do I know
01:10:15.820 that that's not believable?
01:10:19.240 It's easy.
01:10:20.640 Because if this is true,
01:10:22.060 it would mean I made
01:10:23.220 the correct decisions.
01:10:25.560 And we know
01:10:26.260 that didn't happen.
01:10:28.260 Am I right?
01:10:29.840 This can't be true.
01:10:31.000 Because if it were true,
01:10:31.960 then everything I did
01:10:32.940 would be right.
01:10:34.220 And other people
01:10:35.240 who were my critics
01:10:35.920 would look wrong.
01:10:36.800 And we know
01:10:37.240 that's not the case.
01:10:38.560 Since we know
01:10:39.360 I'm wrong,
01:10:40.060 you can reason backwards
01:10:41.020 to know that
01:10:41.920 the study is wrong.
01:10:43.800 That's the way
01:10:44.360 I did it.
01:10:44.820 Here's another thing
01:10:50.780 people said to me.
01:10:51.720 So I've got
01:10:52.100 a little questions on this.
01:10:53.740 I said,
01:10:54.460 how did you know
01:10:56.640 that these vaccinations
01:10:58.400 would be more dangerous
01:10:59.480 than we hoped?
01:11:00.900 And people said,
01:11:01.900 you could tell
01:11:02.540 because how hard
01:11:03.400 they were trying
01:11:03.860 to sell it.
01:11:04.480 that the extreme pressure
01:11:10.040 that they put on people,
01:11:11.540 mandates, laws,
01:11:13.340 the peer pressure,
01:11:14.700 the shame,
01:11:15.980 the penalties,
01:11:17.800 once you see
01:11:18.820 all that pressure
01:11:19.460 that people are putting on,
01:11:21.280 that's enough
01:11:22.100 to tell you
01:11:22.940 that it's dangerous.
01:11:25.420 Anybody agree with that?
01:11:27.120 How many would agree
01:11:28.120 that once you see that
01:11:29.400 you don't need
01:11:29.900 to know much else?
01:11:31.680 Yeah,
01:11:31.880 and the EAU
01:11:32.460 and the freedom
01:11:36.000 from lawsuits,
01:11:39.020 like that's all
01:11:39.620 you need to know,
01:11:40.260 right?
01:11:41.880 Okay.
01:11:44.460 How many of you
01:11:45.260 wear seatbelts,
01:11:46.540 which are mandated?
01:11:49.340 Do you wear seatbelts?
01:11:52.220 Because I think
01:11:53.000 I'm going to stop
01:11:53.700 wearing seatbelts,
01:11:55.800 sort of a protest.
01:11:58.060 Because once I realize
01:11:59.280 that I'm being
01:12:00.580 forced to do it,
01:12:02.800 I'm like,
01:12:03.560 hmm,
01:12:04.080 you're really trying
01:12:05.940 really, really hard
01:12:07.420 to make me wear
01:12:08.480 these seatbelts.
01:12:10.260 I think how hard
01:12:11.380 you're trying
01:12:11.960 is a tip-off,
01:12:15.620 that it's not based
01:12:16.780 on real science.
01:12:18.800 What are your comments?
01:12:19.780 Oh, somebody's saying
01:12:20.640 that seatbelts
01:12:21.280 have been tested
01:12:21.980 for 75 years.
01:12:24.540 Yeah.
01:12:25.980 Well, if you want
01:12:26.920 to believe that,
01:12:27.640 I believe that
01:12:29.760 it's how hard
01:12:30.480 they push
01:12:31.060 that matters.
01:12:33.160 But if you think
01:12:34.280 seatbelts are
01:12:35.360 a different situation,
01:12:36.700 then you're not
01:12:38.280 on the page
01:12:38.960 that says
01:12:39.500 how hard they push
01:12:40.520 is all you need,
01:12:41.220 right?
01:12:42.200 So would you agree
01:12:43.240 it's not the fact
01:12:44.280 that it's mandated?
01:12:46.780 Do you know
01:12:47.400 what else is
01:12:48.060 something that
01:12:49.060 the government
01:12:49.540 pushes really hard?
01:12:50.680 every law.
01:12:54.260 Every law.
01:12:56.280 Every law.
01:12:57.760 You can go to jail
01:12:58.480 if you break them.
01:12:59.600 So it turns out
01:13:00.220 we live in a
01:13:00.940 horrible country
01:13:02.080 where the government
01:13:02.960 is forcing us
01:13:03.960 to do stuff
01:13:04.540 all the time.
01:13:05.720 Don't kill your neighbor.
01:13:07.040 Don't steal that money.
01:13:09.140 Don't say things like that.
01:13:10.820 You know,
01:13:11.680 and so I'm going
01:13:13.000 to look at everything
01:13:13.780 the government's
01:13:14.580 telling us to do
01:13:15.460 and I'm going to
01:13:16.260 start doing the opposite
01:13:17.140 because they wouldn't
01:13:18.680 tell you to do it
01:13:19.600 and force you to do it
01:13:20.740 at gunpoint
01:13:21.560 unless it wasn't real.
01:13:29.220 No?
01:13:30.540 All right.
01:13:32.640 You're not buying that.
01:13:34.500 You all saw
01:13:35.360 that there was
01:13:35.820 that huge study
01:13:37.180 that said that masks
01:13:38.660 were useless.
01:13:40.080 Did everybody see that?
01:13:41.680 It was like a massive
01:13:42.760 biggest study
01:13:43.820 that showed that masks
01:13:45.360 were just totally worthless.
01:13:47.380 Did you all see it?
01:13:48.100 It came out recently.
01:13:50.600 600,000 participants.
01:13:53.780 And they concluded
01:13:54.640 that masks made,
01:13:55.860 quote,
01:13:56.560 little to no difference
01:13:57.780 in preventing infection.
01:14:00.820 Yeah.
01:14:03.100 So we're done here, right?
01:14:05.480 Big, massive study.
01:14:07.080 It's the newest one.
01:14:09.020 No evidence
01:14:09.880 that masks work.
01:14:11.140 And we're done.
01:14:12.440 What kind of study
01:14:13.220 was this?
01:14:13.800 Oh, a meta-study.
01:14:15.120 A meta-study.
01:14:15.840 Now,
01:14:17.680 let me ask you this.
01:14:19.680 It's a meta-study.
01:14:20.840 What percentage
01:14:21.800 of meta-studies
01:14:22.860 in general
01:14:23.720 do you think
01:14:25.040 are valid?
01:14:27.160 Just the whole field
01:14:28.520 of meta-studies
01:14:29.380 of all topics
01:14:30.400 everywhere.
01:14:31.660 What percentage
01:14:32.680 usually are wrong?
01:14:36.800 If you didn't know that,
01:14:39.140 could you have
01:14:40.600 a reasonable opinion
01:14:41.840 about this mask study?
01:14:43.480 Knowing it's a meta-study.
01:14:45.840 Now, a meta-study
01:14:46.520 is where they look
01:14:47.200 at other existing studies
01:14:48.680 and they sort of
01:14:49.900 add them all together
01:14:50.980 and they take out
01:14:52.360 the ones that are,
01:14:53.380 you know,
01:14:53.780 most effective.
01:14:56.920 The answer
01:14:57.560 is almost all of them.
01:14:59.260 Yeah,
01:14:59.420 it's like 90%
01:15:00.180 or something.
01:15:01.040 If the only thing
01:15:01.920 you knew
01:15:02.220 was that it was
01:15:03.720 a meta-study,
01:15:04.480 what would be
01:15:05.560 your rational conclusion?
01:15:07.400 You don't know
01:15:07.880 anything else.
01:15:08.700 It's just a meta-study.
01:15:09.700 Probably fake.
01:15:13.020 Yeah, probably fake.
01:15:14.980 Exactly.
01:15:16.160 Now, that doesn't mean
01:15:16.960 masks work
01:15:17.820 because, in my opinion,
01:15:19.500 there's no signal
01:15:20.260 for them working.
01:15:21.820 No signal at all
01:15:23.140 in macro numbers.
01:15:26.060 So it doesn't really
01:15:26.800 justify that kind
01:15:28.000 of pain on people.
01:15:29.740 But if you believe
01:15:32.040 they don't work
01:15:32.880 because there was
01:15:34.340 a meta-study,
01:15:35.920 then you're being
01:15:37.380 non-scientific
01:15:38.460 in the sense that
01:15:40.540 as soon as you heard
01:15:41.440 it was a meta-study,
01:15:42.360 you should have said,
01:15:43.040 oh, that's probably fake.
01:15:44.960 Doesn't mean it is fake,
01:15:46.620 but you should have
01:15:47.560 played the odds.
01:15:49.020 Overwhelmingly,
01:15:49.480 the odds are it's fake.
01:15:51.300 Did you know that?
01:15:54.120 Because most,
01:15:54.960 a lot of the studies
01:15:55.720 that we're depending on
01:15:56.860 are meta-studies.
01:15:59.160 And most of them
01:16:00.100 are fake,
01:16:01.060 just in general.
01:16:02.960 That's an important
01:16:03.600 thing to know.
01:16:05.080 You know,
01:16:05.260 if the only thing
01:16:06.360 that I ever taught you,
01:16:08.080 ever,
01:16:09.920 was not to trust
01:16:11.080 meta-studies,
01:16:12.600 you would be
01:16:13.040 the smartest person
01:16:14.100 you know.
01:16:16.200 Because everybody
01:16:16.960 gets fooled by that.
01:16:17.980 And they just keep
01:16:19.080 pushing about us
01:16:20.000 like they're real.
01:16:22.120 Some of them are real,
01:16:23.260 but you can't tell
01:16:23.920 which ones are real
01:16:24.520 in advance.
01:16:25.060 That's the problem.
01:16:28.980 All right.
01:16:29.800 Ladies and gentlemen,
01:16:30.660 I have only
01:16:36.520 one more comment.
01:16:40.760 Can you think
01:16:41.420 of any alternative
01:16:42.580 reason that our
01:16:44.280 government would have
01:16:45.080 tried so hard
01:16:45.860 to get you vaccinated?
01:16:47.860 Now, you've got
01:16:48.420 a lot of conspiracy
01:16:49.260 theories and et cetera,
01:16:51.420 but one reason
01:16:52.860 would be that they
01:16:53.980 know the shots
01:16:55.720 are dangerous,
01:16:56.760 so they have to
01:16:58.280 force you to do it.
01:16:59.140 What would be
01:16:59.900 another reason
01:17:00.740 that they would
01:17:01.520 try really,
01:17:02.380 really hard?
01:17:02.980 Money, okay.
01:17:03.940 What's another reason?
01:17:05.740 What would be
01:17:06.140 another reason?
01:17:11.420 Because they know
01:17:12.200 you're stupid,
01:17:13.020 control,
01:17:13.680 boredom,
01:17:15.180 cognitive dissonance,
01:17:17.380 they trusted it,
01:17:18.600 they were lied to,
01:17:20.000 yeah, maybe they
01:17:20.680 had good intentions,
01:17:21.480 but they're wrong,
01:17:22.160 that's possible.
01:17:22.580 A noble lie,
01:17:24.700 ego,
01:17:25.120 there is one
01:17:31.120 possibility that
01:17:32.120 you're completely
01:17:33.160 ignoring.
01:17:35.300 Maybe people
01:17:36.340 wanted to save
01:17:37.160 civilization?
01:17:40.280 But what about
01:17:41.200 that one?
01:17:42.980 Do we believe
01:17:43.860 that there was
01:17:44.340 literally nobody
01:17:45.160 in the entire
01:17:46.000 healthcare community,
01:17:47.380 the entire
01:17:48.240 healthcare community,
01:17:49.300 all the experts,
01:17:50.100 all the politicians,
01:17:51.040 there was nobody
01:17:51.760 there who wanted
01:17:52.360 to save civilization?
01:17:54.440 No?
01:17:59.620 Maybe not.
01:18:01.040 Maybe not.
01:18:02.680 Maybe not.
01:18:03.580 But I love
01:18:04.780 how cavalier
01:18:06.600 we are
01:18:07.080 with tossing
01:18:08.980 out the most
01:18:09.560 likely explanation.
01:18:12.260 If you put me
01:18:13.300 in charge
01:18:13.720 of the government
01:18:14.300 and I came
01:18:16.260 to believe
01:18:16.680 that the vaccinations
01:18:17.480 were helpful,
01:18:18.760 do you think
01:18:19.180 I wouldn't push
01:18:19.800 you pretty hard?
01:18:21.760 If I genuinely
01:18:22.920 believed it was
01:18:23.900 going to save
01:18:24.400 the country
01:18:24.880 and maybe
01:18:25.540 the world,
01:18:26.360 if I really
01:18:27.120 believed that,
01:18:28.400 you don't think
01:18:28.960 I would push
01:18:29.800 like a mofo
01:18:30.660 to save your life
01:18:32.200 and mine
01:18:32.580 at the same time?
01:18:33.500 Because remember,
01:18:34.100 I wouldn't be
01:18:34.520 just saving
01:18:35.060 your life,
01:18:36.380 I'd be saving
01:18:37.020 civilization.
01:18:38.180 And here's the argument.
01:18:39.880 We don't know
01:18:40.760 what would happen
01:18:41.540 if the vaccinations
01:18:45.080 had never been
01:18:46.300 rolled out,
01:18:48.340 do we?
01:18:49.680 You think you do,
01:18:50.780 don't you?
01:18:51.140 You think that
01:18:52.500 without vaccinations,
01:18:55.500 herd immunity
01:18:56.400 would have been
01:18:56.900 reached quickly
01:18:57.600 and then we
01:18:59.080 would be past it.
01:19:00.460 Is that what
01:19:01.160 you think?
01:19:02.440 Is that your belief?
01:19:03.660 That if we had not
01:19:04.620 treated it at all,
01:19:05.860 left everything open,
01:19:07.580 that we would have,
01:19:08.800 you know,
01:19:09.120 had a tough few months,
01:19:10.980 but then we would
01:19:11.880 be past it?
01:19:12.540 Because here's the thing
01:19:16.200 that you haven't
01:19:16.740 quite modeled.
01:19:19.040 Our system came
01:19:21.060 to the edge
01:19:21.760 of crashing
01:19:23.280 the healthcare system
01:19:25.140 and probably
01:19:28.300 vaccinations
01:19:28.940 are the only thing
01:19:29.720 that changed that.
01:19:31.260 Probably.
01:19:31.980 Don't know for sure.
01:19:34.000 So you can't really
01:19:35.260 say that everything
01:19:36.200 would have been fine
01:19:37.120 if we didn't do
01:19:37.860 vaccinations.
01:19:39.040 It might have been.
01:19:40.500 I wouldn't rule it out.
01:19:41.700 But I would say
01:19:42.840 that the evidence
01:19:43.540 suggests you don't
01:19:44.480 know what would
01:19:45.040 have happened.
01:19:46.320 You don't know.
01:19:47.780 What might have
01:19:48.640 happened is
01:19:49.200 I believe we were
01:19:50.560 very close
01:19:51.280 to complete disaster.
01:19:54.180 And complete disaster
01:19:55.520 would have looked
01:19:56.020 like we wouldn't
01:19:57.620 be able to keep
01:19:58.140 the lights on
01:19:58.780 because too many
01:19:59.420 people are sick
01:20:00.060 at the same time.
01:20:03.000 That's what I think.
01:20:04.080 I think we might
01:20:05.200 have been close
01:20:05.720 to not being able
01:20:06.420 to keep the lights on.
01:20:07.780 And it could have
01:20:08.660 been a massive problem.
01:20:10.300 I think our supply
01:20:11.640 chain would have
01:20:13.000 failed.
01:20:14.540 And there would
01:20:14.920 have been mass
01:20:15.440 starvation and
01:20:16.420 freezing to death
01:20:17.340 and stuff like that.
01:20:18.460 And I think we were
01:20:19.120 actually very close
01:20:21.000 to that.
01:20:22.580 So it's possible
01:20:23.880 that vaccinations
01:20:26.020 were terrible for us
01:20:27.780 and saved civilization.
01:20:33.300 I'm not saying
01:20:34.240 that's the case.
01:20:35.540 But I'm saying
01:20:36.480 that's very possible.
01:20:38.140 Very possible.
01:20:38.860 It's very possible
01:20:40.080 that these
01:20:40.620 so-called vaccinations
01:20:42.080 killed way more people
01:20:44.360 than any other
01:20:45.940 approved pharma product.
01:20:49.300 Totally possible.
01:20:50.820 At the same time,
01:20:51.960 it may have saved
01:20:52.600 civilization.
01:20:54.440 Don't know.
01:20:55.640 Doesn't mean we
01:20:56.540 should have done it.
01:20:57.840 Doesn't mean we
01:20:58.620 should have had mandates.
01:21:00.160 I'm anti-mandate.
01:21:01.640 But if you believe
01:21:02.740 you know what
01:21:03.380 would have happened,
01:21:04.560 I think that's just
01:21:05.640 silly.
01:21:07.620 You don't know
01:21:08.280 what would happen.
01:21:09.820 And of course,
01:21:10.800 to believe that you
01:21:11.720 do know what would
01:21:12.360 happen is to believe
01:21:14.360 that there was no
01:21:15.080 difference between
01:21:15.860 the vaccinated
01:21:16.480 and the unvaccinated.
01:21:18.560 And people do believe
01:21:19.560 that.
01:21:20.000 There are a lot of
01:21:20.400 people who believe
01:21:20.880 that nobody was
01:21:22.580 protected by the
01:21:23.500 shots.
01:21:25.280 How many of you
01:21:25.840 people believe that?
01:21:26.780 How many of you
01:21:27.440 believe that during
01:21:29.500 the alpha phase,
01:21:31.280 let's just take the
01:21:32.320 alpha phase,
01:21:33.220 during the alpha phase
01:21:34.380 vaccines only,
01:21:35.240 how many of you
01:21:35.860 believe that the data
01:21:38.500 is wrong and that
01:21:40.960 what appears to be a
01:21:42.440 massive benefit,
01:21:43.520 according to the
01:21:44.340 powers that be,
01:21:45.820 what appears to be a
01:21:46.760 massive benefit in
01:21:47.760 every place,
01:21:49.200 every place,
01:21:50.200 that they were used,
01:21:50.940 these particular ones,
01:21:52.480 that everybody is lying
01:21:55.280 on the data.
01:21:57.220 And how many of you
01:21:57.780 believe that everybody
01:21:58.580 in every country is
01:21:59.500 lying about the
01:22:00.440 effectiveness of the
01:22:01.400 vaccinations?
01:22:02.720 Some of you do.
01:22:04.700 And the thing is,
01:22:05.500 that's not impossible.
01:22:07.320 That's totally not
01:22:08.160 impossible.
01:22:09.780 It's totally within
01:22:10.980 the realm of
01:22:11.680 possibility that it
01:22:13.340 never worked.
01:22:15.820 Now, if we had not
01:22:17.820 lived through, you
01:22:18.600 know, the Russia
01:22:19.400 collusion and every
01:22:20.780 other hoax in the
01:22:21.540 world, that would be
01:22:22.780 a crazy thing to say.
01:22:24.920 It would just be
01:22:25.480 crazy to think that
01:22:26.960 we didn't notice it
01:22:27.820 didn't work.
01:22:28.740 Like every country
01:22:29.720 everywhere didn't
01:22:30.920 notice it didn't
01:22:31.380 work at all.
01:22:32.480 Nobody noticed.
01:22:34.000 That would be crazy
01:22:35.680 except in 2023.
01:22:38.140 And in 2023, you
01:22:39.260 could actually make
01:22:40.040 a story that says
01:22:41.680 that, well, you
01:22:45.240 could make a story
01:22:45.820 in 2023.
01:22:50.200 The survival rate is
01:22:51.460 so high, so the
01:22:52.140 number is not
01:22:52.820 believable.
01:22:56.080 I'm not sure you
01:22:56.920 did the math right
01:22:57.600 there.
01:22:58.920 The high survival
01:22:59.920 rate is not,
01:23:00.820 doesn't have much
01:23:02.440 to do with the
01:23:03.100 fact that there
01:23:03.720 would still be
01:23:04.160 enough to crash
01:23:04.840 the whole system.
01:23:07.940 Because just the
01:23:08.720 sickness would crash
01:23:09.560 the system, right?
01:23:10.700 You don't have to
01:23:11.420 die to make the
01:23:13.020 supply chain fail.
01:23:14.020 You just have to
01:23:14.600 not go to work.
01:23:21.780 Went from flatten
01:23:22.700 the curve to
01:23:23.400 flatten Trump.
01:23:24.160 Oh, so Al wants me to
01:23:29.200 answer this question.
01:23:29.900 So Al Carey says,
01:23:31.140 Scott won't answer
01:23:31.900 this question.
01:23:32.960 Is the Vax spike
01:23:34.420 protein toxic?
01:23:36.120 Is that a good
01:23:36.660 question for me?
01:23:39.320 Do you think that was
01:23:40.440 a question I should have
01:23:41.200 answered?
01:23:41.420 Well, let me answer
01:23:44.940 it.
01:23:45.780 Okay.
01:23:46.380 So for the guy who
01:23:48.260 can't understand my
01:23:49.340 opinions, let me do
01:23:50.560 my research.
01:23:52.120 Spike protein, spike
01:23:54.460 protein, oh, oh,
01:23:56.820 spike protein, oh, oh,
01:23:58.820 all the professionals
01:24:00.060 got one answer, but I
01:24:01.120 got the other answer.
01:24:01.940 Seriously, can you
01:24:07.260 stop pretending that
01:24:09.960 you think I can
01:24:10.960 research deep science
01:24:13.140 and guess something
01:24:15.240 the experts missed?
01:24:17.160 Do you really believe
01:24:18.500 I can do that?
01:24:19.940 Now, I'm willing to
01:24:21.020 accept the wildly
01:24:23.320 ridiculous notion
01:24:24.760 that you can.
01:24:27.920 I'm willing to
01:24:28.820 accept that.
01:24:29.800 I will enter your
01:24:31.060 fantasy as hard as
01:24:32.580 you want.
01:24:34.120 You can do that.
01:24:35.660 But why can't you
01:24:36.740 teach me to do it?
01:24:37.920 This is where I'm
01:24:38.640 confused.
01:24:40.420 There's so many of
01:24:41.400 you, especially with
01:24:43.000 lower levels of
01:24:43.820 education, who can do
01:24:46.240 this, and even
01:24:47.520 though I have two
01:24:48.600 advanced degrees
01:24:49.700 that suggest I was
01:24:51.540 trained to do it, I
01:24:52.320 can't.
01:24:53.620 I guess I don't know
01:24:54.360 how to compare things,
01:24:55.240 even though my entire
01:24:56.700 educational process was
01:24:58.040 how to compare things
01:24:58.840 correctly.
01:24:59.620 But apparently I
01:25:00.280 didn't learn it.
01:25:01.060 Because I don't know
01:25:01.680 how to do this.
01:25:02.800 But so many of you
01:25:03.920 do.
01:25:04.240 It's really
01:25:05.060 impressive.
01:25:12.640 It's called
01:25:13.260 wisdom.
01:25:15.120 Do you know why
01:25:15.880 science was invented?
01:25:19.520 Do you know why
01:25:20.300 science was invented?
01:25:23.020 So that people
01:25:24.320 would not say they
01:25:26.180 could use their
01:25:26.900 wisdom to make good
01:25:28.880 decisions.
01:25:30.820 That's the whole point
01:25:31.860 of science, is that
01:25:32.800 you don't have
01:25:33.740 wisdom.
01:25:34.700 If people had that
01:25:35.660 kind of wisdom, oh,
01:25:37.380 let me use my
01:25:38.000 wisdom.
01:25:39.060 I haven't collected
01:25:40.100 any data.
01:25:43.680 Use my wisdom.
01:25:46.040 Remote viewing.
01:25:47.500 I can see inside the
01:25:48.740 laboratory.
01:25:49.880 I see their documents.
01:25:50.960 the level of self-deception
01:25:57.480 involved in thinking
01:25:58.660 you can look at the
01:25:59.920 science, it makes me
01:26:03.680 laugh every day.
01:26:05.420 It makes me laugh
01:26:06.420 every day that you
01:26:07.880 think I could develop
01:26:09.100 those skills.
01:26:10.400 The entire reason
01:26:11.820 science exists is
01:26:13.660 because people like me
01:26:14.560 can't do what you say
01:26:15.520 I can do, or that you
01:26:17.300 say you did.
01:26:17.840 But I accept that you
01:26:19.340 did it.
01:26:20.300 I'm just really mad at
01:26:21.460 you for not teaching
01:26:22.220 me how.
01:26:23.760 Could you do it for
01:26:24.480 the next thing?
01:26:26.580 So you did it for
01:26:27.460 this.
01:26:28.800 Could you do it for
01:26:29.340 the next thing?
01:26:32.140 I can't wait.
01:26:39.560 Somebody at
01:26:40.120 Locals is embarrassed
01:26:41.380 for me for
01:26:42.800 mentioning I have
01:26:43.720 advanced degrees.
01:26:45.640 degrees.
01:26:47.380 You understand that
01:26:48.660 to do this job
01:26:50.720 you can't be that
01:26:53.400 much of a pussy.
01:26:55.440 So that's why you
01:26:56.400 do your job, which
01:26:57.360 is watching, and I
01:26:58.360 do my job of
01:26:59.140 talking in public.
01:27:00.520 If you're going to
01:27:01.020 be that much of a
01:27:01.800 pussy, you can't
01:27:03.340 mention something
01:27:04.040 that's an obvious
01:27:05.040 useful fact about
01:27:06.900 the conversation.
01:27:08.100 If I tell you I
01:27:09.040 have degrees that
01:27:10.540 are relevant to
01:27:11.480 the topic, to
01:27:13.100 the exact topic,
01:27:14.440 you don't think
01:27:15.600 that that's worth
01:27:16.680 mentioning because
01:27:17.900 it would be too
01:27:18.680 creepy, maybe it's
01:27:21.140 like too much
01:27:22.060 egotistical to
01:27:23.480 mention that I
01:27:24.120 have the exact
01:27:24.860 kind of
01:27:27.840 qualifications for
01:27:29.320 this question.
01:27:30.860 You need to be
01:27:31.800 much less of a
01:27:32.720 pussy.
01:27:34.600 All right.
01:27:35.260 If you can't say
01:27:36.620 the truth because
01:27:38.080 it would be like
01:27:38.680 oh too embarrassing,
01:27:40.080 you got to get
01:27:40.800 rid of that.
01:27:41.320 Elon tweeted
01:27:44.300 something.
01:27:45.060 I'm getting some
01:27:46.720 notices here.
01:27:48.260 I should see what
01:27:48.960 Elon's up to.
01:27:53.220 I don't assume it's
01:27:54.420 about me, is it?
01:27:57.680 Share ad revenues.
01:27:59.900 What?
01:28:01.900 Holy cow.
01:28:04.420 Whoa.
01:28:05.860 Was it about the
01:28:06.600 sharing revenue part?
01:28:07.520 Did you see this?
01:28:11.980 Starting today,
01:28:12.820 Twitter will share
01:28:13.560 ad revenue with
01:28:14.500 creators for ads
01:28:16.400 that appear in
01:28:17.240 their reply
01:28:18.060 threads.
01:28:20.180 That's kind of
01:28:21.020 brilliant.
01:28:22.420 Oh my God,
01:28:23.340 that's brilliant.
01:28:25.480 Oh my God.
01:28:28.640 You know,
01:28:29.280 what is the
01:28:30.240 biggest complaint
01:28:31.360 I have about
01:28:32.020 Twitter?
01:28:33.360 My biggest
01:28:34.200 complaint is that
01:28:35.180 I spend like
01:28:36.240 hours a day
01:28:37.000 and it's not
01:28:38.000 a source of
01:28:38.540 revenue directly.
01:28:39.780 You know,
01:28:39.980 obviously it's
01:28:40.900 important to my
01:28:41.700 business model.
01:28:43.060 But if it were
01:28:45.040 directly related
01:28:46.200 to my income,
01:28:48.840 you couldn't
01:28:49.380 keep me off it.
01:28:51.320 Oh, that's
01:28:52.140 so smart.
01:28:53.660 Does anybody
01:28:54.120 have any
01:28:54.640 complaint about
01:28:55.280 that?
01:28:57.160 Am I wrong
01:28:58.080 that this is
01:28:58.640 brilliant?
01:29:00.740 Am I wrong?
01:29:02.300 Does anybody
01:29:02.940 see anything
01:29:03.420 wrong with this?
01:29:03.980 Did you hear
01:29:15.040 the, I just
01:29:15.780 have to say
01:29:16.160 this to the
01:29:16.600 locals people.
01:29:18.340 I'm very aware
01:29:19.480 that some of you
01:29:20.260 don't want me to
01:29:20.920 talk about the
01:29:21.920 COVID decision
01:29:22.860 making process.
01:29:25.060 But I'm going to
01:29:25.860 continue to talk
01:29:26.540 about whatever I
01:29:27.420 want.
01:29:28.840 Because that's the
01:29:29.580 only way I'm
01:29:30.060 useful.
01:29:30.360 as soon as
01:29:31.680 I don't do
01:29:32.800 that, I'm
01:29:34.120 going to be
01:29:34.400 really boring.
01:29:35.720 Trust me.
01:29:36.660 As soon as I
01:29:37.240 do what I
01:29:37.800 think you'll be
01:29:38.300 okay with,
01:29:39.360 if I only do
01:29:40.060 what you'll be
01:29:40.800 comfortable with,
01:29:41.820 that's not
01:29:42.620 anything that I
01:29:43.540 want to be
01:29:43.860 involved with.
01:29:45.280 So you do
01:29:46.600 have a choice
01:29:47.100 of subscribing
01:29:48.140 or not, and
01:29:48.700 that's the
01:29:49.380 beauty of the
01:29:50.000 local system.
01:29:51.440 And I would
01:29:51.960 encourage you
01:29:52.580 not to
01:29:53.060 subscribe if
01:29:53.720 that's a
01:29:54.020 problem to
01:29:54.340 you.
01:29:54.500 All right.
01:29:58.140 So that was
01:29:58.760 what you
01:29:59.040 wanted me to
01:29:59.440 look at,
01:29:59.720 right?
01:30:02.580 So everybody's
01:30:03.480 testing this
01:30:04.060 making their
01:30:04.600 Twitter account
01:30:05.160 private to
01:30:05.820 see if,
01:30:08.260 wow, that
01:30:09.460 is a problem.
01:30:11.440 So I guess
01:30:12.040 that's what
01:30:12.360 you're talking
01:30:12.780 about.
01:30:18.560 Oh, this is
01:30:21.820 interesting.
01:30:22.840 Here's a
01:30:23.320 little fight I
01:30:24.340 may have
01:30:24.600 started.
01:30:27.480 Maybe
01:30:27.880 indirectly.
01:30:30.200 Remember Alex
01:30:30.900 Epstein was
01:30:31.660 doing some
01:30:32.120 math and he
01:30:32.640 said that it
01:30:33.740 would be
01:30:34.000 impossible to
01:30:34.660 use solar
01:30:35.260 for most
01:30:37.460 of our
01:30:37.760 energy.
01:30:38.940 And Elon
01:30:39.740 Musk tweeted
01:30:40.440 two days ago
01:30:41.020 that wind and
01:30:41.520 solar, combined
01:30:42.160 with batteries,
01:30:42.940 of course, will
01:30:44.120 solve sustainable
01:30:45.380 energy for
01:30:46.060 Earth.
01:30:46.920 But I'd love
01:30:47.460 to see him get
01:30:48.120 into an actual
01:30:48.960 math battle,
01:30:51.320 do an actual
01:30:52.020 math battle,
01:30:52.680 and say,
01:30:53.420 hey, Alex,
01:30:54.340 show me your
01:30:56.380 math, I'll
01:30:56.860 show you mine.
01:30:57.960 Because I
01:30:58.500 think that as
01:31:00.380 much as I
01:31:00.840 love Alex
01:31:01.480 Epstein,
01:31:04.400 okay, yeah.
01:31:13.920 All right, that's
01:31:14.660 all for now.
01:31:15.260 I'm going to go
01:31:15.780 do something else
01:31:16.620 and I'll talk
01:31:18.860 to you later on
01:31:19.820 YouTube.
01:31:20.200 I'll talk to
01:31:20.600 locals here for a
01:31:21.380 minute.
01:31:21.900 Bye for now.
01:31:22.520 Greatest live
01:31:23.000 stream of all
01:31:23.500 time.