Real Coffee with Scott Adams - August 28, 2023


Episode 2214 Scott Adams: Science & Coin-Flipping In Tense Battle For Legitimacy & All News Is Fake


Episode Stats

Length

59 minutes

Words per Minute

155.89319

Word Count

9,248

Sentence Count

739

Misogynist Sentences

24

Hate Speech Sentences

19


Summary

Today's episode is a mashup of some of my favorite bits from a live stream I did on the Locals channel last night, and a story from a story that went over really well on my Man Cave live stream last night.


Transcript

00:00:00.000 Do-do-do-do-do-do, do-do-do-do-do.
00:00:05.360 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:11.300 Today, testing, to also streaming live on X for the first time.
00:00:19.200 And we're doing it over on the locals channel, where they get a little extra before and after,
00:00:24.840 and lots of other extras other times.
00:00:26.400 And, of course, on YouTube.
00:00:28.040 Welcome, everybody.
00:00:30.000 Good to see you all.
00:00:31.980 All right.
00:00:32.960 I just noticed that if I turn my camera landscape, which is what I did,
00:00:38.420 the comments are not upside and they're sideways.
00:00:43.500 So there might be a little more development that Elon needs to do on the live stream
00:00:49.240 because most people want to do it the way I'm doing it.
00:00:53.580 And that means that your comments are all sideways.
00:00:56.620 So if you can deal with that.
00:00:59.360 All right.
00:01:00.380 Everybody, if you'd like to take this triple platform experience
00:01:05.260 up to levels that nobody's even imagined could be possible,
00:01:09.520 well, all you need is a cup or a mug or a glass, a tank or chels or a stein,
00:01:13.460 a canteen jug or a flask, a vessel of any kind.
00:01:17.400 Fill it with your favorite liquid.
00:01:19.640 I like coffee.
00:01:21.080 And join me now for the unparalleled pleasure, the dopamine hit of the day,
00:01:24.200 the thing that makes everything better.
00:01:26.280 It's called the simultaneous sip.
00:01:28.840 And it happens now.
00:01:30.820 Go.
00:01:36.160 So good.
00:01:37.060 Well, I saw a tweet today from Sam Altman.
00:01:42.620 He, you know, leading the chat GPT and other billionaire stuff.
00:01:49.440 One of the smartest people on the planet, Sam Altman.
00:01:52.220 And he says something very compatible with something I've said.
00:01:56.600 So he said, quote, in a tweet, give yourself a lot of shots to get lucky.
00:02:01.460 It's even better advice than it appears on the surface.
00:02:04.940 Luck isn't an independent variable, but increases super linearly with more surface area.
00:02:12.560 You meet more people, make more connections between new ideas, learn patterns, etc.
00:02:17.160 Now, what is my version of that from my book?
00:02:25.380 Reframe your brain.
00:02:26.860 Yeah, my version of that is that you can make luck happen by going where there's more of it.
00:02:34.880 So if you stay in your little small town and you know three neighbors and that's all you know of the entire world, what are the odds you're going to get lucky?
00:02:44.660 Because there just aren't many things happening.
00:02:46.520 There's not much luck to happen.
00:02:48.060 But if you go somewhere, let's say, an exciting industry and an exciting place and you meet more people and you network and you invite people over and you make more friends and you do all the things you know to put yourself in a greater energy situation, just more stuff happening.
00:03:05.880 That's where luck happens.
00:03:08.020 So the reframe is that the old view is that some people are lucky, some people are not.
00:03:14.320 And that's the end of the story.
00:03:15.560 But that's sort of a loser's frame.
00:03:19.600 The truth is that you can make your luck happen by going where there's more of it.
00:03:25.220 So that's your first little lesson.
00:03:27.780 And as Sam Altman says, although he talks about increasing your super linearly surface area, I just say go where there's more stuff happening.
00:03:38.580 That's the dumb guy's version, my version.
00:03:41.480 All right.
00:03:43.000 Allow me to tell you a story that went over really well on my Man Cave live stream last night on Locals.
00:03:50.500 So I think you'll like it, too.
00:03:52.340 It goes like this.
00:03:53.920 There are two stories and then I'm going to connect them.
00:03:56.700 Story number one.
00:03:57.920 If you have children, especially teens, you may have had this experience.
00:04:02.720 That you think they're not listening to any of your wisdom because you're always trying to drop it in.
00:04:07.540 You know, you're trying to be subtle about it.
00:04:10.180 It's like, well, you know, in this situation, it's always a good idea to handle it this way.
00:04:14.700 You know, you're just sort of like slowly letting your magic of your wisdom seep in.
00:04:21.000 But you can never tell if it's working because they're not going to give you feedback.
00:04:26.460 Like the thing you're not going to hear is, my goodness, that's the best idea I've heard all day.
00:04:31.460 I've been alive for 10 years already and I don't think I've heard a finer idea.
00:04:36.480 I'm going to adopt your idea now because it seems so sound.
00:04:40.840 No, that doesn't happen.
00:04:42.480 That doesn't happen.
00:04:43.880 So just keep in mind that sometimes they're listening and absorbing, but you're not going to know right away.
00:04:50.480 All right, now take that story, put it aside, but don't forget it.
00:04:55.420 It's coming back.
00:04:57.560 So yesterday, I finally got my own copies of my own book because when you do independent publishing, you have to buy your own book.
00:05:05.500 If you do regular publishing, the publisher sends you a box of them.
00:05:09.440 You can, you know, give them to your friends and stuff.
00:05:11.740 But I had to wait for my own book to come just like you did, order it from Amazon.
00:05:16.460 And I was standing in the kitchen and I was flipping through it, you know, just to make sure it looked the way I wanted it to look.
00:05:24.520 It looked great.
00:05:25.540 And I sort of randomly picked down a chapter and just started reading it.
00:05:30.940 Now, which is weird because I wrote it, but it's been so long since I wrote it, it's actually like a new experience sometimes when I read it.
00:05:38.660 And the reframe that I was reading about, I'm going to share with you.
00:05:43.160 So you get a freebie.
00:05:44.640 Now, I swear, I swear this will change some of your lives in a small way, right?
00:05:51.460 This one's not going after a big thing.
00:05:53.780 This is going after a small annoyance and I'm going to completely solve it for you right now.
00:05:58.700 Watch how happy you are when you try this reframe.
00:06:02.460 And the reframe is to solve the problem of deciding where to eat with your partner.
00:06:08.880 Have you ever had that problem?
00:06:10.000 Where do you want to eat?
00:06:10.800 I don't care.
00:06:11.320 Where do you want to eat?
00:06:12.080 How about here?
00:06:12.720 No, right?
00:06:13.780 It's a continuous problem that we all have.
00:06:17.960 There is a solution.
00:06:20.300 And it requires a reframe.
00:06:22.320 If the frame you've adopted is that you're trying to make a decision with somebody who's not good at making decisions or doesn't make them the way you'd like them to.
00:06:32.240 Now, let's take the classic, let's say, traditional stereotype situation of a boring, boring hetero couple.
00:06:40.900 They're so boring, they bore me.
00:06:42.080 And we'll just use that as an example.
00:06:45.440 Obviously, I can't speak for the LGBTQ community.
00:06:48.760 Maybe stuff, maybe it's a little different there.
00:06:51.240 I don't know.
00:06:52.020 Wouldn't know.
00:06:52.740 So I won't speak for them.
00:06:54.200 But in the hetero community, generally it goes like this.
00:06:58.160 The guy says, hey, do you want to eat at X place?
00:07:02.020 Because he wants to take charge.
00:07:05.320 Like guys, even women like it.
00:07:07.240 They say it all the time.
00:07:08.320 Women like it when the man will sometimes make a plan.
00:07:12.120 Because women are planning all day and often taking care of kids.
00:07:15.640 And, you know, so they're kind of planned out and they're kind of like it.
00:07:20.640 It would feel really good if the guy would just say, hey, I got a plan.
00:07:23.720 Let's do this.
00:07:25.140 So if you're a guy, you have two things you need to accomplish.
00:07:29.140 Number one, you have to be the take charge person who makes a plan for dinner.
00:07:33.940 But also, you've got to make sure that you don't do any decision making on your own because you're going to get clowned in about a minute.
00:07:41.280 You know you're going to get clowned.
00:07:42.520 So you can't make a decision, but also you must make a decision.
00:07:47.520 So that's impossible.
00:07:49.120 How do you make a decision and not make a decision at the same time?
00:07:52.400 That's the only way you can satisfy the situation.
00:07:55.900 So you reframe it.
00:07:57.700 Instead of saying, I'm trying to make an eating decision with a crazy person for which there is no solution.
00:08:05.180 Nobody's ever come up with a solution.
00:08:06.940 You do the Kobayashi Maru, Star Trek reference, and you reframe it so that the real problem is you have a puzzle to solve.
00:08:17.160 You're not deciding where to eat.
00:08:19.360 You're solving a puzzle, which is how do you take charge and not take charge at the same time.
00:08:24.260 That's the puzzle.
00:08:24.880 And here's how you do it.
00:08:28.240 You pick two places you're willing to eat, and you say to your partner, usually the guy saying to the wife in this classic stereotype situation,
00:08:37.800 and you say, let's go eat.
00:08:40.060 I suggest one of these two places.
00:08:42.620 Now, what you've done is you've taken charge.
00:08:44.800 You've said, let's go out.
00:08:46.180 You've narrowed it to two.
00:08:47.880 But still, still the wife has a choice.
00:08:50.860 Excellent, right?
00:08:51.940 She has a choice.
00:08:53.600 So then she says, oh, that's a good idea.
00:08:56.640 I love it.
00:08:57.080 You're taking charge.
00:08:57.860 She's thinking that, not saying it.
00:08:59.720 And then she says, let's go to this one.
00:09:02.160 And you say to yourself, wow, that worked pretty well.
00:09:05.220 Now, you get in the car.
00:09:07.320 If any of you are married, you can back me on this.
00:09:09.920 So she's chosen one of the two choices.
00:09:11.640 You get in the car, and then she says, after you start the car.
00:09:15.720 But you know, we have eaten at that last place kind of recently.
00:09:23.440 Yeah.
00:09:24.200 You know what would be even better than the two choices you gave me would be this third place I've been thinking about?
00:09:30.320 And then you go for the close.
00:09:33.060 You're the guy, and you've given two choices.
00:09:35.940 She said yes to one, and then she's doing an audible in the car.
00:09:39.580 Here's the closer.
00:09:42.000 Yes, that would be great.
00:09:43.320 I love it.
00:09:43.940 Boom.
00:09:46.960 Problem solved.
00:09:49.040 Yeah.
00:09:49.980 Woman says he took charge, and yet she's still got to eat exactly what she wanted to eat.
00:09:55.660 Now, I've already heard from people who have tried it, and it totally works.
00:10:00.480 Right?
00:10:01.640 Now, remember this story and put it aside.
00:10:06.120 Hold.
00:10:06.780 Hold.
00:10:07.920 Let's go back to the other story.
00:10:10.540 Now I'm going to insert a third story between the two stories, and then I'm going to connect them together.
00:10:18.060 I swear this happened.
00:10:19.400 This really happened.
00:10:20.500 This is a real, real thing.
00:10:21.760 But as I was reading the part of my own book about how to decide where to eat, my stepdaughter comes walking down the hall, and we had made plans earlier to go out to lunch, but hadn't picked a place.
00:10:34.820 She walks directly up to me as I'm reading this section, and she says, we can eat at pastas or the railroad cafe.
00:10:44.420 Now, it's up to you to connect the two stories.
00:10:56.740 Apparently, she's been listening to me.
00:10:59.560 Apparently, she paid attention.
00:11:02.120 So what did I say?
00:11:04.460 I said, pastas.
00:11:07.700 And then we drove to pastas and had a perfectly great lunch.
00:11:10.820 So that's your reframe of the day.
00:11:14.500 This one is the most trivial one, in the sense that it's just getting rid of a little frustration.
00:11:20.920 But the book is full of ways to change your mental and physical health and your career and everything else.
00:11:26.840 Even your view of reality.
00:11:29.600 So that's your funny story of the day.
00:11:32.280 Let's do backward causation science.
00:11:34.740 Are you ready?
00:11:35.900 Saw a tweet by William Costello.
00:11:37.740 There's a study out of Harvard that tracked people over 84 years.
00:11:44.140 And what they found was that good relationships are the greatest predictor of happiness.
00:11:52.180 The people with the best relationships have the best happiness.
00:11:57.740 Huh.
00:11:59.020 That's quite a surprising result.
00:12:01.460 I wonder if there's any other way to look at the same set of data.
00:12:07.660 Oh, whoa.
00:12:09.500 Let's see.
00:12:10.460 You've got a choice of two people to mate with.
00:12:14.160 One is grumpy and unhappy, and one is happy.
00:12:17.420 Who to choose?
00:12:18.520 Who to choose?
00:12:19.900 Who to choose?
00:12:20.880 Grumpy and unhappy all the time, or happy all the time.
00:12:24.680 Other things about equal.
00:12:28.060 I don't know.
00:12:28.740 What do you think people do in that situation?
00:12:30.340 Backward science.
00:12:34.440 Have you ever seen so much backward science in your life?
00:12:38.260 Yeah, here's how you know backward science.
00:12:40.860 Every time somebody says doing something makes you healthier, or doing something makes you happier,
00:12:47.080 it's a lie.
00:12:49.280 Because the happy people do those things, and the healthy people do those things.
00:12:53.340 And it's not that doing those things makes you those things.
00:12:56.480 It's that people who are those things do those things.
00:12:58.980 Do you know what being healthy allows me to do?
00:13:04.080 Just take a guess.
00:13:05.380 So I'm in good health now.
00:13:06.740 Do you know what that allows me to do?
00:13:09.540 Exercise.
00:13:09.940 If you test it to me, you say, hey, he exercises, and he's healthy.
00:13:18.840 Well, there is a causation that way.
00:13:20.660 Of course there is.
00:13:22.280 But do unhealthy people exercise?
00:13:24.440 I mean, it's kind of hard.
00:13:26.260 So anyway, always look for the backwards causation.
00:13:29.380 Often it's a both ways causation, like exercising and being healthy.
00:13:33.220 You know, they cause each other.
00:13:35.460 But don't believe, just don't believe any science on these topics.
00:13:39.760 What makes you happier or healthier?
00:13:41.660 A lot of that's just BS.
00:13:43.900 All right.
00:13:44.640 Speaking of science, there's a new study that says these eco-friendly straws might be as bad or worse than regular straws.
00:13:54.220 Because the eco-friendly ones are adding some chemicals to the environment that we didn't have.
00:14:02.240 But the plastic ones maybe last forever.
00:14:05.140 The microplastics get in your water supply.
00:14:07.640 So I've been watching this pitch battle that we've had recently between science on one hand and flipping a frickin' coin on the other hand.
00:14:19.220 Now, when I was young, I feel like science was ahead.
00:14:23.980 Like it could stay ahead of the coin flip.
00:14:26.740 But I feel like that's at least reversed or maybe drawn even.
00:14:32.940 So whenever you see a study that's a binary, there are two things that are possible.
00:14:37.780 So in this case, there are two things possible.
00:14:40.280 Either the eco-straws were an improvement or they were not.
00:14:45.340 You could argue it was exactly the same, but that would be unusual.
00:14:49.220 So they're either an improvement or not.
00:14:52.560 So do you think that flipping a coin would have gotten you, on average, a worse result than studying it with your science?
00:15:01.960 Well, you'd love to think that the science would win every time, wouldn't you?
00:15:06.700 But I'm here to tell you it's sort of a really even battle between flipping coins and science when it comes to a lot of stuff.
00:15:14.880 Now, I should be quick to tell you that science starts out with shaky.
00:15:22.800 And in a perfect world, it improves over time as people reproduce things and find different ways to test things.
00:15:29.080 And they get skeptical of things, et cetera.
00:15:31.640 So over time, it probably does crawl closer to the truth than whatever else we were using.
00:15:38.700 So that's good.
00:15:39.700 I don't recommend getting rid of science.
00:15:42.200 But for new stuff, new stuff like the straws, that's kind of new.
00:15:47.440 Coin flip.
00:15:49.720 All right.
00:15:50.020 Apparently, Canadians aren't allowed to have news.
00:15:55.600 I don't know.
00:15:56.020 Do you even care about the details?
00:15:59.340 Yeah.
00:16:00.020 The summary.
00:16:01.060 The summary is Canadians are not allowed to have news now.
00:16:04.960 Now, if you were worried that the Canadians were slowly losing their rights,
00:16:09.420 you know, I have to admit, I haven't really been super worried about Canada.
00:16:13.480 And the reason is I like Canadians.
00:16:17.080 And I think Canadians are, generally speaking, quite well-balanced, rational, you know, high-functioning people.
00:16:25.380 So I've been telling myself for a long time, well, the Canadians will work this out.
00:16:30.220 Right?
00:16:30.460 We don't need to go help the Canadians.
00:16:32.720 The Canadians are real, you know, smart, solid.
00:16:35.860 They've been running this good country for the longest time and doing a good job, in my opinion.
00:16:40.620 So, you know, they don't need any help from us.
00:16:45.320 They don't need my opinion.
00:16:47.580 Well, writer Gad Saad is reporting that he had a post blocked by Canadian Facebook for alleged misinformation.
00:16:59.180 But then there was a community of notes that says it was blocked,
00:17:02.680 but maybe not for the reason that he thinks, not for misinformation.
00:17:06.100 Now, the first part of the story is, imagine being a pretty famous author.
00:17:13.740 You know, Gad Saad, very famous.
00:17:15.880 Imagine that your work is blocked on social media and you don't even know why.
00:17:23.520 Because that's part of the story.
00:17:25.140 The part of the story is not that he's wrong.
00:17:27.280 It's that he didn't have a mechanism to know why.
00:17:30.820 Imagine being censored and you don't know why.
00:17:33.480 All right, here's the reason why, according to a community notes.
00:17:37.980 Doesn't mean he's wrong, by the way, but I'll just tell you what the community notes says.
00:17:44.300 The fact that this is real, I'm going to read this.
00:17:47.940 I'm just saying it's from the community notes.
00:17:50.800 I feel like it couldn't be real, but it might be.
00:17:55.640 I'll read it to you.
00:17:56.440 You decide.
00:17:57.640 Community notes.
00:17:58.520 The Canadian government has not forbidden posting news articles,
00:18:02.740 but did start requiring social media companies to share revenue when they link to news sites.
00:18:10.740 What?
00:18:13.160 In response, Facebook made a business decision not to allow display or posting of such links by Canadian users.
00:18:25.580 Really?
00:18:26.220 Seriously?
00:18:27.720 Seriously?
00:18:29.960 Seriously?
00:18:31.720 That Canada was so fucking dumb that they thought that Facebook would just start sharing the revenue with the news people that people link to?
00:18:41.440 That wasn't going to happen.
00:18:43.880 Of course that wasn't going to happen.
00:18:46.320 That would be the stupidest thing ever.
00:18:48.400 You know, you can make fun of Zuckerberg all day long for, you know, what his preferences are politically, et cetera.
00:18:58.700 You know, you can make fun of him for looking like an android.
00:19:02.120 But he's not an idiot.
00:19:04.240 He's not stupid.
00:19:05.880 He knows what a business model looks like.
00:19:08.760 Of course he's not going to do business with Canada under those conditions.
00:19:12.640 At least not the way they want it.
00:19:14.640 Of course not.
00:19:15.560 There's no head of a social media company that would have ever agreed to that.
00:19:20.600 So imagine this.
00:19:22.580 You're a Canadian using social media.
00:19:26.160 And on Facebook, you're not allowed to see news from your own country.
00:19:31.000 You can see news from other countries, but you're not allowed to see your own news.
00:19:36.240 Now, keep in mind that the links were links to drive your traffic to the news site.
00:19:44.160 Now, that would be advertising, we call it in this country, or marketing.
00:19:49.880 It's advertising or marketing when people from somewhere else are encouraged to go to your product.
00:19:58.140 But now that's not going to happen.
00:20:01.680 It's not illegal, but Canada decided to make it non-economic to look at the news.
00:20:08.380 That actually happened.
00:20:09.720 In the real world, Canada is making it un-economical to know what's happening in your country.
00:20:18.040 So, all of you Canadians, if you'd like to follow people like me on X,
00:20:22.980 you might find, for the first time, what's happening in your country.
00:20:27.820 I got some surprises for you.
00:20:30.120 You're not going to like it.
00:20:32.220 I'm just saying, you're not going to like it.
00:20:35.140 All right.
00:20:36.140 Poor Canadians.
00:20:37.000 Well, flying cars, I think, are here, although, as somebody said,
00:20:44.340 we should stop calling them flying cars,
00:20:47.380 because they're just tiny little personal quadricopters.
00:20:51.520 It's basically like a little helicopter with, you know,
00:20:54.040 four little electric motorized things like a drone.
00:20:59.280 So, it's not a car, but it's kind of a car.
00:21:01.380 And in the Dilbert comic, that you can't see unless you're a subscriber on X or on the Locos community,
00:21:09.780 Dilbert has ordered his own flying car.
00:21:13.480 He was trying to compete with a character named Topper,
00:21:16.840 who was bragging that he ordered a Cybertruck.
00:21:19.980 And a Cybertruck would be really, really cool.
00:21:23.160 But it's no flying car, is what I'm saying.
00:21:25.660 So, I want to get my Tesla flying car.
00:21:29.980 Apparently, China's got one that's flying around in prototype, and it looks pretty awesome.
00:21:35.500 The only thing that was keeping us from having private aircraft was the power of the battery,
00:21:44.800 which is now sufficient.
00:21:47.000 The processing speed, so that you could do a lot of processing to keep it stable,
00:21:51.740 which is, we got all that.
00:21:54.540 And I would imagine some kind of GPS, you know, we've had for a long time.
00:21:58.900 And then you would need the regulators to be okay with it.
00:22:02.300 But basically, everything that would be needed to have your own little flying personal vehicle,
00:22:08.340 it's here.
00:22:09.460 There's no technology left to design.
00:22:11.760 It's just productizing it and making it legal.
00:22:14.900 Now, the one I saw was kind of hilarious,
00:22:18.140 because it looked like the four wings on the quadcopter were external.
00:22:25.100 As in, if you walked over to it, it would chop you in half.
00:22:29.060 And they're kind of low, the same height as the car.
00:22:33.180 And I'm thinking to myself,
00:22:34.260 I'm positive that I've seen those where the blade thing was protected.
00:22:39.380 I feel like they have to at least add that, right?
00:22:42.280 You know, where there's a casing around the blade.
00:22:43.880 Well, I'm no inventor of flying planes or cars.
00:22:50.480 All right, DeSantis, I guess I missed this during the debate,
00:22:54.920 but DeSantis was proud of it.
00:22:56.620 So he was tweeting it around or X-ing it around.
00:23:00.160 At the debate, he said,
00:23:02.460 I'm not going to send troops to Ukraine,
00:23:05.040 but I am going to send them to our southern border.
00:23:07.500 When these drug smugglers bring fentanyl across the border,
00:23:11.220 we're going to leave them cold, stone-cold dead.
00:23:15.200 Applause, applause, applause, applause.
00:23:17.140 Sorry, too weak.
00:23:19.260 Too weak.
00:23:21.100 So weak.
00:23:22.660 You're going to wait for them to come across the border?
00:23:25.840 Seriously?
00:23:27.280 Like, if they're standing at one foot on the other side of the border,
00:23:30.100 they're like, nanny, nanny, nanny, nanny.
00:23:31.540 We'll be like, oh,
00:23:33.220 if only there was something we could do about that guy.
00:23:35.880 Really, that's our plan.
00:23:39.860 We're going to watch them, the coyotes,
00:23:42.200 bringing people to the border
00:23:43.580 and then stopping on their side while the immigrants are across.
00:23:47.800 And we're going to be okay with that?
00:23:49.820 No, we should be dropping them with sniper fire.
00:23:54.020 You don't think that would slow the flow?
00:23:57.300 We should have snipers taking out all the coyotes
00:24:00.760 as soon as we spot them.
00:24:02.180 Now, we can shoot from our side
00:24:03.720 if he wants to be like that.
00:24:05.880 But, you know, I'm not going to support...
00:24:08.100 I just can't support anybody who's weak on fentanyl.
00:24:10.900 This is weak.
00:24:12.240 You're going to stay on our side of the border
00:24:13.960 while they're coming over to kill us.
00:24:15.800 No, you go kill them where they're in their sleep.
00:24:19.040 Kill them in their sleep.
00:24:21.640 All right?
00:24:22.340 Bomb their facilities,
00:24:23.840 take their shit,
00:24:25.200 and make their full-time job
00:24:27.400 at trying not to get droned.
00:24:28.680 That should be their full-time job
00:24:30.840 if you're in the cartel,
00:24:32.240 looking up.
00:24:33.600 Your full-time job should be looking up
00:24:36.020 because we should have that whole place
00:24:37.800 blanketed with drones by now,
00:24:39.600 and we should just be raining death
00:24:41.400 on the cartels every single day
00:24:43.900 with no remorse.
00:24:45.480 Anything else is stupid and weak,
00:24:50.600 and we're just asking for what we get.
00:24:53.140 So, as much as I like DeSantis in a lot of ways,
00:24:56.440 I don't exactly...
00:24:57.760 You know, I suppose he's maybe tested it or something
00:25:00.700 and it sounded good in a poll.
00:25:02.020 I don't know.
00:25:02.660 I mean, I don't know why you would have this opinion, even.
00:25:06.820 All right.
00:25:07.580 I'm not going to talk a lot about masks.
00:25:09.480 I'm just going to tell you
00:25:10.300 the two worst takes on either side.
00:25:13.080 Okay?
00:25:14.160 So we're not going to talk about
00:25:15.440 what the good take is.
00:25:17.640 I'm just going to tell you
00:25:18.580 the ones I don't want to see anymore.
00:25:21.040 The worst take on the pro-mask
00:25:22.880 is that the data shows that they work.
00:25:25.720 Do you agree?
00:25:27.040 The worst take on masks
00:25:29.080 is that the data shows they work.
00:25:33.100 At a population level.
00:25:35.380 At a population level.
00:25:37.020 Because there's no dependable data.
00:25:40.120 There's no dependable data
00:25:41.820 on anything, really.
00:25:43.140 But there's no dependable data
00:25:44.700 at a, let's say,
00:25:46.240 a city or a county level
00:25:47.760 that supports it.
00:25:49.680 So saying that there is,
00:25:51.140 I think it's the worst argument.
00:25:52.840 Now, to be clear,
00:25:54.640 I don't know what's true.
00:25:56.560 I just know that the data
00:25:57.920 doesn't support it.
00:25:59.140 So you don't ask citizens
00:26:00.420 to do something
00:26:01.320 that's quite extreme like that,
00:26:03.380 wearing masks everywhere.
00:26:04.700 That's pretty extreme.
00:26:05.520 If you can't demonstrate it
00:26:09.140 unambiguously,
00:26:11.980 that's a no-go.
00:26:13.380 Because freedom requires
00:26:14.880 that if you're going to take
00:26:16.480 any of it away,
00:26:17.900 you've got to have nailed down,
00:26:19.540 absolute,
00:26:21.120 incontrovertible science,
00:26:22.400 and we're nowhere near that.
00:26:24.180 In fact, we're not even
00:26:25.360 in an environment
00:26:26.740 in which incontrovertible science
00:26:29.060 would be useful.
00:26:30.200 It's not even useful.
00:26:31.080 Because there's no credibility
00:26:33.240 or faith or believability
00:26:35.300 from any organization
00:26:36.900 that would create data.
00:26:38.420 So in a situation
00:26:39.620 in which you can't know
00:26:40.840 what's true,
00:26:42.480 don't tell us that the data
00:26:43.700 says it's true
00:26:44.400 when we don't see it
00:26:45.620 at a city level
00:26:46.420 and you're taking
00:26:47.960 our freedom's way.
00:26:49.140 So it's a terrible argument
00:26:50.280 that you think they work
00:26:53.220 and therefore I should wear a mask.
00:26:54.720 Sorry.
00:26:55.900 What you think works
00:26:57.280 has no,
00:26:58.680 that should have no impact on me.
00:27:00.060 You know what should
00:27:01.260 make an impact on me?
00:27:03.540 What I think works.
00:27:05.080 That's it.
00:27:06.320 What I think works
00:27:07.300 should matter.
00:27:08.300 What you think works,
00:27:09.280 no, no bueno.
00:27:11.280 Here's the worst argument
00:27:12.580 on the anti-mask side.
00:27:14.840 I'm completely anti-mask.
00:27:17.120 No mandates.
00:27:18.020 If you want to do it,
00:27:18.740 do it.
00:27:19.660 Same with anything.
00:27:21.380 But no mandates.
00:27:23.900 And there's no wiggle room
00:27:25.300 on that.
00:27:26.300 I have no wiggle room,
00:27:28.000 no mandates on masks.
00:27:30.060 However,
00:27:31.160 I am embarrassed
00:27:32.500 to be on that point of view
00:27:34.120 when I keep seeing people
00:27:36.980 tweeting around
00:27:38.080 this thing that shows
00:27:39.880 the size of a virus
00:27:41.180 versus the size
00:27:42.960 of the holes in the mask.
00:27:45.000 If you don't say,
00:27:47.100 I intentionally left out
00:27:48.300 the size of the water droplets
00:27:49.880 so that I could fool you
00:27:51.900 into being scientifically illiterate,
00:27:55.180 well, that's missing.
00:27:56.900 Let me ask you this.
00:27:59.260 Would you say that
00:27:59.940 Dr. Jordan Peterson
00:28:01.040 is very anti-mask?
00:28:03.080 Yes or no?
00:28:04.120 Dr. Jordan Peterson,
00:28:05.280 is he anti-mask?
00:28:06.280 Yes or no?
00:28:08.600 He's anti-mask.
00:28:10.780 Let me answer the question.
00:28:12.540 He's very anti-mask.
00:28:14.860 Now, would you say that also
00:28:16.020 he understands the topic
00:28:17.820 better than most people
00:28:19.420 because he's scientific by nature?
00:28:21.960 Would you say he's
00:28:23.040 one of the smartest people around
00:28:25.420 and he's looked into it enough?
00:28:27.480 He's got a good take on it, right?
00:28:29.440 Do you think you'll ever see
00:28:31.240 Dr. Jordan Peterson,
00:28:33.280 who agrees with you completely
00:28:34.940 about no-mask mandates,
00:28:37.420 do you think you'll ever see him
00:28:38.680 tweeting that little meme
00:28:41.300 about the size of a virus
00:28:42.660 compared to the size
00:28:43.520 of the holes in the mask?
00:28:45.040 No, you'll never see that
00:28:46.420 because that is not an argument
00:28:48.880 that smart people make
00:28:50.300 or informed people.
00:28:52.220 If you're uninformed,
00:28:53.280 do you think the virus
00:28:54.140 travels on its own?
00:28:55.900 If the virus left your mask
00:28:57.260 on its own
00:28:57.880 without any water droplet,
00:29:00.480 it would just fall to the ground
00:29:01.620 or be dead
00:29:03.760 because it had no water droplet.
00:29:05.200 It'd be dried down.
00:29:06.020 I don't know.
00:29:08.480 So here's the question I ask you.
00:29:11.720 If you were to measure
00:29:12.700 the amount of COVID
00:29:14.460 on the inside of a mask
00:29:15.860 for somebody who had COVID,
00:29:17.480 let's say they had COVID
00:29:18.400 and didn't know it,
00:29:19.760 if you measure the amount of COVID
00:29:21.560 on the inside of the mask,
00:29:22.880 would there be any?
00:29:26.260 Would you borrow a mask
00:29:27.620 from somebody who had COVID
00:29:28.720 if you knew they had COVID?
00:29:33.000 No.
00:29:34.360 So some of it's on the mask,
00:29:38.020 but clearly it doesn't seem
00:29:39.420 to make any difference
00:29:40.200 at a population level.
00:29:41.880 And probably the reason
00:29:42.680 that masking doesn't work
00:29:43.820 is that we all violated a home.
00:29:46.460 Did anybody wear masks at home?
00:29:49.200 Even if people came over?
00:29:50.580 Even if you had guests?
00:29:51.560 Did you wear masks at home
00:29:52.480 during the pandemic?
00:29:53.680 No.
00:29:55.280 No.
00:29:56.040 Did you have,
00:29:56.760 did your kids have playdates
00:29:58.420 at your house
00:29:59.000 and nobody wore masks?
00:30:00.760 Yes.
00:30:01.860 Yes.
00:30:02.180 Many playdates.
00:30:03.560 No masks.
00:30:04.840 Yep.
00:30:05.940 Now, obviously,
00:30:06.760 the parents had a choice
00:30:07.720 of being in a mask,
00:30:09.620 sending their kids
00:30:10.580 to a maskless situation or not,
00:30:12.240 but nobody seemed
00:30:13.480 to have a problem with it.
00:30:17.400 In Spain,
00:30:18.220 people wore masks at home
00:30:19.440 and during Christmas.
00:30:20.600 I'm sorry.
00:30:23.040 I'm sorry about that.
00:30:25.540 Yeah, you can,
00:30:26.420 you can require masks
00:30:27.700 all you want,
00:30:28.460 but we're not going to wear them
00:30:29.200 at home,
00:30:29.800 so good luck with that.
00:30:30.760 And that's the last time
00:30:34.860 I want to talk about masks.
00:30:36.060 Until,
00:30:36.500 well,
00:30:37.120 the next time I'll talk about it
00:30:39.120 will be when I'm helping
00:30:40.500 to organize the destruction
00:30:41.920 of the first corporation
00:30:43.260 that requires it.
00:30:45.080 All right.
00:30:45.440 I don't want to go after
00:30:46.380 any healthcare organizations
00:30:47.760 because that's just messed up.
00:30:50.180 They got enough problems.
00:30:51.300 You know,
00:30:51.460 they're trying to fight
00:30:52.420 through a tough situation
00:30:53.380 if there is one.
00:30:55.420 I don't want to go after them.
00:30:57.180 And I'd rather go after people
00:30:58.780 who have customers
00:31:00.060 who have options,
00:31:01.040 like Bud Light.
00:31:02.640 You can decide
00:31:03.500 not to shop at Target,
00:31:05.260 but changing your whole
00:31:06.260 healthcare situation
00:31:07.280 is,
00:31:07.960 that's like more trouble
00:31:09.000 than the protests
00:31:10.520 would be worth.
00:31:11.720 So the first time
00:31:12.940 there's a public company
00:31:14.340 that requires masks,
00:31:16.880 of the customers at least,
00:31:19.020 you got to close them down.
00:31:20.540 You got to put them
00:31:21.020 right out of business.
00:31:21.980 You got to make it fast
00:31:23.000 and brutal
00:31:23.580 and you got to make sure
00:31:25.000 there's no ambiguity about it.
00:31:27.860 Short of that,
00:31:28.600 you deserve masks.
00:31:30.580 If you want to say
00:31:32.680 what would make us
00:31:33.640 deserve to be masked
00:31:34.940 and be slaves to
00:31:36.720 our overlords,
00:31:38.160 well,
00:31:38.500 not fighting it.
00:31:40.040 So if you don't shut down
00:31:41.040 the first corporation,
00:31:41.960 well,
00:31:43.620 you're asking for it.
00:31:45.460 All right,
00:31:45.760 there's a bunch of news
00:31:46.680 that I'll call
00:31:47.320 the black and white stuff,
00:31:49.200 meaning black people
00:31:50.280 and white people.
00:31:51.460 I'm supposed to get excited
00:31:52.840 about a shooter
00:31:54.080 because there was
00:31:54.940 a racial element.
00:31:56.140 I'm not.
00:31:58.040 I'm actually surprised
00:31:59.240 that there isn't
00:31:59.800 more racial violence.
00:32:01.060 Aren't you?
00:32:02.520 I'm not recommending it
00:32:03.660 in case anybody
00:32:04.560 was unclear on that.
00:32:06.580 But given the media landscape,
00:32:09.420 I'm surprised
00:32:09.960 there isn't a lot more.
00:32:10.860 And if there is
00:32:12.940 a lot more,
00:32:13.460 that would be
00:32:14.040 kind of predictable
00:32:15.560 because the news
00:32:16.940 is turning all the stories
00:32:18.020 into black-white stories.
00:32:19.680 What would be the,
00:32:20.680 what's the most,
00:32:21.640 the most obvious outcome
00:32:23.840 of stories
00:32:25.280 which focus on
00:32:26.360 a person of this type
00:32:27.540 killed a person
00:32:28.180 of that type
00:32:28.840 and vice versa?
00:32:30.580 Would be more of it,
00:32:32.020 of course.
00:32:32.760 Because do you know
00:32:33.520 what causes you to act?
00:32:35.340 There's a requirement
00:32:36.380 for action.
00:32:38.280 There's a few requirements,
00:32:39.400 but one of the requirements
00:32:40.340 is that you have
00:32:41.960 a thought
00:32:42.440 because we think
00:32:44.860 about the thing
00:32:45.360 we're going to do
00:32:46.020 and then we do it.
00:32:47.140 But if you'd never
00:32:47.800 thought of something,
00:32:49.460 if it had never been
00:32:50.260 in your mind
00:32:50.760 in the first place,
00:32:51.840 the odds of you doing it
00:32:53.020 are pretty low.
00:32:54.580 So the news
00:32:55.500 is putting thoughts
00:32:56.420 in our heads
00:32:56.960 of black-on-white crime
00:32:59.760 with these videos
00:33:00.580 and then white-on-black crime
00:33:02.880 with the mass shootings
00:33:03.760 and then we're doing
00:33:04.720 this weird math
00:33:05.700 where we're competing.
00:33:06.640 All right,
00:33:07.720 did the black people
00:33:08.580 kill more white people?
00:33:09.840 Well, it depends
00:33:10.460 if you're looking
00:33:10.940 at percentages
00:33:11.640 or you're looking
00:33:12.600 at raw numbers.
00:33:13.900 And how often
00:33:14.760 do the white people
00:33:15.420 kill the black people?
00:33:16.800 If you engage
00:33:17.920 in that conversation,
00:33:19.760 then you have been
00:33:21.320 basically owned
00:33:23.060 by the people
00:33:23.600 who like to own minds.
00:33:25.680 If you want to be
00:33:26.780 an independent thinker,
00:33:28.380 reject the stories.
00:33:29.940 Don't talk about them
00:33:31.140 in the way
00:33:32.060 that they frame them,
00:33:33.000 black and white.
00:33:34.120 There was an individual
00:33:35.220 who killed
00:33:35.780 some other individuals
00:33:36.700 and it's a huge tragedy.
00:33:40.360 Not greater than
00:33:41.380 any of the other tragedies
00:33:42.520 that happened today.
00:33:44.180 A tragedy.
00:33:45.180 You know,
00:33:45.360 it's a maximum tragedy.
00:33:46.820 But not bigger
00:33:47.740 than all the other ones.
00:33:49.940 Right?
00:33:50.260 Other people died
00:33:51.400 in terrible ways as well.
00:33:53.380 So I'm just not
00:33:54.380 going to participate
00:33:55.100 in that.
00:33:57.240 We'll all wait
00:33:57.760 for the...
00:33:59.100 Oh, the other story
00:34:00.000 that I think is bullshit.
00:34:01.620 You're seeing
00:34:02.060 a bunch of...
00:34:03.040 Trump is going
00:34:04.040 to be gaining
00:34:04.600 with black people
00:34:05.560 and maybe it's
00:34:07.100 because he was
00:34:07.900 unfairly accused.
00:34:09.480 Maybe it's
00:34:10.060 because Biden's
00:34:11.160 not delivering.
00:34:13.280 I don't believe
00:34:14.140 any of it.
00:34:15.200 Really.
00:34:16.140 And here's why
00:34:16.640 I don't believe
00:34:17.060 any of it.
00:34:18.300 It's not that
00:34:18.920 the polls are inaccurate.
00:34:20.100 I'm not saying that.
00:34:21.180 They might be.
00:34:21.980 They might not be.
00:34:22.780 I wouldn't know.
00:34:23.940 But there's going
00:34:25.320 to be another
00:34:25.800 George Floyd op.
00:34:28.680 So whatever you're
00:34:29.740 thinking about
00:34:30.280 racial relations now,
00:34:31.540 that will be
00:34:32.060 completely transformed
00:34:33.420 with the next
00:34:34.940 George Floyd.
00:34:36.700 So you know
00:34:37.680 that there could
00:34:38.280 have been
00:34:38.540 George Floyd's
00:34:39.320 for four years
00:34:40.200 during Biden, right?
00:34:42.680 Easily there could
00:34:43.500 have been more
00:34:43.980 George Floyd's.
00:34:44.980 I'm pretty sure
00:34:45.900 that whatever
00:34:47.280 problems there are
00:34:49.440 between police
00:34:50.180 and the black community,
00:34:51.560 I'm pretty sure
00:34:52.240 they didn't get fixed
00:34:53.260 when Biden
00:34:54.280 became president, right?
00:34:55.540 But the way
00:34:56.060 we treated them,
00:34:56.920 which ones we highlighted,
00:34:58.420 what the news
00:34:59.100 decided was news,
00:35:00.140 probably CIA
00:35:02.120 influence on the news.
00:35:05.420 So there will be
00:35:06.740 guaranteed
00:35:07.920 something that was
00:35:09.680 going to happen
00:35:10.100 on its own.
00:35:10.660 I don't think
00:35:11.060 it would be an op
00:35:11.800 to create a crime,
00:35:13.240 but I think
00:35:13.800 that some crime
00:35:14.500 that was going
00:35:14.900 to happen on its own
00:35:15.820 will have just
00:35:17.000 the right elements
00:35:17.860 to make it
00:35:18.980 the only thing
00:35:19.560 that's a story.
00:35:20.640 You know,
00:35:20.800 the George Floyd
00:35:21.740 it.
00:35:22.300 So it doesn't matter
00:35:23.100 what Trump's
00:35:23.680 numbers are today.
00:35:25.720 Those will be
00:35:26.400 wiped out
00:35:26.860 by the next
00:35:27.540 brainwashing operation,
00:35:29.880 and it's,
00:35:31.100 I mean,
00:35:31.360 you could just,
00:35:32.060 you could put it
00:35:33.620 on your calendar
00:35:34.220 practically.
00:35:35.320 It's guaranteed.
00:35:37.740 All right,
00:35:38.560 and then there's
00:35:39.640 a story about
00:35:40.100 the Maryland
00:35:40.940 schools,
00:35:42.260 and I guess
00:35:42.580 the Baltimore
00:35:43.020 schools in particular,
00:35:44.680 in which
00:35:45.320 there are
00:35:47.060 zero people
00:35:47.940 who learned
00:35:50.260 anything in school.
00:35:51.920 That's the summary,
00:35:53.180 and none of the
00:35:53.840 students learned
00:35:54.460 anything in their
00:35:55.100 schools.
00:35:55.760 They're all
00:35:56.200 failures.
00:35:56.640 to which I
00:35:58.560 say,
00:35:59.040 is that a
00:35:59.600 national story,
00:36:00.400 or is that
00:36:00.680 people getting
00:36:01.200 exactly what
00:36:01.880 they wanted?
00:36:05.040 They keep
00:36:05.680 voting for it.
00:36:06.560 They must
00:36:06.920 want it.
00:36:08.100 If they wanted
00:36:08.940 something different,
00:36:09.800 I imagine they
00:36:10.620 would go get it
00:36:11.140 for themselves.
00:36:12.460 Do you know
00:36:12.760 why we don't
00:36:13.900 have this problem
00:36:14.520 in my town?
00:36:16.280 So in my town
00:36:17.140 we have pretty
00:36:17.640 highly rated schools
00:36:18.740 by California
00:36:19.640 standards?
00:36:21.740 Because people
00:36:22.420 in my town
00:36:22.960 want good
00:36:23.540 schools.
00:36:25.240 That's it.
00:36:26.640 They want it.
00:36:28.620 That's the whole
00:36:29.560 story.
00:36:30.080 They just want it.
00:36:31.120 So they make
00:36:31.820 sure that they
00:36:32.280 get it.
00:36:33.820 Baltimore has
00:36:34.660 no functioning
00:36:35.340 schools,
00:36:36.160 as far as I
00:36:36.940 can tell,
00:36:37.940 but every
00:36:39.280 year they
00:36:40.000 vote,
00:36:41.320 they go vote
00:36:42.360 for it,
00:36:42.860 and they vote
00:36:43.360 for more of
00:36:43.920 this,
00:36:44.220 apparently.
00:36:45.180 So I don't
00:36:46.280 care about
00:36:46.680 Baltimore's
00:36:47.320 failing schools.
00:36:48.920 They're getting
00:36:49.400 what they ask
00:36:49.860 for.
00:36:50.780 And they ask
00:36:51.600 for it every
00:36:52.320 two to four
00:36:53.140 years.
00:36:54.020 They re-ask
00:36:54.860 for it,
00:36:55.400 and they re-ask
00:36:56.120 for it.
00:36:56.840 They know
00:36:57.340 what to do
00:36:57.680 to fix it.
00:36:58.420 They don't
00:36:58.700 want to.
00:37:00.400 I don't know
00:37:00.860 why.
00:37:01.860 And when I
00:37:02.280 say they
00:37:02.600 don't want
00:37:03.000 to,
00:37:03.240 that never
00:37:03.620 means everybody,
00:37:04.660 right?
00:37:05.140 It means
00:37:05.680 there's not
00:37:06.120 enough people
00:37:06.620 who care.
00:37:08.140 So what
00:37:09.020 should you
00:37:09.560 do about
00:37:10.260 that situation?
00:37:11.800 Should you
00:37:12.260 take some
00:37:12.760 of your money
00:37:13.400 that you
00:37:13.760 earned and
00:37:14.940 maybe try to
00:37:15.720 give it to
00:37:16.200 them?
00:37:17.180 No, that
00:37:17.600 would be
00:37:17.840 crazy.
00:37:18.840 You should
00:37:19.140 just avoid
00:37:20.280 it in any
00:37:20.900 way you can.
00:37:21.420 Just get
00:37:21.740 away.
00:37:22.920 Just go
00:37:24.000 where that's
00:37:24.420 not a problem.
00:37:25.000 Move to my
00:37:25.480 town.
00:37:25.720 My town
00:37:27.040 has good
00:37:27.400 schools.
00:37:28.120 Problem solved.
00:37:29.760 All right.
00:37:30.240 Yeah, move
00:37:30.660 away.
00:37:31.080 Let the cities
00:37:32.140 fail.
00:37:32.920 The faster the
00:37:33.760 cities fail,
00:37:34.580 the better off
00:37:35.060 we all will
00:37:35.660 be.
00:37:36.300 Because they
00:37:36.740 need to hit
00:37:37.160 bottom.
00:37:38.220 And apparently
00:37:38.760 they're not
00:37:39.140 close.
00:37:39.540 so they've
00:37:40.960 got to get
00:37:41.260 a lot worse.
00:37:42.900 So when
00:37:43.820 nobody in
00:37:44.360 Baltimore can
00:37:44.920 read and
00:37:46.200 everybody dies
00:37:47.020 by murder,
00:37:47.980 that would be
00:37:48.460 about the
00:37:49.020 time that
00:37:49.380 something will
00:37:49.760 change.
00:37:51.680 All right.
00:37:54.500 But I'll
00:37:55.000 tell you what
00:37:55.360 I'm not going
00:37:55.820 to do.
00:37:56.620 I'm not going
00:37:57.240 to make it
00:37:57.600 my problem.
00:38:00.480 Everybody
00:38:00.880 agree with
00:38:01.360 that?
00:38:02.260 I'm not going
00:38:02.900 to make that
00:38:03.280 my problem.
00:38:03.780 If you want
00:38:05.980 your problem
00:38:06.600 to be solved,
00:38:07.340 you need to
00:38:07.800 leave.
00:38:09.260 It's not
00:38:09.920 really the
00:38:10.440 hardest thing
00:38:11.100 in the world
00:38:11.540 to understand
00:38:12.420 what needs
00:38:12.840 to be done.
00:38:13.620 And if you
00:38:14.160 want to leave
00:38:14.840 and you
00:38:15.120 can't,
00:38:16.220 that's something
00:38:16.900 I would put
00:38:17.400 some energy
00:38:17.820 into.
00:38:18.560 If it turns
00:38:19.120 out there
00:38:19.420 were a lot
00:38:19.720 of people
00:38:20.020 who just
00:38:20.260 wanted to
00:38:20.620 get out
00:38:20.900 of their
00:38:21.100 bad situation
00:38:21.860 physically
00:38:22.380 and just
00:38:23.380 go anywhere,
00:38:24.420 just anywhere,
00:38:25.860 that there's
00:38:26.420 a real school,
00:38:28.100 I would be
00:38:28.640 behind throwing
00:38:30.100 some energy
00:38:30.780 and money
00:38:31.140 behind that.
00:38:32.180 But no,
00:38:32.700 I'm not going
00:38:33.060 to fix a
00:38:33.640 place where
00:38:34.100 the people
00:38:34.400 don't want
00:38:34.800 it fixed.
00:38:35.500 That's not
00:38:35.960 going to
00:38:36.080 happen.
00:38:37.360 All right.
00:38:38.520 I keep
00:38:39.260 seeing the
00:38:40.880 presidential race
00:38:42.080 like the
00:38:44.120 Avengers.
00:38:45.760 And what's
00:38:46.440 interesting is
00:38:47.140 the Democrats
00:38:48.660 are trying as
00:38:49.300 hard as
00:38:49.680 possible to
00:38:50.280 take down
00:38:50.720 Iron Man.
00:38:52.320 Let's say
00:38:52.760 that's Trump.
00:38:54.280 But what's
00:38:55.080 different this
00:38:55.640 time is that
00:38:56.380 it's not just
00:38:57.600 one superhero.
00:39:00.780 Because the
00:39:02.820 GOP also
00:39:03.600 has a
00:39:03.980 Hulk.
00:39:05.360 They don't
00:39:05.960 just have
00:39:06.400 Iron Man
00:39:06.840 anymore.
00:39:07.560 Now they
00:39:08.020 have a
00:39:08.260 Hulk.
00:39:09.260 If you
00:39:09.680 take Trump
00:39:10.180 down,
00:39:10.600 you're going
00:39:10.860 to get
00:39:11.120 Vivek.
00:39:12.440 You think
00:39:13.260 you're going
00:39:13.580 to get
00:39:13.800 DeSantis?
00:39:15.240 Surprise.
00:39:16.660 You're not.
00:39:18.100 You're not.
00:39:19.520 You're not
00:39:20.280 going to get
00:39:20.660 a surprise.
00:39:21.280 If Trump
00:39:22.080 goes down,
00:39:23.080 you're going
00:39:23.820 to get the
00:39:24.240 person who
00:39:24.720 is going
00:39:25.000 to fix it.
00:39:26.460 Do you
00:39:26.880 think that's
00:39:27.320 DeSantis?
00:39:28.660 Has he been
00:39:29.420 talking like
00:39:29.960 somebody who's
00:39:30.460 going to fix
00:39:31.040 it?
00:39:32.240 No.
00:39:33.000 No.
00:39:33.560 There's one
00:39:34.120 person who
00:39:34.600 has the
00:39:34.900 capability and
00:39:37.000 the clear
00:39:37.800 message that
00:39:38.600 he's going
00:39:39.320 to fix it.
00:39:40.060 Now you
00:39:40.360 say to
00:39:40.660 yourself,
00:39:41.060 but Scott,
00:39:41.900 I've heard
00:39:42.720 before that
00:39:43.300 people are
00:39:43.660 trying to
00:39:44.080 clear the
00:39:44.680 swamp.
00:39:45.840 We've heard
00:39:46.380 this before.
00:39:47.620 Do you
00:39:47.920 know the
00:39:48.880 only way to
00:39:49.600 clear the
00:39:50.060 swamp is
00:39:51.840 the way
00:39:52.140 Vivek is
00:39:52.780 suggesting?
00:39:53.800 And I
00:39:54.040 didn't realize
00:39:54.620 the genius
00:39:55.420 of this until
00:39:56.420 really just
00:39:57.420 yesterday I
00:39:58.140 figured it
00:39:58.480 out.
00:39:58.620 the only
00:40:00.140 way you
00:40:00.540 can take
00:40:00.980 a corrupt
00:40:01.600 organization
00:40:02.360 and make
00:40:03.480 it less
00:40:03.820 corrupt is
00:40:04.440 to make
00:40:04.740 it go
00:40:05.060 away.
00:40:06.540 And that's
00:40:06.940 what he
00:40:07.160 plans to
00:40:07.580 do.
00:40:08.080 He's going
00:40:08.440 to spread
00:40:08.780 the FBI's
00:40:09.600 functions across
00:40:10.480 other functions.
00:40:11.660 He's going
00:40:12.060 to take the
00:40:12.480 Department of
00:40:13.140 Education and
00:40:13.920 take that
00:40:14.340 money and
00:40:14.800 give it to
00:40:15.220 the states.
00:40:16.300 You have
00:40:17.300 to take
00:40:17.720 away the
00:40:19.160 entire entity
00:40:19.960 if it's
00:40:21.220 that corrupt.
00:40:22.580 And I think
00:40:23.060 we've reached
00:40:23.440 the point where
00:40:23.880 the entity has
00:40:24.660 to go away,
00:40:25.320 but not the
00:40:25.900 function.
00:40:26.240 The function
00:40:27.720 needs to stay
00:40:28.460 exactly as
00:40:29.300 Vivek wants
00:40:29.840 to do.
00:40:30.640 So if you
00:40:31.240 have fewer
00:40:31.720 people, you
00:40:33.260 have fewer
00:40:33.800 places that
00:40:34.480 corruption can
00:40:35.160 hide.
00:40:36.120 If you have
00:40:36.620 less complexity,
00:40:38.220 do you know
00:40:38.500 how many people
00:40:39.000 just work in
00:40:39.620 that Hoover
00:40:41.840 building?
00:40:42.740 I mean, just
00:40:43.420 filled with
00:40:44.100 people doing
00:40:44.640 complicated things
00:40:45.660 that you don't
00:40:46.160 understand that
00:40:47.120 interact with
00:40:48.220 each other?
00:40:48.840 That's where all
00:40:49.560 corruption hides
00:40:50.280 in complexity.
00:40:51.760 But if you
00:40:52.200 were just to
00:40:52.660 simplify it, make
00:40:54.160 things more
00:40:54.700 local, make
00:40:55.900 them more
00:40:56.340 on point, the
00:40:59.020 field agents
00:41:00.640 would be
00:41:01.080 working on
00:41:01.620 discrete things
00:41:02.960 in different
00:41:04.220 organizations.
00:41:06.540 You're more
00:41:07.400 likely to find
00:41:08.520 corruption by
00:41:10.740 having fewer
00:41:11.380 people, more
00:41:12.200 simplification, and
00:41:13.740 bringing it closer
00:41:14.480 to the local
00:41:15.040 level.
00:41:16.540 So that's a
00:41:17.780 real plan.
00:41:19.020 So good luck
00:41:19.620 taking Iron Man
00:41:20.380 down, because
00:41:21.260 you don't want
00:41:22.900 the Hulk.
00:41:23.940 You really
00:41:24.300 don't.
00:41:25.900 All right, all
00:41:28.280 things being
00:41:28.820 equal, the
00:41:29.540 Democrats have
00:41:30.500 an impenetrable
00:41:31.860 advantage at
00:41:34.920 the moment.
00:41:35.640 Impenetrable.
00:41:36.580 So unless this
00:41:37.480 changed, Democrats
00:41:38.740 will just win
00:41:39.380 everything in
00:41:40.520 2024.
00:41:41.960 And this has
00:41:43.020 to do with the
00:41:43.520 fact that fear
00:41:44.920 persuasion is the
00:41:46.100 strongest.
00:41:47.680 Fear persuasion
00:41:48.800 just beats
00:41:49.320 everything.
00:41:49.660 Now, the
00:41:50.840 reason that
00:41:51.420 Trump won
00:41:52.240 the first
00:41:53.000 time is that
00:41:54.200 his fear
00:41:54.720 persuasion was
00:41:55.540 excellent.
00:41:56.580 You know, you
00:41:56.860 can say it's
00:41:57.780 inappropriate or
00:41:58.860 immoral or
00:42:00.060 whatever you want
00:42:00.600 to say.
00:42:01.640 But it was
00:42:02.180 effective, because
00:42:03.600 he said people
00:42:04.380 coming across the
00:42:05.180 border, you know,
00:42:06.480 too many criminals
00:42:07.460 and all that, and
00:42:08.400 got people scared.
00:42:09.940 And, you know,
00:42:10.580 basically you were
00:42:12.140 scared of everything
00:42:12.880 by the time he was
00:42:13.620 done.
00:42:14.780 Scared of China.
00:42:15.700 But, right now,
00:42:18.560 the Democrats
00:42:19.120 own fear.
00:42:20.440 So they've got
00:42:20.900 climate change
00:42:21.720 fear that doesn't
00:42:23.480 affect Republicans
00:42:24.260 too much, but
00:42:25.060 their base is all
00:42:25.980 about it.
00:42:27.100 They've got the
00:42:28.060 white supremacy
00:42:28.860 fear, or just
00:42:30.880 white men fear,
00:42:33.060 which is something
00:42:33.800 that the
00:42:34.520 Republicans can't
00:42:35.800 match.
00:42:37.000 You know, they
00:42:37.420 could try showing
00:42:38.280 lots of social
00:42:39.540 media videos of,
00:42:41.600 you know, criminal
00:42:42.600 behavior and stuff,
00:42:44.040 but it doesn't,
00:42:44.820 it's not really
00:42:45.480 getting through.
00:42:46.880 I think the
00:42:47.480 white supremacy
00:42:48.140 message by the
00:42:48.960 mainstream media
00:42:49.600 is pretty strong.
00:42:51.300 And then last,
00:42:54.080 women losing
00:42:55.200 bodily autonomy,
00:42:56.780 as they would
00:42:57.600 say, over
00:42:59.660 abortion.
00:43:01.380 Imagine if
00:43:02.100 somebody said,
00:43:02.920 we're going to,
00:43:04.080 the state is
00:43:04.960 going to have
00:43:05.420 control over
00:43:06.640 your body.
00:43:08.600 Well, that's
00:43:09.260 what women are
00:43:09.780 imagining right
00:43:10.380 now.
00:43:11.120 You don't have
00:43:11.560 to imagine it.
00:43:12.940 That's actually
00:43:13.580 what the state is
00:43:14.180 saying.
00:43:14.460 And by the way,
00:43:14.940 if you think
00:43:16.080 that I'm
00:43:16.400 giving you my
00:43:17.120 opinion on
00:43:17.740 abortion, that
00:43:18.400 won't be here.
00:43:19.400 I don't do
00:43:19.900 that.
00:43:20.740 I let women
00:43:21.460 take the lead
00:43:22.040 on abortion.
00:43:23.360 Whatever they
00:43:24.120 collectively decide
00:43:25.140 in their state,
00:43:26.720 I back the
00:43:28.200 women.
00:43:28.900 So I'm not
00:43:29.380 backing Republicans
00:43:30.260 or Democrats.
00:43:32.460 I'm not backing
00:43:33.560 pro-abortion or
00:43:34.640 anti-abortion.
00:43:35.740 I'm backing
00:43:36.420 women, because
00:43:37.940 this is an
00:43:38.460 impossible decision,
00:43:40.500 and you need to
00:43:41.400 at least make
00:43:42.180 the people with
00:43:43.780 the most skin
00:43:44.300 in the game.
00:43:45.360 At least they
00:43:45.920 should have to
00:43:46.300 take a lead.
00:43:47.140 Now, I'm not
00:43:47.800 saying you should
00:43:48.220 give up your
00:43:48.700 vote.
00:43:49.560 If you want to
00:43:50.020 vote, go vote.
00:43:51.080 You have that
00:43:51.500 right.
00:43:52.200 I'm just saying
00:43:52.780 for me, I can't
00:43:55.020 justify being
00:43:56.900 persuasive in this
00:43:58.100 domain.
00:43:59.020 I want to be
00:43:59.800 anti-persuasive on
00:44:01.500 this and let women
00:44:02.300 take the lead.
00:44:03.060 Anyway, but if you're
00:44:05.100 a woman, that would
00:44:05.700 be one of the
00:44:06.240 scariest things.
00:44:07.040 I can imagine if I
00:44:07.860 put myself in that
00:44:08.660 position, which I'm
00:44:09.520 not, if you told
00:44:11.960 me somebody was
00:44:12.580 going to take
00:44:12.960 away one of my
00:44:13.920 ways to manage my
00:44:15.920 life, and I didn't
00:44:17.720 think that a fetus
00:44:18.920 was a human life,
00:44:19.900 let's take that
00:44:20.840 assumption, I'd be
00:44:22.620 pretty scared.
00:44:24.160 So what is it that
00:44:25.140 the Republicans have
00:44:27.180 to offer, fear-wise,
00:44:29.440 that would match, at
00:44:30.620 least for the voters
00:44:31.720 on the other side?
00:44:32.580 Obviously, the
00:44:33.180 Republicans have
00:44:33.960 Republican votes
00:44:34.860 already.
00:44:35.700 But if they want to
00:44:36.680 influence the other
00:44:37.300 side, what do they
00:44:38.720 have?
00:44:39.940 Crime.
00:44:42.980 COVID lockdowns.
00:44:46.420 Poverty is too
00:44:47.280 slow-moving.
00:44:49.600 Racism.
00:44:51.660 No, that's what,
00:44:52.820 the racism is what
00:44:53.820 the left is using.
00:44:56.060 Zombie cities.
00:44:58.380 I don't know.
00:44:59.360 See, the trouble is
00:45:00.040 the cities don't care
00:45:01.020 about themselves
00:45:01.680 enough.
00:45:03.140 If the cities cared
00:45:04.420 about themselves,
00:45:05.300 they would have
00:45:05.600 already fixed it.
00:45:07.280 So it's hard for me
00:45:08.420 to care about, say,
00:45:09.260 I don't care about
00:45:10.060 San Francisco, honestly.
00:45:12.940 You know, I live
00:45:13.880 near it.
00:45:14.660 It's like a major
00:45:15.680 part of my whole
00:45:16.360 life.
00:45:17.280 But San Francisco
00:45:18.160 wants to be what
00:45:19.240 it is.
00:45:21.000 The moment they
00:45:22.160 decide not to be
00:45:23.140 that, they'll change
00:45:23.960 it.
00:45:24.560 So it's not up to me
00:45:25.760 to tell them that
00:45:26.720 they can't walk in
00:45:27.760 feces when they go
00:45:29.280 outside if they want
00:45:30.160 to.
00:45:30.340 And obviously this is
00:45:33.060 hyperbole.
00:45:33.980 They don't all want
00:45:35.060 to.
00:45:35.520 Yeah, I get that.
00:45:36.500 But they're acting
00:45:37.240 in the way that gives
00:45:38.860 them one result.
00:45:40.240 If you want to get a
00:45:41.140 different result, act
00:45:42.300 differently, but don't
00:45:43.440 make it my problem.
00:45:45.600 I'm happily living away
00:45:46.980 from San Francisco.
00:45:48.360 So, you know, work it
00:45:49.800 out yourself.
00:45:50.720 So no, that doesn't
00:45:51.460 give me any fear.
00:45:53.200 I think that's just
00:45:54.180 evolution of, you know,
00:45:55.780 it's just change.
00:45:59.520 So inflation, you
00:46:01.140 know, if you look at
00:46:01.820 what the Republicans
00:46:03.160 like to push, they
00:46:04.500 push, you know,
00:46:05.160 inflation is bad.
00:46:06.260 That's true.
00:46:06.860 Nobody likes inflation.
00:46:08.360 But it doesn't give
00:46:09.480 you a visceral fear.
00:46:11.460 It makes you feel bad,
00:46:13.120 but you're not, like,
00:46:14.300 afraid it's going to
00:46:15.020 go come kill you.
00:46:17.160 Climate change,
00:46:18.100 people think, is going
00:46:18.860 to actually kill them.
00:46:20.960 Right?
00:46:21.460 White supremacy, if you
00:46:23.040 watch CNN or MSNBC,
00:46:24.760 there's a whole bunch
00:46:26.040 of people who think
00:46:26.580 the white supremacists
00:46:27.480 are arming up and
00:46:28.420 are coming for you
00:46:29.240 or something.
00:46:31.020 But, you know,
00:46:31.520 that's not true.
00:46:32.900 I mean, not at scale.
00:46:36.220 Anyway, so keep an eye
00:46:37.560 on that.
00:46:37.980 The really predictive
00:46:39.120 feature is who has
00:46:40.820 the better fear
00:46:41.500 persuasion, and right
00:46:43.360 now, that's all
00:46:44.700 Democrats.
00:46:45.960 Big advantage.
00:46:47.740 There's another story
00:46:48.600 that Kamala Harris
00:46:49.360 tried to talk in
00:46:50.480 public again.
00:46:53.560 Summary.
00:46:54.760 Didn't go well.
00:46:57.860 Yeah, that's the
00:46:59.160 whole story there.
00:47:00.500 Yep, Kamala Harris
00:47:01.400 trying to talk
00:47:02.460 coherently in public.
00:47:04.180 Didn't go well.
00:47:06.340 Here's more evidence
00:47:07.360 that the Democrats
00:47:09.840 are an organized
00:47:10.780 crime party.
00:47:12.040 So there's emails
00:47:12.880 now that show
00:47:13.640 that U.S.
00:47:14.480 Attorney Weiss
00:47:15.220 talked to the
00:47:16.960 DOJ to thwart
00:47:19.140 congressional
00:47:19.920 questioning.
00:47:20.860 That's the charge.
00:47:21.700 But once again,
00:47:22.760 we have entities
00:47:23.940 in the government,
00:47:25.120 Democrats,
00:47:25.920 who are talking
00:47:27.280 to each other
00:47:27.940 in a way that
00:47:28.700 appears on the
00:47:29.760 surface to be
00:47:31.280 organized and
00:47:32.300 coordinated for
00:47:33.680 illegitimate purposes.
00:47:36.640 Now, as I've said
00:47:38.220 before, the Democrat
00:47:39.560 party has become
00:47:40.560 an organized crime
00:47:41.860 party, which is not
00:47:43.560 to say that the
00:47:44.460 individuals in it
00:47:45.580 are worse than the
00:47:47.580 individuals who are
00:47:48.400 Republican or
00:47:49.220 independent.
00:47:49.680 So here now, I'm
00:47:53.060 not talking about
00:47:53.780 the voters.
00:47:55.120 I'm talking about
00:47:55.600 the leadership.
00:47:57.320 Here's what I think
00:47:58.500 is a key difference,
00:48:00.060 just observationally,
00:48:01.500 between the Democrats,
00:48:03.860 the party, and the
00:48:05.220 Republican party.
00:48:06.760 See if this tracks
00:48:07.720 with your observation.
00:48:09.500 So this is not a
00:48:10.240 science, just
00:48:10.800 observation.
00:48:12.320 It seems to me
00:48:13.280 that when Republicans
00:48:14.900 are pushing bullshit,
00:48:16.960 which happens a lot,
00:48:18.020 it's coming from
00:48:19.760 citizens and
00:48:21.020 sometimes boiling
00:48:22.000 up until you'll
00:48:23.580 see somebody in
00:48:24.280 Congress, you know,
00:48:25.000 some of the more
00:48:25.560 controversial ones,
00:48:26.560 it'll come out of
00:48:27.040 their mouths.
00:48:28.100 But basically,
00:48:29.800 you know, everything
00:48:30.400 from Pizzagate to
00:48:32.240 you name it,
00:48:33.520 started with
00:48:34.600 citizens who
00:48:36.140 happen to be
00:48:36.640 Republican.
00:48:37.840 You know, maybe
00:48:38.400 it was on Reddit
00:48:39.080 or 4chan, and
00:48:40.680 then it bubbles
00:48:41.180 up, and then
00:48:42.240 maybe it comes out
00:48:43.100 of somebody's
00:48:43.600 mouth.
00:48:44.500 Right?
00:48:44.740 But, when
00:48:48.120 the Democrats
00:48:48.780 run an op, it
00:48:50.360 is very clearly
00:48:51.240 organized by the
00:48:52.240 leadership, and
00:48:53.620 then trickles down
00:48:54.520 through the news to
00:48:55.420 the rest of us.
00:48:56.880 So one of them is
00:48:57.920 a bullshit bottom-up
00:48:59.380 party, and the
00:49:00.380 other is a bullshit
00:49:01.320 top-down party.
00:49:03.880 And it's very
00:49:04.740 consistent.
00:49:05.600 So I'll give you an
00:49:06.300 example.
00:49:06.800 On the Democrat
00:49:07.520 side, you've got
00:49:08.740 your Russia collusion
00:49:09.720 hoax, you know,
00:49:11.000 coordinated across
00:49:11.960 multiple entities.
00:49:12.860 You've got your
00:49:14.120 Hunter Biden
00:49:15.140 laptop hoax, again,
00:49:18.080 coordinated in a
00:49:19.120 sense, you know,
00:49:19.880 the media was in
00:49:20.660 on it, the
00:49:21.280 Democrats were in
00:49:22.180 on it, the Intel
00:49:23.000 operations were in
00:49:23.940 on it.
00:49:26.460 You've seen the
00:49:27.600 prosecutions.
00:49:29.400 So the prosecution
00:49:30.180 depends on the
00:49:33.000 organized nature of
00:49:34.360 the crime on the
00:49:35.420 Democrat side.
00:49:36.560 If it were not an
00:49:38.320 organized approach
00:49:40.240 to take Trump
00:49:40.920 down, what would
00:49:42.160 the news have
00:49:42.820 reported about
00:49:43.820 his perfect phone
00:49:44.960 call to find
00:49:45.760 votes?
00:49:46.860 If it were not
00:49:48.380 an organized
00:49:49.140 attempt, what
00:49:50.960 would the news
00:49:51.580 say if they were
00:49:52.260 just describing it
00:49:53.260 objectively?
00:49:55.120 They would say
00:49:56.060 people question
00:49:57.020 elections all the
00:49:57.860 time, perfectly
00:49:59.100 legal.
00:50:00.380 Find votes in
00:50:01.540 the context of
00:50:02.520 thinking that an
00:50:03.620 audit would show
00:50:04.360 that he had
00:50:04.780 actually won, was
00:50:06.460 just talking.
00:50:08.160 That's what they
00:50:08.700 call that.
00:50:09.200 Well, there was
00:50:10.580 an example of a
00:50:11.560 phone call, and
00:50:12.340 it involved some
00:50:13.080 talking.
00:50:14.560 That's literally
00:50:15.300 the only thing
00:50:16.040 that happened.
00:50:17.100 But because it's
00:50:18.160 an organized event,
00:50:20.360 not only do you
00:50:21.220 have the media
00:50:21.820 covering for it,
00:50:22.820 the Democrats
00:50:23.460 covering for it,
00:50:24.520 but you have
00:50:24.940 actual prosecutors
00:50:26.040 who have arrested
00:50:28.100 and indicted the
00:50:30.940 probable next
00:50:31.860 president of the
00:50:32.820 United States on
00:50:34.240 what is clearly an
00:50:35.200 organized op.
00:50:36.060 And how about
00:50:37.940 the Hunter, or
00:50:40.420 the, let's say,
00:50:41.320 the Biden crime
00:50:42.040 family allegations?
00:50:44.640 Do you think that
00:50:45.640 the Democrats are
00:50:46.500 coordinated in
00:50:48.000 covering up the
00:50:48.880 importance of that
00:50:49.700 story?
00:50:50.100 Of course.
00:50:50.960 It's very
00:50:51.340 coordinated.
00:50:52.180 The media, the
00:50:53.060 Democrats, probably
00:50:54.460 the intel people,
00:50:55.420 all is very
00:50:56.600 coordinated.
00:50:58.120 But do you see
00:50:59.120 the difference now?
00:51:00.100 Have I told you
00:51:01.120 that the Democrats
00:51:02.700 organized, they
00:51:03.620 actually plan and
00:51:05.300 execute multi-agency
00:51:07.720 operations that
00:51:09.860 are gaslighting
00:51:11.340 operations?
00:51:13.460 Republicans, too,
00:51:14.960 gaslight themselves
00:51:15.820 and try to gaslight
00:51:16.820 other people, but
00:51:17.860 it almost always
00:51:18.760 starts on 4chan or
00:51:20.520 Reddit or someplace
00:51:21.320 like that.
00:51:22.620 You know, many of
00:51:25.440 you believe that
00:51:26.240 because 4chan started
00:51:27.960 a rumor about me
00:51:28.920 that all of my
00:51:30.500 pandemic opinions
00:51:31.480 were reversed by
00:51:32.440 4chan and then
00:51:33.940 caused me this huge
00:51:35.240 problem, that
00:51:36.220 people thought my
00:51:36.940 opinions were the
00:51:38.040 opposite of what
00:51:38.600 they are.
00:51:39.580 And that didn't
00:51:41.260 come from the
00:51:41.780 top, that came
00:51:43.660 from just people.
00:51:45.460 Just people who
00:51:46.400 were probably on
00:51:47.100 4chan and Reddit
00:51:48.060 and stuff like
00:51:48.680 that.
00:51:49.560 So once you see
00:51:50.780 it, watch how
00:51:52.040 often you see more
00:51:52.920 of it.
00:51:54.280 The Democrats are
00:51:55.240 literally a criminal
00:51:57.240 organized enterprise.
00:51:59.620 It's organized at
00:52:00.860 the top, it's
00:52:02.100 implemented through
00:52:02.900 various agencies,
00:52:04.020 including the
00:52:04.720 news, which, as
00:52:05.500 we know, is
00:52:06.540 corrupt at this
00:52:07.260 point.
00:52:10.020 All right.
00:52:12.180 Apparently, Tucker
00:52:13.460 Carlson can interview
00:52:14.780 Putin now.
00:52:16.820 Looks like nothing
00:52:17.720 will stop him,
00:52:18.580 because he's an
00:52:19.940 independent person
00:52:20.820 now.
00:52:21.800 So, won't that be
00:52:23.040 interesting?
00:52:25.400 Yeah, won't that be
00:52:26.580 interesting?
00:52:27.480 Now, nothing that
00:52:28.340 Putin says can be
00:52:29.280 trusted, obviously.
00:52:30.400 But, I'd love to
00:52:32.700 hear what he has
00:52:33.160 to say.
00:52:34.880 All right.
00:52:36.540 I told you the
00:52:37.660 other day that
00:52:38.500 one of the things
00:52:40.240 that Vivek does,
00:52:41.560 his campaign, and
00:52:44.080 also Trump did
00:52:45.080 well.
00:52:45.840 And let's see if you
00:52:46.560 can agree with me
00:52:47.320 on this.
00:52:48.040 One of the things
00:52:48.720 Trump did better
00:52:49.520 than anybody is he
00:52:51.020 could read the
00:52:51.660 room, because he
00:52:52.720 was always following
00:52:53.460 social media, and
00:52:54.940 he was seeing
00:52:55.380 immediately how
00:52:56.020 people reacted.
00:52:57.040 And then he was
00:52:57.540 riffing off of
00:52:58.480 that.
00:52:59.620 And so, one of
00:53:00.360 the things that
00:53:00.740 happened was that
00:53:02.360 if you had a good
00:53:03.240 idea in the Trump
00:53:04.600 administration, and
00:53:05.580 I've given you
00:53:06.060 examples of this
00:53:06.840 before, I won't do
00:53:07.580 that again.
00:53:08.300 If you had a good
00:53:09.060 idea, it would
00:53:10.980 immediately trickle
00:53:12.040 up to the
00:53:12.720 president of the
00:53:13.280 United States, and
00:53:14.340 he would say, oh,
00:53:15.320 that's a pretty good
00:53:15.880 idea.
00:53:16.940 And then he would
00:53:17.460 implement it.
00:53:18.660 It could be an
00:53:19.440 executive action, it
00:53:20.680 could be a campaign
00:53:22.000 change.
00:53:22.820 And you saw it all
00:53:23.420 the time.
00:53:24.360 You saw one
00:53:25.480 individual who
00:53:26.220 would have an
00:53:26.500 idea, and the
00:53:27.700 next thing you
00:53:28.200 know, you
00:53:29.940 know, Trump
00:53:30.300 would have
00:53:30.600 adopted it, only
00:53:31.760 because it was a
00:53:32.400 good idea.
00:53:33.420 You know, it
00:53:33.760 didn't matter where
00:53:34.360 it came from.
00:53:35.640 Just a good idea
00:53:36.580 is a good idea.
00:53:37.880 Well, Vivek is
00:53:39.080 the same.
00:53:40.360 I gave you this
00:53:41.140 example, but now
00:53:41.880 it's implemented.
00:53:43.160 So I was whining
00:53:44.580 and bitching on
00:53:45.660 X that I kept
00:53:48.640 having to explain
00:53:50.300 away his various,
00:53:52.240 the rumors about
00:53:53.280 him, and the
00:53:54.020 WEF, and Soros,
00:53:55.760 and Big Pharma,
00:53:56.980 you know, all
00:53:57.600 these rumors
00:53:58.160 swirling about
00:53:58.860 him.
00:53:59.220 And I was
00:53:59.720 getting exhausted,
00:54:00.520 and I said, you
00:54:02.000 guys just got to
00:54:02.700 put up a page that
00:54:04.540 says what the
00:54:05.100 rumor is, what's
00:54:06.800 your version of
00:54:08.160 the truth, and
00:54:10.300 then I could just
00:54:11.420 link to it.
00:54:12.560 Well, now it
00:54:13.100 exists.
00:54:14.360 So now there's a
00:54:15.380 page for all of
00:54:17.000 those rumors, and
00:54:17.740 I looked, I think he
00:54:18.600 got them all.
00:54:19.680 By the way, I was
00:54:20.340 worried that he
00:54:21.100 might skip a rumor,
00:54:22.600 and then you'd say,
00:54:23.320 oh, why did he skip
00:54:24.400 this one?
00:54:24.820 But I think he
00:54:25.360 got them all.
00:54:26.400 Actually, it looked
00:54:27.180 pretty extensive.
00:54:28.940 And when I read
00:54:29.740 through it, I said,
00:54:30.600 okay, those are
00:54:31.200 good answers.
00:54:32.700 Here's the one I
00:54:33.340 liked the best, that
00:54:35.180 he accepted a
00:54:36.160 Soros scholarship
00:54:37.280 that had no
00:54:38.080 strings attached.
00:54:40.760 What's the only
00:54:41.780 right answer to
00:54:42.660 that accusation?
00:54:45.000 And it wasn't
00:54:45.600 George Soros, by
00:54:46.520 the way, it was
00:54:47.020 his dead brother.
00:54:50.720 He said, here's
00:54:52.340 the only right
00:54:53.020 answer, and
00:54:53.640 Javik had it.
00:54:56.140 If you're not
00:54:56.980 willing to take
00:54:57.740 free money when
00:54:58.580 it's offered, you
00:55:00.000 shouldn't be
00:55:00.540 anywhere near the
00:55:01.520 White House.
00:55:02.740 Boom.
00:55:04.600 Done.
00:55:06.440 Done.
00:55:08.700 Who is it who
00:55:09.920 once said that?
00:55:12.060 What does that
00:55:12.640 sound like?
00:55:16.960 Trump during the
00:55:17.980 debate.
00:55:19.260 Mr. Trump, you
00:55:20.620 know, you've also
00:55:21.400 donated money, I
00:55:23.280 think it was
00:55:23.600 donating money to
00:55:24.700 politicians, and
00:55:26.980 he said, no, no,
00:55:28.480 it was, Mr.
00:55:29.860 Trump, you've used
00:55:30.600 these, you know,
00:55:31.360 these tax loopholes
00:55:32.500 or whatever to save
00:55:33.660 your taxes.
00:55:34.940 And Trump says,
00:55:36.060 absolutely.
00:55:37.420 Of course I use
00:55:38.200 them.
00:55:39.080 They're legal, they're
00:55:39.920 available, and every
00:55:40.900 one of Hillary's
00:55:42.700 donors uses them
00:55:43.540 too.
00:55:45.820 Can you beat that
00:55:46.780 answer?
00:55:47.400 Of course I use
00:55:48.420 them.
00:55:49.000 They're legal, they're
00:55:49.880 available, and every
00:55:50.780 one of your donors
00:55:51.460 uses them.
00:55:52.900 If you want me not
00:55:53.980 to do it, change the
00:55:54.980 law.
00:55:56.780 How do you beat
00:55:57.480 that?
00:55:58.500 How do you beat
00:55:59.420 that argument?
00:56:00.820 So, you know,
00:56:01.920 Vivek didn't only
00:56:02.880 find the high ground,
00:56:04.720 you know, he was
00:56:05.320 sort of like floating
00:56:06.240 in the clouds above
00:56:07.800 this argument,
00:56:08.860 because you're a
00:56:09.740 freaking idiot if you
00:56:10.740 don't take free money,
00:56:12.420 no strings attached,
00:56:15.000 and I think somebody
00:56:15.740 said he already had
00:56:17.220 money, he'd made
00:56:17.980 some money by that
00:56:18.680 time.
00:56:19.760 So?
00:56:21.880 Do you know how
00:56:22.680 he made money?
00:56:25.500 By not being a
00:56:26.460 dumbass.
00:56:27.560 How about by not
00:56:28.300 being a dumbass?
00:56:29.540 And after he made
00:56:30.340 his money, guess
00:56:31.320 what?
00:56:31.560 He didn't turn
00:56:32.040 into a dumbass.
00:56:33.500 When somebody
00:56:34.200 offered him some
00:56:34.880 more free money,
00:56:36.280 he took the free
00:56:38.420 money.
00:56:39.080 I don't want the
00:56:39.940 guy who can't take
00:56:40.660 the free money,
00:56:41.980 I don't want that
00:56:43.100 person anywhere near
00:56:44.080 the White House.
00:56:45.760 So that was a good
00:56:46.520 answer.
00:56:48.820 So I tweeted the
00:56:50.940 link, it would be
00:56:52.740 near the top of my
00:56:53.800 tweet feed today,
00:56:54.980 or the X feed,
00:56:56.340 and if you'd like to
00:56:57.560 have that link, I
00:56:58.580 recommend keeping it
00:56:59.460 handy if you're
00:57:01.240 running into this
00:57:02.060 conversation.
00:57:03.520 Just keep it in your
00:57:04.940 notes, cut and paste
00:57:06.860 it every time you
00:57:07.520 need it, we'll see
00:57:09.460 what happens.
00:57:10.360 But I do love a
00:57:12.260 candidate who responds
00:57:14.360 to a good idea.
00:57:16.060 That was a good
00:57:16.800 idea, right?
00:57:17.940 I'm not, am I blowing
00:57:20.260 too much smoke up my
00:57:21.280 own butt?
00:57:22.440 No, it was just an
00:57:23.000 obvious good idea, put
00:57:24.200 all the answers in one
00:57:24.980 place.
00:57:26.680 And I've always said
00:57:29.980 that Trump should do
00:57:31.760 the same.
00:57:33.260 You know, Trump had
00:57:33.960 so many hoaxes against
00:57:35.420 him, it would be nice
00:57:36.680 just to have a hoax
00:57:37.500 page, wouldn't it?
00:57:39.460 So that the, you
00:57:42.140 know, every time I get
00:57:42.920 in a conversation,
00:57:43.780 which is like three
00:57:44.460 times a week about the
00:57:45.480 fine people hoax, I
00:57:47.200 just give them the
00:57:47.920 link.
00:57:48.920 Here's the full
00:57:49.540 transcript, here's the
00:57:50.940 part they always leave
00:57:51.720 out, it's a RUPAR,
00:57:53.720 blah, blah.
00:57:55.100 Save us all some time.
00:57:58.600 All right.
00:58:01.000 Ladies and gentlemen,
00:58:02.640 it looks like we have
00:58:03.480 almost, well, 600 people
00:58:06.600 watching live on X right
00:58:09.860 now.
00:58:11.480 And, hey, people on X,
00:58:13.160 how's the feed?
00:58:14.560 You got good audio, good
00:58:15.620 sound?
00:58:16.680 Are you happy with what
00:58:17.520 you're seeing in terms of
00:58:18.720 the quality?
00:58:21.840 It's a great feed.
00:58:23.240 Oh, we're getting good
00:58:24.580 technical performance
00:58:25.760 here.
00:58:26.460 Excellent.
00:58:27.900 Good to know.
00:58:28.720 There's no microphone,
00:58:29.920 but the iPhone has a good
00:58:31.260 microphone built in.
00:58:33.320 All right.
00:58:34.220 Now, I don't know if that
00:58:35.100 might have, you know,
00:58:36.600 some impact on one of
00:58:37.660 the other platforms.
00:58:38.600 We'll see.
00:58:41.020 Is there a chat?
00:58:42.120 Yes, there's chat
00:58:43.420 messages going by, but
00:58:44.540 they're sideways, so
00:58:45.520 they're a little harder
00:58:46.060 for me to read.
00:58:48.560 Loud and clear.
00:58:49.720 Lighting's good, too.
00:58:50.840 I think we're nailing it.
00:58:53.000 Look at this.
00:58:54.800 Just nailing it.
00:58:56.740 All right, ladies and
00:58:57.440 gentlemen, I've got some
00:58:58.300 more things to do to go
00:58:59.480 talk to people, promote
00:59:01.900 my book, make some
00:59:03.600 comics, and generally
00:59:05.980 enjoy my day.
00:59:07.180 The rest of you, I hope
00:59:08.020 you have a great day,
00:59:09.120 too.
00:59:09.700 I'm going to say goodbye
00:59:10.360 to YouTube and to X,
00:59:14.060 and I'm going to spend a
00:59:15.000 little more time talking
00:59:15.720 to the locals, people,
00:59:16.920 subscribers.
00:59:18.420 Thanks for joining.
00:59:19.200 Thank you.
00:59:19.280 Thank you.
00:59:19.320 Thank you.