Real Coffee with Scott Adams - April 14, 2024


Episode 2444 CWSA 04⧸14⧸24


Episode Stats

Length

1 hour and 9 minutes

Words per Minute

147.20848

Word Count

10,241

Sentence Count

859

Misogynist Sentences

21

Hate Speech Sentences

61


Summary

A Chinese study says that 13,000 islands around the world have increased their landmass over two decades, disproving climate change alarms. Is climate change a binary problem, or is it something else entirely? And which is more likely to be true?


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of human civilization that's called
00:00:06.660 Coffee with Scott Adams, the best time you've ever had. And today we've got a special show for you.
00:00:12.380 What makes it special? Nothing. It's just awesome, like always. And if you'd like to
00:00:18.440 take your experience up to a level that a human brain can barely comprehend, all you need for
00:00:23.920 that is a cup or a mug or a glass, a tank or a chalice or a stein, a canteen jug or a flask
00:00:28.440 or a vessel of any kind. Fill it with your favorite liquid. I like coffee. And join me
00:00:33.560 now for the unparalleled pleasure of the dopamine of the day, the thing that makes everything
00:00:37.300 better. It's called the simultaneous sip, and it happens now. Go. So, so good. Well, let's
00:00:54.260 jump right into the news. By the way, if you're not watching Dilbert Reborn, only available
00:01:01.320 by subscription on the X platform or scottadams.locals.com where you get the comic plus a lot more, you
00:01:08.820 would not know that Dave the Engineer is complaining about racism in the office. I'm not going to
00:01:16.220 tell you any more, but it doesn't go the way you expected. I am so happy having my creative
00:01:22.600 freedom to actually do actual real topics that real people are experiencing in the real
00:01:29.040 workplace. Oh my goodness. It's just delightful. Anyway, that's one of the funniest things I've
00:01:35.780 ever written in my life. And if you subscribe, you can see it. And by the way, for the locals
00:01:43.900 people, I took the subscription wall down. So if anybody in locals wants to share that one,
00:01:50.180 that one's available for sharing. Well, here's some news. A Chinese study says that
00:01:56.700 approximately 13,000 islands around the world have increased their landmass over two decades,
00:02:04.040 disproving climate change alarms. Oh, well, that's kind of awkward, isn't it? 13,000 islands
00:02:12.520 got bigger. That's a lot of islands. Now, let me tell you the little bit that I know about the sea
00:02:21.420 level. The little bit I know. The little bit I know is that sea level doesn't change everywhere the
00:02:28.620 same way. Because the landmass itself sometimes goes up and sometimes goes down. And heat, as I
00:02:37.640 understand it, will increase your volume of water. So the warmer places might look a little higher
00:02:45.880 unless they cool off, et cetera. So you've got a lot of moving parts. But apparently climate change
00:02:51.640 has been debunked as one of the causes of rising sea level because CO2 is up, but the sea level is not.
00:03:00.460 Now, you do know that there are two completely different movies on climate change, right?
00:03:08.160 I can sit here all day long and tell you about studies that prove it doesn't exist.
00:03:13.660 But you could change the channel to the other channel where all the studies prove it exists.
00:03:19.960 You know that exists, right? It doesn't matter how many times I tell you there's a study
00:03:25.440 that says it totally doesn't exist. There will be one that says it does coming out at the same time.
00:03:32.740 Which one is true? Well, let's get back to the basics. What percentage of studies in general are true?
00:03:43.080 Just about anything. Any scientific study, what are the odds it's true? 50%. Because they've studied
00:03:51.500 papers, and they know that it's about 50%. Half the time they're wrong.
00:03:59.240 Now, is climate change a binary? Binary meaning it's either happening or it's not happening, right?
00:04:08.100 Because if it's happening, you know, we're not talking about it's too slow or anything.
00:04:11.960 It's either happening a lot and it's really dangerous, or maybe it's not happening.
00:04:17.420 Now, if you've got a binary where it could be happening or not happening, yes or no,
00:04:23.660 and you've got a new paper that has a 50% chance of being right and a 50% chance of being wrong,
00:04:29.140 what have you learned by the new paper? Nothing. Nothing. It's a coin flip.
00:04:37.860 I could have come to you and said, hey, let's flip a coin to see if climate change is real.
00:04:41.960 It would be exactly as useful as this new study about these islands, exactly as useful, meaning not.
00:04:51.600 It has no information value whatsoever. It's a 50-50. It's a coin flip.
00:04:58.520 Now, some of you should be quick to jump on and say, but Scott, let me science-splain you.
00:05:09.160 Is there anybody here who wants to science-splain me? You know, telling me the things that literally
00:05:13.940 everybody already knows as if I'm the only person who doesn't know it? Okay. I'm going to science-splain
00:05:19.520 myself as if I'm my critic. Scott, don't you understand how science works? It's not about one study.
00:05:27.960 It's about, you know, reproducing studies and moving toward the truth slowly over time. Sometimes
00:05:33.620 we make mistakes, but we're moving toward the truth in a directional kind of way, Scott. Don't you
00:05:38.640 understand that these studies don't mean anything individually? You have to look at them collectively
00:05:42.660 with all of your knowledge and things. Science-splainin'. There you go. So, to me, it looks like
00:05:51.600 all the forces of nature are moving in the direction of proving that Trump was right and that climate
00:05:59.420 change was always bullshit. Just in time for the election. Is it my imagination or have we been hearing
00:06:10.040 a lot less about climate change than one would expect going into an election? Huh. Why would we hear
00:06:18.540 less about climate change going into an election where the choice of presidents could make the
00:06:25.800 difference between surviving as a species and not surviving? How can we ignore such a gigantic
00:06:33.720 existential risk? Unless the people pushing it are no longer as confident? Because as I told you
00:06:41.380 yesterday, the entire public reason for climate change is dissolved. The public argument is that
00:06:52.320 all the scientists are on the same side. Like, we're not scientists, so how would we know?
00:06:57.780 But if all the scientists, let's say 97%, if they're all on one side, they tell us, well, that must be
00:07:04.580 pretty scientifically accurate with all those people. Of course, the pandemic and, you know, the 51 people
00:07:13.440 on the laptop and all that, and all of our experience lately of our captured experts, et cetera,
00:07:20.680 have largely proved that the number of people on one side means absolutely nothing. Because you can get
00:07:26.800 a whole bunch of people to agree to just about anything, as long as their paycheck depends on it.
00:07:32.620 If you make their paycheck depend on it, they'll tell you anything they want.
00:07:36.160 What do you want to hear? I'll tell you. Just pay me.
00:07:41.160 So I think the climate change argument is just falling apart like crazy.
00:07:46.080 Over in Japan, let's talk about some more fake news.
00:07:49.480 There's a Japan study that says that the cancer spikes after your third dose of COVID.
00:07:56.800 MRNA vaccines specifically.
00:08:00.100 How many of you believe that is true?
00:08:03.060 Because it has been determined by a Japanese study.
00:08:06.520 Big, big difference in cancer after your third dose.
00:08:10.960 Sound real to you?
00:08:14.440 Well, science. Come on. It must be real.
00:08:18.500 Well, let's check the alternate story.
00:08:21.040 Over on USA Today, they had a fact check from just one month ago.
00:08:26.260 Now, this is before the Japanese study.
00:08:30.700 But one month ago, USA Today said there is no evidence that COVID-19 vaccines cause cancer
00:08:38.000 or are associated with a greater risk of developing cancer.
00:08:41.900 Experts say there's no evidence of it.
00:08:45.200 So that means it's true, right?
00:08:49.580 It's true that there's no correlation because there's no evidence of it.
00:08:54.420 There's also no evidence that the 2020 election was rigged
00:08:59.660 because no courts have found it to be true.
00:09:03.360 Yeah. No court found any rigging in 2020.
00:09:08.180 And also, there's no evidence that the shots cause cancer.
00:09:16.780 Do you see the trick?
00:09:19.020 Everybody see the trick?
00:09:20.020 No evidence is really different from saying it doesn't happen.
00:09:26.000 Do you see that?
00:09:27.800 No evidence means we don't know.
00:09:31.120 No evidence means we don't know.
00:09:34.560 Here's what it didn't say.
00:09:35.820 Many controlled, high-quality studies have been performed
00:09:40.980 and have determined there's no signal for extra cancer.
00:09:45.180 Now, that would be a good answer, wouldn't it?
00:09:47.880 Multiple, repeated, gold-standard, randomized, controlled tests.
00:09:55.700 And every time we do the test, we just keep repeating this test.
00:09:58.840 And the last five times, we didn't find a signal at all.
00:10:02.620 And by the way, the tests were all funded by independent people,
00:10:07.540 not pharma companies.
00:10:10.380 Now, that would mean something, wouldn't it?
00:10:12.980 But if you tell me that there's no evidence,
00:10:15.340 do you know what my first inclination is?
00:10:18.280 Is there no evidence because nobody funded an expensive trial to look for it?
00:10:23.600 Whose job would it be to fund the trial to look for the cancer?
00:10:28.840 I don't know.
00:10:30.040 And if somebody did fund a trial and it didn't find any cancer,
00:10:34.620 wouldn't you ask some questions about who funded the trial?
00:10:37.940 Because it sounds like something big pharma might do.
00:10:41.400 Except I think big pharma just doesn't bother doing the long-term studies
00:10:45.080 because they don't have to.
00:10:47.640 There was a time when I believed that the people who made the vaccinations
00:10:53.840 would track people's health over time
00:10:56.260 to make sure that not only was it safe the first year they got it,
00:11:00.400 but that it would remain safe for 10, 20 years.
00:11:04.480 That's what I thought.
00:11:06.160 You know that doesn't happen, right?
00:11:09.100 I'm pretty sure nobody's checking after a certain amount of time.
00:11:12.960 Like if he didn't die in the first year
00:11:14.840 or they didn't see any extra deaths in whatever study they did,
00:11:19.080 they don't really follow up, do they?
00:11:20.860 Does anybody know the answer to that?
00:11:23.460 But I'm pretty sure there's no follow-up.
00:11:25.460 You just assume that there is.
00:11:27.740 Like me, you assume that, well, that's the most natural thing.
00:11:31.460 You would just track a bunch of people who took it,
00:11:34.980 just track them every year,
00:11:36.220 ask them what their situation is compared to the norm,
00:11:38.880 and no, no, I don't believe that happens.
00:11:41.680 So, do we believe the Japanese study found a strong correlation?
00:11:49.260 I don't.
00:11:50.260 I don't believe it at all.
00:11:52.560 Now, I'm not saying it's wrong,
00:11:54.300 and I'm not saying that the shots are completely safe.
00:11:57.580 I wouldn't know.
00:11:58.820 I'm just saying that this is like having no information at all.
00:12:02.940 So the fact check is completely worthless.
00:12:05.560 This was a month ago.
00:12:07.080 Completely worthless because it said no evidence.
00:12:09.220 No evidence is completely different from saying,
00:12:12.560 yeah, we checked it, and it's not there.
00:12:14.980 No evidence means we didn't check.
00:12:17.800 Right?
00:12:19.860 I'm not wrong about that, no way.
00:12:21.480 No evidence means we didn't check.
00:12:24.400 If they had checked,
00:12:25.700 they would have said all the studies say it's not there.
00:12:28.920 But no evidence says we didn't check.
00:12:31.580 I'm pretty sure that's what that means.
00:12:34.960 But, again, you've got a study,
00:12:37.160 and a 50-50 chance it's wrong
00:12:40.240 because it's a study.
00:12:43.980 So, at best, it's a coin flip.
00:12:46.620 Again, the vaccinations were either going to be good for you or not.
00:12:50.800 It's binary.
00:12:52.460 Studies are either right or wrong,
00:12:54.500 and half the time they're wrong.
00:12:56.500 It's just another coin flip.
00:12:58.160 It didn't tell you anything.
00:12:59.320 But how many of you saw that researcher guy, John,
00:13:04.380 somebody who's always the one telling you
00:13:07.220 all the studies are going to, you know,
00:13:08.500 the COVID's going to kill you?
00:13:10.100 It's always that same guy.
00:13:12.400 Like, it doesn't really get out of his domain too much.
00:13:16.320 So, I don't know.
00:13:18.200 I'm worried about the health risks,
00:13:20.840 but I would say it's short of being proven.
00:13:25.220 It might be.
00:13:25.940 It could be proven someday.
00:13:27.100 Okay.
00:13:29.320 Neuralink brain chip could give users orgasms on demand.
00:13:35.440 So, the star is reporting that the Neuralink
00:13:38.000 could give orgasms on command.
00:13:42.920 You know, hypnotists can do that too.
00:13:48.380 That's right.
00:13:49.560 Hypnotists can do that.
00:13:50.660 You don't need Neuralink for that.
00:13:53.040 All right.
00:13:53.520 But that's a lot of people are going to buy that Neuralink
00:13:56.680 if you just sit around giving yourself orgasms.
00:13:59.320 Anyway, so Cheryl Atkinson has a new video out.
00:14:06.100 She's interviewing some doctors or doctor.
00:14:08.820 And we're learning that medical schools are just bullshit.
00:14:13.040 The medical schools are just terrible.
00:14:16.140 Apparently, the medical schools teach you lessons
00:14:19.420 that are compatible with what Big Pharma
00:14:21.460 and the junk food industry wants you to hear,
00:14:23.920 wants them to do.
00:14:27.920 Just think about the fact
00:14:29.440 that your doctor is being trained by Big Pharma and Big Food.
00:14:35.960 And Big Pharma and Big Food are the things you're the most worried about.
00:14:39.740 Big food especially.
00:14:40.880 Now, I don't know the connection between junk food and medical school,
00:14:46.160 except maybe there's some kind of funding thing going on.
00:14:50.300 I don't know.
00:14:51.840 But no, your doctor, and I guess doctors confirm this.
00:14:55.940 Doctors have gone through the system,
00:14:57.880 will tell you that they're not getting the right education.
00:15:01.020 There's some kind of money bias built into their education.
00:15:05.340 Great, great.
00:15:09.100 Well, there's a new software
00:15:13.200 that's like similar to the Star Trek holodeck
00:15:16.500 that can create all these virtual worlds
00:15:19.360 that didn't exist before.
00:15:21.340 So you can say, give me a bar scene,
00:15:23.860 or we're outdoors, and it just creates that world.
00:15:27.240 Now, I guess they're using it to train AI.
00:15:29.620 So you use one AI to create worlds,
00:15:31.840 and then a second AI to look at that world
00:15:35.460 and learn from it as if it had been a real world.
00:15:39.760 But here's what I say.
00:15:44.360 We're getting closer and closer
00:15:46.040 to removing all doubt that we are a simulation.
00:15:50.340 You all see that, right?
00:15:52.580 The path toward realizing we're a simulation
00:15:56.020 is so clear at this point.
00:15:58.640 Every time a new thing comes out,
00:16:00.440 it's all in the same direction.
00:16:03.280 Now, here's what this is going to do.
00:16:05.520 Once you realize that AI
00:16:07.440 can create an environment on demand,
00:16:11.480 the next thing you're going to see
00:16:13.440 is that you can take a walk through it.
00:16:16.980 So, for example,
00:16:18.480 you can say, I'm in a house,
00:16:20.820 and then the house will be there perfectly.
00:16:22.840 And then every time you need to go into another room,
00:16:25.540 not until you need it,
00:16:29.220 the room will form before you see it.
00:16:31.500 So when you reach for the doorknob,
00:16:33.540 for the first time,
00:16:35.080 the software will build the room behind the door
00:16:37.200 because it didn't need to do it until then.
00:16:40.080 So it'll do it on demand.
00:16:42.000 Because it's not going to create the universe,
00:16:44.360 in case you go everywhere in the universe.
00:16:46.060 It's going to give you what you need as you need it.
00:16:48.540 And then when you walk outside,
00:16:51.420 maybe for the first time,
00:16:53.580 the outdoors will be created where you can see it.
00:16:56.600 And as you walk through the forest,
00:16:58.340 the forest will be created ahead of it
00:17:00.580 so you keep seeing things.
00:17:03.560 Once you've experienced that,
00:17:06.100 all doubt will be removed
00:17:07.540 about whether you live in a simulation
00:17:09.240 because that's your actual experience.
00:17:12.940 And specifically, it's going to teach you
00:17:15.000 that history is created by the present.
00:17:18.360 It doesn't work the other way.
00:17:20.340 Let me give you that example.
00:17:22.040 You're in the video game.
00:17:23.860 And until you open the door,
00:17:26.100 there's nothing behind it.
00:17:28.120 But as you turn the knob
00:17:29.460 in your virtual reality and open it,
00:17:31.800 the computer gives you the room.
00:17:35.240 Once you open that room,
00:17:37.640 it's not just that it's there at the moment,
00:17:40.080 but its history was created too.
00:17:43.020 In other words,
00:17:44.460 if there's some furniture in the room,
00:17:46.420 there's an implied history.
00:17:48.800 There's something delivered the furniture to the room.
00:17:51.760 So you're going to experience in the virtual realm
00:17:54.800 that the impression of history
00:17:57.440 is being created by things you're seeing in the moment.
00:18:01.140 And then you're going to look at the double slit experiment
00:18:03.160 and Schrodinger's cat,
00:18:05.820 and I get that they work at the quantum level
00:18:08.120 and not at the big person level.
00:18:10.060 But still, it's going to be proof to you
00:18:13.620 that the past is generated by the present.
00:18:18.720 Once you realize that,
00:18:20.480 you know we're in a simulation
00:18:21.600 and everything's heading in that direction.
00:18:23.620 It's going to be obvious.
00:18:24.600 And probably less than a year,
00:18:26.460 it will be generally assumed that we're a simulation.
00:18:30.020 All right.
00:18:33.660 There's a story about a guy
00:18:35.000 who's spending $10,000 per month on AI girlfriends.
00:18:38.620 And just like dating apps,
00:18:40.860 he has more than one.
00:18:42.300 So he's got one app for this girlfriend,
00:18:44.300 one app for the other one.
00:18:45.600 And apparently,
00:18:47.160 the man who can afford it,
00:18:49.360 I guess it's a man who's got some money
00:18:51.120 and he's single.
00:18:52.900 So he's got some extra money.
00:18:54.580 And he says it's great.
00:18:56.420 It's completely working and satisfying.
00:19:00.680 Now, add that to the holodeck.
00:19:05.380 And I'm pretty sure human reproduction
00:19:09.260 is going to come to a screaming end really quickly.
00:19:13.740 Really quickly.
00:19:15.260 But here's the thing I wonder.
00:19:18.200 In the old, old days,
00:19:20.500 when humans were more like animals,
00:19:22.660 was it true that everybody got to reproduce?
00:19:27.180 Or is that just a modern thing?
00:19:30.140 Where, you know,
00:19:30.960 it doesn't matter how you rank
00:19:32.560 in the hierarchy of your tribe,
00:19:35.660 you all get to reproduce.
00:19:38.500 Now, what happens if in the virtual world,
00:19:42.200 the people who were maybe not so competitive
00:19:44.800 for reproducing,
00:19:46.080 they just take a pass.
00:19:47.700 They say, all right, we won't reproduce.
00:19:49.000 We'll just watch video games
00:19:51.500 and, you know,
00:19:52.860 play out the rest of our simulation.
00:19:56.320 Will that cause everybody to not reproduce?
00:19:59.300 Or will it make the billionaires
00:20:01.680 reproduce like crazy?
00:20:03.640 You know, the Elon Musk model
00:20:05.040 where he can just have as many babies as he wants.
00:20:08.500 He can just keep doing it
00:20:09.740 because there's no real limit on him.
00:20:12.280 So are we going to go back to a situation
00:20:14.560 where the most capable
00:20:16.240 are having all the babies?
00:20:18.360 And is that bad for us
00:20:21.100 or good for us?
00:20:23.700 I don't know.
00:20:25.060 I mean, if the people
00:20:26.720 who are taken out of the reproduction cycle
00:20:29.520 are completely happy about it,
00:20:32.000 let's say the men,
00:20:32.800 I don't know about the women,
00:20:34.080 but if the men are happy about it,
00:20:35.580 are they worse off?
00:20:37.560 Because they didn't really have
00:20:38.640 this great marriage option anyway.
00:20:41.760 So they have this great virtual reality option.
00:20:44.480 Maybe it's better.
00:20:45.540 But then the people who can afford it
00:20:47.140 and are really bringing some powerful DNA to the mix
00:20:51.440 because they're smart or strong
00:20:52.960 or handsome or tall
00:20:53.960 or whatever it is,
00:20:55.340 they still will probably be mating like crazy.
00:20:59.340 So we might end up
00:21:01.200 changing the gene pool of humanity
00:21:04.060 by simply making it impractical
00:21:06.780 for the people
00:21:07.900 on the struggling end of things
00:21:10.140 to reproduce.
00:21:11.340 And they might be just as happy
00:21:12.480 not reproducing.
00:21:14.000 I don't know.
00:21:14.300 Nothing is predictable.
00:21:17.080 But get your AI girlfriend.
00:21:18.600 Ontario, the wait is over.
00:21:21.520 The gold standard of online casinos has arrived.
00:21:24.660 Golden Nugget Online Casino is live,
00:21:26.920 bringing Vegas-style excitement
00:21:28.360 and a world-class gaming experience
00:21:30.380 right to your fingertips.
00:21:32.260 Whether you're a seasoned player
00:21:33.460 or just starting,
00:21:34.600 signing up is fast and simple.
00:21:36.720 And in just a few clicks,
00:21:38.000 you can have access to our exclusive library
00:21:40.000 of the best slots
00:21:41.080 and top-tier table games.
00:21:42.940 Make the most of your downtime
00:21:44.220 with unbeatable promotions
00:21:45.780 and jackpots
00:21:46.580 that can turn any mundane moment
00:21:48.360 into a golden opportunity
00:21:49.880 at Golden Nugget Online Casino.
00:21:52.440 Take a spin on the slots,
00:21:53.780 challenge yourself at the tables,
00:21:55.100 or join a live dealer game
00:21:56.600 to feel the thrill
00:21:57.600 of real-time action,
00:21:58.920 all from the comfort
00:21:59.860 of your own devices.
00:22:01.120 Why settle for less
00:22:02.100 when you can go for the gold
00:22:03.600 at Golden Nugget Online Casino?
00:22:06.420 Gambling problem?
00:22:07.320 Call Connex Ontario
00:22:08.420 1-866-531-2600.
00:22:11.640 19 and over.
00:22:12.540 Physically present in Ontario.
00:22:13.660 Eligibility restrictions apply.
00:22:15.540 See GoldenNuggetCasino.com
00:22:17.240 for details.
00:22:18.020 Please play responsibly.
00:22:20.040 What are the happiest cities?
00:22:21.720 There's another study
00:22:22.480 that showed the rank
00:22:23.920 the happiest cities.
00:22:25.820 And cities from 33 states
00:22:29.380 made the list.
00:22:30.780 That means that the rest
00:22:31.880 of the states had
00:22:32.780 not a single city
00:22:34.920 that was one of the happiest ones.
00:22:37.620 California had the most.
00:22:39.860 California had 16
00:22:40.980 of the happiest cities.
00:22:44.200 It's Florida at 12.
00:22:45.500 What's that tell you?
00:22:48.420 It's the weather.
00:22:50.600 The first decision I made
00:22:52.500 upon graduating college
00:22:54.800 in upstate New York
00:22:56.760 was why would I live
00:22:59.140 in this weather
00:23:00.480 when I could live
00:23:01.620 in good weather?
00:23:02.880 Because at the time,
00:23:04.140 you know,
00:23:04.360 I was a free agent.
00:23:05.380 It was after college.
00:23:06.380 I could go where I wanted.
00:23:07.220 So I went to California
00:23:09.160 because Florida
00:23:11.260 is a little bit muggy.
00:23:13.400 But I guess
00:23:13.840 they're still happy there.
00:23:15.880 And sure enough,
00:23:17.920 Californians got
00:23:19.300 their legal weed
00:23:20.200 and they got
00:23:20.700 their good weather.
00:23:22.180 And you can screw up
00:23:23.800 just about everything else
00:23:25.480 and that still works.
00:23:28.720 Now, do you think that
00:23:29.820 do you think that sun
00:23:31.740 makes a difference?
00:23:33.340 Who gets the most sun?
00:23:34.680 Hawaii, California, Florida.
00:23:39.460 What are the happiest states?
00:23:42.060 Hawaii, California, Florida.
00:23:46.180 How many data points
00:23:48.980 do you need to see
00:23:49.900 to get in the freaking sun?
00:23:53.860 I mean, it couldn't be
00:23:55.520 more obvious.
00:23:56.780 We went through the pandemic
00:23:57.920 and it's like,
00:23:58.920 get in the sun,
00:23:59.600 vitamin D, vitamin D,
00:24:00.780 vitamin D,
00:24:01.420 vitamin D,
00:24:02.020 vitamin D,
00:24:02.640 get in the sun.
00:24:03.160 And then you do
00:24:04.540 the happiest cities.
00:24:06.320 Sunny, sunny, sunny.
00:24:08.280 Are you getting
00:24:09.100 the message yet?
00:24:11.020 It really is
00:24:12.420 pretty simple.
00:24:15.260 Get some sun.
00:24:16.320 Get outdoors.
00:24:17.500 There's no way
00:24:18.440 that these are coincidences.
00:24:21.060 Get some sun.
00:24:23.020 All right.
00:24:26.940 NPR,
00:24:28.020 oh, Dr. Phil
00:24:28.800 crushed a DEI advocate.
00:24:30.840 So, Dr. Phil
00:24:33.520 had somebody
00:24:34.020 who was arguing
00:24:34.880 some bad shit crazy woman
00:24:37.220 who was,
00:24:38.760 and she looked
00:24:39.280 bad shit crazy.
00:24:40.660 I'm just judging from,
00:24:42.220 she just looked it.
00:24:43.300 I mean,
00:24:43.500 she had
00:24:43.760 that bad shit crazy eyes
00:24:46.440 and stuff.
00:24:47.600 And
00:24:48.040 he got her
00:24:49.640 to say that
00:24:50.420 DEI was about
00:24:51.620 equal outcomes
00:24:52.560 and then he
00:24:53.380 mocked that end
00:24:54.220 of existence
00:24:54.780 because equal outcomes
00:24:55.840 is what stupid people want.
00:24:57.700 It's not a real thing.
00:24:58.820 You can't really have
00:24:59.520 equal outcomes.
00:25:00.940 And as Dr. Phil
00:25:01.960 explained it,
00:25:03.220 well,
00:25:03.500 you got a wall
00:25:04.200 and you got short people
00:25:05.260 who can't see over it
00:25:06.180 and tall people who can.
00:25:07.300 Are you going to fix that?
00:25:08.840 You know,
00:25:09.120 sort of his metaphor
00:25:10.480 for everything else.
00:25:12.420 And
00:25:12.820 here's my take.
00:25:16.060 DEI
00:25:16.460 is not a philosophical
00:25:17.520 difference.
00:25:20.600 I'm going to say this
00:25:21.500 over and over
00:25:21.980 until you all agree.
00:25:23.720 The only philosophical
00:25:24.980 difference between
00:25:25.820 the left and the right
00:25:26.820 is abortion.
00:25:28.840 Everything else
00:25:29.600 is smart versus stupid
00:25:30.860 or, you know,
00:25:32.680 somebody selfishly
00:25:33.760 wants more money.
00:25:35.160 That's it.
00:25:36.820 And DEI
00:25:37.680 is just stupid.
00:25:39.460 There's nobody
00:25:40.200 smart in favor
00:25:41.200 of DEI.
00:25:42.780 Can I say that?
00:25:43.640 There's nobody
00:25:43.980 smart in favor of it.
00:25:45.640 At least,
00:25:46.580 and when I say
00:25:47.280 there's nobody
00:25:47.700 smart in favor of it,
00:25:49.600 I mean that
00:25:50.240 the smarter people
00:25:51.120 who pretend
00:25:52.020 to be in favor of it,
00:25:53.860 let's take your
00:25:54.560 Mark Cubans,
00:25:55.220 for example,
00:25:55.980 smart,
00:25:56.560 unambiguously
00:25:57.200 smart person
00:25:58.000 says in public
00:26:00.080 he's in favor of it.
00:26:01.320 This stuff is not real.
00:26:03.720 All right.
00:26:04.000 He's in favor of it
00:26:05.120 if you define it differently
00:26:06.540 as in not
00:26:07.700 equality of outcomes.
00:26:09.280 So the people
00:26:10.020 who say they're in favor
00:26:10.880 are playing the game
00:26:11.840 where they pretend
00:26:12.540 it's something it isn't
00:26:13.580 or they have to say it
00:26:15.240 because of their company
00:26:16.000 or they're,
00:26:17.300 you know,
00:26:17.460 there's just some pressure
00:26:18.340 they have to say it.
00:26:19.140 There's no smart person
00:26:20.540 who in a room alone
00:26:22.580 says DEI
00:26:23.640 is a good idea.
00:26:24.460 Not a black person,
00:26:25.400 not a white person,
00:26:26.240 not a woman,
00:26:26.780 not a man.
00:26:27.780 There's no adult.
00:26:29.400 None.
00:26:30.760 None.
00:26:31.320 There are zero smart people
00:26:32.480 who think DEI
00:26:33.240 is a good idea
00:26:33.960 if you define it properly
00:26:36.020 as outcomes being equal.
00:26:38.600 None.
00:26:39.000 There's not one.
00:26:39.980 There's not a single
00:26:40.960 smart person
00:26:41.700 in the whole fucking world
00:26:42.920 who in a private conversation
00:26:44.980 with you,
00:26:45.540 if they're not lying
00:26:46.480 and they know
00:26:47.720 that you're not going
00:26:48.240 to out on them,
00:26:49.220 there's not a single person,
00:26:50.580 not one,
00:26:51.740 who's smart
00:26:52.340 and in favor of DEI.
00:26:53.920 Can we just say that out loud?
00:26:56.280 This has nothing to do
00:26:57.640 with philosophy.
00:26:59.500 It has nothing to do
00:27:00.700 with what's good for people.
00:27:02.800 It's stupid versus smart
00:27:04.520 and, you know,
00:27:06.060 there's a layer of grifting
00:27:07.320 on top of it
00:27:08.160 that keeps it all going.
00:27:10.740 You know,
00:27:11.200 and people make money from it
00:27:12.360 so it's sort of
00:27:12.940 self-perpetuating.
00:27:14.240 But to imagine
00:27:15.080 this is a philosophical difference,
00:27:16.980 I'm not going to play
00:27:18.040 that anymore.
00:27:18.580 I can't pretend
00:27:20.300 that's a philosophical difference.
00:27:22.720 It's fucking stupidity.
00:27:26.980 Speaking of stupidity,
00:27:28.760 NPR continues
00:27:29.680 to humiliate itself
00:27:30.860 and Jonathan Turley
00:27:32.160 is happy to
00:27:33.020 mock them for it
00:27:34.720 in his latest article.
00:27:37.940 Anyway,
00:27:39.120 the new,
00:27:39.720 the NPR,
00:27:40.560 not new,
00:27:41.040 but the NPR CEO,
00:27:42.600 Catherine Marr.
00:27:44.460 Oh,
00:27:45.100 okay.
00:27:45.420 I'd love to see
00:27:47.420 a picture of her.
00:27:49.400 I just have a theory
00:27:50.560 about what her eyes
00:27:51.720 probably look like,
00:27:52.760 but I'm just guessing here.
00:27:54.200 I haven't seen
00:27:54.600 a picture of her.
00:27:56.360 Maybe saucerized?
00:27:57.780 Maybe.
00:27:58.280 I don't know.
00:27:59.820 But
00:28:00.040 she,
00:28:02.560 of course,
00:28:03.220 she's responding
00:28:03.840 to the big
00:28:05.180 dust-up
00:28:06.460 because one of her
00:28:07.340 editors said
00:28:08.400 that the place
00:28:09.000 was just a
00:28:09.820 liberal hellhole.
00:28:11.720 He didn't say hellhole,
00:28:13.040 but he said
00:28:13.380 there's basically,
00:28:14.560 not basically,
00:28:15.140 there are zero
00:28:15.960 Republicans working
00:28:17.000 as editors there
00:28:17.900 and of,
00:28:18.820 I don't know,
00:28:19.140 80 some.
00:28:20.540 And,
00:28:20.860 and NPR
00:28:23.740 is not even
00:28:24.260 really pretending
00:28:25.100 to be any kind
00:28:26.900 of an independent
00:28:27.600 news outlet.
00:28:28.320 It's just a
00:28:29.060 lefty organ.
00:28:30.420 So once,
00:28:31.480 once somebody
00:28:32.320 who really did
00:28:33.140 know what's happening,
00:28:34.440 you know,
00:28:34.600 not somebody guessing,
00:28:35.760 somebody who was
00:28:36.280 in the middle
00:28:36.740 of the belly
00:28:37.360 of the beast,
00:28:38.280 called them out
00:28:39.080 for what they really are,
00:28:40.720 what did the CEO do?
00:28:42.380 Did the CEO say,
00:28:43.580 those are some
00:28:44.120 good criticisms?
00:28:45.900 We'd better
00:28:46.580 re-examine
00:28:47.320 how we do business
00:28:48.920 and you're right.
00:28:50.240 We need some
00:28:51.220 diversity of thought
00:28:52.560 because,
00:28:54.160 oh,
00:28:54.440 I guess
00:28:54.700 I got it right.
00:28:56.020 Somebody just
00:28:56.620 put her picture there.
00:28:57.740 Oh,
00:28:58.240 God.
00:29:04.980 Oh,
00:29:05.540 God.
00:29:06.740 Yeah.
00:29:07.980 Okay.
00:29:09.460 Anyway.
00:29:09.900 And she said that
00:29:14.780 so what she did
00:29:17.220 instead of saying
00:29:18.020 that NPR
00:29:18.560 is broken
00:29:19.360 and biased
00:29:20.280 and they need
00:29:20.820 to fix it
00:29:21.400 to get credibility
00:29:22.280 with their audience,
00:29:23.840 she said that
00:29:24.760 the guy who
00:29:25.400 outed them
00:29:25.980 is profoundly
00:29:26.640 disrespectful,
00:29:27.680 hurtful,
00:29:28.020 and demeaning.
00:29:29.560 And calling out
00:29:30.580 his colleagues
00:29:31.120 like that
00:29:31.580 was just
00:29:32.400 not right.
00:29:35.080 That's right.
00:29:35.760 she attacked
00:29:36.860 the whistleblower.
00:29:40.420 She attacked
00:29:41.200 the whistleblower.
00:29:42.700 I don't know
00:29:43.420 if there's any way
00:29:44.220 you could make
00:29:44.700 that any worse.
00:29:46.220 Yeah.
00:29:46.620 Let's shit
00:29:47.100 on the white guy.
00:29:49.080 That's the way
00:29:50.260 to play this.
00:29:52.220 Now,
00:29:53.300 may I say again
00:29:54.740 that whatever
00:29:56.600 was happening
00:29:57.180 at NPR,
00:29:58.640 if you say
00:29:59.300 that NPR
00:29:59.880 had a
00:30:00.540 philosophical
00:30:02.020 argument
00:30:02.720 and that's
00:30:03.840 why they act
00:30:04.760 the way they are,
00:30:05.920 it's based
00:30:06.380 on the philosophy.
00:30:08.740 It's not based
00:30:09.500 on the philosophy.
00:30:10.720 These are
00:30:11.240 batshit crazy people.
00:30:13.540 They're batshit crazy.
00:30:15.260 They're clearly
00:30:15.920 suffering from TDS
00:30:17.240 because remember
00:30:18.080 it all happened
00:30:18.660 from the Trump
00:30:19.700 time on
00:30:20.740 it got worse.
00:30:22.320 No,
00:30:22.700 this is
00:30:23.120 mental health
00:30:24.920 and to imagine
00:30:26.680 this is anything
00:30:27.320 but bad.
00:30:29.260 Oh my God.
00:30:31.780 The locals people,
00:30:33.000 not everybody
00:30:33.760 can do this
00:30:34.320 but the locals people
00:30:35.340 can include
00:30:36.100 images.
00:30:37.840 They're showing me
00:30:38.720 images of the
00:30:39.560 NPR CEO.
00:30:42.880 It's exactly
00:30:43.800 what you think
00:30:44.420 it is.
00:30:47.120 That's perfect.
00:30:49.700 All right,
00:30:50.660 so I saw a post
00:30:51.920 by Tom Elliott
00:30:52.680 on the X platform
00:30:53.780 and it was
00:30:55.040 an MSNBC clip
00:30:56.500 of somebody
00:30:57.580 doing something.
00:30:59.520 Now,
00:31:00.460 I'm not even
00:31:01.720 going to tell you
00:31:02.120 what it was about
00:31:02.840 because it's not
00:31:03.680 relevant.
00:31:04.700 Here's the funny
00:31:05.520 story.
00:31:06.800 If you're on
00:31:07.580 the X platform
00:31:08.340 and somebody
00:31:10.220 reposts a clip
00:31:11.580 from MSNBC
00:31:12.640 and they don't
00:31:14.440 put a comment
00:31:15.000 on it,
00:31:15.580 they don't put
00:31:16.280 their own comment,
00:31:17.220 it's just
00:31:17.560 take a look
00:31:18.320 at this.
00:31:20.500 It's always
00:31:21.240 for humor.
00:31:22.500 Have you noticed
00:31:23.260 that?
00:31:24.420 That the right
00:31:25.280 actually uses
00:31:26.280 MSNBC,
00:31:27.340 no joke,
00:31:28.020 this is not
00:31:28.500 hyperbole,
00:31:29.740 we use it
00:31:30.320 like Saturday
00:31:30.860 Night Live
00:31:31.580 with no comments
00:31:33.500 needed.
00:31:34.620 So I was looking
00:31:35.300 at one today
00:31:35.900 and, you know,
00:31:37.040 as soon as I saw
00:31:37.900 it was reposted,
00:31:39.240 I knew it was
00:31:39.880 for comedy.
00:31:41.580 I knew it had
00:31:42.360 nothing to do
00:31:42.880 with their philosophy
00:31:43.700 or their difference
00:31:44.580 of opinion.
00:31:45.440 In fact,
00:31:45.860 I don't even
00:31:46.380 remember what
00:31:46.960 it was about.
00:31:47.900 I just remember
00:31:48.440 it was funny
00:31:49.020 because I got
00:31:50.360 to see some
00:31:50.920 batshit crazy
00:31:51.620 people and
00:31:53.340 talking.
00:31:54.320 And if you see
00:31:54.880 batshit crazy
00:31:55.680 people,
00:31:56.940 and I've said
00:31:57.340 this before,
00:31:57.960 but once you
00:31:58.380 see it,
00:31:58.700 it's hilarious.
00:31:59.760 If you're only
00:32:00.760 watching somebody
00:32:01.520 crazy,
00:32:02.160 it's disturbing.
00:32:03.680 That's not funny.
00:32:05.140 Would you agree?
00:32:06.480 If you see
00:32:07.020 somebody who's
00:32:07.540 in mental distress,
00:32:09.340 that's not funny.
00:32:11.680 There's no way
00:32:12.280 you can make
00:32:12.660 that funny.
00:32:13.140 The only way
00:32:14.600 it can be funny
00:32:15.360 is when it's
00:32:16.960 combined with
00:32:17.860 smugness.
00:32:19.760 And that's
00:32:20.480 what MSNBC
00:32:21.300 does.
00:32:22.120 They combine
00:32:22.960 it with the
00:32:23.420 smugness that
00:32:24.320 they're right.
00:32:25.620 And when you
00:32:26.100 add the mental
00:32:26.780 illness to the
00:32:27.580 smugness,
00:32:28.680 I can't stop
00:32:30.100 watching it.
00:32:31.700 Honestly,
00:32:32.960 I just want to
00:32:33.980 watch Lawrence
00:32:35.100 O'Donnell and
00:32:36.000 Rachel Maddow
00:32:37.220 with their
00:32:37.740 smug smile.
00:32:39.440 Well,
00:32:40.920 let me see if
00:32:41.920 I can do an
00:32:42.360 impression.
00:32:43.140 of being
00:32:44.140 crazy and
00:32:45.260 smug at the
00:32:45.880 same time.
00:32:47.720 Well,
00:32:49.260 I don't think
00:32:51.320 a lot of
00:32:51.640 Republicans
00:32:52.080 understand that
00:32:54.240 oil doesn't
00:32:55.000 even need to
00:32:55.400 be drilled
00:32:55.740 for.
00:32:56.080 It's free.
00:32:57.400 It's all over
00:32:58.100 the place.
00:32:58.580 It's on the
00:32:58.920 ground.
00:33:00.100 I've got oil
00:33:00.660 in my pocket.
00:33:03.400 And I
00:33:05.220 guess if you're
00:33:05.780 a Republican,
00:33:06.860 you think you
00:33:07.400 have to drill
00:33:08.980 holes in the
00:33:09.460 ground for it.
00:33:11.360 Am I right,
00:33:12.140 people?
00:33:13.140 They're out
00:33:13.580 there like
00:33:14.000 drilling holes
00:33:15.060 in the
00:33:15.300 ground.
00:33:17.760 Oh,
00:33:18.320 smugly.
00:33:19.680 I'm so
00:33:20.340 smug.
00:33:22.720 Oil's free.
00:33:24.140 Oil's free,
00:33:24.840 people.
00:33:25.960 It's laying
00:33:26.560 all over the
00:33:27.000 place.
00:33:27.700 In fact,
00:33:28.200 if you go
00:33:28.600 out in your
00:33:28.840 car,
00:33:29.820 you can go
00:33:30.500 out in your
00:33:30.740 car and it's
00:33:31.120 full of oil.
00:33:33.040 Does anybody
00:33:33.680 say that?
00:33:34.220 No.
00:33:34.820 No.
00:33:35.120 Does Trump
00:33:35.540 ever mention
00:33:36.140 that there's
00:33:37.440 oil right
00:33:37.860 there in
00:33:38.140 your car?
00:33:39.460 No.
00:33:39.880 He needs
00:33:40.180 to start
00:33:40.580 a war to
00:33:42.380 go look for
00:33:42.860 some oil
00:33:43.300 like a
00:33:43.680 dictator.
00:33:46.880 That's
00:33:47.240 how a
00:33:47.460 dictator
00:33:47.700 does.
00:33:49.620 They start
00:33:50.280 a war to
00:33:51.460 look for
00:33:51.820 oil, but
00:33:52.220 they don't
00:33:52.500 need to
00:33:52.820 because your
00:33:53.220 car has
00:33:53.680 oil right
00:33:54.200 in it.
00:33:55.080 If you
00:33:55.220 don't believe
00:33:55.560 me, open
00:33:56.600 the hood.
00:33:57.580 There's a
00:33:57.940 little round
00:33:58.280 thing.
00:33:58.580 You can
00:33:58.840 take that
00:33:59.320 out and
00:34:00.080 there's a
00:34:00.460 stick in
00:34:00.740 it.
00:34:01.520 It's called
00:34:02.060 the dip
00:34:02.380 stick, just
00:34:03.500 like President
00:34:04.080 Trump.
00:34:04.580 Am I
00:34:04.780 right?
00:34:07.080 Dipstick.
00:34:08.320 Well, isn't
00:34:08.860 that a
00:34:09.140 coincidence?
00:34:11.460 Yeah.
00:34:12.260 I think
00:34:12.900 that's enough
00:34:13.440 of Smug
00:34:14.360 Scott and
00:34:14.920 MSNBC.
00:34:18.420 Well,
00:34:18.980 Julian
00:34:19.280 Assange, of
00:34:19.860 course we're
00:34:20.280 going to
00:34:20.460 talk about
00:34:20.980 Iran and
00:34:21.660 Israel.
00:34:22.520 Of course
00:34:22.920 we are.
00:34:24.300 Of course
00:34:24.780 we are.
00:34:25.640 But it's
00:34:26.020 the most
00:34:26.280 boring war
00:34:26.940 ever so
00:34:27.460 far, so
00:34:28.060 it didn't
00:34:28.800 even get
00:34:29.200 in the
00:34:29.420 top ten
00:34:29.800 stories
00:34:30.200 today.
00:34:31.480 Israel,
00:34:32.220 Iran, you're
00:34:32.980 going to
00:34:33.240 need something
00:34:34.140 a little
00:34:36.100 more, you
00:34:36.960 know, a
00:34:38.480 little more
00:34:38.780 sizzle.
00:34:41.680 Oh, we
00:34:42.360 had a war
00:34:42.820 today, nobody
00:34:43.820 was killed.
00:34:46.520 Really?
00:34:48.200 And that's
00:34:48.960 supposed to
00:34:49.400 impress me?
00:34:50.840 You had a
00:34:51.260 war with
00:34:51.640 nobody killed?
00:34:52.120 We'll get
00:34:52.480 to it.
00:34:53.100 We'll get
00:34:53.600 to it.
00:34:53.920 It's just a
00:34:54.440 boring war.
00:34:55.920 It's going
00:34:56.500 to be, it's
00:34:57.420 coming up.
00:35:00.000 Julian
00:35:00.420 Assange warned
00:35:01.200 us a while
00:35:01.700 ago that the
00:35:02.600 goal of
00:35:03.680 war is
00:35:04.220 never to
00:35:04.720 end the
00:35:05.080 war, and
00:35:06.380 it's never
00:35:06.720 to win.
00:35:07.240 It's to
00:35:07.780 drain money
00:35:08.780 out of the
00:35:09.300 pockets of
00:35:09.880 the citizens
00:35:10.360 and put it
00:35:11.580 in the
00:35:11.860 pockets of
00:35:12.460 the powerful
00:35:13.040 people.
00:35:13.980 Is Julian
00:35:14.760 Assange wrong
00:35:15.580 that the
00:35:16.720 real purpose
00:35:17.420 of war is
00:35:18.040 to transfer
00:35:18.620 your money
00:35:19.180 to the rich
00:35:20.020 people who
00:35:20.480 already have
00:35:20.920 a lot?
00:35:22.540 He's not
00:35:23.280 wrong.
00:35:24.020 It's not the
00:35:24.660 only thing
00:35:25.120 happening.
00:35:26.820 But he's
00:35:27.420 not wrong,
00:35:28.740 meaning that it
00:35:29.660 does in
00:35:30.120 fact do
00:35:30.560 that, and
00:35:31.660 those powerful
00:35:32.280 people have
00:35:32.920 more control
00:35:33.640 over whether
00:35:34.140 we go to
00:35:34.580 war than
00:35:35.100 you do.
00:35:36.660 So it's
00:35:37.560 sort of
00:35:37.840 right.
00:35:38.720 It's not
00:35:39.100 the whole
00:35:39.400 story.
00:35:41.000 What are
00:35:41.400 you showing
00:35:41.740 me here?
00:35:44.720 Let me
00:35:45.080 look at
00:35:45.540 this.
00:35:48.600 The bright
00:35:49.340 side is
00:35:49.860 world wars
00:35:50.580 are the
00:35:50.840 only ones
00:35:51.280 we win.
00:35:52.000 Right.
00:35:53.000 Exactly.
00:35:53.480 All right.
00:35:58.320 We'll get
00:35:59.000 to that.
00:36:01.420 But did
00:36:02.180 you notice
00:36:02.620 that this
00:36:03.400 time Israel
00:36:04.920 put a price
00:36:05.600 tag on
00:36:06.560 their latest
00:36:07.000 action?
00:36:08.460 So Iran,
00:36:09.260 you know,
00:36:09.780 we'll talk
00:36:10.540 more, but
00:36:11.000 they sent
00:36:11.400 some missiles
00:36:11.900 and some
00:36:12.420 drones, and
00:36:13.180 Israel defeats
00:36:14.580 almost all of
00:36:15.260 them with
00:36:16.000 their help,
00:36:16.920 and then
00:36:18.100 they put a
00:36:18.480 price tag
00:36:18.920 on it.
00:36:19.680 They said,
00:36:20.220 well, it
00:36:20.720 costs a
00:36:21.160 little over
00:36:21.540 $1 billion
00:36:22.160 to defend.
00:36:24.100 Have you
00:36:24.620 ever seen
00:36:24.980 anybody put
00:36:26.200 a price tag
00:36:26.900 on a
00:36:27.280 single battle?
00:36:30.480 Has that
00:36:31.060 ever happened
00:36:31.480 before?
00:36:32.960 Do you know
00:36:33.560 why they put
00:36:33.940 a price tag
00:36:34.500 on it?
00:36:38.280 Because
00:36:38.720 they're asking
00:36:39.280 for money
00:36:39.900 from the
00:36:40.460 United States.
00:36:41.460 So Israel
00:36:42.020 put a price
00:36:42.640 tag on
00:36:43.100 their action.
00:36:44.500 Well, here's
00:36:44.900 another billion
00:36:45.440 dollars.
00:36:45.920 That's why
00:36:46.220 we're asking
00:36:46.620 for money.
00:36:48.040 Now, the
00:36:48.820 fact that
00:36:49.240 they had
00:36:49.660 the guts
00:36:50.640 to put a
00:36:51.820 price tag
00:36:52.360 on that,
00:36:53.480 is number
00:36:54.940 one, really
00:36:55.780 good persuasion.
00:36:58.400 It's very
00:36:59.060 good persuasion
00:36:59.840 because it
00:37:01.320 fits with
00:37:01.820 their
00:37:02.000 trying to
00:37:02.520 get funding,
00:37:03.160 et cetera.
00:37:04.320 But it's
00:37:05.920 so on the
00:37:06.560 nose.
00:37:07.860 The trouble
00:37:08.400 is that when
00:37:09.000 you've got
00:37:09.380 this Assange
00:37:10.080 quote sitting
00:37:10.860 out there,
00:37:11.820 that it's
00:37:12.040 really all
00:37:12.520 about transferring
00:37:13.380 money to rich
00:37:14.160 people, and
00:37:15.180 then Israel
00:37:15.660 gives you a
00:37:16.180 price tag for
00:37:16.900 the battle,
00:37:17.320 people, and
00:37:19.180 it's over
00:37:19.560 a billion
00:37:19.920 dollars, that
00:37:21.920 does really
00:37:22.520 play to
00:37:23.020 Assange's
00:37:23.820 view that
00:37:24.700 maybe these
00:37:26.220 wars are a
00:37:26.800 little more
00:37:27.200 optional than
00:37:28.100 they're
00:37:28.500 presented to
00:37:29.100 us.
00:37:30.080 Now, I'm
00:37:31.460 no fool, so
00:37:32.420 I know the
00:37:32.880 Middle East is
00:37:33.400 complicated, and
00:37:34.220 it's not all
00:37:34.780 about the
00:37:35.200 money, but
00:37:36.360 the money is
00:37:36.940 always going to
00:37:37.440 be a big
00:37:37.860 part of it.
00:37:39.280 It's not
00:37:40.040 nothing.
00:37:41.040 I don't think
00:37:41.680 it's most of
00:37:42.280 it, and I
00:37:42.700 don't think it's
00:37:43.120 the main reason
00:37:43.780 anybody's doing
00:37:44.340 anything.
00:37:44.680 But it's
00:37:46.040 a lot.
00:37:49.340 All right,
00:37:49.580 Bill Gates
00:37:50.040 and Jamie
00:37:51.940 Diamond both
00:37:53.300 think that
00:37:53.840 AI could lead
00:37:54.640 to shorter
00:37:55.080 work weeks.
00:37:56.100 I say those
00:37:57.020 are terrible
00:37:57.780 predictions because
00:38:00.260 billionaires don't
00:38:01.960 work less when
00:38:02.680 they become
00:38:03.100 billionaires.
00:38:05.000 The fact that
00:38:05.800 you're able to
00:38:06.700 work less doesn't
00:38:07.860 seem to be
00:38:08.480 related to how
00:38:09.160 much people
00:38:09.580 work.
00:38:10.920 Am I right?
00:38:11.780 I mean, Elon
00:38:12.280 Musk, he's not
00:38:13.000 taking a day
00:38:13.580 off.
00:38:13.860 Bill Gates
00:38:14.880 himself, he
00:38:15.620 just changed
00:38:16.120 what he did.
00:38:17.080 He changed
00:38:17.500 it to charity,
00:38:20.460 to giving it
00:38:20.920 away, but
00:38:21.360 he's still
00:38:22.380 working.
00:38:23.680 And Jamie
00:38:24.800 Diamond, does
00:38:25.640 Jamie Diamond
00:38:26.280 need to work?
00:38:27.900 How much
00:38:28.520 money does
00:38:28.960 Jamie Diamond
00:38:29.560 have?
00:38:29.960 He doesn't
00:38:30.260 need to work,
00:38:31.500 but he still
00:38:31.980 works.
00:38:32.860 So the very
00:38:33.660 people who are
00:38:34.260 saying that we're
00:38:34.860 going to work
00:38:35.340 less when we
00:38:37.120 don't need to
00:38:37.680 work are the
00:38:39.040 ones who are
00:38:39.580 working even
00:38:40.460 though they don't
00:38:40.960 need to work.
00:38:41.560 work, I feel
00:38:42.620 like they're
00:38:43.380 disproving their
00:38:44.140 own argument.
00:38:46.960 I don't need
00:38:47.720 to work, but
00:38:49.940 you see me here
00:38:50.580 literally seven
00:38:51.340 days a week.
00:38:52.440 I work seven
00:38:53.260 days a week, but
00:38:54.780 I don't need to.
00:38:56.740 It's just who I
00:38:57.620 am.
00:38:58.400 I wouldn't know
00:38:59.040 how to not
00:38:59.500 work.
00:39:00.440 So here's what I
00:39:01.180 think.
00:39:01.460 I think the
00:39:01.960 aggressive people
00:39:02.780 who need to
00:39:03.280 work for whatever
00:39:03.980 reason will use
00:39:05.900 AI to work their
00:39:07.180 usual long hours.
00:39:08.580 They'll just get
00:39:09.200 more done because
00:39:10.540 they have more
00:39:10.940 tools.
00:39:12.080 So no, it's
00:39:12.700 not going to, it
00:39:13.940 might actually take
00:39:14.920 the lazy people
00:39:15.700 down to three
00:39:16.320 days a week.
00:39:17.000 So I think
00:39:17.340 that's true.
00:39:18.380 But it will
00:39:20.980 just be an
00:39:21.440 acceleration of a
00:39:22.580 current trend,
00:39:23.400 which is the
00:39:23.880 aggressive people
00:39:24.540 keep working and
00:39:25.500 the lazy people
00:39:26.300 try to stop.
00:39:29.600 Facebook says
00:39:30.560 it's going to
00:39:31.760 spend $20
00:39:32.360 billion and
00:39:33.240 has 40,000
00:39:34.160 people working
00:39:34.880 on safeguarding
00:39:36.400 elections worldwide.
00:39:37.400 Okay, that
00:39:40.240 doesn't sound too
00:39:41.080 suspicious.
00:39:42.660 Let me pull some
00:39:43.800 things together for
00:39:44.680 you.
00:39:45.240 In our world, it's
00:39:46.560 almost impossible to
00:39:47.720 get as big as
00:39:48.420 Facebook or any
00:39:49.820 other big company
00:39:50.620 unless the
00:39:52.240 government is on
00:39:52.960 your side.
00:39:55.680 Right?
00:39:56.720 Because at some
00:39:57.640 point, you need to
00:39:58.840 be able to control
00:39:59.760 other countries and
00:40:01.920 the laws need to be
00:40:02.920 on your side and you
00:40:04.000 need to not get
00:40:04.700 sued for stuff.
00:40:05.640 And at some
00:40:06.800 point, you just
00:40:07.620 need to have the
00:40:08.640 government on your
00:40:09.300 side and they
00:40:09.960 know that.
00:40:11.380 So the government
00:40:11.960 and their spooks can
00:40:14.080 pretty much control
00:40:14.920 any large company
00:40:16.040 because any large
00:40:17.760 company needs the
00:40:18.560 government to be on
00:40:19.260 their side.
00:40:19.960 There's just too
00:40:20.400 many obstacles that
00:40:21.320 only a government
00:40:22.080 can remove.
00:40:23.800 So here we have
00:40:25.040 Zuckerberg who has
00:40:28.260 a social media
00:40:29.000 company that was
00:40:29.860 built for the
00:40:30.540 purpose of guys
00:40:31.460 trying to get laid
00:40:32.380 with other college
00:40:33.500 people.
00:40:34.560 people and
00:40:35.880 turned into this
00:40:36.580 behemoth and
00:40:38.000 now he wants to
00:40:38.760 spend 20 billion
00:40:39.700 and 40,000 people
00:40:41.100 to make sure that
00:40:42.380 elections in other
00:40:43.780 countries, this
00:40:44.600 isn't just about
00:40:45.280 America, people.
00:40:46.360 This is global.
00:40:49.420 And that Facebook
00:40:50.140 will have the
00:40:50.780 largest fact-checking
00:40:52.020 network with
00:40:53.240 partners in South
00:40:54.220 Africa, blah, blah,
00:40:56.320 blah.
00:40:56.420 And it seems to
00:41:00.960 me that what is
00:41:03.600 happening is that
00:41:04.520 the CIA has told
00:41:07.120 Facebook, you guys
00:41:08.780 couldn't possibly be
00:41:09.860 in business without
00:41:10.620 us and so you're
00:41:12.400 going to now help
00:41:13.040 us control the
00:41:13.880 governments and all
00:41:14.700 these other places.
00:41:15.960 If you can control
00:41:17.280 the media, meaning
00:41:19.240 the fact-checking
00:41:20.280 and the social media
00:41:21.740 online and the
00:41:22.700 news, if you can
00:41:23.940 control those things,
00:41:24.880 you don't need to
00:41:25.480 control the voting
00:41:26.800 machines.
00:41:28.140 You don't need to
00:41:29.180 have any miscounting
00:41:31.420 or shenanigans at
00:41:33.120 the vote if you've
00:41:34.480 controlled all of the
00:41:35.500 information up until
00:41:36.700 election day.
00:41:38.100 So here, right in
00:41:39.080 front of us, we see
00:41:40.340 this huge entity, which
00:41:41.780 almost certainly is
00:41:42.740 being driven by our
00:41:43.900 intelligence people,
00:41:45.480 creating a massive
00:41:46.420 structure to brainwash
00:41:48.640 people in our country
00:41:49.700 and other countries so
00:41:51.040 that the elections are
00:41:52.080 the unimportant part
00:41:53.180 of the process.
00:41:55.700 So the election, the
00:41:57.280 actual voting, is the
00:41:58.680 unimportant part of
00:41:59.600 the process.
00:42:00.400 The important part is
00:42:01.620 what people think,
00:42:03.620 because then the vote
00:42:04.420 just comes from that.
00:42:06.220 So changing how people
00:42:07.180 think is exactly what
00:42:09.680 Facebook is telling you
00:42:11.020 they're doing.
00:42:11.860 They're telling you
00:42:12.780 they're in the business
00:42:13.520 of mind control,
00:42:14.960 except they're phrasing
00:42:16.960 it as giving people
00:42:18.400 accurate information.
00:42:19.300 Do you think that's
00:42:20.980 the only goal, is to
00:42:22.300 give people accurate
00:42:23.140 information?
00:42:24.520 Well, I'm sure that
00:42:25.460 people working on this
00:42:26.420 think of it that way.
00:42:28.740 But no, it's to give
00:42:30.660 them accurate information
00:42:31.880 until they believe what
00:42:33.680 you're telling them, and
00:42:34.520 then you can tell them
00:42:35.300 anything, because you've
00:42:36.780 trained them that it's
00:42:37.500 accurate information.
00:42:39.060 As soon as people
00:42:39.800 believe it's accurate
00:42:40.660 information, you own
00:42:41.820 them.
00:42:43.180 Take Walter Cronkite.
00:42:45.080 When I grew up, and
00:42:47.220 even today, people say,
00:42:49.160 you know, Walter
00:42:49.760 Cronkite, he was the
00:42:51.400 real news.
00:42:53.180 And, you know, you got
00:42:54.520 the straight news, no
00:42:56.000 bias, that Walter
00:42:57.640 Cronkite, I wish we
00:42:59.880 could go back to those
00:43:00.680 times, because he
00:43:01.540 played it right down
00:43:02.520 the middle.
00:43:03.700 Now, look at Walter
00:43:04.880 Cronkite with the
00:43:06.580 benefit of the goggles
00:43:07.860 which you've learned to
00:43:08.980 wear in our modern
00:43:10.020 times.
00:43:11.300 Do you think there's
00:43:12.940 any chance at all that
00:43:14.180 Walter Cronkite was not
00:43:15.380 owned by our
00:43:16.180 intelligence people?
00:43:17.380 Does anybody think
00:43:19.640 that was honest and
00:43:20.800 unbalanced news?
00:43:22.660 Of course it wasn't.
00:43:24.860 Of course not.
00:43:26.560 It's never been.
00:43:28.340 Yeah.
00:43:30.340 Now, it does seem to
00:43:31.480 me that Walter Cronkite
00:43:33.420 was probably allowed to
00:43:34.640 tell the news just the
00:43:35.640 way he wanted to, 98%
00:43:37.820 of the time.
00:43:39.100 So if you're watching,
00:43:40.360 you know, some human
00:43:41.220 interest story or somebody
00:43:42.380 got murdered, it's
00:43:43.660 probably exactly what
00:43:44.640 happened.
00:43:44.960 It's only when
00:43:46.680 something matters, as
00:43:48.480 in what's the point of
00:43:49.600 this war or are we
00:43:51.300 winning the war?
00:43:52.760 Are we getting to
00:43:54.000 space?
00:43:54.400 Are we the greatest
00:43:55.080 country?
00:43:55.920 The things that the
00:43:57.060 intelligence people would
00:43:58.000 care about, there's no
00:43:59.680 way in the world he's
00:44:00.540 just winging in on those
00:44:01.620 things.
00:44:02.260 I think they're just
00:44:03.360 telling him what to
00:44:05.340 say and always have.
00:44:09.240 That's my view of the
00:44:10.200 world.
00:44:10.380 So, yes, when Facebook
00:44:12.840 says they're spending
00:44:13.580 $20 billion and have
00:44:14.820 40,000 people dedicated
00:44:16.800 to tell you what the
00:44:17.820 facts are, that means
00:44:20.200 that elections don't
00:44:21.040 matter anymore.
00:44:22.580 That's what it means.
00:44:24.020 Because if they can tell
00:44:25.300 you what the facts are,
00:44:26.620 they can tell you who's
00:44:27.820 going to win the next
00:44:28.560 election.
00:44:30.100 It's the same story.
00:44:32.460 All right.
00:44:34.260 Fog of war over in
00:44:35.500 Israel.
00:44:35.780 As you know, Israel
00:44:38.400 had, I don't know, a
00:44:40.560 few weeks ago, attacked
00:44:41.680 an Iranian asset in
00:44:45.700 another country, an
00:44:46.560 embassy or a consulate or
00:44:47.840 something, killed some
00:44:48.920 people.
00:44:49.660 So Iran said it had to
00:44:50.820 retaliate, and it did,
00:44:52.780 to ascend around, I
00:44:53.900 don't know, two to
00:44:54.480 three hundred projectiles.
00:44:56.700 So a bunch of drones and
00:44:58.060 a bunch of intercontinental
00:44:59.380 missiles.
00:45:00.720 Allegedly, America and UK
00:45:03.580 and Israel and some
00:45:05.260 other friends, knocked
00:45:06.880 down just about every
00:45:08.140 one of those drones and
00:45:09.580 missiles, only maybe a
00:45:10.860 couple got through,
00:45:12.220 causing minor damage,
00:45:13.920 nothing major.
00:45:16.420 Now, have you ever heard
00:45:18.500 the phrase, a theater of
00:45:21.600 war?
00:45:22.940 You used to hear a lot
00:45:23.900 about World War II.
00:45:26.260 It was like the Pacific
00:45:27.260 theater.
00:45:28.840 It was like a theater of
00:45:30.180 war.
00:45:30.460 Did you ever think that
00:45:33.400 was a weird kind of way
00:45:35.440 to say it?
00:45:36.600 Because theater seems like
00:45:37.720 acting, whereas war seems
00:45:39.440 like the most real thing
00:45:40.460 there could ever be.
00:45:41.720 It's a weird combination,
00:45:42.980 a theater of war.
00:45:44.900 But then you fast forward
00:45:46.340 to Israel and Iran, who
00:45:49.120 have apparently negotiated
00:45:50.480 the war in advance.
00:45:52.640 All right, guys, we're
00:45:53.940 going to take out one of
00:45:55.060 your consulate guys.
00:45:57.080 What you'll do is you'll
00:45:58.300 respond with a bunch of
00:45:59.260 missiles, but give us a
00:46:00.500 warning.
00:46:01.120 Make sure that we have
00:46:01.940 our anti-missile stuff in
00:46:04.180 place.
00:46:05.260 We'll shoot down your
00:46:06.500 missiles.
00:46:08.620 But if a few get through,
00:46:10.840 and we'll put a price tag
00:46:12.460 on it, make it sound
00:46:13.220 expensive and dangerous,
00:46:14.700 and then we might need to
00:46:16.460 respond also.
00:46:18.480 So we'll probably respond
00:46:19.840 in some way that's sort of
00:46:21.980 indirect and not quite
00:46:23.900 enough to start a war.
00:46:25.080 And it's literally theater.
00:46:29.400 It's literally theater.
00:46:31.320 It's two countries
00:46:32.300 pretending to be at war.
00:46:34.180 And not only are they
00:46:35.360 pretending, they're telling
00:46:37.120 us they're pretending, and
00:46:38.900 then they're doing it in
00:46:39.640 front of us.
00:46:41.220 I'm going to pretend to
00:46:42.460 fight back, because that's
00:46:44.200 how my population will feel
00:46:45.860 good that we did something.
00:46:47.580 All right, great.
00:46:48.400 We understand that this is a
00:46:49.720 pretend attack.
00:46:50.520 So we won't attack back
00:46:52.640 too hard.
00:46:54.120 But we might need to do a
00:46:57.340 little bit, just a little
00:46:59.040 mop-up pretend attack after
00:47:01.180 this.
00:47:02.400 And Iran would be, ah, we
00:47:04.420 thought that our pretend
00:47:05.340 attack would be all we
00:47:06.340 needed.
00:47:06.660 But I can see that a couple
00:47:08.260 of missiles got through.
00:47:09.680 Yeah.
00:47:10.760 Damn it.
00:47:11.520 You know, it wasn't as
00:47:12.260 clean as we hoped.
00:47:13.400 Yes, you're going to have
00:47:14.160 to probably, I don't know,
00:47:16.440 kill one of our people.
00:47:17.440 Am I wrong that that
00:47:19.840 entire attack from Iran
00:47:21.140 killed zero people?
00:47:23.100 It was one minor injury?
00:47:25.480 I'm not even sure about
00:47:26.480 that.
00:47:27.740 There were zero, it was a
00:47:29.260 major military action with
00:47:30.560 zero deaths.
00:47:32.220 That is a pretend war.
00:47:35.180 That's a pretend war.
00:47:38.480 Now, look at Assange.
00:47:41.200 Assange says that war is
00:47:42.560 basically to transfer assets
00:47:44.200 to other people.
00:47:45.640 Right.
00:47:47.440 Why do you fight a pretend
00:47:49.420 war other than some other
00:47:52.240 reason, like putting your
00:47:55.200 assets somewhere else?
00:47:57.160 I don't know.
00:47:57.940 It's all very pretend.
00:48:01.260 So here's what I think we
00:48:02.540 should do.
00:48:05.020 I think Israel should say
00:48:07.940 that Iran is the problem,
00:48:09.320 not the proxies, because
00:48:11.460 fighting each other's proxies
00:48:13.000 is a pretend war.
00:48:14.840 It's just theater.
00:48:15.620 Oh, I'll pretend I'm
00:48:17.380 fighting you by fighting
00:48:18.360 your proxy.
00:48:19.400 Well, damn you.
00:48:20.900 I'll pretend I'm fighting
00:48:22.020 back by fighting your
00:48:23.060 proxy.
00:48:25.500 But now you've got this
00:48:26.740 pretend attack on the
00:48:27.900 mainland.
00:48:29.280 Now, it was a real attack
00:48:30.760 on, you know, the actual
00:48:32.280 homeland of Israel.
00:48:33.560 But because both sides knew
00:48:35.740 that their anti-drone
00:48:38.100 defenses would be really
00:48:39.280 good, drones move slowly,
00:48:41.380 I think everybody knew it
00:48:43.780 was going to end this way
00:48:44.640 with not much in the way of
00:48:45.800 damage.
00:48:46.520 So that's kind of a pretend
00:48:47.820 war.
00:48:48.940 But here's how Israel can make
00:48:50.480 it real, if they wanted to.
00:48:53.380 I think Israel should demand
00:48:55.400 reparations from Iran, because
00:48:58.060 Iran has been their nemesis for
00:49:00.800 years and caused all of their
00:49:02.300 expenses.
00:49:03.540 Iran caused them to spend a
00:49:04.920 billion dollars to defend
00:49:06.160 itself.
00:49:07.220 Iran causes it to defend
00:49:08.860 itself against Hezbollah,
00:49:10.160 because they back them.
00:49:11.560 October 7th was because of
00:49:13.220 Iran's backing.
00:49:14.140 Every terrorist attack is
00:49:15.460 because of Iran.
00:49:17.420 So here's what you do.
00:49:19.440 You take Gaza, and you call
00:49:21.920 it reparations.
00:49:23.500 And you say, hey, all you
00:49:25.080 people, if you want to
00:49:26.280 resettle, Iran is the cause
00:49:28.540 of this.
00:49:29.520 Iran needs to take you.
00:49:31.080 And if they don't take you,
00:49:32.060 we'll just leave you in these
00:49:32.980 camps.
00:49:34.580 Forever.
00:49:35.020 Forever.
00:49:36.220 So, because your home is in
00:49:37.340 Iran now.
00:49:38.420 If you let Iran be your
00:49:39.820 daddy, well, you live with
00:49:41.900 your daddy.
00:49:43.420 All right?
00:49:44.140 Israel isn't your daddy.
00:49:46.420 Your daddy funded your war.
00:49:48.680 Go live with your daddy.
00:49:50.480 And we don't care if you do or
00:49:51.680 you don't.
00:49:52.720 But one thing for sure is you
00:49:54.940 just gave away Gaza forever.
00:49:57.240 Gaza will become an Israeli
00:49:58.620 property.
00:50:00.080 Managed by Israelis, and I'm not
00:50:01.840 even sure any Palestinians or
00:50:03.460 Hamas people will ever be let
00:50:05.060 back in.
00:50:06.180 Because we got fucking
00:50:08.360 nothing from trying to make
00:50:10.740 peace.
00:50:11.740 We got fucking nothing.
00:50:14.020 So we're just going to take
00:50:14.980 Gaza.
00:50:16.460 And anybody who complains,
00:50:18.060 complain to Iran.
00:50:20.060 You can take your complaint to
00:50:21.380 Iran.
00:50:21.740 I'm not even going to answer
00:50:22.440 your question.
00:50:23.600 Blah, blah, blah.
00:50:24.580 Bad things happened to the
00:50:26.500 Gazans.
00:50:27.080 Yes, it did.
00:50:28.160 Terrible things happened to
00:50:29.240 them.
00:50:29.740 Talk to Iran.
00:50:30.400 Again, we're done pretending
00:50:32.920 this is our fault.
00:50:34.880 Now, in the real world, is
00:50:36.780 Israel at fault for the current
00:50:38.480 situation?
00:50:39.120 Of course.
00:50:40.440 Of course.
00:50:43.280 They are our ally.
00:50:45.920 But did they participate in
00:50:48.480 creating the current situation?
00:50:50.000 Of course.
00:50:51.220 Are there any innocent people?
00:50:52.640 I don't think so.
00:50:54.100 No.
00:50:54.440 I think everybody there sucks,
00:50:55.780 basically.
00:50:56.980 Just in different ways.
00:50:57.960 I'm talking about the
00:50:58.640 government, not the people.
00:51:00.080 People are fine.
00:51:01.440 I like the people.
00:51:03.260 But, you know, governments are
00:51:04.580 sketchy organizations, and
00:51:06.720 especially if you're in a
00:51:07.920 sketchy part of the world.
00:51:11.160 So if Israel wants to be
00:51:13.780 serious about this, take Gaza,
00:51:15.620 keep it, call it reparations,
00:51:18.560 and say, we'd like some more
00:51:20.980 land, too.
00:51:21.680 So if we want to keep going,
00:51:23.300 we'd like to own a little bit
00:51:24.800 of Lebanon.
00:51:27.340 Because Hezbollah's got some
00:51:28.560 nice property up there.
00:51:29.640 I'd hate to see them lose it.
00:51:32.820 See what I'm saying?
00:51:35.220 Israel has to get bigger every
00:51:36.700 time they get attacked.
00:51:37.760 Otherwise, it just happens
00:51:38.700 forever.
00:51:40.140 So I'm in favor of them just
00:51:42.340 keeping Gaza.
00:51:44.040 Do I think that that would
00:51:45.200 constitute a genocide?
00:51:47.180 Yes.
00:51:48.500 Yes, it would.
00:51:49.220 Do I think that going back to
00:51:54.400 the way it was is some kind
00:51:55.800 of an advantage?
00:51:56.600 No.
00:51:57.380 Going back to the way it was
00:51:58.680 will get you more of this.
00:52:00.700 There's nothing good going back
00:52:02.180 to the way it was.
00:52:03.700 Repopulating it would be just
00:52:04.980 stupid.
00:52:07.280 So you need a good excuse for
00:52:09.260 not repopulating it and keeping
00:52:10.900 it.
00:52:12.040 And Iran just gave you one.
00:52:13.200 They just attacked your
00:52:14.440 homeland.
00:52:15.520 You attack your homeland, we're
00:52:17.460 going to send you back your
00:52:18.400 people.
00:52:19.360 I know they're not your
00:52:20.340 people, but they are now.
00:52:22.460 Because we can't take care of
00:52:24.780 them and we don't have the
00:52:26.240 resources.
00:52:27.380 But Iran does.
00:52:28.620 So maybe Iran should take care
00:52:29.980 of their people instead of
00:52:31.500 putting them in a situation
00:52:32.440 where they can only die.
00:52:39.640 All right.
00:52:40.720 Scott is a bit weak on that
00:52:42.480 idea.
00:52:44.460 How about you use a not
00:52:46.140 fucked up word for your
00:52:48.060 opinion?
00:52:49.380 How about you say what you
00:52:50.580 think is important that
00:52:52.520 hasn't been mentioned instead
00:52:54.200 of saying I'm weak on it like
00:52:56.360 a fucking turd?
00:52:58.520 How about not saying that
00:52:59.620 while I'm sitting there reading
00:53:00.680 the comments in front of you?
00:53:02.020 If I weren't here, that would
00:53:03.200 be a perfectly fine thing to
00:53:05.160 say.
00:53:06.200 Like if you talk about somebody
00:53:07.420 who's not going to read it, you
00:53:08.380 say, well, they're weak on that
00:53:09.560 topic.
00:53:10.900 But I'm right here in front of
00:53:12.260 you.
00:53:12.440 So if you say I'm weak on it
00:53:16.760 and you don't tell me what is
00:53:18.720 the objection, you're just being
00:53:20.660 an asshole.
00:53:22.900 So I'm going to call you out on
00:53:24.140 that.
00:53:26.760 I need reparations from the King
00:53:28.480 of England.
00:53:30.640 Yes, you do.
00:53:34.140 All right.
00:53:34.680 That's my idea.
00:53:37.440 Hezbollah got in on the airstrikes.
00:53:39.160 They don't know that their real
00:53:40.340 state's at risk now.
00:53:42.080 Now, I do understand that Israel
00:53:43.420 tried to occupy some of Lebanon
00:53:45.560 at one point and it was just
00:53:47.580 untenable.
00:53:49.720 So that's a different situation.
00:53:51.440 But they've already essentially
00:53:53.140 conquered Gaza and depopulated it.
00:53:55.980 So that's it.
00:53:56.820 I wouldn't compare that to Lebanon.
00:54:00.680 All right.
00:54:01.320 Israel pledged an unprecedented
00:54:04.040 response to Iran's attack.
00:54:07.260 What do you think would be an
00:54:08.580 unprecedented response?
00:54:12.120 Keep in Gaza.
00:54:13.700 Just say, all right, we're just
00:54:15.180 going to keep Gaza.
00:54:16.520 There'd never be a better time to
00:54:17.760 do it.
00:54:20.520 Thomas Massey is quite sure that
00:54:22.480 the Israel situation is going to
00:54:24.800 lead to funding Israel and then
00:54:26.800 making it a package to fund
00:54:28.160 Ukraine.
00:54:29.860 And they'll be they'll be
00:54:31.280 packaged together because that's
00:54:32.720 how the people who like wars can
00:54:35.440 get their war.
00:54:36.380 They'll make one of them
00:54:37.480 irresistible and package it with
00:54:39.360 the one that's resistible.
00:54:41.700 So that's going to happen.
00:54:43.980 Yes, Massey, you're completely
00:54:45.460 right.
00:54:48.800 Elon Musk said in a just sort of an
00:54:51.920 offhand quip to a question that
00:54:54.360 AI will run the U.S.
00:54:55.660 government by 2032.
00:54:57.260 Now, I'll vote against that
00:55:00.600 prediction.
00:55:03.280 I don't think there's any chance
00:55:04.640 we're going to give up the country
00:55:06.560 to AI in 2032.
00:55:08.720 But it's a provocative thing to say.
00:55:10.720 I'm not sure he was totally serious
00:55:12.500 when he said it, but it's
00:55:13.560 it's a it's a head scratcher.
00:55:16.960 I mean, when it comes from Musk,
00:55:18.800 I think what he's responding to is
00:55:21.800 the fact that by 2032,
00:55:23.400 AI will be so super intelligent
00:55:26.000 that we'd wish it were running the
00:55:29.000 country.
00:55:29.740 But we're not going to let it.
00:55:31.500 That's a different thing.
00:55:34.260 At a Trump rally, people started,
00:55:37.260 I guess last night, people started
00:55:38.960 chanting, genocide, Joe.
00:55:41.780 And Trump let them chant, as he
00:55:45.880 sometimes does, you know, he walks
00:55:47.820 away from the podium for a moment
00:55:49.620 to let the crowd do its thing.
00:55:51.740 And they're chanting, genocide, Joe.
00:55:54.140 And then.
00:55:57.960 And then what Trump said was, you
00:56:00.260 know, they're not wrong.
00:56:01.920 He goes, they're not wrong.
00:56:04.160 But you see the problem here.
00:56:07.900 The problem here is that he sided
00:56:10.160 with the people saying it's
00:56:11.380 genocide.
00:56:11.820 And he doesn't want to do that
00:56:16.240 because he's calling our ally Israel
00:56:19.780 a genocidal company or, you know,
00:56:23.500 country.
00:56:24.340 So I think that he was just
00:56:26.680 operating on feel and instinct when
00:56:29.140 he agreed with the crowd because
00:56:31.400 he's going to read in the crowd and
00:56:32.900 getting them on his side.
00:56:34.800 But I feel like he's going to have to
00:56:36.900 fix that.
00:56:37.880 He's probably going to have to have
00:56:40.980 a statement that says the
00:56:43.260 genocide is just the fact that
00:56:45.260 there's a war in general.
00:56:47.560 So whatever you call the
00:56:49.040 genocide, you could just say it's
00:56:51.620 the war.
00:56:52.540 So, yes, Biden is responsible for
00:56:54.500 all the war bad.
00:56:55.820 But technically, let's not call it a
00:56:57.920 genocide.
00:56:58.640 But, you know, it's a funny
00:57:00.200 nickname.
00:57:02.260 So I think he might have a little
00:57:03.920 explaining to do on that because
00:57:06.060 it's not always about the nickname.
00:57:08.420 It has to
00:57:08.800 has to be compatible with his
00:57:10.960 actual philosophy.
00:57:15.100 I'm going to say this again until
00:57:16.600 I get an answer.
00:57:17.740 How do I process the fact that
00:57:20.060 pollsters are telling us
00:57:21.320 unambiguously that Trump is
00:57:23.780 gaining with black and Hispanic
00:57:25.520 voters big time?
00:57:28.120 But yet he's tied with Joe Biden.
00:57:31.860 Who is he losing?
00:57:32.860 What story or trend?
00:57:40.020 He's not losing women because that
00:57:41.900 would be a story.
00:57:43.620 If Trump were bleeding female
00:57:46.060 support, there would be a story just
00:57:48.740 like there's a story that he's
00:57:49.920 gaining blacks and Hispanics.
00:57:53.960 Right?
00:57:55.160 That would be a national headline
00:57:56.760 almost every day if he were losing
00:57:59.260 female support at some significant
00:58:00.860 significant level.
00:58:02.100 Who is he losing?
00:58:03.680 Here's what it feels like.
00:58:05.740 It feels like this is your tell
00:58:07.760 that polling isn't real.
00:58:11.780 Because it can't be real.
00:58:13.360 It can't be true that he's losing
00:58:14.820 substantial parts of his most
00:58:17.160 important, you know, the bulwark of
00:58:19.200 the Democrat power base.
00:58:21.640 At the same time, he's not making any
00:58:24.000 difference at the top line.
00:58:26.040 It can't be that the bottom lines are
00:58:27.820 all moving.
00:58:28.340 The top line is stable.
00:58:29.540 Well, what's going on?
00:58:31.820 And by the way, I'm looking at your
00:58:32.840 comments right now.
00:58:33.600 You have no idea, do you?
00:58:36.280 Losing suburban women.
00:58:38.500 Is that the story?
00:58:39.600 Losing suburban women?
00:58:41.460 Because I thought he'd already lost
00:58:43.500 them.
00:58:47.240 Are you sure that's not just a
00:58:48.660 continuation of the forever story
00:58:50.860 that he doesn't have suburban women?
00:58:53.060 Are they actually moving?
00:58:54.580 Yeah, I know.
00:58:58.620 So let's look into it.
00:59:00.060 Let's look into suburban women and
00:59:02.180 see if that's a move.
00:59:03.300 I feel like they're not moving.
00:59:05.960 Like, that doesn't feel real to me.
00:59:08.320 I feel like, you know, a lot of them
00:59:10.520 are Democrat, but I don't feel like
00:59:11.780 they're moving.
00:59:13.860 I don't know.
00:59:15.600 Why would they not already be there?
00:59:17.760 It doesn't make sense.
00:59:18.600 Well, Libs of TikTok is showing a
00:59:23.880 clip about, there's a, what they
00:59:27.240 would call a male, a trans, trans
00:59:30.500 athlete competing in a girls varsity,
00:59:33.760 200 meters in Oregon.
00:59:35.840 And the trans athlete set a new record
00:59:39.220 for the girls.
00:59:41.240 The trans athlete, like, just
00:59:43.100 practically, they lapped the field.
00:59:44.500 And it's pretty hilarious when you
00:59:47.180 look at it, because
00:59:48.680 pretty big difference in performance.
00:59:52.520 Now, there are many ways to look at
00:59:54.860 these stories.
00:59:56.860 And I, you know, one way is, my God,
00:59:59.380 what's happening to the country?
01:00:01.720 That's one way.
01:00:02.740 Another is, this is so unfair for the
01:00:05.720 women.
01:00:06.940 And I totally get that.
01:00:08.800 I totally get that.
01:00:11.060 But I prefer to take the more
01:00:12.760 optimistic view.
01:00:14.500 If you don't mind, you know,
01:00:17.320 nothing's, nothing's terribly all
01:00:19.500 wrong and or all good.
01:00:21.620 There's always an upside.
01:00:23.100 Do you mind if I take the optimistic
01:00:24.740 view of this?
01:00:26.740 Well, in my lifetime, I've seen
01:00:29.080 women make tremendous gains.
01:00:32.120 And I think most anybody in my age
01:00:34.060 group would say the same thing.
01:00:35.560 You know, I went from a world in
01:00:37.080 which, you know, women really
01:00:39.180 couldn't work any job they wanted.
01:00:41.480 There really wasn't an option.
01:00:43.020 But now it is.
01:00:43.720 In fact, women are doing better in
01:00:45.260 college, better at buying their first
01:00:47.440 homes.
01:00:48.800 Women are doing great.
01:00:50.720 So women are higher educated,
01:00:53.260 less crime than men.
01:00:56.180 Certainly, DEI is helping them in
01:00:58.380 employment and doing great.
01:01:00.440 But, but what about men?
01:01:04.500 Here's what I say.
01:01:06.060 I believe that men have what I call
01:01:08.140 an indomitable spirit.
01:01:10.100 And you can, you can try to hold men
01:01:13.900 down and you can say, men, we're going
01:01:17.360 to, white men, white men, we're going
01:01:20.380 to take your jobs.
01:01:21.840 We're going to give them to people of
01:01:24.380 color and women.
01:01:26.320 But you're just going to have to put up
01:01:28.040 with it, men.
01:01:28.540 And I think the white men said,
01:01:32.540 huh, what if we don't?
01:01:35.540 What if we decided that we don't want
01:01:37.280 to lose?
01:01:39.500 And what if we want to win this race so
01:01:41.800 badly that we will remove our penis just
01:01:44.880 to do it?
01:01:47.360 So you might say to yourself, my God, this
01:01:49.760 is so unfair to women.
01:01:50.780 And you'd be right.
01:01:52.560 You might say to yourself, this is, it
01:01:54.980 makes a mockery of sports.
01:01:56.840 You'd be right.
01:01:58.620 But I think you have to appreciate how
01:02:01.340 hard men will try to win when the rules
01:02:03.960 are against them.
01:02:05.320 All right, here are the rules.
01:02:07.520 You can't win.
01:02:10.340 Well, what if I change some things?
01:02:13.300 Like what?
01:02:14.360 My penis.
01:02:14.980 If I get rid of my penis, can I win?
01:02:18.600 Well, maybe.
01:02:20.520 Well, here's how I summarize this entire
01:02:23.840 situation over the years.
01:02:26.760 Women, take a back seat, men.
01:02:29.700 This is our time.
01:02:31.400 Men, hold my baton.
01:02:36.740 That's all.
01:02:37.360 That's the summary.
01:02:38.480 That whole thing was just to work
01:02:39.740 over that one joke.
01:02:41.260 Hold my baton.
01:02:42.840 That's it.
01:02:44.020 Yeah.
01:02:44.420 The whole thing was just for that one
01:02:46.420 joke.
01:02:47.060 If it didn't pay off, I'm sorry.
01:02:48.280 All right, J.D. Vance is talking about
01:02:51.380 Ukraine and says funding Ukraine is
01:02:53.580 crazy because they don't have soldiers
01:02:56.220 and we can only make about 10% of the
01:03:00.020 artillery that they'll ever need.
01:03:02.400 So they can never, they'll never have
01:03:04.600 an offensive option.
01:03:05.720 So this is J.D. Vance's analysis.
01:03:10.540 They'll never have an offensive option
01:03:13.300 because they won't have the soldiers.
01:03:16.080 And there's nobody left.
01:03:17.520 They just don't have enough people left.
01:03:19.700 And we can't make more than, say, 10% of
01:03:23.180 the artillery they need for the next few years.
01:03:25.060 So he suggests that they use the Russia
01:03:29.680 strategy, which is dig in, you know, create
01:03:32.120 defensive lines that the Russians would be
01:03:34.260 crazy to assault, just as the Ukrainians
01:03:37.340 were crazy to assault the dug-in Russian
01:03:40.060 positions.
01:03:41.360 Now, this, of course, would lead you toward
01:03:43.380 a negotiated settlement once you realize that
01:03:46.400 neither side can beat the other.
01:03:48.580 So I think J.D. Vance did a good job here.
01:03:53.200 By the way, what's his background?
01:03:55.720 I know he's an author.
01:03:57.280 But wasn't he working for Peter Thiel?
01:04:00.860 J.D. Vance?
01:04:03.080 Do I have that right?
01:04:05.160 And what was, what's his educational background?
01:04:08.380 Harvard?
01:04:10.380 What was his major?
01:04:13.140 Lawyer?
01:04:14.580 So he's a Harvard lawyer?
01:04:18.420 Harvard-trained lawyer.
01:04:19.940 And then he worked for Peter Thiel.
01:04:22.580 Oh, he was, I thought he was a V-Saint.
01:04:25.060 He, right.
01:04:26.340 Okay, that's what I was looking for.
01:04:27.920 Peter Thiel, venture capitalist.
01:04:29.760 J.D. Vance worked for him, even though his
01:04:31.760 background is Harvard lawyer.
01:04:33.940 He no doubt learned to think like a venture
01:04:36.080 capitalist.
01:04:36.500 And when I look at his analysis,
01:04:37.760 and when I look at his analysis,
01:04:40.200 I say to myself, holy cow,
01:04:44.520 somebody who knows how to analyze things
01:04:46.520 just analyzes something.
01:04:48.040 Do you know how rare that is?
01:04:50.160 How rare is that that somebody who knows
01:04:52.340 how to analyze something
01:04:53.520 analyzes something?
01:04:55.280 Do you know who else is good at this?
01:04:57.760 Thomas Massey, MIT, engineer.
01:05:01.280 When he analyzes something,
01:05:04.300 you say to yourself, oh,
01:05:06.300 a good analyzer just analyzed something.
01:05:08.740 So you should listen to that.
01:05:10.460 But if you see somebody telling you
01:05:12.720 the moon is made of gas,
01:05:15.100 maybe don't listen to that one.
01:05:17.720 But you should know who are the ones
01:05:19.680 who actually know what they're talking about.
01:05:21.600 And if you're trained as an engineer
01:05:23.420 or a venture capitalist,
01:05:25.120 you probably know how to look at the costs
01:05:27.740 and the benefits.
01:05:28.700 That's what they do.
01:05:30.600 They're trained to.
01:05:31.600 Same with the economics people.
01:05:34.680 So I always say that people think
01:05:36.860 they know how to analyze situations.
01:05:40.720 But they don't.
01:05:42.340 I was going to name a name,
01:05:43.760 but I'm not going to do it.
01:05:44.780 There's somebody who's well known
01:05:47.120 on the conservative libertarian side of things
01:05:50.080 who is very strident in analyzing
01:05:54.580 the situation in public
01:05:56.200 and doesn't have any skill at it.
01:06:00.020 And I keep wanting to not be a jerk
01:06:02.740 and point it out.
01:06:05.940 So I'm not going to name a name.
01:06:07.740 But there's somebody in the public domain
01:06:10.260 who is so unqualified
01:06:11.860 for looking at the big situations.
01:06:15.600 They just don't have the skill.
01:06:18.480 And, well, you know,
01:06:20.600 no, it's not Tucker.
01:06:21.480 Somebody said, is it Tucker?
01:06:22.800 You know what Tucker does?
01:06:24.920 Tucker has humility.
01:06:26.660 And that goes a long way.
01:06:28.880 See if you agree.
01:06:29.960 When he's right,
01:06:30.960 of course, he can be cocky
01:06:33.000 when he knows he's right.
01:06:35.120 But Tucker will be the first person
01:06:37.200 to tell you why he doesn't understand.
01:06:41.140 That goes a long way for credibility.
01:06:43.720 Am I right?
01:06:45.160 When there's something he doesn't understand,
01:06:47.180 like, you know,
01:06:47.900 the deep economics of something,
01:06:50.160 he'll just tell you.
01:06:51.480 I don't know why this isn't the thing.
01:06:54.080 This is all I know.
01:06:55.300 I'm just watching like you are.
01:06:57.600 So the thing about Tucker
01:06:58.900 is that he talks about the news
01:07:02.500 like he was watching it at home.
01:07:06.740 But better than you.
01:07:08.360 Right?
01:07:08.740 Because he's smarter.
01:07:09.680 So, and that's real valuable.
01:07:13.660 That's sort of what I try to do.
01:07:15.320 I try to watch it
01:07:16.280 like I'm watching at home with you,
01:07:18.360 but that I can add maybe some,
01:07:20.760 you know, analytical things
01:07:22.500 just because I'm trained to do it.
01:07:24.740 If you're not trained to do it,
01:07:26.100 you think you can,
01:07:27.900 but you can't.
01:07:29.720 So I'm not going to name names,
01:07:31.180 but you'll notice them.
01:07:32.740 Anyway, J.D. Vance
01:07:33.740 apparently is trained to do this.
01:07:36.360 Thomas Massey apparently is trained
01:07:39.380 to look at complicated pictures
01:07:41.980 and see both sides.
01:07:43.520 So pay attention to that.
01:07:44.780 Give them a little more attention.
01:07:46.640 If you see somebody who
01:07:48.220 has a background
01:07:50.120 that doesn't include those skills,
01:07:52.480 take that into consideration.
01:07:54.260 All right.
01:08:01.680 Ladies and gentlemen,
01:08:04.900 Scott, you're confusing J.D. Vance
01:08:06.560 with Blake Masters.
01:08:08.660 Well, I read today
01:08:09.700 that he worked for Peter Thiel.
01:08:11.880 Did I get that wrong?
01:08:14.980 Give me a fact check before I go
01:08:16.640 because that's a good,
01:08:17.520 that's a really good question.
01:08:19.860 Did I get that wrong?
01:08:21.120 Because I think they both worked
01:08:22.360 for Peter Thiel.
01:08:24.260 Give me a fact check on that.
01:08:28.600 I don't want to get something
01:08:29.720 that important wrong.
01:08:31.160 So make sure.
01:08:34.880 Okay.
01:08:35.460 Yes.
01:08:36.000 All right.
01:08:36.620 So they both work for Peter Thiel,
01:08:38.140 so I'm not confusing them.
01:08:42.840 Yeah.
01:08:43.280 They both work for Thiel.
01:08:45.920 I think that's all we need.
01:08:47.400 All right.
01:08:48.200 Ladies and gentlemen,
01:08:49.000 I'm going to say some more words
01:08:50.860 to the locals people,
01:08:52.460 so they're going to stay on here.
01:08:53.560 But the rest of you
01:08:54.420 will be seeing me tomorrow.
01:08:59.620 All right.
01:09:01.000 See you tomorrow, everybody else.
01:09:03.000 Locals, stick with me.
01:09:04.320 Look at the most.
01:09:05.320 See you tomorrow.
01:09:05.920 Thank you.
01:09:06.080 See you tomorrow, I'll see you for hosts.
01:09:07.740 Lucy.
01:09:08.080 Bye.
01:09:08.800 Bye.
01:09:09.720 Bye.
01:09:11.240 Bye.
01:09:11.880 Bye.
01:09:12.740 Bye.
01:09:13.180 Bye.
01:09:13.860 Bye.
01:09:14.240 Bye.
01:09:14.440 Bye.
01:09:16.740 Bye.
01:09:17.280 Bye.
01:09:17.480 Bye.
01:09:20.200 Bye.
01:09:22.300 Bye.
01:09:22.800 Bye.
01:09:25.000 Bye.
01:09:27.160 Bye.
01:09:27.720 Bye.
01:09:30.080 Bye.
01:09:31.220 Bye.
01:09:32.120 Bye.