Real Coffee with Scott Adams - January 30, 2025


Episode 2735 CWSA 01⧸30⧸25


Episode Stats

Length

1 hour and 23 minutes

Words per Minute

150.29367

Word Count

12,615

Sentence Count

909

Misogynist Sentences

3

Hate Speech Sentences

15


Summary

The crash of an American Airlines jet and a Black Hawk helicopter in the Potomac leaves at least 60 dead, and many more missing. Scott Adams talks to an expert in Black Hawk helicopters to try and figure out what happened.


Transcript

00:00:00.600 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:04.840 It's called Coffee with Scott Adams, and you've never had a better time.
00:00:09.040 But if you want to take this experience up to levels that nobody can even understand
00:00:14.300 with their tiny, shiny human brains, all you need for that is a cup or a mug
00:00:18.260 or a glass of tanker gel, a stein, a canteen, a jug, a flask, a vessel of any kind.
00:00:21.840 Fill it with your favorite liquid. I like coffee.
00:00:24.460 And join me now for the unparalleled pleasure, the dopamine hit of the day,
00:00:27.800 the thing that makes everything better, it's called.
00:00:30.460 That's right, the simultaneous sip, and it happens now. Go.
00:00:40.760 I feel smarter with every sip, and so do you. So do you.
00:00:46.620 Well, as you know, the big news, tragically, is a jet crashed with an American Airlines
00:00:53.900 regional flight. It was a Black Hawk jet, and it fell in the Potomac, I guess.
00:01:01.220 The debris from both, everybody died. It was a mid-air collision, so there was not much chance,
00:01:06.860 and they fell in the icy water.
00:01:08.140 So it looks like at least 60 people, which would be all the people on the jet,
00:01:14.080 and maybe a crew of three, I think. Not everybody has been located, but there's not much chance
00:01:21.000 of survival. Zero, basically.
00:01:23.160 So we're still in the fog of war, period, but people are speculating, how could this happen?
00:01:33.900 You know, it's such a normal thing for there to be traffic in that part of the world,
00:01:38.700 and all of that advanced technology should have seen each other.
00:01:44.360 But that's what people who are not pilots say. Would you like to hear what a pilot says?
00:01:50.860 Which is very different from what you and I are saying, because here's you and I trying to figure
00:01:55.980 out this situation. Huh. If I looked out the window of my helicopter, would I be able to see a
00:02:03.040 gigantic airplane coming my way? I think I would. So it doesn't make sense I didn't see it.
00:02:09.300 And if I were in a giant airline, would I be able to see a helicopter coming toward me?
00:02:15.880 Well, of course I would, because, you know, I don't know anything about airplanes,
00:02:19.620 but I can look out a window and I can see a thing. But here's what an actual,
00:02:25.340 an expert in Black Hawk helicopters tells us. Here's somebody who follows me on X,
00:02:31.420 so I was alerted to this one. Mark McEthrin says,
00:02:37.060 I was a Black Hawk helicopter crew chief in the army. Okay. That's exactly who I want to hear from.
00:02:44.020 And not only that, but he was a flight instructor. Okay. Now we're talking to the right person.
00:02:49.960 I want to know, somebody who's an expert in these helicopters, how hard is it to spot other traffic?
00:02:58.380 And the bottom line is, it's super hard, even for the experts. So you could actually have this
00:03:05.720 accident happen without much going wrong. I hate to say it, but it might not be that anybody did
00:03:14.360 anything wrong in quotes. It could have been, this is just a really hard thing to do.
00:03:20.860 So he talks about the massive responsibility of the people who are the crew for the helicopter,
00:03:27.320 Black Hawk specifically. And part of their job is to look, be the extra eyes for the pilot.
00:03:33.040 So they're the ones who are looking for the extra things that the pilot might not spot.
00:03:38.460 And he said, I can tell you after doing this for hundreds of hours,
00:03:42.500 even when you know exactly where a Black Hawk is and you have night vision goggles on,
00:03:49.800 it is extremely, in all capital letters, hard to see the aircraft.
00:03:54.240 So from the perspective of the commercial flight, seeing it probably wasn't even an option. I mean,
00:04:03.340 it's just really, really, really, really hard to see. So my current view on this is that it's still
00:04:11.080 fog of war. You know, we don't know exactly when wrong. But the most likely is that it was just
00:04:20.400 really, really, really hard situation for even experienced pilots, even with all the electronics
00:04:26.580 in the world. And it probably was just a very unfortunate, perfect storm of something being
00:04:33.080 in exactly the wrong place at the wrong time. That's my guess. My guess is that, you know,
00:04:38.980 there will be no specific blame. I feel like it was just hard. And some people say,
00:04:45.840 we're lucky we haven't had more of these, because the odds of something like this happening are
00:04:50.800 pretty good, just in general. So the fact that one happened doesn't mean anything new is added to
00:04:59.300 the story, necessarily. But as you know, Aaron Rupar, the sometimes considered the worst person in the
00:05:06.900 media, who's literally famous for fake news, so famous that his name itself is used as one of the
00:05:15.820 synonyms for fake news. So a Rupar edit is something that used to be true until it got edited to look
00:05:24.240 like the opposite. So he, as soon as the accident happened, he posts a news story about how Trump
00:05:34.880 had gutted key aviation safety committee and fired the head of the TSA. Now, that only happened
00:05:42.600 this week. I don't think that the firing of the head of the TSA affected the capability
00:05:51.020 of the Black Hawk pilot or the commercial airline. I think it's pretty safe for all of us to say,
00:05:59.200 whatever went wrong, it wasn't because of Trump. He's been there a week, and the only change he made
00:06:05.800 couldn't have possibly had an operational impact. Or at least you better connect the dots a lot
00:06:12.240 better if you're going to make that theory. So what's funny about it is that there was a time
00:06:17.900 when people would have just argued about the truth as a statement, or, hey, is that true
00:06:23.620 that something Trump did could be part of the story, or is that not true?
00:06:28.500 But this time, and I'm very happy to report, almost all of the heat that Rupar got was for
00:06:36.420 being that Rupar guy. So, oh, I'm happy about that. Because the reason I came up with the term
00:06:46.220 Rupar, and used his last name to be synonymous with a certain kind of fake news, is that when he does
00:06:54.040 something on a topic like this, your first thought should be, oh, it's Rupar, Rupar doing a Rupar.
00:07:02.160 Because that's the healthiest thing you could think. The moment you think, ooh, here's somebody
00:07:07.180 making a point I disagree with, it doesn't really look like he's trying to make points.
00:07:13.420 It looks like he's being Rupar. You know, whatever his motives are, we don't know. But it doesn't look
00:07:20.420 accidental that he's wrong all the time, does it? It doesn't look accidental. Can't read his mind.
00:07:28.340 But he doesn't make it look like an accident. So use your own judgment. But watching people just
00:07:34.040 destroy him reputationally, instead of dealing with the ridiculousness of the point of view,
00:07:40.140 was fun. Metta agreed to, and let me just circle back for a moment. I don't want to have fun with
00:07:52.100 the tragedy. So I'll have fun with the Rupar. But 60 people, this is like one of the worst air
00:08:00.260 accidents in a long time. We'll argue about, you know, how long it's been. But this is seriously
00:08:07.140 bad. I mean, even Trump, his optimism kind of left him last night. He even posted on Truth,
00:08:14.600 you know, it's a terrible night for the United States. So it's just a terrible night. And,
00:08:20.160 you know, complete sympathy for the families and the victims. And incredible respect for the recovery
00:08:29.120 crew. Imagine working at night in freezing water. You know, obviously they're equipped, but in freezing
00:08:38.580 water. And the awfulness of what they have to do, just, that's the job they signed up for. Imagine
00:08:47.280 signing up to be in that line of work. To, if a bunch of bodies end up in a freezing river,
00:08:55.480 you're the one to go get them. I mean, the fact that we have humans who will take that job
00:09:03.200 and then do it well, my God. So full respect. All right, another story. Meta has agreed to pay
00:09:14.840 Trump $25 million to settle the lawsuit about Trump getting kicked off of Facebook back when he got
00:09:23.900 kicked off. The, I love this settlement. You know, I don't always love settlements or even court
00:09:32.760 decisions, but there's something, there's something really healthy about this one. First of all, the
00:09:38.160 25 million isn't going to hurt Facebook's business. That's good. That's good. I don't want them to go
00:09:43.760 out of business. It's big enough. So we all get the point, right? 25 million. All right. You have my
00:09:51.800 attention. We get the point that the censorship of Trump was just flat out wrong. I'm pretty sure
00:09:59.240 Zuckerberg says that directly at this point. He does, right? He says that directly. So once
00:10:04.860 Zuckerberg and Meta have said we acted wrong, a settlement makes sense. And from Trump's point
00:10:14.300 of view, once they've admitted that they were wrong, and it looks genuine, by the way, I think
00:10:18.360 Zuckerberg means everything he says about this. Um, you don't need to get a billion dollars,
00:10:24.040 right? Yeah. You just, you don't need that. You just need a nice, clean, solid apology slash
00:10:34.720 settlement. It's just the right number. And even better, um, 20, I guess, 22 million of the 25
00:10:42.180 will go to the Trump library instead of, you know, Trump's pocket. You don't want that. And, um,
00:10:50.840 the rest goes to pay the lawyers. Nicely done. That, that would be, you know, I take this back to
00:10:59.280 how apologies work, the difference between men and women. I've mentioned this before. Uh, when men
00:11:07.300 apologize, I apologize to men, you can, you can sort of see this settlement as like a Zuckerberg
00:11:13.500 apologizing to Trump. Men accept apologies. As long as it looks like you mean it, you know,
00:11:22.200 we're not, we're not going to delve into your personal thoughts, but if you say it like you mean
00:11:27.260 it and you act like you mean it and you change your behavior, we're like, great, done. You're,
00:11:34.560 you're even more awesome than I thought. So apologies really work for men. If they're real
00:11:40.160 for women might be a different situation. I'm no expert on women. So I can't say one way or the
00:11:47.220 other, but it never seems to work. If you've ever tried, it doesn't seem to work the same way,
00:11:53.980 but I can tell you from my own personal experience, you can change me from your raging anger to,
00:12:01.460 oh, we're good now. We're just an apology, you know, depending on the situation. So,
00:12:07.400 so this looks like a man to man apology, uh, Zuckerberg to Trump. And it looks like the two
00:12:12.840 of them handled it like men. Nicely done. Um, some of you are going to say, oh, he's just covering
00:12:19.840 his ass. Yeah, that's, that's what we do. Right. Um, and there's some element to that. So there is
00:12:26.420 never just one thing, you know, can't Zuckerberg do what makes sense and what he thinks is right,
00:12:31.460 but it also is good for business. It's also good for covering his ass. Nothing wrong with that.
00:12:37.880 Give him a threefer. Meanwhile, Elon Musk says that, uh, now that the full self-driving capabilities of
00:12:45.740 the Tesla's have really reached breakout level, uh, breakout, meaning there's no question that the
00:12:53.200 self-driving is safer than the human driver. Just no question. I believe, I believe that's over.
00:13:00.800 Uh, I think the argument about who's safer, you could just put it completely to rest.
00:13:07.820 So we'll see. I mean, we need data to prove it, but, but I think he's got the data on that.
00:13:13.580 Anyway, so Musk says that, uh, Tesla's going to launch, uh, full, uh, full self-driving
00:13:19.620 unsupervised, meaning you don't have to touch the steering wheel or look at the road. Um,
00:13:26.480 and this service will be a paid ride share service in Austin in June. Now that's a good
00:13:33.200 way to, you know, take a halfway movement into, um, the full cyber cab world. I think there's
00:13:41.340 going to be a lot of, a lot of work to get that up and running, but I'm pretty sure Tesla
00:13:45.320 can make that work. So in June, um, that would be a great time to go to Austin just to see how it
00:13:52.860 works. You know, I probably won't do that, but I can see how people would. That'd be, that'd be an
00:14:00.200 awesome American vacation just to go see the future, see how it feels. That'd be great.
00:14:06.240 At the same time, I saw add the Optimus, also Tesla product, the Optimus robot, a little more
00:14:13.500 information than we had about it. And, uh, it's 5'8", and it weighs 125 pounds. It can lift 150 pounds
00:14:26.000 and it can walk at a speed of 1.34 miles per hour. And it looks like the cost, maybe not the first
00:14:33.960 models, but, but very soon when they get to production of a, you know, a million a year,
00:14:39.000 I think they expect the cost per unit to be 20 to $30,000. So, and, uh, it looks like it'd be
00:14:46.560 launched this year. So 2025, if, uh, if Tesla hits his target, this will be the year that you've got a
00:14:57.040 robot. Now, I don't know what the first one is going to cost. Um, you know, I, I hate to be the
00:15:04.500 one who spends, you know, way too much on the first robot. And then six months later, it's $25,000,
00:15:10.140 but I really want one. So I would not probably overpay. Uh, I wouldn't overpay stupid, crazy,
00:15:20.280 you know, idiot money, but I might overpay a little bit. So I'd love to see a price. I'll tell
00:15:27.440 you if I'll pay it, uh, that might help them with the research. But, uh, why did they have to make the
00:15:33.480 thing exactly my height? They almost made it my height and my weight to make me feel like I live
00:15:42.500 in this simulation. Cause one of the, one of the things I worry about with a robot, uh, is the same
00:15:48.320 thing I worry about with a dog. I made sure that when I, that when we selected Snickers, I wanted to
00:15:56.120 make sure that if things got out of control, I could still win in a fair fight against the dog.
00:16:01.240 You know, if you get, you get some giant, you know, some giant dog or pit bull, that's one of
00:16:07.500 the big muscular ones. You say to yourself, well, I don't know, I think I could win a fight, but
00:16:12.980 I'm not sure I could. And it's dangerous to have something living in your house that's both
00:16:19.200 unpredictable and could beat you in a fair fight. So I just try to avoid that. Right. Um, so the robot,
00:16:26.080 the robot looks like it's going to be stronger than me because, um, could I lift 150 pounds? Well,
00:16:33.200 yeah, I can. I mean, that's literally my exact weight. So, so it can lift my exact weight and
00:16:39.800 it's my size. So it can carry me around. I'm going to make my robot carry me around. See, it won't work
00:16:48.000 for you because you're 151 pounds or more, but I'm exactly 150. I'll be like, Carl, can you carry me to
00:16:56.700 bed? Oh, again, master? Yes. Again. You know, you could walk a master. I know. And why do you make
00:17:08.200 me call you master? I just like it. So robots are coming. Senator Mark Menendez, you know him as
00:17:15.840 gold bar Bob. You got sentenced to 11 years in prison. Does that sound right? Does it sound right
00:17:22.980 if a, uh, high level federal elected official is selling access for money, gold bars? There's 11
00:17:32.900 years about right. Yeah. Yeah. It actually, yeah. It feels about right. Um, I don't always say that.
00:17:42.760 It seems like the, the sentences are always too light or too long, but yeah, I think they got one.
00:17:50.220 I think they got that one right. So good work, uh, the court system. Good work, the prosecutors.
00:17:59.060 I think he got that all completely right. Well, there's going to be a, uh, bunch of Democrats who
00:18:07.460 have formed to object to Trump's idea to cut taxes. That sounds like a punchline, doesn't it?
00:18:16.960 This is real. There is an organized group of Democrats who are going to oppose Trump's call
00:18:24.420 for lower taxes. Now lower just means he wants to extend the current situation. That's what they
00:18:30.820 call lower, just doing what we're already doing. But, um, the funny part is that they named their
00:18:39.120 Democrat group families over billionaires. Their idea being that the billionaires are going to get
00:18:44.780 the tax breaks families over billionaires and the families over billionaires who don't want the
00:18:51.860 billionaires to have so much control and get the benefits, uh, are going to have an alleged
00:18:56.540 eight figure funding. So 10 to $99 million. Who could afford to give a group like this
00:19:05.820 over $10 million? Let's see a millionaire, a millionaire. Oh no. 10 million is more than a
00:19:12.440 million. So you couldn't, you couldn't really donate that if you're like just a basic millionaire
00:19:17.800 who, uh, maybe a billionaire. Are there any billionaires that donate to Republican or to
00:19:24.620 Democrats? Yes, there are. It's called Alexander Soros. It's called Reid Hoffman. Yes. So they think
00:19:34.040 that their billionaires are the good ones and, uh, they would claim that the Republican billionaires
00:19:38.960 are the bad ones. So they're going to have their billionaires, uh, fund a fake organization that's
00:19:46.160 going to pretend it's in favor of higher taxes. Their whole thing is to pretend they're, they're
00:19:53.180 in favor of higher taxes. Okay. All right. I'm glad that you woke up today and went to, went to fight,
00:20:01.400 went to fight for the things that you believe in higher taxes. Um,
00:20:06.820 so that's not ideally not ideal. Ontario, the wait is over. The gold standard of online casinos has
00:20:18.700 arrived. Golden nugget online casino is live bringing Vegas style excitement and a world-class
00:20:24.260 gaming experience right to your fingertips. Whether you're a seasoned player or just starting
00:20:29.260 signing up is fast and simple. And in just a few clicks, you can have access to our exclusive library
00:20:34.860 of the best slots and top tier table games. Make the most of your downtime with unbeatable
00:20:40.140 promotions and jackpots that can turn any mundane moment into a golden opportunity at golden nugget
00:20:46.000 online casino. Take a spin on the slots, challenge yourself at the tables or join a live dealer game
00:20:51.440 to feel the thrill of real-time action all from the comfort of your own devices. Why settle for less
00:20:56.960 when you can go for the gold at golden nugget online casino gambling problem. Call connects Ontario
00:21:03.260 1-866-531-2600. 19 and over physically present in Ontario. Eligibility restrictions apply. See
00:21:10.520 golden nugget casino.com for details. Please play responsibly. Uh, meanwhile, if you saw the RFK
00:21:17.480 junior confirmation hearings, you probably enjoyed it. Uh, my take on it was I thought RFK junior answers
00:21:25.860 tough questions better than I maybe have ever seen anybody answer tough questions because he had the
00:21:34.700 data, you know, right in the back of his or the front of his head. He didn't even have to go to the
00:21:39.960 back of his head. He knew exactly what he was talking about. And every one of the gotchas, he had an
00:21:46.700 explanation that when you were done, you'd say, Oh, Oh, well, that actually is not what I thought it was.
00:21:52.020 And it really tells you how fake the fake news was because that's all they had. It turns out that
00:21:59.920 the only thing they had was fake news. So when he sits there and calmly explains, you know, why it's
00:22:07.240 fake and what the real context is, it's really powerful. So I don't think they laid a glove on him.
00:22:13.380 He's got maybe another day or so of, uh, testifying to a slightly different group, I think,
00:22:18.760 but it's some of the best, best answers I've seen. Um, but the, the funniest thing that came out of it
00:22:27.200 was Bernie Sanders and his onesies. If you haven't seen the clip yet, Bernie Sanders thinks he has this
00:22:35.320 real gotcha because somebody developed a onesie, which would be a clothing item for a baby or I guess
00:22:42.540 a baby, uh, and the baby clothing would have some, had some anti-vax message on it. So Bernie puts that
00:22:49.420 up, uh, puts up a picture of the onesies with the anti-vax message. He says, are you supportive of
00:22:55.460 these onesies? And of course, uh, RFK Jr. had nothing to do with the onesies, not directly, not indirectly.
00:23:04.540 So he decides not to answer that dumb question. And he just says, I'm supportive of vaccines.
00:23:10.900 Now, first of all, that's a perfect answer. Anything he said other than this sentence,
00:23:16.560 I'm supportive of vaccines would have been a mistake of all the billions and billions of
00:23:21.120 things you could have said. There was only one perfect thing. And he said it now. I really noticed
00:23:28.240 that. If there's only one perfect thing and everything else is a mistake, if you can find
00:23:34.000 the one perfect thing and you lead with it, I'm supportive of vaccines. There's no hedging on that.
00:23:42.260 Now it doesn't mean every vaccine doesn't mean he wants to do it without testing, but he's generally
00:23:47.240 supportive. So that should have been the end of the questioning, right? Once he says I'm in favor of
00:23:54.140 vaccines, which is the opposite of the message on that onesie, well, we're done here, right?
00:24:00.660 But Bernie apparently had gone through the trouble to make this,
00:24:03.460 to make this visual and he wasn't going to quit on it.
00:24:09.220 So he starts doing this ridiculous, are you supportive of the onesie? Uh, well, I support
00:24:15.980 vaccines, but what about the onesie? What about the onesie? Tell us about the onesie.
00:24:20.820 And you could just see people behind RFK Jr. Like Megyn Kelly, just laughing. And then RFK Jr.,
00:24:30.320 he can't stifle his own laugh. So the video that got the most play was RFK Jr. literally laughing
00:24:40.160 at Bernie being just a total idiot on this point. I mean, I respect Bernie in a lot of ways.
00:24:47.220 He's, you know, partly his stick-to-itiveness and, you know, he seems pretty committed to his
00:24:54.520 principles, whether you like him or not. But sorry, Bernie, this was the most absurd, ridiculous,
00:25:03.000 uh, anti-science, anti-useful, complete waste of time. But you made a clown of yourself and it was
00:25:10.760 entertaining. So we like that. We like the entertaining part. Anyway, um, here's what I think.
00:25:21.820 I think that the thing about RFK Jr. that is really unique is that it's not political.
00:25:29.680 It's a political process, but, you know, he's a lifelong Democrat and he is selected for one of the
00:25:37.220 top jobs by the top Republican. That's as non-political as you can get when the top Republican
00:25:45.520 picks one of the most famous Democrats. And by the way, RFK Jr. has never said,
00:25:51.520 oh, now I'm a Republican. That never happened. He's the same guy he always was. It's just that
00:25:58.000 he wants this one mission and it's important enough that he'll do it in whatever way can get it done.
00:26:03.980 So it's the least political thing you'll ever see in your life, which makes me like it the most.
00:26:10.860 It's also one of the most important things. I can't really come up with something besides the
00:26:16.120 debt. The debt is existential, but beyond the debt, this is really my number one and probably should be
00:26:23.280 most people's number one. Now I'll tell you why I'm more of a, more on the war path for this than
00:26:30.520 other people. I've actually done the experiment where I cut off all processed foods for months.
00:26:41.280 You won't believe how well you feel. If you do the experiment of just getting rid of, you know,
00:26:48.700 it's expensive because processed food is cheaper and more convenient and everything else. But if you can do
00:26:54.280 it and you just do your basic proteins and your basic organic, if you can do it, vegetables and
00:27:01.400 fruits and stuff, you're going to find out that a lot of what you thought were your medical problems
00:27:06.940 were food related. I thought I had terrible allergies all year long, all the time, no matter what.
00:27:14.740 I don't. I had a reaction to poison food and I never had any unpoisoned food, I guess, as an adult.
00:27:24.860 So I didn't really notice. I didn't think it was food because no matter what I ate, I had the same
00:27:30.820 reaction. But I had to actually cut down the number of things I eat to just this tiny sliver of things
00:27:37.400 I allow myself and then all my symptoms go away. Now, if you haven't experienced that, you don't quite
00:27:44.220 understand what's at stake. What's at stake is chronic illness for all your children forever.
00:27:53.080 They'll die. They'll suffer. Their lives will be terrible. They'll barely be able to pay attention
00:27:58.940 in school. And I don't even know if they can mate. It's the end of the fucking world if we don't get
00:28:06.500 him in there. Maybe, you know, you can't guarantee that, of course. But this is life and death for the
00:28:12.920 children. This is bigger than abortion. You know, I guess you could argue that. But one of the things
00:28:21.020 I always appreciated about the most conservative Republicans, without necessarily agreeing with
00:28:27.860 their opinion, I respect the fact that the Republicans said, we're going to move this decision
00:28:35.420 out of the federal government, put it in the courts, and we're going to get killed in the elections.
00:28:39.880 Now, that has my respect. If you think that's important enough, and again, this is their opinion,
00:28:47.720 not mine. I'm just respecting that the thing they decided to, you know, die on that sword,
00:28:53.580 they knew they were going to die on the sword. They knew they'd get killed in the midterms. They'd get,
00:28:58.600 you know, maybe lose 2020 or whatever. So the Republicans who said, we believe this so hard,
00:29:04.580 we're going to die on the sword. I really respect that. Even if you don't agree with them on abortion,
00:29:12.240 that that is a respected approach to life. Now, when I see RFK Jr. taking this kind of personal,
00:29:20.340 professional risk to get this done, to save the children, save the families, save America,
00:29:27.080 my God, do I have respect for that. You know what I don't respect? The people trying to stop him.
00:29:35.200 Now, I can kind of understand maybe Democrats, you know, just doing the political thing,
00:29:41.260 blah, blah, blah, but there are enough Republicans, so it shouldn't matter.
00:29:45.920 Unless a handful of Republicans turn on him, because they're getting funded by the pharma or big food
00:29:52.680 lobbies, which is probably the only reason I can think of it would happen. And let me just say,
00:29:58.580 this isn't politics as normal. This is not politics as normal. If Republicans kill the RFK Jr. thing,
00:30:06.680 we're not going to forget. Because this would be like driving up to your family and punching every
00:30:12.780 one of your family members in front of you, and then driving away and saying, yeah, well, that was
00:30:17.280 yesterday. That was yesterday I drove up and punched in the face every one of your kids.
00:30:23.540 Well, are you going to forget it? No, I'm not going to forget it. If you punch my children in the face
00:30:29.220 while I'm standing there and walk away, 40 years from now, you're still going to pay if I have a chance.
00:30:35.560 Right? We're coming for you. I don't mean physically, obviously. No violence, please. No violence.
00:30:42.020 But in terms of career, in terms of reputation, in terms of money, in terms of politics, yeah,
00:30:47.660 it's, this is to the end. So you can retire, and we're still coming after you. Reputationally.
00:30:56.940 Reputationally, not physically. So I'm with Nicole Shanahan as hard as you can be with anything. So
00:31:06.060 Nicole says that if you vote against this, especially if you're Republican, she doesn't say
00:31:11.240 that part, but I do. Especially if you're Republican, it's so clearly obvious that you're
00:31:17.420 being bought out and that you've chosen the life of our fucking kids over whatever you're getting
00:31:23.440 out of somebody. You're going to fucking pay for it. This isn't like the other stuff. The other stuff
00:31:29.840 is just politics. We get it. We don't win every time. We can't win everything. Sometimes we win.
00:31:35.360 Sometimes we lose. We get it. We get over it. We're not going to fucking get over this. We're
00:31:40.280 not going to fucking get over it. This is to the end. Again, not physical. We're not talking about
00:31:46.700 any violence. But you think you're going to stay in politics if you vote against RFK Jr. and you
00:31:52.040 kick him out? No, we're not getting over it. We're not getting over it. And you're not going to get
00:31:58.500 over it either. We're going to make sure of that reputationally. That's a promise. Edward Snowden
00:32:06.800 said about Tulsi Gabbard, who's also going to be in the confirmation process. So I guess Tulsi Gabbard
00:32:15.540 has been in favor of Edward Snowden being pardoned, if I have the background right. And then Snowden
00:32:22.480 not wanting her to fail in the confirmations because of him, he said in a post today that
00:32:32.280 Tulsi Gabbard will be required to disown all prior support for whistleblowers, meaning himself
00:32:39.280 and others, as a condition of confirmation today. I encourage her to do so. In other words,
00:32:44.920 to disavow even Snowden. Tell them I harmed national security and the sweet, soft feeling
00:32:51.120 of staff. He's hurt their feelings. In D.C., that's what passes as the Pledge of Allegiance.
00:32:59.220 So I'm not sure that Snowden is helping. Yeah. But it might. I mean, I don't know that it
00:33:07.920 hurt. But she might actually. No, I don't think she will. I don't think she's going to disavow
00:33:15.920 him. It doesn't feel like something she would do.
00:33:22.220 Bank more encores when you switch to a Scotiabank banking package. Learn more at
00:33:27.680 scotiabank.com slash banking packages. Conditions apply. Scotiabank. You're richer than you think.
00:33:34.040 Yeah, I'm seeing the comments. There's a story going around that some are saying that Lyme disease
00:33:43.280 wasn't naturally occurring. It was also a lab leak. Now, I would consider that so far a rumor.
00:33:52.960 But you know how these rumors turn into a real thing if you wait long enough? It sounds like a
00:33:58.700 conspiracy theory. But I'm not going to be the one who says, well, six months from now, remember when
00:34:05.920 I said that was a conspiracy theory and then some information came out? So I'll say that I don't know
00:34:12.120 enough about that story to say it's true or false. It's just out there. If you had told me that Lyme
00:34:19.280 disease was made in a bioweapon lab, if you told me that 10 years ago, I would have just said, come on.
00:34:26.920 Come on. That doesn't really happen in the real world, does it? Somebody makes a bioweapon. Suddenly
00:34:34.520 it gets out and, you know, millions of people are infected. That's not a real thing. But it's a real
00:34:41.800 thing. So whether it happened with Lyme disease or not, don't know. No, no. I guess we'll find out
00:34:48.420 more. But I don't rule it out. It's like the years of ruling out things just because they sound like
00:34:56.780 they're ridiculous. Can't do it anymore. We're in the ridiculous world now. Anyway.
00:35:04.200 So we'll see how Tulsi Gabbard does. And I guess Kash Patel is today as well. Is that right?
00:35:14.620 Is Mike Benz saying the Lyme disease? Does he confirm it or just give us the background so we
00:35:20.580 can make up our own opinion? I'll check that out. I'll see. I'll see what Benz is saying.
00:35:25.240 Remember, Benz uses, or at least when he documents his opinions, he uses public information.
00:35:33.980 So if Benz has an opinion on this that leans more toward the lab leak theory, it's going to be based
00:35:42.500 on stuff you can check yourself. You know, he shows the receipts. So that would be interesting. I'll
00:35:48.300 check that out for you. I was watching the Daily Show, whose initials are TDS, which is important,
00:35:57.080 the Daily Show. They even put the initials on the background for part of the opening segment.
00:36:02.880 It actually says TDS all over the background. You would have think they would have fixed that by now.
00:36:10.160 But John Seward, he came on and he said this about the Democrats are responding to Trump so far.
00:36:18.300 He said, quote, things are going to get fascisty, fascisty, you know, more fascist. And he
00:36:26.120 questioned Democrats that they don't want to ruin all their credibility by complaining about things
00:36:32.260 that are, you know, not factually correct or not important. So John Stewart is warning people
00:36:42.600 that they're not being rational in their complaints about Trump. He's doing that on a show whose letters
00:36:51.840 are TDS. And here's the best part. Let me pull it all together now. You probably have all seen maybe
00:36:59.340 more than once a famous clip where long before the COVID lab leak in Wuhan was established as the
00:37:07.240 most likely source of it. Jon Stewart broke ranks with the popular opinions. And on the Colbert show,
00:37:15.060 he mocked the fact that there was any doubt that the Wuhan, what was it? The Wuhan Institute of
00:37:25.040 Virology, which happened to be across the street from the wet market. He just mocked people for
00:37:31.960 thinking it was anything but the most obvious thing, which was the very lab that was working on
00:37:36.820 the very thing. Now, you remember how he did it, right? He just mocked the fact that the name of the
00:37:45.980 lab was the answer to the mystery. You didn't have to go any deeper than the name of the lab.
00:37:52.940 The name of the lab. We're done here, people. Look at the name of the lab. Now, that's hilarious
00:37:59.900 because we're all so complicated in our thinking that you really didn't really need to go past the
00:38:06.660 name of the lab. The reason it's funny is that it's 100% correct. You don't need to think deeper.
00:38:14.300 It's a sign on the door. Yeah, that tells you everything you need to know right on the sign.
00:38:20.520 But the same guy who brought us that piece of brilliance and bravery, which was pure bravery,
00:38:28.200 by the way, because he was going against pretty big forces when he said that, Jon Stewart was,
00:38:33.280 sits on a set that literally says TDS and warns people that things are going to get fascisty.
00:38:42.960 Now, I got to pay back Jon Stewart a little bit here. And by the way, I love his whole thing,
00:38:50.160 even when I disagree with him. He's very good at what he does. I got to go full Jon Stewart on you,
00:38:56.060 Jon Stewart. I got to go full Jon Stewart. Jon, the letters on your set say TDS. Not just once,
00:39:08.920 probably a hundred times. TDS. TDS. Do you think that might be the better explanation of why you
00:39:19.340 think things might get fascisty? Do you think those are unrelated? That you're sitting in front of a TDS
00:39:28.960 background saying that everything was fine, but I think it could turn fascisty. Maybe you'll steal
00:39:39.220 your democracy. How long do we have to go before he doesn't do anything like any of those things
00:39:45.180 before you realize it's just the Wuhan Institute of Virology? It's right on the sign. Just read the
00:39:54.660 sign. Now, I know that's an analogy, so it's not a perfect argument. It's just kind of so simulation
00:40:01.420 perfect. I can't stop looking. Meanwhile, over at the big situation about DeepSeq,
00:40:09.320 the Chinese open source AI that some say was only 5% as much cost as the American AI and just as good
00:40:19.080 destroying our industry and all that. Well, I did a little research and I found out how you could
00:40:26.920 make an AI that's 5% of the cost of the United States AI. Are you ready? Number one, you lie about
00:40:34.940 how much it costs. That's important to the process. So you say, it only costs 5%. So that's very
00:40:46.800 important. If you told the truth, it would sound like we have way more NVIDIA GPUs than you think,
00:40:55.960 even though we're not supposed to have any. We've got a whole data center or two that's just stacked
00:41:01.120 with them. And it costs us millions and millions and millions and billions of dollars. So the first
00:41:07.780 thing is you just don't mention that. And then later, later, if somebody says, uh, you know, actually,
00:41:13.900 I think there was like a giant data center involved because otherwise you couldn't get to where you
00:41:18.320 are. Then you just, it's too late because you've already got out the, the 5% is already in people's
00:41:24.340 minds. Oh, it only costs them 5 million. Wow. Cheap. So when later you find out, oh, they had
00:41:30.960 certainly had a data center full of very expensive equipment to get there. You just kind of forget
00:41:36.560 that part. So that's the first thing. Second thing is instead of using training data that you've
00:41:43.520 scraped from the entire internet, the way the big U S companies do, you steal it from the people who
00:41:51.220 stole it. So if the big AI companies stole my IP and my copyrighted works, but tried to cover it up by,
00:42:00.200 you know, generalizing it. Um, first the U S companies steal it and pay nothing to people like
00:42:05.800 me. That's a separate, separate conversation. But then if you want to really save some money,
00:42:12.360 you steal it from the people who stole it. So if you steal from stealers, it looks like it's free.
00:42:21.240 It's not free to me, uh, being one of the original copyright holders who has tons of material,
00:42:27.520 which apparently AI is trained on. How do I know? Cause I can ask it and it knows a hell of a lot
00:42:33.220 about me. So yeah, it trained on me pretty hard. Um, so that's how you do it. You lie about how much
00:42:41.460 hardware you used and then you just steal what somebody else already stole and then it looks
00:42:47.660 cheap. And then you lie about how many people are working on it and all that stuff. So that works.
00:42:54.160 Um, however, I would like to, uh, add this thought when I found out my friend got a great deal on a
00:43:02.920 wool coat from winners. I started wondering is every fabulous item I see from winners like that woman
00:43:09.740 over there with the designer jeans. Are those from winners? Ooh, are those beautiful gold earrings.
00:43:15.680 Did she pay full price or that leather tote or that cashmere sweater or those knee-high boots,
00:43:20.440 that dress, that jacket, those shoes, is anyone paying full price for anything?
00:43:26.140 Stop wondering, start winning. Winners find fabulous for less.
00:43:31.040 Think about all the important people in time, you know, like Plato and Socrates and all them.
00:43:39.900 If people stop reading and start using AI, which apparently is happening and, um, AI reaches sort of a
00:43:48.800 training limit roughly now, meaning that there's not much else to train on in the real world. It's,
00:43:56.720 it's sucked up all the real world stuff. So it's got to use, you know, artificial data to extend.
00:44:03.840 What that means, correct me if I'm wrong, but if there were some modern voices in the world that were
00:44:12.560 unusually powerful, would they not forever be part of AI's personality?
00:44:20.880 And would they not be more important for being current and alive at the moment and creating a lot
00:44:27.420 of documents than let's say somebody who was really smart, but died a thousand years ago,
00:44:32.740 and we only have a few surviving texts, that sort of thing. So would it be true that the people who
00:44:41.060 are, let's say, best-selling authors or public figures who've got a lot of opinions, would it be
00:44:48.980 true that their, their impact on humanity is sort of locked in now? Meaning that AI is sort of
00:44:59.760 permanently affected by the people who are the most persuasive at the moment, because that's where all
00:45:05.920 the training happened. And then after the training happened, all they do is run some updates. You know,
00:45:11.440 I guess some new stuff happened, but you know, it's hardly going to change the whole. Is it true
00:45:18.560 that some people will be more locked in as the personality of AI than other people?
00:45:24.400 And am I one of those people? Let me give you an example. If you go to AI, and it depends which AI
00:45:32.960 you're, you're looking at, and you ask this question, which I have asked, uh, if you say,
00:45:38.880 what is the impact of Scott Adams on politics? Now I asked that question, I forget which AI,
00:45:45.920 it might've been perplexity. And I think Grok has a similar answer, but, um, I think it was perplexity
00:45:53.120 that told me that my contribution to politics was that I changed the national conversation
00:46:00.720 from policy to persuasion. Now that's what AI says. Now, is that true? It certainly seems like it,
00:46:13.440 because if you look at the way, you know, any podcaster or anybody else is talking about anything,
00:46:18.480 we do talk about the policy, but we spend way more time as I have already today talking about the liars.
00:46:26.320 We talk about the fake news because that's persuasion. We talk about Rupar and these technique.
00:46:32.720 We talk about the white house, uh, publishing a hoax list. That's up to four hoaxes, right?
00:46:39.920 Right. Right. That's mostly me. That's mostly me. And that appears to be now locked in to what AI
00:46:51.520 thinks about the world. So cancel me or not too late. I'm baked into the AI and probably you'll never
00:47:02.640 get it out of there. Probably all of my books that the main themes of all of my main books,
00:47:10.080 such as systems being more important than goals. Um, the idea of a talent stack, AI knows that stuff.
00:47:20.000 It knows it and it'll repeat it back to you. Now, is that a, um, is that because it
00:47:27.040 stole my copyrighted work? Not necessarily because so many people have talked about my work
00:47:33.920 that if they just trained on the people talking about it, they'd probably get almost everything
00:47:39.040 they needed. So those are several, several contributions that seem like they might be
00:47:47.280 permanent in the AI brain. And then there's Dilbert itself, you know, 36 years of Dilbert comics,
00:47:55.280 which certainly changed America. If you were watching the business book market at that time,
00:48:02.080 business books that promised to tell you how to do everything great. If you just,
00:48:06.720 you know, did what the business book author said, they became like the, just the biggest thing.
00:48:11.680 And then Dilbert came along and mocked all of that bullshit for being completely worthless crap.
00:48:17.760 And the business market for business books collapsed and never recovered. Now every once
00:48:24.000 in a while, there'll be a big book, but the, the whole idea that any consultant can write a book
00:48:28.320 and it's a bestseller that for a while, it just seemed like anybody with a business name was
00:48:33.760 writing a bestseller about how to be successful. That kind of all went away as bullshit.
00:48:38.960 And it was replaced by what I call a Dilbert point of view, which is sort of cynical and,
00:48:45.280 you know, it's about your bosses looking out for themselves, et cetera.
00:48:49.440 And I'm pretty sure that AI has absorbed all of that, all of it. So even let's say a more minor
00:48:58.640 example, there's a, we'll see in the comments, how many of you ever seen that? How many of you
00:49:04.000 have seen a one page document I wrote years ago on how to be a better writer? How many of you have
00:49:12.000 seen that? It's one of the most viral things for years and years and years.
00:49:15.680 Now that's going around everywhere. And I would assume that AI has absorbed it because it's been
00:49:24.320 in so many places and repeated and repeated and recopied and it's a meme and everything else.
00:49:30.480 So the reason it got around is that nobody disagreed with it. You know, the experts looked at it and
00:49:35.280 said, yeah, that looks about right. So have I, have I become a permanent part of what it takes to be a
00:49:43.760 good writer in the future? And would AI itself be influenced by anything I said? Does it recognize,
00:49:51.040 oh, here's the guide to being a good writer. I'm trying to be succinct. I'll just do that.
00:49:57.440 I don't know. No way to know that. But here's what I'm saying more generally. You know,
00:50:02.960 I'm using myself as an example because I know the most about my own work. But don't you think,
00:50:08.240 don't you think that the people who are the most effective voices at the moment on social media
00:50:18.320 are going to get permanently a higher status in AI until the end of time? Because when AI decided to do
00:50:27.840 its big learning, it seems like that's going to be the bias it will have forever. So good luck
00:50:38.320 canceling me now, suckers. I'm in the machine. I'm in the machine. All right.
00:50:46.000 All right. So the US Copyright Office, according to Just the News, did a ruling that artists can
00:50:58.080 copyright some work that is created with the help of AI. So what is not copyrighted is if the only thing
00:51:06.240 you did is put in a prompt, like, show me an image of Trump riding a bicycle. Can't copyright that. But
00:51:14.720 if you had Trump riding a bicycle, but then you painted your own image of something over it so
00:51:25.120 it's part AI and it's part you, you can copyright that. Copyright the whole thing. So there's going to be
00:51:31.200 a lot of gray area. Well, but I like that they've at least taken that step, that if the human artist has
00:51:40.480 substantially added to the AI, the AI is just a tool. And then the, you know, the ownership and
00:51:48.160 artistry of the creator still gets credit. So this is a step in the right direction. It's going to get
00:51:53.680 really, really gray and messy, but at least we sorted that out. I like that. According to also on
00:52:03.200 in Just the News, somebody named Drew Horn, Drew Horn, which sounds a lot like what I used to do when
00:52:13.120 I was doodling. I would just draw horns on people sometimes because I wondered what they would look
00:52:18.080 like with horns. So that's his name, Drew Horns. Something I've done so many times. Anyway,
00:52:27.600 he's the CEO of something called Green Met and he said that Greenland deciding to leave Denmark and
00:52:37.520 have some kind of association or joining America would be easier than you think.
00:52:44.880 Easier than you think. And indeed, if he's right, Greenland only has to vote on it. So apparently
00:52:52.640 Denmark has been quite open-minded about letting Greenland manage itself. So Denmark seems to care
00:53:00.320 about the national security, the big picture. They don't control the laws in Greenland. And the
00:53:08.240 current law, according to Drew Horn, is that if they wanted to, the people in Greenland could simply
00:53:15.840 have a vote and they can vote their own independence and Denmark would respect it because that's the
00:53:22.720 current system. The current system gives them the right to vote anything they want. And if that's what
00:53:27.200 they want to vote on, there's nothing stopping them. So in other words, when this whole conversation
00:53:34.640 started about how hard it would be for Trump to possibly pull this off, nobody could ever pull this
00:53:41.520 off. It'd be the hardest thing. It might take one hour of the people in Greenland doing a little vote
00:53:48.960 on paper and then counting it. That could be the whole thing. The entire process might be, hey, we're
00:53:56.480 going to make you a proposition in Greenland. You can stick with Denmark or you can make more money
00:54:04.320 going with us and you'll probably be safer too. How about that? Why don't you vote on it?
00:54:10.960 That could be the whole thing. It could be literally just seven bullet points of what we can do for you
00:54:17.680 versus what Denmark can do for you. Just bullet points. That's it. You don't even have a document.
00:54:23.120 Just seven bullet points. Could be fewer. Could be five bullet points. Just what we'll do,
00:54:29.440 what we won't do. Have a vote. One hour. In one hour, Greenland could completely
00:54:38.480 determine its independence and what it wanted to do with the United States.
00:54:42.000 In one hour. So how many times do you have to see
00:54:47.680 that Trump picks some objective that really looks impossible? Like in the real practical,
00:54:54.000 complicated world, it just looks like it can't be done. And then he just does it in an hour.
00:55:02.400 He's got kind of a reputation for that. Just doing in an hour the thing that can't be done.
00:55:07.440 So working with Elon Musk on other things that people say can't be done, like cutting the budget,
00:55:13.600 it's right on brand for Trump. The stuff that can't be done that he can do in an hour.
00:55:17.920 Guantanamo Bay is going to get a new lease on life. Pete Hegseth and the president want to put
00:55:26.640 the illegal criminal migrants, the ones who have broken more crimes than just coming to the country,
00:55:35.120 wants to store them temporarily in Guantanamo Bay prior to shipping them back to the country of origin.
00:55:41.840 Because in some cases, the country of origin will take a little leaning on to make them say yes.
00:55:49.760 That seems to me like a perfect use of it. Number one, it's hard for AOC to visit
00:55:57.120 because you don't want people crying at the fence. So it gives it a little hard to get to quality,
00:56:05.360 which is probably good. Some reporters, I assume, will get there. I don't think it's off completely.
00:56:12.720 inaccessible. But maybe the ones who go there will be vetted so they're not RUPARs and AOCs.
00:56:20.320 So that seems like a good use of something that already exists.
00:56:25.840 Meanwhile, this one's fun. Fox News, Caitlin McFaul is reporting on this, that the incoming UK
00:56:33.200 ambassador, so this is who the UK has decided will be the ambassador to the United States.
00:56:39.280 And remember, we have a special relationship, a special relationship with the UK. It's special.
00:56:47.120 And their new ambassador is coming in, Lord Peter Mandelson.
00:56:51.840 He says good things about Trump today, but he didn't always say good things about him because in
00:57:00.960 2019, he had said that Trump was, quote, a danger to the world, a danger to the world.
00:57:07.600 Which is another way of saying you have TDS and you're worried that Trump will be more fascist-y.
00:57:15.440 Things will get more fascist-y. That's kind of what he was saying in 2019. But now he's changed his
00:57:20.960 tune. And he says that Trump could be one of the most consequential American presidents of his lifetime.
00:57:26.560 Oh, hold on. Hold on. Nope. Nope. If you read that fast, it sounds like a compliment.
00:57:34.240 But just being most consequential would not be counter to his earlier opinion that he was a
00:57:40.880 danger to the world. A danger to the world would also be the most consequential.
00:57:46.640 So he'd better say better than that. Does he have anything better to say than most consequential?
00:57:53.760 Well, he also said, I consider my remarks about President Trump as ill-judged and wrong.
00:58:01.280 Huh. But why? Were they ill-judged and wrong because his opinion was wrong? Or were they ill-judged
00:58:10.400 and wrong because it's inconvenient to his current career objectives? Hmm. I think I need to know more
00:58:17.440 about why you think you were wrong. But he goes on. And he said, I think that times and attitudes
00:58:23.760 toward the president have changed. Okay. You're still not saying what your attitude is. I get it
00:58:32.000 that other people's attitudes have changed. You're so close to saying something right and good,
00:58:38.480 but you're not there yet. You're not there. Can he take it over the line? And then he said,
00:58:46.320 I think that he, meaning Trump, I think that Trump has one fresh respect, he added. He certainly has
00:58:53.920 for me. Oh, okay. Now we're talking. So he has fresh respect. And that is going to be the basis of all the
00:59:02.640 work I do for his majesty's ambassador in the United States.
00:59:11.040 Now, there's some rumor that the US, Trump in particular, would reject their ambassador,
00:59:19.520 would reject their ambassador. But would you reject him if he's on the right page now?
00:59:26.640 Because all I really want from other countries is that they treat the US with respect. And they
00:59:33.600 understand that that's the way it has to be. You know, you don't have to agree with everything.
00:59:38.320 That's not a requirement. But respect. Yeah. Don't call our leader a danger to the world
00:59:44.160 and act like you can work with. Yeah. Okay. So on one level, you'd say to yourself,
00:59:49.920 hmm, I don't know, maybe we could do better. Could we get somebody who didn't once hate him?
00:59:56.480 That seems like a safer play. But on the other hand, somebody who was like an ex-smoker,
01:00:02.000 you know, who's admitting he was wrong. And now he's trying to make good. Well, that could be good,
01:00:06.560 too. Maybe he'll try harder to show that he's on Trump's side. Maybe they'll work in our favor.
01:00:11.520 But it does make me wonder if the, quote, special relationship, are they using the word special
01:00:22.480 in what context? Is it Special Olympics? Or is it special like it's just good? Now, no disrespect to
01:00:33.600 the Special Olympics, which is pretty awesome. But we do wonder what they mean by that special
01:00:39.760 relationship. Because that word can be interpreted in more than one way. But if this guy thought Trump
01:00:46.960 was a danger to the world, and then he found out he was completely wrong in his political worldview,
01:00:53.200 is that the guy you want representing your country? The guy who was wrong about something so basic?
01:00:59.280 And the reason he was wrong is not because of his opinion, but because he fell for brainwashing.
01:01:06.080 If you fall for brainwashing, and it's public, and everybody can see it,
01:01:12.800 is that the one you want sitting in the US for your special relationship? I don't know. So here's my
01:01:19.840 take. I think he's, you know, maybe minimally acceptable. But if Trump decided to get him out of
01:01:29.360 here, just get him out, I'd be okay with that. I think that's up to Trump. That's a personal
01:01:35.360 decision for him. And I could go either way. Meanwhile, according to the Daily Wire, the White
01:01:41.280 House has received over 7400 applications to be some kind of new media White House reporter.
01:01:50.080 Because the White House has opened up the question answering or the question asking process for the
01:01:57.440 press, the press events. And now, if you're a podcaster or something, you can apply. And if they
01:02:04.400 like you, and you look like a serious news related entity, you can get some kind of press credentials.
01:02:15.040 Here's what I think. First of all, this is brilliant. Everything about this is good.
01:02:20.560 Why wouldn't the Democrats do it? Well, let me give you the obvious reason. The Democrats benefit
01:02:28.720 when the traditional corporate media is healthy, because those are the ones that have their back.
01:02:35.200 So the corporate media is basically a Democrat platform, most of it. But we can't live without it,
01:02:43.600 because you also need the news. I've said this before, but if the corporate media died tomorrow,
01:02:51.520 I wouldn't have much to talk about. Because I mostly riff off of things they did stupidly.
01:02:59.040 Right? And then there are things that the White House announces. So a lot of the news is just the
01:03:04.960 White House announced something or they answered a question, or they have a plan. Podcasters can do that.
01:03:10.880 We don't need NBC to be in the room. Tell us what happened in the room. And then we go and make
01:03:18.400 our podcast mocking NBC's bad interpretation of it, but also looking at the original so we can see what
01:03:24.480 they got wrong. So that's what I do all day. And it looks like maybe the Trump administration is not
01:03:33.360 only opening up their access, which they like to do, but maybe they're killing the corporate news.
01:03:39.520 Because if a podcaster is getting the direct news from the White House staff and has access to all
01:03:47.440 the right people and can ask all the right questions, I don't need NBC. I don't need MSNBC. I don't need
01:03:54.320 ABC. I could just go to whichever podcaster got the credentials and spent the most time at the White
01:03:59.840 House. So this could be a death blow to the corporate news. I don't know if they mean it to be that,
01:04:07.680 but it could be that way. Christy Noem, she's Homeland Security Secretary, and she announced an end to
01:04:20.800 grant funding abused by NGOs for aiding illegal migrants. Wow. That is so good.
01:04:30.560 I asked the other day. I said, when are we going to, just on X, I said, when is the government going to cut the funding to these NGOs?
01:04:42.560 Now, I said all of them, but what I really meant was the bad ones. And it looks like the bad ones are all out of money now, or at least government funding. So, man, it makes me wonder, how was it ever a thing
01:04:59.040 that gigantic percentage of our budget was going to these really just money laundering, highly political
01:05:09.920 things that the country would never have agreed to if they knew it was happening? So, man, cleaning that up is, that's quite a swamp cleanup. I'm happy about that.
01:05:20.480 Meanwhile, Trump has signed what NBC News is calling a sweeping executive order. Let's see. If all the podcasters had access to all the White House, would I need NBC News to tell me what Trump just signed? Nope. I could get that from Tim Poole or whoever's going to get credentials.
01:05:45.200 Anyway, so is it a big deal? Why is it a big deal? He said he would prioritize and free up federal funding to expand the school choice programs. Oh, it's a big deal, because there's big money behind it. And the government would be behind, at least the federal government, would be strongly backing homeschooling.
01:06:07.260 I feel like that. I feel like that might be one of the most historic decisions in the history of the United States.
01:06:16.220 If you don't follow the homeschooling thing, you're thinking to yourself, well, Trump did a lot this week. That's not like the biggest one or anything. It might be. That might be the biggest thing.
01:06:27.360 Because if we don't fix education, everything breaks. And education is completely broken right now. It's completely broken.
01:06:36.440 Here's what I never hear. I've never heard this once, actually. I've never heard somebody who homeschools their child say, that was the biggest mistake I ever made.
01:06:47.320 Should have sent them to public school. Not once. And I know quite a few homeschoolers at this point.
01:06:54.500 Don't you? You all know some homeschoolers, right? Have you ever heard any homeschooler, maybe the student might say, I wish I had more friends or something, but I don't think any parents have ever said, as long as they had a system that worked, I don't think they've ever said, I wish they went to public school.
01:07:14.380 I have to be honest. I've interacted with enough people who are products of homeschooling and enough people who are products of public schools.
01:07:23.840 Like, you can really, really tell the difference. Am I wrong about that? Have you all experienced that? That when you meet a homeschooler versus meeting a public schooler, oh, there's a difference.
01:07:43.820 And I don't think it's just selection. It's not just selection. It's what you turn into with those two experiences as your contrast. So that could be a big deal.
01:07:56.620 Meanwhile, the Justice Department, according to NBC News, dropped a classified documents case against Trump's co-defendants.
01:08:06.980 So I have some question about the timing of that. Is this an extension from some old news or is it really new news?
01:08:14.260 But I don't want to see any of Trump's co-defendants go to jail when Trump himself gets dropped from the case because he's president.
01:08:23.820 That just wouldn't feel right to anybody. So I don't know the details of that. It just sounds like something good happened in that regard.
01:08:33.100 Speaking of Denmark, how does Denmark get in the news twice?
01:08:36.900 I remember going years without mentioning Denmark even once, but today twice.
01:08:43.320 So Denmark, for reasons that are escaping me, they're letting Russia plug the Nord Stream 2, the one that got blown up, mysteriously blown up.
01:08:56.020 We don't know who. Nobody knows. Yeah, we all know.
01:09:01.260 Why would we be in the middle of the Ukraine war, hopefully closer to the end, but still in it,
01:09:08.400 and Denmark's going to allow them to rebuild the pipeline that was one of the biggest risks?
01:09:17.760 How does that make sense?
01:09:19.560 Well, not everybody's happy about it.
01:09:22.620 Poland says, what the hell are you doing?
01:09:24.900 Poland does not want that to happen because it just makes Russia stronger,
01:09:29.460 gives them economic leverage over that part of the world and Europe.
01:09:32.940 And I don't know what's going on.
01:09:36.120 So my take on this is fog of war.
01:09:41.380 There's something about this story that we don't know.
01:09:45.540 It could be,
01:09:48.140 let me give you my most optimistic take,
01:09:51.380 it could be
01:09:52.380 that all sides see that the conflict
01:09:56.160 between Ukraine and Russia
01:09:58.260 Russia and NATO
01:09:59.460 are about ready to wrap up.
01:10:02.540 If they know it's about ready to wrap up
01:10:05.240 and they feel that part of that will be
01:10:08.000 normal business will be allowed to,
01:10:10.820 you know, happen,
01:10:12.540 I could see Denmark saying,
01:10:14.940 oh, we're going to,
01:10:16.000 it's going to take us months
01:10:17.100 to just get prepared to do this
01:10:19.460 and give approval.
01:10:20.500 So we'll just say it now
01:10:21.940 because we're pretty sure the war is
01:10:24.380 going to be wrapped up in a month.
01:10:26.660 You know, we'll be back to normal.
01:10:27.980 We need the gas.
01:10:30.260 So why not?
01:10:31.920 It's possible.
01:10:34.660 I wouldn't bet on it,
01:10:36.580 but it's possible
01:10:37.880 that maybe the United States said,
01:10:41.100 you know, don't refer to us,
01:10:43.080 but if you want to get this going,
01:10:44.880 we're not going to stop you
01:10:45.820 because we think we're going to wrap things up.
01:10:48.360 It could be
01:10:49.200 that that's one of the things
01:10:50.860 that Trump has promised Russia
01:10:54.100 because you can imagine
01:10:56.540 that no matter what the deal is
01:10:58.300 to end the war in Ukraine,
01:10:59.880 it's going to be some shit we don't like
01:11:02.160 and some shit he doesn't like.
01:11:05.280 Otherwise, it doesn't happen.
01:11:07.320 All right.
01:11:07.480 There's no such thing
01:11:08.400 as an end to a war
01:11:09.800 where both sides got what they wanted.
01:11:11.560 That's not a thing.
01:11:12.920 You know, don't even think about that.
01:11:14.820 The way you know the war is ending
01:11:16.800 is when people are doing things,
01:11:18.820 you know, they didn't want to do.
01:11:19.880 That's the signal you're looking for.
01:11:23.040 People doing things
01:11:24.140 they didn't want to do.
01:11:25.180 Now you're talking.
01:11:26.660 That means that people
01:11:27.980 have moved off their hard positions.
01:11:30.300 So on one hand,
01:11:31.840 it doesn't make sense to us
01:11:33.480 with what we know
01:11:35.180 that Denmark would be giving a green light
01:11:38.020 to rebuild the Nord Stream 2.
01:11:40.920 But if that's part of what Trump has said,
01:11:45.500 you know, I don't like it,
01:11:46.720 but I'm going to give you this.
01:11:48.400 You know, I'd rather not compete
01:11:49.560 against it.
01:11:50.280 We'd rather, we'd rather,
01:11:52.400 you know, we didn't compete.
01:11:53.520 But if that's what it's going to take,
01:11:56.060 you know, if that's what makes people
01:11:57.680 stop dying,
01:11:59.700 we'll just try to out-compete
01:12:01.320 your pipeline instead of blowing it up.
01:12:04.420 I could see that.
01:12:06.200 So here's what I think.
01:12:07.360 I think that's the canary
01:12:08.460 in the canary in the coal mine.
01:12:10.680 I think that Denmark wouldn't go rogue
01:12:13.260 unless somebody had whispered
01:12:15.500 in their ear.
01:12:16.960 Just speculating.
01:12:18.420 Just pure speculation.
01:12:19.560 But I feel like you're going to find
01:12:21.860 that a deal is almost ready
01:12:24.100 because, you know,
01:12:25.480 deals are ready before you know
01:12:27.360 that they're even talking about it
01:12:28.700 and that this might be part of it.
01:12:31.680 So I'm going to take the optimistic view,
01:12:34.280 which is not entirely called for,
01:12:36.700 but it's a sign of the times.
01:12:40.280 The sign of the times is
01:12:41.700 I can take the optimistic view on this
01:12:43.600 and I won't be mocked
01:12:45.440 even if it doesn't work out
01:12:46.760 because if you choose optimism
01:12:49.300 in the middle of the beginning
01:12:51.460 of the golden age
01:12:52.400 when it does seem like
01:12:53.780 a lot of things are going our way,
01:12:55.500 you're not crazy.
01:12:57.520 You might be wrong,
01:12:58.740 but I won't look crazy in the end.
01:13:03.500 Well, here's a study in the SciPost.
01:13:06.240 Eric Dolan is writing about it.
01:13:07.780 It says that human evolution in the U.S.
01:13:09.980 I was talking about that as a topic
01:13:13.740 that education-linked genes
01:13:15.940 are being selected against,
01:13:18.220 meaning that there are more people
01:13:20.260 being born without higher education
01:13:25.700 or without the gene to want to do it, I guess.
01:13:29.680 I don't know if the genes make any difference
01:13:31.620 to wanting to be educated,
01:13:33.200 but probably.
01:13:35.500 And that these findings offer new insight,
01:13:38.400 blah, blah, blah, blah, blah.
01:13:39.280 All right.
01:13:41.340 You know what I'm going to say.
01:13:45.620 Is there anybody watching
01:13:47.060 who is not fully aware
01:13:49.280 that people with less education
01:13:52.560 have more babies?
01:13:54.720 You all knew that, right?
01:13:58.080 It seems like it's true almost everywhere.
01:14:01.700 I mean, maybe in some weird case
01:14:05.140 like China in some situation,
01:14:07.580 but generally speaking,
01:14:09.280 poor people have more babies.
01:14:11.840 And there's obvious reasons for it.
01:14:13.760 If you have the option to go to college,
01:14:16.340 it's tough to have a family at the same time.
01:14:19.460 So you do one first,
01:14:20.720 and then you've got less time,
01:14:23.240 and you've got better opportunities
01:14:25.400 than just having kids,
01:14:27.420 as awesome as it might be.
01:14:29.060 So everything about that makes sense.
01:14:30.920 But you could have just asked me.
01:14:33.040 I'm pretty sure that dumb people have more babies,
01:14:36.040 and I could give you five reasons,
01:14:37.680 and they would all check out.
01:14:39.740 But I'm going to go deeper.
01:14:41.320 Or if it's true today
01:14:44.180 that the lesser educated people
01:14:48.380 are outproducing the others,
01:14:50.660 was it always true?
01:14:52.940 Is it possible it's always been true
01:14:55.280 since maybe the younger Dryas time?
01:14:59.020 Because the thing that I think is the funniest
01:15:02.700 is that we sit around watching TV,
01:15:05.220 and we can't figure out
01:15:06.060 how the pyramids were built.
01:15:07.880 Like, how did they move these big rocks?
01:15:09.620 How did they carve them so precisely
01:15:11.880 and then put them in place?
01:15:13.860 And I think to myself,
01:15:15.160 maybe we were just smarter,
01:15:17.760 and we've evolved since that time
01:15:20.180 just getting dumber and dumber,
01:15:21.680 and it's not going to stop.
01:15:24.080 We're just going to keep getting dumber
01:15:25.680 until, you know, civilization ends.
01:15:29.600 Maybe.
01:15:30.380 The only hope is that the robots
01:15:32.460 will fill in the gap
01:15:33.840 for how dumb we are,
01:15:35.300 and they'll do the smart stuff
01:15:36.340 so we'll not even notice.
01:15:38.460 So maybe we found a way to hack that.
01:15:43.000 In other news, in The Guardian,
01:15:45.920 the scientists developed a patch
01:15:47.700 that you can put on your heart to repair it.
01:15:50.500 But the patch is made from your own blood,
01:15:53.880 and they trick your blood
01:15:55.020 into turning it into stem cells,
01:15:57.560 and then they take the stem cells,
01:15:59.360 and they turn it into heart muscles
01:16:01.020 that are your heart muscles,
01:16:02.320 and then they put it on a patch,
01:16:04.120 and they slap it on your existing heart
01:16:06.500 that's damaged,
01:16:07.920 and it helps that part of the heart beat.
01:16:11.200 So it doesn't repair the thing it's over.
01:16:14.740 It just operates as if it's the thing.
01:16:17.060 So the patch becomes the muscle.
01:16:20.500 And then the muscle being damaged
01:16:22.140 does what it does,
01:16:23.760 but it can't do as much
01:16:24.960 of what it wants to do.
01:16:26.800 That would be a gigantically big deal
01:16:29.780 because apparently it would be effective
01:16:31.520 for people with serious heart injuries.
01:16:34.120 So not just somebody
01:16:35.280 with a little bit of a problem.
01:16:36.960 This could be something
01:16:39.300 like the end of most heart disease.
01:16:41.520 Well, it looks like you got
01:16:42.460 a little damage on your heart.
01:16:44.440 It's going to take us about
01:16:46.260 a two-hour operation.
01:16:48.440 First, we've got to take a little blood,
01:16:49.900 turn it into stem cells,
01:16:51.100 turn it into heart muscles,
01:16:52.700 turn it into a little patch,
01:16:54.480 slap it on the side of your heart,
01:16:55.860 you're good to go.
01:16:57.080 You'll be up and running in a week.
01:17:00.000 That could be a real thing,
01:17:02.300 and very soon.
01:17:04.020 That's kind of cool.
01:17:04.780 Anyway, according to the New York Post,
01:17:11.120 there's nationwide testing of school kids
01:17:13.780 found out that their reading levels
01:17:16.040 have plummeted to the lowest level
01:17:18.140 in 32 years.
01:17:21.360 So that's bad, right?
01:17:23.160 The reading levels of U.S. kids
01:17:25.020 have plummeted to the lowest level
01:17:26.940 in 32 years.
01:17:28.220 You know what else is bad?
01:17:29.200 The ability of the news
01:17:32.640 to understand data and statistics.
01:17:35.980 I haven't done a study,
01:17:37.960 but I'll bet you the ability of the news
01:17:39.960 to interpret data and statistics
01:17:41.820 is that I'm going to guess a 32-year low
01:17:45.020 because this is pretty bad.
01:17:47.980 Let me just give you a guess
01:17:50.400 of what they left out of the story.
01:17:53.600 Anybody want to guess?
01:17:55.140 So they look at the total average,
01:17:56.900 and the total average is
01:17:58.500 reading levels are the worst in 32 years.
01:18:01.500 Is there anything they may have left
01:18:03.840 out of the story?
01:18:06.040 I wonder.
01:18:07.860 Yeah, this is the one story
01:18:09.480 where the demographics were left out.
01:18:13.940 Isn't this exactly the kind of story,
01:18:17.160 something about school children performance,
01:18:20.660 isn't this usually where they include it?
01:18:22.520 I always thought that it would help us
01:18:27.000 if we knew what the demographic breakdown was
01:18:29.760 because that would be a deeper insight.
01:18:31.720 We would know, for example,
01:18:32.980 if systemic racism was any part of that.
01:18:37.980 Although, presumably,
01:18:39.060 that's not getting worse,
01:18:40.740 so it would be hard to say
01:18:42.240 it's systemic racism
01:18:43.300 because that would be similar
01:18:45.860 to how it's been for 32 years.
01:18:47.720 I don't think it got worse, did it?
01:18:49.740 Here's what else is missing.
01:18:54.800 So here's how they handled the fact
01:18:56.480 that they left out the demographics.
01:19:00.840 They said also, part of the same story,
01:19:03.260 the gap between the high
01:19:04.560 and low-performing students
01:19:05.920 also showed signs of trouble.
01:19:08.240 Oh, so it's also a problem
01:19:10.580 if there's a bigger gap.
01:19:14.120 Oh, okay.
01:19:14.980 So let's go deeper on that.
01:19:16.360 And, but what they said is
01:19:18.780 that the best 10% of the students
01:19:21.340 were doing better than ever.
01:19:24.120 The lowest 10% of the students
01:19:26.180 were doing worse than ever.
01:19:30.040 So let me say it again.
01:19:31.380 The best 10% of the students
01:19:33.120 were not having anything like
01:19:35.380 a 32-year low in reading.
01:19:37.820 They were better than ever,
01:19:39.780 the top 10%.
01:19:40.840 Better than ever.
01:19:43.100 The top 10% are the only ones
01:19:44.900 who make any difference.
01:19:46.520 I hate to tell you,
01:19:47.960 but the bottom 90%
01:19:49.180 are not inventing iPhones
01:19:51.100 and, you know, making robots.
01:19:54.380 It's all the top 10%.
01:19:56.260 The top 10% is better
01:19:58.200 than it's ever been.
01:19:59.640 Our top 10% rocks.
01:20:02.500 They're really, really good.
01:20:04.880 So then here's the other thing
01:20:07.340 that they left out.
01:20:08.120 Now, if you went to one
01:20:10.400 of my local schools,
01:20:11.920 which would be considered
01:20:13.400 one of the top schools
01:20:14.620 in California,
01:20:16.000 my local school,
01:20:17.620 top, probably top 20%
01:20:19.560 or something, you know,
01:20:20.580 not elite or anything like that,
01:20:22.880 but people move to where I live
01:20:24.980 specifically because the schools
01:20:27.380 have a good reputation.
01:20:29.080 If you went into one of the classes
01:20:31.320 in any high school
01:20:33.560 or any grade school too,
01:20:36.220 what would look different
01:20:37.700 than it looked 32 years ago?
01:20:40.520 Well, let me tell you,
01:20:42.620 there would be more people
01:20:43.680 in the class
01:20:44.240 who don't speak English
01:20:45.380 than at any time
01:20:47.300 in 32 years.
01:20:50.180 Every class,
01:20:51.620 every class locally
01:20:53.380 has an unusually high number
01:20:55.940 of non-English speakers.
01:20:57.960 Do you think they should
01:20:58.820 have mentioned that?
01:21:00.380 Do you think that it all
01:21:01.740 changes your average?
01:21:03.360 Yes.
01:21:04.180 Forget about IQ.
01:21:05.600 Forget about training.
01:21:07.360 Forget about your background.
01:21:08.400 If you can't speak
01:21:09.600 the frickin' language,
01:21:11.740 how's your reading comprehension
01:21:13.180 in English going to be?
01:21:14.780 Should be the lowest
01:21:15.660 it's ever been.
01:21:16.880 You're going to be
01:21:17.380 in the lowest 10%
01:21:18.380 and you'll be worse
01:21:19.920 than the old 10%
01:21:21.000 because the old 10%
01:21:22.020 could at least read English.
01:21:24.340 You not only are brand new,
01:21:26.580 but you can't even
01:21:27.240 read the language.
01:21:28.840 So if you don't take out
01:21:30.220 the recent arrivals,
01:21:31.940 because that's not fair,
01:21:34.260 and you don't acknowledge
01:21:35.820 that pretty much every class
01:21:38.000 at this point
01:21:38.720 has a good chunk
01:21:40.300 that it never had before.
01:21:43.560 That's going to be something.
01:21:46.340 Anyway, so anytime you see
01:21:48.580 an average of the entire group
01:21:50.380 of American children,
01:21:52.040 just know that that's
01:21:53.260 probably hiding something.
01:21:55.840 It's probably hiding something.
01:21:58.500 All right.
01:21:58.820 Meanwhile, according to live science,
01:22:02.040 Chinese astronauts
01:22:02.900 figured out how to make
01:22:03.820 rocket fuel
01:22:04.560 and oxygen in space
01:22:07.180 using only artificial photosynthesis.
01:22:12.080 So they can make oxygen
01:22:14.000 and they can make rocket fuel
01:22:16.400 in space.
01:22:18.000 And they did it in space,
01:22:19.160 so they're not wondering.
01:22:20.940 They actually have a ship
01:22:22.840 that's circling,
01:22:24.380 orbiting the Earth right now,
01:22:25.660 and they made some
01:22:26.760 artificial photosynthesis.
01:22:28.040 So that might be valuable
01:22:29.540 for some future moon
01:22:31.180 or Martian base.
01:22:33.160 I remind you that
01:22:34.240 if you don't have
01:22:35.100 your Dilbert calendar
01:22:36.140 for 2025,
01:22:36.820 it's not too late.
01:22:38.180 You can get a 10% discount
01:22:39.680 for missing the month
01:22:41.080 of mostly January.
01:22:43.560 But really,
01:22:44.440 it's a calendar
01:22:45.100 that's sort of like
01:22:46.180 a book in calendar form.
01:22:47.660 So the point of it
01:22:48.260 is to read a bunch of comics.
01:22:49.480 It's not about telling you
01:22:50.560 what day it is
01:22:51.300 because you've got a phone
01:22:52.760 and you know what day it is.
01:22:54.840 But some people
01:22:55.520 like calendars.
01:22:56.120 So you can get that.
01:22:58.260 Go to Dilbert.com
01:22:59.420 and you'll see the link
01:23:00.380 to buy that.
01:23:01.420 And when you get there,
01:23:02.980 it'll show you
01:23:03.560 the discount code
01:23:04.580 when you check out.
01:23:05.740 You won't see it
01:23:06.360 until you check out.
01:23:07.440 But it'll show you
01:23:08.300 the discount code
01:23:09.100 for your 10%.
01:23:10.200 And if you do that,
01:23:12.000 you're going to be so happy.
01:23:14.220 People seem to like
01:23:15.080 the calendar a lot.
01:23:16.960 All right.
01:23:17.380 And my book,
01:23:18.020 Win Bigly,
01:23:18.620 will teach you how to,
01:23:20.500 which is in the second edition,
01:23:22.620 now on Amazon.
01:23:23.400 If you buy
01:23:26.040 Win Bigly,
01:23:26.600 you will understand
01:23:27.500 the persuasion perspective
01:23:29.780 of President Trump.
01:23:31.480 And you'll learn
01:23:32.180 how to do it yourself.
01:23:33.420 It's easy.
01:23:34.960 So those are the things
01:23:36.000 I wanted to tell you.
01:23:36.680 I'm going to say hi
01:23:37.240 to the locals people privately.
01:23:39.780 And then we're going to watch
01:23:40.640 some confirmation processes,
01:23:42.800 cash and Tulsi.
01:23:44.320 See how that goes.
01:23:45.820 All right.
01:23:46.040 Locals,
01:23:46.540 coming at you privately
01:23:47.820 if you're on YouTube
01:23:49.180 or X or Rumble.
01:23:50.920 Thanks for joining
01:23:52.420 and I will see you
01:23:53.820 same time next time.