The crash of an American Airlines jet and a Black Hawk helicopter in the Potomac leaves at least 60 dead, and many more missing. Scott Adams talks to an expert in Black Hawk helicopters to try and figure out what happened.
00:01:23.160So we're still in the fog of war, period, but people are speculating, how could this happen?
00:01:33.900You know, it's such a normal thing for there to be traffic in that part of the world,
00:01:38.700and all of that advanced technology should have seen each other.
00:01:44.360But that's what people who are not pilots say. Would you like to hear what a pilot says?
00:01:50.860Which is very different from what you and I are saying, because here's you and I trying to figure
00:01:55.980out this situation. Huh. If I looked out the window of my helicopter, would I be able to see a
00:02:03.040gigantic airplane coming my way? I think I would. So it doesn't make sense I didn't see it.
00:02:09.300And if I were in a giant airline, would I be able to see a helicopter coming toward me?
00:02:15.880Well, of course I would, because, you know, I don't know anything about airplanes,
00:02:19.620but I can look out a window and I can see a thing. But here's what an actual,
00:02:25.340an expert in Black Hawk helicopters tells us. Here's somebody who follows me on X,
00:02:31.420so I was alerted to this one. Mark McEthrin says,
00:02:37.060I was a Black Hawk helicopter crew chief in the army. Okay. That's exactly who I want to hear from.
00:02:44.020And not only that, but he was a flight instructor. Okay. Now we're talking to the right person.
00:02:49.960I want to know, somebody who's an expert in these helicopters, how hard is it to spot other traffic?
00:02:58.380And the bottom line is, it's super hard, even for the experts. So you could actually have this
00:03:05.720accident happen without much going wrong. I hate to say it, but it might not be that anybody did
00:03:14.360anything wrong in quotes. It could have been, this is just a really hard thing to do.
00:03:20.860So he talks about the massive responsibility of the people who are the crew for the helicopter,
00:03:27.320Black Hawk specifically. And part of their job is to look, be the extra eyes for the pilot.
00:03:33.040So they're the ones who are looking for the extra things that the pilot might not spot.
00:03:38.460And he said, I can tell you after doing this for hundreds of hours,
00:03:42.500even when you know exactly where a Black Hawk is and you have night vision goggles on,
00:03:49.800it is extremely, in all capital letters, hard to see the aircraft.
00:03:54.240So from the perspective of the commercial flight, seeing it probably wasn't even an option. I mean,
00:04:03.340it's just really, really, really, really hard to see. So my current view on this is that it's still
00:04:11.080fog of war. You know, we don't know exactly when wrong. But the most likely is that it was just
00:04:20.400really, really, really hard situation for even experienced pilots, even with all the electronics
00:04:26.580in the world. And it probably was just a very unfortunate, perfect storm of something being
00:04:33.080in exactly the wrong place at the wrong time. That's my guess. My guess is that, you know,
00:04:38.980there will be no specific blame. I feel like it was just hard. And some people say,
00:04:45.840we're lucky we haven't had more of these, because the odds of something like this happening are
00:04:50.800pretty good, just in general. So the fact that one happened doesn't mean anything new is added to
00:04:59.300the story, necessarily. But as you know, Aaron Rupar, the sometimes considered the worst person in the
00:05:06.900media, who's literally famous for fake news, so famous that his name itself is used as one of the
00:05:15.820synonyms for fake news. So a Rupar edit is something that used to be true until it got edited to look
00:05:24.240like the opposite. So he, as soon as the accident happened, he posts a news story about how Trump
00:05:34.880had gutted key aviation safety committee and fired the head of the TSA. Now, that only happened
00:05:42.600this week. I don't think that the firing of the head of the TSA affected the capability
00:05:51.020of the Black Hawk pilot or the commercial airline. I think it's pretty safe for all of us to say,
00:05:59.200whatever went wrong, it wasn't because of Trump. He's been there a week, and the only change he made
00:06:05.800couldn't have possibly had an operational impact. Or at least you better connect the dots a lot
00:06:12.240better if you're going to make that theory. So what's funny about it is that there was a time
00:06:17.900when people would have just argued about the truth as a statement, or, hey, is that true
00:06:23.620that something Trump did could be part of the story, or is that not true?
00:06:28.500But this time, and I'm very happy to report, almost all of the heat that Rupar got was for
00:06:36.420being that Rupar guy. So, oh, I'm happy about that. Because the reason I came up with the term
00:06:46.220Rupar, and used his last name to be synonymous with a certain kind of fake news, is that when he does
00:06:54.040something on a topic like this, your first thought should be, oh, it's Rupar, Rupar doing a Rupar.
00:07:02.160Because that's the healthiest thing you could think. The moment you think, ooh, here's somebody
00:07:07.180making a point I disagree with, it doesn't really look like he's trying to make points.
00:07:13.420It looks like he's being Rupar. You know, whatever his motives are, we don't know. But it doesn't look
00:07:20.420accidental that he's wrong all the time, does it? It doesn't look accidental. Can't read his mind.
00:07:28.340But he doesn't make it look like an accident. So use your own judgment. But watching people just
00:07:34.040destroy him reputationally, instead of dealing with the ridiculousness of the point of view,
00:07:40.140was fun. Metta agreed to, and let me just circle back for a moment. I don't want to have fun with
00:07:52.100the tragedy. So I'll have fun with the Rupar. But 60 people, this is like one of the worst air
00:08:00.260accidents in a long time. We'll argue about, you know, how long it's been. But this is seriously
00:08:07.140bad. I mean, even Trump, his optimism kind of left him last night. He even posted on Truth,
00:08:14.600you know, it's a terrible night for the United States. So it's just a terrible night. And,
00:08:20.160you know, complete sympathy for the families and the victims. And incredible respect for the recovery
00:08:29.120crew. Imagine working at night in freezing water. You know, obviously they're equipped, but in freezing
00:08:38.580water. And the awfulness of what they have to do, just, that's the job they signed up for. Imagine
00:08:47.280signing up to be in that line of work. To, if a bunch of bodies end up in a freezing river,
00:08:55.480you're the one to go get them. I mean, the fact that we have humans who will take that job
00:09:03.200and then do it well, my God. So full respect. All right, another story. Meta has agreed to pay
00:09:14.840Trump $25 million to settle the lawsuit about Trump getting kicked off of Facebook back when he got
00:09:23.900kicked off. The, I love this settlement. You know, I don't always love settlements or even court
00:09:32.760decisions, but there's something, there's something really healthy about this one. First of all, the
00:09:38.16025 million isn't going to hurt Facebook's business. That's good. That's good. I don't want them to go
00:09:43.760out of business. It's big enough. So we all get the point, right? 25 million. All right. You have my
00:09:51.800attention. We get the point that the censorship of Trump was just flat out wrong. I'm pretty sure
00:09:59.240Zuckerberg says that directly at this point. He does, right? He says that directly. So once
00:10:04.860Zuckerberg and Meta have said we acted wrong, a settlement makes sense. And from Trump's point
00:10:14.300of view, once they've admitted that they were wrong, and it looks genuine, by the way, I think
00:10:18.360Zuckerberg means everything he says about this. Um, you don't need to get a billion dollars,
00:10:24.040right? Yeah. You just, you don't need that. You just need a nice, clean, solid apology slash
00:10:34.720settlement. It's just the right number. And even better, um, 20, I guess, 22 million of the 25
00:10:42.180will go to the Trump library instead of, you know, Trump's pocket. You don't want that. And, um,
00:10:50.840the rest goes to pay the lawyers. Nicely done. That, that would be, you know, I take this back to
00:10:59.280how apologies work, the difference between men and women. I've mentioned this before. Uh, when men
00:11:07.300apologize, I apologize to men, you can, you can sort of see this settlement as like a Zuckerberg
00:11:13.500apologizing to Trump. Men accept apologies. As long as it looks like you mean it, you know,
00:11:22.200we're not, we're not going to delve into your personal thoughts, but if you say it like you mean
00:11:27.260it and you act like you mean it and you change your behavior, we're like, great, done. You're,
00:11:34.560you're even more awesome than I thought. So apologies really work for men. If they're real
00:11:40.160for women might be a different situation. I'm no expert on women. So I can't say one way or the
00:11:47.220other, but it never seems to work. If you've ever tried, it doesn't seem to work the same way,
00:11:53.980but I can tell you from my own personal experience, you can change me from your raging anger to,
00:12:01.460oh, we're good now. We're just an apology, you know, depending on the situation. So,
00:12:07.400so this looks like a man to man apology, uh, Zuckerberg to Trump. And it looks like the two
00:12:12.840of them handled it like men. Nicely done. Um, some of you are going to say, oh, he's just covering
00:12:19.840his ass. Yeah, that's, that's what we do. Right. Um, and there's some element to that. So there is
00:12:26.420never just one thing, you know, can't Zuckerberg do what makes sense and what he thinks is right,
00:12:31.460but it also is good for business. It's also good for covering his ass. Nothing wrong with that.
00:12:37.880Give him a threefer. Meanwhile, Elon Musk says that, uh, now that the full self-driving capabilities of
00:12:45.740the Tesla's have really reached breakout level, uh, breakout, meaning there's no question that the
00:12:53.200self-driving is safer than the human driver. Just no question. I believe, I believe that's over.
00:13:00.800Uh, I think the argument about who's safer, you could just put it completely to rest.
00:13:07.820So we'll see. I mean, we need data to prove it, but, but I think he's got the data on that.
00:13:13.580Anyway, so Musk says that, uh, Tesla's going to launch, uh, full, uh, full self-driving
00:13:19.620unsupervised, meaning you don't have to touch the steering wheel or look at the road. Um,
00:13:26.480and this service will be a paid ride share service in Austin in June. Now that's a good
00:13:33.200way to, you know, take a halfway movement into, um, the full cyber cab world. I think there's
00:13:41.340going to be a lot of, a lot of work to get that up and running, but I'm pretty sure Tesla
00:13:45.320can make that work. So in June, um, that would be a great time to go to Austin just to see how it
00:13:52.860works. You know, I probably won't do that, but I can see how people would. That'd be, that'd be an
00:14:00.200awesome American vacation just to go see the future, see how it feels. That'd be great.
00:14:06.240At the same time, I saw add the Optimus, also Tesla product, the Optimus robot, a little more
00:14:13.500information than we had about it. And, uh, it's 5'8", and it weighs 125 pounds. It can lift 150 pounds
00:14:26.000and it can walk at a speed of 1.34 miles per hour. And it looks like the cost, maybe not the first
00:14:33.960models, but, but very soon when they get to production of a, you know, a million a year,
00:14:39.000I think they expect the cost per unit to be 20 to $30,000. So, and, uh, it looks like it'd be
00:14:46.560launched this year. So 2025, if, uh, if Tesla hits his target, this will be the year that you've got a
00:14:57.040robot. Now, I don't know what the first one is going to cost. Um, you know, I, I hate to be the
00:15:04.500one who spends, you know, way too much on the first robot. And then six months later, it's $25,000,
00:15:10.140but I really want one. So I would not probably overpay. Uh, I wouldn't overpay stupid, crazy,
00:15:20.280you know, idiot money, but I might overpay a little bit. So I'd love to see a price. I'll tell
00:15:27.440you if I'll pay it, uh, that might help them with the research. But, uh, why did they have to make the
00:15:33.480thing exactly my height? They almost made it my height and my weight to make me feel like I live
00:15:42.500in this simulation. Cause one of the, one of the things I worry about with a robot, uh, is the same
00:15:48.320thing I worry about with a dog. I made sure that when I, that when we selected Snickers, I wanted to
00:15:56.120make sure that if things got out of control, I could still win in a fair fight against the dog.
00:16:01.240You know, if you get, you get some giant, you know, some giant dog or pit bull, that's one of
00:16:07.500the big muscular ones. You say to yourself, well, I don't know, I think I could win a fight, but
00:16:12.980I'm not sure I could. And it's dangerous to have something living in your house that's both
00:16:19.200unpredictable and could beat you in a fair fight. So I just try to avoid that. Right. Um, so the robot,
00:16:26.080the robot looks like it's going to be stronger than me because, um, could I lift 150 pounds? Well,
00:16:33.200yeah, I can. I mean, that's literally my exact weight. So, so it can lift my exact weight and
00:16:39.800it's my size. So it can carry me around. I'm going to make my robot carry me around. See, it won't work
00:16:48.000for you because you're 151 pounds or more, but I'm exactly 150. I'll be like, Carl, can you carry me to
00:16:56.700bed? Oh, again, master? Yes. Again. You know, you could walk a master. I know. And why do you make
00:17:08.200me call you master? I just like it. So robots are coming. Senator Mark Menendez, you know him as
00:17:15.840gold bar Bob. You got sentenced to 11 years in prison. Does that sound right? Does it sound right
00:17:22.980if a, uh, high level federal elected official is selling access for money, gold bars? There's 11
00:17:32.900years about right. Yeah. Yeah. It actually, yeah. It feels about right. Um, I don't always say that.
00:17:42.760It seems like the, the sentences are always too light or too long, but yeah, I think they got one.
00:17:50.220I think they got that one right. So good work, uh, the court system. Good work, the prosecutors.
00:17:59.060I think he got that all completely right. Well, there's going to be a, uh, bunch of Democrats who
00:18:07.460have formed to object to Trump's idea to cut taxes. That sounds like a punchline, doesn't it?
00:18:16.960This is real. There is an organized group of Democrats who are going to oppose Trump's call
00:18:24.420for lower taxes. Now lower just means he wants to extend the current situation. That's what they
00:18:30.820call lower, just doing what we're already doing. But, um, the funny part is that they named their
00:18:39.120Democrat group families over billionaires. Their idea being that the billionaires are going to get
00:18:44.780the tax breaks families over billionaires and the families over billionaires who don't want the
00:18:51.860billionaires to have so much control and get the benefits, uh, are going to have an alleged
00:18:56.540eight figure funding. So 10 to $99 million. Who could afford to give a group like this
00:19:05.820over $10 million? Let's see a millionaire, a millionaire. Oh no. 10 million is more than a
00:19:12.440million. So you couldn't, you couldn't really donate that if you're like just a basic millionaire
00:19:17.800who, uh, maybe a billionaire. Are there any billionaires that donate to Republican or to
00:19:24.620Democrats? Yes, there are. It's called Alexander Soros. It's called Reid Hoffman. Yes. So they think
00:19:34.040that their billionaires are the good ones and, uh, they would claim that the Republican billionaires
00:19:38.960are the bad ones. So they're going to have their billionaires, uh, fund a fake organization that's
00:19:46.160going to pretend it's in favor of higher taxes. Their whole thing is to pretend they're, they're
00:19:53.180in favor of higher taxes. Okay. All right. I'm glad that you woke up today and went to, went to fight,
00:20:01.400went to fight for the things that you believe in higher taxes. Um,
00:20:06.820so that's not ideally not ideal. Ontario, the wait is over. The gold standard of online casinos has
00:20:18.700arrived. Golden nugget online casino is live bringing Vegas style excitement and a world-class
00:20:24.260gaming experience right to your fingertips. Whether you're a seasoned player or just starting
00:20:29.260signing up is fast and simple. And in just a few clicks, you can have access to our exclusive library
00:20:34.860of the best slots and top tier table games. Make the most of your downtime with unbeatable
00:20:40.140promotions and jackpots that can turn any mundane moment into a golden opportunity at golden nugget
00:20:46.000online casino. Take a spin on the slots, challenge yourself at the tables or join a live dealer game
00:20:51.440to feel the thrill of real-time action all from the comfort of your own devices. Why settle for less
00:20:56.960when you can go for the gold at golden nugget online casino gambling problem. Call connects Ontario
00:21:03.2601-866-531-2600. 19 and over physically present in Ontario. Eligibility restrictions apply. See
00:21:10.520golden nugget casino.com for details. Please play responsibly. Uh, meanwhile, if you saw the RFK
00:21:17.480junior confirmation hearings, you probably enjoyed it. Uh, my take on it was I thought RFK junior answers
00:21:25.860tough questions better than I maybe have ever seen anybody answer tough questions because he had the
00:21:34.700data, you know, right in the back of his or the front of his head. He didn't even have to go to the
00:21:39.960back of his head. He knew exactly what he was talking about. And every one of the gotchas, he had an
00:21:46.700explanation that when you were done, you'd say, Oh, Oh, well, that actually is not what I thought it was.
00:21:52.020And it really tells you how fake the fake news was because that's all they had. It turns out that
00:21:59.920the only thing they had was fake news. So when he sits there and calmly explains, you know, why it's
00:22:07.240fake and what the real context is, it's really powerful. So I don't think they laid a glove on him.
00:22:13.380He's got maybe another day or so of, uh, testifying to a slightly different group, I think,
00:22:18.760but it's some of the best, best answers I've seen. Um, but the, the funniest thing that came out of it
00:22:27.200was Bernie Sanders and his onesies. If you haven't seen the clip yet, Bernie Sanders thinks he has this
00:22:35.320real gotcha because somebody developed a onesie, which would be a clothing item for a baby or I guess
00:22:42.540a baby, uh, and the baby clothing would have some, had some anti-vax message on it. So Bernie puts that
00:22:49.420up, uh, puts up a picture of the onesies with the anti-vax message. He says, are you supportive of
00:22:55.460these onesies? And of course, uh, RFK Jr. had nothing to do with the onesies, not directly, not indirectly.
00:23:04.540So he decides not to answer that dumb question. And he just says, I'm supportive of vaccines.
00:23:10.900Now, first of all, that's a perfect answer. Anything he said other than this sentence,
00:23:16.560I'm supportive of vaccines would have been a mistake of all the billions and billions of
00:23:21.120things you could have said. There was only one perfect thing. And he said it now. I really noticed
00:23:28.240that. If there's only one perfect thing and everything else is a mistake, if you can find
00:23:34.000the one perfect thing and you lead with it, I'm supportive of vaccines. There's no hedging on that.
00:23:42.260Now it doesn't mean every vaccine doesn't mean he wants to do it without testing, but he's generally
00:23:47.240supportive. So that should have been the end of the questioning, right? Once he says I'm in favor of
00:23:54.140vaccines, which is the opposite of the message on that onesie, well, we're done here, right?
00:24:00.660But Bernie apparently had gone through the trouble to make this,
00:24:03.460to make this visual and he wasn't going to quit on it.
00:24:09.220So he starts doing this ridiculous, are you supportive of the onesie? Uh, well, I support
00:24:15.980vaccines, but what about the onesie? What about the onesie? Tell us about the onesie.
00:24:20.820And you could just see people behind RFK Jr. Like Megyn Kelly, just laughing. And then RFK Jr.,
00:24:30.320he can't stifle his own laugh. So the video that got the most play was RFK Jr. literally laughing
00:24:40.160at Bernie being just a total idiot on this point. I mean, I respect Bernie in a lot of ways.
00:24:47.220He's, you know, partly his stick-to-itiveness and, you know, he seems pretty committed to his
00:24:54.520principles, whether you like him or not. But sorry, Bernie, this was the most absurd, ridiculous,
00:25:03.000uh, anti-science, anti-useful, complete waste of time. But you made a clown of yourself and it was
00:25:10.760entertaining. So we like that. We like the entertaining part. Anyway, um, here's what I think.
00:25:21.820I think that the thing about RFK Jr. that is really unique is that it's not political.
00:25:29.680It's a political process, but, you know, he's a lifelong Democrat and he is selected for one of the
00:25:37.220top jobs by the top Republican. That's as non-political as you can get when the top Republican
00:25:45.520picks one of the most famous Democrats. And by the way, RFK Jr. has never said,
00:25:51.520oh, now I'm a Republican. That never happened. He's the same guy he always was. It's just that
00:25:58.000he wants this one mission and it's important enough that he'll do it in whatever way can get it done.
00:26:03.980So it's the least political thing you'll ever see in your life, which makes me like it the most.
00:26:10.860It's also one of the most important things. I can't really come up with something besides the
00:26:16.120debt. The debt is existential, but beyond the debt, this is really my number one and probably should be
00:26:23.280most people's number one. Now I'll tell you why I'm more of a, more on the war path for this than
00:26:30.520other people. I've actually done the experiment where I cut off all processed foods for months.
00:26:41.280You won't believe how well you feel. If you do the experiment of just getting rid of, you know,
00:26:48.700it's expensive because processed food is cheaper and more convenient and everything else. But if you can do
00:26:54.280it and you just do your basic proteins and your basic organic, if you can do it, vegetables and
00:27:01.400fruits and stuff, you're going to find out that a lot of what you thought were your medical problems
00:27:06.940were food related. I thought I had terrible allergies all year long, all the time, no matter what.
00:27:14.740I don't. I had a reaction to poison food and I never had any unpoisoned food, I guess, as an adult.
00:27:24.860So I didn't really notice. I didn't think it was food because no matter what I ate, I had the same
00:27:30.820reaction. But I had to actually cut down the number of things I eat to just this tiny sliver of things
00:27:37.400I allow myself and then all my symptoms go away. Now, if you haven't experienced that, you don't quite
00:27:44.220understand what's at stake. What's at stake is chronic illness for all your children forever.
00:27:53.080They'll die. They'll suffer. Their lives will be terrible. They'll barely be able to pay attention
00:27:58.940in school. And I don't even know if they can mate. It's the end of the fucking world if we don't get
00:28:06.500him in there. Maybe, you know, you can't guarantee that, of course. But this is life and death for the
00:28:12.920children. This is bigger than abortion. You know, I guess you could argue that. But one of the things
00:28:21.020I always appreciated about the most conservative Republicans, without necessarily agreeing with
00:28:27.860their opinion, I respect the fact that the Republicans said, we're going to move this decision
00:28:35.420out of the federal government, put it in the courts, and we're going to get killed in the elections.
00:28:39.880Now, that has my respect. If you think that's important enough, and again, this is their opinion,
00:28:47.720not mine. I'm just respecting that the thing they decided to, you know, die on that sword,
00:28:53.580they knew they were going to die on the sword. They knew they'd get killed in the midterms. They'd get,
00:28:58.600you know, maybe lose 2020 or whatever. So the Republicans who said, we believe this so hard,
00:29:04.580we're going to die on the sword. I really respect that. Even if you don't agree with them on abortion,
00:29:12.240that that is a respected approach to life. Now, when I see RFK Jr. taking this kind of personal,
00:29:20.340professional risk to get this done, to save the children, save the families, save America,
00:29:27.080my God, do I have respect for that. You know what I don't respect? The people trying to stop him.
00:29:35.200Now, I can kind of understand maybe Democrats, you know, just doing the political thing,
00:29:41.260blah, blah, blah, but there are enough Republicans, so it shouldn't matter.
00:29:45.920Unless a handful of Republicans turn on him, because they're getting funded by the pharma or big food
00:29:52.680lobbies, which is probably the only reason I can think of it would happen. And let me just say,
00:29:58.580this isn't politics as normal. This is not politics as normal. If Republicans kill the RFK Jr. thing,
00:30:06.680we're not going to forget. Because this would be like driving up to your family and punching every
00:30:12.780one of your family members in front of you, and then driving away and saying, yeah, well, that was
00:30:17.280yesterday. That was yesterday I drove up and punched in the face every one of your kids.
00:30:23.540Well, are you going to forget it? No, I'm not going to forget it. If you punch my children in the face
00:30:29.220while I'm standing there and walk away, 40 years from now, you're still going to pay if I have a chance.
00:30:35.560Right? We're coming for you. I don't mean physically, obviously. No violence, please. No violence.
00:30:42.020But in terms of career, in terms of reputation, in terms of money, in terms of politics, yeah,
00:30:47.660it's, this is to the end. So you can retire, and we're still coming after you. Reputationally.
00:30:56.940Reputationally, not physically. So I'm with Nicole Shanahan as hard as you can be with anything. So
00:31:06.060Nicole says that if you vote against this, especially if you're Republican, she doesn't say
00:31:11.240that part, but I do. Especially if you're Republican, it's so clearly obvious that you're
00:31:17.420being bought out and that you've chosen the life of our fucking kids over whatever you're getting
00:31:23.440out of somebody. You're going to fucking pay for it. This isn't like the other stuff. The other stuff
00:31:29.840is just politics. We get it. We don't win every time. We can't win everything. Sometimes we win.
00:31:35.360Sometimes we lose. We get it. We get over it. We're not going to fucking get over this. We're
00:31:40.280not going to fucking get over it. This is to the end. Again, not physical. We're not talking about
00:31:46.700any violence. But you think you're going to stay in politics if you vote against RFK Jr. and you
00:31:52.040kick him out? No, we're not getting over it. We're not getting over it. And you're not going to get
00:31:58.500over it either. We're going to make sure of that reputationally. That's a promise. Edward Snowden
00:32:06.800said about Tulsi Gabbard, who's also going to be in the confirmation process. So I guess Tulsi Gabbard
00:32:15.540has been in favor of Edward Snowden being pardoned, if I have the background right. And then Snowden
00:32:22.480not wanting her to fail in the confirmations because of him, he said in a post today that
00:32:32.280Tulsi Gabbard will be required to disown all prior support for whistleblowers, meaning himself
00:32:39.280and others, as a condition of confirmation today. I encourage her to do so. In other words,
00:32:44.920to disavow even Snowden. Tell them I harmed national security and the sweet, soft feeling
00:32:51.120of staff. He's hurt their feelings. In D.C., that's what passes as the Pledge of Allegiance.
00:32:59.220So I'm not sure that Snowden is helping. Yeah. But it might. I mean, I don't know that it
00:33:07.920hurt. But she might actually. No, I don't think she will. I don't think she's going to disavow
00:33:15.920him. It doesn't feel like something she would do.
00:33:22.220Bank more encores when you switch to a Scotiabank banking package. Learn more at
00:33:27.680scotiabank.com slash banking packages. Conditions apply. Scotiabank. You're richer than you think.
00:33:34.040Yeah, I'm seeing the comments. There's a story going around that some are saying that Lyme disease
00:33:43.280wasn't naturally occurring. It was also a lab leak. Now, I would consider that so far a rumor.
00:33:52.960But you know how these rumors turn into a real thing if you wait long enough? It sounds like a
00:33:58.700conspiracy theory. But I'm not going to be the one who says, well, six months from now, remember when
00:34:05.920I said that was a conspiracy theory and then some information came out? So I'll say that I don't know
00:34:12.120enough about that story to say it's true or false. It's just out there. If you had told me that Lyme
00:34:19.280disease was made in a bioweapon lab, if you told me that 10 years ago, I would have just said, come on.
00:34:26.920Come on. That doesn't really happen in the real world, does it? Somebody makes a bioweapon. Suddenly
00:34:34.520it gets out and, you know, millions of people are infected. That's not a real thing. But it's a real
00:34:41.800thing. So whether it happened with Lyme disease or not, don't know. No, no. I guess we'll find out
00:34:48.420more. But I don't rule it out. It's like the years of ruling out things just because they sound like
00:34:56.780they're ridiculous. Can't do it anymore. We're in the ridiculous world now. Anyway.
00:35:04.200So we'll see how Tulsi Gabbard does. And I guess Kash Patel is today as well. Is that right?
00:35:14.620Is Mike Benz saying the Lyme disease? Does he confirm it or just give us the background so we
00:35:20.580can make up our own opinion? I'll check that out. I'll see. I'll see what Benz is saying.
00:35:25.240Remember, Benz uses, or at least when he documents his opinions, he uses public information.
00:35:33.980So if Benz has an opinion on this that leans more toward the lab leak theory, it's going to be based
00:35:42.500on stuff you can check yourself. You know, he shows the receipts. So that would be interesting. I'll
00:35:48.300check that out for you. I was watching the Daily Show, whose initials are TDS, which is important,
00:35:57.080the Daily Show. They even put the initials on the background for part of the opening segment.
00:36:02.880It actually says TDS all over the background. You would have think they would have fixed that by now.
00:36:10.160But John Seward, he came on and he said this about the Democrats are responding to Trump so far.
00:36:18.300He said, quote, things are going to get fascisty, fascisty, you know, more fascist. And he
00:36:26.120questioned Democrats that they don't want to ruin all their credibility by complaining about things
00:36:32.260that are, you know, not factually correct or not important. So John Stewart is warning people
00:36:42.600that they're not being rational in their complaints about Trump. He's doing that on a show whose letters
00:36:51.840are TDS. And here's the best part. Let me pull it all together now. You probably have all seen maybe
00:36:59.340more than once a famous clip where long before the COVID lab leak in Wuhan was established as the
00:37:07.240most likely source of it. Jon Stewart broke ranks with the popular opinions. And on the Colbert show,
00:37:15.060he mocked the fact that there was any doubt that the Wuhan, what was it? The Wuhan Institute of
00:37:25.040Virology, which happened to be across the street from the wet market. He just mocked people for
00:37:31.960thinking it was anything but the most obvious thing, which was the very lab that was working on
00:37:36.820the very thing. Now, you remember how he did it, right? He just mocked the fact that the name of the
00:37:45.980lab was the answer to the mystery. You didn't have to go any deeper than the name of the lab.
00:37:52.940The name of the lab. We're done here, people. Look at the name of the lab. Now, that's hilarious
00:37:59.900because we're all so complicated in our thinking that you really didn't really need to go past the
00:38:06.660name of the lab. The reason it's funny is that it's 100% correct. You don't need to think deeper.
00:38:14.300It's a sign on the door. Yeah, that tells you everything you need to know right on the sign.
00:38:20.520But the same guy who brought us that piece of brilliance and bravery, which was pure bravery,
00:38:28.200by the way, because he was going against pretty big forces when he said that, Jon Stewart was,
00:38:33.280sits on a set that literally says TDS and warns people that things are going to get fascisty.
00:38:42.960Now, I got to pay back Jon Stewart a little bit here. And by the way, I love his whole thing,
00:38:50.160even when I disagree with him. He's very good at what he does. I got to go full Jon Stewart on you,
00:38:56.060Jon Stewart. I got to go full Jon Stewart. Jon, the letters on your set say TDS. Not just once,
00:39:08.920probably a hundred times. TDS. TDS. Do you think that might be the better explanation of why you
00:39:19.340think things might get fascisty? Do you think those are unrelated? That you're sitting in front of a TDS
00:39:28.960background saying that everything was fine, but I think it could turn fascisty. Maybe you'll steal
00:39:39.220your democracy. How long do we have to go before he doesn't do anything like any of those things
00:39:45.180before you realize it's just the Wuhan Institute of Virology? It's right on the sign. Just read the
00:39:54.660sign. Now, I know that's an analogy, so it's not a perfect argument. It's just kind of so simulation
00:40:01.420perfect. I can't stop looking. Meanwhile, over at the big situation about DeepSeq,
00:40:09.320the Chinese open source AI that some say was only 5% as much cost as the American AI and just as good
00:40:19.080destroying our industry and all that. Well, I did a little research and I found out how you could
00:40:26.920make an AI that's 5% of the cost of the United States AI. Are you ready? Number one, you lie about
00:40:34.940how much it costs. That's important to the process. So you say, it only costs 5%. So that's very
00:40:46.800important. If you told the truth, it would sound like we have way more NVIDIA GPUs than you think,
00:40:55.960even though we're not supposed to have any. We've got a whole data center or two that's just stacked
00:41:01.120with them. And it costs us millions and millions and millions and billions of dollars. So the first
00:41:07.780thing is you just don't mention that. And then later, later, if somebody says, uh, you know, actually,
00:41:13.900I think there was like a giant data center involved because otherwise you couldn't get to where you
00:41:18.320are. Then you just, it's too late because you've already got out the, the 5% is already in people's
00:41:24.340minds. Oh, it only costs them 5 million. Wow. Cheap. So when later you find out, oh, they had
00:41:30.960certainly had a data center full of very expensive equipment to get there. You just kind of forget
00:41:36.560that part. So that's the first thing. Second thing is instead of using training data that you've
00:41:43.520scraped from the entire internet, the way the big U S companies do, you steal it from the people who
00:41:51.220stole it. So if the big AI companies stole my IP and my copyrighted works, but tried to cover it up by,
00:42:00.200you know, generalizing it. Um, first the U S companies steal it and pay nothing to people like
00:42:05.800me. That's a separate, separate conversation. But then if you want to really save some money,
00:42:12.360you steal it from the people who stole it. So if you steal from stealers, it looks like it's free.
00:42:21.240It's not free to me, uh, being one of the original copyright holders who has tons of material,
00:42:27.520which apparently AI is trained on. How do I know? Cause I can ask it and it knows a hell of a lot
00:42:33.220about me. So yeah, it trained on me pretty hard. Um, so that's how you do it. You lie about how much
00:42:41.460hardware you used and then you just steal what somebody else already stole and then it looks
00:42:47.660cheap. And then you lie about how many people are working on it and all that stuff. So that works.
00:42:54.160Um, however, I would like to, uh, add this thought when I found out my friend got a great deal on a
00:43:02.920wool coat from winners. I started wondering is every fabulous item I see from winners like that woman
00:43:09.740over there with the designer jeans. Are those from winners? Ooh, are those beautiful gold earrings.
00:43:15.680Did she pay full price or that leather tote or that cashmere sweater or those knee-high boots,
00:43:20.440that dress, that jacket, those shoes, is anyone paying full price for anything?
00:43:26.140Stop wondering, start winning. Winners find fabulous for less.
00:43:31.040Think about all the important people in time, you know, like Plato and Socrates and all them.
00:43:39.900If people stop reading and start using AI, which apparently is happening and, um, AI reaches sort of a
00:43:48.800training limit roughly now, meaning that there's not much else to train on in the real world. It's,
00:43:56.720it's sucked up all the real world stuff. So it's got to use, you know, artificial data to extend.
00:44:03.840What that means, correct me if I'm wrong, but if there were some modern voices in the world that were
00:44:12.560unusually powerful, would they not forever be part of AI's personality?
00:44:20.880And would they not be more important for being current and alive at the moment and creating a lot
00:44:27.420of documents than let's say somebody who was really smart, but died a thousand years ago,
00:44:32.740and we only have a few surviving texts, that sort of thing. So would it be true that the people who
00:44:41.060are, let's say, best-selling authors or public figures who've got a lot of opinions, would it be
00:44:48.980true that their, their impact on humanity is sort of locked in now? Meaning that AI is sort of
00:44:59.760permanently affected by the people who are the most persuasive at the moment, because that's where all
00:45:05.920the training happened. And then after the training happened, all they do is run some updates. You know,
00:45:11.440I guess some new stuff happened, but you know, it's hardly going to change the whole. Is it true
00:45:18.560that some people will be more locked in as the personality of AI than other people?
00:45:24.400And am I one of those people? Let me give you an example. If you go to AI, and it depends which AI
00:45:32.960you're, you're looking at, and you ask this question, which I have asked, uh, if you say,
00:45:38.880what is the impact of Scott Adams on politics? Now I asked that question, I forget which AI,
00:45:45.920it might've been perplexity. And I think Grok has a similar answer, but, um, I think it was perplexity
00:45:53.120that told me that my contribution to politics was that I changed the national conversation
00:46:00.720from policy to persuasion. Now that's what AI says. Now, is that true? It certainly seems like it,
00:46:13.440because if you look at the way, you know, any podcaster or anybody else is talking about anything,
00:46:18.480we do talk about the policy, but we spend way more time as I have already today talking about the liars.
00:46:26.320We talk about the fake news because that's persuasion. We talk about Rupar and these technique.
00:46:32.720We talk about the white house, uh, publishing a hoax list. That's up to four hoaxes, right?
00:46:39.920Right. Right. That's mostly me. That's mostly me. And that appears to be now locked in to what AI
00:46:51.520thinks about the world. So cancel me or not too late. I'm baked into the AI and probably you'll never
00:47:02.640get it out of there. Probably all of my books that the main themes of all of my main books,
00:47:10.080such as systems being more important than goals. Um, the idea of a talent stack, AI knows that stuff.
00:47:20.000It knows it and it'll repeat it back to you. Now, is that a, um, is that because it
00:47:27.040stole my copyrighted work? Not necessarily because so many people have talked about my work
00:47:33.920that if they just trained on the people talking about it, they'd probably get almost everything
00:47:39.040they needed. So those are several, several contributions that seem like they might be
00:47:47.280permanent in the AI brain. And then there's Dilbert itself, you know, 36 years of Dilbert comics,
00:47:55.280which certainly changed America. If you were watching the business book market at that time,
00:48:02.080business books that promised to tell you how to do everything great. If you just,
00:48:06.720you know, did what the business book author said, they became like the, just the biggest thing.
00:48:11.680And then Dilbert came along and mocked all of that bullshit for being completely worthless crap.
00:48:17.760And the business market for business books collapsed and never recovered. Now every once
00:48:24.000in a while, there'll be a big book, but the, the whole idea that any consultant can write a book
00:48:28.320and it's a bestseller that for a while, it just seemed like anybody with a business name was
00:48:33.760writing a bestseller about how to be successful. That kind of all went away as bullshit.
00:48:38.960And it was replaced by what I call a Dilbert point of view, which is sort of cynical and,
00:48:45.280you know, it's about your bosses looking out for themselves, et cetera.
00:48:49.440And I'm pretty sure that AI has absorbed all of that, all of it. So even let's say a more minor
00:48:58.640example, there's a, we'll see in the comments, how many of you ever seen that? How many of you
00:49:04.000have seen a one page document I wrote years ago on how to be a better writer? How many of you have
00:49:12.000seen that? It's one of the most viral things for years and years and years.
00:49:15.680Now that's going around everywhere. And I would assume that AI has absorbed it because it's been
00:49:24.320in so many places and repeated and repeated and recopied and it's a meme and everything else.
00:49:30.480So the reason it got around is that nobody disagreed with it. You know, the experts looked at it and
00:49:35.280said, yeah, that looks about right. So have I, have I become a permanent part of what it takes to be a
00:49:43.760good writer in the future? And would AI itself be influenced by anything I said? Does it recognize,
00:49:51.040oh, here's the guide to being a good writer. I'm trying to be succinct. I'll just do that.
00:49:57.440I don't know. No way to know that. But here's what I'm saying more generally. You know,
00:50:02.960I'm using myself as an example because I know the most about my own work. But don't you think,
00:50:08.240don't you think that the people who are the most effective voices at the moment on social media
00:50:18.320are going to get permanently a higher status in AI until the end of time? Because when AI decided to do
00:50:27.840its big learning, it seems like that's going to be the bias it will have forever. So good luck
00:50:38.320canceling me now, suckers. I'm in the machine. I'm in the machine. All right.
00:50:46.000All right. So the US Copyright Office, according to Just the News, did a ruling that artists can
00:50:58.080copyright some work that is created with the help of AI. So what is not copyrighted is if the only thing
00:51:06.240you did is put in a prompt, like, show me an image of Trump riding a bicycle. Can't copyright that. But
00:51:14.720if you had Trump riding a bicycle, but then you painted your own image of something over it so
00:51:25.120it's part AI and it's part you, you can copyright that. Copyright the whole thing. So there's going to be
00:51:31.200a lot of gray area. Well, but I like that they've at least taken that step, that if the human artist has
00:51:40.480substantially added to the AI, the AI is just a tool. And then the, you know, the ownership and
00:51:48.160artistry of the creator still gets credit. So this is a step in the right direction. It's going to get
00:51:53.680really, really gray and messy, but at least we sorted that out. I like that. According to also on
00:52:03.200in Just the News, somebody named Drew Horn, Drew Horn, which sounds a lot like what I used to do when
00:52:13.120I was doodling. I would just draw horns on people sometimes because I wondered what they would look
00:52:18.080like with horns. So that's his name, Drew Horns. Something I've done so many times. Anyway,
00:52:27.600he's the CEO of something called Green Met and he said that Greenland deciding to leave Denmark and
00:52:37.520have some kind of association or joining America would be easier than you think.
00:52:44.880Easier than you think. And indeed, if he's right, Greenland only has to vote on it. So apparently
00:52:52.640Denmark has been quite open-minded about letting Greenland manage itself. So Denmark seems to care
00:53:00.320about the national security, the big picture. They don't control the laws in Greenland. And the
00:53:08.240current law, according to Drew Horn, is that if they wanted to, the people in Greenland could simply
00:53:15.840have a vote and they can vote their own independence and Denmark would respect it because that's the
00:53:22.720current system. The current system gives them the right to vote anything they want. And if that's what
00:53:27.200they want to vote on, there's nothing stopping them. So in other words, when this whole conversation
00:53:34.640started about how hard it would be for Trump to possibly pull this off, nobody could ever pull this
00:53:41.520off. It'd be the hardest thing. It might take one hour of the people in Greenland doing a little vote
00:53:48.960on paper and then counting it. That could be the whole thing. The entire process might be, hey, we're
00:53:56.480going to make you a proposition in Greenland. You can stick with Denmark or you can make more money
00:54:04.320going with us and you'll probably be safer too. How about that? Why don't you vote on it?
00:54:10.960That could be the whole thing. It could be literally just seven bullet points of what we can do for you
00:54:17.680versus what Denmark can do for you. Just bullet points. That's it. You don't even have a document.
00:54:23.120Just seven bullet points. Could be fewer. Could be five bullet points. Just what we'll do,
00:54:29.440what we won't do. Have a vote. One hour. In one hour, Greenland could completely
00:54:38.480determine its independence and what it wanted to do with the United States.
00:54:42.000In one hour. So how many times do you have to see
00:54:47.680that Trump picks some objective that really looks impossible? Like in the real practical,
00:54:54.000complicated world, it just looks like it can't be done. And then he just does it in an hour.
00:55:02.400He's got kind of a reputation for that. Just doing in an hour the thing that can't be done.
00:55:07.440So working with Elon Musk on other things that people say can't be done, like cutting the budget,
00:55:13.600it's right on brand for Trump. The stuff that can't be done that he can do in an hour.
00:55:17.920Guantanamo Bay is going to get a new lease on life. Pete Hegseth and the president want to put
00:55:26.640the illegal criminal migrants, the ones who have broken more crimes than just coming to the country,
00:55:35.120wants to store them temporarily in Guantanamo Bay prior to shipping them back to the country of origin.
00:55:41.840Because in some cases, the country of origin will take a little leaning on to make them say yes.
00:55:49.760That seems to me like a perfect use of it. Number one, it's hard for AOC to visit
00:55:57.120because you don't want people crying at the fence. So it gives it a little hard to get to quality,
00:56:05.360which is probably good. Some reporters, I assume, will get there. I don't think it's off completely.
00:56:12.720inaccessible. But maybe the ones who go there will be vetted so they're not RUPARs and AOCs.
00:56:20.320So that seems like a good use of something that already exists.
00:56:25.840Meanwhile, this one's fun. Fox News, Caitlin McFaul is reporting on this, that the incoming UK
00:56:33.200ambassador, so this is who the UK has decided will be the ambassador to the United States.
00:56:39.280And remember, we have a special relationship, a special relationship with the UK. It's special.
00:56:47.120And their new ambassador is coming in, Lord Peter Mandelson.
00:56:51.840He says good things about Trump today, but he didn't always say good things about him because in
00:57:00.9602019, he had said that Trump was, quote, a danger to the world, a danger to the world.
00:57:07.600Which is another way of saying you have TDS and you're worried that Trump will be more fascist-y.
00:57:15.440Things will get more fascist-y. That's kind of what he was saying in 2019. But now he's changed his
00:57:20.960tune. And he says that Trump could be one of the most consequential American presidents of his lifetime.
00:57:26.560Oh, hold on. Hold on. Nope. Nope. If you read that fast, it sounds like a compliment.
00:57:34.240But just being most consequential would not be counter to his earlier opinion that he was a
00:57:40.880danger to the world. A danger to the world would also be the most consequential.
00:57:46.640So he'd better say better than that. Does he have anything better to say than most consequential?
00:57:53.760Well, he also said, I consider my remarks about President Trump as ill-judged and wrong.
00:58:01.280Huh. But why? Were they ill-judged and wrong because his opinion was wrong? Or were they ill-judged
00:58:10.400and wrong because it's inconvenient to his current career objectives? Hmm. I think I need to know more
00:58:17.440about why you think you were wrong. But he goes on. And he said, I think that times and attitudes
00:58:23.760toward the president have changed. Okay. You're still not saying what your attitude is. I get it
00:58:32.000that other people's attitudes have changed. You're so close to saying something right and good,
00:58:38.480but you're not there yet. You're not there. Can he take it over the line? And then he said,
00:58:46.320I think that he, meaning Trump, I think that Trump has one fresh respect, he added. He certainly has
00:58:53.920for me. Oh, okay. Now we're talking. So he has fresh respect. And that is going to be the basis of all the
00:59:02.640work I do for his majesty's ambassador in the United States.
00:59:11.040Now, there's some rumor that the US, Trump in particular, would reject their ambassador,
00:59:19.520would reject their ambassador. But would you reject him if he's on the right page now?
00:59:26.640Because all I really want from other countries is that they treat the US with respect. And they
00:59:33.600understand that that's the way it has to be. You know, you don't have to agree with everything.
00:59:38.320That's not a requirement. But respect. Yeah. Don't call our leader a danger to the world
00:59:44.160and act like you can work with. Yeah. Okay. So on one level, you'd say to yourself,
00:59:49.920hmm, I don't know, maybe we could do better. Could we get somebody who didn't once hate him?
00:59:56.480That seems like a safer play. But on the other hand, somebody who was like an ex-smoker,
01:00:02.000you know, who's admitting he was wrong. And now he's trying to make good. Well, that could be good,
01:00:06.560too. Maybe he'll try harder to show that he's on Trump's side. Maybe they'll work in our favor.
01:00:11.520But it does make me wonder if the, quote, special relationship, are they using the word special
01:00:22.480in what context? Is it Special Olympics? Or is it special like it's just good? Now, no disrespect to
01:00:33.600the Special Olympics, which is pretty awesome. But we do wonder what they mean by that special
01:00:39.760relationship. Because that word can be interpreted in more than one way. But if this guy thought Trump
01:00:46.960was a danger to the world, and then he found out he was completely wrong in his political worldview,
01:00:53.200is that the guy you want representing your country? The guy who was wrong about something so basic?
01:00:59.280And the reason he was wrong is not because of his opinion, but because he fell for brainwashing.
01:01:06.080If you fall for brainwashing, and it's public, and everybody can see it,
01:01:12.800is that the one you want sitting in the US for your special relationship? I don't know. So here's my
01:01:19.840take. I think he's, you know, maybe minimally acceptable. But if Trump decided to get him out of
01:01:29.360here, just get him out, I'd be okay with that. I think that's up to Trump. That's a personal
01:01:35.360decision for him. And I could go either way. Meanwhile, according to the Daily Wire, the White
01:01:41.280House has received over 7400 applications to be some kind of new media White House reporter.
01:01:50.080Because the White House has opened up the question answering or the question asking process for the
01:01:57.440press, the press events. And now, if you're a podcaster or something, you can apply. And if they
01:02:04.400like you, and you look like a serious news related entity, you can get some kind of press credentials.
01:02:15.040Here's what I think. First of all, this is brilliant. Everything about this is good.
01:02:20.560Why wouldn't the Democrats do it? Well, let me give you the obvious reason. The Democrats benefit
01:02:28.720when the traditional corporate media is healthy, because those are the ones that have their back.
01:02:35.200So the corporate media is basically a Democrat platform, most of it. But we can't live without it,
01:02:43.600because you also need the news. I've said this before, but if the corporate media died tomorrow,
01:02:51.520I wouldn't have much to talk about. Because I mostly riff off of things they did stupidly.
01:02:59.040Right? And then there are things that the White House announces. So a lot of the news is just the
01:03:04.960White House announced something or they answered a question, or they have a plan. Podcasters can do that.
01:03:10.880We don't need NBC to be in the room. Tell us what happened in the room. And then we go and make
01:03:18.400our podcast mocking NBC's bad interpretation of it, but also looking at the original so we can see what
01:03:24.480they got wrong. So that's what I do all day. And it looks like maybe the Trump administration is not
01:03:33.360only opening up their access, which they like to do, but maybe they're killing the corporate news.
01:03:39.520Because if a podcaster is getting the direct news from the White House staff and has access to all
01:03:47.440the right people and can ask all the right questions, I don't need NBC. I don't need MSNBC. I don't need
01:03:54.320ABC. I could just go to whichever podcaster got the credentials and spent the most time at the White
01:03:59.840House. So this could be a death blow to the corporate news. I don't know if they mean it to be that,
01:04:07.680but it could be that way. Christy Noem, she's Homeland Security Secretary, and she announced an end to
01:04:20.800grant funding abused by NGOs for aiding illegal migrants. Wow. That is so good.
01:04:30.560I asked the other day. I said, when are we going to, just on X, I said, when is the government going to cut the funding to these NGOs?
01:04:42.560Now, I said all of them, but what I really meant was the bad ones. And it looks like the bad ones are all out of money now, or at least government funding. So, man, it makes me wonder, how was it ever a thing
01:04:59.040that gigantic percentage of our budget was going to these really just money laundering, highly political
01:05:09.920things that the country would never have agreed to if they knew it was happening? So, man, cleaning that up is, that's quite a swamp cleanup. I'm happy about that.
01:05:20.480Meanwhile, Trump has signed what NBC News is calling a sweeping executive order. Let's see. If all the podcasters had access to all the White House, would I need NBC News to tell me what Trump just signed? Nope. I could get that from Tim Poole or whoever's going to get credentials.
01:05:45.200Anyway, so is it a big deal? Why is it a big deal? He said he would prioritize and free up federal funding to expand the school choice programs. Oh, it's a big deal, because there's big money behind it. And the government would be behind, at least the federal government, would be strongly backing homeschooling.
01:06:07.260I feel like that. I feel like that might be one of the most historic decisions in the history of the United States.
01:06:16.220If you don't follow the homeschooling thing, you're thinking to yourself, well, Trump did a lot this week. That's not like the biggest one or anything. It might be. That might be the biggest thing.
01:06:27.360Because if we don't fix education, everything breaks. And education is completely broken right now. It's completely broken.
01:06:36.440Here's what I never hear. I've never heard this once, actually. I've never heard somebody who homeschools their child say, that was the biggest mistake I ever made.
01:06:47.320Should have sent them to public school. Not once. And I know quite a few homeschoolers at this point.
01:06:54.500Don't you? You all know some homeschoolers, right? Have you ever heard any homeschooler, maybe the student might say, I wish I had more friends or something, but I don't think any parents have ever said, as long as they had a system that worked, I don't think they've ever said, I wish they went to public school.
01:07:14.380I have to be honest. I've interacted with enough people who are products of homeschooling and enough people who are products of public schools.
01:07:23.840Like, you can really, really tell the difference. Am I wrong about that? Have you all experienced that? That when you meet a homeschooler versus meeting a public schooler, oh, there's a difference.
01:07:43.820And I don't think it's just selection. It's not just selection. It's what you turn into with those two experiences as your contrast. So that could be a big deal.
01:07:56.620Meanwhile, the Justice Department, according to NBC News, dropped a classified documents case against Trump's co-defendants.
01:08:06.980So I have some question about the timing of that. Is this an extension from some old news or is it really new news?
01:08:14.260But I don't want to see any of Trump's co-defendants go to jail when Trump himself gets dropped from the case because he's president.
01:08:23.820That just wouldn't feel right to anybody. So I don't know the details of that. It just sounds like something good happened in that regard.
01:08:33.100Speaking of Denmark, how does Denmark get in the news twice?
01:08:36.900I remember going years without mentioning Denmark even once, but today twice.
01:08:43.320So Denmark, for reasons that are escaping me, they're letting Russia plug the Nord Stream 2, the one that got blown up, mysteriously blown up.
01:08:56.020We don't know who. Nobody knows. Yeah, we all know.
01:09:01.260Why would we be in the middle of the Ukraine war, hopefully closer to the end, but still in it,
01:09:08.400and Denmark's going to allow them to rebuild the pipeline that was one of the biggest risks?