The Matt Walsh Show - February 12, 2025


Ep. 1535 - Tyrannical, Power-Hungry Judges Attempt To Seize Total Control Of The Government


Episode Stats

Length

1 hour and 7 minutes

Words per Minute

173.9584

Word Count

11,784

Sentence Count

864

Misogynist Sentences

32

Hate Speech Sentences

19


Summary


Transcript

00:00:00.000 Today on the Matt Wall Show, out of control, tyrannical, power-hungry judges are attempting
00:00:03.580 to override Trump and claim total control over the executive branch. We'll talk about why Trump's
00:00:07.660 only choice is to ignore the lawless judges that are issuing these orders and dare them to do
00:00:12.060 something about it. Also, J.D. Vance attends a summit on AI. He says the Trump administration
00:00:16.320 plans to take the lead on AI, which is good, but taking the lead also means putting up guardrails.
00:00:20.760 We'll talk about that. And a left-wing protest goes wildly off the rails in truly horrifying
00:00:25.660 ways that I will force you to witness. And the morbidly obese woman suing Lyft made
00:00:30.080 an appearance on Breakfast Club this week. Her whole case and their furniture fell apart
00:00:35.460 almost immediately. We'll talk about all that and more today on the Matt Wall Show.
00:00:55.660 Did you know that over 85% of grass-fed beef in stores is imported? Kind of like the U.S.
00:01:09.640 population ever since Biden's disastrous border policies. But anyway, since 2015's repeal of
00:01:15.240 country-of-origin labeling, meat can get a product of USA label even if it's from overseas. Most grocery
00:01:21.460 store meat contains unwanted additives, antibiotics, hormones, and seed oils. But GoodRanchers.com
00:01:27.640 is different. Their meat is 100% American-born and raised and harvested with complete transparency
00:01:32.480 about its origins all the way from farm to table. I've tried many of their steaks and other choices,
00:01:37.440 and they are quite possibly the most tender, tasteful, clean protein options that I have ever
00:01:42.140 had. Right now, Good Ranchers is running the New Year New Meat promotion. So in 2025, you can make choices
00:01:47.940 that you feel good about. If you subscribe to any of their boxes, you'll get free ground beef,
00:01:51.620 chicken breasts, or wild-caught salmon in every order for a year. Plus, $25 off and use my code
00:01:57.940 WALSH at checkout. That is clean, high-quality protein in every order for free for an entire
00:02:03.020 year. You don't want to miss out. My family and I, we eat Good Ranchers at least once a week,
00:02:07.520 and we all love it. It's not too late to start the year right. Invest in your health and support
00:02:12.020 American farms with every purchase. Visit GoodRanchers.com now. Use my code WALSH for that exclusive
00:02:16.720 offer. Good Ranchers, American meat delivered. Whenever a new administration takes over the
00:02:21.480 White House, there are obviously a lot of major changes that take place. We talked about many
00:02:25.660 of these changes already in the past few weeks with Trump's executive orders and so on. But there are
00:02:29.780 also some minor changes that nobody really talks about or takes into account, even though they may
00:02:34.640 have symbolic significance. In particular, one of the first moves that Trump made when he took office
00:02:40.140 back in 2017 was the installment of a portrait of Andrew Jackson in the White House. Jackson was a war
00:02:46.340 hero who served in both the Revolutionary War and the War of 1812 when he led American troops to a
00:02:51.420 historic victory in New Orleans. Jackson was also a populist who wanted the United States to expand
00:02:55.860 and grow its borders because he believed it would benefit the average American. And so for four years,
00:03:01.500 Jackson's portrait remained in the Oval Office. Then the minute he became president, Joe Biden quickly
00:03:06.420 removed the portrait. No one in Biden's administration could stand to look at Andrew Jackson, but apparently
00:03:11.080 they didn't destroy the portrait. They just put it in storage. So now, four years later,
00:03:14.800 the game of Andrew Jackson tug of war is continuing. The portrait is officially back.
00:03:21.520 And as you can see, when you look at the portrait in the Oval Office, Abe Lincoln is on the left and
00:03:26.880 Jackson is on the right. And if you ask a typical Biden supporter about this, they'll tell you that this is
00:03:31.780 an outrage because Andrew Jackson was a slave owner. On top of that, he wasn't exactly a big fan of the
00:03:36.980 Indians. And therefore, you know, he's canceled. But that explanation actually misses one of the
00:03:43.500 defining moments of Jackson's presidency. This is a moment that very soon could have a lot of relevance
00:03:48.860 for the second Trump turn. I'm talking about a decision by the U.S. Supreme Court in 1832,
00:03:53.740 which held that the Cherokee Indians were a sovereign nation that had the right to govern themselves
00:03:59.000 without interference from the states. This was a ruling that threatened to disrupt Jackson's plans
00:04:03.520 for American expansion into the West. It was also a major blow to the concept of states' rights,
00:04:08.760 and it would have significant ramifications for Georgia in particular. And of course, as Jackson
00:04:13.520 saw it, it was also an unlawful ruling. So in response to this decision, according to various
00:04:17.900 accounts, Jackson uttered some version of this quote, Chief Justice John Marshall has made his decision.
00:04:24.800 Now let him enforce it. In other words, when a court, even the Supreme Court, exceeds the limits of
00:04:30.660 its authority, then there's no requirement that anybody follow the court's rulings. There's
00:04:35.600 certainly no way for the court to compel the executive branch to do anything. So if a court ever
00:04:40.820 goes rogue and begins to issue rulings that flagrantly disregard the Constitution, as well as the outcome
00:04:45.840 that a majority of Americans want, that's how you deal with it. Jackson set that precedent.
00:04:52.640 A century later, FDR built on that idea when the courts began to striking down his New Deal
00:04:56.840 legislation. He threatened to pack the courts. Very quickly, the Supreme Court backed down
00:05:00.740 and started upholding some of those laws. And depending on who you ask, Andrew Jackson and FDR
00:05:06.560 were either villains or heroes in those stories. But either way, their actions illustrate an important
00:05:13.280 point, which is that in our system of checks and balances, everything depends on legitimacy.
00:05:19.020 If courts are seen as illegitimate, then their orders are enforced. If courts are seen as
00:05:23.900 illegitimate, then nothing they do really matters. Donald Trump, as a student of Andrew Jackson,
00:05:29.580 certainly understands that. And judges, we can assume, understand this principle as well. And
00:05:34.340 that's why the flood of federal injunctions that have already been issued in Trump's second term
00:05:39.140 are best understood as a deliberate provocation by the judiciary. Because every other day, an unelected
00:05:45.800 federal judge, usually a left-wing judge, is issuing a nationwide injunction barring some aspect
00:05:51.540 of Trump's agenda. And this is happening so frequently that most people don't really understand
00:05:57.220 or comprehend the scale of the problem. So let me run down a brief, non-exhaustive list of just some
00:06:05.100 of these injunctions, okay? These have all been issued in the past couple of weeks. A Biden judge
00:06:11.640 blocked Trump's halt on federal grant spending. A Clinton judge in Massachusetts blocked Trump's plan
00:06:17.320 to issue buyouts to federal workers and also blocked the Trump administration from transferring
00:06:21.940 a trans-identifying man to a man's prison facility where he belongs. An Obama judge in New York blocked
00:06:27.360 both Doge and the treasury secretary himself from accessing internal treasury records. A Biden judge in
00:06:33.380 Massachusetts blocked Trump from cutting billions of dollars in fraudulent spending that was earmarked
00:06:37.980 for scientific research, including wasteful overhead, as we discussed yesterday. Another judge in
00:06:42.560 Washington, D.C., appointed by Trump, halted the suspension of U.S. aid workers who were
00:06:47.260 suspending or spending billions of dollars orchestrating foreign coups and sponsoring
00:06:51.380 transgender theater and that sort of thing. An Obama judge in D.C. ordered Trump to reinstate the head
00:06:57.040 of a special counsel office. A Reagan judge in D.C. blocked prison officials from transferring
00:07:01.980 trans-identifying men to men's prisons. Three judges, including Biden, Bush, and Reagan judges in
00:07:07.380 New Hampshire, Washington, Maryland, blocked Trump's executive order ending birthright citizenship.
00:07:12.020 And most recently, a D.C. judge named John Bates issued what may be the single most obviously
00:07:18.700 unconstitutional ruling written by a federal judge in modern history. Now, before I get into that
00:07:25.540 ruling, take stock of the sheer magnitude of orders that I just mentioned and where they're coming from.
00:07:33.260 I mean, without exception, these are judges located in left-wing jurisdictions, most of them appointed
00:07:40.720 by Democrats. And in every case, they are unilaterally issuing emergency temporary restraining
00:07:47.220 orders, which are supposed to be an extraordinary and drastic remedy according to established legal
00:07:51.900 precedent. And if you're trying to portray the courts as legitimate, nonpartisan institutions,
00:07:58.420 then put simply, this does not look good. In fact, it's a travesty. And it gets much worse when you
00:08:04.460 look at what these rulings are saying. So let's go back to the ruling by John Bates that I just
00:08:09.340 mentioned. Okay, so this ruling prevents the Trump administration from removing, or is supposed to
00:08:15.860 prevent them, from removing content from websites that the Trump administration controls. In fact, the
00:08:22.900 ruling goes further than that. It requires the Trump administration to keep old content from Biden, the
00:08:28.940 Biden administration, online. And this is not an exaggeration. The order, by its own terms, bans the Trump
00:08:34.500 administration from, quote, removing or modifying health-related web pages and data sets, while also
00:08:40.960 compelling the administration to, quote, restore web pages and data sets that they have already removed or
00:08:45.500 modified. In other words, you have an unelected federal judge who's just named himself the official
00:08:52.580 web editor of the entire federal government. He's saying, in effect, that the executive branch
00:08:58.220 has no power whatsoever. I mean, they can't even control what they're putting on the internet. Trump,
00:09:04.260 as the president of the United States and the most powerful man in the world, actually doesn't even
00:09:08.760 have the power to make alterations to a website. Specifically, the ruling comes in response to the
00:09:15.100 Trump administration's executive order ending gender ideology in the federal government. Pursuant to that
00:09:20.620 order, the administration removed a lot of content from websites that are run by the CDC, HHS, FDA,
00:09:26.960 and other agencies. And some of that content was related to sex changes and cross-sex hormones.
00:09:31.820 Some of it included activist research on the risk of suicide among trans-identifying individuals,
00:09:37.200 which, of course, has been used to justify child castration instead of treating their obvious mental
00:09:41.820 health condition. Some of the materials had to do with nonsense concepts like environmental justice.
00:09:46.840 Some of it was about drugs that gay people can take in order to prevent the spread of HIV while
00:09:53.100 they're engaging in reckless and inherently dangerous sexual activity, and on and on. Now,
00:09:57.540 whatever you think of these materials, and you should think that they're garbage because they are,
00:10:02.220 it's clearly within the federal government's purview to delete them from their own websites.
00:10:08.860 This is why we have elections. But Judge Bates disagreed. He ruled that, quote,
00:10:13.920 by removing long relied upon medical resources without explanation, it is likely that each agency
00:10:19.400 failed to examine the relevant data and articulate a satisfactory explanation for its action.
00:10:24.880 And he goes on to say that the Trump administration has violated something called the Paperwork Reduction
00:10:29.860 Act, which is a law that nobody knows anything about or cares about at all. But there's more to the
00:10:37.360 ruling. But let's stop there and just address this part of it. Okay, he's saying that there's no
00:10:43.160 explanation for why this content was removed. So apparently we can't reduce this particular
00:10:48.340 paperwork. And this would be a compelling argument if the judge were, say, living in a cave with no
00:10:55.080 internet access or access to newspapers or other human interaction for the past decade or so.
00:11:00.540 But everyone else understands exactly why the Trump administration is gutting these websites.
00:11:04.800 As we discussed yesterday, activists have taken over scientific and medical research in this
00:11:08.940 country. They've promoted child butchery and other obviously immoral practices, which the
00:11:13.720 overwhelming majority of Americans reject. And Trump ran against all of that. And he won a resounding
00:11:20.500 victory. So that's all the explanation the federal judge should require. That's if, as though we need to
00:11:31.000 explain anything to him anyway, like his personal opinion about their reasons for deleting something
00:11:37.380 from a website is irrelevant or it should be. But then the judge goes on to offer another explanation
00:11:44.180 for his ruling. Quoting again from the decision, he says, quote, if those doctors cannot provide these
00:11:48.920 individuals the care they need and deserve within the scheduled and often limited timeframe, there's a
00:11:53.760 chance that some individuals will not receive treatment, including for severe life-threatening
00:11:57.820 conditions. So now what's happening is that the judge is claiming that doctors are reliant on
00:12:03.580 government websites in order to treat their patients. And as proved this, he cites a doctor
00:12:07.940 from Yale who says that it now takes her a little bit longer to prescribe contraception for her patients.
00:12:13.980 And without these websites, she's totally clueless, apparently. Her medical school training wasn't
00:12:18.740 enough. Her residency wasn't enough. Her years of practice weren't enough. Her access to thousands of
00:12:24.020 research papers is not enough. She needs a small handful of government websites in order to do her
00:12:29.140 job, apparently. Now, if we take this claim literally, of course, it means that this doctor
00:12:33.420 is incompetent to practice medicine. But instead, the judge takes as proof that he needs to dictate
00:12:39.880 what content goes on the internet. And if you go through any of the other 10 million injunctions
00:12:44.820 that judges have filed against Trump so far, you'll find that this kind of reasoning, as egregious as it is,
00:12:49.980 is pretty much the norm. The judges are taking anecdotal claims from activists, and they're
00:12:55.400 using them to throw up roadblocks in the way of the new administration. And they're doing it so that
00:13:00.060 CNN can report that Trump is causing a constitutional crisis, when in fact the judges are the ones doing
00:13:06.940 that. Watch. We are three weeks into the second Trump presidency, three weeks, and tonight there are
00:13:13.560 warnings that the U.S. is dangerously close to a constitutional crisis. Now, the first shoe on this
00:13:19.300 dropped when a federal judge today said the White House is defying his order to unfreeze billions
00:13:24.780 of dollars in federal aid, marking the first time that we've had a judge expressly accuse the Trump
00:13:30.080 administration of ignoring a court ruling. And in a separate case today, federal employees here in
00:13:35.800 Washington told a judge that the administration was defying another order by not reinstating workers
00:13:41.760 who had been put on leave. Now, this all has prominent Democrats and many of the nation's top
00:13:46.560 constitutional scholars declaring that the U.S. is on the brink of a reckoning. The Trump Justice
00:13:51.900 Department says the president should have the authority to decide how to run the government
00:13:55.340 and that these judges are overreaching. And some of the president's allies say the judges should not
00:14:00.920 be judging any of the moves to shrink the federal government. Now, as you just heard, there are hints in
00:14:06.900 that reporting that the Trump administration may not comply with some of these orders. Trump himself
00:14:10.560 hasn't come out and said that. He's apparently weighing his options at this point. But as this continues,
00:14:14.980 which it inevitably will, the Andrew Jackson solution is going to become more and more
00:14:19.680 appealing. And ultimately, it will be necessary. It is necessary. You know, to be very clear about
00:14:26.000 this, there is nothing in the Constitution that gives any random federal judge absolute power to
00:14:31.620 override anything the president does or any decision he makes at any time just because they personally
00:14:37.080 don't like it. You know, if a single judge in a place like New Hampshire or Washington can decide
00:14:42.020 the president can't reduce foreign aid or fire his own employees or kick men out of women's prisons or
00:14:46.880 even control the content that's posted on government websites, then essentially the president has no
00:14:52.860 power. The judges are the presidents. They have all the power. Nothing else matters. And that's the
00:14:59.960 result the left obviously wants. Now it is anyway. And now that the deep state and career bureaucrats are
00:15:06.460 being terminated, they see the courts as their only hope. That's why Chuck Schumer just spoke on the
00:15:11.560 floor of the Senate declaring that courts are essentially infallible and their rulings must
00:15:16.680 always be honored no matter what. Watch.
00:15:19.960 Donald Trump is not free to bulldoze his way through the rule of law. Donald Trump is not free
00:15:28.420 to bulldoze his way through the rule of law. He is an executive, not a monarch. He swore an oath
00:15:35.760 faithfully to execute the duties of his office. And when the courts speak, Donald Trump must accept
00:15:42.380 their judgments and honor the Constitution.
00:15:46.760 When the courts speak, you just have to honor their judge, no matter what they say,
00:15:51.780 no matter what. If a federal judge tomorrow issues a ruling saying that, you know, everybody has to wear a
00:15:56.740 red shirt on Tuesday, well, he just got to do it. You know, the fact that he has no power to issue
00:16:03.540 that ruling and no power to enforce it, and he's way outside the bounds of the Constitution, doesn't
00:16:08.260 matter. He's a judge in a robe. He said it, so we all got to do it. I mean, that's what Chuck Schumer
00:16:12.140 was saying. But if you remember just a couple of years back, Democrats like Chuck Schumer were standing
00:16:16.680 in front of the Supreme Court threatening individual judges over their rulings. AOC was telling Anderson
00:16:22.720 Cooper that Biden should just ignore Supreme Court rulings, and he did. He admitted that the court
00:16:27.740 had struck down his plan to cancel student loans, for example, so he just went ahead and did it
00:16:31.220 anyway. Now these same Democrats are maintaining that judges should never be questioned under any
00:16:37.360 circumstances. The same Democrats who are organizing protests outside the houses of Supreme Court
00:16:45.920 justices. Now they're saying, well, no, if the judge says it, honor it, doesn't matter.
00:16:52.400 Now the reality, of course, is somewhere in the middle of these two extremes that the Democrats have
00:16:58.500 bounced between. Judges are not infallible. They can deliberately and maliciously violate the
00:17:04.980 Constitution just like anyone else can. And when that happens, when a judicial coup is underway,
00:17:10.740 a response is necessary. We're not talking about one or two bad rulings here. We're not talking
00:17:15.700 about rulings that block a handful of policy goals or anything like that. We're at the point where
00:17:19.980 the president is not being allowed to do anything. He can't even edit a website. Left-wing judges are
00:17:27.880 issuing emergency injunctions on everything without even deciding actual cases. You know, the only way
00:17:35.280 to get us out from under this judicial tyranny is for Trump to disregard these orders and for Congress
00:17:40.360 to impeach the judges responsible for them. Throughout our history, there have only been a handful of times
00:17:45.160 when presidents have needed to consider drastic actions like that. The country was clearly better
00:17:50.500 off because Andrew Jackson did it. And now two centuries later, it's equally clear that this
00:17:56.120 country would be better off if the Trump administration followed in Jackson's footsteps
00:17:59.520 and dared these judges to enforce these rulings. They can't do it, obviously. You know, they complain and
00:18:07.700 issue more injunctions and more opinions. Fine. Meanwhile, the rest of us, people who want to see this
00:18:14.540 country improve, will get exactly what we voted for. Now let's get to our five headlines.
00:18:26.820 If you own a handgun, you know the dilemma. You either keep it locked away somewhere secure but tough to
00:18:31.860 access or you compromise on security for quick access. Neither option is ideal. So that's why I was
00:18:38.380 genuinely impressed when I tried out the Stopbox Pro. I've had mine for a while now and it solves that
00:18:43.460 access versus security problem brilliantly. What makes it different is that it uses this incredibly
00:18:48.200 clever push-button mechanical system. No keys to lose, no batteries to fail, just reliable access
00:18:54.640 when you need it. What really sold me was testing it out at home. The build quality is exceptional
00:18:59.660 and it should be since they make everything right here in the USA. I could access it quickly in the
00:19:04.140 dark, which gives me real peace of mind while simultaneously making whoever is breaking and entering
00:19:08.280 learn what regret is very rapidly. Plus, for those of you who travel, it's TSA compliant. So you can
00:19:14.120 actually fly with it properly secured and check baggage because it turns out TSA isn't a fan of
00:19:18.680 just throwing loose weapons into your bag. Don't try it. For a limited time only, our listeners are
00:19:23.720 getting a crazy deal. Not only do you get 10% off of your entire order when you use code MATTWALLSHOW
00:19:28.660 at stopboxusa.com. But they're also giving you buy one, get one free for their Stopbox Pro. That's
00:19:33.900 10% off and a free Stopbox Pro when you use code MATTWALLSHOW at stopboxusa.com. Discover a better way
00:19:41.920 to balance security and readiness with Stopbox. Daily Wire reports. Vice President J.D. Vance addressed
00:19:48.380 world leaders at the Artificial Intelligence Action Summit in Paris on Tuesday, pointing to the U.S. as
00:19:53.700 the global leader on one of the most promising technologies we've seen in generations. The
00:19:58.580 conference included world leaders, including French President Emmanuel Macron, Canada's outgoing
00:20:03.920 Prime Minister Justin Trudeau, other leaders, tech executives as well. In his speech, the Vice
00:20:09.580 President outlined four key points. He says the Trump administration will strive for in its AI policy,
00:20:15.440 making America the gold standard of AI, fighting excessive regulation in the industry, preventing
00:20:20.320 political bias in AI and advocating for American workers as the technology continues to develop.
00:20:26.440 Here's some of Vance's address to the summit.
00:20:30.240 AI, we believe, is going to make us more productive, more prosperous, and more free.
00:20:36.560 The United States of America is the leader in AI, and our administration plans to keep it that way.
00:20:42.980 The U.S. possesses all components across the full AI stack, including advanced semiconductor design,
00:20:50.320 frontier algorithms, and, of course, transformational applications.
00:20:54.680 Now, the computing power this stack requires is integral to advancing AI technology. And to safeguard
00:21:01.520 America's advantage, the Trump administration will ensure that the most powerful AI systems are built
00:21:07.280 in the U.S. with American-designed and manufactured chips.
00:21:12.700 Now, just because we're the leader doesn't mean we want to or need to go it alone, of course.
00:21:17.180 And let me be emphatic about this point. America wants to partner with all of you.
00:21:23.640 And we want to embark on the AI revolution before us with the spirit of openness and collaboration.
00:21:32.140 But to create that kind of trust, we need international regulatory regimes that fosters the creation of AI
00:21:40.840 technology rather than strangles it. And we need our European friends in particular to look to this
00:21:47.180 new frontier with optimism rather than trepidation.
00:21:50.440 So, you know, I agree with him that we should lead the way with this technology. This is the correct
00:21:59.940 position for a presidential administration to have. There's nothing he said that I disagree with.
00:22:05.680 That said, I don't think that our approach to AI should be to simply embrace it wholesale
00:22:10.140 unquestioningly, which isn't what Vance was saying. I'm just putting forward. I'm putting this
00:22:14.820 forward as my own opinion. In fact, Vance talked about needing to advocate for and protect American
00:22:18.960 workers. And I agree, that's important. And to me, that means that there have to be guardrails put
00:22:27.020 in place. There have to be lines drawn where AI just will not cross. We have always wanted to make
00:22:38.360 things faster and more efficient and cheaper. And that makes sense. You know, the faster, the more
00:22:45.980 efficient, the cheaper, the better. That's the idea. But we're now at a point in history where fast,
00:22:51.500 efficient, and cheap cannot be the be-all and end-all. Right? We can't just say, well, whatever's faster,
00:22:58.580 more efficient, and cheaper is automatically the better way to go. We're at a point now where it's not that
00:23:05.000 simple. I mean, you could argue it's never been that simple. But it certainly isn't now. Because
00:23:09.220 if we do approach it that way, human beings will simply just be replaced. And they will be replaced
00:23:18.640 in every facet of life. We're talking millions of jobs lost. Millions. And most of them will not be
00:23:26.880 converted into some other kind of job. We're not talking about jobs evolving or changing. We're talking
00:23:33.260 about jobs just lost. If AI is allowed to just simply take over with no guardrails, no regulations
00:23:40.660 of any kind. You know, people like to draw all kinds of comparisons. Anytime you hear someone like me
00:23:49.020 talking about the dangers of AI, what we always hear from the other side is, well, people have always
00:23:54.060 said that about every new technology. You know, when the car replaced the horse and buggy, there are
00:23:59.620 people saying, well, what about the carriage drivers? Well, yeah, but that's different. AI is
00:24:06.300 just a different kind of technology. It is its own category of thing. It's not comparable to any of
00:24:12.880 this other stuff. Because, you know, when the horse and buggy was replaced with the car, it meant that
00:24:17.400 if you were a carriage driver for a living, well, now you can become a cab driver in a car. It just
00:24:24.400 means that the tool that you're using is different, but you're still essentially doing the same thing,
00:24:28.920 which is getting people from point A to point B. And now you're doing it in a different vehicle.
00:24:34.700 And now you can do it a lot faster. You can carry a lot more people around. So it's better for you,
00:24:39.620 you know, in the long term and in the short term. And then when cabs started getting replaced with
00:24:45.660 Uber, it meant that cab drivers became Uber drivers. Whether that was actually an improvement
00:24:51.860 for the cab driver is a different conversation. But, you know, that's pretty clear. It's just like
00:24:55.660 the technology that you're using has changed, but it's still you doing the thing.
00:25:02.120 But as AI takes over, many jobs will just go away. I mean, they're just not going to exist anymore.
00:25:08.460 Self-driving cars means everyone who drives for a living doesn't have a job anymore. It's not like,
00:25:14.200 well, no, now you go do something. Hopefully you do go do something else, but the tool of your trade
00:25:20.340 has not changed, your trade is gone. That's the difference. The job does not evolve into something
00:25:27.200 else. It's just gone. And this same thing will happen over and over and over again in almost
00:25:33.760 every area of life. And there will be less and less for human beings to actually do. And that's the way
00:25:41.740 it goes if we embrace AI wholesale. If we don't value the human component at all, if we view humans
00:25:50.360 as just another type of machine, then we lose the competition because AI will be a superior machine
00:25:57.900 for many applications from a purely utilitarian perspective.
00:26:02.880 If the only thing we care about is, well, what's the cheapest, fastest, most efficient way to do
00:26:08.360 this? Well, then humans are going to lose. And human beings losing ultimately is not what we want
00:26:20.120 because the whole idea, like what's the point of life? What's the point of any of this? I mean,
00:26:25.540 what's the point of having a country? What's the point of human civilization?
00:26:32.640 We want to see that human life is thriving. And if you're going in a direction that's going to
00:26:42.760 ultimately hurt human beings for the sake of machines, then clearly it's not the right direction.
00:26:48.680 A future where millions of jobs are extinct, replaced with nothing, and that there's very little for human
00:26:53.320 beings to actually do is a dystopian, hellish future. It's an unhuman future. It's the future
00:26:58.680 that, I mean, it's the future that like every dystopian sci-fi writer ever has warned us about.
00:27:05.060 And I'm not saying that AI necessarily creates that future or leads to that future or needs to lead to it.
00:27:13.320 I'm not saying that we, you know, J.D. Vance said we don't want to strangle,
00:27:16.780 we want to be leaders in AI. We don't want to strangle innovation. I agree. We don't want to
00:27:23.200 strangle it. Total agreement there. I'm saying that the wholesale, uncritical, absolute embrace
00:27:29.660 of AI with no guardrails, no lines drawn, that leads to dystopia. Like guaranteed. I mean, guaranteed
00:27:37.260 dystopia. So what do guardrails mean? It means that there have to be lines, again, lines drawn where we
00:27:43.880 say, okay, AI could replace this thing, this job, this aspect of human existence, but we will not let
00:27:53.580 it. We have to be willing to say that, you know, and we're not going to say that for everything,
00:28:02.100 but that's part of the debate is where are the lines drawn? But they have to be drawn somewhere.
00:28:08.360 And I do think there's some people that are really bullish on AI that really don't want the lines
00:28:12.100 drawn anywhere at all. They just like, let it do what it's going to do. Wherever it can replace a
00:28:18.520 person, let's just do it. I do believe that's the opinion of some people. And I think that it's,
00:28:24.300 I think that you're just strolling into dystopia. I mean, I'm kind of, it's mind boggling.
00:28:33.740 So using the cab or the Uber example, truck drivers, another example, you know, in that case,
00:28:40.100 it would mean passing a law that bans self-driving rideshare and self-driving commercial trucks.
00:28:45.420 It would mean passing a law where we say, no, we're not going to allow these entire industries
00:28:50.080 full of jobs that people rely on to feed their families to just be wiped out overnight.
00:28:55.440 And look, we can debate whether we should pass that law with cabs and trucks, especially,
00:28:59.700 or specifically, I'm using that as an example of what drawing the line would mean and what it would
00:29:05.260 look like. Maybe you could make a good argument that we should just let it happen. We should let all
00:29:09.240 the truck drivers and Uber and cab drivers and everyone else who drives for a living lose their
00:29:12.320 jobs. It'd be very difficult to convince me of that argument because I tend to think we should
00:29:20.020 not allow that to happen. But maybe you can make that argument. My point is that if we're not willing
00:29:25.760 to pass those kinds of laws for anything, well, then AI will just replace all of us and we will live
00:29:33.020 in a future in a world run and operated by computers. And not computers that people are
00:29:38.800 commanding and running, but computers that are operating on their own. I mean, when people talk
00:29:44.180 about the promise of AI and the excitement of AI, that's what they mean. It's what makes it so
00:29:49.280 promising. It's that it's just, it can do this stuff without people being involved hardly at all,
00:29:54.080 and eventually not at all. And we will just be walking willingly into an unhuman future,
00:29:59.840 a future where humanity has been devalued into nothingness. And of course, you can look at jobs
00:30:07.040 where this has already happened, where the human element has been basically removed.
00:30:11.340 Take something obvious like grocery store cashiers, or really cashiers in any, cashiers in general,
00:30:16.500 but let's just take grocery stores. This is not AI, you know, but it is automation. And
00:30:21.640 so I think back to when I was a grocery store cashier many moons ago, maybe 24, 25 years ago,
00:30:28.360 there was no self-checkout at that time. At least it wasn't, if it existed, it was not prevalent.
00:30:33.620 We didn't have it where I worked. So we had whatever it was, 10 or 12 checkout lanes.
00:30:40.560 And during the busy times, every lane was manned by a human cashier. And most of the cashiers had
00:30:46.860 baggers, you know, had people that were still behind them and did the bagging. And then outside,
00:30:52.860 we had parcel pickup, which really doesn't exist anymore. But that's where you had people that
00:30:58.520 stood outside. And not only were they putting carts away, but they would actually help you bring,
00:31:02.300 you know, you help the old lady bring the bags to her car, load the bag in the car.
00:31:10.020 And this was a relatively small grocery store, but there were many jobs. There were many, many jobs,
00:31:14.900 mostly done by young people. I was like 14 or 15 when I was working at cash register. I was terrible at it.
00:31:21.300 I got booted out to parcel pickup because I was throwing, you know, watermelons in with eggs and
00:31:29.100 that sort of thing. But anyway, there were a lot of jobs being done. And because we had actual human
00:31:35.340 beings working all these jobs, it created a kind of a community atmosphere, a relationship with the
00:31:40.360 customers. I remember we had these old men. One of them looked like, I remember, looked like the old
00:31:47.180 guy from Home Alone, the old guy with the shovel and the beer in Home Alone. And these retired guys,
00:31:54.200 and they would come to the grocery store. This is what they did every single day. Every day,
00:31:57.080 they'd come to the grocery store. They'd get a donut and coffee. They'd sit outside on a bench
00:32:00.920 outside the grocery store, and they would just sit there. They would talk to the employees. They
00:32:04.500 would talk to the customers. Everyone kind of knew each other. None of that exists anymore.
00:32:11.280 You know, most of those jobs just don't exist. They did not evolve. They didn't become,
00:32:14.980 they just don't exist. They're gone. Everything's automated. And so you go to the grocery store now,
00:32:19.560 and you talk to no one. You look at no one. You check out your own groceries. You have no
00:32:25.060 interaction with anybody. And you leave. And the few employees who still work there have not much to
00:32:31.420 do. They're much more removed from the customer. I mean, you have one employee just kind of standing,
00:32:37.880 monitoring eight self-checkouts, right? And there's no relationship with the customers at all.
00:32:44.500 There's no relationship between the customers, you know, at all. And you just can't convince me that
00:32:50.840 this setup is better than it was back when I worked these jobs. Like, it was just better. That was a
00:33:01.120 better, it was better then. That was a better, it was better in almost every way. The new setup is
00:33:06.620 better for the CEOs of these companies because self-checkout is cheaper, and so it helps their
00:33:11.560 bottom line. But it's not better for anyone else. It's not better for the employees because there are
00:33:15.500 a lot less of them, and they're a lot less engaged. It's not better for the customers,
00:33:19.600 ultimately. I mean, yeah, it's quicker for the customers, but it's not, quicker is not always
00:33:24.660 better is what I'm saying. As human beings, it's not always better. And so I'm not saying
00:33:29.580 that we should ban self-checkouts, okay? That's not my, that ship has long since sailed. I mean,
00:33:35.660 that's just, that's already happened. There's no putting the toothpaste back in that tube.
00:33:41.860 What I'm saying is that a world where this process plays out everywhere with everything
00:33:49.560 is not a world that any of us should want to live in. And none of us will actually be happy
00:33:57.960 with it when it arrives. So that's something we need to think about. And as we're talking about
00:34:04.940 embracing AI and all the promise of AI, there needs to be like a serious conversation
00:34:11.240 about, again, where these lines are drawn. And that is going to mean somewhere saying,
00:34:20.560 yeah, AI could do this. It could replace this whole industry, but we will not let it.
00:34:25.660 We're just not going to let that happen. And I think for, especially, you know, there's
00:34:32.060 a certain kind of conservative where kind of the market is king always. And for that type
00:34:37.880 of conservative, that's just a conversation they never want to have. They're very uncomfortable
00:34:41.920 with any notion of ever putting any kind of roadblock in the way of making things just quicker
00:34:49.680 and cheaper. But if we are human, if being human means something, then it has to mean
00:34:56.860 that, you know, there are things we value even above quickness and cheapness and efficiency.
00:35:08.500 And the thing that we should value above that is like having a meaningful life, human connection.
00:35:15.300 I mean, these things actually do matter. You can't measure them, but they do matter.
00:35:20.280 All right. Here's Trump talking about why he is requiring federal workers to work from the office.
00:35:30.020 Let's listen.
00:35:31.780 We talk about reporting to work, right? I happen to be a believer that you have to go to work.
00:35:37.300 I don't think you can work from a home. I don't know. It's like there's a whole big,
00:35:41.780 oh, you can work from home. Nobody's going to work from home. They're going to be going out.
00:35:46.120 They're going to play tennis. They're going to play golf. They're going to do a lot of things.
00:35:48.920 They're not working. It's a rare person that's going to work. You might work 10 percent of the
00:35:53.500 time, maybe 20 percent. I don't think you're going to work a lot more than that. And I think
00:35:58.160 they have an obligation to work. And they have an obligation not to have a second job when they're
00:36:03.060 supposed to be working for the federal government. You're going to find that a lot of these people
00:36:06.780 have second jobs instead of working. They'll be collecting a federal government check and they'll
00:36:12.280 be working two jobs. And that's that's big trouble for them. So he's right about this, of course. I
00:36:19.220 mean, we can have the debate about working from home generally. But one thing for sure, in my view,
00:36:22.760 is that government workers should be at the office because they're working on the taxpayer dime.
00:36:27.520 They need to be monitored. We need to be sure that they're actually doing their jobs. And if they have
00:36:31.520 a job that doesn't require them to do much of anything at all in the first place, then the job should be
00:36:34.600 liquidated. You know, I know I just talked about preserving jobs, but this is one area where I'm
00:36:38.920 okay with wiping out as many jobs as we can. That's the federal government. You know, that's one area
00:36:44.820 where drastically shrinking the number of jobs available is a positive step. So for sure, government
00:36:52.180 workers should be at the office. You know, taxpayers shouldn't have to just take. And all these federal
00:36:56.980 employees are saying, no, I work even more when I'm at home. Yeah, okay, you say that, but we don't
00:37:03.000 know that. And I don't trust you. Sorry, I shouldn't have to. We shouldn't just have to take your word
00:37:08.220 for it that you're not going off and playing tennis half the time. So you should be at the office. And of
00:37:14.040 course, we know that even in an office, a lot of time can still be wasted. But if there's any hope
00:37:19.500 of holding people accountable, you got to require them to go to the office. And so that's for government
00:37:27.640 jobs. I do think this applies to many jobs in the private sector as well. Not so much the monitoring
00:37:32.180 piece. That's not what it's about. But I think, look, I just, I don't think you can totally replace
00:37:37.240 in-person collaboration with communication over the phone or through a screen. I think there is real
00:37:42.140 value to being in-person physically together. And it's not value that can be precisely quantified
00:37:51.280 or measured. The inability to measure it, though, doesn't mean that there's no value. It means the
00:37:56.320 value is literally immeasurable. And this actually does kind of go back to the AI point.
00:38:02.920 You know, we have to start valuing things of immeasurable value. Like we cannot demand that
00:38:09.020 everything, oh yeah, well, you say that has value. Show it to me on this chart. Show me the study.
00:38:14.320 Show me the data that proves that this thing has value. Not everything can be reduced to that.
00:38:21.700 Um, so, and now I don't feel as strongly about the work from home stuff as I do about AI.
00:38:31.800 A future where everybody is working from home is not nearly as bleak to me as a future where
00:38:36.100 nobody is working because algorithms are just doing everything. Um, but I do, I do think that
00:38:42.060 this is still important. Okay. Some, uh, some anti-Trump protests this week as a, as a change of pace,
00:38:48.240 it seems that, uh, some leftists are out protesting Trump. Uh, so, so it's not something you see very
00:38:52.880 often. Uh, it had been at least three or four hours since the last one. So I have two clips and
00:38:58.600 honestly, I'm not sure if these are from the same protest. I think they probably are, but they all kind
00:39:02.980 of bleed together. It doesn't really matter. So first here is representative Maxine Dexter of Oregon
00:39:08.720 with, uh, with, uh, her, well, just listen. I just, I, I've been told I have 30 seconds. So I am going
00:39:17.620 to tell you that we do have to, I don't swear in public very well, but we have to Trump.
00:39:25.720 Trump, please don't tell my children that I just did that. Um, I mean, don't tell your husband
00:39:46.660 either. That's a, you know, well, not that you probably don't have a husband. I don't know. Um,
00:39:50.940 I, uh, so I, you know, look, Trump has a, he gets a lot of threats, but out of all the threats
00:39:58.960 that Trump has ever gotten, this has to be the most terrifying for him. You know, that,
00:40:05.540 and that woman looks like she means it. That woman, she's on a mission. She has to, she says she has to,
00:40:13.060 she has to F Trump. I mean, that's what she said. She, she must, she's, she's overwhelmed by her
00:40:20.120 carnal desire for president Trump. If I were Trump, I would be right now. And look, I like to
00:40:26.760 think that I've, I've, um, that I can be brave in some situations, but if I were Trump, I'd be
00:40:33.140 holed up in a white house bunker right now, surrounded by secret service, hiding under a
00:40:38.160 table, trembling in fear, knowing what this woman intends to do to me. Uh, so that's, that's, uh,
00:40:44.180 that's, I mean, that's pretty bad, but it's not as bad as this. Now, um, so this is from the same
00:40:51.840 protest or a different one. I don't know. It doesn't matter. All I can tell you is that
00:40:54.860 the clip I'm about to play is a true test of your mental fortitude. Um, it's two minutes long. I don't
00:41:04.040 know if we're going to play the whole thing. We probably can't, but I challenge you, I challenge you
00:41:10.500 to not hit mute. Don't turn off the show. You're going to want to. I'm warning you ahead of time.
00:41:18.360 You're going to want to, you're going to want to take your phone or your computer and throw it into
00:41:22.240 a bathtub full of acid. If you happen to have one in the house, but I challenge you to stick with it.
00:41:27.980 We'll get through this together and we will be better for it. Okay. Here it is.
00:41:33.980 Which side are you on? Which side are you on? Which side are you on? Which side are you on?
00:41:47.760 We'll fight against Doge. We'll fight Elon Musk. No we let scab within our walls. We'll fight from
00:41:58.980 dong to dusk. Oh, which side are you on? Which side are you on? Which side are you on?
00:42:13.900 Trump's coming for our unions. He wants us all to fail. He wants us to bow to him, but we want him in jail.
00:42:26.220 Oh, which side are you on? Which side are you on? Tell me.
00:42:35.100 I mean, I'm on the side of the, of the deaf community. That's the side I want to be on. That's,
00:42:39.580 that's, that's who I want to be with. You know, I saw a video recently of, I think it was a Siberian
00:42:47.840 Husky, a dog trying to sing. And well, he was just howling, but the, you know, someone was playing
00:42:56.280 the piano and they said it was singing. It's not really singing, but he was a significantly better
00:43:00.220 singer than any of the people in that video. And cause that sounded like 50 Huskies with mental
00:43:06.720 illnesses. Drowning in a river while being eaten by piranhas. Or maybe that's just the fate that I
00:43:17.020 wish I would have suffered instead of having to hear that. I'm depressed now. I actually feel sick
00:43:25.400 in my soul after I, I do. I feel, I feel this deep well of despair in, in the, in the depths of my
00:43:34.760 soul after listening to that. And I know we said we get, that we'd get through it and we'd be better
00:43:40.760 for it, but we're not, we're not better. I was wrong. We are not better for it. I actually think
00:43:45.060 that all of our lives are ruined now. I think I just ruined your life and my own because your dreams
00:43:51.320 will be haunted by the sound of boomer feminists singing so off pitch that it actually causes brain
00:43:58.260 damage when you hear it. And I, I already had brain damage. So I don't, I, you know, I can't afford
00:44:02.780 this. I cannot afford to have suffered this, but I just did. And, um, wow. Anyway, I can't even think
00:44:15.340 anymore. I've totally lost my train of thought. This is, this is really the left problem in a nutshell
00:44:20.100 illustrated in both of those videos. And, um, you know, the problem is they can't sing and they
00:44:26.300 want to have, they secretly want to have sex with Donald Trump. I mean, that's, uh, that's the
00:44:30.640 surface level problem, but what's underneath it is that they're just lost at sea and they have no
00:44:35.340 message. And, um, and, and then, so you end up with this. I think we just need to quit while we're
00:44:45.340 ahead. Let's get to the, uh, comment section. Tax season, that magical time when we all remember
00:45:00.220 how much fun it is being a responsible adult still don't follow those returns. Well, operate if
00:45:04.940 you're operating on a, maybe I, if I ignored, it'll go away type strategy. Let me save you some
00:45:09.460 suspense. Uh, that is not going to work. The IRS has a whole menu of ways to make your life.
00:45:15.480 Let's say interesting wage garnishments, frozen accounts. And if you're really lucky,
00:45:20.180 uh, maybe you get some property seizures thrown in. It's like a game show where all the prizes
00:45:24.320 are things you do not want. And now that we're in tax season, they're feeling particularly motivated.
00:45:29.080 But before you consider moving to a remote Island with no extradition treaty, look, there's
00:45:33.400 tax network USA. They've been playing this game of tax chess for years. And unlike most of us who can
00:45:39.060 barely remember which receipts to save, they actually know what they're doing. They've helped
00:45:43.560 taxpayers save over a billion dollars in tax debt and filed over 10,000 tax returns. Because let's
00:45:48.400 face it, you've got better things to do than argue with the IRS about tax deductions. Look, I get it.
00:45:53.060 Dealing with the IRS is about as fun as a root canal, but ignoring the problem is not going to make
00:45:56.880 it go away. So here's what you got to do for a complimentary consultation. Call today at 1-800-958-1000
00:46:02.340 or visit their website at TNUSA.com slash Walsh. That's 1-800-958-1000 or visit TNUSA.com slash Walsh
00:46:10.080 today. Don't let the IRS take advantage of you. Get the help you need with Tax Network USA.
00:46:16.060 No amount of Sesame Street will influence a culture. I was raised in Puerto Rico, U.S. territory. We have
00:46:20.640 Sesame Street in absolutely every American movie, music, and shows. Assimilation has never been
00:46:25.980 accomplished. They have their own culture and identity, very different from Americans. We are lovely
00:46:29.360 people, by the way, but Sesame Street has nothing to do with it. Right. And by the way, that's precisely
00:46:35.680 why Puerto Rico should never be the 51st state, because it's not American. As you point out, it's
00:46:42.220 not an American, it's just, it's not American. It has some aspects of American culture, mainly
00:46:47.020 American media and entertainment, but it's not American and it will never be.
00:46:51.980 So if parents are making the decision to lobotomize their kids with their child's doctor, that would
00:47:00.060 make it okay? The why do you even care doesn't affect you argument is so stupid. Yeah, I've
00:47:05.260 made the lobotomy comparison many times. I mean, people listen to the show know that this is something
00:47:09.020 that through the years of combating gender ideology and the gender transition racket, I've
00:47:16.840 brought up this comparison. And if you aren't familiar with it, if anyone isn't familiar,
00:47:22.400 you should really go back and look into it. Because this was a widely accepted medical practice
00:47:28.800 for many, many years. And lobotomy was as barbaric as it sounds. And when you go back and research this
00:47:38.180 stuff, you'll be shocked to see how often the most barbaric and disturbing forms of quote-unquote
00:47:44.320 medical treatment became widely accepted and practiced by medical experts. And the thing is,
00:47:50.080 you don't have to go back to the Middle Ages, right? You don't have to go back to a time of
00:47:53.740 bloodletting and using leeches for whatever. No, you don't have to go back far at all. I mean,
00:48:01.340 this has been happening in modern medicine for a long time. And in fact, the most barbaric practices
00:48:08.920 that are practiced by and accepted by mainstream medicine are happening today. We have gender
00:48:17.720 transitions, which hopefully are on their way out, but then abortion also. And even that, you'll hear
00:48:26.360 the argument all the time that abortion has existed forever and people have always, and it's always
00:48:33.360 been a practice. And in a sense, yes, in the sense that murder itself has always existed. Like people
00:48:40.180 have always murdered each other since Cain and Abel. I mean, so it's always been a part of human
00:48:47.000 existence and abortion is murder. And so in that sense, yes, we've always had it ever since the fall
00:48:55.500 of man. But the difference is that abortion now being not just the fact that it's legal and there's
00:49:10.440 a whole industry, a billion dollar industry behind it, but that it's presented as a medical procedure.
00:49:16.800 You know, if you go back centuries to find societies where unborn children were killed and that did
00:49:26.040 happen, they didn't pretend that it was medicine. And so this is an innovation. So abortion itself
00:49:36.060 is not an innovation of modern medicine, but pretending that it is medicine is an innovation of modern
00:49:44.020 medicine. And so when you look at these things, when you look at, you know, it wasn't all that long
00:49:50.480 ago that doctor, that mainstream medicine would have told me that lobotomies are okay. It is, it is in the
00:49:57.360 current day where the mainstream medicine tells me that, you know, we should chemically castrate
00:50:01.280 children and that dismembering an infant in the womb is a form of medical care. And you look at that
00:50:06.800 stuff. It's just your faith in the medical industry. There's no way that it is not almost entirely
00:50:14.500 destroyed. And, uh, anytime you say that, you'll always hear from the other side that, well, well,
00:50:22.800 like you're telling people not to trust their doctors. You know how dangerous it is to tell people
00:50:25.820 that it is dangerous. I agree. Like it is not good that we can't trust the medical industry.
00:50:34.060 And I don't, I don't have an exact answer for that. I don't, I don't know how to make that better
00:50:39.460 right now. I don't have some, like, there's nothing I can say that will, I agree. It's a terrible
00:50:44.300 situation to be in. Uh, but it is the situation we are in and that's not our fault, right? That's the
00:50:54.440 fault of the medical industry. Um, rather than funding shows like Sesame Street, why not encourage
00:51:03.720 missionaries to go to these countries and share the gospel? Churches already fund overseas missions,
00:51:07.600 so the government doesn't need to spend anything. The government can focus on ensuring missionary
00:51:11.180 safety and access. The gospel can change lives in ways that Big Bird never will.
00:51:17.120 Yeah, that's, that's, uh, precisely it. I talked about the problem of spreading our values, you know,
00:51:22.780 the, um, the, the, this, this idea that we need to go and spread our values overseas. We heard this
00:51:29.400 from Senator Chris Coons yesterday when he was arguing for taxpayer funding for Sesame Street
00:51:34.980 in Iraq and Afghanistan, wherever else. Um, but one of the biggest problems with quote unquote
00:51:44.220 spreading our values is that the people that are doing the spreading, you know, the establishment,
00:51:50.980 I don't agree with what they consider our values to be in the first place. And most of us don't agree.
00:51:58.240 So, uh, because these, these are the same people. You bring up missionaries. Well,
00:52:03.900 I guarantee you, I feel rather confident that Chris Coons, who says that we need to spread our
00:52:09.440 values through Sesame Street and taxpayer funded, you know, uh, theater plays for the LGBT community
00:52:15.180 or whatever the hell, that's what he thinks. But if you were to ask him about missionaries
00:52:21.240 going into these third world countries, I bet you, he would tell you that it, well,
00:52:26.100 that's a little problematic, you know, imposing your, imposing your belief system is it's, it's,
00:52:32.080 it's problematic. So the same people that say we should spread our values will tell you that
00:52:34.980 missionaries are problematic because, because, um, they don't actually think that we should,
00:52:39.180 well, we should just be spreading values. They have a very specific set of values and those are
00:52:43.500 left-wing secular values that they want to spread. And that, that really is the fundamental problem.
00:52:48.240 When you join Daily Wire Plus, it's not just a subscription. It is a statement, a refusal to
00:52:52.720 be spoon-fed the nonsense shoved down your throat by the media and Hollywood and every self-righteous
00:52:57.940 blue-haired activist with a TikTok account. You've heard the lies, you've seen the manipulation,
00:53:02.060 you know, the game is rigged and you refuse to be played. Becoming a Daily Wire Plus member isn't
00:53:06.220 just about access to content. It's about standing up for truth in a world that treats truth like a
00:53:10.540 disposable inconvenience. It's about rejecting propaganda. It's about demanding facts and logic and
00:53:15.740 reality when the culture wants you gaslit into submission. So when you join, you know exactly
00:53:20.620 what you're doing, backing a movement that doesn't just report on the culture, but reshapes it. Every
00:53:24.740 dollar you spend goes directly into building the future because America's future won't build itself.
00:53:29.700 Join the fight today at dailywire.com slash subscribe.
00:53:38.900 You know, it's no secret that it's very hard, if not impossible, to sympathize with the vast majority
00:53:43.400 of lawyers. We have lawyer jokes for a reason after all, and it's especially true when lawyers
00:53:48.540 are bringing frivolous lawsuits and doing the whole ambulance chaser routine. You'll find more
00:53:55.220 popular support for acne and herpes than you'll find for these people. But even with that in mind,
00:54:00.320 there are a couple of lawyers that, if you have any heart at all, you have to feel kind of bad for
00:54:06.080 at this particular moment. These are lawyers who are clearly being punished for some sins they
00:54:10.360 committed in a past life, and those sins must have been truly horrific because right now they're
00:54:14.620 having a very bad time. I'm talking, of course, about the legal team of a Detroit rapper who uses
00:54:18.900 the name Dank DeMoss. These are not ambulance chasers so much as they are, I guess you would say,
00:54:24.300 chubby chasers in the most literal sense. You might remember that a couple of weeks ago we discussed
00:54:28.260 the sordid story of Ms. Dank DeMoss. She's the woman who was far too obese to fit into a lift. She
00:54:34.340 weighed something like 500 pounds. So the driver told her that she couldn't enter his vehicle.
00:54:39.360 And in case you somehow missed the story, despite its plus-sized importance to our country,
00:54:45.320 and indeed to the whole solar system, here's a very quick recap.
00:54:50.920 Blanding tells us she was just trying to get to a Detroit Lions watch party this month when her
00:54:56.060 lift rolled up. As I'm walking, I see him like making faces or whatever. I'm like, oh, man.
00:55:00.520 She already knew. I can fit in this car.
00:55:03.380 No, believe me, you can. Yes, I can.
00:55:05.940 Believe me.
00:55:06.480 He told her there's not enough room in his car.
00:55:09.960 The kicker part was when he started to talk about his tires. You know, I feel like that
00:55:14.160 was a slap in the face. That was like my tires. You know, like.
00:55:18.360 The driver said his tires could not handle her weight.
00:55:22.260 Every big person you turned on because they can't fit in your car?
00:55:25.220 Yeah, because they need to order the Uber XL.
00:55:29.700 No, I don't never have to order Uber XL.
00:55:32.240 Now, as we talked about at the time, Madam Dank does not come across as a sympathetic figure in
00:55:39.700 this footage, but her lawyer saw an opportunity there. They claim that in Michigan, it's a crime
00:55:43.800 to discriminate against anybody on the basis of weight. They said that being obese is basically
00:55:47.900 a protected class. So in this case, they're arguing that the lift driver might as well have
00:55:52.360 said no black people are allowed in his car. They're equating this woman's decision to be heavier
00:55:57.180 than a polar bear with being born with a certain skin color. And now they want lift to pay up
00:56:01.400 millions of dollars. If Dank could not fit in his car or exceeded its weight limit, then I guess by
00:56:06.900 these lawyers logic, it was his responsibility to come back with a dump truck or whatever vehicle
00:56:10.900 could handle the load, you know. But I have to admit, as something of an amateur observer in the
00:56:16.300 area, it didn't seem like the most compelling legal argument at the time. Kind of seemed like a
00:56:21.860 shakedown where they're basically just demanding a payout from lift so that the story goes away.
00:56:26.280 Naturally, that led me to the conclusion that Ms. Dank's lawyers deserve no sympathy whatsoever.
00:56:29.980 But then I saw this footage of Dank DeMoss appearing on The Breakfast Club. And in just the first 30
00:56:35.820 seconds of her appearance, Dank DeMoss completely obliterates her lawyer's case. She wastes absolutely
00:56:40.640 no time in wrecking every hour of work that these attorneys have put into this lawsuit. It's like
00:56:46.320 watching someone spend a month laying the foundation for a house only to see a morbidly obese woman
00:56:51.660 walk on top of it and collapse the whole thing. Metaphorically, I mean. And one might even say that
00:56:58.300 Dank DeMoss, you know, chewed up her lawyer's case and then in an uncharacteristic display of
00:57:03.380 restraint, spat it out. There is nothing left to work with at this point after these 30 seconds.
00:57:09.560 And here's the moment that I'm talking about.
00:57:12.360 Wake that ass up.
00:57:14.640 In the morning.
00:57:15.640 The Breakfast Club.
00:57:16.900 Peace, big day.
00:57:18.900 How you?
00:57:19.900 Good morning.
00:57:20.900 You good?
00:57:21.900 Good morning.
00:57:22.900 What's going on? Nice to meet you.
00:57:23.900 Hi. How are you?
00:57:24.900 Stormy. How you doing?
00:57:25.900 Peace, Keith. How you, brother? You good?
00:57:26.900 Nice to meet you, Keith.
00:57:27.900 Got it.
00:57:28.900 Yep.
00:57:29.900 It's the only seat y'all got?
00:57:31.900 What you want? What you need?
00:57:33.900 A bigger chair or something.
00:57:35.900 This is what I'm talking about. Good. This is accommodation.
00:57:42.900 There you go.
00:57:43.900 Now, in case you couldn't make it out, Dank walks into the studio and then tries to sit
00:57:54.740 in the chair and immediately asks, is this the only seats y'all got? She's not satisfied
00:58:00.520 with the chair that they're offering, even though it looks pretty large. And the other people
00:58:05.700 are confused because they're like, well, what do you mean? It's a human chair. The only
00:58:12.580 chair. It's a chair. What other kind of chair do you want? And then, I mean, it's like if
00:58:17.920 she walked in and like broke through the floor and then said, is that the only floor y'all
00:58:22.020 got? You got any other kinds of floors? No, it's a floor. It's a floor. It's like, the
00:58:27.260 floor, it's a floor. It's worked perfectly fine for every other person that's ever walked
00:58:29.980 across it. But then she said she needs a bigger chair, so they roll out a couch for her.
00:58:34.140 Okay. Now, which by the way, you know, in case you're not familiar, a couch is a seat
00:58:40.540 made for like multiple people, like three, four people. And she took up the whole thing.
00:58:47.500 Now, I'm not a lawyer, but if you're trying to make the claim that you could fit in a
00:58:51.200 Lyft driver's tiny sedan and that he violated the law by turning you away, then it stands
00:58:55.800 to reason this particular moment poses a few problems for your case. Dank, by her own
00:59:00.600 emission, cannot fit into an oversized office chair in a large studio with no other obstructions.
00:59:07.220 Unlike a sedan, there's no roof over your head. There's, or at least not one directly
00:59:11.060 over your head. There are no doors right next to you, right? There's no seat belt you have
00:59:15.400 to put on. Total flexibility, but it still was not enough. She needed an entire couch.
00:59:20.820 So by her logic, if the Lyft driver was bigoted, then Dank is also bigoted against herself,
00:59:27.160 I guess. If the law still means anything in this country, which it probably doesn't,
00:59:30.720 if we're being honest, then Dank DeMoss' case has just imploded. So pour one out for her lawyers,
00:59:36.140 but the interview didn't end there. Somehow everybody maintained their composure after
00:59:39.500 this little snafu. And as the conversation continued, Dank actually drew something of an
00:59:43.940 interesting parallel that's worth taking note of. Here it is. Oh! I'm not saying that word.
00:59:50.820 Yeah, we don't use the word in my house either. I thought you were talking about the gay slur.
00:59:53.760 I'm like, wait, I'm confused. No, no, no, no, no, no, no, no, no. But I just feel like it should
00:59:58.000 be accommodated. Bigger people should be accommodated. Now you won't find a more succinct explanation
01:00:05.060 for why we should never pander to any group of deranged activists for any reason. She's saying
01:00:09.120 that because society tolerated the insanity of gender ideology, we should also tolerate her insanity.
01:00:14.840 And in a certain way, she has a point. I mean, that's why the solution is to never accommodate
01:00:19.360 any of this nonsense at any point. Once you start entertaining complete and total lunacy,
01:00:24.040 then you get a lot more of it. And as the interview continued, that's exactly what happened.
01:00:29.340 Things really went off the rails, particularly when DeMoss' healthcare provider, who uses the name
01:00:33.040 Stormy, showed up to the set. Watch as Stormy, who of course is wearing her white doctor coat for the
01:00:39.260 interview, explains that Ms. Dank isn't really responsible for weighing 500 pounds.
01:00:43.400 Instead, we're informed that the real culprit is Dank DeMoss' thyroid problem. Watch.
01:00:49.540 I do. I work on myself, you know, and when I feel like I'm getting, it's getting too much,
01:00:57.320 I try to fix it, you know, like, at my pace, you know, but, you know, you want to see her?
01:01:04.020 Yeah, and we have a weight specialist right here. Fix her a weight specialist. Fix her a mic.
01:01:07.800 And what's your name, ma'am?
01:01:10.040 Stormy Anderson.
01:01:10.560 Stormy.
01:01:11.360 Stormy Anderson.
01:01:12.420 Stormy gonna get it right.
01:01:13.380 Dr. Stormy.
01:01:14.160 Thank you.
01:01:14.660 One time, you know, when you have these conversations with plus-size people, it always comes up that,
01:01:20.640 oh, everybody that's, you know, big isn't unhealthy, and, you know, some people just can't lose the weight.
01:01:25.280 They're dealing with issues. Like, you know, she has a thyroid. What do you, talk to us about that.
01:01:28.480 So, I've been Dank provider for a little over three months, right, and she's lost 80 pounds.
01:01:34.080 Your 80 pounds gonna look different from her 80 pounds, right, because she started at 580.
01:01:39.100 So, even with her being at 500, people be like, she's still not losing, but that's not the case.
01:01:45.000 Well, her thyroid makes it very difficult for her to lose weight because her hormones is unbalanced.
01:01:49.680 So, when your hormones is unbalanced and they're all over the place, it controls a lot of things,
01:01:55.380 and it makes it very difficult for you to lose weight, but keep it off as well.
01:01:59.220 So, I think for her, she's consistent because she done lost 80 pounds.
01:02:05.840 It's a medical provider, a doctor.
01:02:08.860 She done lost 80 pounds. She done lost it.
01:02:13.900 But she's wearing the white coat, so it's okay. That's what's important.
01:02:19.340 Let's just assume here that Dr. Stormy actually collected the blood work and analyzed it.
01:02:22.820 Let's assume that Dr. Stormy is correct, that indeed, Dank DeMoss has a thyroid problem.
01:02:28.180 That's a real thing. It's possible.
01:02:30.460 But there is no thyroid condition on the planet that causes anyone to weigh a quarter of a ton.
01:02:39.940 Okay.
01:02:40.240 There is no thyroid condition that makes you gain so much fat that you now weigh more than two full-grown male ostriches.
01:02:49.340 I looked it up.
01:02:52.540 Okay.
01:02:52.920 You cannot blame a medical condition for the fact that you weigh as much as a Harley-Davidson motorcycle.
01:02:59.080 Okay.
01:02:59.440 And that's probably enough weight comparisons.
01:03:02.300 I'm just trying to put it into perspective.
01:03:03.660 At most, thyroid conditions make it a bit harder for some people to lose weight and easier to gain it.
01:03:11.360 That's it.
01:03:12.580 The only way that it's possible to exceed 500 pounds is to commit to a lifestyle of extreme sloth and gluttony.
01:03:20.440 I mean, it requires an obsessive commitment to at least two of the seven deadly sins.
01:03:26.760 And if you disagree with that assessment, then your problem isn't with me.
01:03:29.440 Your problem is with the laws of physics.
01:03:32.140 Energy doesn't spontaneously create itself.
01:03:35.080 Calories are a measure of energy.
01:03:37.920 Therefore, calories do not appear out of thin air and just invade Dank's body.
01:03:43.440 She is consuming them.
01:03:44.920 And in fact, based on her size, she's probably consuming them at every available point in the day.
01:03:51.240 But Stormy won't acknowledge any of this because she wouldn't make as much money if she did.
01:03:57.480 Dank wouldn't hire her.
01:03:59.260 And then certainly, or at least there'd just be one consultation where she would say,
01:04:04.000 okay, well, how much are you eating?
01:04:06.000 And Dank would say, here's what I'm eating.
01:04:08.700 And then she would say, okay, eat like a tenth of that.
01:04:12.560 Let's start.
01:04:13.420 Cut out 90% of that that you're eating.
01:04:17.100 And you will start losing weight right away.
01:04:20.440 Okay?
01:04:20.740 It's like that easy.
01:04:22.780 But she doesn't want to say that.
01:04:24.240 And she also would not get to appear on shows like Breakfast Club if she did.
01:04:28.200 So we've talked a lot recently about how doctors and scientists have discredited themselves.
01:04:33.620 And this is another way.
01:04:35.360 They've refused to be honest about what actually causes obesity.
01:04:39.660 Lazy people who eat too much become fat.
01:04:43.360 Extremely lazy people who eat extreme amounts of food become extremely fat.
01:04:47.360 That is the whole formula.
01:04:49.640 But the medical industry has not been honest about it.
01:04:52.380 And it's not the first time that this has happened.
01:04:54.660 I mean, the medical industry took a similar approach with something like HIV.
01:04:58.400 They were not, and still are not, honest about the fact that pretty much the only way you get it,
01:05:04.620 the only way people who get HIV, people who get HIV, the only people who do get it,
01:05:09.240 are homosexuals and intravenous drug users.
01:05:11.680 So if you don't use intravenous drugs or engage in sodomy, you almost certainly won't get it.
01:05:19.100 You might get other diseases, but you almost definitely will not get that one.
01:05:25.680 But doctors didn't want to seem insensitive to homosexuals,
01:05:28.500 so they pretended that HIV is an equal opportunity disease.
01:05:31.780 It is not.
01:05:33.400 More recently, we saw the same approach with monkeypox.
01:05:35.820 We were supposed to believe that this was some grave public health emergency
01:05:39.200 that would affect, say, a straight 80-year-old man
01:05:42.260 just as much as it would affect a gay 25-year-old in Manhattan.
01:05:46.380 The common threat here is that there is a refusal to be honest
01:05:49.860 in the name of sensitivity, which has led to a lot of needless death and suffering.
01:05:55.180 It's also led to a ton of footage that is frankly embarrassing to this country
01:05:58.400 and to humanity in general.
01:06:00.020 I began this segment with some of that footage,
01:06:01.720 so it's only fitting to end things in the same way.
01:06:04.000 Here, reportedly, is one failed stunt performed by Dank DeMoss in Detroit
01:06:10.800 not too long ago.
01:06:13.860 Watch.
01:06:22.240 If you're listening to the audio podcast, she fell over.
01:06:24.800 That's what we just played.
01:06:26.780 Just to be clear, the failed stunt,
01:06:28.600 the stunt she was trying to perform was standing.
01:06:32.200 Okay, that's...
01:06:34.120 She tried to pull off the incredible stunt of standing,
01:06:38.760 and she was not able to.
01:06:40.400 So she fell over.
01:06:41.500 It doesn't quite stick the landing.
01:06:43.000 And yes, I played that clip, you know,
01:06:46.620 mainly for the entertainment value,
01:06:48.120 but if I had to find some other reason to justify playing it,
01:06:51.020 I'd say the footage of Dank DeMoss falling over backwards in front of a crowd
01:06:53.980 is, in fact, a solid metaphor for this whole story.
01:06:56.780 Dank's story began with so much promise,
01:06:59.700 her lawyers thought they had a multi-million dollar case against Lyft.
01:07:03.560 Her doctor thought she had a totally plausible explanation
01:07:05.900 for why her patient weighs more than a young hippopotamus.
01:07:08.740 And then it all fell apart
01:07:09.900 when people thought about it for about five seconds.
01:07:12.660 It collapsed, like so many other stationary objects in Dank's life.
01:07:16.940 And that is why Dank DeMoss and her doctor, Stormy,
01:07:20.680 who claims that people can balloon to 500 pounds
01:07:22.900 because of a thyroid problem,
01:07:24.900 are today canceled.
01:07:27.320 That'll do it for the show today.
01:07:28.040 Thanks for watching. Thanks for listening.
01:07:30.660 I see what you're about to do.
01:07:32.640 I'm just going to take my earpiece out
01:07:33.680 so that it doesn't, I won't hear it.
01:07:35.740 Anyway, have a great day.
01:07:38.280 Godspeed.
01:07:39.160 Which side are you on?
01:07:41.380 What's up? Can't hear it.
01:07:42.340 Which side are you on?