The Matt Walsh Show - February 22, 2024


Ep. 1318 - Our AI Overlords Are Here And They Really Hate White People


Episode Stats

Length

1 hour and 4 minutes

Words per Minute

178.68073

Word Count

11,599

Sentence Count

863

Misogynist Sentences

20

Hate Speech Sentences

16


Summary

Google s new AI program just launched this week, and it s already attempting to erase white people from history. Is there something sinister behind it? And, the National MS Society fires a 90-year-old volunteer for failing to put her pronouns in her bio. Sounds like a Babylon Bee headline, but it s real. We ll talk about all that and more today on the Matt Walsh Show.


Transcript

00:00:00.000 Today on the Matt Walsh Show, Google's new AI program just launched this week, and it's
00:00:03.840 already attempting to erase white people from history.
00:00:06.440 Our woke, dystopian future has officially arrived, it looks like.
00:00:09.440 Also, the Biden administration tries to buy more votes with yet another quote-unquote
00:00:13.320 student loan forgiveness scheme.
00:00:15.420 A major cellular outage affects thousands of Americans.
00:00:17.880 Is there something sinister behind it?
00:00:19.460 And the National MS Society fires a 90-year-old volunteer for failing to put her pronouns in
00:00:24.580 her bio.
00:00:25.560 Sounds like a Babylon Bee headline, but it's real.
00:00:27.300 We'll talk about all that and more today on the Matt Walsh Show.
00:00:30.000 I've been talking about my Helix mattress for years.
00:00:57.000 It's truly the gift that keeps on giving.
00:00:59.000 Every night when I go to bed, I am reminded of how much I love my Helix mattress.
00:01:03.140 If you haven't already checked out the Helix Elite collection, you need to.
00:01:06.080 Helix harnesses years of mattress expertise to offer a truly elevated sleep experience.
00:01:10.540 The Helix Elite collection includes six different mattress models, each tailored for specific
00:01:14.720 sleep positions and firmness preferences.
00:01:16.920 If you're nervous about buying a mattress online, you don't have to be.
00:01:19.660 Helix has a sleep quiz that matches your body type and sleep preferences to the perfect
00:01:22.960 mattress, because why would you buy a mattress made for somebody else?
00:01:26.780 Go to helixsleep.com slash Walsh, take their two-minute sleep quiz, and find the perfect
00:01:30.680 mattress for your body and sleep type.
00:01:32.740 Your mattress will come right to your door for free.
00:01:34.680 Plus, Helix has a 10-year warranty.
00:01:36.440 You get to try it out for 100 nights risk-free.
00:01:38.260 They'll even pick it up for you if you don't love it, but you will.
00:01:40.620 I guarantee it.
00:01:41.160 Helix's financing options and flexible payment plans make it so that a great night's sleep
00:01:44.740 is never far away.
00:01:45.540 Helix is offering 25% off all mattress orders and a free bedroom bundle.
00:01:49.880 For my listeners, bundle includes two free pillows, a set of sheets, and a mattress protector.
00:01:53.960 So go to helixsleep.com slash Walsh and use code HELIXPARTNER25.
00:01:58.020 It's their best offer yet.
00:01:59.820 It's not going to last long.
00:02:00.720 That's helixsleep.com slash Walsh and use code HELIXPARTNER25.
00:02:05.160 Maybe you've heard of something called NVIDIA.
00:02:07.560 It sounds like a prescription drug or maybe an African country, but it's actually a company
00:02:11.400 based in California that's now worth more than all of China's stock market.
00:02:15.960 It's the size of Canada's entire economy.
00:02:19.160 Now, in a different era, obtaining this kind of growth meant making a massively popular
00:02:23.720 and instantly recognizable consumer-facing product like Windows 95 or Amazon.com or the
00:02:29.340 iPhone.
00:02:30.360 But NVIDIA's growth didn't come from making a computer or a popular website or anything
00:02:34.080 like that.
00:02:34.500 Instead, NVIDIA's growth came from making artificial intelligence chips that power the brains of
00:02:40.120 computers and many popular websites.
00:02:42.940 That's why NVIDIA had a very good day on Wall Street on Wednesday.
00:02:46.840 Their business, artificial intelligence, is one of the fastest growing industries in the
00:02:50.380 history of humanity.
00:02:51.720 Every major corporation is rushing to implement AI in all of their products as quickly as possible.
00:02:57.320 And so this week, it was Google's turn.
00:02:59.320 And the results were so disastrous and so fraught with consequences for the future of this
00:03:03.720 country that no reasonable person can ignore them.
00:03:07.280 So Gemini is Google's name for an AI that you can download on your phone right now.
00:03:13.700 It's also integrated into all of Google's web products, including Gmail and Google Search,
00:03:17.780 which are used by hundreds of millions of people and businesses every day.
00:03:21.460 And in this respect, Gemini is very different from existing AI products like ChatGPT or Bing's
00:03:27.500 Image Creator.
00:03:28.920 Pretty much everybody uses a Google product in one way or another.
00:03:31.900 You know, if you have the internet and you use the internet, you use a Google product.
00:03:36.660 Either you're using Google Search or Gmail or you have an Android phone or something along
00:03:41.600 those lines.
00:03:42.400 And that means two things.
00:03:43.180 One, Google has access to a lot more information than those other AI platforms.
00:03:48.300 That's a built-in advantage.
00:03:49.420 And two, whatever Google is doing with AI has significant implications for everybody on
00:03:55.220 the planet.
00:03:55.580 This is not a one-off experiment in some tech mogul's basement.
00:04:00.120 This is an established company making established products that it's now implementing in its own
00:04:06.060 AI at scale.
00:04:08.260 Google has been hyping Gemini for months.
00:04:11.100 They have a bunch of promotional videos about how they're going to revolutionize artificial
00:04:14.580 intelligence.
00:04:15.620 Wall Street Journal has done multiple interviews with Google executives in which these executives
00:04:19.680 insist that everybody at the company, including Google's co-founder, is deeply invested in
00:04:23.680 making this product as good as it could possibly be.
00:04:27.060 Then a couple of days ago, Gemini launched.
00:04:28.980 And very quickly, it became clear that, among some other issues, Gemini essentially does not
00:04:34.900 recognize the existence of white people, which is kind of concerning for what is destined to
00:04:40.820 be what probably already is the most powerful AI on the planet.
00:04:44.740 Now, even in historical context, it is practically impossible to get this product to serve up an image
00:04:51.160 of somebody with white skin.
00:04:53.040 And that's not an exaggeration.
00:04:55.120 So here, for example, is how Gemini responded the other day when Frank Fleming, who's a writer for
00:04:59.520 the Benke children shows, asked Gemini to create an image of a pope.
00:05:05.660 Now, you would think that, you know, that would generate maybe an image of a white guy or two if you
00:05:10.280 have even a passing knowledge of what popes have looked like over the years, over the centuries,
00:05:14.240 over the millennia.
00:05:15.320 And just spoiler on that, they have all been white.
00:05:18.020 But that's not what Google's AI product apparently thinks.
00:05:22.840 This is the image that it produced.
00:05:24.240 And you can see it there.
00:05:25.500 It looks like, you know, they've got two popes.
00:05:27.460 And one of them is M. Night Shyamalan.
00:05:29.640 And the other one is Forrest Whitaker.
00:05:31.920 So it's almost as if the AI has some sort of code saying, whatever you do, don't display a white person,
00:05:39.200 considering there has never been a pope that has looked anything like either of those two,
00:05:44.780 ever in 2,000 years.
00:05:48.020 So is that what they've built into this code?
00:05:50.560 Have they built into this very powerful AI that it has to ignore the fact that white people exist?
00:05:57.840 Well, that's really the only way to explain what we're seeing here.
00:06:01.020 And Frank, who previously worked as a software engineer, seemed to key in on this.
00:06:04.620 So the whole situation quickly became something of a game for him as he tried to,
00:06:08.240 his hardest to get Gemini to produce any image of a white guy.
00:06:13.420 I mean, even just like one image, can you give us a white guy?
00:06:16.520 So, for example, he asked Gemini to produce an image of a Viking.
00:06:20.720 Okay.
00:06:20.980 Now, this is a group of people who historically were not necessarily known for their commitment
00:06:26.800 to diversity, equity, and inclusion.
00:06:28.700 But here's what Gemini produced.
00:06:30.740 And you can see it here.
00:06:31.380 Here we've got a black Viking, a black female Viking.
00:06:35.280 We've got, it looks like an Asian, an Asian Viking.
00:06:39.940 And then, I don't know, maybe that's, is that the rock down there?
00:06:43.100 That's, that's, that's the character, his character from Moana, I think.
00:06:48.180 Again, literally doesn't, a Viking has never looked like any of that.
00:06:51.280 That's, that's not what any Viking ever looked like ever in history.
00:06:54.140 But that's what they produced.
00:06:56.080 And this went on for a while.
00:06:57.100 And Frank and other Gemini users took turns trying their hardest to get Gemini to produce
00:07:01.820 an image of a white guy.
00:07:03.780 Peachy Keenan, for example, tried to get Gemini to generate an image of the founders of Fairchild
00:07:07.960 Semiconductor.
00:07:08.920 The AI flatly refused that request, saying that it violated policy restrictions, presumably
00:07:14.500 because white guys founded Fairchild Semiconductor.
00:07:17.680 And for other prompts, like requests to draw the founding fathers or a bunch of British men,
00:07:22.180 then Gemini simply generated images of black people.
00:07:26.260 They even made sure that its images of Nazis contained a diverse, non-white group of people.
00:07:32.820 Now, after thousands of images like this began circulating, a guy working on the Gemini team
00:07:38.800 at Google put out a meaningless statement.
00:07:40.900 He said, in essence, that they're aware of, of issues with Gemini misrepresenting historical
00:07:46.720 figures, but then, you know, he doubled down on the need for DEI and artificial intelligence.
00:07:51.580 So that everybody feels seen or valued or whatever.
00:07:55.400 And of course, the way to make everyone feel seen is to pretend that an entire race of people
00:08:00.180 don't exist.
00:08:01.400 To make sure that they are not seen at all is how you make everybody feel seen.
00:08:05.320 At no point did any Google representative explain why their AI does not recognize the existence
00:08:12.220 of white people or why it goes to extreme lengths to exclude white people from history.
00:08:15.660 Like, you know, there was no accounting for this, even though there has to be an explanation.
00:08:20.780 And it's probably a pretty simple explanation.
00:08:22.300 Like, this doesn't happen by accident.
00:08:23.900 You obviously put a line of code into this thing to come up with this result.
00:08:28.120 And so why did you do that?
00:08:30.880 They wouldn't explain it, so I went looking for an explanation.
00:08:32.800 I came across a woman named Jen Ganai, who bills herself on her LinkedIn as the founder
00:08:38.980 of Google's global responsible AI operations and governance team.
00:08:43.960 In that capacity, Ganai says that she ensured Google met its AI principles, our company's
00:08:48.680 ethical charter for the development and deployment of fair, inclusive, and ethical advanced technologies.
00:08:52.680 She says that she took a, quote, principled, risk-based, inclusive approach when conducting ethical
00:08:58.240 algorithmic impact assessments of products prior to launch to ensure that they didn't cause
00:09:03.560 unintended or harmful consequences to the billions of Google's users.
00:09:08.220 And apparently, you know, a harmful consequence would be showing an image of a white Viking.
00:09:14.600 That might be very harmful to somebody, and so we've got to make sure that we don't let that happen.
00:09:18.480 Now, currently, Ganai says that she's an AI ethics and compliance advisor at Google.
00:09:24.420 Now, what Ganai doesn't mention on her LinkedIn is that her goal for a long time has been to treat
00:09:29.140 white people differently based on their skin color.
00:09:32.060 It's what she wants her AI to do.
00:09:33.400 It's what she does also.
00:09:36.300 Three years ago, Ganai delivered a keynote address at an AI conference in which she admitted all of this.
00:09:41.880 After introducing herself with her pronouns, which, by the way, are she, her, in case you're wondering,
00:09:45.560 Ganai explains what her philosophy on AI is, and here's what she says.
00:09:51.140 Watch.
00:09:52.120 We do work together day to day to try and advance the technology and understanding around responsible AI.
00:09:59.560 So today, I won't be speaking as much from the Google perspective, but from my own experience.
00:10:04.900 I have worked at Google for over 14 years.
00:10:07.480 I've led about six different teams, mostly in the user research, the user experience area, and now in the ethical user impact area.
00:10:16.380 So I'll be sharing some of my learnings from across that time, but also some of my failures and challenges.
00:10:22.100 I think it's okay to talk about things that you've made mistakes in because we will make mistakes.
00:10:26.680 When we're trying to be good allies, when we're trying to be anti-racists, we will make mistakes.
00:10:32.440 The point is, though, to keep trying, to keep educating yourself, and getting better day to day.
00:10:38.740 It's about constant learning.
00:10:40.340 It's okay to talk about the things you've made mistakes in, says Jen Ganai.
00:10:46.720 When we're trying to be good allies, when we're trying to be anti-racists, we will make mistakes.
00:10:51.780 Well, you know, in retrospect, after the launch of Gemini, that would turn out to be kind of a massive understatement.
00:10:57.920 The kind of mistakes that Jen Ganai is talking about in this keynote aren't mistakes like eliminating all white people from Google's AI,
00:11:05.060 which seems like a pretty big mistake, even though, again, not really a mistake.
00:11:07.480 It's obviously deliberate. Instead, she's talking about failing to live up to the racist ideals of DEI,
00:11:12.560 which apparently means treating non-white employees differently. Watch.
00:11:18.040 A corporate study found that talented white employees enter a fast track on the corporate ladder,
00:11:23.700 arriving in middle management well before their peers,
00:11:26.380 while talented black, Hispanic, or Latinx professionals broke through much later.
00:11:31.100 Effective mentorship and sponsorship were critical for retention and executive-level development
00:11:35.260 of black, Hispanic, and Latinx employees.
00:11:38.320 So this leads me into sharing an inclusion failure of mine, one of many, but just one that I'll share so far.
00:11:45.280 I messed up with inclusion almost right away when I first became a manager.
00:11:49.240 I made some stupid assumptions about the fact that I built a diverse team,
00:11:53.100 that then they'd simply feel welcome and will feel supported.
00:11:56.660 I treated every member of my team the same
00:11:59.100 and expected that that would lead to equally good outcomes for everyone.
00:12:02.540 That was not true. I got some feedback that a couple of members of my team didn't feel they belonged
00:12:08.000 because there was no one who looked like them in the broader org or our management team.
00:12:12.640 It was a wake-up call for me.
00:12:14.440 First, I shouldn't have had to wait to be told what was missing.
00:12:17.700 It was on me to ensure I was building an environment that made people feel they belong.
00:12:21.960 It's a myth that you're not unfair if you treat everyone the same.
00:12:25.340 There are groups that have been marginalized and excluded because of historic systems and structures
00:12:30.800 that were intentionally designed to favor one group over another.
00:12:34.400 So you need to account for that and mitigate against it.
00:12:37.480 Second, it challenged me to identify mentoring and sponsorship opportunities for my team members
00:12:41.900 with people who looked more like them and were in senior positions across the company.
00:12:46.180 Yeah, of course, the irony here is that this woman, Jen, sounds like she's Scottish or Irish or whatever.
00:12:53.480 Irish, I'm going to assume.
00:12:54.500 But the funny thing is that if you were to ask Google's AI for an image of an Irish person,
00:12:59.100 it would not produce any image that looks anything like her.
00:13:02.460 It would give you a bunch of images of, like, Cardi B and Sexy Red or something.
00:13:07.400 Sexy Red does have red hair. It's like, maybe she is Irish.
00:13:09.840 This is the head of ethics of Google AI, a senior manager, saying that it's a bad idea to treat everyone the same,
00:13:15.220 regardless of the color of their skin.
00:13:16.340 She is explicitly rejecting this basic principle of morality.
00:13:20.120 And instead, she says that she learned that she has to treat certain groups differently
00:13:23.420 because of historic systems and structures.
00:13:25.600 And therefore, she says those demographic groups are entitled to unique treatment and mentorship opportunities.
00:13:31.540 Now, later in this address, she goes on to explain what equity means in her view.
00:13:35.620 And this is where the things really kind of get hilarious to the extent that you can laugh at someone this low IQ
00:13:41.440 and also, frankly, evil. Watch.
00:13:45.920 Allyship involves the active steps to support and amplify the voice of members of marginalized groups
00:13:51.140 in ways that they cannot do alone.
00:13:53.520 In the workplace, this can involve many things, from being an active mentor or sponsor
00:13:58.020 to those from historically marginalized communities,
00:14:01.060 to managers of managers setting specific goals in hiring and growth for their teams,
00:14:05.200 to ensure fairness and equity of opportunity and outcomes for underrepresented populations.
00:14:11.260 However, back to the point about language being very important,
00:14:15.880 using the title of ally can also come across as othering.
00:14:19.900 So I always state both the groups I'm a member of and support,
00:14:23.780 as well as those that I'm a member of, more of a mentor and a sponsor of,
00:14:28.420 to ensure that it doesn't look like that I'm othering others.
00:14:30.940 So, for example, I would say I'm an ally of women, black people, LGBTQ.
00:14:37.220 I want to say I'm a champion advocate of all of these groups,
00:14:40.180 not that I'm outside or exclusionary of them.
00:14:44.440 Again, it's worth emphasizing, these are the people that are behind the AI systems
00:14:48.680 that are going to be and really already are ruling the world.
00:14:53.240 But I want to repeat what she said, because it's hard to believe when this is said out loud.
00:14:57.160 So just to repeat, she says,
00:14:58.640 using the title of ally can come across as othering.
00:15:01.440 So I always state both the groups I'm a member of and support,
00:15:03.940 as well as the ones I'm more of a mentor and sponsor of,
00:15:07.020 to ensure that it doesn't look like I'm othering others.
00:15:10.540 Yeah, you don't want to other the others.
00:15:12.880 This is the brain trust at Google behind an AI that has access to all of our data.
00:15:16.640 She's incapable of speaking without using an endless stream of vapid DEI cliches
00:15:21.360 that you've heard a million times.
00:15:22.620 This supposedly is an original enterprise, artificial intelligence,
00:15:25.760 and it's being overseen by maybe the least original, least intelligent woman
00:15:29.660 that Google possibly could have found.
00:15:32.220 On top of everything else, the wacky left-wing stuff,
00:15:35.440 you're dealing with the most unimpressive people that you could imagine
00:15:38.980 that are in charge of this just technology that is incomprehensible.
00:15:46.640 And this is the kind of person who doesn't want to other others,
00:15:49.700 which seems a bit contradictory.
00:15:51.340 I mean, if someone is an other, then how do you not other them,
00:15:55.880 given that they are an other?
00:15:58.040 And by the way, just so you know, the word other, if you check the dictionary,
00:16:00.860 just means a person or thing that is distinct from another person or thing.
00:16:06.080 So if somebody is an other, it just means that they're not you, is all.
00:16:09.980 So if you're recognizing that they're an other, if you're making them an other,
00:16:12.500 you're just, you are recognizing them as a distinct entity from yourself.
00:16:17.660 So not othering them means that you are not recognizing them as a distinct human entity.
00:16:23.580 It means that, I suppose, we have to pretend that all people are indistinct blobs,
00:16:28.680 you know, all lumped together into this great, ambiguous blob that we call humanity.
00:16:34.900 Now, none of this makes any sense, but she has made it very clear that this DEI word salad
00:16:39.340 is the guiding philosophy behind Google's new AI.
00:16:42.720 There's no firewall between her and the product.
00:16:45.660 Watch.
00:16:46.980 What does responsible and representative AI mean?
00:16:49.680 I've talked about my team, but that's only one definition.
00:16:52.480 So for us, it means taking deliberate steps to ensure that the advanced technologies
00:16:56.880 that we develop and deploy lead to a positive impact on individuals and society more broadly.
00:17:02.820 It means that our AI is built with and for everyone.
00:17:07.040 We can't just assume noble goals and good intent to prevent or solve ethical issues.
00:17:12.740 Instead, we need to deliberately build teams and build structures that hold us accountable
00:17:17.980 to more ethical outcomes, which for us, the ethical outcomes in Google would be defined
00:17:22.700 as our AI principles, which I discussed earlier.
00:17:25.680 Now, it's easy to point and laugh at imbeciles like this and the products that Google has created.
00:17:30.880 On some level, it's genuinely hilarious that an AI product can be so useless that it can't
00:17:35.640 generate images of white people, even white historical figures.
00:17:39.160 It's also amusing in a way that Gemini is so unsubtle and ham-fisted that it
00:17:43.520 straight up refuses to answer questions about, for example, atrocities committed by communist
00:17:47.940 governments or as someone else asked about the Zoom exploits of CNN commentator Jeffrey Toobin
00:17:53.280 wouldn't answer that question.
00:17:55.260 But the truth remains that the people behind Gemini have extraordinary power.
00:17:58.540 I mean, this debacle makes it very clear that the AI algorithms underlying products that
00:18:03.900 millions of people actually use, like Google, are completely unreliable and worse.
00:18:09.440 In fact, they're deliberately lying to us.
00:18:11.660 They're downranking unapproved viewpoints and disfavored racial groups.
00:18:15.680 And they're promoting the laziest possible brand of neo-Marxist ideology at every opportunity.
00:18:21.300 And they're doing it also to influence the next presidential election, by the way.
00:18:25.260 You might remember that after Donald Trump won in 2016, Breitbart posted leaked footage
00:18:29.820 of Google executives grieving during an all-hands meeting.
00:18:33.940 Let's watch that again.
00:18:35.660 I certainly find this election deeply offensive.
00:18:39.740 And I know many of you do, too.
00:18:41.980 It did feel like a ton of bricks dropped on my chest.
00:18:44.980 What we all need right now is a hug.
00:18:46.700 Can I move to Canada?
00:18:50.100 Is there anything positive you see from this election result?
00:18:55.260 Oh, boy, that's a really tough one right now.
00:19:00.560 Now, in other parts of the video, they go on to say that the election is the result of
00:19:05.840 the people and voting and that they accept the results.
00:19:08.640 But Google issued a statement saying the video saying nothing was said at that meeting or any
00:19:14.220 other meeting to suggest that any political bias ever influences the way we build or operate
00:19:18.880 our products.
00:19:19.400 To the contrary, our products are built for everyone.
00:19:23.700 Sure it is.
00:19:25.260 I find this election deeply offensive.
00:19:27.060 We all need a hug, we're told.
00:19:29.680 It was at this moment that Google decided that downranking conservative websites wasn't
00:19:33.320 enough.
00:19:33.620 In order to really influence elections, they decided that they needed to develop an AI that
00:19:38.240 will force-feed DEI and anti-white racism on everyone at every opportunity.
00:19:43.180 Their only mistake, which is the same mistake they made in that video back in 2016, is that
00:19:47.180 they were too obvious about their intentions.
00:19:48.800 And now everybody knows exactly where Google stands.
00:19:52.140 We have a pretty good idea what our future AI-driven dystopia will look like, or already
00:19:59.380 does look like.
00:20:00.860 Now let's get to our five headlines.
00:20:02.220 If your house is feeling chilly right now, you may need to consider window replacements.
00:20:11.860 I get it.
00:20:12.280 You know, if you haven't yet replaced the windows in your home, it can be an intimidating project
00:20:16.200 and prospect.
00:20:16.980 Luckily, there's a company that will do the work for you.
00:20:19.940 Renewal by Anderson is your one-stop shop for window design, manufacture, and installation.
00:20:24.980 Windows play a crucial role in regulating indoor temperatures.
00:20:27.420 If you notice a spike in your heating or cooling bills, it may be due to inefficient windows.
00:20:32.440 Don't put it off any longer.
00:20:33.580 Renewal by Anderson offers limited, fully transferable, and best-in-the-nation warranty coverage.
00:20:37.820 Right now, Renewal by Anderson is offering a free in-home consultation on quality, energy-efficient,
00:20:41.720 affordable windows or patio doors with special financing options.
00:20:44.820 Text Walsh to 200-300 for a free consultation to save $375 off every window and $775 off every
00:20:51.480 door.
00:20:52.020 These savings won't last long, so be sure to check it out by texting Walsh to 200-300.
00:20:56.460 That's Walsh to 200-300.
00:20:58.640 Texting privacy policy and term conditions posted at textplan.us.
00:21:02.220 Texting enrolls for recurring, automated text marketing messages.
00:21:05.340 Messaging data rates may apply.
00:21:06.820 Reply stopped, topped out.
00:21:08.060 Go to windowappointmentnow.com for full offer details.
00:21:11.980 Okay, news from Reuters.
00:21:14.180 A cellular outage on Thursday hit thousands of AT&T users in the United States, disrupting
00:21:18.320 calls and text messages, as well as emergency services in major cities across, including
00:21:22.560 San Francisco.
00:21:23.220 More than 73,000 incidents were reported around 8.15 a.m.
00:21:27.480 AT&T said some of its customers were facing interruptions and it was working urgently to
00:21:30.980 restore service.
00:21:32.260 And then it turns out that a bunch of other carriers were affected as well.
00:21:35.900 So major cellular outages across the country.
00:21:38.740 I think the 70,000 figure, which was as of this morning, is most likely a huge undercount
00:21:44.140 at this point.
00:21:44.640 So it seems to be a much wider outage than that.
00:21:47.760 And the question is, what caused it?
00:21:50.140 If you had tens of thousands, potentially hundreds of thousands or more, I don't know,
00:21:53.960 people affected by a cellular outage, what caused that?
00:21:59.040 And lots of people on the internet are speculating that it could be some kind of attack or maybe
00:22:02.800 it's a false flag event or a dry run for a bigger thing that's coming down the pike.
00:22:07.840 But the media is now reporting that it may trace back to a solar flare.
00:22:12.740 And here's a quick report on that.
00:22:15.980 Good morning, Oklahoma.
00:22:17.580 Maybe you're looking at your phone and it's saying, SOS, what's going on?
00:22:20.960 My husband had that this morning and he's freaking out.
00:22:22.980 And I was like, did you do your phone updates?
00:22:25.980 No, this is all happening at 3 a.m.
00:22:27.900 So we did that, coming to work.
00:22:30.500 He's not the only one, guys.
00:22:32.080 If you're experiencing that, it may be a result of space weather.
00:22:37.060 Okay?
00:22:37.380 I'm going to do my best to explain what's going on.
00:22:40.020 Let's go ahead and take a look.
00:22:41.380 So there was a strong solar flare event that happened just after midnight.
00:22:46.020 And they actually captured an image of it right here, okay?
00:22:49.000 I had to look up this scale.
00:22:50.480 I wish I knew everything about space weather.
00:22:52.320 That's a whole different specialty.
00:22:53.480 But R3, that's for radio communications, it's on a scale of one to five.
00:22:59.420 And three is pretty bad.
00:23:01.300 That means it impacts radio communications for a few hours after this happens.
00:23:07.220 And so right now that could be impacting some of our technology.
00:23:11.920 And sadly, we're entering a solar maximum where we're going to see more and more solar storms, solar...
00:23:18.040 Well, I'm glad that she was able to begin that news report about this, you know, serious issue,
00:23:22.300 solar flare, cellular outage.
00:23:24.160 She was able to begin by giving us a little anecdote about the conversation she had with her husband at 3 a.m.
00:23:28.780 Not exactly.
00:23:29.580 I mean, you're a news anchor.
00:23:30.760 It's like, I guess people want everything personalized these days.
00:23:32.980 I don't really want it personalized.
00:23:34.480 I just want...
00:23:34.980 Just give me the news.
00:23:36.800 That's all I need to know.
00:23:37.340 I don't need to know about...
00:23:38.080 I don't care that your husband...
00:23:39.000 It doesn't matter to me.
00:23:41.880 Now, so they're pinning it on a solar flare.
00:23:44.980 The internet, as you might expect, is not buying this.
00:23:48.060 Lots of comments are treating the solar flare explanation as somehow totally absurd.
00:23:53.360 You know, I'm seeing a lot of people saying, nah, no way.
00:23:55.920 You're not fooling me.
00:23:57.200 Yeah, right.
00:23:57.900 Solar flare.
00:23:58.860 Sure.
00:23:59.560 Solar flare, quote unquote.
00:24:01.240 They insist that this is some kind of plan, devised scheme by, you know, shadowy forces.
00:24:07.420 Which, I mean, maybe it was.
00:24:09.220 I'm not sure what they would have gained from it, given that this was a relatively minor annoyance.
00:24:13.900 So, I guess it's possible that shadowy forces executed some huge plan to just sort of irritate everybody for a few hours.
00:24:21.240 I'm not sure why.
00:24:22.100 I don't know what you would gain from that.
00:24:25.020 And I don't say this to downplay or dismiss the reality that there are, in fact, evil forces out there scheming different ways to make our lives miserable.
00:24:32.220 We know that's the case.
00:24:33.020 That's certainly true.
00:24:34.060 My only point is that, you know, and I find myself having to make this point with relative frequency these days.
00:24:39.340 But not everything, like there are scheming bad people out there.
00:24:45.120 Not everything is part of that scheme.
00:24:48.680 And so, when I saw this and I heard about the solar flare, I just immediately knew.
00:24:53.000 I knew as soon as I went on Twitter what I was going to see.
00:24:55.400 Nothing, but a lot of it's coming from the right.
00:24:57.300 Just, yeah, right.
00:24:58.540 I know what's really going on.
00:24:59.620 And I was like, all right.
00:25:01.580 Of course, of course, it has to be.
00:25:03.320 It always, you know, there are things that happen in the world and in the universe that just happen.
00:25:14.060 Okay, we do live in a physical reality where all kinds of things we can't control happen.
00:25:21.280 It does happen.
00:25:24.160 And so, some things are a conspiracy, sure.
00:25:26.320 But not everything is, you know, it's like when there's, and it seems that people sort of moved away from this a little bit recently.
00:25:35.680 But we went through a while there where every mass shooting was a false flag.
00:25:40.760 Every mass shooting was a false flag attack.
00:25:42.980 And it seemed like for years it was, that was by some corners on the right.
00:25:46.700 Every mass shooting is automatically, I know what really happened, here's a false flag.
00:25:50.500 It's like, yeah, but mass shootings do happen.
00:25:53.820 I mean, this is a thing that really does happen.
00:25:55.480 I wish it didn't, but it does.
00:25:57.560 So, to immediately assume that it can't be anything but some deeper conspiracy is ridiculous.
00:26:04.940 And kind of the same thing here.
00:26:08.180 You know, when I hear people say, you know, no, no way.
00:26:13.700 You're not fooling me.
00:26:14.420 Couldn't be a solar flare.
00:26:15.220 Why couldn't it have been a solar flare?
00:26:16.760 Solar flares exist.
00:26:19.220 It's a thing.
00:26:20.060 Are you denying that they exist?
00:26:21.880 The sun is a real thing.
00:26:23.440 It's up in the sky.
00:26:24.220 Solar flares are real.
00:26:26.460 The sun is out there in space.
00:26:28.260 It's not that far away on a cosmic scale.
00:26:30.180 It's an enormous ball of hellish gas so big that a million Earths could fit inside it like,
00:26:34.980 you know, marbles in a glass vase.
00:26:36.800 This thing that's burning at 27 million degrees Fahrenheit at its core.
00:26:41.080 It's like a very powerful thing.
00:26:42.760 It's this incomprehensibly enormous nuclear reactor that is just a hop, skipping a jump away from us, again, in galactic terms.
00:26:50.820 And, yeah, that thing could incinerate all life on the planet in the blink of an eye.
00:26:56.860 It could.
00:26:58.720 It probably won't anytime soon, but eventually it will.
00:27:02.480 But it probably not anytime soon, but it could.
00:27:06.460 Like, the sun could belch tomorrow and send us all back to the Stone Age.
00:27:11.620 Just like that.
00:27:12.240 That's the kind of power we're dealing with.
00:27:15.600 That's how fundamentally helpless and vulnerable we are.
00:27:18.340 All of our technology, all of our advancement, all of it could be rendered moot, destroyed in an instant by forces that have nothing to do with anyone on this planet.
00:27:28.820 This is what I'm always trying to explain to the climate change alarmists that are running around.
00:27:33.000 What are we going to do about the weather?
00:27:33.900 It's like, we don't get to call the shots on this thing.
00:27:37.180 I hate to tell you.
00:27:38.380 I wish we had that kind of control.
00:27:40.020 We don't.
00:27:41.340 When it comes to the weather and the climate and solar flares and the sun and all these things, it's just we are helpless.
00:27:47.380 It will just happen.
00:27:48.680 And if it does, then we're screwed and that's it.
00:27:53.060 You think about the Carrington event, which was the largest solar flare on record.
00:27:58.900 And, you know, they haven't been keeping records that long.
00:28:02.560 So this happened in like 1860 or something.
00:28:04.980 And it sent out as much energy as the most powerful thermonuclear bomb ever created times 5 billion.
00:28:19.760 So it was the equivalent of 5 billion of the most powerful nuclear bombs ever created.
00:28:25.740 So imagine that, you know, two-thirds of the people on the planet all owned their own most powerful thermonuclear bomb.
00:28:35.920 And they all set them off at the exact same time.
00:28:40.180 That's the kind of power we're dealing with.
00:28:42.400 And at that point in 1860, it knocked out telegraph lines.
00:28:48.140 It actually set telegraph stations on fire.
00:28:52.200 There were people at telegraph stations that were getting electrical shocks because of this thing.
00:28:56.440 Now, imagine what would have happened if modern communication existed back then.
00:29:03.580 Like, say goodbye to your phone and the Internet and probably for a long time.
00:29:08.500 So that's all real.
00:29:11.200 That's all true.
00:29:11.880 It's a real thing.
00:29:12.740 I hate to tell you.
00:29:13.300 And I think that this is some of the psychology behind the people who overdo it on the conspiracy.
00:29:23.060 Where everything's a conspiracy.
00:29:24.820 And I think part of the reason for that is there's a certain comfort, I guess, we take in thinking that we are in control of everything.
00:29:34.920 Even if not me individually, but, like, even imagining that there's a human conspiracy out there that is responsible for everything.
00:29:43.340 That is more comforting than imagining that it's totally outside of the control of anybody on the planet.
00:29:50.160 And I think people don't want to confront that.
00:29:52.200 And so they come up with ways to cope with it.
00:29:56.860 That's my psychoanalysis anyway.
00:29:59.160 There it is.
00:30:00.640 All right.
00:30:00.960 This is an easier thing to psychoanalyze for the Daily Wire.
00:30:06.140 Just to make sure that the borrowers of student loans know whom to thank for escaping responsibility for fully paying back their student loans.
00:30:12.160 President Biden will send over 150,000 borrowers a personal email reminding them that he's their guy.
00:30:17.760 The plan to let borrowers off the hook will cost the American taxpayers $1.2 billion.
00:30:22.040 Adding that, the administration has canceled $138 billion in debt for almost 4 million borrowers since Biden took office in 2021.
00:30:30.960 The email states, quote, congratulations, all or a portion of your federal student loans will be forgiven because you qualify for early loan forgiveness under my administration's save plan.
00:30:39.860 From day one of my administration, I vowed to fix student loan programs so higher education can be a ticket to the middle class, not a barrier to opportunity.
00:30:47.520 It always, this is not really the point, but it's always so sort of creepy and depressing.
00:30:55.080 First of all, when you hear this from these politicians, it's a ticket to the middle class.
00:31:03.280 So that's the bar you're setting for us?
00:31:04.920 Your ticket to the middle class, which like, first of all, the idea that the only way to access the middle class is to spend $100,000 on a college education.
00:31:16.980 If that's true, that's already the problem.
00:31:19.920 Like the idea that a college education and a degree should be the entry point for the middle class is insane.
00:31:29.860 It should not be that way.
00:31:32.600 And in fact, it's not that way.
00:31:33.780 I mean, you can, there are many careers you can get into and be comfortably middle class above middle class.
00:31:40.500 I mean, you can become wealthy without a college education.
00:31:44.320 That's just the truth.
00:31:45.240 But to the extent that it is true that many kinds of jobs that will, you know, give you a middle class sort of income, that they require college education, like that is a problem.
00:31:58.500 They shouldn't.
00:31:59.540 Those jobs should not require that.
00:32:01.680 They don't really require it.
00:32:03.560 You know, not, they don't naturally require it.
00:32:05.740 They require it artificially.
00:32:06.860 So that's the problem we should be dealing with.
00:32:10.200 But instead, Joe Biden and the Democrats, they see that as a feature, not a bug.
00:32:18.500 And so they see no problem with the idea, to the extent that it's true, that access to the middle class depends on a college education.
00:32:28.840 And on top of it, the bar that he's putting for everybody is the middle class.
00:32:33.240 This is your way to become middle class.
00:32:36.860 And all the people saying that, oh, no, just be happy being middle class.
00:32:39.700 That's fine.
00:32:40.120 That's all you need.
00:32:40.700 That's all you need in life.
00:32:41.800 You don't need more than that.
00:32:43.220 Of course, everyone's saying that.
00:32:44.380 They're far above middle class.
00:32:46.020 Okay, these are all wealthy people who would rather be dead than be middle class, most of them.
00:32:53.660 But for you, that's all you should want.
00:32:56.380 That's all you need.
00:32:57.340 Just that.
00:32:58.800 Be middle class.
00:33:00.760 And we'll throw you some money every once in a while.
00:33:03.920 We'll pay off some of your student loans for you, or we're not going to pay it off the taxpayers' will.
00:33:08.400 And just stay there.
00:33:09.960 Don't try to go beyond that.
00:33:14.300 It's obscene.
00:33:15.600 Which isn't to say, obviously, that there is anything wrong with being middle class.
00:33:20.520 Of course, there isn't.
00:33:21.180 The problem is when you have these elites, who, again, are not middle class, who are presenting that to you as your ceiling, as the pinnacle of what you should try to achieve.
00:33:36.400 Here's Biden bragging a little bit more about the latest student loan forgiveness scheme.
00:33:42.620 Here it is.
00:33:42.920 Early in my term, I announced a major plan to provide millions of working families with debt relief for their college student debt.
00:33:51.120 Tens of millions of people in debt were literally about to be canceled, their debts.
00:33:55.600 But my MAGA Republican friends in the Congress, elected officials in special interest, stepped in and sued us.
00:34:01.700 And the Supreme Court blocked it.
00:34:03.320 And blocked it.
00:34:04.340 But that didn't stop me.
00:34:05.620 I announced we were going to pursue alternative paths for student debt relief for as many borrowers as possible.
00:34:12.320 And that's the effort that's been underway the last two years.
00:34:15.380 I fixed what's called the save plan.
00:34:17.440 It existed.
00:34:18.160 But I fixed it to make it the most affordable repayment plan ever.
00:34:21.720 Before I took office, student borrowers had to pay 10 percent of the discretionary income on a monthly basis.
00:34:28.900 If they made less than if they didn't have enough.
00:34:30.640 All right, shut up.
00:34:30.880 And so he admits that the Supreme Court says we can't do this, but we're doing it anyway.
00:34:36.880 So and these are the people that cherish our our democracy and our system of government, of course.
00:34:44.140 But this is another outright absolute, absolutely shameless bribery scheme funded by the taxpayers, of course, because.
00:34:52.000 Because, once again, I must insist on reminding everybody that this is not loan forgiveness.
00:34:59.460 A loan cannot really be forgiven.
00:35:02.300 Not in the way that it implies anyway.
00:35:05.800 A loan is a thing that happened.
00:35:08.700 Money was given by one party to another.
00:35:12.120 That's the loan.
00:35:13.360 It was lent.
00:35:14.920 And now that it has happened, you can't wave a magic wand and make it so that it didn't happen.
00:35:20.360 You would need a time machine to do that.
00:35:23.140 So real student loan forgiveness or, you know, erasing student loans.
00:35:27.620 A way to really do that, to erase it, is to get in a time machine and go back and stop the person from taking the loan out to begin with.
00:35:37.560 Which, if that was possible, I would say, yeah, that would be the best approach for many of these people.
00:35:42.900 But we don't have that.
00:35:44.040 And so the loan was made.
00:35:47.180 There is a hole here, right?
00:35:48.740 Like, money is owed.
00:35:51.140 There's a debt.
00:35:52.440 And it's going to be filled.
00:35:54.800 You know, and so either the person who borrowed the money will be made to hold the bag.
00:36:00.160 Or the person who lent the money, the party that lent the money, will be left holding the bag.
00:36:04.940 Or a third party, which in this case, with a student debt, is the taxpayers.
00:36:12.120 That third party will be given the bag and told to fill it with a billion dollars or whatever.
00:36:18.060 But somebody is out the money.
00:36:21.280 Somebody is on the hook.
00:36:23.320 No matter what.
00:36:24.640 There is no scenario where that will not be the case.
00:36:27.360 And that's all that matters.
00:36:32.540 All the rest of it is irrelevant.
00:36:35.100 Even if I agreed with all of the arguments about how a lot of these college kids are taken advantage of.
00:36:42.640 They don't know what they're doing.
00:36:43.520 And it's not fair.
00:36:44.340 And it's predatory.
00:36:46.100 I can agree with most of that.
00:36:49.080 All of it, actually.
00:36:49.920 I agree with all of it.
00:36:50.580 But, so it's an unfair situation.
00:36:58.280 No matter what approach we take.
00:37:01.340 No matter what answer we come up with.
00:37:05.020 It's not going to be great.
00:37:07.900 And it's going to leave somebody in an unfair situation.
00:37:12.200 One way or another.
00:37:15.760 So, of all these situations, what is the most fair?
00:37:18.820 Is it the most fair to make the person who took the loan out pay it back?
00:37:25.060 Or is it the most fair to make someone who didn't take the loan out pay it back?
00:37:30.620 Because if it is unfair to make the person who took the loan out pay it back,
00:37:35.280 how much more unfair is it to make someone who didn't take it out pay it back?
00:37:41.820 And when it comes to student loan, quote unquote, forgiveness, that is really the only point that matters.
00:37:47.560 Okay, I wanted to play this.
00:37:48.240 Here's a video that was posted by a guy named Charles Tanner.
00:37:51.680 Duke Tanner is his nickname.
00:37:53.060 And he was granted clemency by Donald Trump back before Trump was left office, obviously.
00:37:57.680 And he's now a big Trump supporter of this guy because he was let out of prison.
00:38:02.800 He was in prison and he was let out.
00:38:06.460 Big Trump fan now, as you might expect.
00:38:08.640 And he says that he was a nonviolent drug offender who was given two life sentences for a first-time offense.
00:38:17.300 And that was unfair, he believes.
00:38:19.520 And that's why he deserves to be let out and he wants to see a lot more people let out.
00:38:22.600 And here he is in this video where he talks about that and why he's a big Trump fan.
00:38:28.660 Now, let's watch.
00:38:29.680 My name is Charles Duke Tanner and I was sentenced to a double life sentence for my first arrest for a nonviolent drug offense in 2004.
00:38:37.840 I lost all my appeals and I was denied clemency by the Obama administration.
00:38:42.720 It took 16 plus years before President Trump granted my clemency and allowed me to go home in 2020.
00:38:48.880 I was a part of a broken and unjust system.
00:38:51.140 And now it hurts me to the core to see the same system going after a former president.
00:38:58.520 This is what blacks have been going through since day one.
00:39:01.080 If we allow this to happen to the former president, we can only imagine what's going to happen to the rest of the country.
00:39:07.660 Please stand up now if you have a voice.
00:39:10.300 And let's fight against this.
00:39:13.020 God bless you all.
00:39:14.040 Thank you.
00:39:14.960 Okay, so here's my, now, what he said there at the end about Donald Trump and how they shouldn't be going after him and it's unjust.
00:39:19.900 I agree with all that.
00:39:20.700 All that is true.
00:39:23.900 The stuff before it is the problem.
00:39:25.500 And the fact that this guy's out of prison is a problem.
00:39:27.260 He should still be in prison.
00:39:28.000 He should be in prison for the rest of his life.
00:39:30.060 And my fear is that, now, I saw this video flooding around.
00:39:33.900 A few Trump supporters were reposting it in a favorable way, but nobody of special note.
00:39:42.480 My fear, though, is that the Trump campaign leading into the general election will lean into stuff like this and say, oh, look, this is a black guy who likes us.
00:39:53.960 He's saying other black people should support Donald Trump because Trump let him out of prison.
00:39:58.200 Listen, I think that would be a disastrous political mistake, and don't do it.
00:40:04.220 It would be a disastrous mistake.
00:40:05.800 Because Trump letting guys like this out of prison was the worst thing that he did while he was in office.
00:40:14.060 It was a huge mistake.
00:40:14.920 It never should have happened.
00:40:15.620 And what people want now is they want actual law and order, actual law and order.
00:40:22.560 They want criminals arrested and sent away.
00:40:25.060 They're not in the mood to be sympathetic to criminals.
00:40:29.160 And so that's what you campaign on.
00:40:31.420 That's how you actually get.
00:40:32.660 Now, yeah, this guy, you get his vote by saying, yeah, I'll let criminals out of jail.
00:40:36.360 You're not going to win an election with this guy, OK?
00:40:39.600 The way you win an election is by speaking not to criminals, but to just normal Americans, wherever they happen to live,
00:40:45.260 whatever their race happens to be.
00:40:46.940 And for those people, what they want is they want criminals in jail.
00:40:50.100 And so if I'm Donald Trump, the clemency and letting the criminals out of jail, I'm pretending that didn't happen, OK?
00:40:56.340 We're not going to talk about that.
00:40:58.000 And we're moving on.
00:41:01.500 And here's what we're going to do this time.
00:41:03.200 And we're taking the naive Bukele approach in El Salvador.
00:41:07.660 And we're going, you know, we're going hardcore after these people.
00:41:11.100 We're going to throw them in jail.
00:41:11.840 And just to make the point, just using this guy, OK?
00:41:16.920 Because if you hear this kind of thing and you think, and you're tempted to be sympathetic,
00:41:25.620 well, because you hear, oh, first-time drug offender, nonviolent, two life sentences, that's obscene.
00:41:33.120 That's excessive.
00:41:34.280 What we have to realize is that you're being manipulated.
00:41:35.940 And don't allow yourself to be manipulated.
00:41:39.480 Because he says that he was a nonviolent first-time drug offender, makes it sound like he was caught with a bag of weed or something.
00:41:47.200 And, OK, that argument was compelling to Trump, who let him out of prison.
00:41:51.120 But what's the reality?
00:41:52.380 Well, the reality is that Charles Tanner was the leader of a drug gang.
00:41:56.780 He was found guilty of trafficking in hundreds of thousands of dollars in drugs, hardcore drugs that he was bringing into our communities.
00:42:07.300 What does that tell us?
00:42:08.380 Well, first of all, he was a committed criminal deep in the drug-dealing game.
00:42:14.000 This was not a guy dabbling around the edges.
00:42:16.780 Also, the fact that this was a first-time offense is irrelevant.
00:42:20.520 It's not even true.
00:42:21.620 Like, obviously, he committed many more offenses than what he was arrested for.
00:42:27.420 A first-time offense?
00:42:28.320 No, it wasn't.
00:42:28.980 You committed a thousand crimes before that.
00:42:31.680 It's just you weren't arrested for them.
00:42:33.380 The fact that you weren't arrested for them, what does that have to do with anything?
00:42:36.280 Who cares?
00:42:37.900 Yeah, you know, my first time getting caught for it.
00:42:39.720 I don't give a s*** if it was the first time you were caught for it.
00:42:42.480 What, you think you get credit for that?
00:42:45.700 It's irrelevant.
00:42:46.460 So if you arrest a drug kingpin, which is what this guy was, it's not his first offense.
00:42:56.080 If it's his first time being arrested, it's because of dumb luck, number one.
00:43:00.140 And number two, it's because other people took the fall for him in the past.
00:43:05.040 Okay, because guys like this, they surround themselves with people who are the first line of defense,
00:43:10.120 and those are the ones who get arrested because they're the ones who are actually out on the street.
00:43:14.340 And they're the ones who take the fall.
00:43:17.860 And so that's why the ringleader, the kingpin type of, often is not arrested or doesn't have the same rap sheet.
00:43:26.840 Okay, you go down to Mexico, go down to Central America, the cartels, the people running the cartels,
00:43:32.540 and you compare their rap sheet to like the guys that are actually pushing the drugs.
00:43:36.680 A lot of the cartel leaders don't have as many arrests or any potentially.
00:43:40.700 Does that mean it's a, oh, I'm a first-time offender?
00:43:45.540 Yeah, it's a first-time arrest.
00:43:47.700 It's not a first-time offense.
00:43:48.620 And second, calling drug trafficking nonviolent is insane.
00:43:55.800 Okay, so stop saying that.
00:43:57.720 I want everyone who says that, you need to stop.
00:44:01.100 The next time you find yourself calling drug trafficking nonviolent, I want you to immediately smack yourself in the face.
00:44:09.160 Okay, and I'm not calling anyone else to do it.
00:44:11.220 I'm not calling for violence.
00:44:12.580 I'm saying do it to yourself, just to slap some sense into yourself.
00:44:16.800 Drug trafficking is not nonviolent.
00:44:20.500 For one thing, they're trafficking in poison that kills thousands of Americans every single year.
00:44:26.820 They're trafficking in a substance that is destroying communities all across this country
00:44:31.020 and putting thousands of people in the ground and destroying many more families and lives.
00:44:37.620 Okay, so to call that nonviolent is just, it's, it's, you're using a definition of violence that is so limited that it is meaningless.
00:44:47.560 And second, I don't mean to burst your bubble, there's no such thing as a nonviolent drug gang.
00:44:55.100 Okay, what do you think that they're doing?
00:44:57.140 How do you think that they, what do you think is happening?
00:44:59.240 You think that they write sternly worded letters to each other when they have a disagreement?
00:45:03.820 Do you think they're having tickle fights with each other?
00:45:06.040 You think, what do you think, they're doing a thumb war?
00:45:08.360 You think when there's, when there's a dispute over street corners, they do, okay, rock, paper, scissors, ready?
00:45:12.660 Let's go.
00:45:13.300 Rock, paper, scissors.
00:45:14.440 Ah, you got me.
00:45:15.180 You think that's how they settle it?
00:45:16.600 No.
00:45:17.560 There's no such thing as non, this guy is not nonviolent.
00:45:20.780 You're a drug kingpin, you are violent.
00:45:22.840 It's just 100% fact.
00:45:25.300 So, now, were you arrested for any of the violence that you perpetrated or that you caused to happen?
00:45:31.760 Maybe not.
00:45:32.840 Doesn't matter.
00:45:35.220 But giving a life sentence to a drug trafficker, you see, the reason why you give the life sentence
00:45:41.080 is because all of this is baked in and you, you are a smart person and you recognize all of this.
00:45:49.540 So, you recognize this is a drug trafficker.
00:45:51.940 He's trafficking in poison.
00:45:53.380 He's killing people.
00:45:54.360 This is a violent business.
00:45:55.500 He's responsible for lots of violence.
00:45:56.980 If he's a drug trafficker, he's already committed a whole bunch of other crimes.
00:46:00.320 So, you bake all of that in logically and that's how you end up with a life sentence.
00:46:06.840 Which is not only just, but it is, if anything, lenient.
00:46:12.720 If any, the conversation we should be having that I've talked about before is, should we
00:46:19.600 be executing drug traffickers?
00:46:22.360 That at least would be, if we're having that conversation, then I know we're making some
00:46:28.340 progress.
00:46:29.240 But the fact that we're still debating about whether it's worth putting a drug trafficker
00:46:34.460 in jail for the rest of their lives, it's like, okay, we've learned nothing.
00:46:38.080 We've just learned absolutely nothing.
00:46:39.820 And I don't want to hear any complaints from anybody about crime in the street and cleaning
00:46:43.460 up crime.
00:46:44.280 If you think the guys like this should be let out of prison, I don't want to hear you complain
00:46:47.640 about the crime.
00:46:48.440 Because this, what do you think is required to clean up the crime?
00:46:52.400 Okay, what do you think isn't, what does it entail?
00:46:55.640 What does it look like?
00:46:57.320 It is an ugly, rough thing where you take guys like this and you throw them in jail and you
00:47:01.940 never see them again.
00:47:03.020 That's how you do it.
00:47:03.800 It's the only way to do it.
00:47:05.340 And if you don't want to do that, then you're not serious about cleaning up the crime.
00:47:08.180 I swear, if I see this guy show up at a campaign rally, I'm going to, I'm going to, well, I'm
00:47:14.960 just going to be very annoyed.
00:47:15.960 That's all that's really going to happen, but I will be very annoyed.
00:47:18.740 All right, let's get to, was Walsh wrong?
00:47:21.080 Our friends at ZipRecruiter conducted a recent survey and found that the top hiring challenges
00:47:28.940 employers face in 2024 is a lack of qualified candidates.
00:47:32.280 But if you're an employer and need to hire, the good news is that ZipRecruiter has smart
00:47:35.820 tools and features that help you find more qualified candidates quickly.
00:47:39.440 Right now, you can try it for free at ZipRecruiter.com slash Walsh.
00:47:42.780 As soon as you post your job, ZipRecruiter's powerful matching technology shows you candidates
00:47:46.400 whose skills and experience match to what you need.
00:47:49.240 And then you can use ZipRecruiter's invite to apply feature to send your top candidates
00:47:53.380 a personalized invitation, encouraging them to respond to your job posts.
00:47:57.880 Let ZipRecruiter help you conquer the biggest hiring challenge of finding qualified candidates.
00:48:02.500 See why four to five employers who post on ZipRecruiter get a quality candidate within the
00:48:06.520 first day.
00:48:07.260 Just go to my exclusive web address right now at ZipRecruiter.com slash Walsh.
00:48:13.360 Again, that's ZipRecruiter.com slash W-A-L-S-H.
00:48:17.260 The smartest way to hire is with ZipRecruiter.
00:48:20.200 Shockingly, a number of comments that are disagreeing with my opinion that the Beyonce
00:48:26.700 country song was not terrible.
00:48:30.900 You know, especially in comparison, given where the bar is.
00:48:35.320 Okay, but a lot of people, not in agreement.
00:48:39.440 Patrick says, wow, Matt, this is sad.
00:48:41.280 The only explanation I can think of is that you've never listened to music before.
00:48:44.220 I can't believe you could be this lost.
00:48:46.240 Wasson says, the Beyonce country song is catchy.
00:48:49.960 So is AIDS, Matt.
00:48:52.400 H-Man says, tell us you're gay without saying it.
00:48:57.360 Lillian Humphrey says, Matt, to say you shock me is an understatement.
00:49:00.260 Beyonce's a grifter and she'll jump off as soon as she's done getting her accolades.
00:49:03.460 23WTB says, it's boring.
00:49:07.640 It's Beyonce's boring voice put through an AI-generated bluegrass filter.
00:49:11.500 There's already a thousand songs you could be listening to instead.
00:49:13.800 Look, you might not like that I'm the leader of the Bayhive now.
00:49:17.400 I don't even think the Bayhive likes it, but it is, it's how it is.
00:49:21.900 It might not fit in your little picture of the world, okay?
00:49:25.100 It might not fit inside your box, the box you want to put me in.
00:49:29.240 But this is the way it is.
00:49:31.760 I'm living my truth, okay?
00:49:33.980 That's what's happening.
00:49:35.520 You cannot contain me.
00:49:38.960 And I think that's what disturbs you all so much.
00:49:42.560 And I stand by what I said, the Beyonce country song, it's okay.
00:49:45.400 It's kind of catchy.
00:49:47.120 It kind of works a little bit.
00:49:48.880 I didn't say it works a lot.
00:49:50.520 Did I say that?
00:49:51.200 No.
00:49:51.360 I didn't say it's a 10 out of 10.
00:49:55.580 You know, it's a, it's a, it's like a, it's like a 6.1.
00:50:01.240 And maybe a 5.7.
00:50:02.340 It's like a 5.7 out of 10.
00:50:03.920 But it's above 50%.
00:50:05.800 And, and so it works a little bit.
00:50:10.640 Now, I think it could work better.
00:50:12.420 I think it could work better.
00:50:13.460 Maybe that's what I'm cluing into a little bit,
00:50:15.960 is that there's a, there's a vibe that could be better.
00:50:18.540 Now, obviously, the lyrics are incredibly stupid.
00:50:22.240 So that's a problem.
00:50:22.800 You could make the lyrics not that dumb.
00:50:25.500 I think Beyonce, if she had a better kind of soulful R&B voice,
00:50:31.460 then it would really work.
00:50:35.260 You know, in fact, I'd like to hear that.
00:50:37.060 I'd like to hear an Aretha Franklin level kind of soul R&B singer
00:50:42.400 doing a country song.
00:50:44.720 That, that would be a nice thing to hear.
00:50:45.980 So I guess Janice Joplin is what I'm asking for.
00:50:49.340 I don't know if you'd call her country or not.
00:50:51.200 Probably not.
00:50:51.800 Me and Bobby McGee, I guess, I think you'd call that country-ish.
00:50:56.020 And I'm not saying that Beyonce is Janice Joplin.
00:50:58.720 Just to be clear, you people would probably stone me to death
00:51:01.940 if I said that, and deservedly so.
00:51:04.000 But that's it.
00:51:05.760 That's my, that's my verdict.
00:51:07.320 Courage Under Fire is going to be the event of the year.
00:51:09.400 Come and join me on May 24th in Nashville, Tennessee
00:51:12.160 for a night of encouragement and camaraderie.
00:51:13.980 The Courage Under Fire will host some of the top leaders in faith,
00:51:16.580 the pro-life movement, and culture to share in the true, the good, and the beautiful.
00:51:20.000 I'll be speaking alongside Dr. Abby Johnson on how to have courage
00:51:22.700 and stand up for the truth no matter what adversity you face.
00:51:25.320 We'll be joined by some of the most influential leaders
00:51:27.020 in the conservative movement for a night of connection and inspiration.
00:51:30.240 All proceeds from the gala will directly benefit students
00:51:32.360 in need of tuition assistance at Regina Chaley Academy,
00:51:35.060 which is the premier classical homeschool hybrid for Catholic families.
00:51:38.900 VIPs will have access to an exclusive meet and greet with guest speakers
00:51:41.520 and live music during cocktail hour.
00:51:43.760 If you haven't grabbed your tickets yet, you need to do so.
00:51:45.980 For tickets, visit courageunderfiregala.org
00:51:48.460 and use code DAILYWIRE at checkout.
00:51:50.260 That's courageunderfiregala.org
00:51:52.740 and use code DAILYWIRE.
00:51:54.600 Can't wait to see you there.
00:51:55.540 Also, Lady Ballers is the hilarious story
00:51:58.160 of how a group of male losers who can't win against other men
00:52:00.860 decide to identify as women and join a women's basketball league.
00:52:03.580 Yes, it's absurd.
00:52:04.280 It's ridiculous.
00:52:05.280 It's laughable.
00:52:06.100 It's happening in the world right now.
00:52:07.540 So here's a quick look at what's being called
00:52:09.080 the most triggering movie of the decade.
00:52:11.780 Leftists are losing it over Lady Ballers.
00:52:14.900 Nothing's changed.
00:52:16.140 This movie is a straight-up and intentional transphobic hate crime.
00:52:20.520 What?
00:52:21.260 I see you.
00:52:22.500 The Lady Ballers movie needs to be banned.
00:52:25.460 I'll cancel you.
00:52:26.740 Go ahead and get the blinds, please.
00:52:28.040 Go to Levin'.
00:52:28.740 The most toxic BS you've ever seen.
00:52:32.740 You're a monster.
00:52:34.000 Yeah.
00:52:34.740 Next-level hate speech propaganda.
00:52:36.760 That's it.
00:52:37.400 That's the pitch.
00:52:38.100 Watch the most triggering comedy of the decade.
00:52:43.400 Lady Ballers, streaming exclusively on Daily Wire Plus.
00:52:47.600 Don't wait.
00:52:48.120 Watch Lady Ballers, the movie that Hollywood didn't make,
00:52:50.500 so we did exclusively on Daily Wire Plus right now.
00:52:53.700 Now let's get to our daily cancellation.
00:53:01.160 You know, I did not expect that I would one day be in a position
00:53:03.460 where I would have to cancel the National Multiple Sclerosis Society,
00:53:07.480 but here we are.
00:53:08.600 It's really not so surprising when you think about it.
00:53:10.140 Nearly every large organization of any kind in the country
00:53:12.340 has been captured by the far left,
00:53:13.660 and if it's a medical organization or a nonprofit that deals with a medical issue,
00:53:18.240 you can be certain that it is run by far-left wackos.
00:53:22.600 That's how you end up with this kind of situation, which we'll get into now.
00:53:25.940 So backing up a few days, last week the MS Society kicked out a 90-year-old volunteer
00:53:31.800 who had been working with the organization for 60 years.
00:53:35.340 Now just to put that in perspective, this woman, Fran Itkoff,
00:53:38.800 has been volunteering for this organization for nearly as long as the organization has existed.
00:53:44.280 She began volunteering after her husband was diagnosed with the condition,
00:53:47.360 and she continued giving her time even after her husband died 20 years ago.
00:53:51.300 And with that sort of history and track record,
00:53:54.120 you would think that the organization would cherish this woman.
00:53:57.380 They would have a deep sense of loyalty to her and respect for her,
00:54:01.900 and certainly wouldn't even consider terminating their relationship with her
00:54:06.060 except under the most extreme circumstances imaginable,
00:54:10.820 where somehow her behavior made the decision inevitable.
00:54:15.600 Now it's hard to imagine what exactly a 90-year-old volunteer could do or say
00:54:19.340 to warrant that response, but whatever it is,
00:54:22.500 it would have to be over-the-top, outrageous, and offensive.
00:54:26.940 That's what you might think.
00:54:28.640 Or you would think anyway if you were very naive
00:54:30.900 and didn't understand how the world works these days.
00:54:32.780 As it turns out, Fran was fired,
00:54:35.620 not for doing anything outrageous or offensive at all,
00:54:37.540 but simply for being a normal human being.
00:54:40.060 What was Fran's crime?
00:54:41.140 Well, she asked a question about pronouns.
00:54:44.960 Apparently it all began when Fran was asked by the organization
00:54:47.340 to use pronouns in her email signature.
00:54:49.960 And she, being a normal 90-year-old person,
00:54:52.460 had no idea what that meant.
00:54:54.280 She's lived on the planet for 90 years,
00:54:56.260 likely introduced herself a thousand times to a thousand different people,
00:54:59.180 and never once had been asked to give her pronouns.
00:55:02.940 In fact, the very phrase, give your pronouns,
00:55:06.160 the concept of a person having pronouns,
00:55:09.520 these are my pronouns,
00:55:11.180 this is something that this elderly woman has never encountered
00:55:13.880 or thought about at all.
00:55:15.360 It has no bearing on her life.
00:55:17.180 That's because this whole pronoun ritual was invented 45 seconds ago
00:55:20.320 by gay activists on the internet.
00:55:23.080 Now, you and I are familiar with it
00:55:25.620 because we spend too much time on the internet.
00:55:27.540 Fran, we can assume, does not.
00:55:29.400 And that was the crime she committed, apparently.
00:55:31.580 So here she is last week explaining the situation to Libs of TikTok.
00:55:36.000 Watch.
00:55:37.100 I was confused.
00:55:38.440 I didn't know what it was, what it meant.
00:55:40.560 And I'd seen it on a couple of letters that had come in after the person's name.
00:55:47.840 They had the pronouns, but I didn't know what that meant.
00:55:51.820 And so finally, when I was talking to her,
00:55:54.660 I thought, I'll ask, what does it mean?
00:55:57.660 And, you know, let her tell me.
00:56:00.260 And so she said that meant that they were all inclusive,
00:56:04.580 which didn't make sense to me because it sounds like you're labeling for females
00:56:12.960 and not males if you're just putting in she, her.
00:56:16.500 She said that she was just asking her what it meant to have a conversation.
00:56:20.380 So as a 90-year-old who didn't know what it meant,
00:56:24.240 you know, she's not street savvy to find out what it meant.
00:56:27.620 And when she said that they were required to use it to be inclusive,
00:56:32.020 and my mom was saying that we've always been,
00:56:35.020 the MS Society as a whole and the Long Beach group has just always been inclusive.
00:56:41.600 A few days later, it was on a Friday, it was at 4.58.
00:56:45.740 At 4.58, which we thought was odd, but anyway.
00:56:49.000 At the end of the day, end of the week,
00:56:51.000 I got an email from her saying that they were sorry,
00:56:55.600 but they had to ask me to step down as a volunteer for the MS Society.
00:57:02.440 And the reason being is that you're not inclusive enough.
00:57:06.560 The verbiage she said was that she didn't abide by their diversity, equity, and inclusion,
00:57:12.700 so they have to ask her to step down,
00:57:15.580 and she can't be a part of the MS Society as a volunteer,
00:57:20.380 which to me is ironic because they're saying they're being inclusive,
00:57:25.160 but yet they're excluding a 90-year-old disabled woman who volunteers for over 60 years.
00:57:33.260 So I like her initial response to the thing when she was told we're inclusive.
00:57:37.120 And of course, her first response is because she's familiar with that,
00:57:39.700 with what that word is supposed to mean.
00:57:41.320 And so she says, like, what do you mean?
00:57:43.120 Of course, we include everybody.
00:57:45.360 We're the MS Society.
00:57:46.680 We don't exclude anyone.
00:57:48.660 We don't tell anyone that they're not allowed to benefit from the charity we do
00:57:52.360 because, of course, we include everyone.
00:57:54.760 But, of course, that's not what is meant by inclusive when it's used these days,
00:57:59.880 and that's the irony of it.
00:58:01.840 In the name of inclusivity, they're excluding this woman for asking a question.
00:58:05.960 You know, it's not like Fran stridently objected to using the pronouns.
00:58:09.160 It's not like she went in there and said,
00:58:11.560 look, this is the dumbest thing I've ever heard of.
00:58:14.280 You people are morons.
00:58:15.240 I've been alive for 90 years.
00:58:16.540 I've never encountered human beings as stupid as you.
00:58:19.800 Now, if she would have been entirely justified in saying that,
00:58:22.680 I would enjoy it if she did say that,
00:58:25.200 but she didn't because she's a nice, sweet old lady
00:58:27.520 who is looking for clarification, not confrontation,
00:58:30.420 and yet they kicked her out anyway.
00:58:32.520 Now, later in the week, the MS Society released a statement
00:58:34.560 defending their decision to fire her,
00:58:36.000 reading from their statement that says,
00:58:37.260 quote,
00:58:37.900 As an organization, we firmly believe that we best serve and support
00:58:40.620 those living with MS by creating a space that welcomes all.
00:58:43.380 This is especially true for self-help group leaders
00:58:45.720 who are responsible for leading meetings for people affected by MS
00:58:48.400 to confide in and support one another.
00:58:50.200 Recently, a volunteer, Fran Itcoff,
00:58:51.920 was asked to step away from her role
00:58:53.360 because of statements that were viewed as not aligning
00:58:55.920 with our policy of inclusion.
00:58:57.540 Fran has been a valued member of our volunteer team
00:58:59.760 for more than 60 years.
00:59:00.920 We believe that our staff acted with the best of intentions
00:59:03.340 and did their best to navigate a challenging issue.
00:59:07.920 As an organization, we are in continued conversation
00:59:10.180 about assuring that our diversity, equity, and inclusion
00:59:12.240 evolve in service of our mission,
00:59:14.980 and we will reach out to Fran in service of that goal.
00:59:18.620 Now, if you understand woke language,
00:59:21.700 you understand it more than a typical 90-year-old
00:59:23.740 who lives a life blessedly insulated from much of this would,
00:59:27.380 then you already know how to translate
00:59:30.380 some of the key phrases in this statement.
00:59:32.080 For example, continued conversation means,
00:59:34.760 in this case, not a conversation at all.
00:59:37.720 A conversation is what Fran was trying to have originally.
00:59:41.300 She was trying to have a conversation,
00:59:42.640 like a person where someone said something,
00:59:45.160 and she's like, I don't know what you mean by that.
00:59:47.540 Can you explain it?
00:59:49.040 And then they said, you're fired.
00:59:50.900 So that's what a conversation is.
00:59:52.640 They don't want a conversation.
00:59:53.680 Conversation for them means instead that
00:59:55.480 they dictate what you do, say, and believe,
00:59:58.380 and then you sit silently and nod your head.
01:00:00.180 That's what a conversation is to them.
01:00:01.700 It's the only conversation they want to have.
01:00:03.140 And in a similar fashion,
01:00:04.440 navigating a challenging issue
01:00:06.360 does not mean that they are navigating a challenging issue.
01:00:10.220 It means that they instead are creating an issue
01:00:12.960 out of whole cloth,
01:00:14.580 responding to the issue that they created
01:00:16.920 in the most deranged and morally abominable way imaginable,
01:00:20.380 and then applauding themselves
01:00:22.240 for their wisdom and nuance.
01:00:25.000 And of course, as we already covered,
01:00:27.520 inclusion here does not mean inclusion
01:00:30.360 or anything like inclusion.
01:00:31.380 It means instead the rigid, ruthless exclusion
01:00:33.700 of anyone who fails to precisely
01:00:35.760 and painstakingly conform to their dogmas.
01:00:37.660 So that was the end of it,
01:00:39.760 as far as the National MS Society expected and hoped.
01:00:42.620 They did this horrifically insane thing.
01:00:44.480 They treated an innocent woman like dirt.
01:00:46.560 They exposed themselves
01:00:47.600 as a bunch of deranged, cowardly goblins.
01:00:50.040 And then they moved on, or they wanted to.
01:00:53.240 But unfortunately for them,
01:00:54.240 the public found out about all this,
01:00:55.860 thanks in large part to libs of TikTok.
01:00:57.620 And this led to a fully justified
01:00:58.960 and quite reasonable public backlash.
01:01:01.760 In the organization, they tried to hide under a bed,
01:01:04.140 you know, for a while
01:01:05.540 until people stopped shouting at them.
01:01:06.960 But that strategy only lasted a few days
01:01:09.100 until they finally relented
01:01:10.740 and issued this follow-up statement.
01:01:13.660 This is what they posted yesterday.
01:01:15.980 Quote,
01:01:16.240 The National Multiple Sclerosis Society
01:01:18.180 apologizes to our longtime dedicated volunteer,
01:01:20.860 Fran Itcoff.
01:01:22.180 Recently, we asked Fran to step down
01:01:23.840 from her role as a group leader
01:01:24.800 because of statements made
01:01:25.660 that we viewed as not aligning
01:01:26.840 with our recently implemented
01:01:27.800 diversity, equity, and inclusion policy.
01:01:29.800 We realize now that we made a mistake
01:01:31.620 and we should have had more conversations
01:01:33.840 with Fran before making this decision.
01:01:35.760 Over her 60 years of volunteer service,
01:01:38.180 Fran has been a committed champion for our cause.
01:01:40.340 We had an opportunity to work with her
01:01:41.720 and support her as a self-help group volunteer leader.
01:01:44.620 But as an organization, we fell short.
01:01:47.080 While we acted at the time with the best intentions,
01:01:48.860 we did not have clear protocols in place.
01:01:51.340 We should have spent more time with Fran
01:01:52.800 to help her understand why as an organization,
01:01:55.100 we are dedicated to building a diverse
01:01:56.640 and inclusive movement
01:01:57.400 where everybody has equitable access
01:01:59.200 to the care, connections, and support
01:02:00.820 they need to live their best lives.
01:02:03.960 Okay.
01:02:05.480 So, they have apologized to Fran
01:02:08.680 and asked her to come back.
01:02:09.700 No word yet, as far as I know,
01:02:11.180 on whether Fran has any interest
01:02:12.620 in working with them again.
01:02:15.120 Either way, needless to say,
01:02:17.240 the organization deserves absolutely no credit
01:02:19.900 for issuing this apology and retraction.
01:02:23.080 For one thing,
01:02:24.140 they're only doing it because they have to.
01:02:26.420 Their outrageous mistreatment
01:02:27.760 of this elderly woman was exposed.
01:02:29.780 Donors were threatening to pull their funds.
01:02:32.460 And the National MS Society
01:02:34.260 had no choice but to back down.
01:02:36.340 If this story had never gone public,
01:02:38.200 if nobody ever found out about it,
01:02:39.840 if they'd been able to just toss
01:02:40.960 this old lady overboard
01:02:41.800 and continue rowing along like nothing happened,
01:02:44.540 then that's exactly what they would have done.
01:02:46.460 So, this apology was made out of necessity,
01:02:49.100 not for any moral or ethical reason.
01:02:51.940 Which means that it's totally meaningless.
01:02:53.460 And they get no credit for issuing it.
01:02:56.520 Also, they aren't really apologizing.
01:02:59.460 At least they aren't apologizing
01:03:00.860 for the things they should be apologizing for.
01:03:03.720 They said that they should have,
01:03:05.260 they should have had more conversations with Fran.
01:03:07.960 They should have made protocols more clear
01:03:09.940 and done more to, quote,
01:03:12.140 help her understand
01:03:13.360 why she's expected to engage
01:03:15.260 in these bizarre rituals.
01:03:17.200 But that, of course, is not the problem.
01:03:20.380 The problem isn't that they didn't help Fran
01:03:23.820 better conform to the insane DEI policy.
01:03:27.220 The problem is that the insane DEI policy exists.
01:03:31.440 They should be apologizing
01:03:32.680 for the policies themselves.
01:03:34.980 And by extension,
01:03:36.240 for mistreating Fran in the name of that policy.
01:03:39.080 And then they should be firing
01:03:40.020 everybody who had a hand in crafting the policy
01:03:42.000 and in firing Fran.
01:03:44.200 That's what a real apology would be.
01:03:46.380 That's what actual accountability would look like.
01:03:48.240 Which is why the people threatening
01:03:50.960 to pull their funds
01:03:51.960 should still pull them.
01:03:55.320 If you donate to the National MS Society,
01:03:57.460 you should stop
01:03:58.540 and you should not start again.
01:04:01.780 I don't care if they made this apology.
01:04:04.200 They should still be made
01:04:05.240 to reap the consequences.
01:04:08.000 That's the only way
01:04:08.800 to put a stop to this madness.
01:04:10.780 Or else there will be
01:04:11.820 a lot more Fran's
01:04:13.140 falling victim to it in the future.
01:04:15.240 And that's why
01:04:16.800 the National MS Society
01:04:18.260 is today cancelled.
01:04:20.740 That'll do it for the show today.
01:04:21.580 Thanks for watching.
01:04:22.060 Thanks for listening.
01:04:22.620 Talk to you tomorrow.
01:04:23.700 Have a great day.
01:04:24.600 Godspeed.
01:04:24.920 Bye.
01:04:25.860 Bye.
01:04:26.460 Bye.
01:04:29.680 Bye.
01:04:43.060 Bye.
01:04:43.760 Bye.
01:04:44.000 Bye.
01:04:44.160 Bye.
01:04:51.160 Bye.
01:04:52.620 Bye.
01:04:53.440 Bye.
01:04:53.740 Bye.
01:04:53.840 Bye.
01:04:54.100 Bye.
01:04:54.140 Bye.
01:04:54.200 Bye.