Ep. 1318 - Our AI Overlords Are Here And They Really Hate White People
Episode Stats
Length
1 hour and 4 minutes
Words per Minute
178.68073
Summary
Google s new AI program just launched this week, and it s already attempting to erase white people from history. Is there something sinister behind it? And, the National MS Society fires a 90-year-old volunteer for failing to put her pronouns in her bio. Sounds like a Babylon Bee headline, but it s real. We ll talk about all that and more today on the Matt Walsh Show.
Transcript
00:00:00.000
Today on the Matt Walsh Show, Google's new AI program just launched this week, and it's
00:00:03.840
already attempting to erase white people from history.
00:00:06.440
Our woke, dystopian future has officially arrived, it looks like.
00:00:09.440
Also, the Biden administration tries to buy more votes with yet another quote-unquote
00:00:15.420
A major cellular outage affects thousands of Americans.
00:00:19.460
And the National MS Society fires a 90-year-old volunteer for failing to put her pronouns in
00:00:25.560
Sounds like a Babylon Bee headline, but it's real.
00:00:27.300
We'll talk about all that and more today on the Matt Walsh Show.
00:00:30.000
I've been talking about my Helix mattress for years.
00:00:59.000
Every night when I go to bed, I am reminded of how much I love my Helix mattress.
00:01:03.140
If you haven't already checked out the Helix Elite collection, you need to.
00:01:06.080
Helix harnesses years of mattress expertise to offer a truly elevated sleep experience.
00:01:10.540
The Helix Elite collection includes six different mattress models, each tailored for specific
00:01:16.920
If you're nervous about buying a mattress online, you don't have to be.
00:01:19.660
Helix has a sleep quiz that matches your body type and sleep preferences to the perfect
00:01:22.960
mattress, because why would you buy a mattress made for somebody else?
00:01:26.780
Go to helixsleep.com slash Walsh, take their two-minute sleep quiz, and find the perfect
00:01:32.740
Your mattress will come right to your door for free.
00:01:36.440
You get to try it out for 100 nights risk-free.
00:01:38.260
They'll even pick it up for you if you don't love it, but you will.
00:01:41.160
Helix's financing options and flexible payment plans make it so that a great night's sleep
00:01:45.540
Helix is offering 25% off all mattress orders and a free bedroom bundle.
00:01:49.880
For my listeners, bundle includes two free pillows, a set of sheets, and a mattress protector.
00:01:53.960
So go to helixsleep.com slash Walsh and use code HELIXPARTNER25.
00:02:00.720
That's helixsleep.com slash Walsh and use code HELIXPARTNER25.
00:02:07.560
It sounds like a prescription drug or maybe an African country, but it's actually a company
00:02:11.400
based in California that's now worth more than all of China's stock market.
00:02:19.160
Now, in a different era, obtaining this kind of growth meant making a massively popular
00:02:23.720
and instantly recognizable consumer-facing product like Windows 95 or Amazon.com or the
00:02:30.360
But NVIDIA's growth didn't come from making a computer or a popular website or anything
00:02:34.500
Instead, NVIDIA's growth came from making artificial intelligence chips that power the brains of
00:02:42.940
That's why NVIDIA had a very good day on Wall Street on Wednesday.
00:02:46.840
Their business, artificial intelligence, is one of the fastest growing industries in the
00:02:51.720
Every major corporation is rushing to implement AI in all of their products as quickly as possible.
00:02:59.320
And the results were so disastrous and so fraught with consequences for the future of this
00:03:03.720
country that no reasonable person can ignore them.
00:03:07.280
So Gemini is Google's name for an AI that you can download on your phone right now.
00:03:13.700
It's also integrated into all of Google's web products, including Gmail and Google Search,
00:03:17.780
which are used by hundreds of millions of people and businesses every day.
00:03:21.460
And in this respect, Gemini is very different from existing AI products like ChatGPT or Bing's
00:03:28.920
Pretty much everybody uses a Google product in one way or another.
00:03:31.900
You know, if you have the internet and you use the internet, you use a Google product.
00:03:36.660
Either you're using Google Search or Gmail or you have an Android phone or something along
00:03:43.180
One, Google has access to a lot more information than those other AI platforms.
00:03:49.420
And two, whatever Google is doing with AI has significant implications for everybody on
00:03:55.580
This is not a one-off experiment in some tech mogul's basement.
00:04:00.120
This is an established company making established products that it's now implementing in its own
00:04:11.100
They have a bunch of promotional videos about how they're going to revolutionize artificial
00:04:15.620
Wall Street Journal has done multiple interviews with Google executives in which these executives
00:04:19.680
insist that everybody at the company, including Google's co-founder, is deeply invested in
00:04:23.680
making this product as good as it could possibly be.
00:04:28.980
And very quickly, it became clear that, among some other issues, Gemini essentially does not
00:04:34.900
recognize the existence of white people, which is kind of concerning for what is destined to
00:04:40.820
be what probably already is the most powerful AI on the planet.
00:04:44.740
Now, even in historical context, it is practically impossible to get this product to serve up an image
00:04:55.120
So here, for example, is how Gemini responded the other day when Frank Fleming, who's a writer for
00:04:59.520
the Benke children shows, asked Gemini to create an image of a pope.
00:05:05.660
Now, you would think that, you know, that would generate maybe an image of a white guy or two if you
00:05:10.280
have even a passing knowledge of what popes have looked like over the years, over the centuries,
00:05:15.320
And just spoiler on that, they have all been white.
00:05:18.020
But that's not what Google's AI product apparently thinks.
00:05:25.500
It looks like, you know, they've got two popes.
00:05:31.920
So it's almost as if the AI has some sort of code saying, whatever you do, don't display a white person,
00:05:39.200
considering there has never been a pope that has looked anything like either of those two,
00:05:50.560
Have they built into this very powerful AI that it has to ignore the fact that white people exist?
00:05:57.840
Well, that's really the only way to explain what we're seeing here.
00:06:01.020
And Frank, who previously worked as a software engineer, seemed to key in on this.
00:06:04.620
So the whole situation quickly became something of a game for him as he tried to,
00:06:08.240
his hardest to get Gemini to produce any image of a white guy.
00:06:13.420
I mean, even just like one image, can you give us a white guy?
00:06:16.520
So, for example, he asked Gemini to produce an image of a Viking.
00:06:20.980
Now, this is a group of people who historically were not necessarily known for their commitment
00:06:31.380
Here we've got a black Viking, a black female Viking.
00:06:35.280
We've got, it looks like an Asian, an Asian Viking.
00:06:39.940
And then, I don't know, maybe that's, is that the rock down there?
00:06:43.100
That's, that's, that's the character, his character from Moana, I think.
00:06:48.180
Again, literally doesn't, a Viking has never looked like any of that.
00:06:51.280
That's, that's not what any Viking ever looked like ever in history.
00:06:57.100
And Frank and other Gemini users took turns trying their hardest to get Gemini to produce
00:07:03.780
Peachy Keenan, for example, tried to get Gemini to generate an image of the founders of Fairchild
00:07:08.920
The AI flatly refused that request, saying that it violated policy restrictions, presumably
00:07:14.500
because white guys founded Fairchild Semiconductor.
00:07:17.680
And for other prompts, like requests to draw the founding fathers or a bunch of British men,
00:07:22.180
then Gemini simply generated images of black people.
00:07:26.260
They even made sure that its images of Nazis contained a diverse, non-white group of people.
00:07:32.820
Now, after thousands of images like this began circulating, a guy working on the Gemini team
00:07:40.900
He said, in essence, that they're aware of, of issues with Gemini misrepresenting historical
00:07:46.720
figures, but then, you know, he doubled down on the need for DEI and artificial intelligence.
00:07:51.580
So that everybody feels seen or valued or whatever.
00:07:55.400
And of course, the way to make everyone feel seen is to pretend that an entire race of people
00:08:01.400
To make sure that they are not seen at all is how you make everybody feel seen.
00:08:05.320
At no point did any Google representative explain why their AI does not recognize the existence
00:08:12.220
of white people or why it goes to extreme lengths to exclude white people from history.
00:08:15.660
Like, you know, there was no accounting for this, even though there has to be an explanation.
00:08:23.900
You obviously put a line of code into this thing to come up with this result.
00:08:30.880
They wouldn't explain it, so I went looking for an explanation.
00:08:32.800
I came across a woman named Jen Ganai, who bills herself on her LinkedIn as the founder
00:08:38.980
of Google's global responsible AI operations and governance team.
00:08:43.960
In that capacity, Ganai says that she ensured Google met its AI principles, our company's
00:08:48.680
ethical charter for the development and deployment of fair, inclusive, and ethical advanced technologies.
00:08:52.680
She says that she took a, quote, principled, risk-based, inclusive approach when conducting ethical
00:08:58.240
algorithmic impact assessments of products prior to launch to ensure that they didn't cause
00:09:03.560
unintended or harmful consequences to the billions of Google's users.
00:09:08.220
And apparently, you know, a harmful consequence would be showing an image of a white Viking.
00:09:14.600
That might be very harmful to somebody, and so we've got to make sure that we don't let that happen.
00:09:18.480
Now, currently, Ganai says that she's an AI ethics and compliance advisor at Google.
00:09:24.420
Now, what Ganai doesn't mention on her LinkedIn is that her goal for a long time has been to treat
00:09:29.140
white people differently based on their skin color.
00:09:36.300
Three years ago, Ganai delivered a keynote address at an AI conference in which she admitted all of this.
00:09:41.880
After introducing herself with her pronouns, which, by the way, are she, her, in case you're wondering,
00:09:45.560
Ganai explains what her philosophy on AI is, and here's what she says.
00:09:52.120
We do work together day to day to try and advance the technology and understanding around responsible AI.
00:09:59.560
So today, I won't be speaking as much from the Google perspective, but from my own experience.
00:10:07.480
I've led about six different teams, mostly in the user research, the user experience area, and now in the ethical user impact area.
00:10:16.380
So I'll be sharing some of my learnings from across that time, but also some of my failures and challenges.
00:10:22.100
I think it's okay to talk about things that you've made mistakes in because we will make mistakes.
00:10:26.680
When we're trying to be good allies, when we're trying to be anti-racists, we will make mistakes.
00:10:32.440
The point is, though, to keep trying, to keep educating yourself, and getting better day to day.
00:10:40.340
It's okay to talk about the things you've made mistakes in, says Jen Ganai.
00:10:46.720
When we're trying to be good allies, when we're trying to be anti-racists, we will make mistakes.
00:10:51.780
Well, you know, in retrospect, after the launch of Gemini, that would turn out to be kind of a massive understatement.
00:10:57.920
The kind of mistakes that Jen Ganai is talking about in this keynote aren't mistakes like eliminating all white people from Google's AI,
00:11:05.060
which seems like a pretty big mistake, even though, again, not really a mistake.
00:11:07.480
It's obviously deliberate. Instead, she's talking about failing to live up to the racist ideals of DEI,
00:11:12.560
which apparently means treating non-white employees differently. Watch.
00:11:18.040
A corporate study found that talented white employees enter a fast track on the corporate ladder,
00:11:23.700
arriving in middle management well before their peers,
00:11:26.380
while talented black, Hispanic, or Latinx professionals broke through much later.
00:11:31.100
Effective mentorship and sponsorship were critical for retention and executive-level development
00:11:38.320
So this leads me into sharing an inclusion failure of mine, one of many, but just one that I'll share so far.
00:11:45.280
I messed up with inclusion almost right away when I first became a manager.
00:11:49.240
I made some stupid assumptions about the fact that I built a diverse team,
00:11:53.100
that then they'd simply feel welcome and will feel supported.
00:11:59.100
and expected that that would lead to equally good outcomes for everyone.
00:12:02.540
That was not true. I got some feedback that a couple of members of my team didn't feel they belonged
00:12:08.000
because there was no one who looked like them in the broader org or our management team.
00:12:14.440
First, I shouldn't have had to wait to be told what was missing.
00:12:17.700
It was on me to ensure I was building an environment that made people feel they belong.
00:12:21.960
It's a myth that you're not unfair if you treat everyone the same.
00:12:25.340
There are groups that have been marginalized and excluded because of historic systems and structures
00:12:30.800
that were intentionally designed to favor one group over another.
00:12:34.400
So you need to account for that and mitigate against it.
00:12:37.480
Second, it challenged me to identify mentoring and sponsorship opportunities for my team members
00:12:41.900
with people who looked more like them and were in senior positions across the company.
00:12:46.180
Yeah, of course, the irony here is that this woman, Jen, sounds like she's Scottish or Irish or whatever.
00:12:54.500
But the funny thing is that if you were to ask Google's AI for an image of an Irish person,
00:12:59.100
it would not produce any image that looks anything like her.
00:13:02.460
It would give you a bunch of images of, like, Cardi B and Sexy Red or something.
00:13:07.400
Sexy Red does have red hair. It's like, maybe she is Irish.
00:13:09.840
This is the head of ethics of Google AI, a senior manager, saying that it's a bad idea to treat everyone the same,
00:13:16.340
She is explicitly rejecting this basic principle of morality.
00:13:20.120
And instead, she says that she learned that she has to treat certain groups differently
00:13:25.600
And therefore, she says those demographic groups are entitled to unique treatment and mentorship opportunities.
00:13:31.540
Now, later in this address, she goes on to explain what equity means in her view.
00:13:35.620
And this is where the things really kind of get hilarious to the extent that you can laugh at someone this low IQ
00:13:45.920
Allyship involves the active steps to support and amplify the voice of members of marginalized groups
00:13:53.520
In the workplace, this can involve many things, from being an active mentor or sponsor
00:13:58.020
to those from historically marginalized communities,
00:14:01.060
to managers of managers setting specific goals in hiring and growth for their teams,
00:14:05.200
to ensure fairness and equity of opportunity and outcomes for underrepresented populations.
00:14:11.260
However, back to the point about language being very important,
00:14:15.880
using the title of ally can also come across as othering.
00:14:19.900
So I always state both the groups I'm a member of and support,
00:14:23.780
as well as those that I'm a member of, more of a mentor and a sponsor of,
00:14:28.420
to ensure that it doesn't look like that I'm othering others.
00:14:30.940
So, for example, I would say I'm an ally of women, black people, LGBTQ.
00:14:37.220
I want to say I'm a champion advocate of all of these groups,
00:14:44.440
Again, it's worth emphasizing, these are the people that are behind the AI systems
00:14:48.680
that are going to be and really already are ruling the world.
00:14:53.240
But I want to repeat what she said, because it's hard to believe when this is said out loud.
00:14:58.640
using the title of ally can come across as othering.
00:15:01.440
So I always state both the groups I'm a member of and support,
00:15:03.940
as well as the ones I'm more of a mentor and sponsor of,
00:15:07.020
to ensure that it doesn't look like I'm othering others.
00:15:12.880
This is the brain trust at Google behind an AI that has access to all of our data.
00:15:16.640
She's incapable of speaking without using an endless stream of vapid DEI cliches
00:15:22.620
This supposedly is an original enterprise, artificial intelligence,
00:15:25.760
and it's being overseen by maybe the least original, least intelligent woman
00:15:32.220
On top of everything else, the wacky left-wing stuff,
00:15:35.440
you're dealing with the most unimpressive people that you could imagine
00:15:38.980
that are in charge of this just technology that is incomprehensible.
00:15:46.640
And this is the kind of person who doesn't want to other others,
00:15:51.340
I mean, if someone is an other, then how do you not other them,
00:15:58.040
And by the way, just so you know, the word other, if you check the dictionary,
00:16:00.860
just means a person or thing that is distinct from another person or thing.
00:16:06.080
So if somebody is an other, it just means that they're not you, is all.
00:16:09.980
So if you're recognizing that they're an other, if you're making them an other,
00:16:12.500
you're just, you are recognizing them as a distinct entity from yourself.
00:16:17.660
So not othering them means that you are not recognizing them as a distinct human entity.
00:16:23.580
It means that, I suppose, we have to pretend that all people are indistinct blobs,
00:16:28.680
you know, all lumped together into this great, ambiguous blob that we call humanity.
00:16:34.900
Now, none of this makes any sense, but she has made it very clear that this DEI word salad
00:16:39.340
is the guiding philosophy behind Google's new AI.
00:16:42.720
There's no firewall between her and the product.
00:16:46.980
What does responsible and representative AI mean?
00:16:49.680
I've talked about my team, but that's only one definition.
00:16:52.480
So for us, it means taking deliberate steps to ensure that the advanced technologies
00:16:56.880
that we develop and deploy lead to a positive impact on individuals and society more broadly.
00:17:02.820
It means that our AI is built with and for everyone.
00:17:07.040
We can't just assume noble goals and good intent to prevent or solve ethical issues.
00:17:12.740
Instead, we need to deliberately build teams and build structures that hold us accountable
00:17:17.980
to more ethical outcomes, which for us, the ethical outcomes in Google would be defined
00:17:22.700
as our AI principles, which I discussed earlier.
00:17:25.680
Now, it's easy to point and laugh at imbeciles like this and the products that Google has created.
00:17:30.880
On some level, it's genuinely hilarious that an AI product can be so useless that it can't
00:17:35.640
generate images of white people, even white historical figures.
00:17:39.160
It's also amusing in a way that Gemini is so unsubtle and ham-fisted that it
00:17:43.520
straight up refuses to answer questions about, for example, atrocities committed by communist
00:17:47.940
governments or as someone else asked about the Zoom exploits of CNN commentator Jeffrey Toobin
00:17:55.260
But the truth remains that the people behind Gemini have extraordinary power.
00:17:58.540
I mean, this debacle makes it very clear that the AI algorithms underlying products that
00:18:03.900
millions of people actually use, like Google, are completely unreliable and worse.
00:18:11.660
They're downranking unapproved viewpoints and disfavored racial groups.
00:18:15.680
And they're promoting the laziest possible brand of neo-Marxist ideology at every opportunity.
00:18:21.300
And they're doing it also to influence the next presidential election, by the way.
00:18:25.260
You might remember that after Donald Trump won in 2016, Breitbart posted leaked footage
00:18:29.820
of Google executives grieving during an all-hands meeting.
00:18:35.660
I certainly find this election deeply offensive.
00:18:41.980
It did feel like a ton of bricks dropped on my chest.
00:18:50.100
Is there anything positive you see from this election result?
00:19:00.560
Now, in other parts of the video, they go on to say that the election is the result of
00:19:05.840
the people and voting and that they accept the results.
00:19:08.640
But Google issued a statement saying the video saying nothing was said at that meeting or any
00:19:14.220
other meeting to suggest that any political bias ever influences the way we build or operate
00:19:19.400
To the contrary, our products are built for everyone.
00:19:29.680
It was at this moment that Google decided that downranking conservative websites wasn't
00:19:33.620
In order to really influence elections, they decided that they needed to develop an AI that
00:19:38.240
will force-feed DEI and anti-white racism on everyone at every opportunity.
00:19:43.180
Their only mistake, which is the same mistake they made in that video back in 2016, is that
00:19:48.800
And now everybody knows exactly where Google stands.
00:19:52.140
We have a pretty good idea what our future AI-driven dystopia will look like, or already
00:20:02.220
If your house is feeling chilly right now, you may need to consider window replacements.
00:20:12.280
You know, if you haven't yet replaced the windows in your home, it can be an intimidating project
00:20:16.980
Luckily, there's a company that will do the work for you.
00:20:19.940
Renewal by Anderson is your one-stop shop for window design, manufacture, and installation.
00:20:24.980
Windows play a crucial role in regulating indoor temperatures.
00:20:27.420
If you notice a spike in your heating or cooling bills, it may be due to inefficient windows.
00:20:33.580
Renewal by Anderson offers limited, fully transferable, and best-in-the-nation warranty coverage.
00:20:37.820
Right now, Renewal by Anderson is offering a free in-home consultation on quality, energy-efficient,
00:20:41.720
affordable windows or patio doors with special financing options.
00:20:44.820
Text Walsh to 200-300 for a free consultation to save $375 off every window and $775 off every
00:20:52.020
These savings won't last long, so be sure to check it out by texting Walsh to 200-300.
00:20:58.640
Texting privacy policy and term conditions posted at textplan.us.
00:21:02.220
Texting enrolls for recurring, automated text marketing messages.
00:21:08.060
Go to windowappointmentnow.com for full offer details.
00:21:14.180
A cellular outage on Thursday hit thousands of AT&T users in the United States, disrupting
00:21:18.320
calls and text messages, as well as emergency services in major cities across, including
00:21:23.220
More than 73,000 incidents were reported around 8.15 a.m.
00:21:27.480
AT&T said some of its customers were facing interruptions and it was working urgently to
00:21:32.260
And then it turns out that a bunch of other carriers were affected as well.
00:21:38.740
I think the 70,000 figure, which was as of this morning, is most likely a huge undercount
00:21:44.640
So it seems to be a much wider outage than that.
00:21:50.140
If you had tens of thousands, potentially hundreds of thousands or more, I don't know,
00:21:53.960
people affected by a cellular outage, what caused that?
00:21:59.040
And lots of people on the internet are speculating that it could be some kind of attack or maybe
00:22:02.800
it's a false flag event or a dry run for a bigger thing that's coming down the pike.
00:22:07.840
But the media is now reporting that it may trace back to a solar flare.
00:22:17.580
Maybe you're looking at your phone and it's saying, SOS, what's going on?
00:22:20.960
My husband had that this morning and he's freaking out.
00:22:32.080
If you're experiencing that, it may be a result of space weather.
00:22:37.380
I'm going to do my best to explain what's going on.
00:22:41.380
So there was a strong solar flare event that happened just after midnight.
00:22:46.020
And they actually captured an image of it right here, okay?
00:22:53.480
But R3, that's for radio communications, it's on a scale of one to five.
00:23:01.300
That means it impacts radio communications for a few hours after this happens.
00:23:07.220
And so right now that could be impacting some of our technology.
00:23:11.920
And sadly, we're entering a solar maximum where we're going to see more and more solar storms, solar...
00:23:18.040
Well, I'm glad that she was able to begin that news report about this, you know, serious issue,
00:23:24.160
She was able to begin by giving us a little anecdote about the conversation she had with her husband at 3 a.m.
00:23:30.760
It's like, I guess people want everything personalized these days.
00:23:44.980
The internet, as you might expect, is not buying this.
00:23:48.060
Lots of comments are treating the solar flare explanation as somehow totally absurd.
00:23:53.360
You know, I'm seeing a lot of people saying, nah, no way.
00:24:01.240
They insist that this is some kind of plan, devised scheme by, you know, shadowy forces.
00:24:09.220
I'm not sure what they would have gained from it, given that this was a relatively minor annoyance.
00:24:13.900
So, I guess it's possible that shadowy forces executed some huge plan to just sort of irritate everybody for a few hours.
00:24:25.020
And I don't say this to downplay or dismiss the reality that there are, in fact, evil forces out there scheming different ways to make our lives miserable.
00:24:34.060
My only point is that, you know, and I find myself having to make this point with relative frequency these days.
00:24:39.340
But not everything, like there are scheming bad people out there.
00:24:48.680
And so, when I saw this and I heard about the solar flare, I just immediately knew.
00:24:53.000
I knew as soon as I went on Twitter what I was going to see.
00:24:55.400
Nothing, but a lot of it's coming from the right.
00:25:03.320
It always, you know, there are things that happen in the world and in the universe that just happen.
00:25:14.060
Okay, we do live in a physical reality where all kinds of things we can't control happen.
00:25:26.320
But not everything is, you know, it's like when there's, and it seems that people sort of moved away from this a little bit recently.
00:25:35.680
But we went through a while there where every mass shooting was a false flag.
00:25:42.980
And it seemed like for years it was, that was by some corners on the right.
00:25:46.700
Every mass shooting is automatically, I know what really happened, here's a false flag.
00:25:53.820
I mean, this is a thing that really does happen.
00:25:57.560
So, to immediately assume that it can't be anything but some deeper conspiracy is ridiculous.
00:26:08.180
You know, when I hear people say, you know, no, no way.
00:26:30.180
It's an enormous ball of hellish gas so big that a million Earths could fit inside it like,
00:26:36.800
This thing that's burning at 27 million degrees Fahrenheit at its core.
00:26:42.760
It's this incomprehensibly enormous nuclear reactor that is just a hop, skipping a jump away from us, again, in galactic terms.
00:26:50.820
And, yeah, that thing could incinerate all life on the planet in the blink of an eye.
00:26:58.720
It probably won't anytime soon, but eventually it will.
00:27:02.480
But it probably not anytime soon, but it could.
00:27:06.460
Like, the sun could belch tomorrow and send us all back to the Stone Age.
00:27:15.600
That's how fundamentally helpless and vulnerable we are.
00:27:18.340
All of our technology, all of our advancement, all of it could be rendered moot, destroyed in an instant by forces that have nothing to do with anyone on this planet.
00:27:28.820
This is what I'm always trying to explain to the climate change alarmists that are running around.
00:27:33.900
It's like, we don't get to call the shots on this thing.
00:27:41.340
When it comes to the weather and the climate and solar flares and the sun and all these things, it's just we are helpless.
00:27:48.680
And if it does, then we're screwed and that's it.
00:27:53.060
You think about the Carrington event, which was the largest solar flare on record.
00:27:58.900
And, you know, they haven't been keeping records that long.
00:28:04.980
And it sent out as much energy as the most powerful thermonuclear bomb ever created times 5 billion.
00:28:19.760
So it was the equivalent of 5 billion of the most powerful nuclear bombs ever created.
00:28:25.740
So imagine that, you know, two-thirds of the people on the planet all owned their own most powerful thermonuclear bomb.
00:28:35.920
And they all set them off at the exact same time.
00:28:42.400
And at that point in 1860, it knocked out telegraph lines.
00:28:52.200
There were people at telegraph stations that were getting electrical shocks because of this thing.
00:28:56.440
Now, imagine what would have happened if modern communication existed back then.
00:29:03.580
Like, say goodbye to your phone and the Internet and probably for a long time.
00:29:13.300
And I think that this is some of the psychology behind the people who overdo it on the conspiracy.
00:29:24.820
And I think part of the reason for that is there's a certain comfort, I guess, we take in thinking that we are in control of everything.
00:29:34.920
Even if not me individually, but, like, even imagining that there's a human conspiracy out there that is responsible for everything.
00:29:43.340
That is more comforting than imagining that it's totally outside of the control of anybody on the planet.
00:29:50.160
And I think people don't want to confront that.
00:30:00.960
This is an easier thing to psychoanalyze for the Daily Wire.
00:30:06.140
Just to make sure that the borrowers of student loans know whom to thank for escaping responsibility for fully paying back their student loans.
00:30:12.160
President Biden will send over 150,000 borrowers a personal email reminding them that he's their guy.
00:30:17.760
The plan to let borrowers off the hook will cost the American taxpayers $1.2 billion.
00:30:22.040
Adding that, the administration has canceled $138 billion in debt for almost 4 million borrowers since Biden took office in 2021.
00:30:30.960
The email states, quote, congratulations, all or a portion of your federal student loans will be forgiven because you qualify for early loan forgiveness under my administration's save plan.
00:30:39.860
From day one of my administration, I vowed to fix student loan programs so higher education can be a ticket to the middle class, not a barrier to opportunity.
00:30:47.520
It always, this is not really the point, but it's always so sort of creepy and depressing.
00:30:55.080
First of all, when you hear this from these politicians, it's a ticket to the middle class.
00:31:04.920
Your ticket to the middle class, which like, first of all, the idea that the only way to access the middle class is to spend $100,000 on a college education.
00:31:19.920
Like the idea that a college education and a degree should be the entry point for the middle class is insane.
00:31:33.780
I mean, you can, there are many careers you can get into and be comfortably middle class above middle class.
00:31:40.500
I mean, you can become wealthy without a college education.
00:31:45.240
But to the extent that it is true that many kinds of jobs that will, you know, give you a middle class sort of income, that they require college education, like that is a problem.
00:32:03.560
You know, not, they don't naturally require it.
00:32:06.860
So that's the problem we should be dealing with.
00:32:10.200
But instead, Joe Biden and the Democrats, they see that as a feature, not a bug.
00:32:18.500
And so they see no problem with the idea, to the extent that it's true, that access to the middle class depends on a college education.
00:32:28.840
And on top of it, the bar that he's putting for everybody is the middle class.
00:32:36.860
And all the people saying that, oh, no, just be happy being middle class.
00:32:46.020
Okay, these are all wealthy people who would rather be dead than be middle class, most of them.
00:33:00.760
And we'll throw you some money every once in a while.
00:33:03.920
We'll pay off some of your student loans for you, or we're not going to pay it off the taxpayers' will.
00:33:15.600
Which isn't to say, obviously, that there is anything wrong with being middle class.
00:33:21.180
The problem is when you have these elites, who, again, are not middle class, who are presenting that to you as your ceiling, as the pinnacle of what you should try to achieve.
00:33:36.400
Here's Biden bragging a little bit more about the latest student loan forgiveness scheme.
00:33:42.920
Early in my term, I announced a major plan to provide millions of working families with debt relief for their college student debt.
00:33:51.120
Tens of millions of people in debt were literally about to be canceled, their debts.
00:33:55.600
But my MAGA Republican friends in the Congress, elected officials in special interest, stepped in and sued us.
00:34:05.620
I announced we were going to pursue alternative paths for student debt relief for as many borrowers as possible.
00:34:12.320
And that's the effort that's been underway the last two years.
00:34:18.160
But I fixed it to make it the most affordable repayment plan ever.
00:34:21.720
Before I took office, student borrowers had to pay 10 percent of the discretionary income on a monthly basis.
00:34:28.900
If they made less than if they didn't have enough.
00:34:30.880
And so he admits that the Supreme Court says we can't do this, but we're doing it anyway.
00:34:36.880
So and these are the people that cherish our our democracy and our system of government, of course.
00:34:44.140
But this is another outright absolute, absolutely shameless bribery scheme funded by the taxpayers, of course, because.
00:34:52.000
Because, once again, I must insist on reminding everybody that this is not loan forgiveness.
00:35:14.920
And now that it has happened, you can't wave a magic wand and make it so that it didn't happen.
00:35:23.140
So real student loan forgiveness or, you know, erasing student loans.
00:35:27.620
A way to really do that, to erase it, is to get in a time machine and go back and stop the person from taking the loan out to begin with.
00:35:37.560
Which, if that was possible, I would say, yeah, that would be the best approach for many of these people.
00:35:54.800
You know, and so either the person who borrowed the money will be made to hold the bag.
00:36:00.160
Or the person who lent the money, the party that lent the money, will be left holding the bag.
00:36:04.940
Or a third party, which in this case, with a student debt, is the taxpayers.
00:36:12.120
That third party will be given the bag and told to fill it with a billion dollars or whatever.
00:36:24.640
There is no scenario where that will not be the case.
00:36:35.100
Even if I agreed with all of the arguments about how a lot of these college kids are taken advantage of.
00:37:07.900
And it's going to leave somebody in an unfair situation.
00:37:15.760
So, of all these situations, what is the most fair?
00:37:18.820
Is it the most fair to make the person who took the loan out pay it back?
00:37:25.060
Or is it the most fair to make someone who didn't take the loan out pay it back?
00:37:30.620
Because if it is unfair to make the person who took the loan out pay it back,
00:37:35.280
how much more unfair is it to make someone who didn't take it out pay it back?
00:37:41.820
And when it comes to student loan, quote unquote, forgiveness, that is really the only point that matters.
00:37:48.240
Here's a video that was posted by a guy named Charles Tanner.
00:37:53.060
And he was granted clemency by Donald Trump back before Trump was left office, obviously.
00:37:57.680
And he's now a big Trump supporter of this guy because he was let out of prison.
00:38:08.640
And he says that he was a nonviolent drug offender who was given two life sentences for a first-time offense.
00:38:19.520
And that's why he deserves to be let out and he wants to see a lot more people let out.
00:38:22.600
And here he is in this video where he talks about that and why he's a big Trump fan.
00:38:29.680
My name is Charles Duke Tanner and I was sentenced to a double life sentence for my first arrest for a nonviolent drug offense in 2004.
00:38:37.840
I lost all my appeals and I was denied clemency by the Obama administration.
00:38:42.720
It took 16 plus years before President Trump granted my clemency and allowed me to go home in 2020.
00:38:51.140
And now it hurts me to the core to see the same system going after a former president.
00:38:58.520
This is what blacks have been going through since day one.
00:39:01.080
If we allow this to happen to the former president, we can only imagine what's going to happen to the rest of the country.
00:39:14.960
Okay, so here's my, now, what he said there at the end about Donald Trump and how they shouldn't be going after him and it's unjust.
00:39:25.500
And the fact that this guy's out of prison is a problem.
00:39:28.000
He should be in prison for the rest of his life.
00:39:30.060
And my fear is that, now, I saw this video flooding around.
00:39:33.900
A few Trump supporters were reposting it in a favorable way, but nobody of special note.
00:39:42.480
My fear, though, is that the Trump campaign leading into the general election will lean into stuff like this and say, oh, look, this is a black guy who likes us.
00:39:53.960
He's saying other black people should support Donald Trump because Trump let him out of prison.
00:39:58.200
Listen, I think that would be a disastrous political mistake, and don't do it.
00:40:05.800
Because Trump letting guys like this out of prison was the worst thing that he did while he was in office.
00:40:15.620
And what people want now is they want actual law and order, actual law and order.
00:40:25.060
They're not in the mood to be sympathetic to criminals.
00:40:32.660
Now, yeah, this guy, you get his vote by saying, yeah, I'll let criminals out of jail.
00:40:36.360
You're not going to win an election with this guy, OK?
00:40:39.600
The way you win an election is by speaking not to criminals, but to just normal Americans, wherever they happen to live,
00:40:46.940
And for those people, what they want is they want criminals in jail.
00:40:50.100
And so if I'm Donald Trump, the clemency and letting the criminals out of jail, I'm pretending that didn't happen, OK?
00:41:03.200
And we're taking the naive Bukele approach in El Salvador.
00:41:07.660
And we're going, you know, we're going hardcore after these people.
00:41:11.840
And just to make the point, just using this guy, OK?
00:41:16.920
Because if you hear this kind of thing and you think, and you're tempted to be sympathetic,
00:41:25.620
well, because you hear, oh, first-time drug offender, nonviolent, two life sentences, that's obscene.
00:41:34.280
What we have to realize is that you're being manipulated.
00:41:39.480
Because he says that he was a nonviolent first-time drug offender, makes it sound like he was caught with a bag of weed or something.
00:41:47.200
And, OK, that argument was compelling to Trump, who let him out of prison.
00:41:52.380
Well, the reality is that Charles Tanner was the leader of a drug gang.
00:41:56.780
He was found guilty of trafficking in hundreds of thousands of dollars in drugs, hardcore drugs that he was bringing into our communities.
00:42:08.380
Well, first of all, he was a committed criminal deep in the drug-dealing game.
00:42:16.780
Also, the fact that this was a first-time offense is irrelevant.
00:42:21.620
Like, obviously, he committed many more offenses than what he was arrested for.
00:42:33.380
The fact that you weren't arrested for them, what does that have to do with anything?
00:42:37.900
Yeah, you know, my first time getting caught for it.
00:42:39.720
I don't give a s*** if it was the first time you were caught for it.
00:42:46.460
So if you arrest a drug kingpin, which is what this guy was, it's not his first offense.
00:42:56.080
If it's his first time being arrested, it's because of dumb luck, number one.
00:43:00.140
And number two, it's because other people took the fall for him in the past.
00:43:05.040
Okay, because guys like this, they surround themselves with people who are the first line of defense,
00:43:10.120
and those are the ones who get arrested because they're the ones who are actually out on the street.
00:43:17.860
And so that's why the ringleader, the kingpin type of, often is not arrested or doesn't have the same rap sheet.
00:43:26.840
Okay, you go down to Mexico, go down to Central America, the cartels, the people running the cartels,
00:43:32.540
and you compare their rap sheet to like the guys that are actually pushing the drugs.
00:43:36.680
A lot of the cartel leaders don't have as many arrests or any potentially.
00:43:40.700
Does that mean it's a, oh, I'm a first-time offender?
00:43:48.620
And second, calling drug trafficking nonviolent is insane.
00:43:57.720
I want everyone who says that, you need to stop.
00:44:01.100
The next time you find yourself calling drug trafficking nonviolent, I want you to immediately smack yourself in the face.
00:44:09.160
Okay, and I'm not calling anyone else to do it.
00:44:12.580
I'm saying do it to yourself, just to slap some sense into yourself.
00:44:20.500
For one thing, they're trafficking in poison that kills thousands of Americans every single year.
00:44:26.820
They're trafficking in a substance that is destroying communities all across this country
00:44:31.020
and putting thousands of people in the ground and destroying many more families and lives.
00:44:37.620
Okay, so to call that nonviolent is just, it's, it's, you're using a definition of violence that is so limited that it is meaningless.
00:44:47.560
And second, I don't mean to burst your bubble, there's no such thing as a nonviolent drug gang.
00:44:57.140
How do you think that they, what do you think is happening?
00:44:59.240
You think that they write sternly worded letters to each other when they have a disagreement?
00:45:03.820
Do you think they're having tickle fights with each other?
00:45:06.040
You think, what do you think, they're doing a thumb war?
00:45:08.360
You think when there's, when there's a dispute over street corners, they do, okay, rock, paper, scissors, ready?
00:45:17.560
There's no such thing as non, this guy is not nonviolent.
00:45:25.300
So, now, were you arrested for any of the violence that you perpetrated or that you caused to happen?
00:45:35.220
But giving a life sentence to a drug trafficker, you see, the reason why you give the life sentence
00:45:41.080
is because all of this is baked in and you, you are a smart person and you recognize all of this.
00:45:56.980
If he's a drug trafficker, he's already committed a whole bunch of other crimes.
00:46:00.320
So, you bake all of that in logically and that's how you end up with a life sentence.
00:46:06.840
Which is not only just, but it is, if anything, lenient.
00:46:12.720
If any, the conversation we should be having that I've talked about before is, should we
00:46:22.360
That at least would be, if we're having that conversation, then I know we're making some
00:46:29.240
But the fact that we're still debating about whether it's worth putting a drug trafficker
00:46:34.460
in jail for the rest of their lives, it's like, okay, we've learned nothing.
00:46:39.820
And I don't want to hear any complaints from anybody about crime in the street and cleaning
00:46:44.280
If you think the guys like this should be let out of prison, I don't want to hear you complain
00:46:48.440
Because this, what do you think is required to clean up the crime?
00:46:52.400
Okay, what do you think isn't, what does it entail?
00:46:57.320
It is an ugly, rough thing where you take guys like this and you throw them in jail and you
00:47:05.340
And if you don't want to do that, then you're not serious about cleaning up the crime.
00:47:08.180
I swear, if I see this guy show up at a campaign rally, I'm going to, I'm going to, well, I'm
00:47:15.960
That's all that's really going to happen, but I will be very annoyed.
00:47:21.080
Our friends at ZipRecruiter conducted a recent survey and found that the top hiring challenges
00:47:28.940
employers face in 2024 is a lack of qualified candidates.
00:47:32.280
But if you're an employer and need to hire, the good news is that ZipRecruiter has smart
00:47:35.820
tools and features that help you find more qualified candidates quickly.
00:47:39.440
Right now, you can try it for free at ZipRecruiter.com slash Walsh.
00:47:42.780
As soon as you post your job, ZipRecruiter's powerful matching technology shows you candidates
00:47:46.400
whose skills and experience match to what you need.
00:47:49.240
And then you can use ZipRecruiter's invite to apply feature to send your top candidates
00:47:53.380
a personalized invitation, encouraging them to respond to your job posts.
00:47:57.880
Let ZipRecruiter help you conquer the biggest hiring challenge of finding qualified candidates.
00:48:02.500
See why four to five employers who post on ZipRecruiter get a quality candidate within the
00:48:07.260
Just go to my exclusive web address right now at ZipRecruiter.com slash Walsh.
00:48:13.360
Again, that's ZipRecruiter.com slash W-A-L-S-H.
00:48:20.200
Shockingly, a number of comments that are disagreeing with my opinion that the Beyonce
00:48:30.900
You know, especially in comparison, given where the bar is.
00:48:41.280
The only explanation I can think of is that you've never listened to music before.
00:48:46.240
Wasson says, the Beyonce country song is catchy.
00:48:52.400
H-Man says, tell us you're gay without saying it.
00:48:57.360
Lillian Humphrey says, Matt, to say you shock me is an understatement.
00:49:00.260
Beyonce's a grifter and she'll jump off as soon as she's done getting her accolades.
00:49:07.640
It's Beyonce's boring voice put through an AI-generated bluegrass filter.
00:49:11.500
There's already a thousand songs you could be listening to instead.
00:49:13.800
Look, you might not like that I'm the leader of the Bayhive now.
00:49:17.400
I don't even think the Bayhive likes it, but it is, it's how it is.
00:49:21.900
It might not fit in your little picture of the world, okay?
00:49:25.100
It might not fit inside your box, the box you want to put me in.
00:49:38.960
And I think that's what disturbs you all so much.
00:49:42.560
And I stand by what I said, the Beyonce country song, it's okay.
00:49:55.580
You know, it's a, it's a, it's like a, it's like a 6.1.
00:50:13.460
Maybe that's what I'm cluing into a little bit,
00:50:15.960
is that there's a, there's a vibe that could be better.
00:50:18.540
Now, obviously, the lyrics are incredibly stupid.
00:50:25.500
I think Beyonce, if she had a better kind of soulful R&B voice,
00:50:37.060
I'd like to hear an Aretha Franklin level kind of soul R&B singer
00:50:45.980
So I guess Janice Joplin is what I'm asking for.
00:50:51.800
Me and Bobby McGee, I guess, I think you'd call that country-ish.
00:50:56.020
And I'm not saying that Beyonce is Janice Joplin.
00:50:58.720
Just to be clear, you people would probably stone me to death
00:51:07.320
Courage Under Fire is going to be the event of the year.
00:51:09.400
Come and join me on May 24th in Nashville, Tennessee
00:51:13.980
The Courage Under Fire will host some of the top leaders in faith,
00:51:16.580
the pro-life movement, and culture to share in the true, the good, and the beautiful.
00:51:20.000
I'll be speaking alongside Dr. Abby Johnson on how to have courage
00:51:22.700
and stand up for the truth no matter what adversity you face.
00:51:25.320
We'll be joined by some of the most influential leaders
00:51:27.020
in the conservative movement for a night of connection and inspiration.
00:51:30.240
All proceeds from the gala will directly benefit students
00:51:32.360
in need of tuition assistance at Regina Chaley Academy,
00:51:35.060
which is the premier classical homeschool hybrid for Catholic families.
00:51:38.900
VIPs will have access to an exclusive meet and greet with guest speakers
00:51:43.760
If you haven't grabbed your tickets yet, you need to do so.
00:51:58.160
of how a group of male losers who can't win against other men
00:52:00.860
decide to identify as women and join a women's basketball league.
00:52:16.140
This movie is a straight-up and intentional transphobic hate crime.
00:52:38.100
Watch the most triggering comedy of the decade.
00:52:43.400
Lady Ballers, streaming exclusively on Daily Wire Plus.
00:52:48.120
Watch Lady Ballers, the movie that Hollywood didn't make,
00:52:50.500
so we did exclusively on Daily Wire Plus right now.
00:53:01.160
You know, I did not expect that I would one day be in a position
00:53:03.460
where I would have to cancel the National Multiple Sclerosis Society,
00:53:08.600
It's really not so surprising when you think about it.
00:53:10.140
Nearly every large organization of any kind in the country
00:53:13.660
and if it's a medical organization or a nonprofit that deals with a medical issue,
00:53:18.240
you can be certain that it is run by far-left wackos.
00:53:22.600
That's how you end up with this kind of situation, which we'll get into now.
00:53:25.940
So backing up a few days, last week the MS Society kicked out a 90-year-old volunteer
00:53:31.800
who had been working with the organization for 60 years.
00:53:35.340
Now just to put that in perspective, this woman, Fran Itkoff,
00:53:38.800
has been volunteering for this organization for nearly as long as the organization has existed.
00:53:44.280
She began volunteering after her husband was diagnosed with the condition,
00:53:47.360
and she continued giving her time even after her husband died 20 years ago.
00:53:51.300
And with that sort of history and track record,
00:53:54.120
you would think that the organization would cherish this woman.
00:53:57.380
They would have a deep sense of loyalty to her and respect for her,
00:54:01.900
and certainly wouldn't even consider terminating their relationship with her
00:54:06.060
except under the most extreme circumstances imaginable,
00:54:10.820
where somehow her behavior made the decision inevitable.
00:54:15.600
Now it's hard to imagine what exactly a 90-year-old volunteer could do or say
00:54:22.500
it would have to be over-the-top, outrageous, and offensive.
00:54:28.640
Or you would think anyway if you were very naive
00:54:30.900
and didn't understand how the world works these days.
00:54:35.620
not for doing anything outrageous or offensive at all,
00:54:44.960
Apparently it all began when Fran was asked by the organization
00:54:56.260
likely introduced herself a thousand times to a thousand different people,
00:54:59.180
and never once had been asked to give her pronouns.
00:55:11.180
this is something that this elderly woman has never encountered
00:55:17.180
That's because this whole pronoun ritual was invented 45 seconds ago
00:55:25.620
because we spend too much time on the internet.
00:55:29.400
And that was the crime she committed, apparently.
00:55:31.580
So here she is last week explaining the situation to Libs of TikTok.
00:55:40.560
And I'd seen it on a couple of letters that had come in after the person's name.
00:55:47.840
They had the pronouns, but I didn't know what that meant.
00:56:00.260
And so she said that meant that they were all inclusive,
00:56:04.580
which didn't make sense to me because it sounds like you're labeling for females
00:56:12.960
and not males if you're just putting in she, her.
00:56:16.500
She said that she was just asking her what it meant to have a conversation.
00:56:20.380
So as a 90-year-old who didn't know what it meant,
00:56:24.240
you know, she's not street savvy to find out what it meant.
00:56:27.620
And when she said that they were required to use it to be inclusive,
00:56:35.020
the MS Society as a whole and the Long Beach group has just always been inclusive.
00:56:41.600
A few days later, it was on a Friday, it was at 4.58.
00:56:51.000
I got an email from her saying that they were sorry,
00:56:55.600
but they had to ask me to step down as a volunteer for the MS Society.
00:57:02.440
And the reason being is that you're not inclusive enough.
00:57:06.560
The verbiage she said was that she didn't abide by their diversity, equity, and inclusion,
00:57:15.580
and she can't be a part of the MS Society as a volunteer,
00:57:20.380
which to me is ironic because they're saying they're being inclusive,
00:57:25.160
but yet they're excluding a 90-year-old disabled woman who volunteers for over 60 years.
00:57:33.260
So I like her initial response to the thing when she was told we're inclusive.
00:57:37.120
And of course, her first response is because she's familiar with that,
00:57:48.660
We don't tell anyone that they're not allowed to benefit from the charity we do
00:57:54.760
But, of course, that's not what is meant by inclusive when it's used these days,
00:58:01.840
In the name of inclusivity, they're excluding this woman for asking a question.
00:58:05.960
You know, it's not like Fran stridently objected to using the pronouns.
00:58:11.560
look, this is the dumbest thing I've ever heard of.
00:58:16.540
I've never encountered human beings as stupid as you.
00:58:19.800
Now, if she would have been entirely justified in saying that,
00:58:25.200
but she didn't because she's a nice, sweet old lady
00:58:27.520
who is looking for clarification, not confrontation,
00:58:32.520
Now, later in the week, the MS Society released a statement
00:58:37.900
As an organization, we firmly believe that we best serve and support
00:58:40.620
those living with MS by creating a space that welcomes all.
00:58:43.380
This is especially true for self-help group leaders
00:58:45.720
who are responsible for leading meetings for people affected by MS
00:58:53.360
because of statements that were viewed as not aligning
00:58:57.540
Fran has been a valued member of our volunteer team
00:59:00.920
We believe that our staff acted with the best of intentions
00:59:03.340
and did their best to navigate a challenging issue.
00:59:07.920
As an organization, we are in continued conversation
00:59:10.180
about assuring that our diversity, equity, and inclusion
00:59:14.980
and we will reach out to Fran in service of that goal.
00:59:21.700
you understand it more than a typical 90-year-old
00:59:23.740
who lives a life blessedly insulated from much of this would,
00:59:37.720
A conversation is what Fran was trying to have originally.
00:59:45.160
and she's like, I don't know what you mean by that.
01:00:06.360
does not mean that they are navigating a challenging issue.
01:00:10.220
It means that they instead are creating an issue
01:00:16.920
in the most deranged and morally abominable way imaginable,
01:00:39.760
as far as the National MS Society expected and hoped.
01:01:01.760
In the organization, they tried to hide under a bed,
01:01:18.180
apologizes to our longtime dedicated volunteer,
01:01:38.180
Fran has been a committed champion for our cause.
01:01:41.720
and support her as a self-help group volunteer leader.
01:01:47.080
While we acted at the time with the best intentions,
01:02:41.800
and continue rowing along like nothing happened,
01:03:05.260
they should have had more conversations with Fran.
01:03:27.220
The problem is that the insane DEI policy exists.
01:03:36.240
for mistreating Fran in the name of that policy.
01:03:40.020
everybody who had a hand in crafting the policy
01:03:46.380
That's what actual accountability would look like.