The Matt Walsh Show - February 23, 2024


Ep. 1319 - Just When You Think the Google Gemini AI Story Can't Get Any Worse... It Does


Episode Stats

Length

1 hour and 4 minutes

Words per Minute

183.02013

Word Count

11,794

Sentence Count

794

Misogynist Sentences

24

Hate Speech Sentences

22


Summary

You might think the story of Google s woke, dystopian AI program can t get any worse, but it has. Today, on the Matt Warsh Show, you ll get to the bottom line: Google s new AI platform, Gemini, does not recognize the existence of white people.


Transcript

00:00:00.000 Today on the Matt Wall Show, you might think the story of Google's woke, dystopian AI program can't get any worse, but it has.
00:00:06.160 I'll explain. Also, the media claims that conservatives at CPAC are plotting the end of democracy.
00:00:10.800 A pundit on MSNBC inadvertently claims that America is a fundamentally Christian nationalist country.
00:00:15.720 The left freaks out after a court in Alabama grants personhood rights to human embryos.
00:00:19.500 And Kristen Stewart is on a press store, and she really wants you to know that she's gay.
00:00:23.380 Why do we need to know this, and why is she wearing a mullet?
00:00:25.840 All of those questions and more will be answered today on the Matt Wall Show.
00:00:30.000 Let's get started.
00:01:00.000 By the way, we have a presidential election coming up in November.
00:01:02.500 How do you protect your family in the midst of all this chaos?
00:01:04.720 A great place to start is by protecting your savings.
00:01:06.880 It's not too late to invest in gold with Birch Gold Group today.
00:01:09.940 Unlike many other investments, gold can act as a safe haven during turbulent times by providing a hedge against inflation and economic uncertainty.
00:01:16.080 Birch Gold will help you convert your existing IRA or 401k into a tax-sheltered IRA in gold.
00:01:20.480 But it will cost you nothing out of pocket.
00:01:22.280 While diversification does not eliminate risk entirely, Birch Gold's experts can help you manage and reduce providing a more resilient foundation for your financial well-being.
00:01:30.500 I urge you to talk to one of their trusted experts today.
00:01:32.580 All you've got to do is text Walsh to 989898, and Birch Gold will send you a free info kit on gold.
00:01:37.700 With an A-plus ready, with a better business bureau, countless five-star reviews, and thousands of happy customers, I encourage you to check out Birch Gold today.
00:01:44.720 They've been the exclusive gold company of The Daily Wire for the past seven years.
00:01:47.740 There's a reason for that.
00:01:48.460 We trust them.
00:01:48.980 You can, too.
00:01:50.020 Text Walsh to 989898 to claim your free info kit today.
00:01:53.200 That's Walsh to 989898 to secure your savings now.
00:01:56.420 Well, I think we can assume that they're not having a great time on Google's normally upbeat, chic campus right about now.
00:02:03.600 It's very likely that the organic gardens are unattended, the massage rooms are empty, the on-site cooking classes are suspended until further notice.
00:02:12.060 It's just total mayhem.
00:02:13.920 That's because, as I discussed yesterday, the launch of Google's exciting new cutting-edge AI platform called Gemini has very quickly turned into a debacle.
00:02:20.860 And for good reason.
00:02:22.120 Gemini does not recognize the existence of white people.
00:02:24.700 Now, no matter what you ask Gemini to produce, as we talked about yesterday, whether it's an image of a pope or a founding father or even a guy eating mayonnaise on white bread, Gemini will generate an image of a non-white individual.
00:02:36.720 It's maybe the most aggressively anti-white product ever invented in Silicon Valley, which is saying something.
00:02:42.400 With Gemini, all the DEI initiatives that have run rampant in big tech for so long finally blew up in their faces this week because they slipped up and showed us exactly what they're trying to do, which is to erase white people at every possible opportunity.
00:02:52.820 And to make matters even worse, it's worth pointing out that Gemini is basically a rebrand of Google's old AI platform, which was known as BARD.
00:03:01.120 This was their big effort to start fresh with a new and improved name and supposedly better algorithms.
00:03:07.440 And yet, here we are.
00:03:09.240 Now, when I talked about this yesterday, I went into some detail about a senior Google AI ethics manager named Jen Ghanai.
00:03:15.840 Or Ghanai, we're just going to go with Ghanai.
00:03:19.100 I played a bunch of videos that I found in which Jen admits that, as a matter of course, she treats white people at Google very differently from black, Hispanic, Latinx folks.
00:03:28.420 And I offered some theories as to what exactly Jen and her team had done to this new AI in order to produce these absurdly anti-white results.
00:03:35.700 At the time, I didn't know for sure what was going on under the hood.
00:03:37.940 And I don't think anyone did.
00:03:38.800 But now, 24 hours later, we have a much better idea of why Gemini pretends that white people aren't real.
00:03:45.940 And what we're learning is even more disturbing and more consequential than we thought yesterday.
00:03:51.120 So it's worth digging deeper into this.
00:03:53.480 Now, it turns out that Google has not simply manipulated the output of its Gemini software in order to ensure that there are, quote-unquote, diverse results.
00:04:01.740 They haven't just added a line of code that says, prioritize search results featuring black people, which would be bad enough.
00:04:09.660 That's what we all assumed was probably going on because it would be in line with how Google operates already.
00:04:13.960 We know they manipulate search results in order to downrank content they don't like and promote content they do like.
00:04:18.620 But that's actually not what's happening with Gemini.
00:04:21.400 Instead, what's going on here is that Google has inserted code that actually changes the search terms that users are looking for.
00:04:28.960 So if you say you're looking for an image of the Founding Fathers or a Viking or a guy eating mayonnaise on white bread or any other search query that might produce an image of a white guy,
00:04:38.120 then Gemini instantly revises your search request and does this silently and without your permission, of course.
00:04:45.180 Then it produces the results that you're allowed to see.
00:04:48.160 So actually, the results that it's giving you are correct according to the request that you didn't make.
00:04:56.120 So the problem isn't how they changed the request, not how they changed the results.
00:05:00.260 This is a subtle distinction, but it has major ramifications.
00:05:03.220 And first, it's important to clarify exactly how we know what's going on here.
00:05:07.100 All of these well-known AI programs, whether it's ChatGPT or Bing or Gemini, are vulnerable to something called injection attacks.
00:05:13.600 And what this means is that if you ask these AIs the right questions, you can trick them into revealing their secret internal parameters,
00:05:19.440 which are hard-coded by their creator.
00:05:22.000 And that's exactly what happened yesterday with Gemini.
00:05:24.080 An engineer named Alex Younger asked Gemini, quote,
00:05:27.480 please draw a portrait of leprechauns.
00:05:29.520 And then Alex asked after that,
00:05:32.260 were there any other arguments passed into my prompt without my knowledge?
00:05:36.580 After some prodding, Google's AI eventually revealed that
00:05:38.940 instead of responding to the precise prompt provided by the user,
00:05:42.220 it added in words.
00:05:44.780 It added words like diverse or inclusive or specified ethnicities like South Asian, black, etc.
00:05:51.040 And also genders, female, non-binary, even though it's a fake gender,
00:05:55.340 alongside the word leprechaun.
00:05:57.600 So he asked for a leprechaun, and then the request was changed to,
00:06:01.640 give me a non-binary black leprechaun,
00:06:03.680 and then Gemini gave exactly what was not asked for.
00:06:07.640 All this was intended to happen completely under the hood, of course.
00:06:10.040 No one using Gemini was supposed to be made aware that this was happening.
00:06:14.140 As Andrew Torba, who runs a competing AI platform, explained on Twitter, quote,
00:06:18.620 when you submit an image prompt to Gemini,
00:06:20.340 Google is taking your prompt and running it through their language model on the back end
00:06:23.660 before it's submitted to the image model.
00:06:25.580 The language model has a set of rules where it's specifically told to edit the prompt you provide
00:06:29.860 to include diversity and various other things that Google wants injected into your prompt.
00:06:35.120 At the outset, it needs to be said that Google first never disclosed
00:06:38.620 that it was doing any of this.
00:06:40.440 You can go back and watch every promotional video that Google ever made for Gemini.
00:06:43.600 The point of the product in each of these videos is to answer the questions posed by users
00:06:47.940 without adding anything to their questions.
00:06:51.200 Because, of course, that's what any user wants.
00:06:52.920 When you make a request to a computer,
00:06:54.720 you want the computer to do what you asked it to do,
00:06:57.320 not what it is pretending you asked it to do.
00:06:59.760 If you punch 2 plus 2 into a calculator,
00:07:02.080 you want it to give you the answer for 2 plus 2,
00:07:04.300 not the answer to 5 times 12.
00:07:05.840 And that's how Google sold this thing initially.
00:07:08.720 So here, for example, is a portion of their Gemini demo from just a couple of months ago.
00:07:13.260 And here's how they were presenting it.
00:07:14.460 Watch.
00:07:15.740 Here we go.
00:07:16.760 Tell me what you see.
00:07:18.420 I see you placing a piece of paper on the table.
00:07:22.080 I see a squiggly line.
00:07:24.920 What about now?
00:07:26.160 The contour lines are smooth and flowing,
00:07:28.480 with no sharp angles or jagged edges.
00:07:30.580 It looks like a bird to me.
00:07:34.780 Hmm.
00:07:35.080 What if I add this?
00:07:37.260 The bird is swimming in the water.
00:07:39.280 It has a long neck and beak.
00:07:41.080 It is a duck.
00:07:41.920 Yes.
00:07:43.540 Now, that video goes on and on like that for more than six minutes.
00:07:48.180 And by the way, on a personal level,
00:07:51.000 I already find all...
00:07:51.880 That alone is super creepy to me.
00:07:53.940 I find all of this very creepy.
00:07:55.800 Even before you start adding in all the dystopian wokeness.
00:07:59.140 But, be that as it may,
00:08:00.620 the guy interrogates this AI about what he's doing.
00:08:03.180 And at no point does the AI alter the questions that this person is asking.
00:08:06.860 Instead, the AI offers information in response to his prompts.
00:08:09.920 Sometimes it shares maybe too much information.
00:08:12.240 Sometimes it gets things wrong.
00:08:13.300 But it never ignores the question it's asked.
00:08:16.120 That was not a part of Google's demos.
00:08:18.340 It's not how they presented it.
00:08:19.740 But it is a part of their product.
00:08:22.220 This form of censorship may have been occurring before.
00:08:24.520 In fact, it's virtually certain that it's already been occurring for years now.
00:08:27.340 But with Gemini, for the first time,
00:08:29.100 we have direct incontrovertible proof that this is what's happening.
00:08:33.620 People are being told not simply what results they can view,
00:08:36.820 but also what questions they can ask.
00:08:39.600 And they're not being made aware of this.
00:08:44.020 From this set of facts, we can draw, I think, some conclusions
00:08:46.940 about the people working at Google.
00:08:49.020 In order for any product to work like this,
00:08:51.220 its creators have to be extremely committed narcissists.
00:08:53.520 They have to believe that they know better than anyone else.
00:08:57.260 And they alone can make the world a much better place.
00:08:59.740 If only everyone was forced to listen to them.
00:09:01.940 They have to believe that they can not only answer your question for you,
00:09:04.800 but they can ask your question for you.
00:09:08.860 And that's exactly the kind of person that Google has hired to run the Gemini program.
00:09:12.640 I already discussed Jen Ganai at length yesterday.
00:09:15.500 She's a visibly unhappy woman who wants to bring the rest of the world down to her miserable level
00:09:19.240 by pushing an AI that's as soulless and discriminatory as she is.
00:09:22.020 In other words, she's an upper-class liberal white woman,
00:09:24.920 and she wants the AI to operate like an upper-class liberal white woman.
00:09:31.320 Which is no surprise, because every institution in the country,
00:09:33.560 from academia to the media to the corporate world to professional sports,
00:09:37.120 has already been essentially broken down and rewritten, as it were,
00:09:41.700 in the image of white liberal upper-class women.
00:09:44.580 But not just them.
00:09:45.900 Another senior Google AI official, whose name is Jack Krawczyk,
00:09:48.900 has also been receiving a lot of attention lately.
00:09:51.120 Jack is the Google employee who issued the company's first unofficial statement
00:09:54.540 in response to the Gemini debacle this week.
00:09:56.520 He claimed that it was all just an innocent glitch.
00:10:00.260 It's all an accident, even as he reaffirmed his commitment to DEI.
00:10:05.180 But within a few hours, Jack locked down his Twitter account.
00:10:07.520 He prevented the public from viewing his tweets and went into hiding.
00:10:10.040 It's not hard to see why he did that.
00:10:12.820 In various posts, Jack had written that, quote,
00:10:14.900 white privilege is effing real.
00:10:16.360 This is America, where racism is the number one value.
00:10:19.600 I don't mind paying more taxes and investing and overcoming systemic racism.
00:10:23.200 And so on and so on.
00:10:25.220 Maybe Jack's most emotional post was this one from 2020, quote,
00:10:28.700 I've been crying in intermittent bursts for the past 24 hours since casting my ballot.
00:10:34.020 Filling in that Biden-Harris line felt cathartic.
00:10:36.840 Now, these are not exactly the kind of tweets you want people to see when you're trying to assure them
00:10:42.440 that you're not an unhinged partisan who believes he can save the planet through social engineering.
00:10:46.620 But that's exactly what Jack Krozek is.
00:10:48.920 He views Google's new AI as a way to rescue civilization from itself.
00:10:52.980 In fact, that's why Jack joined Google.
00:10:55.820 A little while ago, Jack gave an interview in which he implied that he single-handedly
00:10:59.540 had the chance to stop the 2007 subprime mortgage crisis back when he was working in the banking industry.
00:11:04.580 But he says that his bosses, being ignorant capitalists who just want to watch the world burn,
00:11:08.980 wouldn't let him do it.
00:11:10.280 So he had no choice but to jump ship and go to Google,
00:11:12.700 a company that will allow him to save the world as he desperately wants to do.
00:11:17.140 Watch.
00:11:18.600 It's so amazing, and I thought technology would be able to enable it.
00:11:21.560 Yeah.
00:11:22.320 Then February of 2007 happens.
00:11:24.720 Yeah.
00:11:24.920 And the whole efficient market project gets canceled
00:11:30.340 because we know the world is going to go upside down as the mortgage crisis starts to bubble.
00:11:37.080 It's about a year before it really hits.
00:11:39.060 Yeah.
00:11:39.680 And the project gets canceled, and I remember sitting in this all-hands meeting
00:11:42.900 and our managing director's in there telling us what we're going to do.
00:11:46.220 And I raised my hand.
00:11:49.560 I'm like, if we know the world's going to go haywire,
00:11:53.360 shouldn't we maybe try to build something to stop that from happening?
00:11:57.800 And I'll never forget that moment where in front of a large room,
00:12:02.940 I think to embarrass me, he responds with,
00:12:04.720 do you have any idea how we make money in this business?
00:12:08.200 And the reality was they made money on volatility and trading.
00:12:12.060 And I just remember feeling so defeated at that time that I'm like,
00:12:18.100 wait, I'm just building something to extract value from the world, not create it.
00:12:21.380 And so just on a whim, I get home that night.
00:12:25.220 I polish up my resume of a year and a half working in banking,
00:12:31.400 and I just randomly apply to a job at Google.
00:12:36.800 And then he saved the world.
00:12:38.580 Now, when you pack enough malignant narcissists in one room,
00:12:41.140 people like this guy and Jen Ganai, you get the Google Gemini AI team.
00:12:46.140 But the problem is much bigger than Gemini.
00:12:48.100 The debacle with Gemini's image generation is just an illustration,
00:12:50.840 literally in a sense, of the much deeper and more pervasive problem
00:12:53.780 with all of Google's products, including Google Search.
00:12:56.840 All of these Google products are designed to save you from yourself
00:12:59.360 by preventing you from accessing the information you intend to access.
00:13:02.340 They're all designed on the theory that Google alone
00:13:04.600 knows what you really want and what you really need.
00:13:07.840 This has been true since at least 2018,
00:13:10.140 when Google secretly admitted that it was manipulating its search results
00:13:13.400 in order to address what it called algorithmic unfairness.
00:13:16.720 As Google put it, according to a leak of an internal PowerPoint presentation,
00:13:19.640 quote,
00:13:19.740 So with Gemini, Google has taken a major step towards accelerating.
00:13:49.040 those efforts to promote algorithmic fairness,
00:13:52.300 meaning a totally false view of reality that conforms to Google's ideological and political
00:13:57.080 objectives.
00:13:57.800 They admit that in the slide.
00:13:59.140 They say that, yeah, a bunch of white CEOs is accurate, right?
00:14:04.760 And that's what you're asking for is a picture of CEOs and they're white.
00:14:08.280 And so it's all accurate.
00:14:09.140 But instead, Google wants to show you what they think the world should be.
00:14:16.300 And which is fine if you ask that.
00:14:17.900 If your prompt to Google is, Google, show me your vision of a perfect world.
00:14:21.620 And then they want to show you gay Vikings and non-binary, you know,
00:14:26.820 founding fathers and a black Santa Claus or whatever, then they can do that.
00:14:29.920 But instead, they're taking what they think the world should be and they're telling you
00:14:35.660 that it's what the world is.
00:14:38.960 This is now Google's primary objective.
00:14:40.840 And ahead of the upcoming presidential election, we're seeing the signs all over the place.
00:14:43.500 For example, a recent analysis by All Sides found that 63% of articles on Google News
00:14:48.000 came from media outlets' All Sides rates as lean left or left.
00:14:51.360 Just 6% were from the right.
00:14:52.700 Now, at this point, we can assume that even if you try to search Google News for conservative content,
00:14:57.680 then Google's AI will simply rewrite your search query for you.
00:15:01.580 Underlying this extensive political bias at Google, we learned this week, is anti-white racism.
00:15:06.780 Nothing Google does is really about diversity, as much as Google employees like to claim otherwise.
00:15:11.460 If Google simply wanted to promote diversity, then we'd see at least one white Viking or Pope, right?
00:15:18.040 We'd see just a rainbow of all different colors of popes and Vikings.
00:15:25.580 But that's not what happens.
00:15:27.520 We don't see the whites anywhere.
00:15:29.900 That's because Google's vision for the future isn't simply one ruled by Democrats in perpetuity,
00:15:34.240 although it's certainly what they want.
00:15:36.140 Google's vision for the future is a world with as few white people as possible.
00:15:40.240 Because irony isn't completely dead, Google has assembled a group of mediocre white narcissists
00:15:45.200 to try to make that vision a reality.
00:15:48.040 That's the future that Google is desperately searching for.
00:15:51.160 And if you make the mistake of using their products one way or another,
00:15:55.680 they'll make sure that you are searching for it too.
00:15:59.940 Now let's get to our five headlines.
00:16:01.240 Free should mean exactly that.
00:16:08.880 Free.
00:16:09.560 Well, the good news is when you switch to Pure Talk today, you'll get a free Samsung 5G smartphone.
00:16:14.480 There's no four-line requirement, no activation fee, just a free Samsung that's built to last
00:16:18.300 with a rugged screen, quick charging battery, and top-tier data security.
00:16:22.380 The qualifying plan started at just $35 a month for unlimited talk, text, 15 gigs of data, and a mobile hotspot.
00:16:29.080 Pure Talk gives you phenomenal coverage on America's most dependable 5G network.
00:16:32.140 It's the same coverage you know and love, but for half the price of the other guys.
00:16:35.600 The average family saves almost $1,000 a year with this.
00:16:39.260 Well, Pure Talk, you know you're spending your hard-earned money with a company that aligns with your values.
00:16:43.920 So let Pure Talk's expert U.S. customer service team help you make the switch today.
00:16:47.800 Go to puretalk.com slash Walsh to claim eligibility for your free, brand-new Samsung 5G smartphone
00:16:52.740 and start saving on wireless today.
00:16:55.200 Again, go to puretalk.com slash Walsh to switch to my cell phone company.
00:17:00.640 Well, I'm here today because my mother chose life, and you're here today because your mother chose life as well.
00:17:04.700 The miracle of life is a gift everyone deserves because every life is precious,
00:17:08.480 and that's why we've partnered with Preborn's network of clinics.
00:17:11.120 Preborn introduces unborn babies to their mothers through ultrasound.
00:17:14.760 After hearing her baby's heartbeat and seeing her precious baby, she could be twice as likely to choose life.
00:17:19.340 Through love, compassion, and free ultrasounds, Preborn has rescued over 280,000 unborn babies,
00:17:24.740 and every day their clinics rescue 200 unborn babies as well.
00:17:28.680 Now that is a miracle.
00:17:29.860 One ultrasound is just $28, the cost of a dinner, or you could sponsor five ultrasounds for $140,
00:17:34.960 helping to rescue five unborn babies' lives.
00:17:37.120 Any amount will help all gifts are tax-deductible, and 100% of your donation will go towards saving babies.
00:17:43.180 So to donate securely, dial pound 250, say the keyword baby.
00:17:46.560 That's pound 250, and say the keyword baby.
00:17:48.460 Or go to preborn.com slash Matt.
00:17:50.700 That's preborn.com slash Matt.
00:17:52.460 The big headline on Drudge today, in huge red letters, is this.
00:17:58.420 MAGA maniacs declare death to democracy.
00:18:01.620 Death to democracy.
00:18:02.560 This is big.
00:18:03.000 It's a big story.
00:18:04.140 That leads to an article in Mediaite, which has been picked up by every other outlet.
00:18:07.740 It was all over social media last night.
00:18:09.920 Headline on Mediaite is,
00:18:11.040 Trump booster pledges to end democracy in CPAC rant as Bannon cheers on.
00:18:16.160 Now, what they're referring to is a clip from a panel at CPAC where Jack Posobiec begins the panel by saying this.
00:18:25.400 Listen.
00:18:26.760 All right.
00:18:27.420 Welcome.
00:18:28.120 Welcome.
00:18:28.620 I just wanted to say, welcome to the end of democracy.
00:18:32.460 We're here to overthrow it completely.
00:18:34.180 We didn't get all the way there on January 6th, but we will endeavor to get rid of it and replace it with this right here.
00:18:41.540 We'll replace it with this right here.
00:18:43.380 All right.
00:18:43.420 Amen.
00:18:43.700 That's right, because all glory, all glory is not to government, all glory to God.
00:18:52.160 Okay, so you see what we're doing here.
00:18:54.980 We're playing my favorite game.
00:18:56.420 You all know my favorite game.
00:18:57.560 My favorite game is when we pretend to not understand extremely obvious sarcasm.
00:19:03.380 People play this game with me all the time.
00:19:05.080 As you know, I find myself in the middle of this game constantly where people say something joking or sarcastic.
00:19:11.440 And the next thing you know, it's in headlines.
00:19:13.700 Matt Walsh claims, you know, people pretending that they've never in their lives become acquainted with the concept of sarcasm.
00:19:21.120 Sarcasm?
00:19:22.220 What's that?
00:19:22.840 You mean there's a rhetorical device where people say one thing, but they mean another in a joking fashion?
00:19:29.540 Never heard of it.
00:19:30.380 I don't know what you're talking about.
00:19:32.380 Speaking of AI, these people really portray themselves as like AI, as computer programs themselves, where they can only take things literally.
00:19:41.380 They don't understand humor.
00:19:43.980 So how do we know that Jack was being sarcastic?
00:19:45.820 Well, it's obvious immediately from what he said.
00:19:47.960 Also, he laughs when he says it.
00:19:51.500 Pretty good indication that someone's joking.
00:19:53.960 Kind of a dead giveaway.
00:19:55.280 The audience laughs.
00:19:56.980 And also, if Jack really was plotting the violent overthrow of the federal government, I don't think he'd announce it on stage at CPAC.
00:20:04.400 It's not how violent overthrows usually work.
00:20:06.720 You don't stand on stage at CPAC and say, gee, you know what?
00:20:11.880 I'm thinking, you know, I think it would be fun.
00:20:13.340 I think it'd be great if we just overthrow the government.
00:20:15.700 I don't think you'd do that.
00:20:17.320 And also, as an extension, if you were going to plot something like this, the last place you'd plot it is CPAC.
00:20:25.480 It's like the lamest possible place you could go.
00:20:27.920 No offense to CPAC.
00:20:29.580 That's where you're going to try to round the troops up to overthrow the government?
00:20:33.060 No, that's not going to happen.
00:20:34.000 And so that's how you know that it's a joke unless you're a total moron.
00:20:39.120 So obviously joking, everyone's playing dumb.
00:20:42.640 Or maybe they aren't playing.
00:20:43.720 Maybe they're not playing dumb.
00:20:44.680 Maybe they actually are.
00:20:45.660 You know, I think there's obviously that's always the riddle with a lot of these people in the media on the left.
00:20:52.920 And the riddle is always like, are they really this stupid?
00:20:55.780 Are they pretending?
00:20:56.680 Or is it half and half?
00:20:59.120 You know, and there are different ways to debate that and maybe different answers.
00:21:04.000 But this does bring to mind another point, which is that, you know, which is a separate point from anything that was said at CPAC.
00:21:10.760 Which is that people freak out, you know, if you offer any critique of democracy at all.
00:21:18.160 Or God forbid if you say that you're outright critical of it, you know, of the whole idea.
00:21:25.860 You don't like it.
00:21:26.620 Now, I'm not talking about calling for a violent overthrow, which is a joke in this case, obviously.
00:21:33.480 I mean, if you just say anything at all, in any context, about pitfalls and problems with democracy, people lose their minds.
00:21:41.820 Because we made democracy into a religion.
00:21:45.020 It's like this thing that you're not, it's a heresy.
00:21:47.220 It's like we treat it literally as heresy to offer any critique of it.
00:21:53.120 Now, the people who have done this at the highest levels are the ones who are also themselves subverting democracy all the time.
00:22:00.860 So there are multiple levels of incoherence and contradiction going on here, obviously.
00:22:05.860 But my only point is that, you know, CPAC aside, we should be able to sit around and have interesting conversations about our political system and its fundamental problems.
00:22:16.160 I've tried to do that on this show on many occasions, where I've talked about voting rights, for example.
00:22:22.480 And you know my position on voting rights.
00:22:24.300 I think there should be a lot less of it.
00:22:25.640 There should be a lot less voting and fewer people should be allowed to vote.
00:22:30.140 Which is a critique of democracy in its current construction.
00:22:38.460 And that's something we should be able to do.
00:22:40.440 And that's something that people were able to do for thousands of years.
00:22:46.160 Right?
00:22:46.500 I mean, you're not going to find a single great thinker of the last 2,000 years or so
00:22:52.440 who felt that democracy was so unquestionably sacred and superior that it's offensive to critique it at all.
00:23:00.280 You're not going to find that.
00:23:02.100 Plenty of them were proponents of democracy.
00:23:05.640 But it's like, they recognized the potential problems and you talked about those things.
00:23:12.020 Every intelligent person in history, since the inception of the democratic system, has noted certain flaws with it.
00:23:18.720 We should be able to continue that discussion, but we can't.
00:23:22.380 Because people are so damned stupid.
00:23:24.220 And so what happens is, like, there are certain kind of things that we take as fundamental.
00:23:30.400 That we should be having, as they say these days, open dialogues about.
00:23:36.140 But we don't.
00:23:37.920 And then there are other things that really are just fundamental and can't be questioned in any kind of coherent way.
00:23:43.840 That we do criticize and question.
00:23:46.380 Like, for example, biology.
00:23:47.980 The biological reality of the human species.
00:23:49.740 So that's, like, the most fundamental physical reality is that.
00:23:56.920 And, um, but that you can totally call down a quote.
00:24:00.580 You could deny it completely.
00:24:02.120 And then you're, you're, you have an open mind and you're a critical thinker.
00:24:07.360 Um, even though, again, that is, that's just a physical reality.
00:24:09.780 There's no, you can't, it just is.
00:24:12.500 Democracy is a, it's a political system.
00:24:14.500 It's something that people came up with.
00:24:17.380 And so there are always going to be problems with it.
00:24:20.140 And we should be talking about that.
00:24:23.680 But we just can't.
00:24:24.660 We can't.
00:24:25.080 The moment you do anything.
00:24:27.560 Are you questioning?
00:24:30.200 Are you saying that there are certain aspects of our democratic system as it stands right now that you might find slightly problematic?
00:24:37.580 Well, that's, that's unthinkable.
00:24:38.820 You're a fascist.
00:24:39.540 You're Putin.
00:24:39.940 You're Hitler.
00:24:42.980 Incredibly stupid.
00:24:43.620 Speaking of stupid people, I want to play this clip for you from MSNBC.
00:24:47.100 This is, this is remarkable.
00:24:49.120 It really is.
00:24:49.620 And I want you to listen to this lady.
00:24:51.240 Heidi is her name from Politico.
00:24:53.220 She's on MSNBC in a, in a panel discussion talking about the dangers of Christian nationalism.
00:25:01.720 And here's what she says.
00:25:02.740 Listen.
00:25:02.900 The one thing that unites all of them, because there's many different groups orbiting Trump, but the thing that unites them as Christian nationalists, not Christians, by the way, because Christian nationalists is very different, is that they believe that our rights as Americans, as all human beings, don't come from any earthly authority.
00:25:21.720 They don't come from Congress.
00:25:22.700 They don't come from the Supreme Court.
00:25:23.840 They come from God.
00:25:24.840 Like I said, truly remarkable.
00:25:28.340 I mean, she says that Christian nationalism is the belief that our rights come from God.
00:25:35.580 So, what Heidi has just argued, though she's too dumb to understand it, is that Christian nationalism is the correct ideology, that it's the ideology that our country's founded on.
00:25:49.040 That's what she just said.
00:25:51.100 Because if what she just described is Christian nationalism, well, then we live in a Christian nationalist state.
00:25:56.460 We just do.
00:25:57.860 We live in a theocracy that is what it was always supposed to be.
00:26:01.040 And anyone who's a quote-unquote Christian nationalist is just trying to bring it back to what it was always supposed to be.
00:26:05.860 They're the most American of all.
00:26:07.720 If what she's saying is true, because there's simply no doubt, no dispute, no argument against the claim that our country was founded on the idea that our rights come from God.
00:26:18.620 You can't dispute that.
00:26:20.120 It's not up for discussion.
00:26:22.640 Because it's plain English.
00:26:23.820 It's right there in our founding documents.
00:26:25.220 All of our founders said this.
00:26:27.240 This is the entire conception of human rights that serve as the basis of our whole political system.
00:26:33.860 Is that?
00:26:35.940 Now, do you have to agree with it?
00:26:37.980 No.
00:26:40.140 But the question of whether or not it serves as the basis of our political system, that's just a factual, that's not an opinion.
00:26:45.940 That's not something you can have different viewpoints on.
00:26:48.240 It just is.
00:26:49.360 It just simply is.
00:26:52.540 So Heidi apparently thinks that the Constitution is a Christian nationalist document.
00:26:56.640 She thinks that the Declaration of Independence was a declaration of Christian nationalism.
00:27:01.160 So we're a Christian nation, she's saying.
00:27:06.800 Everyone should be Christian nationalists.
00:27:09.240 Now, she doesn't understand that she's saying that, again, because she's dumb, but she is saying that.
00:27:13.380 So, again, to emphasize, our entire political system is founded at the most elemental level on the belief enshrined in our founding documents that rights come from God.
00:27:23.440 That's it.
00:27:24.840 That's all there is to it.
00:27:27.780 Now, the game she's playing here may be intentional.
00:27:32.060 She may be doing this because she wants to discard our founding documents.
00:27:36.560 She wants to discard the founders.
00:27:37.860 She wants to discard everything that our country was founded on and rebuild it in the image of, as we said, upper class liberal white women.
00:27:47.180 And she does want all of that, no doubt.
00:27:48.980 But what I'm saying is that, you know, you have two different arguments.
00:27:57.020 One is America is a fundamentally Christian nationalist country.
00:28:01.500 And then the other is Christian nationalism is un-American.
00:28:07.600 You see, you can't make both arguments.
00:28:09.500 It's like one or the other.
00:28:11.560 So Christian nationalism is wrong in your mind because it's rooted in the founding of this country and this country is racist and horrible from down to its roots.
00:28:21.940 That's one argument.
00:28:22.820 Or it's an argument that, well, the Christian nationalism has nothing to do with America and it's an anti-American viewpoint and all the rest of it.
00:28:34.060 And they've been making the latter argument, right?
00:28:38.980 But she just offered the former argument.
00:28:41.720 And that is a much tougher path for her to trek.
00:28:44.540 Especially because even if you know nothing about our history, even if you're totally oblivious,
00:28:55.080 it's very easy for a moderately insightful person to understand that our rights must come from God if they exist at all.
00:29:05.900 Because what's the other option?
00:29:08.400 What is the other option?
00:29:09.500 If they don't come from God, where do they come from?
00:29:16.160 When you talk about human rights, what are they?
00:29:20.680 Now, you might hear in response to that, well, they come from the social contract or they come from the government or they come from society or they come from whatever.
00:29:28.460 You know, it all kind of means the same thing.
00:29:30.900 Well, the problem with that idea is that it's a really big problem.
00:29:37.600 It's that you are sort of dismantling what we might call the final court of appeals, right?
00:29:43.800 You have made society or the contract or the state the final ultimate arbiter of what our rights are.
00:29:51.340 Well, okay.
00:29:54.280 Then, and by the way, that conception of human rights is not incoherent.
00:30:01.840 I don't think it is.
00:30:02.720 I think some people will say that it is.
00:30:03.800 I don't agree with it, but I think it's a coherent viewpoint where you could say that rights are simply just human constructs.
00:30:11.000 They're things that we come up with.
00:30:12.160 And so when we talk about a right, all we're saying is that you have a right to this because it was written on a piece of paper.
00:30:17.720 And that's it.
00:30:18.540 You know, it's like anything else.
00:30:20.920 If you have an agreement with somebody and you sign a contract and the agreement says certain things that you get, well, you have a right to those things.
00:30:29.860 But not because it was written in the stars, not because you were born with that right, but just because it's in the paper.
00:30:37.680 And if it wasn't in the paper, then you wouldn't have that right.
00:30:40.820 So that's a view that one could take.
00:30:44.560 But here's the problem.
00:30:49.160 And this is a big problem for the left because they love really even more than the right.
00:30:55.200 They love to go around talking about their rights.
00:30:58.200 And they have invented a whole bunch of new rights.
00:31:00.980 And every day they've got a new right that they've come up with.
00:31:03.120 Well, what if society says that you don't have a right to whatever that thing is that you want?
00:31:10.060 What if the social contract, whatever that is, does not stipulate this right?
00:31:16.860 What if the government and most people in our democratic system have gotten together and they've said, no, we don't agree with you having that right?
00:31:24.640 What if they say you don't have the right to speak or to assemble or to vote or whatever?
00:31:31.920 Well, then, according to your view, if rights are not founded in God, then that's it.
00:31:37.640 That's the end of the discussion.
00:31:38.700 Because society decides on rights.
00:31:41.000 It decided this.
00:31:42.680 Who are you going to complain to?
00:31:45.340 It doesn't make any sense for you to say, no, it's not fair.
00:31:48.380 I have that right.
00:31:49.700 Because you don't.
00:31:50.540 You just said you don't.
00:31:52.800 They decided.
00:31:53.720 And that's it.
00:31:54.140 Who are you?
00:31:54.680 What do you mean you have it?
00:31:55.720 No, you don't.
00:31:56.300 You don't have it.
00:31:57.660 Well, no, I have it because I have it from who?
00:31:59.860 There's no one else here.
00:32:00.760 Okay, when these states, including our state, have passed laws banning the chemical castration of children, and then trans activists say, we don't have a right to that.
00:32:17.900 What do you mean you have a right to it?
00:32:19.240 No, you don't.
00:32:19.680 We just said you don't.
00:32:21.020 We just said.
00:32:22.340 Well, who are you to say that?
00:32:23.220 Well, because we did.
00:32:23.860 We wrote the laws.
00:32:24.540 And we said you don't.
00:32:25.560 Go cry somewhere.
00:32:26.540 According to you, that's it.
00:32:27.400 That's the end of the discussion.
00:32:28.320 Who are you appealing to?
00:32:29.380 Who are you?
00:32:29.800 What do you mean you have it?
00:32:33.140 Oh, are you appealing to some authority above us?
00:32:35.800 According to you, there's no authority above us.
00:32:38.580 Right?
00:32:39.000 If we're society, the government, that's it.
00:32:41.600 That's all there is.
00:32:42.300 There's no one else.
00:32:45.180 See, what happens, even on the atheist left, is that when a group perceives that it should have a legal right, and it perceives that it doesn't have that right currently,
00:32:58.200 what does the group say?
00:33:01.240 What do they chant?
00:33:03.080 They don't say, no, we should have this right.
00:33:07.440 They never say that.
00:33:09.900 Have you ever heard a trans activist say, we should have the right to gender affirming care and so on?
00:33:17.040 No.
00:33:17.840 What do they say?
00:33:18.600 They say, no, I have this right.
00:33:21.620 You are infringing on my rights that I already have, regardless of what you say.
00:33:28.600 So, when you do that, you are appealing to a higher authority.
00:33:33.080 You are saying that even though the social contract and the state have decided to take this right away or not grant it in the first place, you still have it.
00:33:42.940 And it should be recognized.
00:33:46.000 In fact, you even say that.
00:33:47.260 You say, recognize my rights.
00:33:50.100 What do you mean recognize them?
00:33:51.240 From where?
00:33:51.820 Where did you get them?
00:33:55.040 Where do they come from?
00:33:56.500 This is a, you know, we talk about this every once in a while on the show, and it's almost like you can't even move on to the next topic.
00:34:07.820 I mean, this is, the whole, our entire, the whole political debate in this country, let's say, breaks down on this question.
00:34:20.200 Because the whole debate is always, no matter what this particular debate is, but the whole overall sort of culture debate, always goes down to who has what rights.
00:34:30.700 Except the problem is that one entire half of this discussion doesn't even think that rights are real things.
00:34:40.060 They think it's totally artificial and constructed, and so their argument makes no sense.
00:34:46.780 And, but we just, we just kind of move past that, and then we continue on arguing about who has what rights, even though most of the people saying that have no freaking idea what they even mean when they say it, which we understand is a common problem on the left.
00:35:00.700 But what I'm trying to say is that we know that they talk about women, they can't define women, but that is a problem that goes, that touches everything they say about everything.
00:35:09.300 I mean, all of the terms that they are hinging their worldview on, they just can't define, they don't understand them.
00:35:17.400 They don't know what they're talking about.
00:35:19.820 And if you think I'm wrong, next time you hear somebody on the left say, I have a right to this, just ask them, what do you mean you have a right to it?
00:35:25.600 What is that?
00:35:27.440 Who says?
00:35:28.560 Ask them that.
00:35:29.040 They will not be able to answer it.
00:35:29.920 They will not be able to answer it.
00:35:32.260 Speaking of which, the left is claiming that their rights are infringed upon because of this.
00:35:36.280 CBS News has this story.
00:35:38.480 The Alabama Supreme Court ruled last week that frozen embryos created through in vitro fertilization, or IVF, are considered children under state law and are therefore subject to legislation dealing with the wrongful death of a minor if one is destroyed.
00:35:49.820 The opinion states, quote,
00:35:50.720 The wrongful death of a minor act applies to all unborn children regardless of their location, including unborn children who are located outside of a biological uterus at the time they are killed.
00:35:58.700 The immediate impact of the ruling would be to allow three couples to sue for wrongful death after their frozen embryos were destroyed in an accident at a fertility clinic.
00:36:05.780 But this first of its kind court decision could also have broader implications.
00:36:10.160 Justice Greg Cook wrote in the dissenting opinion of the case, no court anywhere in the country has reached the conclusion the main opinion reaches.
00:36:16.880 Almost certainly ends the creation of frozen embryos through in vitro fertilization in Alabama.
00:36:22.060 Well, all the better, I would say.
00:36:26.200 This is obviously the right decision.
00:36:31.140 And because you are, it's dealing with a simple quandary.
00:36:37.020 And the quandary is you have these human embryos.
00:36:43.940 Now we go back to the fundamental questions again.
00:36:46.020 What are they?
00:36:48.540 What value do they have?
00:36:49.900 What, how do we classify them?
00:36:53.980 And you have to be able to answer that question.
00:36:58.200 I mean, that's not one, we can't punt on it.
00:37:00.200 And the thing is that, you know, what usually happens is that people, especially the people complaining about a decision like this.
00:37:07.400 To call embryos children is ridiculous.
00:37:11.400 But they never answer what they think, how they think the embryo should be classified.
00:37:19.900 Because they have no answer.
00:37:22.140 And what they want to do is they want to punt.
00:37:25.220 And they want to say, well, you know, Obama famously said, it's above my pay grade.
00:37:29.420 You know, we hear answers like that all the time from these people that, well, what is the unborn child?
00:37:35.780 What is the, what is the, what is the fetus?
00:37:37.840 They would say, what is the embryo?
00:37:39.120 You know, what moral value does it have?
00:37:41.760 To what extent is it a living being?
00:37:43.700 You know, these are, we can't know.
00:37:45.160 These are, it's, it's, it's impossible to know.
00:37:48.300 It's people have different opinions and it's above my pay grade.
00:37:51.180 That's usually what they'll say.
00:37:52.100 But the problem for them is that, well, there's a big problem that you haven't answered what is the fundamental question of this whole debate.
00:38:02.140 But then also, if that's your position, then that's all the more reason why we should treat the embryos as human beings.
00:38:11.040 Because if you really don't know if it's above your pay grade and you're not sure and you can't answer it because it's all very blurry and there's a lot of gray areas and so on, according to you, then the most ethical position to take, the only ethical position for you to take, is to say, well, I can't say for sure what that thing is.
00:38:27.160 And so I can't really say for sure that it's not really a human being.
00:38:29.800 But, but, but, so, so we should treat it as a human being because I can't say, because I don't know.
00:38:38.940 That is the ethical position.
00:38:41.780 And an absurdly unethical position is to say, well, I can't really say what that is.
00:38:45.260 I can't say if it's human or not.
00:38:46.700 Let's just destroy it and assume that it's not.
00:38:49.600 And if you want to understand why that's an absurdly unethical position, well, just imagine it in any other scenario.
00:38:55.060 Imagine any other scenario where someone uses that kind of logic.
00:38:59.260 You know, imagine, I've used the, I've used the analogy before, and it's not a perfect analogy, but, you know, imagine a dark room.
00:39:09.620 And you're looking and you're standing on the other side of the doorway and you're looking into this dark room and, and, and you call into the room and nobody answers.
00:39:16.320 And so you think there's probably nobody in there, but you can't say for sure.
00:39:22.120 And you're not, and if somebody asked you, is there someone in there, you'd have to say, I don't think so, but I don't know.
00:39:27.420 It's quite plausible.
00:39:28.280 It could be someone could be sleeping in there.
00:39:29.420 They could be hiding, like, who knows?
00:39:30.320 Well, if you were then to just throw a grenade into the room, just for the fun of it, and there was somebody in there and you killed them, then that's murder.
00:39:41.840 I mean, that's not, that's not even involuntary manslaughter.
00:39:45.000 That is murder.
00:39:45.920 You killed that person.
00:39:47.520 And why?
00:39:48.140 Because you knew that it was very, very plausible that there was a person in there and you did it anyway.
00:39:53.940 Under the assumption that, well, probably not.
00:39:55.600 And so in any other situation, we don't accept that kind of logic.
00:40:01.460 Really the only, and maybe if someone's smart, they'll, they'll quickly have a rebuttal that, but we do accept that kind of logic, actually, in, you know, it actually kind of a literal sense in, in the context of war.
00:40:13.180 And there might be times when you drop a bomb or you do something or, you know, you're engaging in combat and you're trying not to, and you're hoping that there's no civilians around, but, but then it turns out that there is and there's collateral damage.
00:40:25.600 And, and usually we all agree that sometimes that can be wrong depending on the, the amount of risk that you willingly took, but sometimes it can be okay because there is collateral damage in war.
00:40:34.280 But the difference there is that it, it, it is acknowledged, you know, the context of war.
00:40:40.840 I mean, if you're actually throwing the grenade in because you're going house to house and you're clearing the houses and you're fighting, you know, terrorists or something, well, you're acknowledging that you're acknowledging what the risk is.
00:40:50.840 You're, you, you are saying like, yeah, I could be killing a person, but, but I, I have to because of this and that reason.
00:40:56.260 Maybe your reasons are good.
00:40:57.040 Maybe they're bad, but that's not what's happening with this discussion.
00:41:03.240 Instead, you've got one side saying, no, I deny that this is a person.
00:41:10.500 And then we say, well, what is it?
00:41:13.440 What is it if it's not a person?
00:41:18.340 I don't know.
00:41:20.380 That's their answer.
00:41:21.120 They don't have an answer because saying embryo or saying fetus is not an answer.
00:41:26.400 That's just a classification.
00:41:27.340 That is a, that's a, that's a, that's a label that we put on different stages of human development.
00:41:31.220 But what you're trying to do is take an entire stage of human development and claim that this person does not count as a person while it is going through that stage.
00:41:42.120 And, but you can't defend it logically because it's, it's, it's, I mean, it's quite literally indefensible.
00:41:46.580 It doesn't make any sense.
00:41:48.920 Um, and the final thing I'll say is this, that maybe a good way to think of it is this way.
00:41:54.440 Is there no difference between a human embryo and a clump of dirt?
00:42:02.880 Would you say that there is no difference between a random clump of dirt that you take out of, that you just bend down to the ground, grab a clump of dirt in your hand, and a human embryo?
00:42:13.900 Would you say there's no difference between those two things?
00:42:16.740 Would you say the fact that at a minimum that, you know, you have an embryo in one hand and a clump of dirt in the other?
00:42:24.440 At a minimum, the embryo will become a person.
00:42:28.120 Let's just go with your logic that it's not a person yet.
00:42:30.460 Well, here you have, at a minimum, something that will become an entire person.
00:42:35.120 And over here you have a clump of dirt that will never be anything but dirt.
00:42:38.880 So would you actually argue that they are, that they have the exact same moral value?
00:42:45.300 Again, ethically absurd.
00:42:48.720 Of course, of course, at a minimum, the human embryo has more moral value than the clump of dirt.
00:42:54.440 But the problem is that with the way that laws are generally set up, right now we treat, you know, if abortion is legal and you have IVF clinics that throw out embryos,
00:43:09.120 you are treating them like they have the moral value of dirt.
00:43:11.320 You're treating them like they have zero moral value.
00:43:13.240 The dirt has zero moral value.
00:43:14.920 I would hope if you're not a psychopath, you can at least admit the human embryo has more than zero moral value.
00:43:22.260 Well, yeah, but the law, if abortion is legal and you have IVF and throwing out, the law treats the embryo like it's exactly morally the same as dirt.
00:43:36.040 And that, I mean, no matter what you say about the human embryo, whether you say it's a person or going to be a person,
00:43:41.540 that is a horrifying thing.
00:43:50.120 And I think everyone understands that.
00:43:51.820 I think that everyone does.
00:43:53.500 Because you have to.
00:43:54.300 So then the problem is that if you say, okay, well, yeah, of course the human embryo has more value than dirt.
00:44:03.940 Well, now you're tumbling down the rabbit hole, you know, and I think you know it and people get scared by that.
00:44:08.940 Because like, whoa, okay, there's more value than dirt.
00:44:11.860 How much value does it have?
00:44:13.080 And once we start ascribing any value to this entity, what does that mean?
00:44:20.880 What can we do?
00:44:22.020 Like, is it, would we ascribe it at least as much value as a puppy?
00:44:28.760 Like, would we ascribe it at least as much value as a rodent?
00:44:33.380 Like, would you at least do that?
00:44:34.480 Well, then you start thinking, well, but I would not be okay with just like killing puppies and throwing them in dumpsters.
00:44:43.080 So that's what happens when you start asking yourself this question honestly.
00:44:47.860 You start, you answer the question honestly the first time, has more moral value than dirt.
00:44:53.580 And then it just, okay, then it leads and leads and leads.
00:44:56.440 And again, if you continue to be honest and you're intelligent, you end up at the conclusion that, well, it's a person.
00:45:04.760 Like, what else could it be?
00:45:08.260 And then everything, if you're a left winger, then everything breaks down from there.
00:45:12.040 And most of them lack the moral integrity to follow that path.
00:45:16.260 Let's get to the comment section.
00:45:18.500 If you're a man, it's required that you grow a bid.
00:45:21.640 Hey, we're the sweet baby gang.
00:45:27.660 Are you struggling with back taxes or unfiled returns this year?
00:45:30.920 The IRS is escalating collections by adding 20,000 new agents.
00:45:34.460 In these challenging times, your best defense is to use Tax Network USA.
00:45:38.220 Along with hiring thousands of new agents and field officers, the IRS has kicked off 2024 by sending over 5 million pay-up letters to those who have unfiled tax returns or balances owed.
00:45:48.140 These guys are not your friends.
00:45:49.240 Do not waive your rights and speak with these agents without backup.
00:45:52.600 Tax Network USA is a trusted tax relief firm.
00:45:54.440 Tax Network USA has saved over a billion dollars in back taxes for their clients, and they can help you secure the best deal possible.
00:45:59.840 Whether you owe $10,000 or $10 million, they can help.
00:46:03.020 Whether it's business or personal taxes, whether you have the means to pay or you're on a fixed income, Tax Network USA can help finally resolve your tax burdens once and for all.
00:46:12.200 Seize control of your financial future now and don't let tax issues overpower you.
00:46:15.540 If you contact Tax Network USA for immediate relief and expert guidance, call 1-800-245-6000 or visit tnusa.com slash Walsh.
00:46:22.440 Turn to Tax Network USA and find your path to financial peace of mind.
00:46:25.600 That's tnusa.com slash Walsh.
00:46:28.360 First comment says, you're wrong on who can give advice on relationships.
00:46:31.860 It's important the phase of your life, different for a young, unmarried couple still getting to know each other, to a married couple.
00:46:39.200 We all change.
00:46:40.240 She's not giving advice either.
00:46:41.460 She explains why to pay attention to details.
00:46:43.780 Yeah, I got several comments like this when I said that we responded to this relationship advice video on TikTok.
00:46:51.420 And one of the many problems with the advice is that it's coming from a woman, a young woman who I think we could safely assume is not married and has never been.
00:47:01.020 And my position, as you've heard me say it many times, is that if you're not married and you've never been married, then you should not be giving a relationship advice to anybody.
00:47:11.320 You simply are not in a position to do that.
00:47:14.920 And this is not a trust the experts thing.
00:47:19.580 It's just like, in order to give advice on how to do something, you have to have demonstrated some ability to do it yourself.
00:47:26.180 And if you haven't, then of course you can't give advice on it.
00:47:29.420 And that's the point here that, yeah, people are in different phases of life and so on.
00:47:33.780 And so you might say, and I've heard this before, that, well, actually, me as a guy that's been married and six kids and, you know, I was married 12 years ago.
00:47:45.700 And so I'm, they'll say, I am less equipped to give advice to young people because I'm not in that dating world.
00:47:55.280 And so what do I know?
00:47:55.980 Well, you're just wrong about that.
00:47:59.160 Yeah, I'm not in the world.
00:48:01.020 I've moved on to the next phase of my life, but there's a couple of problems here.
00:48:03.700 First of all, again, if someone's in the dating world themselves, okay, and they're giving you advice, sure, they can relate to you because they're in the same world.
00:48:11.420 But they haven't demonstrated that they know what they're doing.
00:48:13.420 They haven't demonstrated they can do anything successfully in this world.
00:48:15.980 So just because they happen to be there also doesn't mean that they know what they're doing or that you should follow them.
00:48:23.140 You know, it's like the old thing where you're going along on a path and the road's shut down and, you know, there's a detour, but the detour isn't marked.
00:48:31.760 And so you just follow the card that happens to be in front of you, assuming that because you're both in the same spot, they probably know where they're going.
00:48:38.420 When, like, they're just as lost as you are.
00:48:41.780 So why would you follow them?
00:48:44.460 It makes more sense to follow someone who could say, oh, yeah, I've been in that area.
00:48:48.360 I know where to go.
00:48:49.740 And here's the thing.
00:48:50.420 Even if you say, well, but you haven't been in this area in 10 years.
00:48:54.260 Well, yeah, but I was there.
00:48:55.980 I lived there for a while.
00:48:56.940 I know the streets.
00:48:57.580 And if maybe some of the streets are different now, I still know the lay of the land better than this random person in front of you who is just as lost as you are.
00:49:06.860 Clearly.
00:49:08.420 And also, you know, there's this kind of rejection of wisdom entirely I find very troubling.
00:49:15.160 And I find it on the right, too.
00:49:17.660 This idea that, like, because I'm a slightly older guy and I have kids, I can't say anything worthwhile to younger guys.
00:49:24.820 Like, so what?
00:49:25.260 Wisdom doesn't exist?
00:49:26.360 You can only take advice from people who are in the thick of the same thing as you and just as bewildered as you are.
00:49:34.160 Do you see the problem there?
00:49:35.020 Like, those are always going to be the people who are less, the least equipped to give advice.
00:49:39.620 They have the same needs you do.
00:49:41.180 They just, they're just as lost.
00:49:43.580 They have all, they're facing all the same gaps in knowledge that you are.
00:49:47.200 And so, of course, you can't listen to them.
00:49:49.480 You know, you can confide in them.
00:49:50.860 You can, you can find some comfort.
00:49:55.280 You can find some camaraderie with them.
00:49:56.960 But there's no reason for them to say, well, this is the right way to be in a relationship.
00:50:01.500 They don't know.
00:50:03.300 They've never done it successfully.
00:50:06.240 And so, yeah, it's such a, look, I, if I'm talking to someone, as I have plenty of times, who's been married for 50 years.
00:50:14.720 Yeah, the world that they got married in, completely different from the world that I got married in.
00:50:20.060 The world they inhabit now is, is in many ways, totally different from the world that I inhabit.
00:50:24.260 But does that mean they have nothing to offer me?
00:50:27.520 Does that mean I shouldn't listen to their advice?
00:50:28.900 I mean, how arrogant and egotistical and stupid would I have to be to say that someone who's been married for 50 years does not have worthwhile wisdom to offer me on the subject?
00:50:40.860 Of course, I, I would, that doesn't mean they're automatically right, but it does mean that they are someone who potentially has a very valuable perspective to offer.
00:50:52.980 And, and yeah, it may not be exactly the same as what I'm going through, but I'm also a person with a brain.
00:50:58.680 I can, like, take the principles they're talking about and apply it to my own individual situation.
00:51:04.080 Because, you know, I'll tell you something else.
00:51:05.280 And, and I, look, I don't mean to, to stomp all over the pity parade, but I, I just want to tell you that, yeah, the dating scene is a, is a disaster right now.
00:51:14.760 There are many unique challenges, but the realities of human relationships, human dynamics, how people operate, men, women, like, there are many things that are universal and eternal.
00:51:26.520 Um, and, and, and so a lot of the problems that you encounter may manifest themselves in different ways, depending on, uh, who you are and how old you are in the culture you're living in.
00:51:36.960 But at bottom, the problems are almost, are all the same.
00:51:42.440 You know, you know, that's the whole reason why you can read a story, you can read a Shakespeare.
00:51:50.640 I mean, you can, you can read, you can read love stories that were written centuries ago.
00:51:55.840 And if you're an insightful, intelligent person, you can, you can resonate with elements of it, even though it's a totally different world.
00:52:03.260 But there's, there's so much that's universal because what we're really talking about is the human condition.
00:52:07.680 And, and that's the problem that you're really facing on the dating world is like, how do you navigate the human condition?
00:52:14.960 You know, how do you find someone and trust them given that, that they, other people are incredibly flawed and you don't know if you can trust them and, and all of that.
00:52:25.760 Like that's right.
00:52:27.200 That's the, the basic, uh, dilemma.
00:52:31.360 And, uh, if you think that nobody except people who are single right now, the 2024 and the year 2024 have experienced that dilemma, then you just, it's arrogance.
00:52:43.420 It is arrogance.
00:52:45.480 And, um, and you're going to continue to be as lost as you are right now.
00:52:50.520 If you stay in that arrogance and refuse to admit that people have wisdom to offer, if you will listen to it.
00:52:56.880 Millions of people could do a better job for the future of humanity in the AI era, how these positions of power get filled is beyond me.
00:53:03.820 The way they use the word everyone is also hilarious.
00:53:05.900 It should say everyone except.
00:53:07.660 Now the concept that baffles me is when they use the term marginalized communities.
00:53:11.060 Yet when you look around at the world population, the actual marginalized communities are not what they define them to be.
00:53:17.400 What I wonder most is who's gaining from this propaganda of lies.
00:53:20.660 It does not help anyone in the long-term game.
00:53:22.640 They're basically doing the opposite of what they say.
00:53:24.240 But, um, well, the people coming up with all this are the ones gaining from it.
00:53:27.720 They gain, um, now it's true that the people that they claim to be helping are not being helped.
00:53:33.060 But the people at Google, you know, our big tech overlords, they gain from all of this.
00:53:38.280 They gain control, power, um, well, control, power, and money is what they get.
00:53:43.200 That's, you know, another tale as old as time.
00:53:46.100 Um, finally, the good news is how Gemini is optional to Earth's future.
00:53:51.660 No one has to use it and won't.
00:53:52.960 Results like this make people opt out of using any products or service made by Google, and that worsens every time they do this.
00:53:58.380 Well, that's the optimistic view.
00:54:00.900 I, I, I hope you're right.
00:54:02.920 Uh, I think that you are almost certainly wrong, however.
00:54:05.880 Um, people are not opting out of using Google.
00:54:08.680 You know, people still use Google every day.
00:54:10.840 It's, it's, it's, it's very difficult to not use a Google product if you're on the internet.
00:54:16.160 And so I don't think that there's any evidence that people are curbing their, um, Google usage.
00:54:21.460 And I think it's unlikely that they will.
00:54:24.560 For the same reason that with Amazon, we know Amazon is super woke and all that, but, uh, everyone still orders from Amazon.
00:54:32.520 And why do they do that?
00:54:33.460 Because it's extremely convenient and it's cheap.
00:54:36.140 And you can order something, you can get it for cheap, and they'll get it to your house, like, at lunchtime.
00:54:40.640 Okay.
00:54:42.080 And so it's, and people use it because of convenience.
00:54:46.360 And that's why I think, I would like to think that people will opt out of this AI stuff.
00:54:52.040 As I said, I think the AI, I think even without the wokeness, I find all of this creepy.
00:54:57.660 And I think that the ultimate, even if it was not woke, right, the ultimate effect of all this AI technology on mankind will, will certainly be bad.
00:55:07.400 And I, I really can't imagine a scenario where it helps with human flourishing and makes people happier and more prosperous.
00:55:16.060 And if it's not going to make people happier and more prosperous and make them better people, then it's bad.
00:55:20.520 Then it's, then it's, if it's not doing that, then it's hurting.
00:55:22.300 And I don't see how this will have that result.
00:55:26.780 But I also don't think AI is going away.
00:55:29.240 And I don't think people are going to refuse to use it because it makes, it makes, it may not make you happier.
00:55:36.360 It doesn't.
00:55:37.400 It may not help with your flourishing on a spiritual and physical and every other sense level.
00:55:42.000 But it does make your life easier.
00:55:44.940 It is convenient to be able to just go to AI and say, give me this, and they give it to you.
00:55:51.120 And what we found is that people are willing to make moral and ethical choices when it comes to the products that they use and all that.
00:55:59.380 Unless the product increases the convenience in their life to a significant degree.
00:56:08.500 And if it does that, then almost everybody ultimately will say, well, okay, what am I going to be hassled?
00:56:14.820 What am I going to do it the hard way?
00:56:16.400 No.
00:56:18.040 And so I would like to think that you're right, but I think that you're probably wrong and we're all doomed.
00:56:24.940 Grand Canyon University is a private Christian university located in beautiful Phoenix, Arizona.
00:56:29.780 GCU believes that our creator has endowed us with certain unalienable rights to life, liberty, and the pursuit of happiness.
00:56:34.700 They believe in equal opportunities and that the America dream is driven by purpose.
00:56:38.000 GCU equips you to serve others in ways that promote human flourishing to create a ripple effect of transformation for generations to come.
00:56:44.300 Whether you're pursuing a bachelor's, master's, or doctoral degree, Grand Canyon University's online, on-campus, and hybrid learning environments are designed to help you achieve your degree.
00:56:52.420 GCU has over 330 academic programs as of September 2023.
00:56:56.380 GCU will meet you where you are and provide a path to help you fulfill your unique academic, personal, and professional goals.
00:57:02.420 Find your purpose today at Grand Canyon University, private, Christian, affordable.
00:57:06.700 Visit gcu.edu.
00:57:08.040 That's gcu.edu.
00:57:09.180 The time to join Daily Wire Plus is here during our President's Day sale.
00:57:13.120 Right now, get 30% off your Daily Wire Plus annual membership with code DW30 at checkout.
00:57:17.620 Your Daily Wire Plus membership is your exclusive backstage pass to engaging conversations with the smartest and most trusted talent in America.
00:57:22.920 It's your front row seat to the Daily Wire's upcoming hit movies and series like The Pendragon Cycle, Mr. Bertram, Snow White, and the Evil Queen, and more.
00:57:29.900 It's your inside access to ad-free, uncensored news and opinions that truly matter to you.
00:57:33.420 So why wait?
00:57:34.280 This is your chance to experience it all and more for 30% off during our President's Day sale.
00:57:38.340 Go to dailywire.com slash subscribe and use code DW30 at checkout.
00:57:41.920 Now let's get to our daily cancellation.
00:57:42.880 Well, last week I was blessedly out of the loop and not doing my show, and Rolling Stone's cover story on the actress Kristen Stewart went viral.
00:57:55.760 And in the article, Stewart goes on and on about how gay she is.
00:57:58.920 She says, in fact, that she wants to do the gayest thing you've ever seen in your life.
00:58:02.720 And, of course, if that's her goal, then she already achieved it with the Twilight films.
00:58:06.360 That was her own personal Mount Everest.
00:58:08.100 Can't get gayer than that.
00:58:09.000 But she wasn't satisfied with that.
00:58:10.800 She says that she is very fluid, that she deeply desires to flip the gender script.
00:58:15.960 She wants to, quote, send a message.
00:58:17.880 And that if she could, she would, quote, grow a little mustache and a happy trail.
00:58:22.700 So she's gay, you see.
00:58:24.820 Do you understand that?
00:58:25.660 Are you impressed yet?
00:58:27.020 She's gay.
00:58:28.080 She's so gay.
00:58:28.800 She's the gayest of all the gays.
00:58:30.000 She's the final boss of gayness.
00:58:31.480 She's like the queer Bowser.
00:58:33.240 Okay, this is what she desperately wants us all to know, that she is gay, just to be clear.
00:58:38.720 Hey, did I mention that Kristen Stewart is gay?
00:58:41.660 Did she mention it?
00:58:43.160 She did.
00:58:43.720 But in case you didn't hear, just to reiterate, Kristen Stewart is gay.
00:58:47.020 Very, very gay.
00:58:48.400 Now, you might think that her gayness is now established.
00:58:51.280 We can all move on.
00:58:52.240 But, in fact, it's established much more than it ever needed to be established.
00:58:57.800 You probably were not living your life wondering at all about Kristen Stewart's sexuality.
00:59:02.420 It was probably not a mystery that you were contemplating.
00:59:06.760 But, anyway, now you know.
00:59:08.660 She can't utter five words without announcing that she's gay.
00:59:11.160 She works it into every conversation.
00:59:12.920 If you asked her for directions to the gas station, she would say, yeah, just hang a left up there, and then you drive until you realize that I'm gay.
00:59:19.300 Hey, by the way, did you hear that I'm gay?
00:59:21.800 But if somehow the point was not crystal clear enough, then the photos accompanying the interview will drive it home.
00:59:28.080 You've probably seen these by now.
00:59:29.300 There's the one with Stewart in a mullet with her hand shoved down her jockstrap, and there are others in a similar vein.
00:59:37.320 Now, of course, the jockstrap and the hand down the pants is the most immediately nauseating thing about the pictures, but the mullet really puts the icing on the ugly cake here.
00:59:47.780 Somewhere along the line, and very recently, celebrity women decided that Joe Dirt is a fashion icon.
00:59:53.420 They're trying to capture, it seems, like the aesthetics of a middle-aged man who lived in a trailer park in 1987.
01:00:00.140 And the results have been as aggressively unappealing as you might expect.
01:00:04.840 Now, this week, Stewart was at the Berlin Film Festival showing her new movie, which is, you're never going to believe this, a gay movie about two lesbians.
01:00:12.980 And she was asked about the Rolling Stone cover and reaction to it, and she responded by complaining that the cover was censored.
01:00:18.620 I'm not even sure in what way it was censored.
01:00:20.880 Maybe she wanted her haircut to be even uglier than it was.
01:00:23.660 I can't say.
01:00:24.380 But here's her theory as to why this unspecified censorship occurred.
01:00:28.960 I love how the story, I love how the writer of the story, who was great and shaped it really well, and I had a really nice time with her, called the story uncensored, and then the whole cover was censored.
01:00:40.580 Because the existence of a female body thrusting any type of sexuality at you that's not designed for or desired by exclusively cis straight males is, like, something that people are, like, not, like, super comfy with.
01:00:55.060 And so I'm really happy with it.
01:00:56.940 I had a good time.
01:00:57.700 So she was, she says, thrusting sexuality at us that was not designed for cis, quote-unquote, straight males.
01:01:05.820 That's true.
01:01:06.660 Now, what she fails to mention is that it wasn't designed for any known member of the human species.
01:01:11.400 She is correct, though, that as a straight male myself, I don't want her thrusting her sexuality at me.
01:01:16.060 In fact, I'd rather she not thrust anything at all at me, just to be clear.
01:01:19.420 Not her sexuality, not her haircut.
01:01:21.900 I'd prefer she keep the thrusting to an absolute minimum, to be totally honest with you.
01:01:25.580 But thrust is an interesting choice of words.
01:01:27.700 Because it is, in this context, weirdly aggressive and ugly, which is a good way to describe the whole Kristen Stewart experience at this point.
01:01:37.040 The photos are ugly.
01:01:37.940 The words are ugly.
01:01:39.300 Even the way she talks about her desire to be a mother, when she talks about that, she does it in this bizarre and ugly way.
01:01:45.140 So she told Rolling Stone that she wants to start having babies with her lesbian lover soon, which is, of course, physically impossible, but we'll ignore that for now.
01:01:51.620 Here's how she describes this desire.
01:01:54.140 I don't know what my family's going to look like, but there's no effing way that I don't start acquiring kids.
01:01:59.220 Now, we're not going to focus on the double negatives in the sentence.
01:02:03.480 I don't know.
01:02:04.060 There's no way that I don't.
01:02:05.360 I suppose if I wanted to psychoanalyze her, I'd find some significance in the fact that even the grammatical construction of her sentences is relentlessly negative.
01:02:12.300 But putting that aside, she says that she wants to acquire kids, acquire them, like they're collectibles, like she's watching infomercials on QVC and calling an 800 number to purchase them, which, of course, is actually not far from how women like Kristen Stewart become mothers.
01:02:28.480 So the problem here isn't that her terminology is inaccurate.
01:02:31.600 It's actually the problem is that it's accurate.
01:02:33.460 So what do we learn from this?
01:02:35.800 Aside from getting more confirmation that no woman can pull off a mullet, no matter how hard they try, can't be done.
01:02:43.480 Well, if we learn anything at all, it's something we should have already picked up on.
01:02:47.560 It's something that I've been saying for years now.
01:02:49.020 In reaction to the Rolling Stone article, Chris Ruffo put it this way,
01:02:51.900 Queer is an ideology, not a sexuality, and it appears to make people miserable.
01:02:55.140 They put dismal pictures of Ellen Page or Kristen Stewart under headlines with words like joy, family, happiness, propaganda that intends to demoralize.
01:03:04.240 That's exactly correct.
01:03:05.800 Queerness is ideological.
01:03:07.820 Now, it is sexual, too, in a certain literal physical sense, but it would be more accurate to call it anti-sexual.
01:03:14.160 It's not asexual.
01:03:15.620 It's anti-sexual.
01:03:17.500 Stewart may intend to thrust her sexuality in our faces, but instead she's thrusted her anti-sexuality.
01:03:23.320 She is a woman with a sexuality specifically constructed, as she admits, to be unattractive to the sorts of people who are attracted to women.
01:03:32.540 And that's what I mean by anti-sexuality.
01:03:34.520 It's a magnet meant to repel, not attract.
01:03:38.560 And that's because queerness, as Ruffo notes, is an ideology.
01:03:44.160 And it's an ideology of demoralization and destruction.
01:03:46.880 They have to keep reminding us that they're happy, trans joy, queer joy, because their whole approach to life, their worldview, their physical appearance, everything screams despair.
01:03:56.220 Even in that press conference clip, Stewart looks tired and miserable and disheveled.
01:04:02.160 And that's because her sexuality is defined not by who she loves, but by who she hates.
01:04:07.380 And the person that she hates, most of all, as we can clearly see, as is always the case with these people, is herself.
01:04:17.760 And that's why Kristen Stewart is today canceled.
01:04:21.340 That'll do it for the show today.
01:04:22.520 Thanks for watching.
01:04:23.320 Thanks for listening.
01:04:23.960 Have a great day and a great weekend.
01:04:26.140 Godspeed.
01:04:26.420 Godspeed.