ManoWhisper
Home
Shows
About
Search
The Matt Walsh Show
- November 01, 2019
Ep. 362 - Democrats Dehumanize The Unborn, Volume 989,128,002
Episode Stats
Length
42 minutes
Words per Minute
171.19087
Word Count
7,202
Sentence Count
409
Misogynist Sentences
17
Hate Speech Sentences
3
Summary
Summaries are generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript is generated with
Whisper
(
turbo
).
Misogyny classification is done with
MilaNLProc/bert-base-uncased-ear-misogyny
.
Hate speech classification is done with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.000
You know, the cutoff really has to be double digits, I think. That's the cutoff. That's all
00:00:03.880
I'm saying. I think once you hit double digits, it's time to stop trick-or-treating. Get off my
00:00:08.340
porch and find a job at that point, once you get to 10 years old. Now, I was out with my kids last
00:00:12.580
night, and there were these teenagers hitting the same houses that we were hitting, and there were
00:00:17.240
a few houses ahead of us. They had no costumes. They were just putting their candy into their
00:00:24.780
book bags. It looked like they just got home from school, and they just decided to, on the way home,
00:00:28.940
might as well hit a couple houses and get some candy. And all I'm saying is, at least put some
00:00:35.160
effort into it. A 16-year-old, not in a costume, going up to a house on Halloween and taking candy.
00:00:43.700
It's like if a five-year-old was invited to his kindergarten friend's birthday, and the five-year-old
00:00:50.840
16-year-old brother showed up at the party right when the cake was being cut, had some cake, and
00:00:57.140
left. It's like, it's not really for you, but if you are going to come, you should participate
00:01:04.120
in the whole thing. And so, in this case, put on a costume, grab one of those pumpkin candy
00:01:10.180
buckets, say trick-or-treat the whole nine yards. People say, oh, well, it's better for teenagers to
00:01:14.760
be trick-or-treating than out causing trouble. Well, first of all, they can multitask. Second,
00:01:20.760
there is a third option. They could be picking up shifts at the coal mine, okay, like I did at their
00:01:28.000
age, making themselves useful. So, now, this is for older kids. If older kids want to trick-or-treat,
00:01:33.520
they have to wear costumes. But once you're not even a kid anymore, once you're an adult,
00:01:38.480
so at 18 years old, you're too old to trick-or-treat, obviously. You're also too old to wear costumes.
00:01:44.520
People don't like it when I say this, but adults should not be wearing costumes.
00:01:47.540
Costumes are for kids. Halloween is for kids. And if you think I'm being a Scrooge, I am. That's
00:01:53.960
kind of my thing. But it's not just that. Let me give you Exhibit A. Look at this from yesterday.
00:01:59.160
This picture here, this is Representative Katie Porter, a Democrat, obviously,
00:02:04.880
at the Capitol yesterday in a Batman costume. She's working at it. She's, you know, doing
00:02:11.900
supposedly serious business wearing a Batman costume. She looks like a mental patient.
00:02:19.220
This is what happens when adults wear costumes. Just look at that. It's a disgrace. It's an
00:02:24.400
embarrassment. Think about what you're doing. How do you not look at yourself in the mirror before
00:02:29.400
you leave the house and think to yourself, oh, my God, I'm an adult dressed like Batman.
00:02:33.960
Again, how does this not cause you to reevaluate not just your outfit for the day, but your whole
00:02:40.860
life? And this is why under my administration, the legal cutoff for costumes will be 18 years
00:02:48.100
old. All violators of this, anyone caught in a costume over the age of 18 will be stuffed
00:02:53.580
into a burlap sack and thrown into the sea. And as I always say, I take no pleasure in handing
00:03:02.000
out those kinds of punishments. It will hurt me more than it will hurt the person being thrown
00:03:06.380
into the ocean. But it's just we have to have law and order. And this kind of degeneracy simply cannot
00:03:12.580
be tolerated. Now, from one Katie to another, speaking of degeneracy, Katie Hill, you know that
00:03:21.380
lady that was having sex with her 22-year-old female campaign aide, paying her with campaign funds,
00:03:26.260
and then allegedly also having sex with another staffer while, you know, who was also, of course,
00:03:32.540
being paid, this one being paid with tax money. She gave her final speech before Congress yesterday
00:03:38.460
and she was not wearing a costume. It would have been kind of funny, actually, if she was the one
00:03:44.900
wearing, if she was the Katie wearing the Batman costume, giving the speech after resigning because
00:03:51.620
of ethics violations. Now, that would be that would be worth it. But no, despite being a disgraced
00:03:56.700
politician resigning over ethics violations, she tried to turn the whole speech into some kind of
00:04:02.960
triumphant feminist moment. Watch this. This is the last speech that I will give from this floor as a
00:04:11.900
member of Congress. I wasn't ready for my time here to come to an end so soon. It's a reality I'm still
00:04:18.780
grappling with and I will be for a long time to come. The mistakes I made and the people I've
00:04:24.580
hurt that led to this moment will haunt me for the rest of my life and I have to come to terms
00:04:28.600
with that. Ever since those images first came out, I've barely left my bed. I've ignored all the calls
00:04:34.520
and the texts. I went to the darkest places that a mind can go and I've shed more tears than I thought
00:04:40.240
were possible. And I'm here today because so many of the people I let down, people close to me,
00:04:47.000
supporters, colleagues, people I've never even met, told me to stand back up and that despite
00:04:52.940
all of my faults, they still believed in me and they were still counting on me. And I realized
00:04:58.840
that hiding away and disappearing would be the one unforgivable sin. I am leaving now because of a
00:05:05.340
double standard. I'm leaving because I no longer want to be used as a bargaining chip. I'm leaving
00:05:11.200
because I didn't want to be peddled by papers and blogs and websites used by shameless operatives for
00:05:16.960
the dirtiest gutter politics that I've ever seen and the right-wing media to drive clicks and expand
00:05:23.160
their audience by distributing intimate photos of me taken without my knowledge, let alone my consent,
00:05:28.860
for the sexual entertainment of millions. I'm leaving because of a misogynistic culture that
00:05:34.840
gleefully consumed my naked pictures, capitalized on my sexuality, and enabled my abusive ex to continue
00:05:41.260
that abuse, this time with the entire country watching. I am leaving because of the thousands
00:05:47.460
of vile, threatening emails, calls, and texts that made me fear for my life and the lives of the people
00:05:53.420
that I care about. I'm leaving because for the sake of my community, my staff, my family, and myself,
00:06:00.040
I can't allow this to continue. Today, as my final act, I voted to move forward with the impeachment of
00:06:08.020
Donald Trump on behalf of the women of the United States of America. We will not stand down. We will
00:06:14.700
not be broken. We will not be silenced. We will rise, and we will make tomorrow better than today.
00:06:21.900
Now, let me ask you, did you notice what was missing from her list of reasons why she was
00:06:28.260
leaving? She's leaving because of misogyny. She's leaving because of double standards. She's leaving
00:06:33.860
because of her ex-husband. She's leaving because of political operatives. She's leaving because of
00:06:38.080
sexism and patriarchy and everything. But she forgot to mention that she's also leaving because she was
00:06:43.140
using her subordinates like sex toys. And that's one little detail that she seemed to have left out.
00:06:49.680
I'm sure unintentionally she just forgot to mention it. Of course, she's talking about double standards,
00:06:54.860
but the only double standard is the fact that she can get away with making herself out to be persecuted,
00:07:01.100
making herself the victim in this scenario, and the media just goes along with it. A man would not get
00:07:08.060
that benefit. So that's where the double standard is. In fact, the double standard is so strong in her favor
00:07:13.540
that the left now, for all intents and purposes, has canceled the Me Too movement for her sake.
00:07:19.240
The Me Too movement is canceled now. They threw it all to the side. The first time a woman really,
00:07:26.120
really, you know, came under scrutiny, they said, okay, never mind on the Me Too movement thing.
00:07:32.240
Forget it. Never mind. Because everything that they've been saying about consent
00:07:41.860
is now gone. All for the sake of defending this woman.
00:07:49.380
Because they have been saying all along, if you're in a position of power and someone is a subordinate,
00:07:56.500
they cannot consent because they're not on the same level of power and they're not going to feel
00:08:04.680
like they have the ability or like they're empowered to say no, so on and so forth. Those
00:08:11.680
are not my rules. That's not what I say. Now, my own personal feeling is that certainly if you are
00:08:20.380
a member of Congress, having a sexual relationship with a subordinate in that scenario is definitely
00:08:26.160
wrong and it is an ethical violation. Now, if we weren't talking about Congress and we weren't talking
00:08:32.300
about campaign funds and tax money and everything, and this was just in a private business and you had
00:08:40.740
someone in a sexual relationship with a subordinate, then it kind of depends. It's sort of
00:08:47.320
situationally based. I don't personally think that it's impossible for there to be consent or anything
00:08:54.900
like that or that it's automatically sexual assault or something. I don't personally think that, but
00:09:00.080
we're not talking about my standards. We're talking about the standards set by the left, by people
00:09:06.100
like Katie Hill, who have been very clear about this. They've been very clear that just because someone
00:09:12.640
says, yes, I want to do it, doesn't necessarily mean that it's real consent. They've said this over and
00:09:19.640
over and over again. And now I guess we're back to, well, hey, I mean, the other
00:09:24.800
woman wanted it, so she said she did, so that's it. That's all that matters.
00:09:31.440
Well, Me Too movement then. RIP, I guess. Well, while we're on the subject of detestable
00:09:38.760
Democrat women, which seems to be the theme here so far, let's check in with Wendy Ullman,
00:09:43.440
state representative in Pennsylvania. They were debating a bill in Pennsylvania that would require
00:09:49.000
hospitals and abortion clinics to respectfully dispose of the remains of unborn children.
00:09:56.640
Because as it stands right now, aborted children are tossed in medical waste dumpsters along with
00:10:03.400
used needles and other toxic trash. That's how the human remains are treated. This bill would seek to
00:10:10.240
grant just a little bit of dignity and humanity to these humans. But Ullman is worried about this and
00:10:17.480
about the implications. And she tried to express her concerns and it didn't go so well.
00:10:23.840
It refers specifically to the product of conception after fertilization, which covers an awful lot of
00:10:31.480
territory. I think we all understand the concept of the loss of a fetus, but we're also talking about
00:10:38.880
a woman who comes into a facility and is having cramps. And not to be concrete, an early miscarriage is
00:10:56.800
just some mess on a napkin. And I'm not sure people would agree that this is something that we want to
00:11:04.560
take to take to take to the point of ritual, uh, either cremation or internment.
00:11:11.200
Yes, a mess on a napkin. That's how she sees unborn life as a mess on a napkin.
00:11:20.000
And you could tell there right before she said that she paused noticeably and was searching for the right
00:11:28.600
words because she knew that what she was about to say, it was horrible. And so she was looking for
00:11:34.800
a different way of putting it. And this is what she came up with. So actually I wonder in her mind,
00:11:42.540
she, she obviously had, she was originally going to say, say it a different way. She had option A in
00:11:47.760
her mind. She was about to say it. She stopped and then went with option B. So considering that this is
00:11:54.560
what she decided to say, I wonder what she originally was going to say before she settled on
00:11:58.940
this. Um, either way though, this is how this is, it's, it's honest. I mean, this is how, as a
00:12:04.340
Democrat, this is how she sees unborn life. She sees it as just, and it, it makes sense from that point of
00:12:11.720
view using that logic. An unborn baby is a clump of cells. When you're aborting a child, it's just a
00:12:18.380
clump of cells, basically a cancerous tumor you're getting rid of. And then in that case, yeah, if a,
00:12:23.800
if an unborn child that's aborted is a clump of cells, then I guess a miscarried child is a mess
00:12:28.680
on a napkin, as she puts it. Now, you know, we, we've had miscarriages. My wife's had miscarriages
00:12:35.100
and it's a very difficult thing to go through emotionally and, um, for the mother physically.
00:12:44.700
And in fact, this is what immediately came to mind for me when I was listening to this horrid,
00:12:51.740
woman saying this, uh, one of the things that adds to the emotional difficulty of marriage of a
00:13:04.200
miscarriage is precisely that so many people have the attitude of all men here. The attitude that she's
00:13:12.400
expressing is exactly how a lot of people feel about it. And I, even if they're not so explicit
00:13:20.220
about it, you know, so many people don't really view it when you have a miscarriage.
00:13:26.620
So many people don't really view it as losing a child. And then the result is that a woman who's
00:13:33.280
going through this, she already has, she's already got the emotional burden of it. Um, but then on top
00:13:41.400
of it, she feels alone and isolated as she's mourning her miscarriage because society doesn't
00:13:48.140
see it as a big deal. And so she feels like she's gotten no one to turn to and nowhere to go for
00:13:54.280
empathy. Because society is saying basically what Wendy Ullman is saying, which, ah, it's just a mess
00:14:02.300
on a napkin. Who cares? It's just, it's evil and it's vile. But every time a Democrat takes off the mask
00:14:13.620
and really says what they actually think, uh, on these topics, even though it's horrible to hear,
00:14:24.960
I'm always grateful when they do, because I want people to, to see this. People need to see this,
00:14:30.840
that this is how the Democrat party views human life. It's important that we all see that. And then,
00:14:37.140
uh, the voters have to decide, do you want to support a political party that sees human life
00:14:43.880
this way? Do you agree with that? Is that how you see human life? You know, if you've ever had a
00:14:51.020
miscarriage or if you, someone you love has had a miscarriage, is that how you do, do you agree with
00:14:55.100
that way of looking at it? If not, then you have to really evaluate whether you can support a political
00:15:04.060
party that differs with you so dramatically in the way that it fundamentally views human life. And,
00:15:11.640
and, and that is a, that is a, that is a very fundamental issue, isn't it?
00:15:14.960
What, how we view human life? What is human life? Is it inherently important?
00:15:26.660
Democrat party says no. If you say yes, then it doesn't mean you have to be a Republican. It
00:15:32.420
just means that I think there's no way you could support these people.
00:15:37.920
Okay. I wanted to mention this case, speaking of vile and evil, um, reading now a little bit from
00:15:44.440
KTLA, an article on KTLA.com says the girlfriend of a Boston college student who died by suicide in May
00:15:54.200
repeatedly texted him to do so during their relationship. According to Massachusetts
00:15:59.080
prosecutors, um, who said this in announcing involuntary manslaughter charges against her.
00:16:05.420
In young you 21 tracked Alexander Ertula's location on May 20th and was present when he jumped from a
00:16:13.000
parking garage, only hours from graduation. Authority said you also a student of Boston college was
00:16:18.680
physically, verbally, and psychologically abusive toward her boyfriend during their 18 month long
00:16:22.720
relationship. Investigators looked through a trove of text messages. The two exchanged in which you
00:16:27.760
allegedly tells Ertula, uh, who's 22, who was 22 years old to go kill himself or to go die.
00:16:35.000
And that she, his family in the world would be better off without him. Prosecutors said the district
00:16:40.320
attorney said you is in her native South Korea and her office is cautiously optimistic that you
00:16:44.300
will return to the U S voluntarily. Uh, they're saying they're going to do what they can to get
00:16:48.180
her back. Um, skipping ahead a little bit. Prosecutors have described a pattern of abuse and manipulation
00:16:56.060
throughout the relationship in which you allegedly made demands and threats and exercise total control
00:17:00.740
of Ertula, both mentally and emotionally. She was aware of her boyfriend's depression. They contend you
00:17:05.900
instructed Ertula to kill himself hundreds of times through the more than 47,000 text messages she
00:17:11.460
sent to him into 47,000 text messages she sent in two months. Now do the math on that.
00:17:22.740
You know, so that's about 60 days, 47,000 divided by 60.
00:17:27.860
How many text messages a day, how many text messages an hour? It does, does that add up to,
00:17:35.000
um, Rollins said a bill is currently in front of a legislative committee that would make encouragement
00:17:40.080
or assistance of suicide, a crime punishable, punishable up to five years in prison. Now this,
00:17:46.060
um, okay. And they do bring this up. This, this will bring to mind, uh, the case of Michelle Carter
00:17:51.520
a few years ago, who was convicted of involuntary manslaughter for doing the exact same kind of
00:17:57.160
thing. Very similar where she threw text messages over the course of many weeks was trying to
00:18:03.520
encourage her boyfriend to kill himself. And, uh, and on, if I remember the case correctly on,
00:18:10.300
on the day, the night when he actually did it, she was, you know, he was texting with her and saying
00:18:16.660
that he was going to do it. And she was saying, yes, do it and encouraging. And then he did. And,
00:18:20.100
uh, she was convicted with manslaughter for that. Now, I think there are, there are many different
00:18:26.200
directions you could go with this with analyzing something like this. But for me, I see this,
00:18:32.100
this tragic, disturbing case as a sort of clear boundary line for free speech.
00:18:39.900
We talk about what is free speech? What are the limits of, of free speech? When does speech become
00:18:44.360
something more than mere speech? And I think that here we see how that works because it's clear to me
00:18:52.440
that obviously this woman, this sick, twisted woman is not covered by free speech. She has committed a
00:18:59.120
crime. If the allegations are true, she has committed a crime and should be punished severely
00:19:02.960
for it. I think she'd be, should be punished much more than involuntary manslaughter. There was,
00:19:09.100
seemed to be a lot more intention behind it than that. Now, even though she didn't, as far as we know,
00:19:14.440
she didn't physically push him over the edge. She didn't physically killed him, kill him. She used
00:19:18.500
other methods. And those other methods may mean that her penalty is not as severe as it would have
00:19:23.680
been had she directly killed him. But there still has to be a penalty. The method that she used is
00:19:28.720
not acceptable in a civilized society and shouldn't be. And this is where free speech ends. When you're
00:19:33.760
using speech with the intention of directly and substantially and maliciously harming another
00:19:40.720
person, the malicious, and the malicious part of that is important. Because if you're, for example,
00:19:45.860
if you're calling out a politician for something that they've done or said, well, that might harm
00:19:54.560
them politically, might harm their career, but it's not malicious on your part. You're justified in what
00:20:00.880
you're doing. Malicious harm through speech is, is now in the case of a politician, that would be like
00:20:06.660
if you made up a lie about them to destroy them. Well, then that's not free speech. That's against the
00:20:10.760
law. And this is where the delineation is. And yes, there might be hypothetical cases where it's
00:20:16.580
harder to discern the nuances and distinguish between whether it's free speech or not. And you're
00:20:21.280
always going to have those hard cases. But in broad strokes, generally speaking, there is a clear
00:20:25.540
distinction here. And it is relatively easy to tell the difference, generally speaking, between these
00:20:31.200
things. Now, some people have, and this is my point, some people have tried to make free speech
00:20:35.820
into this very complicated, ambiguous thing. And the people who do that, it's, they're doing it for
00:20:43.940
a reason, because it benefits them. And usually these are people who really don't like free speech.
00:20:49.480
And, but rather than come out directly against it, they just try to make it into this ambiguous
00:20:53.640
concept that nobody can really understand. But really, it's not so complicated most of the time.
00:21:00.640
When you are, as this woman did, harassing and tormenting someone relentlessly with the intention
00:21:09.500
of causing significant harm to them, and that is clearly what you intend to do and why you're doing
00:21:16.000
it, that's not free speech. And that, obviously, to me, is not what the founders intended. When the
00:21:25.480
founders codified our first amount of free speech rights, they didn't have stuff like this in mind.
00:21:32.140
I think we could be pretty sure about that. Now, on the other hand, when you give an opinion
00:21:38.900
that, let's say, hurts someone's feelings, when you're speaking honestly about your opinions on
00:21:44.820
whatever subject, and someone has their feelings hurt because of it, that is free speech. And there
00:21:50.400
actually isn't that much room for confusion here. So when we talk about, we get into the free speech
00:21:56.400
subject, and someone says, well, you know, what is an example of when it could be, when it should be
00:22:04.480
against the law for someone to just say something? Well, here we go. Here's an example that I think
00:22:11.840
most of us could agree with. And that's where the dividing line is between free speech and not free
00:22:18.280
speech. It's also interesting here, this was in, right, this was in Boston, right? And so now they're
00:22:27.620
talking about passing a law that will specifically make it illegal to do this kind of thing to, I guess,
00:22:37.440
encourage, okay, make encouragement or assistance of suicide a crime, punishable up to five years in
00:22:47.080
prison. Now, would this apply to so-called doctor-assisted suicide? It seems like you would
00:22:58.020
have to choose between the two. And anyone, and here's another angle we could explore with this,
00:23:04.480
that anyone who hears about the story of this disgusting woman and what she did, or Michelle
00:23:14.280
Carter, that kind of thing, if you hear about that, and you think that is despicable and horrible,
00:23:20.380
and that person needs to go to jail, which I think any civilized person, that's how you would feel about
00:23:24.160
it, then you should also be against assisted suicide in every circumstance, including doctor-assisted
00:23:34.280
suicide. Because I think there, I think, I imagine there are a lot of people, and there's kind of a
00:23:40.280
cognitive dissonance here, I imagine there are probably a lot of people who think that doctor-assisted
00:23:44.080
suicide is okay, but would hear about a case like this and think that's horrible, must be against the
00:23:49.440
law. Now, obviously, there is a difference between the cases that you could point to. There are several
00:23:56.680
differences. There's no doctor involved, and this was, this was, this was, you could say, manipulative,
00:24:08.200
and all of, and all of that. But the fact is, it's just, it's a question in society,
00:24:15.320
if someone is depressed, you know, if someone is going through a difficult time, should we encourage
00:24:23.800
and help facilitate their suicide or not? And I would say not. The right and moral and ethical
00:24:35.420
and civilized thing to do is to help them, to get, to get them treatment, and our message always has to
00:24:41.900
be, your life is worth living. And if that's going to be our message, if that's how we should
00:24:49.560
approach, you know, this, this issue, then I think you can't have doctors, so-called doctor-assisted
00:24:57.240
suicide or, or this. I think, I think both of them go out the window. And, you know, the people who
00:25:09.760
would say, well, it's, it's different because you got a doctor involved. Getting a doctor involved in,
00:25:14.280
in many ways makes it worse because it's a total perversion of medicine. Think about the Hippocratic
00:25:22.540
oath, do no harm. Medicine is supposed to be about treating disease, helping people, treating people,
00:25:32.260
curing people, making them healthier. That's what medicine is supposed to be.
00:25:40.340
There shouldn't be a time in medicine where we are directly killing people on purpose. That's not
00:25:46.200
medicine. All right, let's move on to emails. mattwalshow at gmail.com, mattwalshow at gmail.com.
00:25:57.180
Um, this is from, uh, let's see here from John says, hi, Matt, I like your show and your activism.
00:26:04.420
That said, I think your NCAA take is wrong. Basically your point is that you're not pro free
00:26:10.360
market. If you think the rule that NCAA athletes can't use their name to make money is fair. However,
00:26:15.920
I think you forget the essence of free market, freedom of contracts. No one forced these athletes
00:26:20.640
to enter into a contract in which a clause or name and likeness, um, with such a clause on name and
00:26:27.280
likeness. On the contrary, athletes enter these contracts because they believe that even with such
00:26:31.580
clauses, the contract benefited them. They benefit with scholarships, fame, a chance to make it in the
00:26:36.800
pro, uh, in the pros, et cetera. Now you might think that the deal is unfair, that the NCAA want,
00:26:42.300
it makes too much money and that the athletes make too little, but that's irrelevant. The athletes
00:26:46.640
in a free market environment use their freedom of contracts to enter into these agreements
00:26:50.100
because they believe that they end up better off with the agreement than without. They got what
00:26:54.940
they bargained for in a free market environment. To the contrary, if you pass a law or a court
00:26:58.900
decision saying that the athletes cannot contract, contractually waive their rights to use their name
00:27:03.100
and likeness, you are going against free market the same way a minimum wage law would. This is exactly
00:27:08.120
like a minimum wage, uh, because you would be saying, no, even if you want to, you cannot legally go
00:27:13.060
below this threshold. But what if I'm starting to work and I'm willing, and I'm willing to work
00:27:16.580
for $5 an hour? No, you can't. What if I'm willing to waive my right to use my name, to get a
00:27:20.700
scholarship, to get into NCAA and get a scholarship? No, you can't. It's the same thing. Of course,
00:27:25.700
from the free market perspective, it's totally fine if the NCAA decide on its own or, or after
00:27:30.440
bargaining with the athletes to let them use their name. Yeah. Well, that's what you said at the end
00:27:35.760
there is kind of the crux of it because what we talked about a few days ago was the NCAA was deciding
00:27:40.700
that they were going to allow athletes and we didn't, they didn't get into specifics yet as far
00:27:45.800
as I know. So there's a lot that has to be hashed out here, but the basic broad stroke here is that
00:27:49.540
they are going to allow athletes to profit off of their name and image. Um, and then you have the
00:27:54.420
government coming in and saying, or at least one politician coming in and saying he's going to
00:27:59.260
propose a law that would penalize students who take advantage of that and decide to profit off of
00:28:05.920
their name and image. So that I assume you would agree is anti-free market where now you've got a
00:28:12.260
contract between the NCAA and these athletes, and then you've got the government coming in and saying,
00:28:17.460
no, no, no, we don't like that. And so we're going to punish you for it. That obviously is not free
00:28:21.420
market. Um, also I would say that I have been pushing for this myself. I think that this is this,
00:28:33.420
the NCAA should allow this. That's not an anti-free market view on my part.
00:28:39.960
You know, I never said that the NCAA necessarily didn't have, didn't have the right to prevent
00:28:45.580
athletes from making money in this way. It wasn't a matter of, did they have the right to do it? I
00:28:50.920
didn't say they have the right to it. I said, it's not right. So they might have the right to make
00:28:57.440
these kinds of rules, preventing athletes from profiting off of their own name and image.
00:29:01.140
But I think it's wrong. And this is, this is a distinction that people struggle with a lot. I'm
00:29:06.620
not exactly sure why. Because this happens a lot where I'll say, okay, this and that, you know,
00:29:11.740
I think that such and such is wrong and you shouldn't do that. And then inevitably I'll get
00:29:17.140
the emails from people saying, oh, you're saying we shouldn't have the right to do that. No, that's
00:29:21.380
not what I'm saying. If I think that you shouldn't have the right to do it, then I'll say that.
00:29:26.200
That's a separate argument. But there are a lot of things that we have the right to do as individuals
00:29:33.160
or as organizations, as companies, whatever, that I still think we shouldn't do because it's not
00:29:41.280
right. It's unethical. It's immoral in my opinion. And so that's my argument.
00:29:44.740
Um, and, uh, third point is that, you know, when you've got, I don't think that your comparison
00:29:55.420
exactly works because with the NCAA, this is this massive organization, um, controlling all of
00:30:09.280
these schools across the country, passing these rules from on high, telling athletes that they
00:30:17.160
can't go on their own because we're not even talking about making an income saying that they
00:30:23.040
can't even go in their own private life and sell a Jersey or something. Now, prior to this rule change
00:30:31.100
they're talking about, that was what they were saying. So it's hard for me to not see. I don't know
00:30:36.600
how you could see that as anything but a violation of free market principles. I don't know how you
00:30:42.900
could say that that is consistent with the free market because even in a, in the example of Walmart
00:30:48.940
with the minimum wage that you're talking about, yeah, what you, you enter into a contract with
00:30:54.300
Walmart about what your salary is going to be, but Walmart's not going to tell you that when you
00:30:59.360
leave and go home, you can't sell something on Craigslist. And if Walmart did try to pass a rule
00:31:09.740
like that, telling employees that even when they go home and are on their own time, they're not allowed
00:31:16.360
to go on Craigslist and sell something of theirs. I would say that that certainly isn't consistent with
00:31:22.440
the free market and, um, and whether or not they, that they should have the legal right to pass a rule
00:31:26.720
like that. Well, that would be a discussion we could have, but it's, there's certainly a question
00:31:31.940
about whether or not they would. This, uh, is from Leslie says, hi, Matt. I do, I do the grocery
00:31:37.180
shopping every week for my family. I have to go during school hours. So I don't have to drag around
00:31:41.140
little humans that hang off my buggy and dabble in petty thievery. I tend to go late mornings to try
00:31:46.580
to miss restocking so I can rush through and tackle my list. But lately there seems to be a recurring
00:31:51.220
issue of larger humans standing in the middle of the aisles, texting on their phone and looking annoyed
00:31:55.940
when I say, excuse me, to try to get around them during your dictatorship. Would you please summon
00:32:00.360
these people immediately to an American gladiator style demise with those big Q-tip looking bats and
00:32:06.080
a vat of alligators? Thank you in advance for your ruling. Leslie, um, you have identified one of the
00:32:13.520
great threats to human civilization that we face and not just in supermarkets, but everywhere. People
00:32:18.640
who stand in walking aisles, uh, you know, walking lanes, aisles, sidewalks, clogging up foot traffic,
00:32:24.980
people who walk too slow, people who stand in the walking path of those moving walkways at the
00:32:30.660
airport where there's supposed to be one side, you stand one side, you walk, but there are people who
00:32:33.820
stand in the walking aisle. Um, this is all, it's an epidemic. I agree. It's chaos. As for the penalty
00:32:40.680
under my regime, I like how you're, you know, I like, I like your thought process. I like the creativity,
00:32:48.200
but here's my thought. Anyone who blocks an aisle or stands when they should be walking or walks too
00:32:53.580
slow will have their legs amputated because if you aren't going to lose, use them, then you lose them.
00:32:59.620
That's the basic principle here. And, uh, by the way, that principle will also apply to people who
00:33:04.840
don't make sufficient use of their brains as well. Use it or lose it is going to be the rule under my
00:33:10.620
regime. This is from Andrew says, why do you think people, why do people think an impeachment is good
00:33:15.700
for Trump? I'm seeing all over Twitter, mid-level conservative figures declare that what the house
00:33:20.500
is doing is an automatic win for Trump in 2020. Is that really the case? Well, the only way that
00:33:26.660
it's bad for Trump is if he actually gets thrown out of office, which isn't going to happen. So the
00:33:31.960
problem for Democrats is that they're really, as far as I can tell, there really isn't anything for them
00:33:35.700
to gain from this politically speaking, the anti-Trump voters who love the impeachment stuff.
00:33:41.000
Well, they're going to vote for Democrats regardless. The Democrats didn't need to do this to win the
00:33:47.940
votes of the rabidly anti-Trump voters, because they're going to vote against Trump no matter what
00:33:53.220
the Democrats do. Um, the issue then is then is, is how does it play with people who are in the middle
00:34:00.640
and people who are center-right? The people who are in the middle or center or center-right and don't
00:34:06.860
really like Trump and are kind of sick of his old act and could, could be convinced to vote against
00:34:14.260
him, but are concerned about the extremism and the craziness on the left? The question is now the
00:34:23.920
Democrats, they need to win a good chunk of those voters. The question is this impeachment pageant, is it
00:34:31.920
going to help them win those people or is it more likely to alienate those people? And I would say
00:34:36.640
almost certainly the latter. Uh, this is from Rebecca says, Matt, what are your thoughts on Q?
00:34:46.360
Well, Rebecca, you didn't, I was trying to figure out when I read this email,
00:34:49.600
what are your thoughts on Q? You didn't stipulate what exactly you're referring to. So you could be
00:34:56.140
asking for my general thoughts on the letter Q. Um, and as far as that goes, I'm a fan of the letter.
00:35:01.460
I don't, I don't think it's a top five member of the alphabet, but it's not bad. Um, you could be
00:35:06.320
asking me about the whole QAnon thing. And as for that, I don't believe any of that because I'm not a
00:35:10.960
moron. Um, or you could be asking me about the Q source, which is the hypothetical source Matthew and
00:35:17.860
Luke use to write their gospels. And as the theory goes, Matthew and Luke based their gospel accounts
00:35:23.240
on Mark, but they also had a second source, which would have been a collection of Jesus's sayings, uh,
00:35:29.200
without any narrative. So sort of like the apocryphal gospel of Thomas. Um, and I'm going
00:35:35.740
to pretend that you meant to ask me about that because that is the most interesting Q related
00:35:39.880
thing that we could talk about in my opinion. And, uh, and no, I'm not a proponent of the Q theory.
00:35:44.660
I think inventing a hypothetical gospel that nobody has ever seen is sort of absurd and, and far too
00:35:52.380
complicated when there are simpler, more logical explanations. So it's clear that Matthew and Luke,
00:35:59.700
uh, you know, Matthew, Mark, and Luke are the synoptic gospels. Of course, we call them synoptic
00:36:04.100
because they're so similar. Um, it's clear that they did use Mark as a source because, and we know
00:36:09.920
that because they repeat Mark almost verbatim at many different places in their accounts. So there's
00:36:15.940
just no reasonable way to explain the similarities, um, other than, other than they used Mark.
00:36:24.320
And besides in the preamble to Luke's gospel, he says that he consulted other sources. And so it
00:36:29.980
makes sense that he would have consulted an earlier gospel like Mark, although he doesn't specifically
00:36:33.520
say Mark, that does make sense. The issue though, and where people get this Q thing from is that
00:36:38.140
while Matthew and Luke repeat Mark verbatim at various points, they also mirror each other verbatim
00:36:43.520
at other points, but not Mark. And so the question is, where did they get that almost identical content
00:36:49.960
that isn't in Mark there? And so the, the, the theory is there must be some other third thing
00:36:54.540
they looked at, um, that they were both using independently without knowing that the other
00:36:59.020
one was using it. And, um, but my thing is, you know, why couldn't it have been like this?
00:37:06.020
Mark writes his gospel first, uh, and as tradition has it, Mark is, was a, was a, a, a,
00:37:12.640
a follower of Peter and, and, and, uh, an associate of Peter. And so would have gotten a lot of his
00:37:19.320
material from Peter. And then Matthew writes his gospel and uses Mark and also his own recollections
00:37:25.900
as an eyewitness apostle. And then Luke writes third and he uses both Matthew and Mark and then
00:37:32.580
adds his own material that he gets from, uh, from his own sources. So there's no need for Q and it seems
00:37:40.600
to be simpler and it makes more sense that way in my view. All right. Finally from Tim.
00:37:46.380
And I just answered a question that you almost certainly probably were not trying to ask,
00:37:50.580
uh, from Tim says, hi, Matt, I don't believe in divorce, but have you considered it with your
00:37:55.440
refrigerator problem? Yeah. I've been talking about this on Twitter. Um, my wife has, well,
00:38:02.600
it all started, I think with the syrup. She started putting our syrup in the refrigerator.
00:38:07.340
And I, originally I didn't say anything because I thought, okay, it's not, not a big deal. It's
00:38:12.220
just syrup. Um, so when I found the syrup in the fridge, I would take it out. Cause you don't,
00:38:15.560
you don't need to put syrup in the fridge. Okay. Syrup will stay good forever. Uh, you know,
00:38:20.160
syrup will outlast the plastic that it's in. It will outlast the bottle that it's in. Syrup is just,
00:38:26.340
is, will stay good for a million years. Uh, that's just science as far as I know. Although I could be
00:38:32.020
wrong. Probably. So she started putting syrup in the fridge and I didn't say anything.
00:38:36.360
And, and then, and then she started putting peanut butter in the fridge.
00:38:41.480
And that's when I started making comments and saying, you know, we don't need to put
00:38:45.300
peanut butter and syrup in the fridge. It can just go in the pantry. But then she started putting
00:38:50.960
honey in the fridge too. Honey. I mean, honey is, it, it, it usually is in a beehive,
00:38:57.260
which is not refrigerated. In fact, beehives are very warm inside. Bees keep hives warm,
00:39:03.720
even in the, even in the, even in the winter time. Uh, because with the vibration of their
00:39:07.680
bodies, they, they heat up the, the, the inside of the hive so that it's at like 98 degrees inside
00:39:11.540
the hive. Tried to explain this to my wife. She didn't care. So now we've got, we've got a honey,
00:39:16.920
syrup, and peanut butter going in the fridge. And every time I see it, I take it out and put it in
00:39:20.920
the pantry. And then she puts it back in the fridge and we got chaos going on. Well, yesterday I
00:39:24.580
opened up the fridge last night and I find honey, syrup, peanut butter, and bananas in the fridge.
00:39:32.640
She, she is now putting bananas in the refrigerator. Bananas are tropical fruits. You get them in the
00:39:38.380
jungle. They don't need to be in a refrigerator. There's no refrigeration in the jungle.
00:39:44.300
And you know what happened? I, I tried to talk to my wife about this calmly.
00:39:47.980
And I said, you know, I think we have a problem here. I think you have a compulsion. You have a
00:39:55.960
refrigerator, refrigerator related compulsion. We need it. We can, we can get help for this.
00:40:01.160
That's it. It's, it's okay. You know what she did? She just laughed.
00:40:06.480
This is refrigerator abuse is not a laughing matter.
00:40:10.480
I don't know where it's going to end before all of our kitchen items are going to end up in the
00:40:19.060
refrigerator. She's going to start putting like the toaster in the refrigerator.
00:40:24.040
This is, this is, but you know, as I've been talking about this on Twitter, I've discovered
00:40:27.480
that, um, a lot of people out there are dealing with similar problems in their home where they've
00:40:34.620
got one spouse or the other who thinks that everything should be refrigerated.
00:40:38.580
And so this is an unspoken issue in America that I'm glad that I could bring this for this,
00:40:44.300
this conversation to the forefront because we need to talk about it. Um, a refrigerator is a,
00:40:49.480
is a, is a, is an important and powerful tool. It has specific uses and to abuse it in this way
00:40:56.700
is dangerous and, uh, and offensive to me on a personal level. And we'll leave it there.
00:41:05.180
Thanks everybody for watching. Have a great weekend. Godspeed.
00:41:09.260
If you enjoyed this episode, don't forget to subscribe. And if you want to help spread the
00:41:16.060
word, please give us a five-star review and tell your friends to subscribe as well. We're available
00:41:19.960
on Apple podcasts, Spotify, wherever you listen to podcasts. Also be sure to check out the other
00:41:25.120
Daily Wire podcasts, including the Ben Shapiro show, Michael Knoll show, and the Andrew Klavan show.
00:41:29.980
Thanks for listening.
00:41:30.700
The Matt Walsh show is produced by Robert Sterling, associate producer, Alexia Garcia Del Rio,
00:41:36.000
executive producer, Jeremy Boring, senior producer, Jonathan Hay. Our supervising producer is Mathis
00:41:41.240
Glover and our technical producer is Austin Stevens edited by Donovan Fowler. Audio is mixed by Mike
00:41:47.440
Coromina. The Matt Walsh show is a Daily Wire production, copyright Daily Wire 2019.
00:41:51.700
If you prefer facts over feelings, if you aren't offended by the brutal truth,
00:41:56.460
if you can still laugh at the nuttiness filling our national news cycle, well,
00:42:00.200
tune on into the Ben Shapiro show where you'll get a whole lot of that and much more. We'll see you there.
Link copied!