The Matt Walsh Show - May 03, 2019


Ep. 252 - The Great Purge Has Begun


Episode Stats

Length

35 minutes

Words per Minute

165.12495

Word Count

5,861

Sentence Count

394

Misogynist Sentences

10

Hate Speech Sentences

11


Summary


Transcript

00:00:00.000 Today on The Matt Walsh Show, Facebook steps up its censorship efforts.
00:00:03.760 We'll talk about the great purge of supposedly dangerous right-wingers.
00:00:07.980 And we'll talk about both the ethical and legal implications.
00:00:11.320 Also, it is now white supremacist to be objective.
00:00:17.420 That's what we're being told.
00:00:18.840 We'll discuss how the left has made white supremacy into a completely meaningless concept.
00:00:23.540 And finally, Burger King has figured out a cure for the mental health problem in America.
00:00:30.000 And we'll talk about that as well today on The Matt Walsh Show.
00:00:40.840 So Facebook yesterday stepped up its censorship campaign in one giant purge.
00:00:47.800 They permanently banned a number of high-profile accounts.
00:00:51.240 The ban includes Milo Yiannopoulos, Laura Loomer, Paul Joseph Watson, along with Farrakhan and a few others.
00:00:58.960 But it goes beyond that, even, because Facebook banned any representation of these people, meaning you can't share content by them.
00:01:09.960 So no InfoWars content is allowed on Facebook at all anymore.
00:01:14.280 No Alex Jones content, presumably nothing produced by Milo or Watson either.
00:01:19.600 So a Facebook spokesperson said that in explaining this sudden purge of people who all coincidentally, except for Farrakhan, happened to exist on the right end of the spectrum.
00:01:37.040 In explaining it, they said that the spokesperson said that these people were banned for, quote, engaging in violence or hate.
00:01:45.800 Facebook also said that they are dangerous.
00:01:48.040 And then there's a report on the BBC's website, which has more information.
00:01:52.960 It says, a spokesperson at Facebook said the ban will apply to all types of representation of the individuals on both Facebook and Instagram.
00:01:59.960 The firm said it would remove pages, groups, and accounts set up to represent them and would not allow the promotion of events when it knows the banned individual is participating.
00:02:09.760 So you can't even promote an event if one of these guys are going to be there in an email or woman as well, Laura Loomer.
00:02:18.360 In an email, Facebook explained its rationale for banning the users.
00:02:21.800 Listen to this. It said it said Alex Jones had hosted on his program, Gavin McGinnis, leader of the Proud Boys, who, although I believe Gavin has stepped down from the Proud Boys, I think.
00:02:34.800 Um, whose members are, this is according to reading now from the BBC, whose members are known for racist, anti-Muslim, and misogynistic rhetoric.
00:02:44.080 Mr. McGinnis has been designated a hate figure by Facebook.
00:02:47.740 Facebook said this year Milo Yiannopoulos had publicly praised both, um, uh, Mr. McGinnis and English Defense League founder Tommy Robinson, both banned from the network.
00:02:58.460 Laura Loomer also appeared with Mr. McGinnis and Facebook said, uh, she also praised another banned figure, Faith Goldie, a Canadian, a Canadian nation of Islam, uh, a Canadian, sorry.
00:03:09.580 I'm, I'm, I almost said a Canadian nation of Islam leader. Um, I'm blending sentences together.
00:03:14.840 Next sentence, nation of Islam leader, uh, Louis Farrakhan was banned from making several anti-Semitic remarks earlier this year.
00:03:21.260 Okay. I mean, this is just, I'm, I'm, it's somewhat bewildering. So Milo is banned according to Facebook, um, because he spoke positively in public about Gavin McGinnis.
00:03:37.000 So that's a violation of the rules. Now, if you express support in public, not even on Facebook, but just anywhere in public, if you express support for someone that Facebook has targeted, then you'll be banned also.
00:03:52.840 That's what they're saying. These are the, the sins committed by these individuals that Facebook mentions.
00:03:58.520 Um, but if these people really have violated the rules, then why are you banning them all on the same day? Why are you doing this big dramatic thing where you're frog marching them in front of the cameras?
00:04:12.540 Um, if you're just objectively enforcing the rules, whatever they may be, then wouldn't you have banned, um, you know, Paul, when he violated those rules, whenever that was?
00:04:24.700 And, uh, Alex Jones, whenever he violated them. Now with, with Alex Jones, Alex Jones was banned on Facebook like a year ago. Right. But now it's going even further that if you're, if you're, you know, his, his, his website, everything, if you're associated with him, none, no, no content associated with him is allowed on the site.
00:04:46.780 Well, um, you, you know, I think they could have, Facebook could have made an argument for banning Alex Jones back when he was promoting the insane Sandy Hook conspiracy theory, which led to death threats against the parents of murdered children.
00:05:03.340 And it was a conspiracy theory that was completely invented and believed exclusively by morons and brain dead idiots. Um, so just be really clear about that. It was a, not only a, not only a stupid conspiracy theory, but a really evil despicable one, which led to, as I said, the targeting of parents whose children had just been murdered.
00:05:26.580 And now they have, they have these whack jobs coming after them. Now look, um, that was five years ago though. So if, while Alex, Alex Jones was promoting that conspiracy theory, which he did, if they had said, no, we're not allowing this on this, this is, this is, you know, false information. It's, it's dangerous to the, to the parents who are now being targeted for violence. Uh, we're not going to allow it on our platform.
00:05:52.840 And we're going to ban you for promoting it. If they had done that, then we could say, okay, well, that's a, that seems like a consistent enforcement of the rules. Doesn't appear to be politically motivated, but they didn't is the point.
00:06:08.640 They, they didn't do anything when, when the, when this, when this conspiracy theory was actually being promoted on Facebook, they didn't do anything about it. Five years later, retroactively, they decided to punish him for past sins.
00:06:20.700 And that to me seems to be a problem. Um, so, you know, why did you wait five years? If you're going to ban Milo, why are you banning him now? Why did you wait until now to do it?
00:06:36.160 What has he done recently to earn this? Oh, that's right. He spoke in support of Gavin. I mean, he spoke, he said something nice about Gavin McGinnis.
00:06:43.220 Well, that makes sense, right? No, it doesn't. Um, but even if that's the reason, why didn't you ban him whenever he said that?
00:06:53.760 I'm, I'm pretty sure. I assume Milo has publicly, he's appeared in public with Gavin McGinnis in the past.
00:06:59.180 So if that, if that's a violation of the rules, why didn't you ban him as soon as he did it?
00:07:04.660 Um, and you know, Paul Joseph Watson, what, what has he ever done to earn a ban?
00:07:11.240 I mean, whatever it was, whenever it was, why not ban him when it happened?
00:07:17.180 Well, the answer here, of course, is that this is a PR stunt by Facebook.
00:07:20.760 And the reason why they don't give more specific reasons for the bans, other than this weird thing
00:07:25.880 about Gavin McGinnis is that they don't have specific reasons. This is political censorship.
00:07:31.680 They're saying that Paul Joseph Watson engages in violence or hatred. They don't give any examples
00:07:37.320 of it whatsoever because, uh, there are no examples. They don't have any examples.
00:07:43.020 This is political censorship, plain and simple. Yeah. They tossed Farrakhan in there because they
00:07:48.000 needed to have someone, if they're going to do this big thing, they need to have somebody who's not,
00:07:53.740 uh, you know, on the, on the quote far right. Although the media has tried to lump Farrakhan in
00:08:01.600 with the far right as a far right figure, which of course is, is ridiculous. Uh, Farrakhan is a leftist,
00:08:07.120 but, but he's kind of his own weird thing too, right? The fact is Facebook, they don't target people
00:08:13.420 who are really associated with the, with the actual far left. Um, no, this is all focused on
00:08:19.820 one side. And I could go through a laundry list of far left accounts on Facebook right now that are,
00:08:28.120 that are on Facebook and they're spreading far left content, hateful extremist, whatever.
00:08:34.380 Um, but let's just remember, I could, there, there, there are many examples, but let's, um,
00:08:38.700 let's remember one just from this past week, because we just talked about it recently.
00:08:44.220 And, uh, remember we talked about this. Here's the picture again. That's an account,
00:08:49.240 a Facebook account run by a burlesque performer. And there she is half naked with a young girl
00:08:56.720 stuffing dollar bills into her underwear. Okay. That was posted on Facebook, that picture.
00:09:02.620 And the picture was taken down, but I, as far as I know, it was taken down by the person who posted
00:09:10.180 it, not by Facebook. Um, and this account, which proudly depicts the exploitation and abuse of
00:09:16.720 children is still on Facebook. So that's not what you just saw there in that picture. Okay. That,
00:09:22.840 that's not dangerous. Uh, that's not extremist or whatever. No, that that's okay.
00:09:29.300 And why is that? Because she's a, she's a, a burlesque performer who, who's, you know,
00:09:36.180 promoting public nudity and, and, you know, sex, the sexualization of children. That is a far left,
00:09:42.240 uh, uh, agenda item. And so that's why Facebook leaves, leaves it alone. So there are two questions.
00:09:50.780 Um, fix my microphone. There are two questions, uh, that we have to ask here about this.
00:09:58.920 Number one is, is Facebook right in censoring these people? And the second is, does Facebook
00:10:07.640 have the right to censor them? And those are two different questions with not necessarily the same
00:10:13.280 answer, but whatever the answer is to number two, before we get to that, I think the answer to number
00:10:18.260 one is very clear that no, they are not right in doing it. Uh, and every conservative should be up in
00:10:24.860 arms about this, speaking out about it, because this is absolutely political censorship. Facebook
00:10:29.940 says that it will ban people who are dangerous and hateful. Yet the only litmus test apparently
00:10:35.780 for what's considered dangerous and hateful, uh, is ideological. So by their standards,
00:10:41.800 you are dangerous and hateful, or you're at least close to being dangerous and hateful if you're on the
00:10:47.260 right. So you can hope that they eat you last. You can hope that you get eaten last, but, uh, you
00:10:54.080 will get eaten if you're a conservative. I think that's becoming clear. The problem is that some
00:11:00.200 conservatives, um, as I've been watching the reaction from other so-called prominent conservatives
00:11:06.920 is that I, I see some of them who, who, who are basically saying, well, I don't like Alex Jones.
00:11:12.840 I don't like Paul or Milo. So, uh, so I'm fine with this. You know, I'm, I'm, I'm not going to say
00:11:17.960 anything about this because I don't like those particular guys. That is very short-sighted
00:11:22.740 thinking right there. Uh, again, it would be entirely different in my view. If Facebook had
00:11:28.400 clear rules, clear terms of service, um, and they enforce those rules equally. Now that would be
00:11:35.180 different. If Facebook decided they were going to ban everyone who has extreme views, um, everyone who
00:11:42.200 could be called hateful, everyone who is radical and what have you, then, uh, then, okay, the site
00:11:48.160 would be boring as hell in that case because all the interesting people would be gone, but at least
00:11:53.020 it would be consistent. And then Facebook would become a place kind of like LinkedIn or something,
00:11:57.340 just a, just a bland sort of meeting place where, um, basically no ideas are, are, are welcome.
00:12:04.920 And if that's what they decided they want to do, then, you know, I, I think there'd be no room
00:12:09.200 to complain, but when it comes to, when it, when it claims to be a forum for the exchange of ideas
00:12:15.400 and it claims to be not politically biased, but then it labels only those on one side as being
00:12:24.240 hateful and dangerous and extreme, and then bans them, bans them in one big PR stunt, smearing them
00:12:30.740 in the media in the process. That's a whole different ball game. So as for the second question,
00:12:35.740 do they have the right to do this? Well, I'll say one thing. I don't see how a company as powerful
00:12:44.300 as Facebook has the right to smear anyone as being dangerous and hateful without justification or
00:12:51.500 reason. Okay. That seems like libel to me, like defamation. And it has a very real effect on people's
00:12:57.840 lives. This is not, this is not just a, you know, any old person saying, Oh, you're hate. It's not,
00:13:02.540 it's not just some Twitter trolls saying, Oh, you're hateful and dangerous. When you've got one
00:13:07.020 of the most powerful companies in the world, blacklisting you and then going to the media
00:13:11.980 and saying, these are violent, hateful, dangerous people. Um, that that's going to affect your life.
00:13:18.160 That's going to affect your career. That's going to have a devastating effect on you. And not just
00:13:23.200 because you don't have a Facebook account anymore. I mean, you know, try getting a job when you have
00:13:28.920 been publicly smeared by the most power, one of the most powerful companies in the companies in
00:13:34.260 the world as dangerous and hateful. Think about how that's going to affect your, your, your, uh,
00:13:39.460 professional prospects. So if you're going to make those kinds of claims about someone,
00:13:45.900 I think you need to provide evidence. You need to provide justification and just giving a couple of
00:13:51.740 examples of some of these people, um, you know, hanging out with Gavin McGinnis. That's not enough.
00:13:57.820 Okay. That's, that's, that's not going to do it. Facebook does not have the right to defame anyone
00:14:04.020 any more than anyone else has the right to do it. If they have evidence to support the claim
00:14:09.680 that Paul Joseph Watson is dangerous and hateful and violent, then they should present it. If they
00:14:16.700 can't present it, then I think Paul has a case against them, um, because they're defaming him.
00:14:25.420 So that's the first thing now, as for their right to ban whoever they want, um, even on a politically
00:14:34.340 motivated ideological basis, well, that kind of hinges on what Facebook is exactly. And that's,
00:14:41.500 that's the debate, right? And so you'll hear, you'll hear arguments about whether Facebook is just a
00:14:48.520 platform or is it a publisher or is it a public utility? Um, if they're a utility like the phone
00:14:54.680 company, then no, they can't ban just anyone they want from using their, their platform. Uh, are they
00:15:00.840 a utility? Well, I can, you know, I can see the argument for it. After all, phones are a utility
00:15:05.780 and, um, and they're a communication tool. Facebook is a communication tool used by over a billion
00:15:11.860 people across the globe. On the other hand, you could argue that Facebook is really just an app
00:15:16.420 on your phone. So it's more like a channel on your television, not the cable provider itself,
00:15:22.480 thus not a utility. I tend to side with that point of view. Um, are they a publisher though? Well,
00:15:28.800 if they're exercising this kind of editorial control, that would seem to make them a publisher.
00:15:35.320 The trouble is if they're a publisher, then they're going to be responsible for everything
00:15:39.460 that's posted on their site. And they don't want to be responsible for that. Um, they don't want
00:15:43.960 to be responsible for, for all of it. Um, but if they don't want to be responsible for it, then,
00:15:51.180 then they need to be just a platform, just a kind of benign, uh, you know, stage that's provided
00:15:58.400 upon which people can stand and express their views. But if they're just a platform, then they
00:16:06.280 can't exercise this kind of editorial control over it. If they are exercising the editorial control,
00:16:12.080 then that makes them a publisher. So the trouble seems to me that Facebook is dancing between all
00:16:17.580 of these different lines. One minute it's a publisher, the next it's a platform, the next
00:16:21.660 it's a utility. Um, I think it has to be one thing or the other. It has to decide what it is.
00:16:27.340 And then it has to behave that way. It seems to me for now though, I will say that sure,
00:16:34.020 they have the right to ban people generally speaking, but they don't have the right to smear
00:16:40.360 anyone without basis. Um, and putting rights aside and looking instead at what is right.
00:16:46.640 This is not right. What they're doing here is not right. It's, it's not a war on extremism. It's not
00:16:52.680 an effort to stop hate. Okay. It's not any of that. This is political censorship, plain and simple.
00:16:58.780 That's what it is. And, um, we need to be speaking up against it. All right. So a woman posted
00:17:06.380 this on Twitter, approvingly posted, uh, I should add, it's a photo of a lecture slide from some
00:17:13.020 class, not sure, not sure where, um, I assume a college class. And anyway, look at this, the,
00:17:23.900 the slide lists what it calls, um, characteristics of white supremacy.
00:17:29.700 And, uh, it lists characteristics of white supremacy and it lists among other things as a
00:17:38.340 characteristic of white supremacy. It lists, um, objectivity. And then as you could see other
00:17:43.220 kind of banal, uh, or, or positive quality being objective, I would think is, is a positive thing.
00:17:49.920 The idea of objectivity being a characteristic of white supremacy is apparently it's a thing now on
00:17:56.220 the left. This isn't the first time I've seen this. In fact, I remembered, I had to go look it up
00:18:00.480 because I remembered seeing this in an article. It was on the national review a few months ago.
00:18:05.380 Um, and just the first few sentences of the, of the article say a course that will be taught
00:18:09.280 at a Hobart and William Smith colleges next year will teach students that objectivity meritocracy,
00:18:16.140 uh, our exam objectivity meritocracy are examples of white mythologies and social constructs.
00:18:21.920 The description for the class says, uh, this course explores the history and ongoing manifestations
00:18:27.680 of white mythologies, longstanding, often implicit views about the place of white male
00:18:33.000 Euro American subjects as the norm against which the peoples of the world are to be understood and
00:18:38.580 judged. The, the, the class is titled white mythologies, objectivity meritocracy, and other
00:18:45.200 social constructions. Now I'm not sure if this slide is a, is from that particular class or not,
00:18:53.180 but either way, here's the point. Um, white supremacy does exist in this world. There are
00:18:59.820 real white supremacists. There are real white racists out there. We, we have seen them. I don't
00:19:05.940 think there are a lot of them comparatively speaking, but they do exist. They are out there yet.
00:19:12.000 But the left has, and some of them are quite dangerous, legitimately. So, which is what makes
00:19:21.180 it so unfortunate that the left has made itself incapable of fighting real white supremacy because
00:19:27.560 it treats everything as white supremacy. This is the thing that I, that for some reason leftists
00:19:34.260 fail to grasp. Um, when you call everything white supremacy, when you call everything racist,
00:19:41.120 when you call everything sexist, when you call everything homophobic on and on and on, then
00:19:46.540 you're not going to be left with any meaningful words to use or meaningful labels to use when
00:19:54.640 you actually encounter those things for real. If you've called everything white, if it's, if it's
00:19:59.680 white, if you're a white supremacist for trying to be objective and you're going to call someone
00:20:04.820 like that, a white supremacist, then what are you going to say about the person who comes along
00:20:09.540 and actually says, I think white people are better than, than, than everyone else. And, uh, you
00:20:14.560 know, and, and people of other colors are inferior. What do you, you just used white supremacist on the
00:20:21.560 guy who was talking about objectivity. What are you going to say about this guy? Well, you're going
00:20:25.980 to call him a white supremacist too, but you have just basically let him off the hook because you put
00:20:31.820 him in the same category as that other guy over there who was just making a benign statement about,
00:20:37.600 uh, about objectivity. You have made it so that you, there's, there's nothing meaningful you can
00:20:43.600 say against white supremacy because you have turned everything in white, into white supremacy.
00:20:49.000 And when everything is white supremacy, then nothing is white supremacy anymore because the
00:20:53.540 word has no meaning. That's the problem. All right. Two other things to get to before we
00:20:59.520 read some emails. Um, first with no setup, just watch this.
00:21:08.520 I can't believe my student loan. I'm never moving out of my parents' home.
00:21:15.880 Just got ghosted. Should've known. Pretty sure I'll end up alone.
00:21:22.140 They say I'm too young to raise my baby girl. Take your opinions and suck it world.
00:21:29.140 time in awe. All I ask is that you let me feel my way. All I ask is that you let me feel my way.
00:21:43.880 All .
00:21:47.820 All I ask is that you let me feel my way
00:21:52.920 Yeah, that's a fast food commercial.
00:22:08.960 And they're doing this because they want to promote mental health.
00:22:14.340 This is a campaign for mental health.
00:22:17.820 So they're renaming all of their all these meals according to certain moods and because it's going to promote mental health, which which I have to say, you know, I was feeling really depressed.
00:22:30.580 But now that I can get a pissed meal at Burger King, everything is better.
00:22:35.940 You know, I don't even have to go to counseling.
00:22:38.180 Burger King has solved mental health, folks.
00:22:40.860 They've done it.
00:22:42.060 Big news.
00:22:42.840 Finally, speaking of mental health, I mentioned before that my wife has a condition.
00:22:51.520 She has a problem, really.
00:22:54.900 She's, as I mentioned before, she's addicted to decorative pillows.
00:22:58.480 And when I talked about this in the past, people, I think, thought that I was exaggerating to try to be funny.
00:23:08.180 Well, I want to show you something so that you realize I'm not exaggerating.
00:23:11.620 OK, look at look at this picture here.
00:23:14.440 Just look at it.
00:23:17.880 We're doing some spring cleaning.
00:23:19.620 You see those bags there.
00:23:20.540 And my wife is relocating her decorative pillow collection.
00:23:26.120 So those bags, all those bags, those bags are filled with decorative pillows.
00:23:30.880 All of those bags have pillows in them.
00:23:34.740 All right.
00:23:35.800 Decorative, not even real pillows, not not not pillows that we can use decorative pillows.
00:23:40.660 And those you see those bags.
00:23:42.320 Those are just the pillows that are not currently on display because those are the non spring themed pillows.
00:23:48.120 Oh, yeah.
00:23:48.740 She has a pillow for every season.
00:23:50.460 So right now we've got the spring pillows out.
00:23:52.960 Those are the winter, fall and summer pillows.
00:23:55.940 So that's not even all the pillows.
00:23:57.520 Those are just, you know, three fourths of the pillows.
00:24:04.300 If we traded in her pillow collection, we could buy a new car.
00:24:09.260 All I can say, I hope is what I was saying to her last night.
00:24:12.320 I can only hope that pillows appreciate in value over time because we aren't going to have any other retirement savings.
00:24:18.120 With my wife spending forty seven thousand dollars a month on pillows.
00:24:22.140 We're not going to have anything else.
00:24:24.260 It's just it's it's really a problem, guys.
00:24:26.740 So you see what I'm talking about.
00:24:28.400 I'm not you know, I'm not making this up.
00:24:31.120 Last week, my wife said she was going to the post office and I found her six hours later lying in the aisle at HomeGoods, passed out from pillow fever.
00:24:38.920 I mean, I caught her yesterday.
00:24:40.260 She was chopping up pillows and trying to snort them.
00:24:42.620 I mean, literally.
00:24:43.580 OK, but I didn't those things didn't actually happen.
00:24:46.760 But but but they might happen if we don't get this woman to a rehab.
00:24:49.920 Do they have rehab centers for suburban white women who are obsessed with home goods?
00:24:53.060 Is that a is that a thing?
00:24:53.900 Because that should be a thing if it's not already.
00:24:57.540 All I'm saying is, you know, she put those in trash bags.
00:25:00.000 And I'm, you know, I'm not saying anything.
00:25:06.420 I just I worry that those things are in trash bags and there could be a terrible mix up wherein someone accidentally thinks that those trash bags are filled with trash and puts them on the curb on trash day.
00:25:21.580 That could happen.
00:25:23.220 It'd be a terrible mistake.
00:25:24.860 I just it could happen, though, accidentally.
00:25:29.100 Of course, then I would worry that my wife might literally stab me in the face.
00:25:32.160 But, you know, it might it might be worth it just to declutter a bit.
00:25:37.440 All right.
00:25:38.240 MattWalshShow at Gmail dot com.
00:25:39.740 MattWalshShow at Gmail dot com is the email address.
00:25:42.640 This is from well, I'll keep this anonymous.
00:25:46.840 It says, Hi, Mr.
00:25:47.740 Walsh, I'm emailing for your advice on a moral dilemma.
00:25:50.060 I find myself in the guy I eat lunch with every day at work and I have become good friends for a while.
00:25:54.740 We talked lightly about politics.
00:25:56.200 I always assumed he had similar views to mine based off what he said.
00:25:59.300 However, I was shocked the other day when he started trying to convince me of the anti-Semitic theory that the Jews are running the world.
00:26:04.940 At the time, I didn't know what to say.
00:26:07.920 So all I told him was I think he's wrong and that a large part of success with Jewish people comes down to their values, which their religion inculcates in them.
00:26:16.440 However, I still can't shake this uncomfortable feeling I have around this guy because I believe him to be anti-Semitic.
00:26:22.100 Should I try to tell him further why that viewpoint is dangerous and wrong or should I just end communication with him?
00:26:26.760 Well, I don't think that we should immediately dump people to the curb and ostracize them for having views we disagree with, even if those are insane and hateful views, which anti-Semitism is.
00:26:38.800 So what I would do is I would continue to talk with him and try to show him the light a little bit, as you've been doing.
00:26:46.760 You know, sometimes people, maybe you don't know a lot about this guy yet, but sometimes people will, they'll have an idea in their head.
00:26:53.840 They'll be, they'll be harboring this idea, but they don't say it out loud because they're rightly ashamed of it.
00:27:00.460 And then maybe one day they finally do say it out loud because they're kind of testing the idea.
00:27:06.200 They're seeing if it holds up to scrutiny.
00:27:09.760 And maybe that's what he was doing there, you know.
00:27:12.520 So I would give it some scrutiny.
00:27:14.240 Sometimes a person doesn't realize how bad their idea is, how stupid or detestable until they've said it and they've kind of put it out there for, for, to be analyzed and argued against.
00:27:32.680 And then they realize.
00:27:35.180 So I would do, I would do him the kindness, do the service of, of scrutinizing that idea.
00:27:41.740 And if he's an honest guy, then maybe he'll see that the idea is stupid and he'll drop it.
00:27:47.780 It's possible, could happen.
00:27:49.840 And so you'll, you've helped him out in that case.
00:27:51.640 I would just, so I've just talked to him.
00:27:53.160 Ultimately, if you find out that he really is a committed, passionate, unmovable anti-Semite, then maybe find someone else to eat lunch with.
00:28:00.200 But I wouldn't assume that right off the bat.
00:28:02.320 I would try to maybe talk with him.
00:28:04.340 All right.
00:28:04.580 This is from Angela.
00:28:05.540 This is, hi, Matt.
00:28:06.100 The more I listen to your show, the more I enjoy it and respect you.
00:28:08.180 Keep telling the truth and searching for the truth, even if it's unpopular.
00:28:11.720 Two questions.
00:28:13.260 Number one, if you, if you shaved your beard, do you think your children would cry and not recognize you?
00:28:18.140 Time for an experiment.
00:28:19.640 Number two, have you been stung by your bees yet?
00:28:23.020 I have not been completely clean shaven since my kids were born.
00:28:26.540 I don't remember the last time I took like a Bic razor to my entire face.
00:28:30.620 It's been, I don't know, probably was like high school was the last time.
00:28:33.460 Although I have in the past cut it down very short.
00:28:37.000 And my kids were disturbed by that.
00:28:38.740 In fact, my, my daughter thought that someone stole my beard in the middle of the night.
00:28:43.240 And she may have thought that because I told her that a gnome came in the middle of the night and stole my beard.
00:28:51.260 And now I have to go find it.
00:28:52.840 So I told her that and she believes it.
00:28:55.500 As for being stung by bees, yes, I have.
00:28:57.140 In fact, um, my first time out with the high, but I, I was, I had the bee suit on.
00:29:01.020 I thought I felt very confident.
00:29:02.360 You feel really powerful in a bee suit.
00:29:03.940 Like, you know, nothing can get you.
00:29:05.700 But then I forgot, um, that, uh, I, I forgot about the, my feet.
00:29:12.000 And so a bee crawled down in my shoe and stung my foot.
00:29:16.260 All right.
00:29:16.720 This is from Patrick.
00:29:17.700 I started, I started reading this a few days ago and then I stopped because I thought I already answered it.
00:29:23.200 But then I was informed that I did not already answer it.
00:29:25.120 So here it goes, um, from Patrick says, future supreme overlord and ruthless dictator of the world.
00:29:31.000 I, your humble and future servant have a question for you.
00:29:33.940 I listen to your show every day and notice you speak about the importance of religion.
00:29:37.360 I understand the importance of community and morals, but you don't need religion to have those things.
00:29:41.580 My question is, why is religion so important?
00:29:44.360 I'm an agnostic, former Catholic person.
00:29:46.260 I think I am a very moral person without religion.
00:29:48.500 Since I am not Catholic, I will undoubtedly be executed when you rise to power.
00:29:53.100 I consider it an honor and privilege to die by your hand.
00:29:55.700 I only ask you make it quick and painless.
00:29:57.600 Thanks.
00:29:57.980 Keep up the good work.
00:29:59.000 Um, I really appreciate your, your, um, attitude there, Patrick.
00:30:02.760 And I cannot promise the quick and painless part, but, um, I, I do again, appreciate the thought anyway.
00:30:09.440 As for, uh, morals and community without religion.
00:30:14.880 Um, well, are you sure I did not answer this?
00:30:18.400 I feel like I've, well, I probably just talk about this topic all the time.
00:30:21.200 Community.
00:30:21.720 Yeah.
00:30:21.840 You can have community without, uh, with, without religion.
00:30:24.840 You know, all many animals have, have, have form in communities.
00:30:32.900 Um, bees, like we've been talking about bees have communities, ants have communities.
00:30:37.620 So yeah, you can have a community in a sense without a religion.
00:30:41.740 Um, might not be a very strong community.
00:30:45.560 It might not be a very meaningful community, but you can have that as for morals though.
00:30:50.380 Well, this is the, this is the important distinction here that I think sometimes, um,
00:30:55.380 atheists and agnostics struggle to, to see that nobody's claiming, at least I'm not claiming.
00:31:03.460 Uh, I don't think any intelligent Christian claims that as an atheist or agnostic, you
00:31:09.500 can't behave morally.
00:31:12.000 That's not, it's not what's being said.
00:31:13.840 You obviously can behave in a moral fashion as an atheist or agnostic.
00:31:19.060 Uh, I assume you've never killed anybody.
00:31:21.100 Hopefully you don't steal things and those sorts of things.
00:31:23.940 You know that it's wrong to do those things.
00:31:25.680 You don't do those things.
00:31:26.420 Fine.
00:31:26.620 Um, but the question is, is there any basis, is there any objective basis for moral action
00:31:36.520 on the atheistic worldview?
00:31:40.780 And then follow-up question is, yeah, you seem to recognize what's right and wrong and
00:31:49.000 you can recognize those things just as well as I can.
00:31:51.400 Um, but where do you think you get that recognition from if it's entirely a, a social
00:32:01.060 construct, if it's just something that we come up with as people, then why is it that
00:32:05.940 when you look across the world and you look at different civilizations, you find remarkably
00:32:10.340 similar moral systems, um, that manifest themselves in, in, in, in starkly different ways.
00:32:18.400 But if you look all across the world, everyone agrees basically that murder is wrong.
00:32:22.580 Everyone basically agrees that rape is wrong.
00:32:24.780 People still do those things in, in large number, but they recognize that it's wrong to do it.
00:32:31.660 Um, which is why almost everywhere those actions are criminalized.
00:32:36.180 So how did that happen?
00:32:37.220 You know, that's interesting, isn't it?
00:32:39.840 And I think as an atheist, you have to have an explanation for that.
00:32:43.240 Um, and the explanation, as I said, cannot be that it's all based on, you know, social
00:32:49.140 constructs, because then you would think you'd find dramatically different moral systems in
00:32:54.980 the society, in these different societies, but you don't find that we all seem to agree.
00:33:00.480 Why is that?
00:33:01.420 Where did that come from?
00:33:02.360 Um, and you can't say that it's, it's all evolution either, because it seems to me that evolution
00:33:11.920 is all about survival of the fittest, right?
00:33:14.400 Evolution is dog eat dog, strongest survive.
00:33:18.180 I mean, that's the, that's the evolutionary way.
00:33:21.920 Whereas a lot of our moral ideas fly directly against evolution.
00:33:27.400 Where we think morally that, for instance, we're supposed to protect the weak and the
00:33:32.960 vulnerable from a, if, if we were from a purely evolutionary standpoint, no, the weak
00:33:38.580 and vulnerable should die because they're dragging the rest of us down and we want only
00:33:42.160 the strong to survive.
00:33:43.680 But our moral system has us go against that tide against that, uh, the grain there as far
00:33:49.980 as that goes.
00:33:50.940 So again, evolution is not a, is not a good explanation.
00:33:55.360 So I don't think you can say that we evolved these moral ideas.
00:34:00.420 I don't think you can say that it's a social construct and where did it come from?
00:34:07.240 The, the answer of those, from those who believe in God is that this is, this moral intuition
00:34:14.280 is endowed in us by God and that God is the, the grounding of morality.
00:34:20.100 God is the foundation of all morality and it flows from him.
00:34:25.280 And so we, as, as his creatures, we, and as rational creatures, we get that recognition
00:34:31.240 from him.
00:34:31.740 We understand it because God has given it to us.
00:34:34.740 That, that's the, that's the, the theological answer to that problem.
00:34:39.560 And it is, you might not agree with it, but it is an answer anyway.
00:34:43.720 And it seems to me that atheists and agnostics don't have an answer.
00:34:48.320 So you can behave morally as an atheist.
00:34:53.120 I don't think you can come up with a coherent objective basis for that moral behavior without
00:35:03.080 religion.
00:35:04.060 And that's the, that's the distinction.
00:35:06.420 All right.
00:35:06.780 We'll leave it there.
00:35:07.420 Thanks for watching everybody.
00:35:08.280 Thanks for listening.
00:35:09.280 Godspeed.
00:35:09.620 Today on the Ben Shapiro show, Facebook bars, Milo Yiannopoulos, Louis Farrakhan, and Alex
00:35:27.360 Jones.
00:35:27.700 We'll talk about it.
00:35:28.460 That's today on the Ben Shapiro show.