ManoWhisper
Home
Shows
About
Search
Relatable with Allie Beth Stuckey
- June 12, 2019
Ep 124 | YouTube Crackdown
Episode Stats
Length
33 minutes
Words per Minute
188.40755
Word Count
6,241
Sentence Count
380
Misogynist Sentences
2
Hate Speech Sentences
4
Summary
Summaries are generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript is generated with
Whisper
(
turbo
).
Misogyny classification is done with
MilaNLProc/bert-base-uncased-ear-misogyny
.
Hate speech classification is done with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.000
Hey guys, welcome to Relatable. Happy Wednesday. I hope everyone is having a wonderful week. For
00:00:06.620
those of you who are at YWLS this weekend, it was so wonderful to meet you. I love seeing you
00:00:13.780
guys in person and just hearing your stories and hearing how inspired you were by all of the
00:00:18.720
speakers. And those of you who do listen to my podcast, just hearing you say, you know,
00:00:24.080
what you get from this every week and what you appreciate about it. It just, I mean,
00:00:28.560
I've always loved my job. I've always loved doing this podcast. It's always meant a lot to me to hear
00:00:33.280
from you guys. But when I get to see you in person and hear these things in person, it just takes
00:00:38.680
everything to a whole other level of reminding me of why I do this. And everyone should have a why
00:00:44.820
behind what they do. Of course, the chief why is to, to glorify God and to obey him. But another huge
00:00:52.080
why behind what I do is you guys. And so just thank you for always being encouraging and for giving
00:00:57.520
feedback and for being such an engaging audience who is so smart, who always gives me really good
00:01:03.520
thoughts, really good ideas, really good critiques. And I just, I just love you guys. I think relatable
00:01:08.340
listeners are the best listeners in the entire world. Some listeners, I'm sure like super boring,
00:01:14.020
don't know that much. They just listen to a person and they don't even really think about the things
00:01:18.580
that they're listening. But that's not you guys. You guys are really thoughtful. And I don't know,
00:01:23.280
you're just, you're just my friends. And I just, I love the people that listen to this podcast. So
00:01:27.540
thank you for that. Okay. Today, what we're going to talk about is YouTube and this crackdown that's
00:01:33.280
been happening on more conservative viewpoints. Should we care about this? Should we be worried
00:01:39.360
about this? You probably know that censorship is something that we have been hearing about
00:01:44.540
nonstop, mostly from the conservative side. So we should ask ourselves, I mean, obviously you guys know
00:01:51.900
that I'm a conservative and you guys know that if this is the case, I probably would be a so-called
00:01:57.480
victim of this kind of censorship, but I want to be objective and stand back and say, okay,
00:02:02.380
is this being overblown? Is there really that much of a bias? And if there is a bias, does any of it
00:02:11.020
really matter? Well, I'm just going to go ahead and tell you my thoughts on it up front. I'm not going
00:02:17.120
to try to hide it from you. I would like to argue that yes, this matters very much. And I don't
00:02:21.520
actually think that it's overblown. Sometimes it is, but what we have seen devolve over the past
00:02:26.740
couple of weeks with YouTube and some conservative voices, I think should, uh, it should, it should
00:02:33.920
worry us and it should make us say, okay, this is not something that's being exaggerated. This is not
00:02:39.580
something that's overblown. This is something that we actually need to look out for really, no matter
00:02:44.360
what side of the aisle that you're on. Um, okay. So the thing that kind of spurred all of this,
00:02:49.860
that started yet another controversy between social media and conservative voices, it has to
00:02:56.600
do with Steven Crowder. Steven Crowder has an extremely, extremely popular podcast that is also
00:03:03.440
distributed by the blaze like minus or blaze TV. Um, and he has a huge following on YouTube. I think he
00:03:11.020
has almost 4 million subscribers. His videos get hundreds of thousands of views. Every time he has
00:03:16.740
a podcast episode go out, uh, some of them get millions of views. He's just a really popular voice
00:03:23.220
in the conservative world. He's very unique because he's also a comedian. And so he's always kind of
00:03:28.580
pushing the limits on the things that you're like, Oh, can you, can you say that is, is that really
00:03:33.280
okay to say? And he just kind of goes there, which is, you know, different than what we do on this
00:03:38.420
podcast. It's certainly, he certainly says things. And I would say this to him certainly says things
00:03:42.860
that I would not say that I would not approve of saying that I probably don't think are that great
00:03:47.680
to say. He says them. And that is who his audience is. That's what his show is. That's how he's always
00:03:52.320
been. He has been in this conservative world for a long time. And there are hundreds of thousands,
00:03:57.700
millions of people who respect his voice and who value the platform that he has diligently
00:04:04.660
built, uh, built for a long time. However, someone who takes issue, someone who takes issue with his
00:04:10.620
platform is a guy by the name of Carlos Maza. I think that's how you pronounce his name. He and
00:04:19.080
Steven Crowder haven't always gotten along. So Carlos Maza works for Vox. Vox is a left leaning out, uh,
00:04:24.780
outlet in, uh, Carlos makes videos for Vox on, you know, a variety of liberal topics. Well, Steven
00:04:33.340
Crowder has used his show or a segment of his show several times to refute the content of these
00:04:39.940
videos. Carlos Maza does not like this, or maybe it's Maza. I really don't know how to, I don't
00:04:46.080
know. I'll just go with Maza. Um, he doesn't like this, but that's, I, that's not totally fair to say
00:04:50.920
that he doesn't just like, or he doesn't just not like that Steven Crowder actually refutes the content
00:04:56.120
of his videos. He doesn't like how, uh, Steven Crowder actually refers to him. So he did this long
00:05:02.600
text thread or text, what tweet thread, um, that ended up going, I guess you would say semi-viral.
00:05:09.320
Now I'll just be totally honest with you. I wasn't able to read the tweet thread right away.
00:05:15.540
And so I've been kind of like out of the loop on this whole thing. And the reason why I wasn't able
00:05:20.840
to read the tweet thread that everyone on the conservative side, it seemed like was talking about
00:05:25.540
was because he apparently, he blocked me. I don't even know who this person is. Like,
00:05:29.460
I don't think that I've ever communicated with this person. I don't know. I don't remember ever
00:05:36.040
knowing his name, talking to him, interacting with him, but I was blocked. And so this tweet
00:05:40.660
thread that was going around, everyone was retweeting and commenting on. I was like,
00:05:44.540
well, I have, I have no idea what this is because this person, he goes by the,
00:05:48.260
the Twitter handle of gay wonk. He, uh, blocked me a long time ago for reasons completely unknown to
00:05:55.400
me. So I just kind of ignored it for a while. So, well, let me say what the tweet thread was.
00:05:59.740
So the tweet thread was him saying, Hey, Steven Crowder makes these videos about me. And you know,
00:06:06.560
I have thick skin, but look, he keeps on commenting on my sexuality. He, he, he keeps on commenting on
00:06:12.100
the fact that I'm gay and, uh, the fact that I am Latino and I don't like it. It's harassment.
00:06:18.320
His followers have harassed me. His followers have targeted me and even tried to, uh, even tried to
00:06:25.900
dox me. And so he posted this like video compilation, this like cut up video compilation of Crowder
00:06:31.440
referring to Maza in a variety of ways that do highlight Maza's homosexuality. Um, mostly just
00:06:38.640
like referring to him as gay, but he also called him a gay Mexican. Oh, let me just say, if you have
00:06:44.800
kids in your car, I always try to say this, there are some things that maybe you might not want them
00:06:49.360
to hear. So just FYI, um, called him a gay Sprite, called him a lispy queer. And again, these are not
00:06:56.700
phrases that we would use on this podcast or that I would use in my life, but stick with me as we talk
00:07:02.620
about the actual point of the story, which goes beyond what Crowder actually said. So he's calling
00:07:07.160
these names. Maza puts a compilation, uh, of the, of this name calling that Crowder did, uh, on
00:07:14.800
Twitter. The thread was really long, but in summary, he said, okay, this is harassment. Uh, and I have
00:07:22.060
been harassed by his followers. So why is YouTube allowing this person to have a platform? Doesn't
00:07:26.840
this go against, uh, their rules? Doesn't this violate their terms? Why are they allowing this
00:07:33.280
person to have such a growing channel on their platform? Who is homophobic? He would say, uh,
00:07:39.420
now Crowder responded to this by saying, okay, hang on just a second. Whoa, whoa, whoa. Okay. So I,
00:07:45.560
this is him talking basically. I'm totally paraphrasing and summarizing basically saying, okay,
00:07:50.340
so I, I use some off color jokes that offended you. Yes. I called you those names, but he's saying,
00:07:57.220
I never told my followers to harass you. I never encouraged doxing. Um, he said, I refuted the
00:08:02.420
content of your videos and sure. I might've said some things that offended you, but I never harassed
00:08:08.440
you. I never encouraged anyone else to harass you either. Um, so after all of this kind of back and
00:08:14.680
forth, and of course you had conservatives all over Twitter, uh, defending Steven Crowder because
00:08:18.980
they know that Steven Crowder is a comedian. Comedians often push the limits. They often pick
00:08:24.740
on people. They rib people. They even say things that are offensive to groups of people. And Steven
00:08:29.900
Crowder has been politically incorrect for a really long time. So you had a lot of conservatives
00:08:35.900
defending him. Um, YouTube responded to Maz's thread on Twitter saying this, uh, our team spent the last
00:08:43.100
few days conducting an in-depth review of the videos flagged to us. So he had flagged the particular
00:08:48.060
videos, uh, that Steven Crowder, that Steven Crowder, uh, or what he would call harassed him in. Um,
00:08:55.200
and they said, and while we found language that was clearly hurtful, the videos as posted don't
00:09:01.100
violate our policies. We've included more info below to explain this decision as an open platform.
00:09:06.700
It's crucial for us to allow everyone from creators to journalists, to late night TV hosts
00:09:11.500
to express their opinions within the scope of our policies. Opinions can be deeply offensive,
00:09:16.420
but if they don't violate our policies, they'll remain on our site. Okay. So I was surprised by this,
00:09:21.820
honestly. And I think a lot of people were surprised by this on the left and the right.
00:09:26.780
Uh, to me, this is the correct stance to take. Uh, yes, he said some offensive things and they're
00:09:34.060
saying, okay, we understand that they were hurtful, but look, if, if we being YouTube, if we took down
00:09:39.620
every single opinion or every single comment that someone was offended by, we wouldn't have anyone on
00:09:44.420
our platform. There are people on the left and the right who say offensive things about groups.
00:09:48.960
And if they're not allowed to do that, are they even allowed to say anything of substance? Now,
00:09:53.900
of course you had people saying, well, why is homophobia an opinion? Why is that something
00:09:58.880
that should be protected? But you've got other people that say, okay, it was just gentle ribbing.
00:10:03.700
It was just making fun. You can't possibly say that that violates your terms. You can't possibly say
00:10:10.140
that that goes against our policies, but okay. I think I was pretty happy with this decision.
00:10:16.240
That's kind of a hands-off. We don't agree with what he said, but we're going to allow people
00:10:20.980
of different opinions to be on this platform because that's who we are. We are encouraging
00:10:24.880
an open dialogue as long as they're not targeting people, um, by calling for harassment or doxing
00:10:30.820
or violence or anything like that. I thought that it was a pretty good step for YouTube, but here's
00:10:35.880
the thing. The story does not end there. The story does not end there. So this completely sane,
00:10:41.980
I thought in common sense stance, uh, by YouTube solicited an absolute, absolutely outrageous
00:10:50.160
outrage mob. I mean, people were upset by this. They could not believe that this is the stance
00:10:56.320
that you, that YouTube, uh, that YouTube made, especially in light of pride month and all of
00:11:03.000
this. People were saying this is a direct attack on the LGBTQ community that YouTube clearly doesn't
00:11:08.680
stand with this community. They said, you're not going to police offensive language. You're not
00:11:14.180
going to police what they call hate speech. You're not going to censor inappropriate jokes. They said.
00:11:20.240
So as we would expect, as we would expect, and as we primarily expected, as we expected in the first
00:11:27.680
place, YouTube to do, they bowed down to the outrage, to the leftist outrage, by the way, they did
00:11:32.140
not bow down to conservative outrage. They bowed down to leftist outrage. They changed their minds.
00:11:37.000
They said, okay, you're right. You guys, you are right. We are sorry. We are going to completely,
00:11:42.980
we're going to completely demonetize Crowder's channel. Now, if you don't know what that means,
00:11:47.440
that means that we're not going to allow ads to play on any of his, uh, videos, which means he
00:11:52.240
will no longer get paid the money that he used to get paid from YouTube. Now, like I said, Crowder
00:11:57.420
has almost 4 million subscribers on YouTube. Uh, his videos get hundreds of thousands and usually,
00:12:02.360
uh, you know, over a million views. So that, that means he probably gets paid pretty well
00:12:07.200
from YouTube. I don't know for sure, but I would guess that the monetization of his videos,
00:12:12.280
um, helps him out a lot. Uh, that's no longer going to happen because YouTube decided to kowtow
00:12:19.100
to people like Mazza and all of the people that, uh, got really upset about their original decision.
00:12:24.600
But Mazza actually still isn't happy about this. I saw a tweet that was screenshot because,
00:12:29.180
you know, I can't see it because I'm blocked. Uh, he still isn't happy about this. He says that
00:12:33.040
demonetization doesn't actually work, that it doesn't go far enough. So he really wants,
00:12:38.160
he really wants Crowder and other people like him to be completely taken off the platform. Now,
00:12:43.320
it makes sense once you realize that Mazza used to work for media matters, which is literally in
00:12:49.060
the business of deplatforming and demonizing conservative voices. This is just who he is.
00:12:54.920
This is a guy who has also called, um, for violence and harassment in person against
00:13:00.900
conservatives. Other tweets that I have seen that you can look up if you are not blocked,
00:13:04.980
he has called for milkshaking conservatives, which is like this thing that apparently people do is
00:13:10.460
like throwing milkshakes on like conservatives in the public sphere, uh, that you see out in everyday
00:13:16.900
life. Really solid stuff, really, really mature, solid stuff that probably makes, you know,
00:13:22.900
a lot of people in the middle say, you know what, I'm going to become a liberal now.
00:13:26.000
Now that I see that these, that these guys are throwing milkshakes on conservatives, you know,
00:13:31.460
they must have some pretty good ideas, some pretty good ideas that are probably, probably worth joining.
00:13:37.880
So that's, that's who this guy is. He also said that he wants conservatives to be terrified
00:13:43.220
to gather in public. So that's good. Good guy here. So he is out here saying all of that,
00:13:49.320
that literally in public, literally calling for harassment against conservatives in the public
00:13:54.680
sphere, throwing milkshakes on them. I, I promise you, if I am with my daughter and you throw a
00:13:59.360
milkshake on me, that's not going to end well for you. One jokes on you. I love milkshakes too.
00:14:05.620
I promise you, I promise you, I promise you if you do anything to me when I am with my daughter,
00:14:11.960
I'm just saying, it's just not, it's not going to turn out well for you. It's not going to turn out
00:14:16.000
well for you. So that's who this guy is. He's a complete and total hypocrite. The bottom line is
00:14:20.400
that he just doesn't want conservative voices to have a platform. And this is a full-on assault.
00:14:25.320
I'm not saying that he shouldn't be offended by what Steven Crowder said. Look, if I were him,
00:14:29.600
if someone had a channel, a very popular channel, and they were talking badly about me on multiple
00:14:34.780
episodes, and they were calling me names that I didn't like, and their followers were coming to my
00:14:39.600
channel and harassing me and threatening to dox me, I would be really upset about that. I would.
00:14:44.600
So I do not blame this guy, even though I don't like him, even though I don't like his tactics,
00:14:49.120
even though I don't like his views, I don't blame this guy for being upset. I don't. I don't blame
00:14:54.080
this guy for not liking Crowder. I don't blame this guy for being, you know, maybe somewhat upset at
00:15:01.400
YouTube. Maybe I don't blame him for that. But if I were in his situation, would I be calling for the
00:15:08.020
systematic censorship of everyone who doesn't agree with me? Would I be calling for the complete
00:15:12.580
deplatforming and removal of someone who said, I don't know, something like, girls shouldn't have
00:15:18.220
podcasts? No, of course not. I would disagree with him. I wouldn't like it. I definitely would
00:15:24.280
report all of the users that tried to harass me or all of the users that tried to dox me. But if that
00:15:30.420
person weren't calling for direct harassment or calling for direct violence or calling for direct
00:15:36.400
for or calling directly for doxing, well, then that's not really that's that person's fault because
00:15:42.240
I'm in the public sphere. They can comment on my content if they want to. And that's just, you know,
00:15:47.440
that's part of the game. And again, I understand why this guy doesn't like Steven Crowder and the
00:15:52.180
things that he said. And I think that he's totally in his he has every right to be offended by this
00:15:58.060
totally justified. But that does not justify trying to shut someone down just because they said
00:16:04.420
something that offends you. So YouTube, I think I think I started to say something and then I went
00:16:12.500
away from it and now I'm coming back. So YouTube changed their mind and they said, yes, OK, I said
00:16:18.420
this. YouTube changed their mind. They're going to demonetize him. Carlos is upset that they're not
00:16:24.340
completely deplatforming them. But they did come out with a statement that said YouTube came out with
00:16:29.840
a statement that said, well, you know, we're actually going to crack down on this stuff. We
00:16:34.720
actually are going to crack down on this stuff. We're going to make sure extremist content doesn't
00:16:40.740
get seen as much. We're going to make sure that even the kinds of videos that could lead to so-called
00:16:46.560
extremist content don't show up as easily. So we're just going to make sure that that doesn't happen.
00:16:51.500
So, OK, here we go. Here we go. This is what happens when you try to give YouTube the power
00:16:56.840
and the authority to take down all content that offends you. When you say YouTube, you don't
00:17:02.800
believe in the LGBT community or you don't support us or you don't like us, you hate us unless you
00:17:09.740
completely deplatform the people that we don't like. When you give YouTube that kind of power,
00:17:13.820
you are implicitly or explicitly, depending on how you look at it, giving them the power to censor
00:17:19.680
all ideas, all ideas that could possibly be offensive. And we saw that with this statement that they made
00:17:25.500
after all of this saying we're going to crack down on it. We're going to make sure that videos that
00:17:29.600
even could possibly lead to so-called extremist views, that we're going to make sure that they
00:17:34.840
don't come up as suggested videos or anything like that. So here we go. Here's the problem with this.
00:17:43.060
We cannot trust Google and YouTube to decide what is extremist, what is actually harmful and what is
00:17:50.380
not. I mean, am I extremist? I've done multiple videos probably about biblical topics that people
00:17:56.180
are offended by. I've done podcast episodes about God's design for biblical marriage. Am I radical?
00:18:01.640
Am I offensive because of that? Am I hateful? Am I engaging in hate speech because I talk about the
00:18:05.920
Bible and I talk about things that non-Christians probably don't like? Am I radical? Am I one of those
00:18:11.400
people that they're going to have to demonetize, that they're going to have to deplatform to make sure
00:18:15.920
that they have a safe community? I mean, who's to say? Is Google to say whether or not a biblical
00:18:22.440
point of view is extremist? I guarantee you this does not stop with someone like Stephen Crowder
00:18:27.560
making off-color jokes that can be offensive to a group of people. This doesn't stop with that.
00:18:32.380
This stops with or this keeps going to people like me, to people who talk about Christianity, to people
00:18:38.560
who talk about, you know, liking Donald Trump or being a conservative, that talk about being pro-life,
00:18:46.080
that even just engage in these conversations. Maybe they don't even hold the views themselves, but
00:18:50.560
maybe they're just open-minded. We've already seen that big tech doesn't really like open-mindedness
00:18:57.160
in the censorship that we've seen of people like, or not even censorship, but just kind of demonization
00:19:02.060
and dislike of people like Joe Rogan and Dave Rubin, who themselves aren't particularly conservative,
00:19:08.860
but engage with people who are. They just don't like that. Which leads to an interesting article that
00:19:15.720
I read in the New York Times, and it was called The Making of a YouTube Radical. The Making of a
00:19:23.980
YouTube Radical. So this writer for New York Times thought that he just busted this like extremely,
00:19:30.580
extremely, extremely interesting case of this radicalized guy who was 26 years old. His name
00:19:36.340
is Caleb Kane. He recently, apparently, he says that he swore off the alt-right. So this article
00:19:43.780
looks at Caleb's journey. It's like this very intensive, interactive timeline of Caleb's journey
00:19:48.640
that YouTube took him on. Apparently he was on the left, or he didn't really know what he believed
00:19:54.480
politically, but then he started listening to conservative commentators, and then he got further and
00:19:59.440
further to the right to where he started listening to more alt-right commentators. And I agree. Some
00:20:04.420
of the commentators that he was listening to, they were alt-right. So he started going down these rabbit
00:20:11.720
holes, and he basically became this alt-right guy who bought into this. But then he started listening to
00:20:20.540
this leftist YouTube channel, and then he was saved, and he realized that, oh my gosh, all the stuff I was
00:20:26.900
listening to was bad. And so this article was basically saying that YouTube is set up in such
00:20:32.760
a way as to radicalize people. Because when you listen to, or you watch one video, it suggests other
00:20:38.660
videos that are similar. So you do go down these rabbit holes, and you can just get more and more
00:20:43.280
radicalized. Well, the reason why this was contentious is because the people that they included
00:20:49.460
in this kind of collage that they put up at the top of these alt-right views that the New York Times
00:20:56.820
seems, or says, radicalized people were people like Milton Friedman. Milton Friedman. So the
00:21:02.200
economist, okay? Ben Shapiro, Dave Rubin. I think there were some other more mainstream people. So
00:21:09.660
this is the problem. And this is something that we have seen multiple times. Obviously, the entire
00:21:15.040
article was biased, basically saying that conservatism, any kind of mainstream conservatism
00:21:20.480
or open-mindedness to conservatism, like what Dave Rubin exemplifies on his show, that's going
00:21:28.380
to radicalize people. And it's subtle, but it's a way of making people worried or making people scared
00:21:34.660
of listening to mainstream conservatives, because they don't want to be seen as radicals. If you are
00:21:39.140
someone who is in the middle, or you're just trying to decide what you believe, the more you see
00:21:43.940
someone like Ben Shapiro associated with the name alt-right or associated with far-right people,
00:21:49.920
the more you are going to classify him in your mind, unless you know better, as some kind of
00:21:55.260
extremist, some kind of bigot, some kind of person that you don't want to associate yourself with.
00:22:00.720
This has happened multiple times. I think it happened with Washington Post recently. It's happened
00:22:05.360
multiple times with multiple outlets, this subtle association of very mainstream, thoughtful
00:22:12.200
conservatives with more far-right or alt-right figures that are not particularly thoughtful,
00:22:18.580
that are extremists, that do espouse views, that really don't have any similarities to conservatism
00:22:24.000
whatsoever, and just maybe like Donald Trump and aren't leftists themselves. But the media really
00:22:30.020
likes to clump all of these people together and say, if you buy into any of these ideologies or
00:22:36.160
philosophies or ideas, if you buy into any of these conversations, well, then you're going to be
00:22:41.560
radicalized too, and you don't want that because that's bad. That's immoral. And New York Times,
00:22:47.000
of course, they came out and they said, oh, I'm sorry. Like, we didn't mean to include some of
00:22:50.720
these people in the collage. We'll take, we'll take, like, I think they said, like Ben Shapiro out.
00:22:55.080
This happened a couple of weeks ago, and the same thing happened to Ben Shapiro when they
00:22:58.640
included him. I think it was with Milo Yiannopoulos or someone and called him alt-right. They had to
00:23:04.900
change the description of him there. So they do that a couple of days later and they apologize.
00:23:10.680
But this has happened too many times for me to think that it's accidental. It's totally deliberate.
00:23:16.540
They want people to be scared to listen to Ben Shapiro. They want people to be scared to listen
00:23:22.180
to Dave Rubin, even to be scared to listen to someone like Joe Rogan, who is not a conservative,
00:23:27.840
by the way. But he does have people of a variety of viewpoints on his show. And to the left,
00:23:33.360
that's not something that they want. They don't think that you should entertain any idea or any
00:23:38.660
thought or any kind of reasoning that disagrees with them. They think the same thing about someone
00:23:43.620
like Jordan Peterson, who himself, I would not call a conservative. Now, he's against political
00:23:48.880
correctness. He is against forced speech, of course. Those didn't used to be conservative values
00:23:54.540
exclusively. He is, but he's extremely thoughtful. There is nothing radical about what Jordan Peterson
00:24:01.940
teaches or talks about. And yet the media wants you to think that if you start entertaining any ideas
00:24:07.760
that are not far left, well, then you might become an extremist too. You might become a bigot too. You
00:24:13.480
might go down this dark and deep rabbit hole into the scary abyss of actually believing things that
00:24:19.400
MSNBC doesn't tell you. And that is why I think all of this is a problem. I do not. There are a lot of
00:24:26.800
voices, a lot of voices who consider themselves on the rights that I don't agree with, that I don't
00:24:32.780
like, that I actually think their ideas are really toxic to the public dialogue, that I wish that they
00:24:39.320
didn't try to espouse conservatism because what they're saying is harmful or it's not true or whatever
00:24:45.740
it is. And I think that the people that follow them and follow their ideas typically do end up in this
00:24:51.740
really weird and corrupt place. But what I don't want is for us to have these, uh, basically these online
00:25:00.120
police states where you have this leftist social justice group of elites saying what can be, uh, what can
00:25:07.840
be accepted as good speech and what can't based on completely biased views. That's what I don't want
00:25:14.380
because I am perceptive enough. And I think all of you are too perceptive enough to realize, uh, that it's not
00:25:21.560
going to stop with the absolutely crazy views. I would rather it be up to individuals. I'd rather
00:25:27.680
it be up to listeners and to viewers to know what is true. And I understand, I understand that there
00:25:33.760
are a lot of people out there that are susceptible to fake news. And I do think that there has to be
00:25:38.660
some sort of standard for truth, some sort of standard. Like it does get really scary when you think
00:25:47.380
about all the technology that's out there that you can literally make someone look like they're saying
00:25:52.460
something in a video that they're not, or you can manipulate something, um, manipulate some sort of
00:25:58.900
footage into, to, uh, implicate someone for something that they didn't do. And all of that is very scary. I'm
00:26:05.820
not saying that these social media platforms don't have any responsibility whatsoever, especially when it
00:26:11.460
comes to directly hurting someone's reputation. But the more you get into that, the more you start
00:26:18.260
censoring that and policing that, uh, the more biased you become. There was that whole thing with
00:26:24.160
Nancy Pelosi and this compilation, uh, going out of her, this guy who just lives and works in the Bronx.
00:26:32.000
I think that's where he lives. Uh, he made this like mashed up video of Nancy Pelosi at a press conference
00:26:37.540
and slowed it down a little bit, apparently. So it sounded like she was drunk. It was circulating
00:26:42.040
on social media. I'm sure a lot of people believe that it was true. The president and his
00:26:47.060
administration, I think that's the same video they ended up sharing. It could have been a different
00:26:51.280
video, but there were people in, in Trump's administration who ended up sharing it. And Nancy
00:26:56.040
Pelosi was very upset. Hillary Clinton was very upset. You had a lot of people on the left,
00:27:01.100
very upset by this. And what did the daily beast do? The daily beast went and found this guy,
00:27:06.360
found exactly where he works, found what his name is, uh, found his criminal history and they
00:27:11.820
exposed it. Why? Because they don't believe that he should be free to do something like that. And
00:27:17.200
of course they called on Facebook to help them find this guy's identity. Facebook did. And they
00:27:22.000
said, well, Facebook, why didn't you take this video down? Mark Zuckerberg apparently called Nancy
00:27:26.860
Pelosi to apologize, all this crazy stuff. And so when you start getting into things like that,
00:27:33.020
when you can't even tell jokes anymore. If you remember that AOC video that I did, the
00:27:38.440
mashed up interview that I did back in July. So almost a year ago, you had the Washington Post,
00:27:44.600
you had Buzzfeed, you had all of these outlets reaching out to me saying, why did you purposely
00:27:49.100
deceive people? And why hasn't Facebook taken this down? Why hasn't Twitter taken this down? They fully
00:27:54.480
believe that, uh, these social media platforms should step in when there is a video that they don't
00:28:00.340
like if it employs humor that they don't think is funny. I do think it's extremely dangerous when
00:28:05.320
social media platforms come in and they try to censor that. And so I don't know exactly where
00:28:11.540
the line is on these social media companies coming in and saying, okay, we don't allow blatantly false
00:28:18.180
material to be circulated on our platforms. Maybe that's not their job at all. Maybe it's not their job
00:28:23.160
at all. Maybe it's totally on us. Maybe it's completely on the individual, on the viewer to
00:28:29.260
know what's true and what's not. Maybe it's not on them at all. Because I do think that when they come
00:28:34.720
in and say, sorry, you can't do that. You can't do that interview with AOC because it makes her
00:28:39.940
look bad or whatever it is. I think that's extremely dangerous because you're not going to see them
00:28:45.980
doing the same thing to Stephen Colbert when he does it with the president. You're not going to see
00:28:49.800
them take down the videos of Jay Leno doing that. And we've already seen how their new round of
00:28:55.140
censorship when they said, oh, we're, we're going to make sure that we're cracking down on these
00:28:59.740
extremist extremist views. We've already seen how that has backfired because apparently even like
00:29:06.220
Holocaust educational videos are being taken off YouTube. Holocaust educational videos that like
00:29:12.440
teachers are using and professors are using in their classrooms to make sure that their kids know
00:29:17.160
what the Holocaust is. They're literally erasing history, which there are very few things that are
00:29:22.200
more Orwellian than that if you have read 1984, but that's exactly what's going on. And so again,
00:29:28.580
when we give the power to the social media companies to tell us what is true, what is not,
00:29:36.220
what is hate speech, what is not, it just becomes extremely dangerous. The information that we have
00:29:43.720
access to becomes extremely narrow, uh, especially ideologically. Again, I think that if you're
00:29:50.620
inciting violence or you're looking to dock someone or you are inciting some kind of harassment or you
00:29:57.400
are spreading possibly an outright lie, maybe there is a place, um, there is a place certainly in place
00:30:03.880
of, in the instance of violence, but maybe there's a place in those other instances for the social media
00:30:09.020
companies to come in and say something, but look, they're a platform at the very least. If they do
00:30:14.920
enforce rules like that, at the very least, they need to be ideologically neutral and they just aren't.
00:30:20.420
We know for a fact that Google isn't neutral. We saw the leaked video of them after the election,
00:30:25.100
crying their eyes out when Hillary Clinton lost. Like we know that they lean to the left. There is a reason
00:30:29.760
for conservatives to be worried. And in a free country where this is, uh, how we use our voices and this
00:30:35.800
is how we have public dialogue. I do think it's important for them to say, Hey, we're not politically
00:30:40.960
biased. Here are our rules. We're going to enforce them evenly across the board. That's all I ask.
00:30:45.740
They are private companies, Facebook, YouTube, Twitter. They can have the rules that they want
00:30:51.760
to use. And I don't want them to be regulated. I don't want them to be regulated by the government.
00:30:55.400
They can totally enforce the rules that they want to enforce, but as platforms, rather than
00:31:01.600
publishers, platforms who claim to be neutral, who claim to be making fair and even decisions,
00:31:06.200
they should absolutely, um, take their political bias out of it. 100%. I don't anticipate that's
00:31:13.920
going to happen. So this is just a plug and I'm, I never plug this thing, but I never plug really
00:31:21.300
at all, unless I'm talking about bolster sleeper express VPN, but I am going to say, I really do want
00:31:27.780
you guys to subscribe to blaze TV. It's really important because you don't know when I'm going
00:31:33.020
to be kicked off YouTube. Like you don't know when I'm going to be kicked off iTunes. We've already
00:31:36.320
had a problem with Spotify. Um, so subscribe to blaze TV. If you can, I know it's an extra expense
00:31:42.400
every month. You can use promo code alley. You can get $20 off, but that just assures that you're
00:31:48.180
going to be able to listen to the content that you want to listen to because they're not going to be
00:31:52.180
able to censor us on blaze TV. So you can go to blaze tv.com slash alley, and you can subscribe
00:31:57.680
there. And then you know that you're always going to get the content that you want to get,
00:32:01.740
no matter what it's protected that way. Um, probably as far as we know, uh, at least longer
00:32:07.500
than we will be on YouTube or these other platforms that we know have a bias. So yes,
00:32:14.140
all of this stuff matters. Yes. Censorship matters. Doesn't matter if you agree with the voice and what
00:32:18.740
they're saying. It all matters. And it is, it's not just a logical fallacy to say that it's a
00:32:23.940
slippery slope. We've already seen that it's a slippery slope and there is a, an ideological
00:32:27.760
motivation behind it to make sure that conservatives don't have a voice and that the only public dialogue
00:32:33.540
that exists is decidedly leftist. And I think that's something that we should care about.
00:32:37.880
Okay. We will be back here on Friday. I haven't decided what we're going to talk about. There's been
00:32:42.080
a lot that's gone on with the Southern Baptist convention over this past week. There's a lot,
00:32:46.740
there's a lot of contention. And so I might talk about that on Friday. I haven't decided,
00:32:51.160
but if you guys do have suggestions, always feel free. You can message me on Instagram.
00:32:55.240
You can email me, Allie at the conservative millennial blog.com. Of course, if you love
00:32:59.840
this podcast, I would love your five-star review on iTunes. It helps me out a lot. Plus I read them
00:33:04.380
and I love hearing your words. Okay. I will see you guys here on Friday.
Link copied!