ManoWhisper
Home
Shows
About
Search
Relatable with Allie Beth Stuckey
- September 08, 2022
Ep 675 | Want to Topple the Elites? Mock Them | Guest: Seth Dillon
Episode Stats
Length
48 minutes
Words per Minute
193.27954
Word Count
9,462
Sentence Count
611
Misogynist Sentences
17
Hate Speech Sentences
16
Summary
Summaries are generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript is generated with
Whisper
(
turbo
).
Misogyny classification is done with
MilaNLProc/bert-base-uncased-ear-misogyny
.
Hate speech classification is done with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.000
Seth Dillon, CEO of The Babylon Bee, is here to discuss libs of TikTok and other journalists
00:00:06.440
being punished for calling out children's hospitals for mutilating kids.
00:00:11.080
And we're also talking about why mocking really matters to a healthy society, as well as his
00:00:17.220
appearance on Joe Rogan's podcast.
00:00:19.900
We'll get into all of this and more.
00:00:21.500
You will love this conversation.
00:00:23.120
As always, this episode is brought to you by our friends at Good Ranchers.
00:00:26.780
That's American meat delivered right to your front door.
00:00:28.720
Go to GoodRanchers.com slash Allie.
00:00:30.740
That's GoodRanchers.com slash Allie.
00:00:41.980
All right, before we get started in that conversation, I wanted to play you a little clip from Seth's
00:00:48.640
interview with Joe Rogan when they were talking about abortion and Seth's pro-life position.
00:00:55.940
I thought he did a really good job, and I wanted to make sure that you saw and heard
00:01:00.180
it for yourself before I discuss it with Seth.
00:01:02.380
So here that is.
00:01:03.620
You don't have the right to tell my 14-year-old daughter she has to carry her rapist baby.
00:01:07.980
You understand that?
00:01:08.620
To look that woman in the eye who was the borson, do you understand that?
00:01:11.960
That's a 14-year-old child.
00:01:13.560
If a 14-year-old child gets raped, you say that they have to carry that baby?
00:01:17.440
I don't think two wrongs make a right.
00:01:18.920
I don't think murder is an answer to – I don't think murder fixes a rape.
00:01:23.280
When we start talking about harmful misinformation and the types of things that are considered
00:01:26.560
– like that I say or that we tweet or the jokes that we make that are considered harmful
00:01:29.720
misinformation, I'm like, well, what about calling that baby a clump of cells?
00:01:33.740
I think that's harmful misinformation because then you're encouraging people to kill it like
00:01:37.420
it's nothing when it's actually a human life.
00:01:39.640
It's a developing human life.
00:01:41.000
I think abortion is health care the way that rape is lovemaking, if we want to use rape as
00:01:45.760
an example.
00:01:46.340
I think it's – I think they're opposites and it's like a – these are euphemisms
00:01:52.020
that we use.
00:01:52.700
You know, we use the word health care.
00:01:54.080
We're talking about a procedure that ends an innocent human life and we're calling it
00:01:57.720
health care.
00:01:58.880
That's like calling rape lovemaking.
00:02:01.420
And this is why it's such a human issue because I see what you're saying.
00:02:06.120
So our friend Seth held his own and he is here to discuss that.
00:02:10.900
So without further ado, here is Seth Dillon.
00:02:12.640
Seth, thanks so much for joining us again.
00:02:18.780
All right.
00:02:19.680
First, I want to hear about your experience on Joe Rogan.
00:02:22.620
Tell me what that was like.
00:02:25.980
Oh, man.
00:02:27.020
I don't know where to begin.
00:02:29.160
Were you – I mean, you didn't seem at all nervous or anything like that, but was it a
00:02:33.740
little intimidating?
00:02:34.740
I would feel intimidated.
00:02:37.380
Yeah, it was.
00:02:38.300
I mean, I had all my friends leading up to the whole thing talking about how this is
00:02:42.500
Joe Rogan.
00:02:43.100
There's nothing bigger than Joe Rogan.
00:02:45.180
And so they were kind of drilling it into my head that it was something to be nervous
00:02:48.500
about.
00:02:48.920
So that didn't help.
00:02:50.020
Yeah.
00:02:51.180
But we did spend some time at his studio, probably about 40 minutes before we got going
00:02:55.020
with the interview.
00:02:55.720
So we were talking and hanging out and he was giving us a tour and everything, which can – you
00:02:59.580
know, that helps to relax you a little bit, just getting to know the person you're about
00:03:02.200
to sit down with.
00:03:03.440
Yeah.
00:03:04.580
But, I mean, it's a very casual conversation, just like this, you know, just sitting here
00:03:10.080
talking to somebody.
00:03:11.300
Yeah.
00:03:11.520
And I feel like I know him because I've listened to him so much.
00:03:14.740
Right.
00:03:15.280
So I was definitely nervous when I got in the chair.
00:03:18.460
But, I mean, you're sipping on whiskey, so you calm down pretty quick.
00:03:21.400
Yeah.
00:03:21.700
I guess that's part of why he does that.
00:03:24.300
It is a very casual setting.
00:03:25.720
But, I mean, he will, like, he will really drill someone that he disagrees with, but he
00:03:32.080
also knows when to back off.
00:03:33.500
And that's what I noticed in his conversation with you about abortion.
00:03:36.720
I'm sure a lot of people have seen that clip now, that he was really pushing you for a
00:03:42.460
minute, especially when he made it personal, when he was talking about, so you're telling
00:03:45.780
me, my 14-year-old daughter, but you really held strong.
00:03:49.440
Like, how were you feeling during that exchange?
00:03:51.500
I had a sense when that whole conversation started, and I don't know if you saw the
00:03:56.380
whole thing, like, there were clips that were taken of it, but it went on for, like,
00:03:59.100
30 minutes.
00:03:59.900
Yeah.
00:04:00.440
And at one point, I finally said, you know, if you want to move on to something else, we
00:04:03.960
can.
00:04:04.760
You know, I was trying to give him an out so that we could change the subject and move
00:04:09.100
to something else.
00:04:09.620
Not because I don't want to talk about that subject anymore, but, you know, it was definitely
00:04:15.080
a, it was probably, it was the most notable exchange in the interview.
00:04:20.200
It's the one that got the most attention, and I think for good reason.
00:04:23.880
There are a lot of people on the right who will defend life, but not in all circumstances.
00:04:29.960
You know, they will be willing to make exceptions, and they are willing to try to compromise in
00:04:35.780
some cases, except, especially these extreme cases where you're talking about, like, a teen
00:04:40.240
rape victim, something tragic like that.
00:04:42.260
But, you know, I don't take that position.
00:04:47.400
You know, I think that if life is valuable in some circumstances, it's valuable in all.
00:04:51.840
It's not just valuable if the mother wants the baby.
00:04:53.880
It's valuable even if she doesn't want the baby.
00:04:56.300
It's valuable even if the circumstances that brought the life into existence were tragic
00:05:01.760
and evil.
00:05:04.020
Yeah.
00:05:04.180
Um, so, you know, it was, it was definitely tough when he made it personal, because Joe
00:05:08.160
does have a 14-year-old daughter.
00:05:09.840
So he was talking about how, you know, you have no right to tell my daughter.
00:05:13.720
And, uh, and it's not, you know, I, I did, um, I was very conscious of the fact that there's
00:05:18.500
going to be a lot of people watching that and listening to it.
00:05:20.800
And I wanted to just stay calm and reasonable and not get sucked into kind of the emotional
00:05:26.120
appeals.
00:05:26.560
And, uh, because that's really the point of bringing up that case, that's an emotional
00:05:30.740
case.
00:05:31.120
It's, it's going for that outlier, the really crazy circumstance and trying to get a wedge
00:05:36.560
in so that you can, you know, put the whole pro-choice argument through on the back of
00:05:41.160
it.
00:05:41.780
And, uh, and I tried not to get sucked into that and just stick to my guns and say, look,
00:05:46.180
you know, if it's wrong to intentionally kill an innocent human life, it's wrong even in
00:05:50.160
this case.
00:05:50.700
And that's not, it has nothing to do with my right to say that to your daughter.
00:05:54.500
It has everything to do with the right of every human being to live.
00:05:58.920
Right.
00:05:59.460
No, you did a great job.
00:06:00.700
And you could tell that he realized that he kind of needed to back off a little bit, that
00:06:06.080
you weren't going to back down and that you weren't going to kind of acquiesce or change
00:06:10.720
your definitions or narrow your definition of life.
00:06:13.700
And that's also what makes him really good is that he kind of like his podcast is kind of
00:06:19.820
like music.
00:06:20.500
Like there's those really tense moments where you're like, oh, I'm not sure if those
00:06:24.400
notes should be together.
00:06:25.680
And then it kind of eases into a melody.
00:06:28.600
And so he did a great job.
00:06:30.180
You did a great job.
00:06:31.480
Good job.
00:06:32.040
I can imagine that was like a little intimidating, but you didn't seem like you were off your
00:06:36.000
game at all.
00:06:36.960
It was.
00:06:37.420
I tell everybody, I'm just glad that I didn't like, uh, say something really stupid and
00:06:42.440
embarrassing, uh, or throw up or pass out.
00:06:45.340
Right.
00:06:45.520
I'm really glad that you didn't throw up or else that would have been like the viral
00:06:49.840
moment and no one would have seen you defend life.
00:06:52.620
It would have just been like Seth Dillon throws up all over Joe Rogan's podcast.
00:06:56.360
So good job.
00:06:57.000
Would have been very entertaining though.
00:06:58.440
It would have been very entertaining.
00:07:00.180
Um, what's the reaction that I know you've gotten a ton of support from the right, but
00:07:03.420
I've also seen some crazy messages and comments that you've gotten from other people who I
00:07:08.220
guess didn't like your argument.
00:07:09.400
It's, you know, I, I'm sure you get this all the time too.
00:07:13.720
It's, it's, you got the two extremes, you know, the, the unbelievable support and praise
00:07:19.880
and, uh, reinforcement and just the thank you, thank you, thank you for standing for
00:07:24.520
life.
00:07:24.700
I mean, some of the most beautiful messages that I've gotten were from, uh, people who,
00:07:30.020
um, their mother was raped and that's how they were conceived.
00:07:33.760
And they're like, look, thank you for standing for life.
00:07:36.360
You know, I wouldn't be here if my mother had aborted me.
00:07:38.560
Um, I've gotten a bunch of messages like that.
00:07:41.340
And then there's also the, you know, obviously the vitriol, the hatred, the, uh, the cursing
00:07:46.620
me out, the telling me they wish I was dead.
00:07:49.000
Um, one, one guy even actually threatening my family and saying that he would come after
00:07:53.260
my kids, you know, that had to be reported to police.
00:07:55.220
So, um, it's, it's a, it's one of those subjects, you know, this is, this is one of
00:07:59.800
the things that really divides people.
00:08:01.200
And it's, it's crazy because, and when I, when I posted that, I actually posted an image
00:08:05.400
of that threat that I got.
00:08:06.700
And I said, you know, all I said was we shouldn't kill babies.
00:08:10.760
Yeah.
00:08:11.160
And this is how they respond to that.
00:08:12.860
Yeah.
00:08:13.240
That is so telling, isn't it?
00:08:14.940
That's like my, my, my, my argument is not that women shouldn't have rights or, you know,
00:08:20.740
that women are less than men, or, uh, I'm not saying something egregious and outrageous.
00:08:24.820
I'm just saying we shouldn't kill babies.
00:08:26.360
That's it.
00:08:27.080
Keep babies alive.
00:08:28.040
So yeah, why is, why is that getting people so fired up?
00:08:31.700
Well, it's just like any other idol, I guess, when you go after someone's idol, no matter
00:08:36.800
how obvious your statement might be, that's going to make something really angry when you
00:08:40.940
try to change or take away the thing that people worship, which I think an abortion is
00:08:45.920
not really necessarily worshiping dead babies, but worshiping the God of self.
00:08:50.460
And the God of self does demand, you know, killing and sacrificing good things on its altar.
00:08:56.960
So I guess that's what it is.
00:08:58.580
Like testing people's idols just makes them angry, which I guess is the same reason why
00:09:04.100
people are met with such vitriol.
00:09:06.600
When you say something like, Hey, I actually don't think that we should mutilate the genitalia
00:09:11.300
of children, something that seems like it would be really obvious, but has gotten people
00:09:16.040
kicked off Twitter.
00:09:17.340
And that's what I want to talk to you about next is that Libs of TikTok and everything.
00:09:21.980
Obviously, you know about all this and have been talking about this.
00:09:26.360
And that's why I want you to explain exactly what's happening.
00:09:28.880
She has been kicked off Twitter this most recent time for not just saying that it's wrong to
00:09:34.940
mutilate the genitals of children, but also showing that this is happening at hospitals,
00:09:39.780
right?
00:09:40.040
And now she's been suspended from Twitter for that.
00:09:43.400
Yeah, it's weird to figure out.
00:09:44.580
I mean, usually what happens, and you've had this experience yourself, I know, where
00:09:49.680
you'll tweet something and they'll want you to delete it, right?
00:09:52.340
And if you delete it, they'll let you back onto your account.
00:09:55.300
And in this case, they didn't flag any specific tweets and say that this tweet was a violation
00:10:01.600
and you need to delete it.
00:10:02.580
Instead of doing that, there just seemed to be this mass reporting that was happening of
00:10:07.900
the account in the wake of this report that we did about this children's hospital that
00:10:12.080
was admitting to, by the way, on recording, performing hysterectomies on girls younger than
00:10:19.220
16.
00:10:19.800
Yeah, National Children's Hospital in D.C.
00:10:22.800
Yeah.
00:10:23.340
Yes.
00:10:23.820
Two different people at this hospital said that they do that.
00:10:26.600
The website said that they do that.
00:10:28.720
And so we reported that.
00:10:30.200
And of course, they immediately call it misinformation.
00:10:32.860
Well, how's it misinformation?
00:10:34.640
Is it false?
00:10:35.520
If it's false, it's the hospital that provided us misinformation, not us providing the public
00:10:41.140
with misinformation.
00:10:44.020
But anyway, it's it's yeah, the whole thing with Twitter locking up the account, I think
00:10:50.300
there was just a mass reporting of the account and somebody at Twitter looked at it and decided
00:10:55.280
that we needed to be put on notice.
00:10:56.860
And this is one of the ways they escalate.
00:10:58.500
You know, they have a they have a process they go through.
00:11:00.760
They'll give you this seven day suspension before they give you the permanent suspension.
00:11:04.960
So sometimes that's the case.
00:11:06.800
Sometimes they'll just perma ban you, but we expect the next will be a permanent ban.
00:11:22.040
And you tweeted because apparently so kind of in this whole realm of children's hospitals,
00:11:28.740
as you know, but maybe not everyone listening knows that there have been several journalists,
00:11:34.280
including libs of tick tock that have just posted on Twitter, the publicly available
00:11:38.640
information of several children's hospitals, Boston Children's Hospital, Seattle Children's
00:11:43.040
Hospital, Kaiser Permanente in Oakland.
00:11:46.240
I mean, a lot of children's hospitals admitting that, hey, yeah, we're performing double mastectomies.
00:11:51.240
We are performing hysterectomies.
00:11:54.300
We are castrating young boys and girls who are confused about their gender.
00:11:59.560
People are being reported, as you said, for misinformation, even though, again, this is
00:12:03.820
verifiable information.
00:12:06.060
And the media's reaction has been not to, wow, I can't believe these barbaric atrocities
00:12:11.500
are happening to children.
00:12:12.480
But wow, I can't believe people are noticing and I can't believe people are talking about
00:12:16.660
it.
00:12:16.920
And so we should probably attack those people.
00:12:19.000
And then there was apparently a bomb threat to Boston Children's Hospital.
00:12:22.820
And everyone was blaming lips of tick tock.
00:12:25.240
Matt Walsh, Billboard Chris, all of these other people.
00:12:28.060
So you had a tweet thread about that.
00:12:29.880
And so what's going on?
00:12:33.680
How are you involved in this?
00:12:35.180
And what exactly went down?
00:12:38.180
Well, this is their tactic for suppressing and silencing anybody that criticizes what
00:12:44.140
they don't want to be criticized.
00:12:47.160
And by the way, it's not even necessarily criticizing.
00:12:49.880
You can just simply take one of the flyers for their family-friendly drag shows and post
00:12:53.940
it.
00:12:54.160
And if it gets enough attention, they'll try to take you out, even if you didn't say anything
00:12:59.320
critical about it.
00:13:00.400
You didn't try to organize a protest.
00:13:02.060
You didn't condemn the practice.
00:13:04.280
You could just simply share this stuff and they'll treat it.
00:13:06.800
It's the weirdest thing, the way that they respond to it.
00:13:10.120
But yeah, I mean, with this Boston hospital, you have this bomb threat that was called in.
00:13:16.180
And, you know, this has been this is this has been I'm suspicious of it for the reason that
00:13:21.420
it falls right in line with, you know, their narrative so so neatly.
00:13:24.860
They tend to do this.
00:13:26.060
You know, the left loves hoaxes that support their narrative.
00:13:29.160
They can't find the hate.
00:13:31.180
They have more demand for hate than is in supply.
00:13:33.900
And so they have to manufacture it somehow.
00:13:36.700
And I suspect that that's what's happened here, because the lives of Tick Tock and the
00:13:42.120
supporters of lives of Tick Tock who are outraged by these practices are outraged for one reason.
00:13:47.920
And it's because of concern for children.
00:13:50.320
It's because of concern for children.
00:13:52.040
We're not going to incite people to threaten a hospital and our followers aren't going to
00:13:57.320
show up in or call in bomb threats at a hospital in the name of protecting children.
00:14:02.820
It's a children's hospital.
00:14:03.840
So that doesn't make any sense whatsoever.
00:14:06.460
Our concern is actually for children.
00:14:08.620
So I think that this was probably a deranged leftist who's trying to set the stage for saying
00:14:14.460
that we incited violence here because that's their argument.
00:14:17.080
Their argument is that any criticism of any of these things amounts to incitement to violence.
00:14:22.080
And I'm really curious why that standard doesn't apply to them, because they engage in all kinds
00:14:26.940
of target harassment against people that they don't like.
00:14:29.900
You know, Media Matters wrote a hit piece about you recently, right?
00:14:32.320
Um, does that not amount to incitement to violence against you?
00:14:36.020
They criticized you harshly.
00:14:37.640
You know, they, they, they said things that aren't true, but apparently that's just, you
00:14:44.800
know, transparent reporting.
00:14:46.820
Right.
00:14:47.320
Well, we haven't lied.
00:14:48.440
We haven't lied about family friendly drag shows.
00:14:50.520
And I, when I say family friendly, please note the scare.
00:14:53.080
Quote unquote, yeah.
00:14:54.340
Quote unquote family.
00:14:55.360
There's nothing family friendly about that garbage.
00:14:57.360
Yeah.
00:14:57.760
Um, but yeah, they've, they, they lie.
00:15:00.680
We don't lie.
00:15:01.560
We just report the facts and they say that it's incitement to violence.
00:15:04.540
They call it stochastic terrorism.
00:15:06.260
Yeah.
00:15:06.460
Um, and so this Boston thing is just the latest, uh, where they're trying to get us wrapped
00:15:11.000
up in this and say that, you know, it's not just lives of tick tock, but you mentioned
00:15:13.720
Matt Walsh and some others who had been really critical of this hospital and others.
00:15:18.520
Chris Ruffo is in that group too.
00:15:20.640
And, uh, and they'll say that, you know, we're trying to incite violence again.
00:15:23.600
Look, we have a problem with the behavior.
00:15:25.720
We don't, the argument that we hate people, we hate trans people, we hate gay people.
00:15:30.600
And we want them to die.
00:15:32.380
It's so ridiculous.
00:15:33.680
We care about children and we don't want children to be castrated, mutilated, sterilized, drugged.
00:15:39.280
Uh, that's what we care about.
00:15:41.180
And we want to draw attention to where this is happening.
00:15:43.840
And everybody, every reasonable person should be outraged if it is in fact happening.
00:15:47.560
If it's wrong, I'll be happy that it's wrong.
00:15:49.800
If we, if we were wrong on that report, if these hospital staffers were admitted incorrect
00:15:53.000
and lied to us, that would be a great thing.
00:15:55.320
That would be a really good thing.
00:15:56.720
Let's hope for that.
00:15:57.620
Let's be outraged if it's actually true.
00:15:59.500
Right.
00:15:59.900
Exactly.
00:16:00.720
Did you see that story?
00:16:01.660
Speaking of like the double standards for violence and family friendly drag shows, did
00:16:06.620
you see that story out of Roanoke, Texas that was having, they were having a family drag
00:16:11.040
show at some bar.
00:16:12.560
I mean, what a dystopian phrase, almost as dystopian as gender affirming hysterectomies, but they
00:16:18.980
were having this drag show and kids were there.
00:16:21.480
I mean, basically strippers men and, you know, they're scantily clad with fake boobs on dancing
00:16:28.200
for tips in front of kids and Antifa self-proclaimed Antifa was standing in front and, you know,
00:16:36.420
black block with their guns, with their long guns.
00:16:40.760
They've got people up on the, up on the roofs looking for protesters protecting this drag
00:16:48.180
show.
00:16:48.920
And they have signs that say like, keep Roanoke gay, whatever.
00:16:52.600
So this apparently is fine.
00:16:55.500
This apparently is courage.
00:16:57.400
And yet we hear the president of the United States, people on MSNBC consistently say, oh,
00:17:02.560
the right is instigating political violence.
00:17:05.580
The right is threatening violence if they don't get their way.
00:17:08.600
It's not even that that's just a little bit wrong.
00:17:10.900
It's that the exact opposite is true.
00:17:13.240
So what are we supposed, what are, what are we supposed to do?
00:17:16.620
Like with this double standard of not just reporting, but of justice?
00:17:20.720
Like, how do we push back against a system like that?
00:17:25.640
Well, I mean, pointing it out repeatedly and on the biggest platforms possible is, is crucial
00:17:34.500
and exposing it for what it is, is crucial.
00:17:37.220
I think reasonable people can see right through this nonsense that they are, in fact, doing
00:17:40.620
what they object to so strenuously.
00:17:43.780
I think the problem is that we're losing our ability to do that.
00:17:47.060
You know, it'll, it'll, it's only a matter of time before Libs of TikTok won't be on the
00:17:50.360
internet anymore.
00:17:51.380
I mean, maybe the sub stack will last a little bit longer than the Twitter account.
00:17:55.180
Who knows how long sub stack will last if someone goes after the payment processor behind
00:17:58.920
it, all of that, the hosting.
00:18:01.880
You know, so this is where the, this is where the fight is right now.
00:18:05.380
We are fighting for the right to object to this widespread, um, depravity targeted at children
00:18:13.940
that this it's targeted at corrupting our children.
00:18:17.840
And we're, we're barely hanging on to the right to even object to it, which is crazy.
00:18:24.120
You mentioned a moment ago too, that, you know, it's so dystopian talking about these
00:18:27.200
phrases like family friendly drag shows and gender affirming care, which is a terrible
00:18:32.480
euphemism.
00:18:32.940
One of the more evil euphemisms I've ever heard for something that's really atrocious.
00:18:36.440
Um, we, I feel like we've become almost numb to how crazy it is.
00:18:42.540
I know you and I both think that it's really crazy, but we get so used to talking about these
00:18:46.320
things.
00:18:46.760
You go back just a few years and these things would have been absolutely unfathomable by
00:18:52.160
everyone.
00:18:52.980
And we're, we're, we're now like kind of numb to it that it's just, it's happening every
00:18:56.840
day.
00:18:57.140
It's happening everywhere.
00:18:58.280
It wasn't that long ago that people were denying that there was such a thing as a family friendly
00:19:02.580
drag show.
00:19:03.120
Like you go back merely weeks and people were saying, no, that's not even happening.
00:19:06.780
It's happening everywhere.
00:19:08.020
It's all over the place.
00:19:09.080
And it's so common that we don't even find it as objectionable as we should.
00:19:12.680
Yeah.
00:19:13.260
And that's really scary to me.
00:19:14.760
That's what's scary to me is we need to maintain this sense of outrage that this is really,
00:19:19.340
really wrong.
00:19:20.200
And we have to continue to stand up and fight against it with every breath that we have
00:19:24.440
and for as long as we can until they shut us all up.
00:19:33.120
That's what the left does.
00:19:39.040
They simultaneously say it's not happening and it's good that it is just like so-called
00:19:45.020
comprehensive sex education, CRT in schools.
00:19:48.640
That's definitely not happening.
00:19:50.500
And if it is, but it's awesome that it's happening and you're a bigot for even, uh, for even bringing
00:19:56.380
it up.
00:19:56.920
So dumb.
00:19:57.720
Speaking on satire.
00:19:59.340
Yeah.
00:19:59.680
Yeah.
00:20:00.160
I know.
00:20:00.520
Which continues to make y'all's job at the Babylon Bee kind of difficult.
00:20:05.720
Um, so you talked about like, who knows how long Substack is going to be there?
00:20:09.680
Who knows how long any of our accounts are going to be on Twitter?
00:20:12.060
I'm just like waiting for the day that I get permanently banned like James Lindsay, which
00:20:16.100
still bums me out.
00:20:17.660
And I saw that Truth Social, according to Axios, um, they were kicked off Google Play, the Google
00:20:24.400
Play Store temporarily.
00:20:26.320
Um, now I'm not on Truth Social.
00:20:27.980
I don't know if you are, um, but they are, Google Play is saying, Google is saying that,
00:20:35.000
you know, they don't meet our standards.
00:20:36.320
We have to take them off.
00:20:37.680
Obviously I'm having a hard time believing that a similar thing kind of happened to Parler,
00:20:41.420
I think in 2021.
00:20:43.420
And so it's like, where do we even go if the plot, if we can't even build platforms anymore?
00:20:50.020
I mean, what are you supposed to do?
00:20:52.240
Yeah, that's tough.
00:20:53.080
I mean, yeah, that's the, it's the content moderation that they're getting knocked for.
00:20:56.260
You know, they don't have enough, they don't have appropriate procedures in place, according
00:20:59.720
to Google for dealing with, um, content moderation and taking down content that Google doesn't want
00:21:06.680
in apps.
00:21:07.960
So, so it's ultimately, that's the problem.
00:21:10.320
You know, you, you've got Parler and, uh, and, and True Social and some of these other
00:21:14.320
apps that, that want to be able to be distributed widely through these app stores, but they're
00:21:19.900
really beholden to the app stores and the, and the terms that the app stores put in place
00:21:24.680
for what they moderate and what they don't moderate.
00:21:26.820
So it's not really up to them.
00:21:28.140
It's up to the, it's up to Google.
00:21:29.680
It's up to Apple.
00:21:31.120
And obviously these are big tech companies that have all the, you know, all these insane
00:21:37.420
progressive ideas, all this ideology that that's being shoved down everyone's throats.
00:21:42.800
They are affirming it and they're building it into their terms that you have to affirm
00:21:48.100
it and you can't criticize it.
00:21:49.880
And so even if you want to have a free speech platform, you can't because you still have to
00:21:54.500
abide by their rules and you still have to do content moderation based on what they say
00:21:58.200
you need to moderate.
00:21:59.580
That's deeply problematic.
00:22:01.360
Deep.
00:22:01.580
It means that there is no public square where free speech exists anymore.
00:22:06.340
And, uh, and so something needs to be done to deal with that.
00:22:09.000
I'm optimistic, honestly, that something will be done to deal with that.
00:22:12.060
I just don't know when it's actually going to happen.
00:22:14.560
Um, you know, I, I don't know.
00:22:17.140
I don't know.
00:22:17.720
Look, you know, Elon Musk's promise to, to turn Twitter back into the de facto town square.
00:22:22.940
How's that going to work?
00:22:23.920
If Apple says, you know what?
00:22:25.040
Twitter can't be in our store anymore.
00:22:26.780
Yeah.
00:22:27.940
And I mean, what's the deal with that?
00:22:29.660
I don't know if you necessarily have any inside knowledge, but is that off the table
00:22:35.640
now?
00:22:35.980
I know that they're kind of in a lawsuit.
00:22:37.660
Is there any hope that Elon Musk is going to take over Twitter?
00:22:40.900
He's right here.
00:22:41.600
We hang out.
00:22:42.060
He's in my office.
00:22:42.720
Oh, hey, Elon.
00:22:44.040
What's up?
00:22:44.360
We're friends.
00:22:45.240
So, okay, cool.
00:22:46.220
What does he say?
00:22:47.300
I don't have, uh, inside information on this.
00:22:50.060
Um, if I did, I couldn't reveal it, but I, I don't know.
00:22:54.800
I still, I can't make up my mind on whether or not Musk is, uh, is really legitimately
00:22:59.780
trying to get out of the deal or if he's just trying to work out a better deal.
00:23:03.580
Um, you could make an argument, I think, uh, you could make an interesting argument for
00:23:08.260
either one of those.
00:23:09.200
And, uh, but I don't know.
00:23:11.160
I remain hopeful that it goes through.
00:23:12.720
I do hope that it goes through.
00:23:13.980
I think that it would be good if it was in his hands.
00:23:15.940
Yeah.
00:23:16.320
Definitely get it out of the hands.
00:23:17.620
Like Bill Maher said, Twitter does need a new sheriff.
00:23:20.220
Get it out of their hands and put it, put it into somebody else as somebody who actually
00:23:24.340
values free speech.
00:23:25.600
And I know that, you know, Musk is by no means a, uh, uh, a conservative.
00:23:30.400
Um, but if he values free speech, then, then that's a good thing.
00:23:34.540
Yeah.
00:23:35.080
Um, but ultimately, you know, we can't, we can't depend on super wealthy, uh, uh, benevolent,
00:23:43.800
um, saviors to, to come in on a, on a horse and save us.
00:23:48.760
Um, there needs to be something that's done either through Congress or through the Supreme
00:23:53.340
Court or something.
00:23:54.140
There needs to be some kind of, uh, through the, you know, the appropriate channels where
00:23:57.880
we actually preserve the first amendment, where, where speech, where people are actually
00:24:00.920
speaking and being heard.
00:24:02.360
Yeah.
00:24:02.580
And that is going to require political will.
00:24:05.220
I know we, as Republicans typically say, we don't want to rely on the government.
00:24:09.020
We don't want to rely on politicians, which I agree.
00:24:12.160
I would love to be in a position where we don't have to rely on people in power to advocate
00:24:17.920
on our behalf.
00:24:18.620
But I mean, Roe v. Wade was overturned because of the political will, not just of the people,
00:24:23.960
uh, in the state, but also the, the lawmakers in the state who made a law that then made
00:24:31.660
it to the Supreme Court.
00:24:32.760
So the same thing has to happen.
00:24:34.500
We have to elect representatives who have the political and moral will to not just do something
00:24:41.180
about censorship and ensure that free speech is actually preserved, but also a lot of the
00:24:46.400
moral atrocities that we're talking about, which is why I like Ron DeSantis.
00:24:50.560
I mean, he's willing to do everything that is constitutionally allowed, all the tools that
00:24:56.140
are constitutionally available to him to use his power to push back against things that
00:25:00.580
makes some conservatives uncomfortable.
00:25:02.220
I just feel like, I don't know.
00:25:04.560
I just think that that's what time it is.
00:25:06.260
I just think that that's the new era that we're in, that we have to kind of be comfortable
00:25:10.560
with politicians exercising power on behalf of good things.
00:25:14.920
Yeah.
00:25:16.580
And it's not the power of force where you're compelling somebody to believe something that
00:25:21.420
they don't want to believe or do something that they don't want to do.
00:25:24.780
It's the power.
00:25:26.340
Um, it's the power that's used to safeguard what's good and true and, uh, and preserve,
00:25:32.720
uh, freedom and rights.
00:25:35.580
And so, you know, it's obviously it's twisted and, and, and distorted when they criticize DeSantis,
00:25:40.780
you know, they, they criticize him as being a tyrant who wants to take away your rights.
00:25:43.860
Um, but everything that I see that he's done has been a fight for the preservation of freedom,
00:25:49.560
not it's annihilation or it's oppression.
00:25:52.660
So, um, I agree with you.
00:25:54.720
I mean, we do need, we do need leadership with a backbone.
00:25:57.440
We need leaders who see these problems for what they are, who aren't, who aren't numb to
00:26:01.620
them, who aren't willing to give an inch and are, and are willing to fight and say, look,
00:26:05.840
we have to use the power at our disposal, uh, to preserve the good, the true and our freedoms.
00:26:11.200
Um, and if we're not willing to do that, I mean, if, if anyone's not willing to do that,
00:26:15.240
then we got to vote them out and replace them with someone who is.
00:26:18.760
Yeah.
00:26:19.620
Yep.
00:26:20.080
I agree.
00:26:20.740
You recently tweeted, which you've said this in a variety of ways.
00:26:23.800
I think I've heard you say this.
00:26:25.500
Some people think we're improving morally by making fun of fewer things.
00:26:28.900
I think the opposite is true.
00:26:30.120
We're more depraved than ever because we're accepting and affirming what should be ridiculed
00:26:36.960
and rejected.
00:26:38.920
So I agree with you.
00:26:40.440
Some people would say, maybe people who identify as maybe the tone police, they would say, no,
00:26:45.880
we're not going to get anywhere by ridiculing people, by rejecting people, by making fun
00:26:51.260
of people, punching down, whatever it is.
00:26:54.020
We just need to, you know, engage with these people and engage with these issues.
00:26:57.920
Why do you think ridiculing these ideas is important for morality?
00:27:03.720
Well, yeah, I mean, that goes, I think you can make a moral case for mockery and I've
00:27:09.000
tried to make it, uh, over the last couple of years.
00:27:11.900
Um, I put it another way, the way that I said it on, on Rogan show was the, the absurd has
00:27:18.160
only become sacred because it hasn't been sufficiently mocked.
00:27:22.020
And, um, I think that's really true.
00:27:24.440
You know, we're talking about these insane ideas, these things that are so unbelievable,
00:27:27.820
but you know, we've become kind of numb to them because they're so commonplace now.
00:27:31.280
Um, they're not just commonplace, they're sacred, they're untouchable.
00:27:36.560
You can't criticize them.
00:27:37.700
You can't joke about them.
00:27:38.740
Well, why is that?
00:27:39.680
Why is that?
00:27:40.120
Well, because we took them seriously and it's not, this is not about, this is not about attacking
00:27:45.560
people personally and making them feel bad about themselves and bullying them into believing
00:27:50.220
what we believe or something like that.
00:27:51.760
This is about, uh, examining bad ideas that are harmful, that can hurt people, that will
00:27:58.520
impact our kids and holding them up to scrutiny, uh, criticizing them even harshly, brutally
00:28:06.160
and ridiculing them.
00:28:07.640
Yes.
00:28:07.900
Ridiculing them, mocking them.
00:28:09.040
If it's an absurd, insane idea that would be harmful if it played out in our society, then
00:28:14.360
we should be mocking it to the sidelines so that it's never adopted, so that it never becomes
00:28:18.580
popular, so that, so that kids, young people see it for what it is and laugh it off instead
00:28:23.500
of taking it seriously.
00:28:24.980
Imagine if we had done that more effectively over the last several years.
00:28:28.240
I think that comedians, and I, you know, I was making this point with Rogan and he pushed
00:28:32.000
back on me a little bit.
00:28:32.880
We disagreed about it.
00:28:33.660
Yeah, I remember this part.
00:28:34.220
Yeah.
00:28:34.760
But yeah, I think comedians bear some of that responsibility, especially satirists, because
00:28:38.780
I put, um, like I was telling him, I put satirists in a different category than comedians.
00:28:42.820
You know, comedians, comedians tend to just make jokes for the sake of making jokes.
00:28:46.960
And oftentimes they're just really silly punchlines.
00:28:49.980
Um, they don't necessarily have any kind of moral concern behind them.
00:28:53.180
The satirist usually has a moral concern behind his jokes, though not always.
00:28:56.760
Some, some of our jokes are just pretty stupid and silly too.
00:28:59.520
Um, but when you're doing religious satire, especially, there is definitely moral concern.
00:29:04.320
Satire, religious satire is the marriage of wit and moral concern together and using humor
00:29:10.100
as the vehicle to put that through.
00:29:11.540
Um, and so, you know, when people, when people object to the tone, you get the tone police
00:29:16.640
or you get the people saying, oh, Jesus wouldn't make fun of others and he wouldn't mock and
00:29:20.780
ridicule them.
00:29:21.480
It's not about attacking people and making them feel bad.
00:29:24.060
It's about attacking these bad ideas.
00:29:25.920
We're not running around with a knife, trying to stab someone and hurt them.
00:29:29.880
We're more like, this is the analogy I use with Rogan.
00:29:32.980
We're more like the surgeon who's using a scalpel, trying to excise something bad before it kills
00:29:37.620
the host.
00:29:39.040
You know, it's like, it's like trying to cut out these social cancers.
00:29:42.040
Why do we have so many social cancers?
00:29:44.140
Well, in part, because we didn't do enough to push back on them, to ridicule them.
00:29:49.040
So, um, I, I honestly, I think, I think, you know, who knows, maybe I'll write a book on
00:29:53.160
that subject because I think it's an important topic.
00:29:55.040
I think that we need to do more to defend, not just the rightness of, of ridicule in certain
00:30:00.260
contexts, obviously.
00:30:01.660
Um, but the effectiveness of it and why we're suffering so much from, from being susceptible to,
00:30:06.500
especially in the, with the minds of young people, these really insane, harmful ideas
00:30:11.760
that are, that are, that are taking a huge toll.
00:30:14.400
If you just look at the stats on like teens right now, like how, how depressed teenagers
00:30:19.080
are, um, you've got so many things feeding into this.
00:30:22.020
Obviously there's the social media, all the, all of those trends, but we're, we are as a
00:30:26.520
society, we are purposefully and successfully confusing countless kids so that they don't even
00:30:33.320
know what and who they are and then the solutions that we provide to them irreversibly damage
00:30:38.540
them for life.
00:30:39.540
Right.
00:30:40.040
And that is something that we're not going to subject to ridicule.
00:30:42.960
Is there anything more deserving of ridicule than that?
00:30:44.820
And I do feel like it's the younger generation that sometimes, especially like the liberal
00:30:59.960
younger generation that is, uh, the most averse to that kind of mockery and that kind of derision
00:31:08.060
because of one thing that you just mentioned, like the pressure that comes not just with social
00:31:13.780
media, but also being in school to go along, to get along and to put your pronouns in your
00:31:18.740
profile and to say, yeah, trans women are women.
00:31:21.280
You're not, your brain isn't even allowed to question it because you are so scared of being
00:31:26.900
bullied or being excluded or being called a bigot.
00:31:31.220
I mean, that's been true since the beginning of time that teenagers don't want to be excluded.
00:31:35.180
You want to fit in.
00:31:35.960
It's an awkward time, but especially with not just peer pressure, but with pressure from
00:31:43.520
the media, pressure from politicians, pressure from the culture at large to all be thinking
00:31:48.840
one way.
00:31:49.660
I think a lot of them are probably afraid to laugh at the things that maybe they know
00:31:54.720
intuitively are ridiculous.
00:31:57.040
And that's how, I mean, that's how the thought police works.
00:31:59.100
That's how like this dystopian language that we're talking about works.
00:32:02.660
It works to limit even your range of thought, not just your range of language, but your range
00:32:08.360
of what your mind is even allowing itself to imagine.
00:32:12.960
And that is why I think mockery, that is why I think satire is so important because it gives
00:32:19.700
your mind permission in like a funny way and maybe a seemingly more permissible way to think
00:32:26.560
about the things that you have been told are forbidden.
00:32:30.300
That's one reason why I think it's so important.
00:32:31.840
That's one reason.
00:32:32.300
I also think that if you, you know, if somebody is, if somebody is telling you with a straight
00:32:36.740
face that two and two make five, and, and this is literally a person who's just abandoned
00:32:41.560
rationality on purpose, like they, they're not crazy.
00:32:45.260
They're just, they've bought into the importance of insisting that two and two makes five, even
00:32:51.220
though they know that it doesn't.
00:32:53.800
Reason doesn't work.
00:32:54.780
You can't appeal to that person's reason because they've abandoned their reason.
00:32:57.840
So what's the other tool that you can use?
00:32:59.760
If you don't have reason, you can at least ridicule it, make them look silly.
00:33:02.920
And, uh, and, and expose how ridiculous their, their way of thinking is to other people on
00:33:07.780
the sidelines.
00:33:08.540
You're not always engaging in these things.
00:33:10.600
And I know, you know, this because you're, you know, you're out there in the public sphere
00:33:13.720
and debating people on Twitter and debating people on your show.
00:33:16.740
You're not always going to change the mind of the person that you're talking to.
00:33:19.620
In fact, you'll rarely change the mind of the person that you're directly talking to,
00:33:22.920
but you're definitely going to be influencing the people on the sidelines who are listening in.
00:33:26.520
Because a lot of them haven't made up their minds and they're just waiting to see who
00:33:29.800
makes an actual case for something.
00:33:31.700
And if you can make something either, if you can either refute it or ridicule it and make
00:33:35.520
it look silly, then you can influence those people.
00:33:37.560
And it's not a cheat.
00:33:38.820
I mean, it's, it's, it's not like a, it's not like you're resorting to ad hominem to avoid
00:33:43.400
actually engaging the arguments.
00:33:45.040
These aren't arguments.
00:33:46.140
These are, these are just insane ideas.
00:33:48.780
So they are deserving of mockery.
00:33:50.020
And it's not even, it's, I would say it's not even ad hominem when it is, when it's
00:33:56.680
satire or when it's, I mean, sometimes humor, of course, can be a personal attack, but satire
00:34:02.860
typically isn't.
00:34:04.000
I mean, you're making an ironic point about something that happens to be true.
00:34:07.380
Like I had people, some people, Christian women typically say that like my Elizabeth Warren
00:34:13.680
video where I pretended to go to the pregnancy center and say, this is so, you know, dangerous
00:34:17.740
and awful saying, you know, we're told to respect our leaders and this is disrespectful.
00:34:22.640
This is not Christ-like.
00:34:24.100
I'm like, well, it really had nothing to do with Elizabeth Warren herself.
00:34:28.120
It had to do with this very destructive and dangerous and damaging idea that the very centers
00:34:33.720
that are providing all of the resources that the left says that women need in order to choose
00:34:38.300
life, that Elizabeth Warren is demonizing them.
00:34:41.380
And of course it also like it, uh, doubled in the role of showing people like what a pregnancy
00:34:47.680
center is actually like.
00:34:49.280
So I think that there is an absolutely necessary wasn't to make Elizabeth Warren feel bad about
00:34:55.400
herself.
00:34:55.800
You know, it wasn't, you weren't attacking her personally to, to go after her.
00:35:00.080
There's a, there's a, there's another point that you're trying to make.
00:35:02.160
And that's usually where they, that's usually where they get it wrong is they try to go after
00:35:05.580
your, misunderstand your motives and try to act like the most important thing in the world
00:35:11.000
is to not hurt anyone's feelings.
00:35:13.640
It's like, look, you know what?
00:35:14.980
If some feelings get hurt in the process of, of us, uh, protecting kids from being castrated,
00:35:23.300
mutilated, drugged, aborted, uh, so be it.
00:35:27.800
I mean, your feelings, uh, feelings don't matter nearly as so much as children's, uh, bodies
00:35:32.980
and rights, um, uh, preserving and protecting them.
00:35:37.180
So, you know, and by the way, and this is an important point to the outrage, the feelings
00:35:43.220
that are hurt, it's 99% of the time fake.
00:35:46.700
Yeah.
00:35:47.240
They fake being upset.
00:35:48.860
They're pretending to be offended so that they can get you to apologize so that they can basically,
00:35:55.280
uh, get you to submit.
00:35:56.560
And, uh, and so this is a tactic.
00:35:59.420
It's a tool that they use.
00:36:00.600
And I tell people all the time, you know, never censor yourself and never apologize to
00:36:05.320
these people who you, who you don't actually know, apologize to someone that you know and
00:36:09.000
love who you actually genuinely offend when you do offend them, but don't apologize to
00:36:13.540
random people on the internet who are using fake outrage as a tool to bludgeon you into
00:36:17.760
submission and silence.
00:36:19.020
Exactly.
00:36:19.240
Never.
00:36:19.660
I beg you, never apologize to those people.
00:36:21.920
And never apologize.
00:36:24.180
And I'm talking about in this context, never apologize for that, which you're not sorry
00:36:29.840
for.
00:36:30.340
If you say something that, uh, you know, someone calls you, who knows racist, transphobic
00:36:36.160
for, but it's actually true.
00:36:37.780
It's actually true.
00:36:38.760
And you really did mean it.
00:36:40.160
And the only reason you're saying sorry is because people misunderstood you or took you
00:36:44.680
out of context or want to malign you.
00:36:46.920
You don't need to say you're sorry.
00:36:48.920
Exactly what you said.
00:36:50.020
Give a sincere apology to someone that you care about and who actually will take your
00:36:54.360
apology.
00:36:55.420
Do not apologize to the trolls online.
00:36:57.840
I remember there was this, um, this happens, uh, gosh, like liberal woman.
00:37:02.200
Instagram is the worst place on earth.
00:37:03.860
But there is this woman who, she owns this like baby wrap company, baby carrier company.
00:37:10.820
And she was, she made this post, oh, I'm patenting this baby wrap.
00:37:16.220
This is, you know, amazing.
00:37:17.280
This was right after George Floyd.
00:37:19.140
And when it was just Instagram was worse than it usually is.
00:37:22.760
And all of these activists came to her page and said, you know, how dare you a white woman
00:37:28.600
try to patent this?
00:37:29.580
You know, African women have been using this type of like baby wrap for thousands of years,
00:37:35.720
indigenous women, blah, blah, blah.
00:37:37.120
So she did this whole apology where she was crying and she was saying like, I've wanted
00:37:43.020
to commit suicide over the past few days.
00:37:44.940
This has been so awful for me.
00:37:46.920
What did the commenter say?
00:37:48.680
Stop making this about you.
00:37:50.180
This isn't about you.
00:37:51.340
Stop trying to make us feel bad for you.
00:37:53.220
They didn't take her apology at all.
00:37:55.000
She then had to do an apology for her apology.
00:37:57.620
And she decided that she wasn't going to patent the wrap anymore, which is absolutely
00:38:02.160
ridiculous.
00:38:03.040
It had no correlation to reality whatsoever.
00:38:05.360
And it just goes to show that the apology doesn't matter.
00:38:09.460
If she would have just moved on and not said anything, those people would have gotten
00:38:13.260
bored.
00:38:13.760
They would have moved on to the next prey.
00:38:15.720
Of course.
00:38:16.240
That's what people need to understand.
00:38:17.700
The outrage will dissipate.
00:38:19.880
Just keep going.
00:38:21.340
Yeah.
00:38:21.700
Well, and it's not just that they don't want really actually want your apology.
00:38:25.100
They want your submission.
00:38:26.200
Right.
00:38:26.400
Um, they want to make a point out of you.
00:38:29.640
Um, but you also, you also do more damage when you apologize, because what you're doing
00:38:35.640
is you're feeding the mob and strengthening it and encouraging them that this kind of behavior
00:38:40.620
works and that it gets them what they want.
00:38:43.120
And so what you, what you're doing is you're, you're setting other people up to be in the
00:38:48.220
same position that you were just in by encouraging that mob to go after others and try to get that
00:38:52.500
same kind of response that you gave them.
00:38:54.020
So, you know, I think we, I think we take the power away from them when we refuse to
00:38:58.640
apologize.
00:38:59.680
I wish there were more people who understood that because really, honestly, you can like
00:39:03.480
count them on one hand.
00:39:04.560
There's very few who get it.
00:39:06.300
Uh, even the ones that you would think and hope wouldn't apologize.
00:39:09.420
Sometimes if they have a big enough platform and they're worried about losing sponsors or
00:39:12.860
whatever, there's a lot of pressure on them and they cave, they cave and, and, uh, and they
00:39:17.140
only make them up stronger.
00:39:29.520
The average person can help with that because there was another story that happened right
00:39:33.840
around the same time.
00:39:34.820
There is this like a baby sleep trainer who has a huge social media presence.
00:39:39.800
And someone revealed that her family had given like $1,700 to Trump back in like 2018.
00:39:47.360
And so they took all of her proprietary videos that were behind a paywall.
00:39:51.180
They made them public so that she couldn't make money on them.
00:39:53.760
I mean, just trashed her.
00:39:55.240
She had friends with big accounts that unfollowed her talking bad about her.
00:39:59.280
And at the time, all I did was post, Hey, um, you should message this woman to my followers
00:40:06.240
and just encourage her and just tell her that you support her and that you're thankful for
00:40:11.980
her.
00:40:12.460
And she ended up sending me this like really long, wonderful, and like very encouraging,
00:40:17.880
but also it wasn't necessary because I didn't do anything, but letter just saying like, that
00:40:22.220
is what helped me not apologize.
00:40:24.200
All of the messages from people saying, thank you, stand strong.
00:40:29.180
It's okay.
00:40:29.700
I'll still support you.
00:40:30.900
I still follow you.
00:40:31.900
She was like, that is why I kept going and did not back down.
00:40:36.060
So she didn't apologize.
00:40:37.220
She came back to social media a couple of days later and said, I'm here to keep serving
00:40:41.240
families.
00:40:42.360
Thank you for those of you who are still here.
00:40:44.300
Never brought it up again.
00:40:45.720
She's doing fine.
00:40:47.220
So we can share the arrows, whether you're in the public eye or not, like you can share the
00:40:52.240
arrows with the people who are feeling the heat.
00:40:54.940
And instead of saying, hey, oh, I'm so glad that that's not me.
00:40:58.480
I'm glad that I'm not the one getting bullied right now.
00:41:01.560
You standing up and saying, yeah, me too.
00:41:03.760
I also hold that opinion.
00:41:05.320
I stand with her.
00:41:06.400
That makes a huge difference.
00:41:07.680
And if everyone did that, maybe we would all have the courage to not back down.
00:41:12.420
Yeah, I think that's a great point.
00:41:13.720
That's a really good point because you need something to counterbalance, to counteract the
00:41:17.880
insanity, the pressure that's coming to silence yourself or to apologize or whatever.
00:41:24.560
And that's you can't ask for that.
00:41:26.840
People have to willingly offer that up.
00:41:29.220
Yeah.
00:41:29.480
And and so, yeah, I think it's a good point.
00:41:31.920
People need to be conscious of that.
00:41:33.420
Yeah.
00:41:33.980
All right.
00:41:34.740
Last question.
00:41:36.120
You had Elon Musk on your show and you guys talked about a little bit about the Bible and
00:41:41.660
about Christianity.
00:41:42.400
Christianity and there was some pushback.
00:41:45.440
There is some pushback about like how you guys talked about Christianity and the gospel
00:41:49.800
and theology with Elon Musk.
00:41:51.960
Tell us a little bit about that, like what you kind of thought about the criticism and
00:41:56.600
how the conversation really went from your perspective.
00:42:01.460
Yeah, you know, this happens.
00:42:04.100
I think any time you do something like this and it has this much visibility, you can't make
00:42:09.620
everybody happy.
00:42:10.280
I think I think every time we publish a headline, somebody gets upset and offended by it.
00:42:14.980
Anytime we do an interview, somebody is upset and offended by something that we said or
00:42:18.460
did.
00:42:19.700
In this particular case, though, you know, we you know, we did this we did kind of like
00:42:25.040
a mock altar call situation.
00:42:28.300
And and, you know, some people weren't happy with that.
00:42:31.140
They actually wanted us to just present the gospel really straightforward to Musk.
00:42:34.360
And and and and I think that I think that maybe some of the misunderstanding there is that
00:42:40.340
from our perspective, you know, we didn't have him on our show so that we could pull out
00:42:45.000
our Bible and try to get him to convert on the show.
00:42:47.980
We had him on our show so we could hear him out.
00:42:50.840
And, you know, we I think that we've done a good job so far of fostering and developing
00:42:55.220
a relationship with Musk and some good questions were raised in that discussion and there will
00:43:01.920
hopefully be future discussions.
00:43:03.400
I mean, we've continued to stay in touch with him and in contact with him since then.
00:43:06.740
So I think I'm personally one of these people who believes when it's when you're approached
00:43:11.060
to evangelizing, sharing the gospel, apologetics.
00:43:15.880
There are a lot of intellectual objections that need to be cleared out of the way first.
00:43:19.860
There's a lot of relationship building that needs to happen.
00:43:22.280
I don't think that it's always advisable, necessary or right to just drive, dive straight
00:43:29.480
into a straightforward gospel presentation and try to get someone to pray the sinner's
00:43:33.880
prayer with you.
00:43:35.280
In some cases, you can really turn somebody off and and and you could end up undermining the
00:43:41.520
relationship that's already there that would have given you a better opportunity down
00:43:45.540
the road.
00:43:46.100
So do I think that we handled everything perfectly in that interview?
00:43:49.460
No, of course not.
00:43:50.280
I would go back and change some of those things.
00:43:52.160
But it was also super last minute.
00:43:54.220
It's not like you guys had been preparing for that for three months.
00:43:56.720
It was kind of just like fell into your laps.
00:43:59.680
Yeah, it did literally overnight.
00:44:01.600
So I think we did as good a job as we can can do with it.
00:44:05.720
I don't really have regrets that, you know, where I'm like, oh, I'm so sorry.
00:44:08.420
We handled it a certain way.
00:44:10.180
I think that we did.
00:44:11.640
We did.
00:44:12.360
We did a great job of preserving the relationship.
00:44:15.540
Establishing trust with him.
00:44:17.100
And I hope we'll continue to have more conversations with him.
00:44:19.200
We did talk with him offline after the interview was over for almost an hour.
00:44:21.980
So there was more conversation that happened off camera than you saw on camera there with
00:44:26.540
that with that whole thing.
00:44:27.500
So, you know, people love to be critics and and that's fine.
00:44:32.200
They can have their opinions.
00:44:33.440
But, you know, there are plenty of conversations that I've had that I am sure people are like,
00:44:37.940
why weren't you clear on this?
00:44:39.280
Why didn't you share the gospel more?
00:44:40.860
Why didn't you correct them on this?
00:44:42.860
And I mean, it's it can be difficult in the moment.
00:44:45.440
In hindsight is 2020.
00:44:47.620
I mean, I'm confident in the Lord's sovereignty and we don't have time to get into the debate
00:44:52.160
about our differences on salvation.
00:44:53.700
But I think we both agree that the Lord is sovereign and hopeful that he can use any seeds planted
00:44:59.860
in that conversation, water them and give them growth.
00:45:03.200
Tactics by Greg Kokel is a great book.
00:45:05.580
And he talks about how not every conversation or every discussion or debate is necessarily
00:45:10.640
about winning, but about doing everything you can in that particular conversation with
00:45:17.320
that interaction to plant good seeds, make them start asking questions and thinking for
00:45:22.300
themselves.
00:45:23.520
You just don't know what the Lord can do with that.
00:45:25.620
So I do wish I do wish like in that conversation when he brought up that he thinks that Jesus
00:45:30.060
is a good moral teacher, for example, and he brought up some, you know, love your enemies,
00:45:34.780
you know, turn the other cheek, that kind of stuff and doing others as you would have
00:45:39.360
them do unto you.
00:45:39.840
The golden rule, right?
00:45:40.660
You know, he's citing some of these things and saying, you know, Jesus is a good moral teacher.
00:45:43.440
And I do wish in hindsight that I had pressed him on that to say, well, yeah, he also, he
00:45:48.740
claims to be God.
00:45:49.680
I mean, is he, you know, and pull out the whole, you know, liar, lunatic Lord thing, the
00:45:53.480
trilemma.
00:45:54.160
Is he, if he, if he made these outrageous claims about himself and he was either lying or
00:46:00.320
insane, then he certainly wasn't a good moral teacher, you know, so, so you've got to, you've
00:46:06.020
got to figure out exactly who he was based on what he said.
00:46:08.740
Um, and I think that, you know, the best explanation, uh, from my perspective is that he was who
00:46:13.980
he claimed to be.
00:46:14.960
And that's why I can't, I can call him a good moral teacher is because he was an honest
00:46:18.480
person who was very sane and happens to be the son of God.
00:46:22.220
Um, Musk would have to wrestle with that and try to decide, you know, how do I, how do I
00:46:25.900
account for the fact that I consider him a good moral teacher while he's out there claiming
00:46:28.840
to be God and that he can forgive sins.
00:46:31.360
Uh, you know, that, that doesn't make any sense.
00:46:33.380
The guy's, the guy's a madman or a liar if he's, uh, if he's not actually God.
00:46:36.680
So, um, I would have pressed him on that, but you know, we didn't have time.
00:46:40.380
Yeah.
00:46:40.560
Next time.
00:46:41.400
Next time.
00:46:41.880
Yeah.
00:46:42.080
Next time.
00:46:42.840
Yeah.
00:46:43.040
Lord willing, there'll be another opportunity.
00:46:44.640
So, well, thank you so much.
00:46:46.280
Thanks for your insight.
00:46:47.220
Thanks for everything that you do.
00:46:49.140
Um, you, one thing I want to ask, did you think when you decided to start running the
00:46:54.200
Babylon Bee that you would be as busy as you are like out front speaking at these conferences,
00:47:00.260
doing these interviews, like as much traction as you get on Twitter, like, did you think
00:47:06.120
that you would kind of become a spokesperson, not just for satire, but really for a lot
00:47:10.600
of conservative values in general?
00:47:12.220
I mean, you are out there everywhere.
00:47:15.440
No, uh, definitely not.
00:47:18.400
I thought I would be kind of behind the scenes running the business and, um, and that we would
00:47:23.080
just keep making jokes on the internet.
00:47:24.560
But I think with, with all of the attacks on us, there needed to be a response.
00:47:28.400
Um, you know, we needed to have our wits about us and we have to, as much as we want to keep
00:47:32.380
things light and bring levity to the situation, there are things that we have to deal with
00:47:35.620
very seriously.
00:47:36.780
Yeah.
00:47:36.980
Um, and I think there's been a lot of opportunities for me to go out there.
00:47:40.020
The conversation with Rogan was a really good one, um, to, to give a, you know, a serious
00:47:46.060
take and perspective on these real issues that are impacting us.
00:47:49.160
We found ourselves even running a comedy site.
00:47:51.440
It's crazy.
00:47:51.940
You know, our speech has been under attack.
00:47:53.840
The truth has been under attack.
00:47:55.180
Rationality has been under attack.
00:47:56.280
Comedy has been under attack.
00:47:57.320
And we found ourselves on the front lines, defending all of these things.
00:48:00.900
And so, no, I never, I never expected to find myself, uh, in the midst of a battle for the
00:48:05.660
preservation of freedom and the restoration of sanity.
00:48:07.620
That was not, uh, on my radar, but we're here and we're just, we're doing our best to, uh,
00:48:12.580
to speak the truth, to do it boldly, uh, to use humor as much as possible.
00:48:17.320
Um, and you know, God will do the rest of it.
00:48:19.800
Yeah.
00:48:20.300
Well, good job.
00:48:21.440
Thanks for taking the time to come on.
00:48:22.860
I really appreciate it.
00:48:24.340
Thank you, Ali.
00:48:27.320
Thank you.
Link copied!