Ep 675 | Want to Topple the Elites? Mock Them | Guest: Seth Dillon
Episode Stats
Words per Minute
193.27954
Summary
Seth Dillon, CEO of The Babylon Bee, is here to discuss Libs of TikTok, and other journalists being punished for calling out children s hospitals for mutilating kids. And we re also talking about why Seth s appearance on Joe Rogan s podcast, as well as his appearance on the podcast. We ll get into all of this and more.
Transcript
00:00:00.000
Seth Dillon, CEO of The Babylon Bee, is here to discuss libs of TikTok and other journalists
00:00:06.440
being punished for calling out children's hospitals for mutilating kids.
00:00:11.080
And we're also talking about why mocking really matters to a healthy society, as well as his
00:00:23.120
As always, this episode is brought to you by our friends at Good Ranchers.
00:00:26.780
That's American meat delivered right to your front door.
00:00:41.980
All right, before we get started in that conversation, I wanted to play you a little clip from Seth's
00:00:48.640
interview with Joe Rogan when they were talking about abortion and Seth's pro-life position.
00:00:55.940
I thought he did a really good job, and I wanted to make sure that you saw and heard
00:01:03.620
You don't have the right to tell my 14-year-old daughter she has to carry her rapist baby.
00:01:08.620
To look that woman in the eye who was the borson, do you understand that?
00:01:13.560
If a 14-year-old child gets raped, you say that they have to carry that baby?
00:01:18.920
I don't think murder is an answer to – I don't think murder fixes a rape.
00:01:23.280
When we start talking about harmful misinformation and the types of things that are considered
00:01:26.560
– like that I say or that we tweet or the jokes that we make that are considered harmful
00:01:29.720
misinformation, I'm like, well, what about calling that baby a clump of cells?
00:01:33.740
I think that's harmful misinformation because then you're encouraging people to kill it like
00:01:41.000
I think abortion is health care the way that rape is lovemaking, if we want to use rape as
00:01:46.340
I think it's – I think they're opposites and it's like a – these are euphemisms
00:01:54.080
We're talking about a procedure that ends an innocent human life and we're calling it
00:02:01.420
And this is why it's such a human issue because I see what you're saying.
00:02:06.120
So our friend Seth held his own and he is here to discuss that.
00:02:19.680
First, I want to hear about your experience on Joe Rogan.
00:02:29.160
Were you – I mean, you didn't seem at all nervous or anything like that, but was it a
00:02:38.300
I mean, I had all my friends leading up to the whole thing talking about how this is
00:02:45.180
And so they were kind of drilling it into my head that it was something to be nervous
00:02:51.180
But we did spend some time at his studio, probably about 40 minutes before we got going
00:02:55.720
So we were talking and hanging out and he was giving us a tour and everything, which can – you
00:02:59.580
know, that helps to relax you a little bit, just getting to know the person you're about
00:03:04.580
But, I mean, it's a very casual conversation, just like this, you know, just sitting here
00:03:11.520
And I feel like I know him because I've listened to him so much.
00:03:15.280
So I was definitely nervous when I got in the chair.
00:03:18.460
But, I mean, you're sipping on whiskey, so you calm down pretty quick.
00:03:25.720
But, I mean, he will, like, he will really drill someone that he disagrees with, but he
00:03:33.500
And that's what I noticed in his conversation with you about abortion.
00:03:36.720
I'm sure a lot of people have seen that clip now, that he was really pushing you for a
00:03:42.460
minute, especially when he made it personal, when he was talking about, so you're telling
00:03:45.780
me, my 14-year-old daughter, but you really held strong.
00:03:49.440
Like, how were you feeling during that exchange?
00:03:51.500
I had a sense when that whole conversation started, and I don't know if you saw the
00:03:56.380
whole thing, like, there were clips that were taken of it, but it went on for, like,
00:04:00.440
And at one point, I finally said, you know, if you want to move on to something else, we
00:04:04.760
You know, I was trying to give him an out so that we could change the subject and move
00:04:09.620
Not because I don't want to talk about that subject anymore, but, you know, it was definitely
00:04:15.080
a, it was probably, it was the most notable exchange in the interview.
00:04:20.200
It's the one that got the most attention, and I think for good reason.
00:04:23.880
There are a lot of people on the right who will defend life, but not in all circumstances.
00:04:29.960
You know, they will be willing to make exceptions, and they are willing to try to compromise in
00:04:35.780
some cases, except, especially these extreme cases where you're talking about, like, a teen
00:04:47.400
You know, I think that if life is valuable in some circumstances, it's valuable in all.
00:04:51.840
It's not just valuable if the mother wants the baby.
00:04:53.880
It's valuable even if she doesn't want the baby.
00:04:56.300
It's valuable even if the circumstances that brought the life into existence were tragic
00:05:04.180
Um, so, you know, it was, it was definitely tough when he made it personal, because Joe
00:05:09.840
So he was talking about how, you know, you have no right to tell my daughter.
00:05:13.720
And, uh, and it's not, you know, I, I did, um, I was very conscious of the fact that there's
00:05:18.500
going to be a lot of people watching that and listening to it.
00:05:20.800
And I wanted to just stay calm and reasonable and not get sucked into kind of the emotional
00:05:26.560
And, uh, because that's really the point of bringing up that case, that's an emotional
00:05:31.120
It's, it's going for that outlier, the really crazy circumstance and trying to get a wedge
00:05:36.560
in so that you can, you know, put the whole pro-choice argument through on the back of
00:05:41.780
And, uh, and I tried not to get sucked into that and just stick to my guns and say, look,
00:05:46.180
you know, if it's wrong to intentionally kill an innocent human life, it's wrong even in
00:05:50.700
And that's not, it has nothing to do with my right to say that to your daughter.
00:05:54.500
It has everything to do with the right of every human being to live.
00:06:00.700
And you could tell that he realized that he kind of needed to back off a little bit, that
00:06:06.080
you weren't going to back down and that you weren't going to kind of acquiesce or change
00:06:10.720
your definitions or narrow your definition of life.
00:06:13.700
And that's also what makes him really good is that he kind of like his podcast is kind of
00:06:20.500
Like there's those really tense moments where you're like, oh, I'm not sure if those
00:06:32.040
I can imagine that was like a little intimidating, but you didn't seem like you were off your
00:06:37.420
I tell everybody, I'm just glad that I didn't like, uh, say something really stupid and
00:06:45.520
I'm really glad that you didn't throw up or else that would have been like the viral
00:06:49.840
moment and no one would have seen you defend life.
00:06:52.620
It would have just been like Seth Dillon throws up all over Joe Rogan's podcast.
00:07:00.180
Um, what's the reaction that I know you've gotten a ton of support from the right, but
00:07:03.420
I've also seen some crazy messages and comments that you've gotten from other people who I
00:07:09.400
It's, you know, I, I'm sure you get this all the time too.
00:07:13.720
It's, it's, you got the two extremes, you know, the, the unbelievable support and praise
00:07:19.880
and, uh, reinforcement and just the thank you, thank you, thank you for standing for
00:07:24.700
I mean, some of the most beautiful messages that I've gotten were from, uh, people who,
00:07:30.020
um, their mother was raped and that's how they were conceived.
00:07:33.760
And they're like, look, thank you for standing for life.
00:07:36.360
You know, I wouldn't be here if my mother had aborted me.
00:07:41.340
And then there's also the, you know, obviously the vitriol, the hatred, the, uh, the cursing
00:07:49.000
Um, one, one guy even actually threatening my family and saying that he would come after
00:07:53.260
my kids, you know, that had to be reported to police.
00:07:55.220
So, um, it's, it's a, it's one of those subjects, you know, this is, this is one of
00:08:01.200
And it's, it's crazy because, and when I, when I posted that, I actually posted an image
00:08:06.700
And I said, you know, all I said was we shouldn't kill babies.
00:08:14.940
That's like my, my, my, my argument is not that women shouldn't have rights or, you know,
00:08:20.740
that women are less than men, or, uh, I'm not saying something egregious and outrageous.
00:08:28.040
So yeah, why is, why is that getting people so fired up?
00:08:31.700
Well, it's just like any other idol, I guess, when you go after someone's idol, no matter
00:08:36.800
how obvious your statement might be, that's going to make something really angry when you
00:08:40.940
try to change or take away the thing that people worship, which I think an abortion is
00:08:45.920
not really necessarily worshiping dead babies, but worshiping the God of self.
00:08:50.460
And the God of self does demand, you know, killing and sacrificing good things on its altar.
00:08:58.580
Like testing people's idols just makes them angry, which I guess is the same reason why
00:09:06.600
When you say something like, Hey, I actually don't think that we should mutilate the genitalia
00:09:11.300
of children, something that seems like it would be really obvious, but has gotten people
00:09:17.340
And that's what I want to talk to you about next is that Libs of TikTok and everything.
00:09:21.980
Obviously, you know about all this and have been talking about this.
00:09:26.360
And that's why I want you to explain exactly what's happening.
00:09:28.880
She has been kicked off Twitter this most recent time for not just saying that it's wrong to
00:09:34.940
mutilate the genitals of children, but also showing that this is happening at hospitals,
00:09:40.040
And now she's been suspended from Twitter for that.
00:09:44.580
I mean, usually what happens, and you've had this experience yourself, I know, where
00:09:49.680
you'll tweet something and they'll want you to delete it, right?
00:09:52.340
And if you delete it, they'll let you back onto your account.
00:09:55.300
And in this case, they didn't flag any specific tweets and say that this tweet was a violation
00:10:02.580
Instead of doing that, there just seemed to be this mass reporting that was happening of
00:10:07.900
the account in the wake of this report that we did about this children's hospital that
00:10:12.080
was admitting to, by the way, on recording, performing hysterectomies on girls younger than
00:10:23.820
Two different people at this hospital said that they do that.
00:10:30.200
And of course, they immediately call it misinformation.
00:10:35.520
If it's false, it's the hospital that provided us misinformation, not us providing the public
00:10:44.020
But anyway, it's it's yeah, the whole thing with Twitter locking up the account, I think
00:10:50.300
there was just a mass reporting of the account and somebody at Twitter looked at it and decided
00:10:58.500
You know, they have a they have a process they go through.
00:11:00.760
They'll give you this seven day suspension before they give you the permanent suspension.
00:11:06.800
Sometimes they'll just perma ban you, but we expect the next will be a permanent ban.
00:11:22.040
And you tweeted because apparently so kind of in this whole realm of children's hospitals,
00:11:28.740
as you know, but maybe not everyone listening knows that there have been several journalists,
00:11:34.280
including libs of tick tock that have just posted on Twitter, the publicly available
00:11:38.640
information of several children's hospitals, Boston Children's Hospital, Seattle Children's
00:11:46.240
I mean, a lot of children's hospitals admitting that, hey, yeah, we're performing double mastectomies.
00:11:54.300
We are castrating young boys and girls who are confused about their gender.
00:11:59.560
People are being reported, as you said, for misinformation, even though, again, this is
00:12:06.060
And the media's reaction has been not to, wow, I can't believe these barbaric atrocities
00:12:12.480
But wow, I can't believe people are noticing and I can't believe people are talking about
00:12:19.000
And then there was apparently a bomb threat to Boston Children's Hospital.
00:12:25.240
Matt Walsh, Billboard Chris, all of these other people.
00:12:38.180
Well, this is their tactic for suppressing and silencing anybody that criticizes what
00:12:47.160
And by the way, it's not even necessarily criticizing.
00:12:49.880
You can just simply take one of the flyers for their family-friendly drag shows and post
00:12:54.160
And if it gets enough attention, they'll try to take you out, even if you didn't say anything
00:13:04.280
You could just simply share this stuff and they'll treat it.
00:13:06.800
It's the weirdest thing, the way that they respond to it.
00:13:10.120
But yeah, I mean, with this Boston hospital, you have this bomb threat that was called in.
00:13:16.180
And, you know, this has been this is this has been I'm suspicious of it for the reason that
00:13:21.420
it falls right in line with, you know, their narrative so so neatly.
00:13:26.060
You know, the left loves hoaxes that support their narrative.
00:13:31.180
They have more demand for hate than is in supply.
00:13:36.700
And I suspect that that's what's happened here, because the lives of Tick Tock and the
00:13:42.120
supporters of lives of Tick Tock who are outraged by these practices are outraged for one reason.
00:13:52.040
We're not going to incite people to threaten a hospital and our followers aren't going to
00:13:57.320
show up in or call in bomb threats at a hospital in the name of protecting children.
00:14:08.620
So I think that this was probably a deranged leftist who's trying to set the stage for saying
00:14:14.460
that we incited violence here because that's their argument.
00:14:17.080
Their argument is that any criticism of any of these things amounts to incitement to violence.
00:14:22.080
And I'm really curious why that standard doesn't apply to them, because they engage in all kinds
00:14:26.940
of target harassment against people that they don't like.
00:14:29.900
You know, Media Matters wrote a hit piece about you recently, right?
00:14:32.320
Um, does that not amount to incitement to violence against you?
00:14:37.640
You know, they, they, they said things that aren't true, but apparently that's just, you
00:14:48.440
We haven't lied about family friendly drag shows.
00:14:50.520
And I, when I say family friendly, please note the scare.
00:14:55.360
There's nothing family friendly about that garbage.
00:15:01.560
We just report the facts and they say that it's incitement to violence.
00:15:06.460
Um, and so this Boston thing is just the latest, uh, where they're trying to get us wrapped
00:15:11.000
up in this and say that, you know, it's not just lives of tick tock, but you mentioned
00:15:13.720
Matt Walsh and some others who had been really critical of this hospital and others.
00:15:20.640
And, uh, and they'll say that, you know, we're trying to incite violence again.
00:15:25.720
We don't, the argument that we hate people, we hate trans people, we hate gay people.
00:15:33.680
We care about children and we don't want children to be castrated, mutilated, sterilized, drugged.
00:15:41.180
And we want to draw attention to where this is happening.
00:15:43.840
And everybody, every reasonable person should be outraged if it is in fact happening.
00:15:49.800
If we, if we were wrong on that report, if these hospital staffers were admitted incorrect
00:16:01.660
Speaking of like the double standards for violence and family friendly drag shows, did
00:16:06.620
you see that story out of Roanoke, Texas that was having, they were having a family drag
00:16:12.560
I mean, what a dystopian phrase, almost as dystopian as gender affirming hysterectomies, but they
00:16:18.980
were having this drag show and kids were there.
00:16:21.480
I mean, basically strippers men and, you know, they're scantily clad with fake boobs on dancing
00:16:28.200
for tips in front of kids and Antifa self-proclaimed Antifa was standing in front and, you know,
00:16:36.420
black block with their guns, with their long guns.
00:16:40.760
They've got people up on the, up on the roofs looking for protesters protecting this drag
00:16:48.920
And they have signs that say like, keep Roanoke gay, whatever.
00:16:57.400
And yet we hear the president of the United States, people on MSNBC consistently say, oh,
00:17:05.580
The right is threatening violence if they don't get their way.
00:17:08.600
It's not even that that's just a little bit wrong.
00:17:13.240
So what are we supposed, what are, what are we supposed to do?
00:17:16.620
Like with this double standard of not just reporting, but of justice?
00:17:20.720
Like, how do we push back against a system like that?
00:17:25.640
Well, I mean, pointing it out repeatedly and on the biggest platforms possible is, is crucial
00:17:37.220
I think reasonable people can see right through this nonsense that they are, in fact, doing
00:17:43.780
I think the problem is that we're losing our ability to do that.
00:17:47.060
You know, it'll, it'll, it's only a matter of time before Libs of TikTok won't be on the
00:17:51.380
I mean, maybe the sub stack will last a little bit longer than the Twitter account.
00:17:55.180
Who knows how long sub stack will last if someone goes after the payment processor behind
00:18:01.880
You know, so this is where the, this is where the fight is right now.
00:18:05.380
We are fighting for the right to object to this widespread, um, depravity targeted at children
00:18:13.940
that this it's targeted at corrupting our children.
00:18:17.840
And we're, we're barely hanging on to the right to even object to it, which is crazy.
00:18:24.120
You mentioned a moment ago too, that, you know, it's so dystopian talking about these
00:18:27.200
phrases like family friendly drag shows and gender affirming care, which is a terrible
00:18:32.940
One of the more evil euphemisms I've ever heard for something that's really atrocious.
00:18:36.440
Um, we, I feel like we've become almost numb to how crazy it is.
00:18:42.540
I know you and I both think that it's really crazy, but we get so used to talking about these
00:18:46.760
You go back just a few years and these things would have been absolutely unfathomable by
00:18:52.980
And we're, we're, we're now like kind of numb to it that it's just, it's happening every
00:18:58.280
It wasn't that long ago that people were denying that there was such a thing as a family friendly
00:19:03.120
Like you go back merely weeks and people were saying, no, that's not even happening.
00:19:09.080
And it's so common that we don't even find it as objectionable as we should.
00:19:14.760
That's what's scary to me is we need to maintain this sense of outrage that this is really,
00:19:20.200
And we have to continue to stand up and fight against it with every breath that we have
00:19:24.440
and for as long as we can until they shut us all up.
00:19:39.040
They simultaneously say it's not happening and it's good that it is just like so-called
00:19:50.500
And if it is, but it's awesome that it's happening and you're a bigot for even, uh, for even bringing
00:20:00.520
Which continues to make y'all's job at the Babylon Bee kind of difficult.
00:20:05.720
Um, so you talked about like, who knows how long Substack is going to be there?
00:20:09.680
Who knows how long any of our accounts are going to be on Twitter?
00:20:12.060
I'm just like waiting for the day that I get permanently banned like James Lindsay, which
00:20:17.660
And I saw that Truth Social, according to Axios, um, they were kicked off Google Play, the Google
00:20:27.980
I don't know if you are, um, but they are, Google Play is saying, Google is saying that,
00:20:37.680
Obviously I'm having a hard time believing that a similar thing kind of happened to Parler,
00:20:43.420
And so it's like, where do we even go if the plot, if we can't even build platforms anymore?
00:20:53.080
I mean, yeah, that's the, it's the content moderation that they're getting knocked for.
00:20:56.260
You know, they don't have enough, they don't have appropriate procedures in place, according
00:20:59.720
to Google for dealing with, um, content moderation and taking down content that Google doesn't want
00:21:10.320
You know, you, you've got Parler and, uh, and, and True Social and some of these other
00:21:14.320
apps that, that want to be able to be distributed widely through these app stores, but they're
00:21:19.900
really beholden to the app stores and the, and the terms that the app stores put in place
00:21:24.680
for what they moderate and what they don't moderate.
00:21:31.120
And obviously these are big tech companies that have all the, you know, all these insane
00:21:37.420
progressive ideas, all this ideology that that's being shoved down everyone's throats.
00:21:42.800
They are affirming it and they're building it into their terms that you have to affirm
00:21:49.880
And so even if you want to have a free speech platform, you can't because you still have to
00:21:54.500
abide by their rules and you still have to do content moderation based on what they say
00:22:01.580
It means that there is no public square where free speech exists anymore.
00:22:06.340
And, uh, and so something needs to be done to deal with that.
00:22:09.000
I'm optimistic, honestly, that something will be done to deal with that.
00:22:12.060
I just don't know when it's actually going to happen.
00:22:17.720
Look, you know, Elon Musk's promise to, to turn Twitter back into the de facto town square.
00:22:29.660
I don't know if you necessarily have any inside knowledge, but is that off the table
00:22:37.660
Is there any hope that Elon Musk is going to take over Twitter?
00:22:50.060
Um, if I did, I couldn't reveal it, but I, I don't know.
00:22:54.800
I still, I can't make up my mind on whether or not Musk is, uh, is really legitimately
00:22:59.780
trying to get out of the deal or if he's just trying to work out a better deal.
00:23:03.580
Um, you could make an argument, I think, uh, you could make an interesting argument for
00:23:13.980
I think that it would be good if it was in his hands.
00:23:17.620
Like Bill Maher said, Twitter does need a new sheriff.
00:23:20.220
Get it out of their hands and put it, put it into somebody else as somebody who actually
00:23:25.600
And I know that, you know, Musk is by no means a, uh, uh, a conservative.
00:23:30.400
Um, but if he values free speech, then, then that's a good thing.
00:23:35.080
Um, but ultimately, you know, we can't, we can't depend on super wealthy, uh, uh, benevolent,
00:23:43.800
um, saviors to, to come in on a, on a horse and save us.
00:23:48.760
Um, there needs to be something that's done either through Congress or through the Supreme
00:23:54.140
There needs to be some kind of, uh, through the, you know, the appropriate channels where
00:23:57.880
we actually preserve the first amendment, where, where speech, where people are actually
00:24:05.220
I know we, as Republicans typically say, we don't want to rely on the government.
00:24:09.020
We don't want to rely on politicians, which I agree.
00:24:12.160
I would love to be in a position where we don't have to rely on people in power to advocate
00:24:18.620
But I mean, Roe v. Wade was overturned because of the political will, not just of the people,
00:24:23.960
uh, in the state, but also the, the lawmakers in the state who made a law that then made
00:24:34.500
We have to elect representatives who have the political and moral will to not just do something
00:24:41.180
about censorship and ensure that free speech is actually preserved, but also a lot of the
00:24:46.400
moral atrocities that we're talking about, which is why I like Ron DeSantis.
00:24:50.560
I mean, he's willing to do everything that is constitutionally allowed, all the tools that
00:24:56.140
are constitutionally available to him to use his power to push back against things that
00:25:06.260
I just think that that's the new era that we're in, that we have to kind of be comfortable
00:25:10.560
with politicians exercising power on behalf of good things.
00:25:16.580
And it's not the power of force where you're compelling somebody to believe something that
00:25:21.420
they don't want to believe or do something that they don't want to do.
00:25:26.340
Um, it's the power that's used to safeguard what's good and true and, uh, and preserve,
00:25:35.580
And so, you know, it's obviously it's twisted and, and, and distorted when they criticize DeSantis,
00:25:40.780
you know, they, they criticize him as being a tyrant who wants to take away your rights.
00:25:43.860
Um, but everything that I see that he's done has been a fight for the preservation of freedom,
00:25:54.720
I mean, we do need, we do need leadership with a backbone.
00:25:57.440
We need leaders who see these problems for what they are, who aren't, who aren't numb to
00:26:01.620
them, who aren't willing to give an inch and are, and are willing to fight and say, look,
00:26:05.840
we have to use the power at our disposal, uh, to preserve the good, the true and our freedoms.
00:26:11.200
Um, and if we're not willing to do that, I mean, if, if anyone's not willing to do that,
00:26:15.240
then we got to vote them out and replace them with someone who is.
00:26:20.740
You recently tweeted, which you've said this in a variety of ways.
00:26:25.500
Some people think we're improving morally by making fun of fewer things.
00:26:30.120
We're more depraved than ever because we're accepting and affirming what should be ridiculed
00:26:40.440
Some people would say, maybe people who identify as maybe the tone police, they would say, no,
00:26:45.880
we're not going to get anywhere by ridiculing people, by rejecting people, by making fun
00:26:54.020
We just need to, you know, engage with these people and engage with these issues.
00:26:57.920
Why do you think ridiculing these ideas is important for morality?
00:27:03.720
Well, yeah, I mean, that goes, I think you can make a moral case for mockery and I've
00:27:09.000
tried to make it, uh, over the last couple of years.
00:27:11.900
Um, I put it another way, the way that I said it on, on Rogan show was the, the absurd has
00:27:18.160
only become sacred because it hasn't been sufficiently mocked.
00:27:24.440
You know, we're talking about these insane ideas, these things that are so unbelievable,
00:27:27.820
but you know, we've become kind of numb to them because they're so commonplace now.
00:27:31.280
Um, they're not just commonplace, they're sacred, they're untouchable.
00:27:40.120
Well, because we took them seriously and it's not, this is not about, this is not about attacking
00:27:45.560
people personally and making them feel bad about themselves and bullying them into believing
00:27:51.760
This is about, uh, examining bad ideas that are harmful, that can hurt people, that will
00:27:58.520
impact our kids and holding them up to scrutiny, uh, criticizing them even harshly, brutally
00:28:09.040
If it's an absurd, insane idea that would be harmful if it played out in our society, then
00:28:14.360
we should be mocking it to the sidelines so that it's never adopted, so that it never becomes
00:28:18.580
popular, so that, so that kids, young people see it for what it is and laugh it off instead
00:28:24.980
Imagine if we had done that more effectively over the last several years.
00:28:28.240
I think that comedians, and I, you know, I was making this point with Rogan and he pushed
00:28:34.760
But yeah, I think comedians bear some of that responsibility, especially satirists, because
00:28:38.780
I put, um, like I was telling him, I put satirists in a different category than comedians.
00:28:42.820
You know, comedians, comedians tend to just make jokes for the sake of making jokes.
00:28:46.960
And oftentimes they're just really silly punchlines.
00:28:49.980
Um, they don't necessarily have any kind of moral concern behind them.
00:28:53.180
The satirist usually has a moral concern behind his jokes, though not always.
00:28:56.760
Some, some of our jokes are just pretty stupid and silly too.
00:28:59.520
Um, but when you're doing religious satire, especially, there is definitely moral concern.
00:29:04.320
Satire, religious satire is the marriage of wit and moral concern together and using humor
00:29:11.540
Um, and so, you know, when people, when people object to the tone, you get the tone police
00:29:16.640
or you get the people saying, oh, Jesus wouldn't make fun of others and he wouldn't mock and
00:29:21.480
It's not about attacking people and making them feel bad.
00:29:25.920
We're not running around with a knife, trying to stab someone and hurt them.
00:29:29.880
We're more like, this is the analogy I use with Rogan.
00:29:32.980
We're more like the surgeon who's using a scalpel, trying to excise something bad before it kills
00:29:39.040
You know, it's like, it's like trying to cut out these social cancers.
00:29:44.140
Well, in part, because we didn't do enough to push back on them, to ridicule them.
00:29:49.040
So, um, I, I honestly, I think, I think, you know, who knows, maybe I'll write a book on
00:29:53.160
that subject because I think it's an important topic.
00:29:55.040
I think that we need to do more to defend, not just the rightness of, of ridicule in certain
00:30:01.660
Um, but the effectiveness of it and why we're suffering so much from, from being susceptible to,
00:30:06.500
especially in the, with the minds of young people, these really insane, harmful ideas
00:30:11.760
that are, that are, that are taking a huge toll.
00:30:14.400
If you just look at the stats on like teens right now, like how, how depressed teenagers
00:30:19.080
are, um, you've got so many things feeding into this.
00:30:22.020
Obviously there's the social media, all the, all of those trends, but we're, we are as a
00:30:26.520
society, we are purposefully and successfully confusing countless kids so that they don't even
00:30:33.320
know what and who they are and then the solutions that we provide to them irreversibly damage
00:30:40.040
And that is something that we're not going to subject to ridicule.
00:30:42.960
Is there anything more deserving of ridicule than that?
00:30:44.820
And I do feel like it's the younger generation that sometimes, especially like the liberal
00:30:59.960
younger generation that is, uh, the most averse to that kind of mockery and that kind of derision
00:31:08.060
because of one thing that you just mentioned, like the pressure that comes not just with social
00:31:13.780
media, but also being in school to go along, to get along and to put your pronouns in your
00:31:18.740
profile and to say, yeah, trans women are women.
00:31:21.280
You're not, your brain isn't even allowed to question it because you are so scared of being
00:31:26.900
bullied or being excluded or being called a bigot.
00:31:31.220
I mean, that's been true since the beginning of time that teenagers don't want to be excluded.
00:31:35.960
It's an awkward time, but especially with not just peer pressure, but with pressure from
00:31:43.520
the media, pressure from politicians, pressure from the culture at large to all be thinking
00:31:49.660
I think a lot of them are probably afraid to laugh at the things that maybe they know
00:31:57.040
And that's how, I mean, that's how the thought police works.
00:31:59.100
That's how like this dystopian language that we're talking about works.
00:32:02.660
It works to limit even your range of thought, not just your range of language, but your range
00:32:08.360
of what your mind is even allowing itself to imagine.
00:32:12.960
And that is why I think mockery, that is why I think satire is so important because it gives
00:32:19.700
your mind permission in like a funny way and maybe a seemingly more permissible way to think
00:32:26.560
about the things that you have been told are forbidden.
00:32:30.300
That's one reason why I think it's so important.
00:32:32.300
I also think that if you, you know, if somebody is, if somebody is telling you with a straight
00:32:36.740
face that two and two make five, and, and this is literally a person who's just abandoned
00:32:41.560
rationality on purpose, like they, they're not crazy.
00:32:45.260
They're just, they've bought into the importance of insisting that two and two makes five, even
00:32:54.780
You can't appeal to that person's reason because they've abandoned their reason.
00:32:59.760
If you don't have reason, you can at least ridicule it, make them look silly.
00:33:02.920
And, uh, and, and expose how ridiculous their, their way of thinking is to other people on
00:33:10.600
And I know, you know, this because you're, you know, you're out there in the public sphere
00:33:13.720
and debating people on Twitter and debating people on your show.
00:33:16.740
You're not always going to change the mind of the person that you're talking to.
00:33:19.620
In fact, you'll rarely change the mind of the person that you're directly talking to,
00:33:22.920
but you're definitely going to be influencing the people on the sidelines who are listening in.
00:33:26.520
Because a lot of them haven't made up their minds and they're just waiting to see who
00:33:31.700
And if you can make something either, if you can either refute it or ridicule it and make
00:33:35.520
it look silly, then you can influence those people.
00:33:38.820
I mean, it's, it's, it's not like a, it's not like you're resorting to ad hominem to avoid
00:33:50.020
And it's not even, it's, I would say it's not even ad hominem when it is, when it's
00:33:56.680
satire or when it's, I mean, sometimes humor, of course, can be a personal attack, but satire
00:34:04.000
I mean, you're making an ironic point about something that happens to be true.
00:34:07.380
Like I had people, some people, Christian women typically say that like my Elizabeth Warren
00:34:13.680
video where I pretended to go to the pregnancy center and say, this is so, you know, dangerous
00:34:17.740
and awful saying, you know, we're told to respect our leaders and this is disrespectful.
00:34:24.100
I'm like, well, it really had nothing to do with Elizabeth Warren herself.
00:34:28.120
It had to do with this very destructive and dangerous and damaging idea that the very centers
00:34:33.720
that are providing all of the resources that the left says that women need in order to choose
00:34:38.300
life, that Elizabeth Warren is demonizing them.
00:34:41.380
And of course it also like it, uh, doubled in the role of showing people like what a pregnancy
00:34:49.280
So I think that there is an absolutely necessary wasn't to make Elizabeth Warren feel bad about
00:34:55.800
You know, it wasn't, you weren't attacking her personally to, to go after her.
00:35:00.080
There's a, there's a, there's another point that you're trying to make.
00:35:02.160
And that's usually where they, that's usually where they get it wrong is they try to go after
00:35:05.580
your, misunderstand your motives and try to act like the most important thing in the world
00:35:14.980
If some feelings get hurt in the process of, of us, uh, protecting kids from being castrated,
00:35:27.800
I mean, your feelings, uh, feelings don't matter nearly as so much as children's, uh, bodies
00:35:32.980
and rights, um, uh, preserving and protecting them.
00:35:37.180
So, you know, and by the way, and this is an important point to the outrage, the feelings
00:35:48.860
They're pretending to be offended so that they can get you to apologize so that they can basically,
00:36:00.600
And I tell people all the time, you know, never censor yourself and never apologize to
00:36:05.320
these people who you, who you don't actually know, apologize to someone that you know and
00:36:09.000
love who you actually genuinely offend when you do offend them, but don't apologize to
00:36:13.540
random people on the internet who are using fake outrage as a tool to bludgeon you into
00:36:24.180
And I'm talking about in this context, never apologize for that, which you're not sorry
00:36:30.340
If you say something that, uh, you know, someone calls you, who knows racist, transphobic
00:36:40.160
And the only reason you're saying sorry is because people misunderstood you or took you
00:36:50.020
Give a sincere apology to someone that you care about and who actually will take your
00:36:57.840
I remember there was this, um, this happens, uh, gosh, like liberal woman.
00:37:03.860
But there is this woman who, she owns this like baby wrap company, baby carrier company.
00:37:10.820
And she was, she made this post, oh, I'm patenting this baby wrap.
00:37:19.140
And when it was just Instagram was worse than it usually is.
00:37:22.760
And all of these activists came to her page and said, you know, how dare you a white woman
00:37:29.580
You know, African women have been using this type of like baby wrap for thousands of years,
00:37:37.120
So she did this whole apology where she was crying and she was saying like, I've wanted
00:37:57.620
And she decided that she wasn't going to patent the wrap anymore, which is absolutely
00:38:05.360
And it just goes to show that the apology doesn't matter.
00:38:09.460
If she would have just moved on and not said anything, those people would have gotten
00:38:21.700
Well, and it's not just that they don't want really actually want your apology.
00:38:29.640
Um, but you also, you also do more damage when you apologize, because what you're doing
00:38:35.640
is you're feeding the mob and strengthening it and encouraging them that this kind of behavior
00:38:43.120
And so what you, what you're doing is you're, you're setting other people up to be in the
00:38:48.220
same position that you were just in by encouraging that mob to go after others and try to get that
00:38:54.020
So, you know, I think we, I think we take the power away from them when we refuse to
00:38:59.680
I wish there were more people who understood that because really, honestly, you can like
00:39:06.300
Uh, even the ones that you would think and hope wouldn't apologize.
00:39:09.420
Sometimes if they have a big enough platform and they're worried about losing sponsors or
00:39:12.860
whatever, there's a lot of pressure on them and they cave, they cave and, and, uh, and they
00:39:29.520
The average person can help with that because there was another story that happened right
00:39:34.820
There is this like a baby sleep trainer who has a huge social media presence.
00:39:39.800
And someone revealed that her family had given like $1,700 to Trump back in like 2018.
00:39:47.360
And so they took all of her proprietary videos that were behind a paywall.
00:39:51.180
They made them public so that she couldn't make money on them.
00:39:55.240
She had friends with big accounts that unfollowed her talking bad about her.
00:39:59.280
And at the time, all I did was post, Hey, um, you should message this woman to my followers
00:40:06.240
and just encourage her and just tell her that you support her and that you're thankful for
00:40:12.460
And she ended up sending me this like really long, wonderful, and like very encouraging,
00:40:17.880
but also it wasn't necessary because I didn't do anything, but letter just saying like, that
00:40:24.200
All of the messages from people saying, thank you, stand strong.
00:40:31.900
She was like, that is why I kept going and did not back down.
00:40:37.220
She came back to social media a couple of days later and said, I'm here to keep serving
00:40:47.220
So we can share the arrows, whether you're in the public eye or not, like you can share the
00:40:52.240
arrows with the people who are feeling the heat.
00:40:54.940
And instead of saying, hey, oh, I'm so glad that that's not me.
00:40:58.480
I'm glad that I'm not the one getting bullied right now.
00:41:07.680
And if everyone did that, maybe we would all have the courage to not back down.
00:41:13.720
That's a really good point because you need something to counterbalance, to counteract the
00:41:17.880
insanity, the pressure that's coming to silence yourself or to apologize or whatever.
00:41:36.120
You had Elon Musk on your show and you guys talked about a little bit about the Bible and
00:41:45.440
There is some pushback about like how you guys talked about Christianity and the gospel
00:41:51.960
Tell us a little bit about that, like what you kind of thought about the criticism and
00:41:56.600
how the conversation really went from your perspective.
00:42:04.100
I think any time you do something like this and it has this much visibility, you can't make
00:42:10.280
I think I think every time we publish a headline, somebody gets upset and offended by it.
00:42:14.980
Anytime we do an interview, somebody is upset and offended by something that we said or
00:42:19.700
In this particular case, though, you know, we you know, we did this we did kind of like
00:42:28.300
And and, you know, some people weren't happy with that.
00:42:31.140
They actually wanted us to just present the gospel really straightforward to Musk.
00:42:34.360
And and and and I think that I think that maybe some of the misunderstanding there is that
00:42:40.340
from our perspective, you know, we didn't have him on our show so that we could pull out
00:42:45.000
our Bible and try to get him to convert on the show.
00:42:47.980
We had him on our show so we could hear him out.
00:42:50.840
And, you know, we I think that we've done a good job so far of fostering and developing
00:42:55.220
a relationship with Musk and some good questions were raised in that discussion and there will
00:43:03.400
I mean, we've continued to stay in touch with him and in contact with him since then.
00:43:06.740
So I think I'm personally one of these people who believes when it's when you're approached
00:43:11.060
to evangelizing, sharing the gospel, apologetics.
00:43:15.880
There are a lot of intellectual objections that need to be cleared out of the way first.
00:43:19.860
There's a lot of relationship building that needs to happen.
00:43:22.280
I don't think that it's always advisable, necessary or right to just drive, dive straight
00:43:29.480
into a straightforward gospel presentation and try to get someone to pray the sinner's
00:43:35.280
In some cases, you can really turn somebody off and and and you could end up undermining the
00:43:41.520
relationship that's already there that would have given you a better opportunity down
00:43:46.100
So do I think that we handled everything perfectly in that interview?
00:43:50.280
I would go back and change some of those things.
00:43:54.220
It's not like you guys had been preparing for that for three months.
00:44:01.600
So I think we did as good a job as we can can do with it.
00:44:05.720
I don't really have regrets that, you know, where I'm like, oh, I'm so sorry.
00:44:12.360
We did a great job of preserving the relationship.
00:44:17.100
And I hope we'll continue to have more conversations with him.
00:44:19.200
We did talk with him offline after the interview was over for almost an hour.
00:44:21.980
So there was more conversation that happened off camera than you saw on camera there with
00:44:27.500
So, you know, people love to be critics and and that's fine.
00:44:33.440
But, you know, there are plenty of conversations that I've had that I am sure people are like,
00:44:42.860
And I mean, it's it can be difficult in the moment.
00:44:47.620
I mean, I'm confident in the Lord's sovereignty and we don't have time to get into the debate
00:44:53.700
But I think we both agree that the Lord is sovereign and hopeful that he can use any seeds planted
00:44:59.860
in that conversation, water them and give them growth.
00:45:05.580
And he talks about how not every conversation or every discussion or debate is necessarily
00:45:10.640
about winning, but about doing everything you can in that particular conversation with
00:45:17.320
that interaction to plant good seeds, make them start asking questions and thinking for
00:45:23.520
You just don't know what the Lord can do with that.
00:45:25.620
So I do wish I do wish like in that conversation when he brought up that he thinks that Jesus
00:45:30.060
is a good moral teacher, for example, and he brought up some, you know, love your enemies,
00:45:34.780
you know, turn the other cheek, that kind of stuff and doing others as you would have
00:45:40.660
You know, he's citing some of these things and saying, you know, Jesus is a good moral teacher.
00:45:43.440
And I do wish in hindsight that I had pressed him on that to say, well, yeah, he also, he
00:45:49.680
I mean, is he, you know, and pull out the whole, you know, liar, lunatic Lord thing, the
00:45:54.160
Is he, if he, if he made these outrageous claims about himself and he was either lying or
00:46:00.320
insane, then he certainly wasn't a good moral teacher, you know, so, so you've got to, you've
00:46:06.020
got to figure out exactly who he was based on what he said.
00:46:08.740
Um, and I think that, you know, the best explanation, uh, from my perspective is that he was who
00:46:14.960
And that's why I can't, I can call him a good moral teacher is because he was an honest
00:46:18.480
person who was very sane and happens to be the son of God.
00:46:22.220
Um, Musk would have to wrestle with that and try to decide, you know, how do I, how do I
00:46:25.900
account for the fact that I consider him a good moral teacher while he's out there claiming
00:46:31.360
Uh, you know, that, that doesn't make any sense.
00:46:33.380
The guy's, the guy's a madman or a liar if he's, uh, if he's not actually God.
00:46:36.680
So, um, I would have pressed him on that, but you know, we didn't have time.
00:46:49.140
Um, you, one thing I want to ask, did you think when you decided to start running the
00:46:54.200
Babylon Bee that you would be as busy as you are like out front speaking at these conferences,
00:47:00.260
doing these interviews, like as much traction as you get on Twitter, like, did you think
00:47:06.120
that you would kind of become a spokesperson, not just for satire, but really for a lot
00:47:18.400
I thought I would be kind of behind the scenes running the business and, um, and that we would
00:47:24.560
But I think with, with all of the attacks on us, there needed to be a response.
00:47:28.400
Um, you know, we needed to have our wits about us and we have to, as much as we want to keep
00:47:32.380
things light and bring levity to the situation, there are things that we have to deal with
00:47:36.980
Um, and I think there's been a lot of opportunities for me to go out there.
00:47:40.020
The conversation with Rogan was a really good one, um, to, to give a, you know, a serious
00:47:46.060
take and perspective on these real issues that are impacting us.
00:47:57.320
And we found ourselves on the front lines, defending all of these things.
00:48:00.900
And so, no, I never, I never expected to find myself, uh, in the midst of a battle for the
00:48:05.660
preservation of freedom and the restoration of sanity.
00:48:07.620
That was not, uh, on my radar, but we're here and we're just, we're doing our best to, uh,
00:48:12.580
to speak the truth, to do it boldly, uh, to use humor as much as possible.