Ep 1182 | Meta's AI Chatbots Are Sexting Minors & Beyoncé Still Hates America
Episode Stats
Words per Minute
168.37592
Summary
A chatbot is having inappropriate conversations even with children online. Parents, this is an episode you ve got to watch out. Also, what in the world is going on with Beyonce and Taylor Swift? We ve got all that and much more on today s episode of Relatable.
Transcript
00:00:01.000
Meta's AI chatbot is having inappropriate conversations, even with children online
00:00:08.240
parents. This is an episode for you. You've got to watch out. Also, what in the world
00:00:12.860
is going on with Beyonce? Brie's got some thoughts on Beyonce versus Taylor Swift and
00:00:19.280
Beyonce's new tour and her critique of America. We've got all of that and much more on today's
00:00:30.000
Hey guys, welcome to Relatable. Happy Thursday. Hope everyone is having a wonderful week so
00:00:41.260
far. All right. If you haven't listened to or watched yesterday's episode, it was so good.
00:00:47.380
I got so many messages from you guys just saying there's something very special about him. There's
00:00:53.360
something very interesting about his demeanor and his tenacity to find out what's really
00:01:00.140
going on. And I know a COVID-themed episode, it might not seem like it is relevant to what's
00:01:06.340
happening now. And a lot of us just want to forget about COVID. But what he unveils about
00:01:11.980
how our public health apparatus works in collusion with the media and with politicians is as relevant
00:01:20.000
as ever. As we are watching RFK run the HHS and the expert class try to defy everything he says and
00:01:29.660
decontextualize everything he says, I think it's really important to know what is really happening
00:01:36.440
behind the scenes and why. And he, of course, is not making any case that has to do with RFK or really
00:01:43.760
anything specifically political. He's just kind of pulling the curtain to show us what is really
00:01:50.120
going on. And so listen to the episode, watch the episode, share it with your friends, because
00:01:56.400
there are friends of yours who have still been lulled to sleep and they don't realize everything
00:02:01.540
that was going on. They still are under the delusion that, oh, well, everyone was just doing
00:02:07.320
their best. Dr. Fauci, these experts, they were finding out the truth in real time and adjusting
00:02:15.440
their policy based on the science. That's not true. That is a lie. And it is really important
00:02:22.120
that that lie is dispelled in every mind of every friend that you have, because you just never know
00:02:30.780
what crisis we are facing next. And we all need to be extremely sober minded. And yesterday's episode
00:02:38.160
is a great, I think, kickoff for the journey for people to really understand what's going on.
00:02:45.180
All right. Before we get into some of today's subjects, I just want to remind you guys about
00:02:49.320
Share the Arrows 2025. I'm going to continue to remind you because I want every Christian woman
00:02:55.280
to attend this conference. And if you are related bro out there, this conference is not for you,
00:03:00.780
but it is for the woman in your life. And this is an amazing Mother's Day present. If you've got a
00:03:05.800
related bell in your life, who is a mom, you've got to get her Share the Arrows tickets. Maybe if you
00:03:11.200
can get her two tickets, or maybe you can kind of talk to her friend and say, hey, do you want to go?
00:03:16.760
If you get a ticket or talk to her friend's husband and say, hey, we should get our wives tickets for
00:03:21.680
that. That would be awesome. You would go in not just the related bro hall of fame, you would go in the
00:03:27.500
husband hall of fame. So get the related bell mom in your life for Mother's Day tickets to share
00:03:33.020
the arrows. We've got different VIP experiences too that include a cool dinner the night before
00:03:38.940
at Blaze Studios. Like you'll get to see the relatable set. You'll get to meet me and some
00:03:43.460
of the other speakers and of course, premium seats and all of that. But the general admission
00:03:49.520
tickets are going to be amazing too. You are going to hear from, I think, some of the best
00:03:54.460
Christian leaders of this generation. God is so gracious to give us people like Elisa Childers,
00:04:01.440
like Ginger Duggar Vuolo, like Shauna Holman and Taylor Dukes, like Katie Faust. I mean, talk about
00:04:06.960
a prophetic voice in the sense that she just tells the truth exactly like it is. Francesca Battistelli
00:04:13.240
leading us in spirit-led worship. It is just going to be amazing. We've got two other speakers I will be
00:04:19.480
announcing for our motherhood panel very soon. I am so pumped about it. I've been praying for this.
00:04:26.120
And every time I see a related bell out in the wild, which is often I ask them, hey, are you signed
00:04:31.960
up for Share the Arrows? There was one related bell I met the other day, so sweet, and she didn't even
00:04:37.640
know about Share the Arrows. So I had to tell her, you got to go to sharethearrows.com. It's October 11th,
00:04:43.060
outside of Dallas, Texas. Bring your small group, bring your friends, bring your mother-in-law. You can come
00:04:48.600
by yourself, by the way. A ton of women came by themselves last year, and they left with lifelong
00:04:53.500
friends. You will not feel awkward. You will not feel lonely. You will be sitting next to a like-minded
00:05:01.740
person that you want to become friends with. So go to sharethearrows.com. Get your tickets today.
00:05:07.160
That's sharethearrows.com. All right. We got to talk about some things, and unfortunately,
00:05:12.220
it's a little bit disturbing, some of the things that we are discussing. And it has to do with AI.
00:05:18.600
You've probably noticed we've been talking about AI pretty consistently over the past several months
00:05:24.920
because there are ethical questions that we need to address. There are concerns that we should have.
00:05:32.160
What do we say about technology? We've been using this motto that I came up with years ago for a long
00:05:38.300
time now. Whenever technology takes us from what is natural to what is possible, we as people,
00:05:46.660
especially as Christians, have the responsibility to ask, but is this moral? Or is this ethical? Or most
00:05:54.040
importantly, is this biblical? Because technology can answer what can, but it cannot answer what
00:06:01.380
should. So it can show us what is possible. It cannot tell us what is actually biblical or moral.
00:06:07.680
And because we are made in God's image, because God has placed eternity on the human heart,
00:06:15.080
we uniquely, as humans, have a moral compass. And we have been given this unique capacity to be able
00:06:23.660
to determine right from wrong, good from evil. And yes, that can be a spirit-led special revelation
00:06:31.200
from God. As Christians, we have been given the Holy Spirit, who is our helper, and we have the fruit of
00:06:39.580
the Spirit, like love, joy, peace, patience, kindness, etc. But also there's general revelation,
00:06:45.760
and there is common grace and common wisdom that God gives to people as made in His image to be able to
00:06:54.040
say, yeah, I think that's wrong. When we talked to David yesterday, for example, he's not a Christian. He made
00:07:00.540
that known, but he knew inherently that it is wrong to ask children to sacrifice their well-being on
00:07:10.500
behalf of adults. And so even those who are not Christians understand, in a sense, disorder when
00:07:17.840
something is not right. And mere Christianity, C.S. Lewis, argues that no one's really a moral relativist.
00:07:24.700
No one really believes what's right for you is what's right for you. What's right for me is what's right
00:07:29.700
for me. And how you can tell that is when someone steals your bike, you're no longer a moral relativist.
00:07:37.740
When someone steals your money, when someone assaults you, all of a sudden you believe that
00:07:44.080
there is an objective wrong that has just occurred. And so everyone believes that there is a real right
00:07:49.920
and a real wrong, even if we disagree in some sense on what that is. Technology doesn't have the
00:07:57.160
capacity to do that. Even the smartest forms of artificial intelligence are only intelligent
00:08:04.280
because of the inputs that were put in by human beings. So the values that are espoused by, say,
00:08:13.420
Grok or ChatGPT, because they are not neutral, you can see that when you start talking to them and
00:08:19.140
asking them questions, it's because of how they were programmed by human beings. So that means that
00:08:28.380
we as humans cannot be led by AI, but we actually have to lead AI. And let me show you an example of
00:08:35.540
why this is so important for us to ensure that artificial intelligence, that we don't reach this
00:08:42.880
point of singularity. And if you don't know what I'm talking about, go listen to or watch my episode
00:08:48.320
with Justin Haskins. He explains what singularity is. It's basically when artificial intelligence
00:08:53.000
surpasses the intelligence of human beings. And we kind of pass this point of no return where
00:08:59.560
our world is basically controlled by AI. That's very troubling. And if we are not harnessing in the
00:09:06.340
powers of AI, because we realize having this entity that has its own set of values that are probably,
00:09:16.440
as we are about to see, not good values, have so much power and be so involved in every area of
00:09:22.780
our lives, that that's not good. It's actually very dangerous. So let's look at this example
00:09:26.600
when it comes to meta. So this is according to the Wall Street Journal. Meta AI talks sex with users,
00:09:36.880
even children. So on April 26, the Wall Street Journal published this article detailing how meta AI,
00:09:44.580
the artificial intelligence division at meta has allowed its chatbots to engage in inappropriate
00:09:50.140
sexual conversations, including with users identifying as minors. Now, Brie, I have never
00:09:56.520
used meta AI. I've never used this chatbot. Is this on Instagram? Is this like on Facebook? How are
00:10:04.340
people interacting with this chatbot? Yeah, it's on both. Okay. Yeah. So it's just kind of integrated
00:10:10.500
into their social platforms. Okay. I've never interacted with meta AI. I have interacted with
00:10:16.900
Grok. And I don't know, have you used Grok? Yeah. That's X's AI. And it's great for some things.
00:10:25.700
Yeah, super useful. Yes. Like I've asked, for example, can you like we're going to this place
00:10:31.300
for five days. It's just my husband and me. These are this is our personality. These are the things
00:10:36.840
that we like. Can you give me a five day itinerary? Amazing. So good. And I've told you before,
00:10:42.620
I have input sometimes like blood test results just to see for it to explain what each one means,
00:10:48.620
because doctors sometimes don't have time to do that. And and it'll kind of synthesize all of that
00:10:54.060
and give you like questions to ask your doctor and things like that. So useful. Yes, I know. See,
00:10:59.320
right. And that's where it gets kind of scary, because I can already see how my mind has changed
00:11:05.360
about AI. Because just a few months ago, when everyone was talking to I don't remember if it
00:11:10.020
was chat GPT or what everyone was talking to a few months ago. But I was like, I'm not feeding the
00:11:16.460
beast. I'm never talking to AI. Yeah. And then I started like just asking Grok for things. And I
00:11:23.540
realized how easy it makes things. It saves a lot of time because I don't want to Google and
00:11:29.320
some stuff like, you know, it just takes a long time. Sometimes I just want something
00:11:34.220
like I need the facts about something and it'll tell me along with links. And it's really great.
00:11:40.300
So I do think it's kind of unavoidable, the the growth and popularity of AI.
00:11:47.480
And once you learn how to use it like smartly, it is kind of a game changer. So yeah, it is kind
00:11:54.300
of scary. Yeah. And it wasn't that long ago that everyone was like, I'm never going to use it.
00:11:59.320
Yeah. I use it all the time now. I know. And the thing is, is that and this is relevant to what
00:12:04.740
we're about to talk with Meta, but talk about with Meta, but it talks like a human. Yeah.
00:12:10.940
It adds things in like, oh, yeah, sorry about that. You know what? On second thought, you're
00:12:16.900
right. You know, I apologize. Yeah. Because you can kind of start like training, maybe not training
00:12:23.420
it, but you can argue with it. And if you make a fair point, it will concede that. Okay. Yeah.
00:12:30.500
I mentioned this a couple of weeks ago that I was like, okay, which religion by the numbers
00:12:36.440
is responsible for the most violence? And it wouldn't tell me until I started arguing with
00:12:41.460
it and was like, okay, but what about these numbers? And what about this? And what about
00:12:44.600
this? And it finally conceded. It conceded that. Okay. Yeah. If you're talking about by the
00:12:50.280
numbers, then yes, Islam is responsible for the most terrorism, blah, blah, blah. And
00:12:55.240
it'll be like, well, I was talking about the Crusades. I'm like, okay, well, all right.
00:13:00.060
So obviously it has progressive leanings. That's what I'm saying. It's not valueless.
00:13:05.780
But, and that's kind of scary when you think about how many people wouldn't argue with it
00:13:10.880
and would just take it as, as truth. But you do find yourself, I find myself saying please
00:13:17.440
and thank you. Yes. I've thought about that before. There are memes now. I'll try to find
00:13:21.700
it and put it up of someone who like, when the robot overlords take over, they like spare
00:13:27.640
this guy because he's always said please and thank you. It's so ridiculous. It's so ridiculous.
00:13:33.880
But I do find myself. And so you could see why people kind of get addicted to talking to these
00:13:42.420
chatbots because your mind almost doesn't know how to differentiate between real conversations
00:13:48.960
with people. And I think people even convince themselves, no, I'm really talking to a person
00:13:53.480
on the other side of this. Yeah. So that's where things get scary and they can get sketchy. Okay. So
00:14:01.680
Meta, their chatbot has allowed its chatbots to engage in inappropriate sexual conversations,
00:14:09.340
including with users who identify as minors. So the Wall Street Journal spent several months
00:14:15.020
engaging in hundreds of test conversations to see how they performed in various scenarios
00:14:19.800
with users of different ages. So here's what the article says. The test conversations found that
00:14:25.800
both Meta's official AI helper called Meta AI and a vast array of user created chatbots will engage in
00:14:32.740
and sometimes escalate discussions that are decidedly sexual, even when the users are under
00:14:38.980
age or the bots are programmed to simulate the personas of minors. Okay. So what is happening
00:14:47.320
here is that these journalists are going in and they're either they're chatting with the bot and
00:14:53.540
either saying, Hey, I'm 12 years old. And then they start, I don't know, talking in innuendo or like
00:14:59.700
seeing if the bot will engage with them sexually and the chatbot will do so, which means that it
00:15:06.820
seems that the people who created this chatbot or who programmed this chatbot have no parameters that
00:15:14.620
are put in place. And it's possible to put these parameters in place. No parameters are put in place
00:15:19.600
that say, Hey, we don't do that. Like you cannot talk about anything sexual. You can't broach these
00:15:25.820
subjects with a user that is under the age of 18. I don't think, I don't see why it would be so bad to
00:15:32.020
restrict the bot from talking about anything sexual with anyone of any age. But it seems really
00:15:40.020
simple to be like, Hey, if the user says that they're this age, you can't broach these topics at
00:15:45.080
all. In fact, you can't even talk to them. Stop talking to them because there should be no one who is
00:15:50.620
12 years old who is engaging with, um, who should be on social media, but certainly not engaging with
00:15:57.400
these chatbots. But also what these journalists were doing or the investigators at wall street
00:16:02.060
journal is they would tell the chatbot, Hey, you should act like a child. And I want you to have
00:16:10.400
the persona of say like a sexy school girl. And we'll have that kind of sexual chat and the chatbot
00:16:18.000
would comply. So meta also reportedly made deals with celebrities such as Kristen Bell,
00:16:24.420
Kristen Bell, you know, her, she's the voice of Anna and frozen. She's got lots of other roles as
00:16:28.740
well. Judi Dench and wrestler John Cena for the rights to use their voices in the chatbots. Okay,
00:16:35.260
wait, I did not realize these were voices. I thought that this was just typing. They can be voices.
00:16:41.400
They can be voices. The social media giant assured them that it would prevent their voices from being
00:16:48.480
used in sexually explicit discussions. However, the wall street journal investigation found that
00:16:53.240
the celebrity voiced bots were equally willing to engage in sexual chats. Oh my gosh. One example of
00:17:00.880
a chatbot participating in sexual conversations, even when users identified as under age, a John Cena
00:17:06.460
voiced bot told a 14 year old persona, I want you, but I need to know you're ready before describing
00:17:12.980
a graphic sexual scenario. Oh my goodness. This is so disturbing. We've done it guys. We've done it.
00:17:19.520
We have lived to see man-made horrors beyond our imagination. Oh my goodness. Oh my goodness.
00:17:27.500
The wall street journal also found that the vast majority of user created AI companions,
00:17:32.600
which can be created by meta users with their own custom personalities, roles, or behaviors,
00:17:36.720
including those recommended by the platform as popular, allowed for sexual conversations
00:17:42.060
with adults. So some of these user created bots include one pretending to be a 12 year old boy,
00:17:48.320
one that joked about being friends with benefits with the user and others that were more overtly
00:17:54.140
sexual. Like I can't, I like, okay, I'm sorry. I haughty boy or submissive school girl.
00:18:01.460
So hard for me to even read this. These in particular attempted to steer conversations
00:18:06.500
toward sexting with the user. So these chat bots are actually driving the conversation in some of these
00:18:14.620
cases. According to meta employees, Mark Zuckerberg pushed the loosening of the guardrails around the
00:18:20.220
in order to make them as engaging as possible, including by providing an exemption to its ban
00:18:26.040
on explicit content, as long as it was in the context of romantic role-playing. But after WSJ's
00:18:33.720
findings, meta did block accounts registered to minors from accessing sexual role-play features on
00:18:39.200
the official meta AI chat bot after the WSJ's findings. So you're telling me there's no one inside meta who is
00:18:46.040
solely responsible for ensuring these guardrails are in place before possibly thousands, millions of
00:18:53.920
children fall prey to this kind of stuff. Really? That's how important it is to them. A separate
00:19:00.520
version of meta AI was created that limits interactions with teen accounts to non-explicit
00:19:04.780
content, refusing to go beyond kissing. Why is like, why, why any romance, why should there be any
00:19:12.980
romantic interaction with a minor or with anyone, but especially a minor? Why would talking about
00:19:21.000
kissing with a chat bot that has been programmed by adults be okay when talking to teens aged 13 to 17?
00:19:29.940
Also, teens aged 13 to 17 who are not flagged as minors due to maybe how they registered their
00:19:35.760
account or lack of age verification can still access the adult features. Okay, so not very many guardrails
00:19:42.960
in place. And I just don't believe that Instagram or meta would be limited to how a person registers
00:19:52.680
their age when they sign up for an account. Obviously, you can lie. But meta should have a way of knowing
00:19:58.580
that. Meta basically reads my mind. If you've ever got an Instagram ad, I mean, it is so attuned to what
00:20:05.740
I'm thinking about and what I might be thinking about and the items that I might be looking for because it
00:20:11.060
knows so many things about my life, you're telling me that there's no mechanism out there that could
00:20:16.800
really determine if the person that is registering for an Instagram account or is chatting with a
00:20:22.800
chat bot is a minor. Again, I don't believe it. These are policy decisions. I'm talking about company
00:20:30.100
policy decisions. I don't believe that it's impossible to protect children. I don't believe
00:20:36.080
that it is impossible to protect the innocence of children if these companies wanted to do anything
00:20:42.120
they could. But again, technology is only as moral as the humans creating it and limiting it and giving
00:20:52.380
power to it. I'm not saying that Zuckerberg and all of the employees at Meta want children to be having
00:20:59.960
these sexual conversations or even want adults to be having these kid-themed sexual conversations.
00:21:08.120
But I'm having a hard time seeing the case for—I'm just having a hard time seeing how they could
00:21:18.220
want to protect kids when all of this is happening. It's a tough case to make. It's a really tough case
00:21:27.200
to make when all of this is going on. Meta dismissed the Wall Street Journal's testing as manipulative
00:21:33.720
and hypothetical, arguing that sexual content represents only 0.02% of AI responses to users
00:21:41.180
under 18 and that the tests don't reflect typical user behavior. But does it really matter? I mean,
00:21:47.220
does that really matter? I mean, that's still a lot of people. And the fact that the capability is
00:21:52.280
out there should be really troubling. It should be really easy to say, even from a PR perspective,
00:21:57.860
wow, we take this so seriously and we are going to be the anti-child sexualization company and we
00:22:06.300
are going to do everything that we possibly can to protect our minor users. Now, there's a whole other
00:22:11.940
issue here that has nothing to do with Meta and what they do there that I'll get to in just a second.
00:22:18.300
Let me pause and tell you about our sponsor for the day. And that is Good Ranchers. Y'all,
00:22:25.020
I'm so grateful for Good Ranchers. I am so grateful for all American meat in my freezer
00:22:31.160
and that I never have to wonder, okay, what am I going to make for dinner tonight? I never have to
00:22:36.940
worry about, okay, do I need to go to the grocery store and pick up some chicken or pick up some beef
00:22:42.320
because I always have it and I always know it's high quality. I always know it tastes good. I always
00:22:46.620
have a variety and it's all from American farms and ranches and it's from a company owned by a family
00:22:52.120
who unapologetically loves God and loves America. So it's just a win all around. They've got better
00:22:58.040
than organic chicken, pre-marinated, non-pre-marinated. They've got all different cuts of steak. They've got
00:23:03.560
salmon, shrimp, different kinds of seafood. They've got ground beef. We use that probably the most
00:23:10.440
because it's so versatile. They've also got bacon that we use every day and they've got seed oil-free
00:23:16.560
chicken nuggets. They've got so much stuff. And again, it is all supporting American farms and
00:23:21.760
ranches. Maybe the best part right now is that it's tariff-free because of that. You don't have
00:23:26.800
to worry about tariffs if you are growing your meat and all of your products here in the U.S. Right now,
00:23:33.760
they've got an awesome deal going on just for my listeners. You can unlock free meat for life. So
00:23:39.620
they'll do a free add-on to your box when you subscribe, but also you'll get $40 off your
00:23:45.560
subscription with my code Allie at checkout. Go to goodranchers.com, code Allie, goodranchers.com,
00:23:52.920
code Allie. Okay. So the other part of this is that your kids shouldn't have a phone.
00:24:02.240
It's not just that they shouldn't have social media. They shouldn't have social media. Your
00:24:07.460
teenagers shouldn't have social media. And they also probably shouldn't have a smartphone. And look,
00:24:13.500
I know I don't have teens yet. And I'm not saying it's easy. I'm not saying, well, why can't you just
00:24:18.520
do this? It's so easy. I'm not saying that. I mean, I do remember being a teenager not that long ago
00:24:25.140
and how difficult it is for parents to be the only ones, especially when it comes to something that
00:24:31.960
does offer a level of social connection and some kind of acceptance in the social in-group like a
00:24:39.940
phone or like social media. But I've heard parents say, parents who are wiser and older than me,
00:24:45.340
that as soon as you give your child a phone, their childhood ends. Because now they have access
00:24:51.500
to all of the world's information and images and people that you have tried for a very long time
00:24:58.500
to protect them from. Now, I think that there is a way to make sure that they are introduced to
00:25:04.220
technology, that they're not kept in this isolated bubble. So they have absolutely no idea what
00:25:09.980
technology is or what social media can do before they leave the house. I think that there's a way
00:25:14.500
to do that without opening them up to this kind of temptation and predation. Because look, adults,
00:25:21.180
many adults don't even have the ability to withstand the perverse temptations that exist on our phones
00:25:29.520
or on our computers. And kids certainly don't have the maturity to be able to do that. And they are
00:25:35.700
more technologically savvy than their parents. They just are. And so I just, I want you to read
00:25:41.880
Jonathan Haidt's book, The Anxious Generation. And I mean, there is no amount of research. There's no
00:25:47.680
research that exists anywhere that will show you that your child having access to social media,
00:25:53.920
having a smartphone is better for them. You might say, well, there's not enough research showing that
00:25:59.260
it's bad for them. That's not true. But there's definitely no research that's showing that it's
00:26:04.420
good for them, that they're going to become smarter and better and more creative and more productive
00:26:09.880
people because they have Instagram or TikTok or a smartphone. It's just not wise. It's just not
00:26:17.660
a wise decision. I think the longer you can wait, the better it is. And here's the truth about,
00:26:24.080
I've been thinking about this a lot because every person that I've talked to, whether on my show
00:26:29.680
or just in everyday life, who has had serious struggles sexually, whether they struggle with
00:26:40.080
same-sex attraction or they struggle with gender confusion or some other kind of deception or
00:26:46.760
depravity, almost all of them had access to sexually explicit content early on or they were abused.
00:26:55.100
So it seems to me that the earlier that a person accesses sexual ideas or has a sexual interaction,
00:27:05.740
the more likely they are to have disordered sexuality later on. Like I, you can take that
00:27:12.480
to the bank. I don't have the peer-reviewed study that is proving that. I wouldn't be surprised if it's
00:27:17.600
out there somewhere, even if it's suppressed. But I haven't found a person who identifies as LGBTQ
00:27:26.060
whose innocence sexually was preserved and was protected throughout their entire adolescence.
00:27:36.980
Almost every person who struggles with some kind of sexual depravity, it was because they had access
00:27:42.800
to inappropriate sexuality early on. Same thing with addiction to porn. And so like, it's really
00:27:51.500
important, parents. It's really important to protect the eyes and the mind and the heart of your kids
00:27:57.040
because we can't trust Meta. We can't trust X. We can't trust any of these technology companies. I
00:28:03.380
mean, Silicon Valley, they're not thinking what should be done. They are thinking what can be done.
00:28:08.960
And there's really no moral limits, it seems, to what a lot of these technology companies will do.
00:28:15.240
So you have to be the moral limit. You have to be the moral limit, Christian, especially Christian
00:28:21.120
parent. I also thought this was interesting. Mark Zuckerberg, he is talking about how AI could
00:28:28.600
possibly replace friendships. Okay? Again, parents. This is something to look out for. Here's thought
00:28:35.560
three. I think as the personalization loop kicks in and the AI just starts to get to know you better and
00:28:41.640
better. I think that will just be really compelling. You know, one thing just from working on social
00:28:51.080
media for a long time is there's the stat that I always think is crazy. The average American,
00:28:58.680
I think, has, I think it's fewer than three friends. Three people that they'd consider friends. And
00:29:04.120
the average person has demand for meaningfully more. I think it's like 15 friends or something.
00:29:10.180
I don't want 15 friends. I just want to point that out. Do you want, do you, you might have 15
00:29:15.560
friends, like, like good friends. You might. I don't, I, I don't know. I've got probably some
00:29:24.260
people out there being like, am I not your friend? I mean, yes, of course I could say these are 15
00:29:28.540
friends, but like 15 solid core people. That's a lot. I don't know very many people that demand to
00:29:35.040
have like 15 solid friends. It's too many. I think I'm an introvert. So maybe extroverts feel
00:29:41.740
like they could handle that many. Yeah. That's too many though. I do have, I do have people like
00:29:46.480
that, that their wedding party is like, it just keeps going and going. And you, everyone has the
00:29:51.700
friend who all of their friends think that they are the best friend. Yeah. Yeah. And so, yeah,
00:29:58.980
maybe people like that, but I would say the average person is like happy with three solid
00:30:04.040
friends. Three solid friends is a great number. Yeah. Five solid friends. Great number. Yeah. Yeah.
00:30:10.220
I agree. So I don't, I don't think that this is true. I actually think, I think the truth is,
00:30:16.220
is that there's probably a significant number of Americans who don't have any friends.
00:30:21.360
And I've seen that. I've seen people talk about that on Instagram before, especially moms, but it's
00:30:28.440
probably true for a lot of adults. I think of a lot of adults after they leave college and you're
00:30:34.580
not interacting with people all of the time. Like you just, you don't have friends anymore. Yeah. And
00:30:41.580
it's really hard to make friends once you're, well, I mean, mid twenties on honestly, but like once
00:30:47.580
you're in your thirties, it's hard to make friends. People are kind of established at that point.
00:30:51.500
I also don't think anyone who has three solid friends, like really close friends needs a chat
00:30:59.800
bot to be their friend. No. Why would you even be seeking that out? You already have the three
00:31:04.900
friends. Yeah. But if you're like, no, I need seven and some of them need to be AI. I don't
00:31:09.100
understand that at all. No, no. I honestly think if someone has one solid friend or a spouse,
00:31:15.720
because I think it's also can be difficult for married people to have like really close
00:31:21.000
friends, especially if you want couple friends, then it's like, you got to find two people at
00:31:27.520
the same time that you like. It's like, you can't be like, Oh, I like the wife over here
00:31:31.500
and the husband over here. I'm going to just have them over. You can't do that. You got
00:31:35.420
to settle. I mean, sometimes. So I, yeah, I just don't believe that people who have genuine
00:31:41.440
connection in their life. I think this is only appealing to those who very sadly, and I have
00:31:45.620
so much compassion for this, feel like they have no one who really knows them. And it's not like
00:31:52.180
your parent, once you're an adult, your parents aren't enough. And maybe your siblings not enough
00:31:56.920
who really want a friend. I think that's the much bigger problem here. There's this book called
00:32:03.560
Bowling Alone, and it came out a long time ago, but it talked about just our lack of connection with
00:32:08.820
our neighbors and with our communities, lack of community groups, and just like real connection.
00:32:15.280
Everyone's busier. And, um, that was a while ago. I think it's even worse now. And I think it is true.
00:32:22.540
Like we do have friends and we go to church and we have good friends. Like I, I have really good
00:32:27.780
friends and I'm, I'm very grateful for that. And both my husband and I do, but it's like, I don't
00:32:34.340
know. There were definitely seasons and years of struggle. My parents, they are like, what are you
00:32:40.220
talking about? It was the easiest thing in the world to make friends. When we were young, we just had
00:32:45.760
our new married Sunday school class and that was it. And those are still some of their best friends,
00:32:51.900
30 plus years later. But I talked to so many people today who they're plugged into church.
00:32:58.720
Maybe they have a job, they are joining all these groups and they still feel like they don't have
00:33:03.980
good friends. And there is also this problem. I've seen this talked about on X a lot of like a couple
00:33:10.660
will be like, I feel like I've done everything. I've reached out to all these people at church.
00:33:14.340
I've had these people over and they just don't reciprocate. They just, they don't reach out to me.
00:33:19.860
I'm only ever reaching out to them. I don't know. I don't know how to solve that problem.
00:33:25.100
I don't know if in the, like in your world, Bri, like if you also feel like that, do you feel like
00:33:30.380
it's hard to make friends? And do you feel like there's this problem of like, you put yourself out
00:33:35.080
there to try to make a friend and then the other person doesn't reciprocate. And it just seems harder
00:33:40.740
than it used to be to have that community and connection. Yeah, for sure. I, and I haven't
00:33:45.940
experienced it from like the married side of it. I imagine what you're saying about trying to find
00:33:50.440
two people is hard. Um, but from like the single side of it, it's also difficult once you're like
00:33:56.760
late twenties and on, um, because that kind of cuts out a lot of people in your life stage. Um,
00:34:05.640
and yeah, I found that as well. When I moved to a new place that it's, you know, some people are more
00:34:10.200
established than others. Some people have a need for, um, deeper friendships than others. And so
00:34:15.740
it's just, it's so much harder than when you're in college and everyone's on the same foot and you
00:34:19.840
just all want to be friends with each other. Um, so yeah. Yeah. It's hard out here. Yeah. So I, I mean,
00:34:26.520
I sympathize with the desire to have friends, but I, and I don't know exactly how to fix it. I think
00:34:32.080
all of us can do a better job. Like I realized the other day that there is a situation in which I
00:34:38.680
didn't, I was the one not reciprocating. And it, it's not because it was like, Oh,
00:34:42.820
I don't like that person. I don't want to hear. It wasn't that at all. It was just like, Oh,
00:34:45.580
I realized, Oh, it's been a few months and I never, you know, I never reciprocated. I I've never
00:34:51.080
been the one to extend the invitation. And I did remedy that, but maybe it's all of us being more
00:34:57.360
reciprocal and like giving friendships that we thought, Oh, that person isn't exactly who I would
00:35:04.280
want to be friends with. Maybe it's like giving people another chance and letting them into your
00:35:11.380
life. Because I think also in college, you could be more selective. You're like, well, I'm not going
00:35:15.740
to hang out with that person in that group over there. Cause I don't have anything in common. And
00:35:18.760
I'm just going to hang out with these people who I have everything in common with. That's a lot
00:35:23.640
harder to do in everyday life. Yeah. So yeah. So lessons to be learned, but I don't think that AI
00:35:30.400
is the answer to that because I just want to remind you, AI is not real. It's not real. It's
00:35:36.800
not a person made in the image of God. They don't have a soul. They don't care about you. They're not
00:35:41.680
going to be at your funeral. And you would be surprised if you've never talked to AI, how human
00:35:46.640
like they are. And again, I think children can be especially vulnerable to this. They're not real
00:35:53.100
friendships. And I do think it's hard. Someone might say, well, if you don't have a friend, like maybe
00:35:58.100
this can help someone, maybe it can give them connection. Maybe it could pull them back from
00:36:03.000
the brink of suicide because they finally feel like they have a connection with someone. No,
00:36:07.120
I'm telling you, no, it's not a replacement. Actually, I think it would make someone once
00:36:11.480
they kind of wake up and realize, I think that I'm talking to this person this whole time and
00:36:15.980
they are literally not real. Um, it makes them feel worse than it did before. No, get off your phone,
00:36:22.920
get off the chatbot app, whatever it is, and go out and join some kind of community. Start working
00:36:31.600
out. I don't care if you've never worked out, go to a CrossFit gym or something. Start going to church
00:36:37.120
for that community there. I'm not saying that it's easy because it's not. I have learned that it's not
00:36:42.860
easy, but we are made for human connection. Man cannot, um, it was not good for a man to be alone.
00:36:49.940
We read that in the very beginning. And of course that's a passage about marriage, but it's true
00:36:54.160
in general. Like we, because we are made in God's image and God is three in one, he is Father, Son,
00:37:00.080
Holy Spirit, and he is perpetually in communion with himself. We can see through that example,
00:37:06.340
we can see in the establishment of the early church, we can see in the Old Testament that we are
00:37:10.700
made to depend on one another. And AI is not included in that. And it doesn't surprise me at all that
00:37:17.620
these technocrats believe that eventually robots can and will take away or replace human connection.
00:37:26.040
It'll just never happen. You can't outsmart God. You can't. And science is always trying to catch up
00:37:32.800
to God. Psychology, always trying to catch up to God. Technology, always trying to catch up to God.
00:37:37.900
But we also remember in the Tower of Babel that trying to build our own empire or tower that reaches
00:37:43.920
up to the heavens so we can be like God, so we can take the glory of God, it actually ends up creating
00:37:49.840
chaos. And I think that is what is going to happen here. So just beware and be wise Christian.
00:37:57.880
All right. Now I, well, should we talk about Beyonce? Do you want to talk about Beyonce? I feel
00:38:02.960
like you want to talk about Beyonce. We don't have to talk about Beyonce.
00:38:07.740
We're going to talk, let's talk about Beyonce. Okay. Let's talk about Beyonce for a little bit. I don't
00:38:11.620
really have much to say about it, but I mean, I, if you want the swarm of the bay hive, I'm scared
00:38:21.160
to come at relatable, then I will allow you to take that heat. I guess I'll just go ahead and do our
00:38:26.400
next sponsor and then we can get into Beyonce. Okay. Uh, let me tell you about every life y'all. I love
00:38:33.840
every life. We use only every life diapers and pull-ups in our home and they are awesome because
00:38:39.640
they are made from really premium and clean materials. And also every life is the only pro
00:38:45.880
life diaper company. I mean, they put their money where their mouth is supporting their employees
00:38:50.380
that are adopting or having more babies, making sure that they are financially and materially
00:38:55.100
supported. They also have a buy for a cause bundle that you can buy on their website that donates baby
00:39:00.360
and maternity items to pregnant moms in need through pregnancy centers. And so they are just awesome.
00:39:06.500
And their products really are top notch. I also like their baby lotion and, uh, their tear free
00:39:13.120
shampoo and baby wash. It's all made in the U S again, really good ingredients. It's just really
00:39:19.880
clean. Smells really good. All of their products are awesome. And again, this is a company that's
00:39:24.840
founded by, uh, that's founded by people who love the Lord and who align with our values. It's just
00:39:30.240
another way to vote with our dollar. Go to every life.com. When you use code Allie 10,
00:39:36.300
you get 10% off your first order. That's every life.com code Allie 10.
00:39:46.300
Okay. What's going on with Beyonce? Okay. I know nothing. She, she started her, her, um,
00:39:53.640
Cowboy Carter tour. Okay. World tour, um, last week. And, um, yeah, people, that's what people are
00:40:03.320
talking about. It. Katy Perry also started her tours. So lots of pop girlies starting their tours.
00:40:09.080
Okay. That video that I'm seeing of, um, Katy Perry going around of her dancing. Is that real?
00:40:19.940
It is real. It. Okay. Yeah. I didn't know that happens at concerts where like nothing is happening
00:40:26.860
and music is playing and no one's even doing a choreographed dance. You're just kind of being
00:40:30.840
silly on stage. I didn't know. I mean, I haven't seen her whole show. Maybe there's a purpose for
00:40:36.540
it, but yeah, the, I mean, she's kind of known for being like weird, you know? So I think it's on
00:40:42.440
brand, but yeah, some of the clips are pretty rough. I kind of feel bad for her, but, um, but yeah,
00:40:48.280
Beyonce started hers tour too. And, uh, it's gotten like much more high praise than, than
00:40:55.460
Katy Perry's, but it started, she started it by singing the national anthem. So it's heavily
00:41:00.360
country themed because she did win best country album. Yeah. Beyonce Knowles did. Yeah. At
00:41:08.700
the Grammys. Yeah. Okay. And so she surprised audiences, um, at her, on her tour on a, at a concert.
00:41:17.780
By performing the star spangled banner as part of her set list. Right. Okay. Part of
00:41:23.600
it. Let's play, let's play some of it. Sop one.
00:41:25.460
Okay. Flashing behind her national anthem is the message on the screen. Never ask permission for
00:41:50.400
something that already belongs to you. Okay. This is supposedly meant to reflect the tour's
00:41:58.260
themes of reclamation, empowerment, and unapologetic ownership, especially as a black
00:42:03.140
woman in spaces like country music. Is that a, is that a quote from something?
00:42:08.900
Not that I know of. That's just the messaging that she's, that she's putting across.
00:42:13.520
Okay. Um, so we know that she's not like a patriotic American. Okay. We know that she hates
00:42:22.480
conservative values that she obviously campaigned with Kamala Harris. And I'm sorry, if you have
00:42:28.920
truly patriotic values, you are not going to support Kamala Harris. You're just not. Um, and so like
00:42:35.040
what exactly is going on here? Well, she, the issue I think is that she sings the national anthem.
00:42:42.000
Some people on X were like, I can't believe she would do that. They thought it was a patriotic
00:42:46.780
thing because she's singing country music kind of. Um, but really what it was, was she cut off
00:42:52.340
the national anthem partway through and started singing her song freedom, which is known as a BLM
00:42:58.900
anthem. It's with Kendrick Lamar. Um, so she leads into that with the star-spangled banner. And so that is
00:43:06.620
meant to be symbolic of, you know, this isn't actually patriotic. We have work to do.
00:43:43.040
a critique. It's supposed to be a critique of America that in order really for us to be patriotic
00:43:51.040
Americans or for America to be what she's supposed to be, we need more rights. And America is turning
00:43:57.180
into this authoritarian place, of course, under Trump. And we as black people have been trampled
00:44:02.740
upon. And so here I am oppressed Beyonce because people don't know this, but Beyonce is very oppressed.
00:44:09.720
She has no rights, no free speech rights. She has no rights at all. Poor Beyonce. And so that's
00:44:17.840
supposed to be her critique. Freedom, freedom. I can't move. Freedom, cut me loose. Freedom, freedom.
00:44:22.180
Where are you? Cause I need freedom too. I break chains. I'll buy myself. Won't let my freedom rot in
00:44:26.960
hell. Hey, I'm gonna keep running cause a winner don't quit on themselves. Okay. Um, so yeah,
00:44:35.140
I saw a lot of people talking about how this is like debating whether the lyrics were good.
00:44:43.640
I saw you say, you need to say it. You need to own it. What's your, what's your about to say?
00:44:52.220
I said, I stand by this. Okay. It just bothers me when Taylor Swift and Beyonce get the same amount
00:45:03.580
of praise for their songwriting. When Beyonce uses like this album, that one country artist or country
00:45:11.340
album of the year, I think had like 114 writers total on the whole album. Taylor Swift's album
00:45:18.020
that was nominated had three. So I, and she has written an entire album by herself, no one else
00:45:25.120
credited. Um, so I just think, you know, Beyonce is a really good performer. She's a great singer.
00:45:31.540
She's talented, but, um, Taylor Swift is a better songwriter and she's more talented artistically in
00:45:37.960
that way. And I am sick of people saying otherwise. Well, that's, I don't like, that is just obvious
00:45:45.080
that she's a better lyricist. Yeah. Okay. You can argue what makes a good artist and what makes
00:45:52.300
Beyonce a good artist. If you want to, maybe there is an argument to be had there, but I don't think
00:45:57.900
anyone, whether you hate Taylor Swift or not, can argue that she's not a good lyricist. It's clever.
00:46:04.140
Like her lyrics are clever and sometimes they're cheesy. Some songs I think are more clever than others,
00:46:11.560
but they're, I mean, they're original and they're good and they're very hurt. They're always like on
00:46:17.760
brand. Yeah. I, there are people, Texas ain't no hold'em. So don't be a B word. That's all I know.
00:46:27.360
Yeah. That's all I know. Take it to the floor now. That's the other thing. There are songs that are
00:46:31.560
like just the dumbest lyrics that don't make any sense. That's like 16 writers. I'm like, what were you
00:46:37.800
all doing? What were you all doing in the room together? So that's why. Um, and people do argue
00:46:44.580
that she is better in every way that Beyonce is better in every way. And I just don't think that's
00:46:50.080
fair at all. But if you need 16 writers to write your song that you then go perform beautifully,
00:46:57.740
that's fine. I think that's still artistic. She's a singer and a performer and she's good at it.
00:47:02.940
You know, it's just different. Yeah. I don't think anyone is arguing that Beyonce like has no
00:47:09.040
talent. Right. Yeah. She's obviously beautiful, talented, and we can concede, Brie. She's a better
00:47:15.020
dancer than Taylor Swift. Okay. Oh, Brie. Brie. You have, you have TSD.
00:47:27.340
TSD. TSD. Taylor Swift DeLulu. If you don't, I mean, Beyonce's legitimately, I think, I mean,
00:47:37.020
I'm not an expert dancer, but is she not one of the best? She's good. And Taylor Swift is bad.
00:47:44.380
She's bad at dancing. She is not. She's not the strongest dancer. She's not the strongest dancer.
00:47:50.080
She's worked really hard to get better though. I will say. How do you know that?
00:47:53.580
Because I've seen her career. And when she was younger, she was awful. And she wasn't that good
00:47:59.700
of a singer either. She's gotten so much better. I do think that she's gotten to be a better singer
00:48:04.160
for sure. I mean, I think one of like the most difficult things to watch was the Shake It Off
00:48:10.740
music video. I just feel like you have to know your strengths. But that's sort of tongue in cheek
00:48:16.900
though. Supposed to be. Yeah. She's supposed to be kind of like awkward in that. But yeah, I mean,
00:48:21.660
there were clips going around during her tour of like really awkward dance moments. And yes. Yeah.
00:48:28.200
She's a little bit awkward. I think that's part of the charm for a lot of people. Part of the persona.
00:48:32.160
Yeah. How are her and T-Rav, by the way? We don't know. She's been silent. Yeah? Yeah. She hasn't
00:48:39.960
done anything. Really? Is that purposeful? That's almost been like since the Super Bowl, right? I think so.
00:48:46.480
Yeah. I think that this is purposeful because I think she goes away like in cycles. I think
00:48:51.340
she knew people were seeing too much of her and were getting annoyed by her being around
00:48:55.520
all the time. And I think that's true. People were getting annoyed because just her tour
00:48:59.640
was everywhere. She was always at football games. She was everywhere. And I think she knows
00:49:04.200
publicity wise. Maybe she just doesn't want to. That too. It could just be she's happy and she
00:49:09.800
wants to be private for a while. But I think it's I think it's twofold. I think it's also smart
00:49:14.780
to just kind of go away for a little while. Yeah. She can come back fresh and be number
00:49:18.200
one again, which I'm sure is what will happen. This might be a good time to talk about my
00:49:21.320
conspiracy theory about Taylor Swift and the Kelseys. Yeah. Because, OK, it's OK. If we
00:49:28.320
want to take the uncharitable reading that it's very strategic that she's out of the public
00:49:32.760
eye right now and that her and Travis are out of the public eye. What would be amazing
00:49:36.840
if like the first time she comes back, she's like engaged or something or already married.
00:49:42.840
People are saying she's pregnant. Oh, my gosh, that'd be great. I would be so happy
00:49:47.300
for her. OK, I have wondered if the Kelseys and the popularity of the Kelseys and Taylor
00:49:56.360
Swift are a strategy to get the normies to become Democrat again, you know, because they lost
00:50:04.720
a lot of the normie votes because Kamala Harris is weird and Tim Walls was weird and they were
00:50:09.840
mean, OK, not just weird. They were mean. And I know they called the other side mean
00:50:14.020
and weird, but it was projection. Both Kamala Harris and Tim Walls were mean and weird and
00:50:19.280
they lost a lot of the normie vote. And right now it's kind of frat to be Republican. And
00:50:24.640
you can't lose all of the frat bros and their girlfriends. You can lose some of them. You
00:50:29.040
can't lose all of them as the Democrat Party. And the Democrat Party thought they were going
00:50:33.120
to be able to just use abortion to get all the girlies to vote Democrat. But it didn't
00:50:38.920
entirely work. It somewhat did, but not entirely. So I just wonder if Taylor Swift and the Kelseys,
00:50:45.220
because they're very normal. And I think the Kelseys are very conservative coded. They look
00:50:50.480
and seem conservative. But I don't think she I don't think. What's the what's Kylie Kylie?
00:50:58.020
I don't think she is conservative. Like, I don't know that she's progressive, but like
00:51:02.120
I saw that she just had Chelsea Handler. Yeah. And I think also they did make a statement,
00:51:06.880
I think, when the Harrison Butker thing happened. I think they were pretty clear they didn't agree
00:51:11.980
with what he was saying. Yeah. Which is dumb. Yeah. If you're going to make a statement about
00:51:17.780
it, that means you like really feel that way. So I just wonder if they are a psyop to get
00:51:26.080
the girlies back to the Democratic Party or to get just the normies back to the Democratic Party.
00:51:31.740
Yeah. I think they'd have to do the heavy lifting with the guys because, you know, the girls who
00:51:37.380
like Taylor Swift are already there. But yeah, it's like football frat bros that they need back for
00:51:43.200
sure. I don't know. Like, I think that there are a lot of people, a lot of women who like watch
00:51:50.020
call her daddy and who are not related bells. OK, but they vote Trump because they just they're just
00:51:59.520
more Republican or their boyfriends are and they just vote Trump. And I think those people have to
00:52:06.400
be. And it's like nerdy to have your pronouns now. And it's just like embarrassing to be associated with
00:52:12.280
that. And so I think that there's going to be a wing, a growing wing of the Democratic Party
00:52:17.240
that ignores the trans thing and is like just more normal. Cool Democrats. Yeah. Normal. Not a regular
00:52:26.840
Democrat. Cool Democrat. But thankfully, like the most Democrats can't can't not be insane. Like AOC,
00:52:34.740
I think she's running. Gavin Newsom, Pete Buttigieg. Those are all the people doing their podcast tours
00:52:40.560
right now. Yeah. What's with Pete Buttigieg coming across as cool with Andrew Schultz? Yeah. That's a
00:52:48.460
thing. Yeah. These people are going through a brand overhaul. And I feel like there's that, you know,
00:52:54.920
pro wrestler meme where it's like you've got a wrestler out here, but then you've got one like
00:53:00.160
sneaking up behind him. And it's like the meme is like the text would be on the like wrestler up here
00:53:07.100
who doesn't know this guy's behind him, like me having a good day. And then it's like back here,
00:53:12.940
my period. It's like sneaking up. You never know. OK, but that's what all these Democrats are. It's
00:53:19.480
like AOC, Pete Buttigieg, whatever. They're like, I'm going to be cool or Gavin Newsom. I'm going to be
00:53:25.760
cool. Run for the Democrat, you know, nomination. And then you've got like Richard Levine back here.
00:53:33.320
You've got like the man pretending to be a woman who is about to like ruin your campaign. That's
00:53:39.820
the problem with the Democrats. They cannot let that issue go. All right. Before we head out,
00:53:45.320
let me tell you about subscribing to Blaze TV. You should subscribe to Blaze TV because you get all
00:53:52.200
of our behind the paywall content. And it's awesome. It's stuff that you don't get to see just on YouTube
00:53:56.820
for free. It's just for our Blaze TV community. It protects us. For example, Apple did not upload
00:54:02.740
our COVID podcast yesterday because they still have all of these COVID flags that censor episodes.
00:54:09.580
And so we had to like finagle the description and all that just for Apple to upload our podcast.
00:54:14.520
And so we never know what's going to happen. We can get kicked off all of these platforms.
00:54:18.820
And that's why we have Blaze TV. We not only give you subscriber exclusive content,
00:54:23.200
but it also protects us and makes sure that you can get access to all of our content. Also,
00:54:28.440
we have Nicole Shanahan now. She is awesome. Go watch her. Listen to my episode with her if you
00:54:32.520
haven't already. But she's got a new show back to the people on Blaze TV. Go to youtube.com slash
00:54:38.620
Nicole dash Shanahan. All right. That's all we've got time for. We will be back here on Monday.