Ep 502 | Facebook Whistleblower: Dem Operative or Hero? | Guest: Rachel Bovard
Episode Stats
Words per Minute
184.2425
Summary
Rachel Bovard is a journalist who writes about big tech and we ve had her on the podcast before talking to her about the solutions that she proposes to the power that big tech holds in the lives of Americans and the influence they have on our elections.
Transcript
00:00:00.000
Hey guys, welcome to Relatable. Happy Thursday. Hope everyone has had a wonderful week so
00:00:14.640
far. Today we are talking to Rachel Bovard. She is a journalist. She writes about big
00:00:23.140
tech and we've had her on the podcast before talking to her about the solutions that she
00:00:29.520
proposes to the power that big tech holds in the lives of Americans and even the power that they
00:00:37.360
have to influence our elections and why that's a problem and where she thinks the federal government
00:00:42.980
should step in and try to curb some of the influence that they have. She's coming from a
00:00:48.560
conservative perspective though, whereas as we will talk about today, Democrats who want more
00:00:55.700
authority over these technology companies seem to want to do so, not because they want to
00:01:03.660
protect the country from their influence, but because they actually want to try to control
00:01:08.760
the flow of information and the messages that, especially the political messages that the
00:01:14.620
American public is receiving. So Rachel Bovard is very nuanced on this issue. She is very well versed
00:01:20.460
on this issue. She is going to talk to us a little bit about that, but really what we're focusing on
00:01:25.800
is this Facebook whistleblower, or if she's being called a whistleblower, her name is Frances Haugen.
00:01:31.700
She testified during the Senate Commerce Science and Transportation Subcommittee on Consumer Protection,
00:01:37.700
Product Safety, and Data Security. She worked for Facebook and she came out and said, you know,
00:01:43.860
Facebook is doing all of these bad things. They know that they're doing bad things. And also,
00:01:48.440
she believes that Facebook should step in or the government should step in and should basically
00:01:53.600
regulate Facebook. There's some disagreement on the right and the left about whether or not she is
00:01:58.860
a hero or whether she is just kind of a partisan hack because some of the things that she said during
00:02:05.220
this hearing, like wanting to censor political information that she doesn't like, that's something
00:02:10.900
that she believes that Facebook should do. Obviously, people on the right don't like that,
00:02:15.840
but at the same time, she is pointing out that Facebook is knowingly devastating the mental health
00:02:24.140
and the safety of young children, especially young girls, and they're not doing anything
00:02:29.000
about it. And so it's important that we look at all sides of this and think about what some solutions
00:02:34.620
to this problem might be. And Rachel's going to do that for us. Now, you probably noticed a couple
00:02:42.880
days ago that Instagram and Facebook and WhatsApp, which was acquired by Facebook, were all down.
00:02:50.620
And I don't think we have all of the information of why that happened, but there are some questions
00:02:55.620
as to whether or not it had to do with all of this because there have been internal documents that
00:03:01.980
have been leaked by this so-called whistleblower. There's some disagreement about whether or not we
00:03:07.340
should be calling her that, that have showed, you know, a lot of the problems within Facebook and
00:03:16.840
the nefarious things that are going on there. So they're already going under that PR crisis. Then
00:03:21.480
things shut down to the point to where apparently Facebook employees couldn't even get inside the
00:03:26.640
building like their little, the keypad wasn't working to get inside the building. And I don't think
00:03:33.200
we even know all of the reasons why. And so there's some questions about whether or not this is all
00:03:39.880
interconnected somehow. You know, I don't know, but Rachel is going to lend as much insight as she
00:03:48.140
possibly can. She is very, very clear. She's going to break all of this very complicated stuff down in
00:03:54.580
a way that makes sense. I don't know about y'all, but when I was on, so when I was on trying to get on
00:04:00.760
Instagram the other day, I think it was on Monday after I recorded the episode with Jonathan Isaac,
00:04:08.080
I was going on Instagram to tell you, um, Hey guys, like I just recorded that and it wasn't
00:04:14.200
loading. And so I definitely, I was like, you know what? They are perfect. They're coming after
00:04:18.080
Ali B. Stuckey and they are not allowing me to post what I want to post on Instagram. This is
00:04:22.960
censorship. So I deleted the app and then I redownloaded it. I tried to open it back up and
00:04:28.600
it still wouldn't let me on. Took me a long time before I figured out, okay, I need to go on
00:04:33.220
Twitter. This is what everyone does. You go on Twitter and you search the words Facebook and
00:04:37.440
Instagram to see if other people are dealing with this. And they were, everyone was on Twitter and
00:04:42.880
on Telegram and things like that to, uh, to talk about how unfortunately Instagram was down. And it was
00:04:49.460
a little weird. Like I, it reminded, it reminded me or it showed me, I guess that I rely on Instagram
00:04:56.280
a lot to connect with you guys. And obviously there have been days where I've been off Instagram,
00:05:01.820
but it felt weird not being able to talk to y'all. I mean, I spend a pretty good chunk of my day
00:05:08.300
responding to messages from you guys on Instagram. Um, and if I haven't responded to you, I'm not
00:05:13.760
ignoring you. I can't get to all of them, but I do respond to people and repost the things that you
00:05:18.860
guys tag me in. And it felt really weird not being able to talk to y'all. I did not like it. And it
00:05:25.800
made me realize maybe I need to diversify my platforms, um, a little bit, but it also takes
00:05:31.280
a lot of energy to post on all of the different platforms. And, um, but that's probably something
00:05:38.360
that I'm going to have to do so I can stay connected to you guys, even when Instagram
00:05:41.820
shuts down. But it just showed me, wow, these platforms have so much power and play such a big
00:05:47.320
role in our lives. It really, really is important that we are paying attention to what they're doing
00:05:52.920
and what's going on behind the scenes. So that's why we're talking about this today, especially if
00:05:56.660
you are a parent, guys, if you are a parent, you need to listen to this. And we have to be so
00:06:01.680
cognizant of what our kids are consuming. As long as your kids are under your roof, like they are
00:06:07.380
under, um, your authority, you get to set the rules. If you feel like, Oh dang, you know, if I took
00:06:13.480
away Snapchat for my 13 year old or Instagram for my 14 year old, who we know from these internal
00:06:18.820
Facebook documents are really, really negatively affected by the kind of content that they consume
00:06:24.860
on these platforms. If you're afraid that they might be mad at you or something, my humble advice
00:06:30.780
as someone who is not a mom of a teenager, but who has been a teenager not too long ago is that it is
00:06:37.100
worth it. It's worth the frustration. It might be worth the anger that they show towards you. You are
00:06:42.900
doing what God has called you to do, which is steward their hearts and their minds to the best of your
00:06:47.460
ability. You can't protect them from everything. Um, but you can do what you can while they're under
00:06:52.860
your roof to try to make sure that what they are consuming and what they are taking in is that which
00:06:58.740
is good and right and true. So that's just my encouragement before we get into this conversation
00:07:05.280
with Rachel. Rachel, thank you so much for joining us again. Okay. I just want you to break down this
00:07:16.800
whole Facebook whistleblower thing. Who is Frances Haugen? Where did she come from? What is she
00:07:23.600
saying? And what should we think about all of this? Well, thanks for having me back. And yes,
00:07:30.600
Frances Haugen or Haugen, I suppose, burst onto the scene with a congressional hearing yesterday,
00:07:36.100
but we've been reading about her findings for the last several weeks. If anyone's, if anyone's been
00:07:41.320
following the Facebook files that the Wall Street Journal has been putting out showing the fruits of
00:07:47.060
her research, basically this trove of documents of internal research that came from Facebook showing
00:07:52.800
that they know how their products harm children, specifically 13% of teenage girls tracing their
00:07:59.780
suicidal ideations back to Instagram, you know, the cartels and sex traffickers that use their platform.
00:08:05.480
Um, the fact that they know the Chinese government is using their platform to stalk Uyghur Muslims,
00:08:10.680
you know, who are they at? They are actively trying to wipe out in China. All of this has been
00:08:14.840
discussed in the Wall Street Journal. But yesterday we really saw the face of, of Frances Haugen. We saw
00:08:20.940
her on 60 minutes on Sunday as well. So she's a former Facebook employee. She went to work for
00:08:24.740
Facebook in 2018. So, you know, you have to wonder, she knew what Facebook was doing at that point
00:08:29.160
in time, but went to work for them anyway. And she's released these documents in an attempt,
00:08:34.480
I think, to push Congress to implement some reforms. And now, if you listen to her testimony
00:08:39.420
yesterday, it was disturbing to a lot of us on the right because she didn't call for, you know,
00:08:44.300
breaking up Facebook or reducing the power Facebook has. She explicitly said, no, we need
00:08:49.800
a government misinformation agency. We need someone to regulate Facebook. I mean, what she was calling
00:08:54.580
for was basically overt government censorship of speech, you know, more entrenchment of these
00:08:59.600
platforms to go after, you know, January 6th insurrectionists, which we know because of
00:09:04.480
definition inflation on the Democratic side, that just means any conservative who they don't like.
00:09:09.480
So her proposed solutions, I think, are very suspect. But I do think the information she's
00:09:15.340
provided is useful and I think could provoke investigations on, you know, privacy, what Facebook
00:09:21.460
is doing with the information about your children, and even some information about their ad
00:09:25.980
practices. All of that should go to the regulating agencies for investigations. And I hope that's
00:09:30.760
what does happen. Yeah. So I guess that leads to my next question. What should be the solutions then?
00:09:36.900
Because we would agree that a lot of the problems that she lists are actual problems, but we don't
00:09:43.220
think that the answer is necessarily the government coming in and just regulating the heck out of
00:09:47.640
Facebook. And so what would you suggest we should do? So my first policy proposal has always
00:09:54.300
started with the antitrust enforcement. One, because these are laws on the books, right? We don't have
00:09:58.860
to create, you know, this new regulatory structure to go after these companies. Antitrust laws are
00:10:03.380
supposed to protect the free market. And I think we've done a very poor job of enforcement around
00:10:08.460
the tech space. And that's my preferred solution for a lot of these speech problems is competition,
00:10:13.320
right? If you actually had a free market in tech, you could compete a lot of these concerns away.
00:10:17.960
And I think that that takes away a lot of the speech concerns if you look at speeches downstream of
00:10:22.260
market power. And what I mean by that is think about Google right now. Google filters information
00:10:27.420
for 90% of America. And so what they choose to suppress, what they choose to amplify can literally
00:10:33.660
change the minds of, you know, a big proportion of the country. If that market power is broken,
00:10:38.640
I don't care what Google suppresses or amplifies if they're only doing it for 30% of the country,
00:10:43.480
right? The market for information is much more open and less of a cartel. So I think antitrust
00:10:50.180
can solve that problem. But I do think that there are other reforms that need to be made. You know,
00:10:54.520
you often hear Section 230 bandied about. Now you'll hear Democrats want to use it again for
00:10:59.520
speech control, for censorship. I do think it has to be done in a way that keeps Section 230 in place,
00:11:05.000
but brings it back to its original intent. And Justice Clarence Thomas has now written about this
00:11:10.100
twice. You know, he said, look, the lower courts have gotten this wrong. What was a porous immunity has
00:11:14.840
become a bulletproof one? You know, guys, the original intent here was so that like, child
00:11:19.840
pornography didn't flourish on the platforms. You know, there's this was never supposed to be expanded
00:11:23.920
as a protection for sex trafficking on Facebook, which is what Facebook claims as its immunity is
00:11:29.460
Section 230. It was never designed to, you know, engage in or protect overt political censorship,
00:11:34.760
which is what we're seeing now. So there needs to be a retraction, a return to the original intent of
00:11:39.280
that statute. Yeah. You know, and then I also think data privacy and data portability play a big role
00:11:43.920
here, too. And that's an area that Congress, frankly, hasn't even begun to explore. Yeah. Can
00:11:48.640
you just quickly summarize you? You kind of did. But in a very basic form, can you summarize what
00:11:55.340
Section 230 is supposed to be what it was originally intended to do? So Section 230 was never even
00:12:03.560
actually a bill. It was an amendment to a much larger telecommunications package. And the title of
00:12:09.120
the amendment was the Family Online Empowerment Act. It was supposed to, again, give these companies
00:12:15.620
an incentive to take down sort of the smutty, you know, violent, harassing content that nobody wanted
00:12:21.200
to see. And to do that, it basically says, look, these platforms are not subject to the liability for
00:12:27.780
what users post on their platform. Right. They're not publishers in that regard. They can't be sued for
00:12:32.360
what we say. And then it gives them a protection. It says, look, even if this is constitutionally protected
00:12:37.260
speech, we want you to be able to take down. And it lists a whole host of criteria, you know, lewd,
00:12:42.460
lascivious, harassing, all this criteria. That's what it was designed to do. But unfortunately,
00:12:47.420
the lower courts have just expanded it so dramatically that, again, the Texas in the Texas
00:12:53.960
Supreme Court, Facebook recently argued against three moms who brought a case there. Their 14 and 15
00:13:00.980
year old girls were trafficked into sex slavery on Instagram. And those moms said, hey, look,
00:13:06.300
Facebook knew this and did nothing. Facebook is responsible. And Facebook said, no, we're not.
00:13:10.100
Section 230 protects us. We're not responsible for this. And so that's how it's become so grossly
00:13:15.220
distorted that it's protecting Facebook from even knowing sex traffickers are on their site and doing
00:13:21.200
nothing about it. And so I do think that that needs to be brought back to bear because this is
00:13:25.420
just non-sustainable. The size and scale of these platforms makes that kind of policy unsustainable.
00:13:30.080
They're protecting criminals and getting away with it.
00:13:31.780
Right. So it was supposed to say, hey, you have the power to be able to take off really bad
00:13:37.720
content from your platforms without being liable for the rest of the content that is on
00:13:43.080
your platform, which is good because that allows them to remove really bad content that we don't
00:13:49.000
want on there. But the thing that I'm thinking is that they very often don't remove really bad
00:13:54.500
content that's on there. Like you said, like we've seen several times, this is a source,
00:14:00.020
Instagram, Facebook or a source of not just harassment and bullying and doxing that very often is never
00:14:08.860
held accountable and never removed. But also it is a vessel for trafficking and for grooming and for
00:14:19.540
the sexualization of children. And they don't seem to be using Section 230 to empower themselves to
00:14:30.660
remove that kind of content. It seems like they only invoke Section 230 when they don't want
00:14:37.040
responsibility to remove that content and also when they want justification for removing, say,
00:14:44.700
political content that they don't like in the name of misinformation. Is that part of the problem?
00:14:50.000
That's a huge part of the problem. And I think especially when you're talking about political
00:14:53.720
speech, you know, these platforms, the reality of them now, and this was not the reality when Section
00:14:59.780
230 was passed or when Facebook began, but the reality of them now is that, you know, Facebook and
00:15:05.700
even Twitter to some extent, these are how candidates now reach their constituents or voters. You know,
00:15:11.880
this is how these are. We saw this with President Trump, right? This is how candidates and elected
00:15:16.480
officials talk to the voters talk to potential voters. And when you remove the ability of,
00:15:22.220
you know, one party or one candidate in a primary to access that forum, that does have a political
00:15:28.680
impact at this point. And that's a reality that our laws haven't grappled with yet. You even saw
00:15:33.940
the RNC, the Republican National Committee brought a lawsuit to the Federal Elections Commission after
00:15:39.720
Twitter suppressed circulation of the story about Hunter Biden from the New York Post. RNC went to FEC and said,
00:15:44.980
look, this is a campaign violation. This is an in-kind donation to Joe Biden, you know, to suppress this
00:15:51.920
critical story at this critical time. And even the FEC said, look, it's not because we think Twitter is
00:15:57.700
a publisher. We think Twitter is a media outlet. You know, they weren't trying to, you know, sway the
00:16:02.300
debate in any way. They have a First Amendment right to do this. And that just flies in the face,
00:16:06.920
I think, of reality at this point. But I think it speaks to the fact that our laws simply haven't
00:16:11.440
caught up to what these platforms actually are right now, which is sort of key avenues of,
00:16:15.700
you know, speech, political speech, commerce, information flow. You know, when we have laws
00:16:21.140
that just don't, I think, accurately reflect that reality at this point.
00:16:24.520
So Democrats want to regulate these social media companies in the name of trying to police
00:16:38.420
misinformation to protect the public from what they deem misinformation, which we know is probably
00:16:43.420
just an Orwellian descriptor for political speech, like you said, that they don't like opinions that
00:16:48.800
they don't agree with. Do you think, do you think that they are using Haugen's testimony? And I don't
00:16:56.960
want to get too conspiratorial, but do you think that this was pre-planned, that they kind of hoisted
00:17:02.100
her up to bring forth all of these issues that the American public really cares about? Okay, you care
00:17:07.680
about if Facebook knows that it could be helping drive young girls to the brink of suicide, it's making
00:17:13.700
their body image issues worse. It's a source of trafficking and grooming for young children.
00:17:20.780
Like, has she been purposely platformed by the side that wants to regulate these social media
00:17:26.260
companies? Is that or is or is that just how she's being incidentally used?
00:17:32.740
I do think that, you know, there is a little bit of an overt effort by Democrats here to use her
00:17:38.920
testimony to favor their own solutions, right? Which is this like major government crackdown on
00:17:43.900
social media, not in the way that Republicans would like it, which again, is to let speech
00:17:48.860
flourish, let the free market solve these problems. But through a very top down heavy handed approach,
00:17:53.660
I do think that, you know, she is impeccably groomed and, you know, well spoken and suddenly,
00:18:00.560
right, all of the sudden, you know, leaking documents, the Wall Street Journal then ends up on a 60
00:18:05.000
minutes interview, then in very, very quick turnaround has a Senate hearing, you know, all
00:18:10.460
working with a PR firm, there is a little bit of like, there was clear pre planning that went into
00:18:16.280
place here. And I but I think, you know, the right can, I think, very legitimately criticize that they
00:18:21.380
can very legitimately criticize her policy solutions, because she's not an expert, they don't have to
00:18:25.940
listen to Francis Haugen. But what they should not ignore is the information she's produced, which
00:18:31.420
there are some very damning indictments of what Facebook is up to. And I think if we ignore that
00:18:36.620
simply because we've, you know, discredited Francis Haugen, then, you know, we aren't doing our job
00:18:42.100
and actually saying what are our solutions for holding Facebook accountable, because what she's
00:18:46.120
produced, exclusive of, you know, her motivations, is still important in this debate.
00:18:51.700
What do you think the motivations are for Facebook and Instagram? Obviously, they,
00:18:55.960
Instagram is an entity of Facebook. But you know, Google, YouTube, and again, YouTube is an entity
00:19:02.400
of Google, even Twitter to not do more of a thorough job to protect its young users from the kind of
00:19:12.500
material that we know is damaging them, not just psychologically, but also physically, if they are
00:19:17.500
groomed into some kind of abuse. I mean, obviously, they make a huge effort to try to censor any
00:19:24.700
information about, say, COVID therapeutics, or people's opinions about masks and things like
00:19:30.960
that. They have certainly mobilized a large team of people to take down that kind of information.
00:19:38.460
Why aren't they quite as motivated to censor the material that they know is damaging the public?
00:19:48.660
Well, I think the really difficult thing to grapple with about that is because it's their business
00:19:53.340
model, right? And the thing that makes what's very clear in these documents is how critical it is
00:19:59.740
for Facebook, and by extension, Instagram, to attract those young users. They, you know, there
00:20:06.200
was lines in those documents about Facebook has teams of people that want to figure out how to,
00:20:15.300
I don't know. And I'm concerned, actually, about what it means. Like, I would like more information
00:20:20.220
about what those teams are doing to, quote, leverage the play dates of children. But they
00:20:24.760
want more clicks on Facebook Messenger, right? Because they have a Messenger for Kids app,
00:20:29.780
basically. And they want to figure out how kids playing together can somehow mean more clicks and
00:20:35.860
more eyeballs on Facebook. Because that is their business model. The more users they can attract,
00:20:40.120
the more data they can collect, feeds their very highly lucrative, highly targeted ad business,
00:20:45.700
which is what funds Facebook. That's how Facebook makes money, right? And so there's far less of an
00:20:51.580
incentive, I think, to police for, you know, things that, you know, could be put could begin to put
00:20:58.880
kids down a very bad track. And that's what's happening, right? We know this. And there's a lot
00:21:04.100
of comparisons these days, you know, among people who defend these companies who say, well, this is not,
00:21:09.180
you know, this is like the crusade of the 90s to get rid of video games, because video games are going
00:21:14.500
to make us all violent. But again, I don't think that comparison grapples with what social media
00:21:17.900
really is. And how, yeah, and how ubiquitous it is. It's everywhere. Yeah, you know, and as soon as a
00:21:22.900
kid has a smartphone, they are on these apps. And it's very difficult, I think, to, to police that
00:21:27.980
even even if you're a helicopter parent, right, even if you know exactly what your kid is doing at all
00:21:32.300
times, you know, they aren't in your control at all times. And the internet is everywhere.
00:21:36.880
Yeah, it's, it's certainly not the same thing. One, because, well, I would actually argue that
00:21:44.040
anything that we put into our mind can have an effect on our thoughts, and therefore our
00:21:48.380
behavior. So that goes for video games, that goes for the things that are online. But it's even more
00:21:55.160
dangerous when it comes to social media, because they're not just playing a game that is, you know,
00:21:59.740
closed, they're actually connecting to real life people that could do them harm. That I'm not on
00:22:05.400
Snapchat, but the things that people send me, they screenshot like the explore page or whatever it is
00:22:11.140
that has different news stories. I mean, there are 11 year old kids that are on Snapchat. And gosh,
00:22:18.520
parents, if you're listening, and your preteen is on Snapchat, change that ASAP. But the stories they
00:22:25.540
have are like, you know, different sex positions, and how to have safe anal sex. And here's how to get
00:22:32.280
an abortion without your parents permission. I mean, that stuff is on Snapchat. And it's not just for,
00:22:36.980
okay, if you're 18 and over, these are the kinds of stories that will show you. They don't care. I
00:22:42.960
don't know what the intent is behind sexualizing kids that age. I don't even really want to think
00:22:48.440
about it. But the fact of the matter is, it exists. And I don't think kids have the brain development
00:22:55.760
to be able to filter out that kind of information. And also, they also just don't have the discernment
00:23:02.980
quite yet to say, maybe I shouldn't send this kind of picture, because then it's going to be
00:23:07.540
spread. Or maybe this person is a trustworthy that I'm talking to. I mean, there are all kinds of
00:23:12.320
problems that do obviously primarily rest on the shoulders of the parents that are helping make
00:23:18.080
decisions for their kids. But you would like to think that there are at least a few parents at
00:23:24.180
these companies that would sympathize with the concerns of the American public. And I don't know,
00:23:30.000
it just doesn't seem like there is. Well, you have to think about it. You know,
00:23:33.600
on the right, we always prize the profit motive, right? We're like, this is what drives the economy
00:23:37.300
forward. And that's true. But if these companies, they aren't distinguishing between what's good for
00:23:42.300
children and just flat out consumers. They only care about the consumer. Because again, this is how
00:23:47.380
they make money. This is why they are doing, you know, you saw Instagram trying to create Instagram
00:23:51.560
kids, you know, and this is what Facebook was doing with the full knowledge of what can happen to
00:23:57.400
kids on their website. Right. They don't, they don't distinguish between child safety necessarily,
00:24:01.920
you know, and what could be a future consumer. And you see this from Google as well. I mean,
00:24:05.460
there's a reason Google hands out free Chromebooks in every school it can. They want to addict your
00:24:09.420
child early because that's their next generation of users. So there's always, you know, I don't think
00:24:14.680
we can trust these companies to look out for child welfare. They've, they've proven again and again,
00:24:17.960
they kind of frankly don't prioritize it. And even when they're punished for it,
00:24:22.140
they barely respond by changing anything. Yep. My last question is, do you have hope that
00:24:28.460
this kind of new crop of conservatives that seems to be coming up, or at least they're running for
00:24:33.680
office and some of them are already in Congress, obviously Josh Hawley has been on this beat for a
00:24:38.280
while. But then you've got people running like Blake Masters, JD Vance, who say that they really care
00:24:44.640
about, you know, about the dominance of big tech and the effect that it's having on society.
00:24:50.920
Do you have hope that there are some Congress people, hopefully some younger Congress people
00:24:56.480
who are waking up to this kind of thing, they understand the threat and that they're actually
00:25:00.640
going to do something about it, at least on the right? I am encouraged, I think, you know,
00:25:06.640
not just by the sloganeering and the rhetoric, but by the fact that, you know, people like JD Vance,
00:25:12.100
you know, is pretty sophisticated. Blake Masters actually kind of has experience in Silicon Valley.
00:25:16.400
They know how these companies work. And I think that's a lot of what's missing, you know, from
00:25:21.340
the current crop of policymakers. They just, they're just not technologically savvy.
00:25:26.180
Well, Blumenthal, he, oh my gosh, I'm sure you saw, he said, and I don't know if the audience saw,
00:25:31.440
he said something like, will you commit to ending Finsta? And Finsta is like a, you know,
00:25:37.460
a slang term for a fake Instagram or like a friend Instagram. Oh gosh. So that is,
00:25:42.240
that's a huge problem. They just don't understand. Well, and, you know, I think this younger generation
00:25:46.780
of lawmakers, they have young kids, right? They have kids that grew up in this environment in a
00:25:52.180
way that I, I didn't, and perhaps you didn't either. You know, I'm in the elder generation
00:25:56.040
of millennials who didn't have Facebook, right? Until like the end of college. Yeah. So I think
00:26:00.140
they are much more aware and, and I think fluent in what the problem is. And then I think in addition
00:26:05.740
to that, this newer generation is skeptical, far more skeptical, I think, of concentrations of power
00:26:11.300
outside of the government. And that I think for a long time is what's been missing on the right.
00:26:15.680
You know, we very rightly look at the government and say, this is a big threat and, you know,
00:26:19.300
we should be wary about what the government is up to. But in many cases, and I think, no,
00:26:23.160
not many cases, in certain cases. And I think in this case with big tech, you have an unprecedented
00:26:27.820
accumulation of power and control here that deserves just as much scrutiny and skepticism. And I think
00:26:33.280
the younger generation of lawmakers is far more savvy in that regard and fearless,
00:26:37.260
I think, too, about calling it to account. Yep. Well, I'm hopeful. Obviously, we don't know the
00:26:43.540
results of those elections. We can hope. I do think having some young blood and having a new
00:26:48.360
understanding and also feeling like you have a vested interest in these companies changing because
00:26:54.780
you've got those young kids that are being exposed, that are being exposed to that. And there's really
00:27:00.520
only so much that parents can do to protect their kids from that. They go to a friend's house,
00:27:05.080
they go to school. I mean, this really does affect everyone. And it can very seriously shape the
00:27:12.220
minds of an entire generation for the worse. So I'm thankful for the work that you do to try to
00:27:18.420
inform people about what's going on. And maybe there are some good things that will come out of this
00:27:24.900
quote unquote whistleblower testimony that aren't just, you know, a government overhaul of these
00:27:33.180
of these companies. That's the hope. Yeah, that's the hope. Well, thank you very much. Where can
00:27:38.480
people find you and follow you? So you can find me on Twitter at Rachel Bovard and all my work is
00:27:44.380
at CPI.org as well. Thank you so much, Rachel. Thank you so much.
00:27:48.980
Okay, guys, thanks so much for listening this week. This was a big week. We had NBA player on Monday,
00:28:00.180
Jonathan Isaac, talking about his decision not to get the vaccine and how he has the courage to stand
00:28:06.460
up for these controversial opinions nowadays, vaccine choice. But also last year, he was the
00:28:15.460
only player on the Orlando Magic to stand up for the national anthem and not wear a Black Lives Matter
00:28:20.100
shirt. When he was asked the reason why he shared the gospel. I mean, the guy is solid. I really
00:28:26.260
appreciated him coming on and kind of telling us his reasoning and sharing his story. If you have not
00:28:31.440
heard that conversation, definitely go listen to it or watch on YouTube. We had our 500th episode
00:28:36.420
on Tuesday and just the messages and the comments and the reshares that you guys put out there and just
00:28:43.200
the encouragement and the kind words that you gave to me and what relatable has meant to you.
00:28:48.120
Just it means so much. Thank you, guys. I'm so thankful to be able to do this. If you do love
00:28:53.460
this show, please leave a five star review on Apple. Tell us why you love it. Just a couple sentences is
00:28:58.480
great. That would mean a whole lot. Monday, we've got Christopher Rufo coming on and he has really led
00:29:05.440
the charge against critical race theory in schools. And you guys might have seen the story of the
00:29:10.900
Department of Justice led by Merrick Garland, Biden's Department of Justice. They are now
00:29:15.760
mobilizing the FBI against parents that they say are threatening, you know, school board members or
00:29:22.800
public school teachers. Now, of course, any threats of violence are not OK, not right. But what the fear
00:29:29.320
is and the very justified fear is that the DOJ is going to actually be weaponized against parents who
00:29:37.340
just raised their concerns and that this is more of an intimidation tactic than anything else to try
00:29:43.200
to silence justifiably concerned parents. So make sure you tune into that. That'll be a big episode.
00:29:49.800
And I hope you guys have a great weekend. I will see you on Monday.