Big Tech Exposed for Hurting Kids, plus Biden ATTACKS Texas & Americans by Restricting LNG Exports
Episode Stats
Words per minute
164.7883
Harmful content
Misogyny
4
sentences flagged
Toxicity
12
sentences flagged
Hate speech
7
sentences flagged
Summary
In this episode, Senator Ted Cruz talks about a recent hearing on Capitol Hill that focused on the issue of child sexual abuse by big tech companies and how they should be doing more to protect our children online. Ted Cruz is a conservative senator from Texas who has been a long-time advocate against Big Tech companies like Facebook, TikTok, and other major tech companies.
Transcript
00:00:09.000
And Senator, there was a big hearing on Capitol Hill that dealt with big tech oversight.
00:00:15.440
There were CEOs there from Facebook and TikTok.
00:00:20.340
You had the opportunity to talk to Mark Zuckerberg
00:00:23.220
specifically over an issue of child exploitation being
00:00:30.700
Talk a little bit about why this hearing was so important.
00:00:41.400
So you had Mark Zuckerberg, the CEO of Meta, which is Facebook and Instagram and others.
00:00:55.420
And they were all testifying before the Senate Judiciary Committee.
00:01:03.400
And in particular, children being targeted by predators, by pedophiles,
00:01:15.640
It's the hearing room where we do, for example, hearings on Supreme Court justices.
00:01:20.400
So it's one of, if not the biggest hearing room on Capitol Hill.
00:01:28.600
It was standing room only with parents who had lost their children to suicide,
00:01:36.360
to abuse, who had been traumatized and abused online.
00:01:45.160
In many instances, predators will do things like convincing young children, convincing teenagers
00:01:52.880
to send them compromising pictures, to send them naked photographs.
00:01:57.520
And then these scammers will often blackmail the kids and threaten,
00:02:03.180
we're going to release these pictures unless you pay us money.
00:02:06.080
And in a number of instances, the kids pay money.
00:02:09.040
And repeatedly, we've seen children taking their own lives, committing suicide.
00:02:13.980
And so you had a room full of parents who had photographs of their children,
00:02:19.440
most of whom presumably had lost their lives, some of whom maybe had survived,
00:02:25.240
but had been badly impacted, whether it was by sexual predators or other forms of hostile activity.
00:02:34.880
And so the focus of the hearing was on big tech and big tech really not only allowing our children to be in jeopardy,
00:02:47.460
but profiting off of addicting our children to social media and everything dangerous that comes from it.
00:02:56.820
You talk about the dangers and not just Facebook, but also TikTok.
00:03:01.320
There's been lots of warning from those in the medical community, pediatric doctors,
00:03:07.980
about the depression and also the addictive nature of these apps.
00:03:11.360
They've created them to become life-altering for young people, and it takes over their lives.
0.85
00:03:15.900
And that's where you're talking about not only are these predators using it,
00:03:19.680
but the depression that comes in, and yet they're readily available, and almost all these kids have them.
00:03:26.000
You know, there's a great movie that is called The Social Dilemma,
00:03:29.360
which if you haven't watched it, I recommend it to you.
00:03:32.040
It's a movie about how social media companies target our kids,
00:03:36.840
how they have algorithms designed to addict our kids to keep them watching more.
00:03:44.180
Heidi and I sat down with our girls, and we made them both watch it.
00:03:49.080
But we wanted them to at least hear and understand what it is that big tech is targeting them with
00:04:05.680
They're printing money, and they give lip service,
00:04:09.140
but yet they are not doing nearly enough to protect children online.
00:04:13.180
Let's talk about part of this exploitation and child exploitation.
00:04:21.260
but the most shocking part is how this is allowed on social media.
00:04:27.460
You had a moment where you and Mark Zuckerberg had a back and forth,
00:04:31.920
and I want to play it for people and listen to,
00:04:35.160
and part of this, Senator, explain these warning signs,
00:04:38.680
because people aren't going to be able to see the posters that you held up.
00:04:41.200
Explain what you were showing and how these warnings worked,
00:04:46.100
Well, so the Wall Street Journal last year did an expose about how Facebook was assisting pedophiles,
00:04:55.140
assisting pedophiles in, number one, targeting children,
00:04:59.720
but number two, finding child pornography online.
00:05:05.600
So if you typed in a search that was designed to pull up child pornography,
00:05:10.740
they would have a warning screen that would pop up that says the search you put in
00:05:16.880
is likely to produce images of child pornography, which is a crime.
00:05:28.920
or do you want to proceed to your search request,
00:05:35.460
And so I asked Mark Zuckerberg why on earth they were doing that.
0.86
00:05:41.100
Instagram also displayed the following warning screen
00:05:44.080
to individuals who were searching for child abuse material.
00:05:49.520
These results may contain images of child sexual abuse.
00:06:01.300
Mr. Zuckerberg, what the hell were you thinking?
00:06:03.240
All right, Senator, the basic science behind that
00:06:07.760
is that when people are searching for something that is problematic,
00:06:12.540
it's often helpful to, rather than just blocking it,
00:06:16.140
to help direct them towards something that could be helpful for getting them to get help.
00:06:23.540
In what sane universe is there a link for see results anyway?
00:06:29.260
We try to trigger this warning, or we tried to,
00:06:33.500
when we think that there's any chance that the results might be wrong.
00:06:38.160
Let me ask you, how many times was this warning screen displayed?
00:06:45.080
I don't know the answer to that off the top of my head.
00:06:47.420
I don't know the answer to that off the top of my head.
00:06:49.920
You also had a follow-up question there right after that, Senator,
00:06:56.760
how many of those people, those child predators, clicked on see it anyway?
00:07:02.000
And he's like, I don't know if we have that data. Really?
00:07:05.540
Well, and look, what is important, and what I followed up right after that,
00:07:18.800
But the point is, they don't know because they don't want to know.
00:07:21.560
And more importantly, they don't want anyone else to know.
00:07:28.240
And further, I asked him a very simple question.
00:07:34.760
searches for images of little children being sexually abused,
00:07:44.320
Do you look into the children who are in the images
00:07:56.740
it's interesting you say you don't know it off the top of your head,
00:07:59.040
because I asked it in June of 2023 in an oversight letter,
00:08:15.460
I know how lawyers write statements saying we're not going to answer.
00:08:18.380
Will you tell us how many times this warning screen was displayed?
00:08:28.560
How many times did an Instagram user who got this warning
00:08:32.240
that you're seeing images of child sexual abuse,
00:08:42.740
but I'll personally look into this and we'll follow up after.
00:08:48.240
when you have a potential pedophile clicking on,
0.99
00:08:58.400
Senator, I think that an important piece of context here
00:09:01.680
is that any context that we think is child sexual abuse.
00:09:34.100
is sexual abuse material on the service and we do.
00:09:41.720
Senator, I don't know if every single search result
00:09:47.080
Senator, do you want me to answer your question?
00:09:50.060
Did you report the people who clicked, see results anyway?
00:09:54.920
That's probably one of the factors that we use in reporting.
00:10:02.760
the National Center of Missing Exploded Children,
00:10:06.860
We proactively go out of our way across our services to do this
00:10:09.960
and have made, I think it's more than 26 million reports,
00:10:13.620
which is more than the whole rest of the industry combined.
00:10:19.680
Mr. Zuckerberg, your company and every social media company
00:10:24.040
By the way, the question I would love to know, Senator, is this.
00:10:30.900
So how much sexual exploitation of children
0.84
00:10:37.780
Well, and understand the Wall Street Journal expose
00:10:40.820
that I was asking about demonstrated how on Instagram,
00:10:51.800
And so their algorithms were actively assisting pedophiles in doing this.
00:10:57.080
By the way, Instagram no longer has this screen of see results anyway.
00:11:04.180
After the Wall Street Journal reported on what they were doing.
00:11:06.900
So it was only when they got caught that they realized,
00:11:09.800
yeah, maybe we shouldn't be aiding and abetting children being abused.
00:11:17.180
But they're still not affirmatively acting to protect kids the way they should.
00:11:24.900
they have no problem trying to use these warnings to silence conservatives.
00:11:31.220
Because we put a picture up of the pictures that were put in front of Congress of Hunter Biden.
00:11:37.500
And for six months, we lost the ability to promote and or reach and or be found by people on Instagram and on Facebook.
00:11:47.120
So and there was hundreds of conservatives that that happened to
00:11:51.140
because they were putting out information about Hunter Biden that they didn't like.
00:11:56.980
I want to tell you about Patriot Mobile real quick.
00:11:59.160
For 10 years, Patriot Mobile has been America's only Christian conservative wireless provider.
00:12:05.920
If you are sick and tired of giving your money to organizations that actually hate your values,
00:12:12.500
that hate your family values, hate your Christian values,
00:12:14.880
hate the First Amendment rights and Second Amendment rights, hate our Constitution,
00:12:18.840
then it is time for you to switch to Patriot Mobile.
00:12:23.880
giving you the same exact coverage that you get right now because they use the same towers.
00:12:28.880
But what you're getting rid of is the propaganda of the left and funding the left.
00:12:32.960
Big Mobile gives big donations to Democratic causes, candidates, including Planned Parenthood.
00:12:39.820
And that is why I have switched to Patriot Mobile.
00:12:42.180
When I look at my phone in the top left corner, it says Patriot.
00:12:45.540
And I know that I'm sending a message that I support free speech, religious freedom, and the sanctity of life.
00:12:50.780
And every time I pay my bill, I also know I'm making a difference because they take 5% of my bill at no extra cost to you.
00:13:04.680
They got 100% U.S.-based customer service team.
00:13:07.120
And you get to keep your same phone number and upgrade your phone as well if you want to.
00:13:28.920
Senator, I also want to talk about the other CEOs that were there.
00:13:34.520
And people may not understand TikTok the same way they do Facebook.
00:13:38.680
Explain a little bit of the concerns around TikTok.
00:13:42.240
And it's actually national security concerns as well.
00:13:45.800
So understand, in the United States, there are roughly 160 million people that use TikTok.
00:13:52.420
So roughly half, roughly 50% of the population of the United States of America is on TikTok.
00:14:02.500
But in the United States, it's about 160 million.
00:14:06.200
Where TikTok is particularly powerful is among kids, is among young people, is among preteens, is among teenagers, is among college kids.
00:14:21.920
It is universal among 6th graders and 7th graders, 8th, 9th, 10th, 11th graders.
00:14:29.760
And understand also that TikTok is wholly owned by a company called ByteDance.
00:14:36.960
And under the rules of China, the Chinese government has the power to monitor everything that is on TikTok.
00:14:43.800
So, look, all of the social media companies are responsible for very significant damage.
00:14:49.680
But TikTok is a qualitatively different damage because none of the other major companies are Chinese companies that the Chinese communist government has the power to engage in espionage, to engage in spying, and to engage in pushing propaganda.
00:15:07.900
And so the CEO of TikTok was there, and I took the opportunity to ask him about this as well.
00:15:18.280
In the next couple of minutes I have, I want to turn to you.
00:15:21.720
Are you familiar with China's 2017 national intelligence law, which states, quote,
00:15:26.720
all organizations and citizens shall support, assist, and cooperate with national intelligence efforts in accordance with the law
00:15:33.360
and shall protect national intelligence work secrets they are aware of?
00:15:44.240
For the Chinese businesses that ByteDance owns, yes, it will be subject to this.
00:15:52.240
And, Senator, as we talked about in your office, we built Project Texas to put this out of reach.
00:15:59.520
Now, under this law, which says, shall protect national intelligence work secrets they are aware of,
00:16:06.080
it compels people subject to the law to lie, to protect those secrets.
00:16:17.360
What I said again is that we have moved the data to reach the data.
00:16:21.160
No, Senator, TikTok is not available in mainland China.
00:16:23.980
We have moved the data into an American cloud infrastructure.
00:16:26.160
But TikTok is controlled by ByteDance, which is subject to this law.
00:16:30.340
Now, you said earlier, you said, and I wrote this down,
00:16:34.360
we have not been asked for any data by the Chinese government, and we have never provided it.
00:16:38.980
I'm going to tell you, and I told this when you and I met last week in my office, I do not believe you.
00:16:44.240
And I'll tell you, the American people don't either.
00:16:46.640
If you look at what is on TikTok in China, you are promoting to kids science and math videos, educational videos,
00:16:56.240
and you limit the amount of time kids can be on TikTok.
00:16:59.080
In the United States, you are promoting to kids self-harm videos and anti-Israel propaganda.
00:17:14.900
There's not a difference between what kids see in China and what kids see here?
00:17:22.000
But you have a company that is essentially the same,
00:17:25.020
except it promotes beneficial materials instead of harmful materials.
00:17:30.320
We have a lot of science and math content here on TikTok.
00:17:32.780
There's so much of it that created the STEM feed for 100 billion views.
00:17:38.640
There was a report recently that compared hashtags on Instagram to hashtags on TikTok.
00:17:50.560
So, for something like hashtag Taylor Swift or hashtag Trump,
00:17:55.800
researchers found roughly two Instagram posts for everyone on TikTok.
00:18:01.400
That difference jumps to 8 to 1 for the hashtag Uyghur.
00:18:12.560
And it jumps to 57 to 1 to the hashtag Tiananmen Square.
00:18:17.720
And it jumps to 174 to 1 for the hashtag Hong Kong protest.
00:18:24.520
Why is it that on Instagram people can put up a hashtag Hong Kong protest 174 times compared to TikTok?
00:18:35.240
What censorship is TikTok doing at the request of the Chinese government?
00:18:44.600
It's been debunked by other external sources like the Cato Institute.
00:18:52.920
The second thing is that you cannot selectively choose a few words within a certain time period.
00:18:56.720
Why the difference between Taylor Swift and Tiananmen Square?
00:19:01.000
Senator, there was massive protests during that time.
00:19:06.060
But what I'm trying to say is our users can freely come and post this content.
00:19:09.280
Why would there be no difference on Taylor Swift or a minimal difference and a massive difference on Tiananmen Square or Hong Kong?
00:19:16.480
Senator, our algorithm does not suppress any content simply based on it.
00:19:20.280
To answer that question, why is there a difference?
00:19:24.640
You're selectively choosing some words over some periods.
00:19:30.900
174 to 1 for Hong Kong compared to Taylor Swift.
00:19:36.520
I just think the guy's a flat-out liar and nothing but a spokesman for the CCP.
0.78
00:19:45.660
By the way, at the end, that was Dick Dermott, the Democrat chairman,
00:19:48.800
and gaveling in and trying to protect TikTok, trying to protect Communist China.
00:19:55.840
By the way, when Democrats were questioning, he let him go on and on and on.
00:19:58.980
But Durbin wanted to jump in and stop that line of questioning.
00:20:01.880
Listen, the numbers are obvious on their faces.
00:20:05.900
So understand the differential in those hashtags.
00:20:08.380
So this was a study that compared different trending hashtags on Instagram,
00:20:13.140
which a lot of young people use, and on TikTok,
00:20:23.120
which is something people will post about a lot,
00:20:26.080
particularly young people, they're big fans of Taylor Swift,
00:20:31.140
So basically, there were two Instagram posts for every one TikTok post.
00:20:38.980
As I went through, the differential became massively different with messages
00:20:44.940
that the Chinese Communist government doesn't like.
00:20:57.080
So, in other words, people could post about it on Instagram 174 times
00:21:13.960
It's not searching for how frequently the word asparagus trends.
00:21:19.900
that the Communist Chinese government does not want people to know about.
00:21:25.080
And miraculously, and it's not really miraculous,
00:21:29.440
the words that they most want people not to know about
00:21:33.020
are the words that you are least allowed to see on TikTok.
00:21:37.300
And mind you, the CEO under oath is testifying,
00:21:41.860
well, no, no, no, we don't share anything with the Chinese Communist government,
00:21:48.860
even if they shared everything with the government,
00:21:51.040
they are required under the law to lie about it.
00:21:54.500
So, he would have to say that regardless of what the facts are.
00:22:02.900
their elected leaders, and the world around them.
00:22:11.240
And in this podcast, we interview Canada's most inspiring women.
00:22:14.860
Entrepreneurs, artists, athletes, politicians, and newsmakers,
00:22:20.740
So, if you're looking to connect, then we hope you'll join us.
00:22:23.940
Listen to the Honest Talk podcast on iHeartRadio
00:22:29.580
Do you believe that there is going to be any type of new legislation
00:22:33.000
and or oversight that will be impactful to stop this?
00:22:43.060
and blackmail and human trafficking and sex trafficking
00:22:50.720
Well, I do think there is growing bipartisan interest
00:22:57.880
And in the Senate Judiciary Committee, on which I serve,
00:23:00.980
five different bills have passed out of the Senate Judiciary Committee.
00:23:05.480
They're all designed to enhance the protection for kids online.
00:23:12.840
But to date, none of them have gotten a vote on the Senate floor.
00:23:20.100
I am the ranking member on the Senate Commerce Committee,
00:23:25.540
We have passed several bills out of the Senate Commerce Committee
00:23:28.940
aimed at the same thing, at protecting kids online.
00:23:31.400
And again, they passed with overwhelming bipartisan support.
00:23:44.580
that big tech are massive financial supporters of Democrats.
00:23:51.380
In the 2016 election, do you know who the employer was
00:23:55.660
whose employees gave the most money to Hillary Clinton in 2016?
00:23:59.080
I'm going to guess it was either, at the time, Twitter, or it was Facebook.
00:24:07.020
Google was the number one supporter of Hillary Clinton in 2016.
00:24:10.700
If you look at, do you know who the number one individual donors
00:24:14.140
to Democrat causes was in 2020, the Biden election year?
00:24:23.720
So, and to give you a sort of order of magnitude,
00:24:26.820
so in the 2020 cycle, the top Republican donor in the country
00:25:03.340
He paid $400 million to literally take over the election machinery,
00:25:11.020
the apparatus, in big blue cities, in purple states.
00:25:16.080
It was designed to dramatically increase Democrat turnout
00:25:28.460
that Joe Biden is president today because of Mark Zuckerberg,
00:25:40.080
Was he also buying the guarantee of no oversight
00:25:47.440
doing some of the things that you exposed today?
00:25:57.340
why will Chuck Schumer not bring any of these bills
00:26:02.380
if you've got the major funders of the Democrat Party there,
00:26:08.040
You've got Democrats who will posture in hearing.
00:26:18.220
this was a strange hearing in that many of the senators,
00:26:29.160
Most of the Democrats just gave a seven-minute speech.
00:26:39.960
But they didn't actually ask very many questions
00:26:49.920
We're going to maybe let them rough you up a little bit.
1.00
00:27:01.560
All of them are hitting a roadblock with Charles Schumer.
00:27:05.600
And Schumer, I think, feels no pressure to do anything.
00:27:10.820
and a lot of you are trying to get your finances in order.
00:27:16.020
Interest rates have dropped and are now in the fives,
00:27:20.780
If you have been buried in high interest credit card debt,
00:27:25.220
American Financing can help you access the cash in your home
00:27:32.860
Last year, their salary-based mortgage consultants
00:27:35.100
helped customers save an average of $854 a month.
00:27:45.980
you may be able to delay two mortgage payments.
00:28:21.580
I want to move also to another important issue,
00:28:24.460
and this is one that's starting to pick up some steam here.
00:28:29.960
and these ports that have basically been stopped