#213 — The Worst Epidemic
Episode Stats
Length
2 hours and 14 minutes
Words per Minute
156.55424
Summary
In this episode, I speak with Gabriel Dance about the global epidemic of child sexual abuse. We discuss the misleading concept of child pornography, the failure of governments and tech companies to grapple with the problem, the tradeoff between online privacy and protecting children, the difficulty in assessing the scope of the problem and other tools, the ethics of encryption, sextortion, and other topics. Again, this episode is not a barrel of laughs, but it s an important conversation, and it s yet another PSA, so not paywalled. If you want to support the work I'm doing here, you can subscribe at makingsense.org. And now I bring you Gabriel Dance, who is the Deputy Investigations Editor at The New York Times. Before working at the Times, Gabriel helped launch the criminal justice news site The Marshall Project, where he focused on the death penalty and prison and policing. And before that, he was the Interactive Editor for The Guardian. Where he was part of the group of journalists who won the 2014 Pulitzer Prize for coverage of the widespread surveillance by the NSA. And yet these phrases can conjure images for people that are not aware of what s really going on here. And yet they can remind people of things that don t really get it, because they don t know it. And that s not really what gets it there, right? Because they don't even get it. And that gets found out when they re found out by their 18th birthday, or when they get into the porn industry, or they re sent to one another. Or they get found out before their 19th birthday. Or they're found out or they get it by their 20th birthday and that gets it, or it gets it by another, or that gets out into the world, or by their or or so on or what gets out there. But then they get out into this category, or gets it or gets binned into this . But at the bottom of this category of porn, or but then they're actually getting it out by a binned by this category? And so on and so on, right on it? This is an episode that many of you will find difficult to listen to this episode. I think you have a moral responsibility to listen by The Making Sense Podcast. (Make Sense Podcast, by Sam Harris) I m going to make sense of it.
Transcript
00:00:00.000
Welcome to the Making Sense Podcast, this is Sam Harris.
00:00:24.880
Okay, the long-awaited episode on the most depressing topic on earth, child sexual abuse, otherwise
00:00:34.860
known as child pornography, in the form of its public consumption.
00:00:39.860
As many of you know, I've delayed the release of this episode for several months, it just
00:00:50.080
When is the right time to talk about this, really?
00:00:52.640
In the tech space, it was probably 20 years ago.
00:00:58.260
Anyway, this is an episode that many of you will find difficult to listen to, understandably.
00:01:06.880
If you work in tech, I think you have a moral responsibility to listen to it.
00:01:13.260
If you work at a company like Facebook, or AWS, or Dropbox, or Zoom, or any company that
00:01:21.760
facilitates the spread of so-called child pornography, you really have a responsibility to listen
00:01:30.660
to this conversation and figure out how you can help solve this problem.
00:01:36.500
As you'll hear, we've gone from a world where pedophiles were exchanging Polaroids in parking
00:01:43.000
lots to a world in which there is an absolute deluge of imagery that provides a photographic
00:01:51.100
and increasingly video record of the rape of children.
00:01:56.140
And as you'll hear, the tech companies have been terrible at addressing this problem, and law
00:02:03.800
enforcement is completely under-resourced and ineffectual here.
00:02:10.000
Now, as I said, I recorded this conversation some months ago as an indication of how long
00:02:15.060
ago, when Zoom came up in the conversation, I felt the need to define it as a video conferencing
00:02:21.320
tool used by businesses. I've since cut that. But everything we discuss is all too current.
00:02:29.580
In fact, the problem has only gotten worse under the COVID pandemic, because the children
00:02:36.780
being abused are more often than not at home with their abusers, and the people who consume
00:02:44.460
this material are at home with much less to do. So both the supply side and demand side of this
00:02:53.040
problem have increased. I will add a short afterward to mention a few things that the government is
00:03:01.180
now doing, but nothing of real substance has changed, to my knowledge. Today I'm speaking with
00:03:08.640
Gabriel Dance. Gabriel is the Deputy Investigations Editor at the New York Times.
00:03:14.460
where he works with a small team investigating technology, from the topic at hand, online sexual
00:03:22.080
abuse imagery, to the companies that trade and sell our data, and this business model that's
00:03:27.420
increasingly known as surveillance capitalism. Before working at the Times, Gabriel helped launch
00:03:33.040
the criminal justice news site The Marshall Project, where he focused on the death penalty and
00:03:39.240
prison and policing. And before that, he was the interactive editor for The Guardian.
00:03:44.460
Where he was part of the group of journalists who won the 2014 Pulitzer Prize for coverage of the
00:03:49.720
widespread secret surveillance by the NSA. In this episode, I speak with Gabriel about the global
00:03:55.960
epidemic of child sexual abuse. We discuss the misleading concept of child pornography, the failure
00:04:03.000
of governments and tech companies to grapple with the problem, the trade-off between online privacy and
00:04:08.460
protecting children, the national center for missing and exploited children, the difficulty in assessing the
00:04:14.800
scope of the problem, photo DNA and other tools, the parts played by specific tech companies, the ethics of
00:04:23.120
encryption, sextortion, the culture of pedophiles, and other topics. Again, this episode is not a barrel of laughs,
00:04:32.340
but it's an important conversation, and it's yet another PSA, so not paywalled. If you want to support
00:04:40.980
the work I'm doing here, you can subscribe at samharris.org. And now I bring you Gabriel Dance.
00:04:48.200
I am here with Gabriel Dance. Gabe, thanks for joining me.
00:04:59.940
So, undoubtedly, in the intro to this, I will have prepared people to not listen to the podcast if
00:05:11.140
Right. But I guess I should just reiterate here that we're going to speak about probably the most
00:05:18.640
depressing topic I can think of. The real gravity of it tends to be concealed by the terms we use
00:05:25.800
to describe it. So, we're going to talk about, quote, child pornography and the exploitation of
00:05:32.120
children. And yet these phrases can conjure images for people that are not, that don't really get it
00:05:40.360
what's going on here. Because they can remind people of things like, you know, they're teenagers
00:05:46.460
who get into the porn industry before their 18th birthday, right? And that gets found out. Or,
00:05:52.300
you know, teenagers send naked photos or videos of themselves to one another, or even with strangers
00:05:58.920
online, and these images get out. And, you know, all of that gets binned into this category of child
00:06:04.720
pornography. But at the bottom of this morass that you and I are going into, we're talking about the
00:06:12.160
rape and torture of young children, either by family members, or caregivers, or by people who have
00:06:19.120
abducted them. And obviously, I'm going to want to know from you just what the scale of this problem
00:06:24.720
actually is. But then we're talking about a vast audience of people who are willing to pay to watch
00:06:32.240
these children raped and tortured. Because they find the rape and torture of children to be the
00:06:38.340
sexiest thing in the world, apparently. So psychologically and socially, we're just in a
00:06:43.820
horror movie here. And people need to understand that's where we're going. And, you know, you can pull
00:06:50.180
the ripcord now if you don't want to go there with us. So that's a fairly grave introduction to you,
00:06:56.620
Gabe. But you have been covering this topic for the New York Times in a series of long and very
00:07:04.060
disturbing articles. So, you know, welcome to the podcast. And thank you for doing the work you're
00:07:09.500
doing. Because, I mean, but for your articles, I really, again, would have just the vaguest notion
00:07:14.840
of what appears to be going on in our world and even in our own neighborhoods. So thank you for
00:07:22.580
lifting the lid on this horror show because it can't be especially fun to be doing this work.
00:07:28.340
Well, thank you. Thank you for having me on. And I really appreciate you starting with an introduction
00:07:34.780
that discusses the terminology around this horrible, horrible crime. We, I also didn't know anything
00:07:46.320
about what I came into this calling child pornography. And our investigation started in
00:07:54.160
February of 2019. And it was pretty quick that we learned, I investigated this with a colleague of
00:08:05.340
mine, Michael Keller, primarily. And it was pretty quick that we learned the proper terminology
00:08:11.080
used by people in the industry and law enforcement is child sexual abuse material.
00:08:19.620
And I think for the purposes of this podcast, it'll be easier to refer to this as CSAM, which is the
00:08:28.240
easier way of referring to it and not constantly using what I think is the inaccurate and inelegant term,
00:08:36.700
child pornography. Maybe let's just linger on the terminology for another minute or so, because
00:08:42.520
it really is one of those terms that really just reliably misleads people. So another example of this
00:08:49.260
is people talk about male circumcision and female circumcision, right? As though the term circumcision were
00:08:56.900
interchangeable in those phrases, right? And so that's, so this is a social issue that is
00:09:02.740
is being obfuscated by, by some common words. And so, I mean, just to, to give people a sense of,
00:09:11.420
of what should be obvious, but strangely isn't, let's just consider how different this is from normal
00:09:17.300
pornography, because there's a lot that could trouble us and perhaps should trouble us about
00:09:23.120
normal pornography. It's, you can ask questions like, how did these women in particular find
00:09:29.520
themselves in a situation where they're performing sex on camera? I mean, are they getting paid?
00:09:35.160
How much are they getting paid? Are some of them actually not getting paid and being exploited or
00:09:39.700
even coerced? Are they private videos that, that were meant to be kept private that just got leaked?
00:09:46.000
Is there some backstory of suffering that would make the average person feel terrible about watching
00:09:51.740
what purports to be a, um, a video of consenting adults having sex? So, I mean, these are totally
00:09:58.120
reasonable questions to ask, but it's also understandable that most people don't really
00:10:03.240
think about these things when they're watching normal adult pornography, because human suffering
00:10:10.860
isn't being directly shown on the video. I mean, even if it's edgy porn, I mean, who knows,
00:10:17.200
it could be horrendous stuff out there that I can't imagine, but normal pornography, even edgy
00:10:23.120
pornography is within its frame, it seems to be the work of consenting adults doing something they
00:10:30.820
want to do for whatever reason. But anything involving kids does not function by that logic at all,
00:10:36.400
right? So any image or video of an adult having sex with a five-year-old is simply the record of a crime,
00:10:44.880
right? Just full stop. And it is obviously a crime to anyone watching it. And yet, you know,
00:10:53.460
as appalling as it is that these crimes occur, it's almost more appalling that there's a vast market
00:11:00.500
for them. I mean, I'm prepared to believe that, you know, one guy in a million is going to abuse
00:11:06.420
his stepchild, right? But the idea that there are millions and millions of people with whom this
00:11:14.500
person could find an online dialogue and sell them the video record of this abuse. I mean,
00:11:21.560
that's just completely shocking to me. And the scale of it is completely shocking as you report.
00:11:26.840
So let's just, let's talk about the nature of the problem. What is, what's going on and how much of it
00:11:34.680
is out there? Well, I think you're absolutely right to draw a distinction between what we call
00:11:41.340
adult pornography and what the misnomer child pornography and what you said several times
00:11:48.680
hit the nail on the head, which is consent. I mean, these are, as you said, children. I mean,
00:11:57.600
even if we take away and we can come back and speak about self-produced material by teenagers,
00:12:03.060
maybe, or 17 year olds who might be engaging in sexual acts on film before turning the legal age.
00:12:12.360
These are not what we are discussing in the majority of our reporting. We are talking about
00:12:17.900
prepubescent acts of sexual crimes against children. There is no consent. They are unable to consent.
00:12:26.640
And there's no, there's no illusion of consent. You have to get your head around the depravity
00:12:33.420
of the audience here. And again, I mean, this is going to sound very judgmental, you know,
00:12:38.460
let's bracket for a moment, some kind of compassionate and rational understanding of pedophilia
00:12:44.060
that we might want to arrive at. It should be obvious that no one chooses to be a pedophile
00:12:49.480
or, you know, anyone who finds this imagery titillating. But there's one example in one
00:12:56.180
of your articles that this is not an especially lurid description of the crime, but just the
00:13:01.580
details give you a sense of how insane all this is. So this is lifted from one of your articles.
00:13:08.320
In a recent case, an offender filmed himself drugging the juice boxes of neighborhood children
00:13:13.380
before tricking them into drinking the mix. He then filmed himself as he sexually abused
00:13:18.240
unconscious children. So that's part of the titillating material for this audience. The
00:13:24.720
imagery, the video of this guy putting whatever narcotic he used into juice boxes and feeding
00:13:32.680
it to unsuspecting children and then performing sex acts on them. The criminality of this and
00:13:39.700
the evil of it is absolutely on the surface. The details are mind boggling.
00:13:46.100
There are definitely, I mean, a variety of extremely depraved things that may come up
00:13:52.960
in our discussion that I have learned are extremely hard for people to hear. They were hard for me to
00:14:00.540
even begin to comprehend when I was learning these things from law enforcement, from survivors,
00:14:06.060
from child advocates. I mean, the one you described was actually an example given by
00:14:13.740
special agent Flint Waters, who at the time was a criminal investigator for the state of Wyoming.
00:14:20.100
He was appearing before Congress when he was describing that video. And that was in 2007,
00:14:29.340
actually, before this crime has exploded in the same way and or in the way that it has. I mean,
00:14:38.320
for reference, in 2007, and I'm sure we'll get more into the total numbers, but there were fewer than
00:14:46.820
100,000 reports of online child sexual abuse material. In 2019, we just published a story on
00:14:57.460
this past Friday. In 2019, there were almost 17 million reports. So the explosion in content being
00:15:09.160
found is staggering. And to talk a little bit, I mean, the examples are all horrendous, hard to hear,
00:15:18.820
harder to imagine, nothing you want to think about or read about. But just to kind of take it to
00:15:25.460
the extent that we've learned is what's going on, there's also an active community engaged in committing
00:15:33.960
sexual crimes against what they call, the criminals, pre-verbal children, which is to say children who
00:15:42.300
cannot speak yet. And that means, obviously, usually children younger than two, younger than one,
00:15:50.520
instances of children days and months old, being molested, raped, being raped, filmed being raped.
00:15:59.520
And it is truly beyond shocking. And as we started to speak with the people who regularly engage with
00:16:08.940
this content, their lives are forever changed. Anybody who deals with this issue cannot get it out of their
00:16:18.900
minds? And it really speaks to why it has become such an interesting issue when it comes to law
00:16:26.600
enforcement and the Department of Justice and tech companies and this very interesting new privacy
00:16:34.240
issues and some of the other things that raise that naturally come out of this subject.
00:16:39.500
Yeah. So I want to talk about the scale of the problem insofar as you understand it and how
00:16:44.820
ineffectual the government and the tech companies have been thus far dealing with it. But just to
00:16:50.660
talk about your experience for a moment, how do you go about reporting on this? And in the course of
00:16:57.840
reporting, are you exposed to any of this material? Or do you actually, can you do all your reporting
00:17:03.420
without feeling the same kind of psychological contamination that the law enforcement people you
00:17:09.960
speak with? Experience? Great question. And one that also we had no idea going into this. So
00:17:18.060
it might be helpful if I, if I talk a little bit, how we stumbled into this subject and then how we
00:17:24.900
learned how to report on it. So I've been working here at the times investigating tech companies for
00:17:31.820
several years now. And that has been everything from bots and fake followers, Cambridge Analytica,
00:17:40.080
data deals that Facebook has struck. So I've been immersed in this field along with several colleagues
00:17:45.720
where these mammoth companies are tracking all sorts of things about you, web pages you like,
00:17:51.700
who your friends are, where you are, using this data to target you, things I know that you've discussed
00:17:57.260
at length with many of the people you've had on the show. But still, I felt often both in conversations
00:18:04.400
with editors here, as well as people outside the building, that I was having difficulty gaining
00:18:11.240
traction on issues surrounding privacy online and how important and how high the stakes are.
00:18:19.980
And so I started asking the small team I work with questions about, you know, what kind of actual
00:18:26.000
harm can we show? Because many people would argue that, whether it be Facebook or any other company
00:18:31.140
violating our privacy by sharing our data with another company or selling our data or whoever
00:18:36.700
might be doing what with our information, many would argue that the harm is in violating our privacy.
00:18:44.020
But that is still an abstract concept for many. And especially sometimes in a place like a news agency,
00:18:52.260
harm, when I'm working with people like Megan Toohey and Jody Cantor, who are investigating Harvey Weinstein and
00:19:00.600
crimes against women, and there's tangible harm there. And there's harm for some of my colleagues
00:19:07.080
investigating what's going on in Myanmar and the Facebook disinformation and people dying from that.
00:19:13.580
I mean, there's harm there. And so gaining traction around online privacy and harm was something that
00:19:20.040
I was looking for, what topic is really going to bring this to a point where people can start having,
00:19:28.020
it was like a fast forward, right? I wanted to short circuit this conversation about privacy online
00:19:32.880
to a point where we could actually begin discussing it in a way that has very, very real harm and
00:19:44.600
But here you're talking about the flip side of this, which is our commitment to maintaining privacy at all costs,
00:19:52.680
if we ever, you know, achieve that, you know, the full encryption of Facebook Messenger, for instance.
00:19:59.180
One of the knock-on effects of that will be to make these crimes more or less undiscoverable.
00:20:07.280
Absolutely. Absolutely. And we'll come back, I'm sure, to encryption and some of the potential,
00:20:13.160
I don't know, solutions for people's privacy and very, very high stake decisions for children
00:20:21.560
suffering this abuse and people trying to rid the internet of this type of material. So you're right,
00:20:28.420
I was coming at it from a privacy side. But I also knew that it was more complicated than that.
00:20:35.840
Yeah. And so we wanted to figure out where does this privacy line start actually,
00:20:42.620
like, where's the rubber meet the road? And one of the ideas was what we were calling at the time
00:20:47.840
child pornography. And that was not only because of the privacy thing, but we were also talking about
00:20:52.800
what has technology done over the last 10 years that has completely changed the world. And one of
00:21:00.220
those things is the ability to create and share imagery. I mean, look at Facebook, look at Instagram,
00:21:06.000
all of these different types of social platforms and other things that have YouTube that have spun up.
00:21:12.400
I mean, so much more images and videos are being created and shared and stored, etc. That we,
00:21:20.780
it was just a hunch. I mean, what's going on with child pornography? And A, nobody wants to talk about
00:21:26.320
it. So as an investigative reporter, that is actually helpful when you encounter a subject
00:21:32.480
that really nobody wants to touch. But the second thing that happened, the second thing that happened
00:21:37.160
that also... I want to return to that point, though. I don't want to derail you, but we have to
00:21:43.160
return to why people don't want to talk about this and the consequences of that. Absolutely. Absolutely.
00:21:50.120
But the second thing that came in, which actually, in its own way, interestingly ties back to the
00:21:55.400
encryption discussion and everything, is the New York Times has a tip line that I actually helped set
00:22:00.960
up in 2016. And this tip line has multiple ways people can send us information. Some of those ways are
00:22:08.500
encrypted. Some of those ways are regular emails. Some of them are through the paper mail. And we
00:22:15.480
received a tip from a man. And I believe it just came in over email. I don't think he was concerned
00:22:21.600
with protecting his own identity. And this tip said, look, I was on Bing, Microsoft Bing search engine.
00:22:28.740
And I was looking up bullet weights. So literally the weight of bullets, which I understand are measured in
00:22:37.680
grains. And I'm not going to say the specific term he was looking up, or that he was actually looking
00:22:45.020
at bullets, but a certain weight of bullet. And he said, you know, I typed this in and all of a sudden
00:22:50.660
I'm seeing images of children being sexually assaulted. And I've reported this to Bing and it's been
00:23:02.260
days or weeks. And they're still there. And I don't know what to do. So I'm telling you.
00:23:08.160
And so we had already been thinking about this issue. And here in mid-February, we get this tip.
00:23:14.780
And I asked my colleague, luckily, as the small team leader of this technology investigations team,
00:23:22.980
I'm sometimes able to pass a tip on and ask one of my fellow reporters to try to run it down.
00:23:29.260
And so in this instance, I was happy to do that. And in this instance, it was Mike Keller. And I said,
00:23:35.360
Mike, check it out. You know, do me a favor and check it out. So Mike writes me back maybe half an
00:23:42.080
hour later and says, yeah, I typed in the exact terminology that the guy sent in. And I only looked
00:23:49.960
at it for half a second, but there were definitely very disturbing images that came up. And so we were
00:23:58.580
shocked, first of all. But second of all, we immediately reached out to our head legal counsel
00:24:04.860
at the New York Times. And there's a lot of benefits for work in the New York Times. But one of really
00:24:11.000
the best things is that we have excellent legal representation in-house. In this case, it's David
00:24:18.520
McCraw, who's relatively famous in his own right for his dealings, both with President Trump, as well
00:24:24.380
as many other people, Harvey Weinstein, et cetera. And David says, look, it is extremely important
00:24:32.040
that you both understand that there is no journalistic privilege when it comes to child
00:24:36.600
pornography. And he sent us the statue and he sent us some news stories where reporters had in fact
00:24:44.120
gotten in trouble for reporting on this subject. And so what we had to do immediately, because Mike had
00:24:50.080
in fact seen images, is report those images to the FBI and the National Center for Missing and Exploited
00:24:57.560
Children. Because not many people know this, I don't think, but it is one of the only crimes,
00:25:05.240
if not the only crime, that you have to report if you see. I mean, you don't have to report a murder
00:25:12.800
if you see it. But if you see an image of child sexual abuse, you have to report it or you are
00:25:20.440
breaking the law. And that stands for everybody. So we filed a report with the National Center and we
00:25:27.380
filed a report with the FBI. And we then began embarking on this investigation, knowing, first of
00:25:35.760
all that, A, we did, of course, we did not want to see any of this material. But B, if we did see it,
00:25:42.980
we had to report it. And along the way, we even received emails from the FBI saying, hey, reminder,
00:25:52.780
you're not allowed to collect this information. You're not allowed this material. You're not allowed
00:25:57.880
to look at this material. You are, there is nothing you can do around this that is legal, which really
00:26:03.760
did cause a complicated reporting process. That's interesting. It somehow seems less than
00:26:10.860
optimal, but it's also understandable. I don't know. I mean, what do you think? Do you think
00:26:14.980
they have the dial set to the right position there or should, or would there be some better way to
00:26:21.000
facilitate your work or your whatever role you as a journalist can play in solving this problem?
00:26:28.540
I mean, I think probably it's in the right spot, to be honest. I think that while it was difficult,
00:26:36.540
it was, I mean, there are, we reviewed hundreds and thousands of court documents. And these court
00:26:43.560
documents include search warrants and complaints and a variety of other things. And when you have a
00:26:51.160
search warrant, so when an investigator, let's say, based on a tip from the National Center or based on
00:26:58.260
investigative undercover work, discovers somebody with this type of material on their system,
00:27:06.280
they file a search warrant. And when they file the search warrant, they have to describe probable
00:27:11.740
cause. And this probable cause nearly always is descriptions of a handful of the photos and videos.
00:27:20.000
And speaking, I've been speaking with a variety of both advocates and people involved. And while I
00:27:26.860
have been personally lucky enough to have never, ever seen one of these images or videos, I've read
00:27:32.880
descriptions of hundreds, if not more than a thousand. And it is a terrible, terrible, terrible thing to
00:27:40.580
read. And some people have said reading it is worse than seeing it. Now, I don't know. And I can't make
00:27:45.860
that comparison. But I don't feel like I would gain much in the reporting process by actually seeing
00:27:53.800
these things. I mean, just as you, you've only read them in our reports. And I'm sure that's even
00:28:03.640
more than enough for you to understand the gravity. And so I don't see what would be helpful in my being
00:28:10.960
able to see them in any kind of journalistic privilege. And I think that would also likely
00:28:19.900
Yeah, I guess the only analogy I can think of is the ISIS videos, you know, the decapitation videos
00:28:25.860
and the other records of their crimes, which, you know, journalists have watched and anyone can watch.
00:28:31.120
And, you know, I've spent a lot of time, as you might know, railing about the problem of jihadism.
00:28:35.580
And, you know, I'm just aware that I, to know, you know, how bad ISIS was, I'm reliant on people
00:28:45.080
who are paying, you know, firsthand attention to their crimes. You know, someone like Graham Wood
00:28:50.160
over at The Atlantic is actually watching these videos and confirming that they're as bad as is
00:28:55.320
rumored. So I don't have to. And so essentially, you have the cops doing that work for you. It seems,
00:29:02.400
I can't imagine the information is getting lost or corrupted there, given that there's so much of
00:29:07.840
it. But it just would be odd if someone like Graham, in the process of, you know, writing his
00:29:14.300
articles on Abu Bakr al-Baghdadi and ISIS and our misadventures in the Middle East, had to, at every
00:29:21.620
turn, worry that he could be thrown in jail for having discovered an ISIS video online. That seems like
00:29:29.000
an extra piece of overhead that he doesn't need to do his work. Yeah, it was, I mean, it was nerve
00:29:33.980
wracking. It was uncomfortable. And again, I mean, we did every single bit of our reporting consultation
00:29:40.180
with our lawyers. And we were also in close contact with the FBI, the Department of Justice,
00:29:47.720
you know, local law enforcement throughout the country, international agencies dealing with this.
00:29:53.680
So that doesn't provide any cover, certainly. But I was hoping, or I hope that it raised flags
00:30:00.840
everywhere to say like, you know, because I was Googling some pretty crazy terms at points, trying
00:30:07.420
to learn about this issue. And I mean, if you Google just child pornography on Google, literally search it
00:30:16.360
on Google, they will return messages telling you that this is an illegal thing to look for, providing
00:30:23.960
resources if you're inclined to look at this type of material. I mean, there is active messaging around
00:30:31.560
people looking for this type of imagery. So I wanted to make sure I didn't end up on some list
00:30:38.240
somewhere, which I, which I, I hope I'm not on. But basically, we wanted to make kind of as much
00:30:44.920
noise as we could as investigative reporters, we're not trying to tip other people off that
00:30:48.520
we're doing this story. But so that law enforcement knew that we were actually trying to engage in this
00:30:54.860
in a real journalistic way. And, and that there wasn't any sort of anything else going on.
00:31:02.040
Okay, so what do we know about the scale of the problem? You know, you explain that at one point
00:31:07.760
in 2007, we had 100,000 reports. I'll remind people, 2007 was, was a time when we were all online.
00:31:16.500
That's not 1997. You know, 2007 is certainly well into the period where the internet has subsumed
00:31:23.740
all of our lives. So that you have 100,000 reports then, and now we're up to 18, 19 million reports.
00:31:31.660
But how much of this is just more looking and finding the ambient level of abuse that was always
00:31:37.360
there? Or how much do we know about the growth of the problem? Because it seems like the judging
00:31:44.440
from your articles, the, the reporting around the issue is increasing something like exponentially,
00:31:51.440
whereas the number of arrests, and the amount of resources being put towards solving the problem
00:31:56.280
are more or less flat, which is a terrible juxtaposition there. So what do we know about how big a problem
00:32:02.920
this is? Well, it's the right question. And unfortunately, I don't think I'm going to have
00:32:07.080
a completely satisfying answer. And part of that, everything around this subject in some way goes
00:32:15.220
back to the fact that nobody wants to talk about this subject. And so there isn't a lot of transparency
00:32:21.900
for a variety of reasons, whether or not it's the federal government not keeping the records and
00:32:28.040
reports that they should be, whether it's the lack of transparency from the National Center,
00:32:32.860
which is responsible for the collection and serves as the clearinghouse for this type of imagery,
00:32:38.220
or a variety of other things. So, so I'm not going to have, I can't answer your question completely,
00:32:42.900
but I can give us some idea. And so the tip line, the cyber tip line is run by the National Center for
00:32:50.780
Missing and Exploited Children, commonly referred to as NICMIC. And so NICMIC started the tip line
00:32:57.900
in 1998 when people started becoming aware of kind of what you were saying, like 97, 98, people are
00:33:05.080
coming online and law enforcement and Congress, other leaders are realizing that child sexual abuse
00:33:15.380
imagery is also coming online. And the internet was the biggest boon to child sexual abuse imagery
00:33:22.160
since the Polaroid camera. And so let's just spell that out for people that can't do the, um, the
00:33:30.500
psychological math so quickly there. So that the significance of the Polaroid camera was that
00:33:34.800
you didn't have to figure out how to get your film developed by a stranger anymore. You could just
00:33:41.760
develop it yourself. And that took a lot of friction out of the system of documenting the abuse of
00:33:48.300
children because unless you were a, a, you had a dark room, it was kind of mysterious how people
00:33:54.740
could produce a ton of this material in the first place. Right. And, and, you know, according to the
00:33:59.680
law enforcement we've spoken with in the eighties and nineties, I mean, they were pretty comfortable.
00:34:05.100
I mean, eighties and early nineties before the advent of the internet, they were, they were pretty
00:34:09.480
comfortable saying that they were, they had a good handle on this problem and they were actually like
00:34:13.640
stomping it out. I mean, child pornography, child sexual abuse material used to really be the domain
00:34:19.260
of, uh, law enforcement in the U S postal service because that is how it was traded. It was traded in
00:34:25.960
parking lots. It was mailed. And that is how the majority of it was caught and detected. But with the
00:34:33.660
advent of the internet, and this is, again, this is even before digital cameras for the most part,
00:34:40.000
I mean, certainly cell phones. So they opened this tip line, 1998 in 1998, they receive 3000,
00:34:48.560
just over 3000 reports of what the legal term is child pornography, which is also why it's a bit
00:34:55.040
confusing when, when talking terminology, most of the laws refer to it as child pornography. So there's
00:35:02.820
just over 3000 in 1998. By 2006, seven, we're at 83,000 reports, 85,000. And then something happens
00:35:18.500
and nobody can say with certainty, but the numbers start exploding with the invention of the smartphone.
00:35:29.100
The iPhone's introduced in 2008, a bunch of other phones also start to be produced that have
00:35:37.180
high quality cameras, broadband connections. And so by about 2015, actually 2014, we break a million for
00:35:49.480
the first time. And it's a big jump. 2013, there's less than half a million reports. 2014, that number
00:35:56.640
doubles. 2015, that number quadruples, we're over 4 million reports. And by 2018, we're at 18 and a half
00:36:06.700
million reports. So the numbers are growing exponentially. And but there's something we need
00:36:15.300
to tease apart here, which is there are reports to the National Center. And the vast majority of these
00:36:21.260
reports, more than 99% come from what they call our electric service providers, Facebook, Twitter,
00:36:29.340
Google, etc. But each report can can contain a number of files. So this is not a one to one. So when
00:36:39.140
there's 18 and a half million reports in 2018, that does not mean there was 18 million pieces
00:36:45.660
of content found. In fact, there were 45 million pieces of content found in 2018. And it was about
00:36:53.400
split between images and videos. And we and we'll certainly come back to the discussion of videos
00:36:58.340
because there's something startling going on there. But the numbers that we just published a few days
00:37:04.100
ago, for the 2019 numbers, really start to tease apart these differences between reports and numbers
00:37:11.300
of files. So in 2019, for the first time in over a decade, the number of actual reports went down.
00:37:21.620
So the number of reports received by the National Center in 2019 was just shy of 17 million. So we're
00:37:29.520
looking at a drop of about one and a half million. And we can talk about why that happened in a minute.
00:37:35.880
But the number of files reported in 2019 was just under 70 million. So we've gone from 45 million in 2018
00:37:47.320
to 70 million in 2019. And again, as recently as 2014, that number was less than 3 million.
00:37:56.620
So I want to talk about why this is so difficult to even focus on and what explains the failure of
00:38:05.600
our response thus far. But I don't want to lose the thing you just flagged. What's the distinction
00:38:11.320
between still images and videos that you wanted to draw there?
00:38:16.140
The thing that we've seen, so the rise in all photos and videos detected, and we should very much
00:38:23.340
get to that, which is the fact that these are only known images and videos that they are detecting.
00:38:28.320
The systems they have to catch this content are trained to match only images and videos that have
00:38:36.740
been identified previously as a legal material. So we're not talking about new material almost
00:38:43.220
whatsoever. This is in near completeness, previously seen images and videos.
00:38:48.880
But to speak specifically to videos, the technology for detecting video child sexual abuse is nascent
00:39:01.220
compared to image abuse. And for that reason, they've detected as recently as 2017, there were
00:39:11.420
only three and a half million videos reported to the National Center as compared to 20 million
00:39:17.200
images. Last year, there were 41 million videos reported as compared to 22 million. No, I'm sorry,
00:39:27.660
27 million images. So I know these are a lot of numbers, but what we're seeing is videos are
00:39:32.940
exploding. Well, the number of videos detected. And that's almost wholly due to Facebook. And Facebook
00:39:42.400
started scanning aggressively for videos in late 2017. And by 2019, they were responsible for by far the
00:39:55.880
majority of video reports. I think they were responsible for 38 million of the 41 million videos
00:40:01.880
reported. So the numbers are rising. The reports we'll come back to in a second, but the numbers of files
00:40:08.820
and videos are rising. But as to your initial question, what does this tell us about A, how much
00:40:15.560
content is online, and B, how much is being produced? It tells us nothing about either of those for a few
00:40:24.980
reasons. Not nothing, but it paints a very incomplete picture. The first reason is, as I said, they're only
00:40:31.000
detecting previously identified imagery, which means they're not detecting anything that is being newly
00:40:38.340
created. That process is a very slow process to get added to these lists of previously identified
00:40:45.120
imagery. It is because of funding issues and a variety of other things. The list of previously
00:40:52.380
identified imagery is growing very slowly. But the number of recirculating images and videos
00:40:59.320
is as high as ever. So we don't know a lot about how much new content is being produced.
00:41:06.360
And we also don't know, because of that, we don't know if this problem is, as you said,
00:41:13.460
always been there. And we're just finding it because more companies are actively looking for it,
00:41:19.360
or if it's actually growing. Now, conversations with law enforcement amongst others say that the
00:41:27.060
problem is growing. And even common sense, as I said, with cell phones, broadband, cloud storage,
00:41:33.660
social media. I mean, the internet is built to share videos and content and files. There's platforms
00:41:40.400
that are billion-dollar platforms completely dedicated to this. The fact that we don't know
00:41:48.040
exactly how much is out there is evident in Facebook being responsible for 90% or so of all
00:41:58.480
reports. And other companies, we're not sure exactly. The whole industry, certainly before our
00:42:06.180
reporting, and still to a certain extent, was very cloaked in secrecy. And people were happy for that to
00:42:14.020
be the case, because nobody wanted to ask. Yeah, well, I want to talk about what the tech companies
00:42:19.620
are doing and not doing. But one wonders whether Facebook is responsible for much of the problem,
00:42:28.840
or just given their scale, given that they've got 3 billion people on the platform, and given the
00:42:34.860
sophistication of their tools that allow them to find the problem to the degree that they do that,
00:42:40.800
it's hard to know whether we're just penalizing them for looking and discovering the enormity of
00:42:47.080
the problem on their side. But you would have a similar problem anywhere else you looked if you
00:42:52.200
deployed the tools on any of these other platforms, whether it's Dropbox or Tumblr or any of these other
00:42:58.520
companies you mentioned in your articles. That's totally right. And I actually want to make sure
00:43:03.720
that it's clear that I don't think Facebook should be penalized for having the highest number of
00:43:10.580
reports. I mean, there's a lot of nuance around the number of reports. And for example, we were we
00:43:18.300
fought tooth and nail with the National Center for them to disclose the number of reports by company in
00:43:24.380
2018. And they would not do it. And none of the other tech companies would disclose it either.
00:43:32.100
All we knew was that there were 18 and a half million, nearly 18 and a half million reports.
00:43:36.820
We didn't know who they came from. We didn't know what companies were detecting imagery versus video.
00:43:42.580
We don't know when they were scanning for that. And there was a variety of reasons for that. But the
00:43:46.800
biggest reason, there were two biggest reasons. One is the National Center for Missing and Exploited
00:43:51.740
Children is actually a private nonprofit. And that has come under judicial review. And we can talk
00:43:59.240
about that more later if we want. But what that provides them is we are not able to file Freedom
00:44:05.320
of Information Acts to receive information from them. So even though they're sitting on the
00:44:11.740
canonical database of child sexual abuse reports that are that's a federal crime, it's an extremely
00:44:21.180
important statistic that in most instances, we would file a Freedom of Information request and be able
00:44:27.300
to learn some of that some of the information around that big number. When we we cannot file a Freedom of
00:44:34.380
Information request to NCMEC, and they would not tell us. So that was that was that was a number one
00:44:40.800
challenge. And then none of the other tech companies would tell us either. And finally, we had a source
00:44:47.180
who I can't disclose who, come to us and say, Look, you would not believe that the number of that is
00:44:57.000
coming from Facebook. And, you know, long story short, we found out that the number just from Facebook
00:45:03.740
Messenger was 12 million. After we reported that number, the federal government had a conference or a
00:45:10.440
presentation, they said that it was in total 16 million from all parts of Facebook. And at first at
00:45:17.240
first blush, you think, damn, I mean, Facebook is absolutely riddled with this content. Now, let me be
00:45:27.600
clear, any company online that has images or videos is infested with this content. It just that is the case.
00:45:38.420
So Facebook does not stand alone in having this issue. The very interesting part about those numbers is that
00:45:48.280
they actually reflect Facebook taking an extremely aggressive stance in looking for this imagery. I mean, they're
00:45:56.660
scanning every photo that gets uploaded. Since late 2017, they're scanning every video that gets uploaded. And
00:46:03.300
they're aggressively reporting it to the National Center. So those very high numbers actually reflect a very
00:46:09.640
conscientious effort to find and remove this imagery. I mean, we spoke with Alex Stamos, who is a former, I
00:46:18.940
think, chief security officer for Facebook. He was also the same position at Yahoo. And I mean, he said that
00:46:27.520
if this, if other companies were reporting the same way that Facebook was reporting, we wouldn't have,
00:46:34.240
you know, 16 million reports last year, we'd have 50 or 100 million. So Facebook actually, and when we
00:46:42.820
can come back to Facebook Messenger, because that's where things get interesting with Facebook. But I think
00:46:47.980
by any measure, Facebook is actually an industry leader when it comes to finding and reporting this content.
00:46:54.520
Right. I know that people hearing this are going to feel, once they absorb the horror of it, they
00:47:04.420
will feel somewhat powerless to do anything to help solve the problem. And so one question I was going
00:47:10.820
to ask you at the end is, you know, are there any nonprofits that you recommend we support who are
00:47:17.200
working on the front lines of this? But, and so you just said something somewhat equivocal about the
00:47:23.540
National Center, which is really at the center for this, and they're a nonprofit. What, what do you
00:47:28.100
recommend people do here? I mean, is there, should we be giving money to the National Center for missing
00:47:34.540
and exploited children? Or is, is there some better option for people who want to help here?
00:47:40.420
Sure. It's a, it's a good question. We, you know, I'm generally not in the business of,
00:47:45.620
it's lucky as a reporter. A lot of my problem is, or a lot of my job is pointing out problems and
00:47:52.640
not necessarily finding the solution to them. But I do think the National Center is full of great
00:47:59.980
people. Really? I mean, you can't work on this and not be a compassionate person. This is a labor of
00:48:10.280
love that these people are doing. That said, there are definite issues. I mean, the fact that it is a, it has
00:48:17.740
this, you know, quasi governmental status that has come up, you know, Justice Gorsuch, when he was a judge
00:48:26.540
in the 10th Circuit, ruled that the National Center was in fact, a part of the government. They get 75% of their
00:48:33.880
funding in general from the federal government. That's about $30 million a year. But at the same
00:48:40.580
time, they are absolutely overwhelmed. I mean, this, this problem is overwhelming the National Center.
00:48:46.980
It's overwhelming law enforcement. It's overwhelming a lot of tech companies. So, so, you know, while it's
00:48:53.380
complicated, I do think that their heart absolutely is in the right place and their efforts are in the
00:48:59.120
right place. They're just behind. They're really, they're, they're behind. Now you could give money
00:49:04.660
to them and that would be good. There are other nonprofits that also are doing great work. The
00:49:09.700
Canadian Center for Child Protection, who we reported on and who is one of the leaders in starting this
00:49:18.020
idea of what they call naming and shaming tech companies. Because of the, the cloak of silence that's
00:49:24.020
been around this, because we haven't been able to hear what are you actually doing to combat this
00:49:28.800
problem. The Canadian Center has taken the lead in trying to push that process forward there. You
00:49:34.800
could donate money there. You can donate money to Thorn, which is also a nonprofit that is developing
00:49:41.400
software for smaller companies, which is a challenge. If you're a smaller company, building these kind of
00:49:46.720
systems to scan and detect content is expensive. And they're sometimes unable to do that.
00:49:53.200
Why wouldn't there be an open source effort to develop this technology that anyone could use? I
00:50:02.700
mean, why would there be any proprietary angle on this at all? Why wouldn't Google or Palantir
00:50:08.780
or, you know, Facebook just break off some of their expertise and say, here are the tools. This is how you
00:50:15.960
find child pornography in your databases, you know, use them freely.
00:50:20.340
Right. Again, a great question. Now, part of what's going on is, and look, Google sits on Nick
00:50:28.600
Mick's board. Facebook sits on Nick Mick's board. Nick Mick gets in-kind donations from Palantir. They've
00:50:34.720
essentially depended in large part on Palantir and Google to upgrade their systems over the past few
00:50:40.180
years, even though they, you know, have a not insignificant sum of money coming in from the federal
00:50:46.100
government. But the detection system most commonly used is something called photo DNA.
00:50:54.280
So photo DNA was invented in 2009, which many experts would say is at least five years too late
00:51:00.280
when they knew what the problem was. But all the same, invented in 2009, it was a partnership between
00:51:06.320
Microsoft and a person named Dr. Hani Farid, who was at Dartmouth at the time, now at Berkeley.
00:51:12.360
And it is proprietary, and we'll talk about that in a second. But basically what it is,
00:51:19.180
is it's a fuzzy image matching. And by fuzzy image matching, I mean, many of your listeners
00:51:24.860
who I know are adept at technology, you can take what are called cryptographic hashes of any type of
00:51:31.720
file. And a cryptographic hash will shoot out a string of characters, and that string of characters is
00:51:37.660
unique to that file. And if any small thing changes, that cryptographic hash will change.
00:51:44.600
And so for a while they were using cryptographic hashes to try to match known images of child
00:51:50.420
sexual abuse. The challenge became that the people who are trading this type of imagery often are also
00:51:57.360
pretty technological and literate in different technologies. And so they knew that even if they
00:52:05.040
saved the image at a different type of compression, or if they cropped it even slightly,
00:52:12.160
that that cryptographic hash would no longer match. So photo DNA was the solution to this.
00:52:18.040
And photo DNA is, again, they call it a fingerprint, you can call it a fuzzy match. But basically,
00:52:23.480
it takes into account a lot of these minor changes that can be made to an image. So even if you change
00:52:29.580
the color a little bit, you crop it a little bit, you write on it maybe a little bit, it's still going
00:52:34.780
to catch that image. That was invented in 2009. Now, the question of why it is an open source is a good
00:52:40.640
question. They would say that it would allow people who are looking to dodge the system to manipulate the
00:52:50.240
system, access to the algorithm, which would then allow them to find out how to do that.
00:52:55.300
Hmm. I don't know enough to say whether that's for sure the case or not. For example, Facebook last
00:53:02.360
year released an open source algorithm for video detection. Now, a couple weeks ago, I asked some
00:53:09.200
cryptologists, why would Facebook do that? And they said, well, it's probably not that very, that good of
00:53:13.920
an algorithm, to be honest. Dr. Freed will tell you that photo DNA is not some kind of top secret,
00:53:20.460
incredibly complex thing, but they still do keep it under wraps. Now, Microsoft, who owns photo DNA,
00:53:28.120
will license that to most companies from what we understand, if they ask. Now, there's been some
00:53:34.960
movement around that lately that complicates things. But for the most part, Facebook has a license for
00:53:41.520
photo DNA. Google has a license for photo DNA. All of the big companies have licenses for photo DNA,
00:53:47.480
and they use it on their system so that they can all share this list of what they call hashes,
00:53:54.320
a hash list, in between themselves, where they fingerprint photos and take a look at it.
00:54:00.620
Now, that technology is being developed, unfortunately, with video, which I mentioned previously. There is
00:54:07.040
no standard, and that has been confusing to us. It remains confusing to us. The National Center has said
00:54:13.200
that they would prefer there's a video standard just the same way there is an imagery standard.
00:54:17.020
But there is no video standard. So Google has their own hash fuzzy fingerprint system for video.
00:54:24.460
Facebook has their own system. Microsoft actually evolved photo DNA to have their own system.
00:54:30.880
And the government uses a different, law enforcement uses a different system. So now all of a sudden you
00:54:36.280
have this issue of a bunch of different proprietary technologies generating different types of
00:54:43.180
fingerprints that are incompatible and no master list of these fingerprints. So there's really a rabbit
00:54:48.940
hole to go down, which is not uncommon to technology as a whole. But again, in this instance, the ramifications
00:54:59.520
Well, there's the effort of the tech companies and the effort of government. And there's something
00:55:07.660
mysterious here around how noncommittal people have been towards solving this problem. Because,
00:55:16.120
you know, as you say, in at least one of your articles, the US government has approved $60 million a year
00:55:22.720
for this problem, which is really a drop in the bucket. I mean, it's just that on its face is not
00:55:28.140
enough. But they don't even spend that amount every year. They set aside that amount of money,
00:55:35.040
but they spend something like half of it. And this is just totally mystifying to me. I mean,
00:55:40.480
if ever there were a nonpartisan issue where, you know, you could get Christian conservatives on the
00:55:47.160
one hand and progressives on the far left on the other to be equally animated about, it's got to be this.
00:55:55.840
So the issue is, let's figure out how to prevent kids from being raped and tortured for money. And let's
00:56:03.560
figure out how to hold people accountable who do this, and who traffic in this imagery. And yet, it seems that
00:56:12.440
even the money that's reserved to fight this problem isn't being deployed. How do you understand
00:56:19.120
that? It's hard to explain. But I do think that this is perhaps the right time to talk about people
00:56:26.440
just not wanting to discuss the issue. Yeah. So in 2008, people knew this was a problem. As I said,
00:56:34.040
that testimony that you had quoted earlier from Flint Waters, where he's talking about the man who gave
00:56:39.480
juice boxes to the children and then raped them. That was 12 years ago. That was 12 years ago, as of last year,
00:56:45.600
so 13 years ago now, 2007. In 2007, everybody knew this was a huge problem. And so a bill was put on the
00:56:56.240
floor by Joe Biden, Debbie Wasserman Schultz, bipartisan. I believe Cornyn was evolved either at that time or at
00:57:03.700
least by 2012. As you say, it was a bipartisan issue. I think it passed unanimously. It was called
00:57:10.440
the 2008 Protect Our Children Act. And it wasn't until a few, probably like a month into our reporting
00:57:17.120
that we realized that there was legislation in order to confront this issue. And the more we dug into that
00:57:25.600
legislation, what we saw is it was pretty good. It really foresaw a lot of the issues.
00:57:31.400
But then what we saw, which was really disappointing, to put it mildly, was that many, most of the major
00:57:43.700
aspects of the legislation had not been fulfilled by the federal government. So there were three main
00:57:51.600
provisions that were not followed through on. The first, and perhaps the most consequential,
00:57:59.240
is the one you discussed, which is Congress allocated only half of the $60 million that the bill
00:58:08.460
appropriated to fight this. That money is supposed to go directly to state and local law enforcement
00:58:14.440
in order that they can deal with this problem. And we haven't even spoken about them, but the short of
00:58:21.160
it is they're completely overwhelmed. They're having to do total triage. Many of them, that means they
00:58:26.720
focus only on infants and toddlers, leaving the rest of the cases unexamined. That's true with
00:58:32.920
the FBI. That's true in LA. So you have these, they're called Internet Crimes Against Children
00:58:38.400
Task Force, ICACs. All these ICACs begging for money. The money has been appropriated. For the last 10
00:58:47.040
years, it stayed almost wholly at $30 million of the $60 million. I might be using appropriated wrong,
00:58:52.960
might be authorized. I'm not sure what the term is. But basically, they're allowed to give up to $60
00:58:57.440
million. They're only given $30 million. We found another thing, that the Justice Department is
00:59:03.700
supposed to produce biannual, every two years, reports on these topics, or reports on this problem.
00:59:10.880
Now, these reports are supposed to have several pieces of information. They're supposed to compile
00:59:16.260
data about how many reports, where the reports are coming from, in order that we have an idea of the
00:59:22.160
scope of this problem. And they're supposed to set some goals to eliminate it. Well, only two of what
00:59:29.320
should now be seven reports have been produced. And finally, they were supposed to have an executive
00:59:36.940
level appointee, at least by 2012, when the bill was reauthorized for the first time. There's supposed
00:59:42.300
to be like an executive level appointee, essentially a quarterback, who's in charge of this issue. That
00:59:48.520
position has never been filled with the executive level person. It's been a series of short-term
00:59:54.380
appointees leading the efforts. And so it was stunning to see that they had foreseen this problem.
01:00:02.520
And they had actually set up a pretty good law meant to address it. And the only reason that we can
01:00:09.360
think of that these things were not followed through on is people were very happy to put the law in place
01:00:16.560
and then turn their backs. And the only, I can only chalk that up to people just literally not wanting
01:00:24.140
to pay any mind to this issue after feeling like they dealt with it.
01:00:29.640
It is truly mysterious. I mean, I don't know. Again, what we're talking about is a source of suffering
01:00:38.620
that is as significant as any we can think of happening in our own neighborhoods, right? This
01:00:50.400
is not happening in some distant place in a culture very unlike your own for which you, you know, the
01:00:57.740
normal levers of empathy are harder to pull, right? This is happening to, if not your kids, your neighbor's
01:01:05.560
kids and some guy down the block is paying to watch it. And it's all being facilitated by technology that
01:01:15.980
is producing more wealth than any other sector on earth, right? So you're talking about the richest
01:01:23.540
companies whose wealth is scaling in a way that normal businesses never do. And the money is not
01:01:31.920
being allocated to solve this problem. It's just, we need something like a Manhattan project on this
01:01:38.040
where all the tech companies get together and realize this is not something the government is
01:01:43.120
especially good at. Look at those Facebook hearings. And, you know, you have a bunch of geezers up there
01:01:49.640
trying to figure out what Facebook is while also trying to hold Zuckerberg to account for having broken
01:01:56.120
our democracy. And it's just a completely fatuous exercise, right? So clearly we need the best and
01:02:05.520
brightest to break off 1% of their, their bandwidth and wealth and figure out how to solve this problem.
01:02:14.260
Because what seems to be happening based on your reporting, correct me if I'm wrong, is that
01:02:19.420
there are troubling signs that tech is moving in the opposite direction. They're, they're creating
01:02:27.000
technology based on other concerns that will make it, make the problem harder to discover. And then the,
01:02:33.900
the example of this that you've written about is that Facebook is planning to fully encrypt
01:02:40.200
Facebook Messenger, which is one channel that where a lot of this material streams. And if you do that,
01:02:47.740
well then Facebook will be able to take the position that, you know, Apple has taken around
01:02:53.660
unlocking its iPhone, right? Like we can't unlock the phone because not even we can get into your
01:02:59.640
iPhone. So if that person's phone is filled with evidence of crimes against children, well,
01:03:05.920
it really can't be our problem. We have, we've built the technology so that it will never become our
01:03:10.400
problem. And there are many people who are understandably part of a, a cult of privacy now,
01:03:17.740
that have so fetishized the smartphone in particular and other channels of information
01:03:23.940
as, you know, sacrosanct and have to be kept forever beyond the prying eyes of government,
01:03:30.520
no matter how warranted the search warrant is, that a lot of people will line up to say, yeah,
01:03:36.440
I really don't care what might be in, in the Facebook Messenger streams of others or on
01:03:44.820
another person's iPhone. I do not want it to be the case that you can ever get into my iPhone
01:03:53.240
or my encrypted messaging. And I don't know how you feel about that. I mean, I think I've heard the
01:04:00.040
arguments specifically with the case of the iPhone. Frankly, my intuitions have been kind of
01:04:06.180
knocked around there such that I actually don't have a settled opinion on it. But I'm pretty sure that
01:04:12.280
if you tell me that, you know, there's somebody who we know is raping and torturing children and
01:04:17.800
we have the evidence on his iPhone, but we can't open it. 99% of my brain says, okay, that's,
01:04:25.320
that's unacceptable. No one has an absolute right to privacy under those conditions. Let's figure out
01:04:30.900
how to open the iPhone. But many people will disagree there for reasons that, you know, in another mood,
01:04:36.680
I can sort of dimly understand. But, you know, for the purposes of this conversation, those reasons
01:04:41.820
seem sociopathic to me. How do you view the, the role of tech here and, and our looming privacy
01:04:48.920
concerns? Right. Well, it's, it's interesting to hear somebody such as yourself, who I know has
01:04:54.900
a lot of experience with many of these issues, not child sexual abuse, but privacy technology and
01:05:03.900
the tech companies. But let me go back to a few things you said, and then I'll address the
01:05:07.980
encryption bit. We were shocked to find out how many people actually are engaged or looking at this
01:05:15.340
type of material. Just one statistic or one quote I can actually give you is we were speaking with a
01:05:20.100
guy, Lieutenant John Pizarro, who's a task force commander in New Jersey dealing with this type of
01:05:25.660
content. So Lieutenant Pizarro says, look, guys, you got 9 million people in the state of New Jersey.
01:05:34.020
Based upon statistics, we can probably arrest 400,000.
01:05:38.220
Okay. So he's just saying that 5% of people look at child pornography online. Is that, I mean,
01:05:47.140
Okay. So, I mean, that just seems, it just seems, it seems impossible, right? It's like.
01:05:52.220
I mean, you've struck, I mean, it's part of the challenges with reporting on it. You know,
01:05:55.700
like it's a, nobody's going to tell me they look at this stuff. I actually did have
01:05:59.440
a series of encrypted chats with somebody who ran some websites that did have this material,
01:06:05.380
but figuring out how many people look at it or don't is very difficult for a reporter.
01:06:11.020
But law enforcement and, and there is an agenda on law enforcement. We'll get to that when we talk
01:06:16.600
to encryption. But what they say is three to 5% of any, any random population will be looking at this
01:06:25.100
material. And that's not all pedophiles. And in fact, a large number of those people are not
01:06:31.380
pedophiles. And that's one of the issues with having this kind of content even available is that
01:06:37.280
many of the child advocates will say, you know, you have, you spoke a little bit about adult
01:06:42.660
pornography earlier and the wide range of adult pornography and just the insane prevalence of
01:06:49.500
pornography. I mean, when, when certainly you and I were growing up, I didn't have access to
01:06:54.480
pornography. Now pornography is everywhere. And, and, and just like everything internet, it's driven
01:07:00.820
more and more extreme. There's more and more types of classifications, whether it's BDSM or,
01:07:07.080
or teen pornography or any of these types of things. And, you know, again, according to
01:07:11.980
interviews and law enforcement and specialists we've spoken with, they say that this will drive
01:07:16.860
people towards child sexual abuse. So that I just wanted to start, but you noted that, you know,
01:07:23.120
there's a lot of people there is, it seems to be a much larger problem than we previously knew.
01:07:29.400
Now, second with the tech companies and why aren't the tech companies and why haven't they done
01:07:33.280
something? So again, I have to initially come back to the fact that nobody was really telling
01:07:38.600
them they had to, because nobody wanted to deal with it. Nobody wanted to talk about it. Nobody was
01:07:43.140
asking them questions about it. I mean, I know there's been articles written about this in the past
01:07:47.640
several years, but there has not been an investigation such as ours in probably a decade or so.
01:07:54.660
It's a very, very easy subject to look away from. But in the course of my reporting, I did go back
01:08:02.060
years and found employees, former employees at Twitter, former employees at Snapchat, former
01:08:07.640
employees at Facebook, because those are the people who had insight in, let's say, 2012, 13, 14,
01:08:13.820
when the problems started really getting big. And from every single person at every one of those
01:08:19.560
companies, I heard the same thing, which is that the teams responsible for dealing with that
01:08:25.160
material, which are generally called trust and safety teams, are totally underfunded and basically
01:08:31.840
ignored. So an example, one former Twitter employee told me in 2013, when Vine, which was Twitter's
01:08:42.180
short-lived video, like video tweets, six-second, eight-second tweets, in 2013, there were gigabytes
01:08:50.560
of child sexual abuse videos appearing on Vine, and they were appearing more quickly than this one
01:08:56.580
person, there was one person charged with this, could take them down. So the idea that this is a new
01:09:03.320
problem is totally absurd. All the companies have known about it for a long time, but they've been
01:09:10.060
happy to not answer questions about it. There's one, this, I think this appeared in a few of your
01:09:16.440
articles, but there's one sentence in one of these articles that I read and reread, and I'll just read it
01:09:24.440
here, and you'll have to explain this to me. So this, I'm just quoting one of your articles. Police records and
01:09:31.320
emails, as well as interviews with nearly three dozen law enforcement officials, show that some tech
01:09:36.760
companies can take weeks or months to respond to questions from authorities, if they respond at
01:09:42.660
all. Now, to my eye, that sentence doesn't make any fucking sense, right? I mean, how is it that the FBI
01:09:51.560
could be calling Tumblr, or Facebook, or Dropbox, or AWS, or any other platform, saying, listen,
01:10:00.720
we've got crimes in progress being documented on your servers? Toddlers are being raped, and this
01:10:08.920
information is being spread to the world through your servers. Call us back. How is it that the cops
01:10:17.520
aren't showing up with guns kicking in the door, getting a response, if they don't get a timely response?
01:10:25.020
I mean, part of the problem, Sam, is that there's too much. The cops are overwhelmed.
01:10:30.720
So what's often occurring, we found, is that there is so many reports coming in to a local
01:10:38.200
task force that they have to spend, A, a significant portion of their time triaging these reports,
01:10:48.120
trying to find, I mean, the number one thing they want to do is identify if there is a child in imminent
01:10:53.540
harm. Because again, a lot of this material is recirculated material. These are children who are
01:11:00.620
abused 10 or 15 years ago, who have been since rescued, saved, and the images are being found
01:11:08.600
and reported, but there's no imminent harm. So they're going through and they're triaging. And
01:11:12.340
again, we're talking tens of thousands or sometimes hundreds of thousands of reports for a task force.
01:11:18.100
So what we found was occurring, which I agree is incredibly disturbing, is the law as it stands now,
01:11:25.780
and there has been a bill that has been introduced subsequent to our reporting to address this.
01:11:29.840
But as it stands now, the law says that tech companies, as soon as they become aware of this
01:11:36.920
content, must report it. So first of all, tech companies are not legally obligated to look for
01:11:42.260
this content. And there are real and difficult to manage Fourth Amendment issues around that. But
01:11:48.580
putting those aside, tech companies are not legally obligated to look for this content, but they are
01:11:53.020
legally obligated to report it as soon as they know about it. So they will report this content and
01:12:00.000
then they are only allowed to store, whether it's the imagery or anything about this, they're only
01:12:06.800
required to store that for 90 days. After 90 days, they have to get rid of the imagery. There's no way
01:12:13.220
they can keep a hold of the imagery, which leads to significant challenges for training things like
01:12:17.680
artificial intelligence classifiers. Okay, so they have to get rid of this. And that that in itself
01:12:22.980
issues a challenge. And second, a lot of the time, because law enforcement is so overwhelmed, because
01:12:28.900
there's so much content, because they're having to figure out what's an actual threat. By the time they
01:12:34.040
go to the tech company, many times they've gotten rid of the information. And so it's a dead end.
01:12:40.660
You have certain companies like Snapchat, where their whole business model is based on getting rid of
01:12:45.580
all kind of any kind of logs or data or anything around imagery. So there were several instances
01:12:52.720
where law enforcement would go to Snapchat, and they wouldn't have it. We found cases where Tumblr,
01:12:58.440
for a certain period of time, I think in 2016, was in fact informing people who they found this content
01:13:05.440
that they had reported them. And we talked with law enforcement who said that that gave criminals
01:13:11.900
ample opportunity to delete the evidence to destroy devices. So yeah, it's absolutely nuts. And it's
01:13:22.260
because the overwhelming amount of content, the inability for the National Center, whose technology is
01:13:30.180
literally 20 years old, to properly vet and filter and triage the content. So some of they do,
01:13:39.300
they try. Some of that then falls on local law enforcement, who again is overwhelmed. And by the
01:13:45.560
time they get to some of these things, it's often gone. So a bill was introduced in December that would
01:13:52.460
double the amount of time, at least I believe, that the companies are required to hold on to this
01:13:57.500
information. So that's one positive step, I think. But I want to get back really quick to the idea of
01:14:04.120
why tech companies maybe aren't inclined to deal with this issue the way I think most of us would
01:14:11.860
expect. So think about these trust and safety teams, right? Their job is to identify pieces of
01:14:19.500
content and users to remove from the platform. Now, I'm sure you know that the way that many of these
01:14:28.400
companies, if not all of these companies, certainly the public ones, and the ones who are looking for
01:14:32.700
funding report their success is by number of users. Daily active users, monthly active users,
01:14:39.980
whatever it is. So you have this team within your organization whose job it is to remove users and
01:14:46.860
to flag users. And yes, I think it's very easy for all of us to say, well, no shit, and we're better
01:14:54.780
off without them. But I mean, unfortunately, what reporting, and again, this is across several
01:15:00.220
different organizations, several different people, this is not one anecdote, this is not two anecdote,
01:15:05.340
this is several people saying, we were underfunded, it wasn't a priority. And as I think we've seen
01:15:13.600
with Facebook in recent years, as well as other companies, until it becomes a public issue, a public
01:15:19.980
problem for them, they're not inclined to put any resources towards anything that is not in some way
01:15:26.040
driving the bottom line. And so that brings us to encryption.
01:15:30.560
Before we go there, let me just kind of message to the audience here, because I know that many people
01:15:37.080
who work at these companies listen to the podcast. In fact, I know many of the people who started
01:15:42.600
these companies, I can reach out and talk to many of the principal people here. So I know many of you
01:15:50.700
are listening. At every level in these companies, you have to figure this out. The fact that this is
01:15:58.620
the status quo, that so little attention and so few resources are going to solving this problem,
01:16:04.780
when the problem itself is being materially facilitated by your companies, right? I mean,
01:16:11.620
the problem couldn't exist at this scale, anything like this scale, but for the infrastructure you
01:16:18.920
have built, and upon which you are making vast wealth, right? It's just, it's completely understandable
01:16:26.920
that it's a daunting task, right? But if you're working for these companies, and you're spending all
01:16:34.260
your time trying to increase their profit, and spending no time at all, I mean, like, when was
01:16:42.420
the last time you, as an employee at Twitter, or Tumblr, or Facebook, or AWS, or Dropbox, any of these
01:16:52.840
companies, have thought about the problem we're now talking about? Please do something. You know
01:16:59.560
better than I do what you might do to make noise within your company about this, but prioritize
01:17:04.980
this. Google lets its employees spend some significant percentage of time just thinking
01:17:11.060
about problems that interest them. Well, become interested in this one. We're going to look back
01:17:15.880
on this period. I mean, of all the things that are going to seem crazy in retrospect, the deformities
01:17:22.320
in our politics, and in culture at large, born of our just not figuring out how to navigate our use
01:17:29.060
of these tools. You know, the fact that we spend half of our lives ignoring our friends and families
01:17:34.260
because we're looking at what's happening to our reputations on Twitter, because we've put a, this,
01:17:39.900
you know, slot machine in our pockets, and take it out 150 times a day, right? All of that is going to
01:17:45.960
seem insane, and once we correct for it and find some psychological balance, we'll be better for it.
01:17:52.820
But nothing will seem more insane than the fact that we did not address this problem in a timely way.
01:18:00.540
So, you know, with that PSA to my friends in tech, back to you, Gabe. What do you have to say about the
01:18:07.500
prospects of encryption and related issues? Right. Well, let me follow up on that just so I can give
01:18:13.500
your listeners a little bit of information as to what companies are doing what. And this won't be
01:18:19.620
a full rundown, but just so people know. So, there's a pretty big distinction between somebody
01:18:25.440
like Facebook, who's scanning every single thing, their social media company, they're doing it
01:18:29.300
aggressively, and places like cloud storage, okay? So, Dropbox, Google Drive, they tend to have very
01:18:37.760
similar policies. And those policies are, they don't scan anything you upload. They're only going to scan
01:18:44.020
a file when you share that file. That's their policy. Now, that's an interesting policy, but in our
01:18:51.280
reporting, we found that people easily circumvent that policy. They do that by sharing logons, they do that
01:18:57.520
by leaving public folders open available. So, they're choosing, and these are all what the companies would say
01:19:03.660
are privacy-based policies. So, Dropbox and Google only scan on share. Now, let me tell you a little
01:19:10.660
bit for these first numbers that were only released to the New York Times about 2019. Dropbox only filed
01:19:17.260
5,000 reports last year in 2019. Now, while we were doing our reporting in 2019, we said to Dropbox,
01:19:25.000
do you scan images? Do you scan videos? And after weeks of them saying, well, we can't tell you that,
01:19:31.200
we won't tell you that for whatever reason. At one point in, I believe, July of 2019, they said,
01:19:37.620
scanning videos is not a priority. Not a priority for us. We don't feel that videos are the medium
01:19:44.380
of choice necessarily, and that's not a priority. By the time we published our article, literally in
01:19:50.180
the days before, Dropbox said, oh, oh, we're scanning video now. Okay, so they start scanning video,
01:19:56.080
let's say in the last quarter of 2019. What the numbers show is that of the 5,000 reports Dropbox
01:20:03.360
filed to the National Center, there were over 250,000 files. 5,000 reports, 250,000 files.
01:20:11.540
The majority of those files were video. Okay, so Dropbox starts scanning for video,
01:20:16.580
they start finding a lot of video. Amazon, Amazon's cloud services handle millions of uploads and
01:20:23.500
downloads every second. Millions every second. They don't scan at all. They scan for no images,
01:20:31.320
they scan for no videos. Last year, they reported zero images, zero videos. You know, we could go on,
01:20:38.480
those are some of the bigger ones. You have Apple, Apple cannot scan their messages app,
01:20:43.480
and they elect not to scan iCloud. So once again, their cloud storage, they don't scan their cloud
01:20:48.160
storage. Now, I've gone back to them. Some of these companies are starting to do it. I think that
01:20:54.980
there's nothing like, you know, the exposure of a company to motivate them to begin doing this.
01:21:01.920
But there are certainly things they can be doing. And they will tell you that they do dedicate a
01:21:07.140
significant amount of resources. But let me address that as well. So Microsoft, who again, invented or
01:21:14.080
sponsored and invented the photo DNA, the creation of image scanning, has long been seen as a leader
01:21:20.420
in this field. And remember, this all started with a tip from a user saying that they were able to find
01:21:26.440
child sexual abuse on Microsoft. So my colleague, Michael Keller, both of us have computer science
01:21:32.740
backgrounds. He wrote a program. And this computer program used what's called a headless browser,
01:21:38.800
which means you can't actually see the browser. And he programmed this headless browser to go on
01:21:46.540
Bing, to go on Yahoo, to go on DuckDuckGo, and to go on Google, and search for child sexual abuse
01:21:52.520
imagery using terms that we both knew were related to the to child sexual abuse, as well as some others
01:21:58.920
that were sent to us as tips. And the program, I mean, again, we had it very heavily vetted by our
01:22:06.000
lawyers. It even blocked the images from ever being loaded, just so you know. So not only could
01:22:10.800
we not even see a browser window, but the images were stopped from ever loading. But what we did is
01:22:16.880
we took the URLs that were returned from these image searches. And we sent those URLs to Microsoft's
01:22:24.480
own photo DNA cloud service. So essentially, this is a cloud service that we signed up for with
01:22:31.120
Microsoft, saying very clearly, we're New York Times journalists, we're reporting on this issue.
01:22:35.940
And we'd like access to your API to check for images of child sexual abuse. They gave us access
01:22:42.120
to the API. We wrote a computer program, it searched Microsoft Bing, using terms, we then sent those
01:22:50.380
URLs to Microsoft Photo DNA, and found dozens of images. Dozens of images. This is a trillion dollar
01:23:00.540
tech company. So not only that, we found dozens of images. And that's before, we just cut it off.
01:23:05.860
I mean, again, with letters coming in from the FBI saying, be careful. And we weren't trying to make
01:23:11.400
a massive collection or prove that there are millions. We found 75 before we were like, okay,
01:23:18.240
there's plenty here. So then what we did is we went and told Microsoft. We said, this is what we did.
01:23:23.900
This is exactly what we did. These are the search terms we use. They said something akin to,
01:23:29.320
a bug, a problem. Okay. Three weeks later, we did it again. And we found them all. We found
01:23:36.220
different ones. We found more. So the idea that these tech companies cannot find, I mean,
01:23:44.440
they should be able to do this themselves, obviously, when two journalists at the New York
01:23:48.920
Times can do that. So the idea that they're doing, and this isn't just Microsoft, right? It was also
01:23:54.580
found on Yahoo and DuckDuckGo. Now, both of those are powered by Microsoft search engine. So the default
01:24:01.300
lies largely with Microsoft. We did not find any on Google. So that says two things. One, Microsoft is
01:24:07.980
not realizing that their own system is indexing and serving up imagery that its own technologies can
01:24:14.380
identify. And two, it's doable. You can stop this. Google's done it. However, Google did it in their
01:24:20.780
search. And I'm not saying it's impossible to find it. Again, we didn't do some kind of exhaustive
01:24:25.580
search, but it wasn't turning up on Google. So there is some extremely uneven commitment to this issue.
01:24:33.860
And also there's this, the issue we flagged in discussing Facebook a while back, where if you
01:24:39.860
don't look, you don't have any bad news to report. If Facebook looks, they find, you know, 16 million
01:24:47.060
instances of the problem. AWS doesn't look, and they don't pay a significant price for not looking.
01:24:55.920
The not looking has to become salient and an instance of terrible PR for anyone to be incentivized
01:25:03.160
to look, it sounds like, beyond actually caring about this issue.
01:25:06.000
Right. Well, now we're running up against exactly what you described earlier, which is
01:25:09.860
the privacy advocates and the encryption, essentially absolutists.
01:25:15.200
Right. And let me start this part of the conversation by saying, I'm a reporter. I don't
01:25:22.380
offer my opinion on exactly how this problem should be solved. My point is that this is the clearest
01:25:31.500
example of where privacy has stark and terrible consequences for a group of people. Okay. But
01:25:43.120
that said, you're right. Amazon, Apple, they seem to pay very little price for filing almost zero
01:25:52.240
reports of child sexual abuse. And meanwhile, Facebook gets a bunch of initially negative headlines
01:25:58.900
for filing an enormous amount. Now, as we've discussed, those numbers are actually indicative
01:26:05.240
of them doing a very good job. But as you said, in March of last year, Mark Zuckerberg announced
01:26:13.520
plans to encrypt Facebook Messenger. Now, let me put some context around Facebook Messenger and just
01:26:22.100
how commonly it's used to share images of child sexual abuse. In 2018, of the 18 million, a little
01:26:32.540
more than 18 million reports made to the National Center, nearly 12 million of those, about 65%,
01:26:38.560
two out of three, were from Facebook Messenger. Right. In 2019, Facebook Messenger was responsible
01:26:47.020
for even more. 72% of all reports made to the National Center. So, I mean, whenever I tell people
01:26:55.060
these facts, the response is almost always, who are these idiots that are trading child sexual abuse
01:27:02.120
on Facebook? I don't know the answer to that, but there's lots of them. Now, if Facebook encrypts
01:27:09.660
Messenger, which again, Mark Zuckerberg has said they're going to do, they will almost completely
01:27:17.040
lose the ability, they'll lose the ability to do any kind of automatic image detection, which is what
01:27:21.700
everybody fundamentally relies on to do this. And while they will say that they're going to use other
01:27:26.940
signals, the experts and people I've talked to anticipate that there will be nearly 100% decrease
01:27:33.540
in reports from Messenger. You know, maybe they'll be able to use some of these other types of
01:27:39.720
indicators, which I would actually be encouraging them to be using anyways. Maybe they are, but to
01:27:44.500
find this, and these are signals, what they call, which are messages that are sent from one to many
01:27:50.060
people, or adults messaging children, things that are, again, I think they should hopefully be using
01:27:56.280
anyways. But the fact that they plan to encrypt Messenger, which Jay Sullivan, product manager,
01:28:03.540
and project management director, I'm sorry for messaging privacy at Facebook, in the fall of
01:28:09.860
last year and prepared remarks said, you know, private messaging, ephemeral stories and small
01:28:14.980
groups are by far the fastest growing areas of online communication. And so by saying that,
01:28:20.720
what he's saying is that this is what our users want. Our users want encrypted messaging, our users
01:28:25.660
want privacy, our users want everybody to stay out of their living room, to use an analogy that
01:28:32.880
they often use. But the truth is, and people are really terrified by this, that if they encrypt it,
01:28:41.220
not only are they not going to be able to see CSAM, they're not going to be able to see all the other
01:28:45.660
kinds of crime and grooming and sextortion and everything else that is occurring all the time
01:28:53.280
on their platform. So obviously, there's a serious conversation that has to be had around tech's role
01:29:01.420
in this and the incentives and this cult of privacy and its consequences. And we have to, I mean, that's
01:29:07.520
its own ongoing topic of conversation that we're not certainly not going to exhaust it here. I guess
01:29:15.580
the best use of our remaining time is just to give a, as clear a picture as we can of the urgency and
01:29:22.660
scope of this problem. I mean, because again, it's when you give me a, a factoid of the sort you did
01:29:29.500
from, from New Jersey. So you have a law enforcement official in New Jersey who says, you know, we've got
01:29:33.960
9 million people in the state of New Jersey. And based upon our statistics, we could probably arrest
01:29:40.060
400,000 of them, right? These are 400,000 people who he imagines have looked at this material in some
01:29:49.980
way online. Now, whether they saw it inadvertently, whether some of them are New York Times reporters
01:29:54.540
doing their research, discount all of that, it appears that there's an extraordinary number of people
01:30:00.120
who seek this material out because this is what gets them off, right? This is not, these are not research
01:30:09.020
papers. And we have a culture. I mean, what do we know about the culture of pedophiles and all of the
01:30:18.080
elaborate machinations they have to take to not get caught producing this material, trading in it, viewing
01:30:25.340
it? First of all, how do predators get access to kids that they abuse in the making of these videos? I mean, yes,
01:30:32.740
apparently there's the, there are truly evil parents and step-parents and caregivers, but how much of
01:30:41.840
this is a story of abductions and other horrific details? What do we know about what's happening,
01:30:50.240
you know, among the culture of people who produce and consume this content?
01:30:55.580
Sure. Let me, let me, I don't want to go on about encryption too much, but let me just raise
01:31:00.860
a few things that I think would be helpful to the conversation, especially with your audience.
01:31:06.180
Because often when you come to the idea of encryption, it's this position of either yes
01:31:11.900
encryption or no encryption, right? Either yes encryption, or somehow there's going to be a back
01:31:16.760
door into encryption. And I will say that I do feel that the government, and this is one of the
01:31:21.080
challenges of being a reporter, I do think the government is using our reporting and using the
01:31:27.140
issue of child sexual abuse as kind of the new terrorism. That, that now, like a week after we
01:31:34.900
put out our first report, Attorney General William Barr held an event at the Department of Justice,
01:31:41.380
and the event was entirely about how encryption is enabling child sexual abuse and how they need a
01:31:48.300
back door into encryption because of this. Now, what that event did not discuss at all were the
01:31:55.400
multiple other failures of the federal government in dealing with this issue. So I do feel like there
01:32:01.420
is some disingenuous behavior, not only on their part, also on the part of people who, there's a lot
01:32:10.220
of, this is becoming a weaponized topic around, around encryption. Well, this is so grim because if ever
01:32:16.840
there were an attorney general who did not inspire confidence for his ethical and political intuitions
01:32:22.660
and his commitment to protecting civil liberties, it's William Barr. So, yeah, I mean, that just makes
01:32:31.280
me think that the answer has to come more from tech than from government, at least government at this
01:32:38.100
moment, right? I mean, obviously government has to be involved because we're talking about crime in the
01:32:42.920
end. But yeah, it's easy to see how fans of Edward Snowden and, you know, everyone else who wouldn't
01:32:50.660
trust the current administration as far as they can be thrown will just say this is completely
01:32:56.500
unworkable. You can't let these people in because they obviously can't be trusted to protect civil
01:33:03.760
liberties. Right. And even Snowden has weighed in on this series specifically about saying he thought one
01:33:10.840
of the stories we wrote was particularly credulous to law enforcement and to this argument against it.
01:33:16.460
But you're right. I do think there are things to be done and we'll focus on Facebook solely because
01:33:21.180
of this messenger example, right? Now, I think one of the most compelling things I've heard from people
01:33:28.020
when really, who are really willing to engage on this issue is maybe encryption should not be deployed
01:33:36.100
against all types of platforms. So, for example, Facebook is a platform where children are at a
01:33:42.460
distinct disadvantage to adults, not only for all the reasons that children are always at a distinct
01:33:48.180
disadvantage. They're younger. They haven't learned as much. They don't have as much life experience.
01:33:52.580
But literally, I found dozens, at least, and by far from an exhaustive search of examples of adults
01:34:00.580
going on Facebook, creating profiles that say they're 13 or 14, befriending other children,
01:34:07.980
getting that child to send them an image of what is child sexual abuse, whether it's self-generated or
01:34:14.160
not, coercing them into it, sometimes by sending that child other images of child sexual abuse that
01:34:20.860
they've collected. And then as soon as the child sends them an image, they'll say, actually,
01:34:27.100
I'm a 25-year-old guy. And if you don't send me more images, I'm going to post this image on your
01:34:33.740
Facebook wall and tell your parents. So then you have a 12 or 13-year-old, I guess you're not supposed
01:34:40.740
to be under 13 on Facebook, although we know those rules get bent and broken all the time. Now you have
01:34:45.600
a 12-year-old, a 13-year-old saying, holy shit, I don't know what to do. I'm going to send them more
01:34:51.980
photos. I'm so terrified if they tell my parents, if they post that on my wall. This happens all the
01:34:58.880
time. It's called sextortion. It's one of the biggest issues coming up. So now you have this
01:35:04.000
platform where adults and children can interact at a distinct advantage to the children. Children
01:35:09.220
are discoverable, despite the fact that Facebook says you can't search for them in certain ways,
01:35:13.260
which is true. There's still plenty of ways for an adult to find children on Facebook and message
01:35:19.840
them and coerce them. And Facebook knows it's a huge problem. We're not starting from a place where
01:35:25.320
they don't know it's a problem. They know it's a huge problem. Now, at the same time, Facebook has
01:35:30.540
an encrypted messaging service, WhatsApp. And if we look at the number of reports to the National
01:35:36.120
Center from WhatsApp versus Facebook, of course, it's not even close. WhatsApp isn't even a fraction
01:35:41.460
of a percent of the reports that Facebook sends. But that said, Facebook could,
01:35:46.260
there's one hypothesis, one possibility, not something I'm advocating necessarily,
01:35:52.240
but an interesting thought. Facebook could direct people, say, look, Messenger is not encrypted.
01:35:58.400
We're not encrypting it. These are not encrypted messages. Law enforcement has access to these
01:36:02.960
messages. Shout it from the hills. Everybody knows. If you want to have an encrypted chat,
01:36:10.200
we own a company. WhatsApp, go use WhatsApp. We'll kick you right over to WhatsApp. Use WhatsApp.
01:36:15.560
Now, that would make it substantially harder to coerce children. Because at that point,
01:36:23.220
what you have to do, in order to even have WhatsApp, you have to have a phone number.
01:36:27.420
So the child has to have a phone. The child has to have WhatsApp on it. And WhatsApp,
01:36:32.360
as opposed to Facebook, doesn't have the same sort of discoverability issue. You can't just go on
01:36:38.540
WhatsApp and start finding children, right? Certainly not in the same way you can on Facebook.
01:36:44.200
So maybe there should be more discussion around what types of platforms should be encrypted.
01:36:52.260
What types of platforms are children at a distinct disadvantage? Like, do I believe? I believe privacy
01:36:58.880
is a fundamental human right. So I absolutely believe that there should be encrypted messaging.
01:37:04.400
But has this course of reporting shaken to my core how that should happen? Absolutely. And has it caused
01:37:15.680
me to say, like, wow, how do we have a technology such as encryption, which by definition cannot have a
01:37:24.600
backdoor, and still protect children? And what I find to be counterproductive when I start talking about
01:37:31.920
these discussions, Sam, is that privacy absolutists, which is a term I use for them,
01:37:40.080
who have been thinking about this issue for years, they will immediately chastise me. I mean,
01:37:45.260
I tweeted out this idea, probably in October, after we had started thinking about it and developing it,
01:37:50.840
my colleague and I, and I'm sure we're not, you know, the first people to think about this. But I
01:37:55.340
said, you know, shouldn't there be a discussion around what platforms should be encrypted?
01:37:59.200
And I was attacked. I was attacked by people who said, I've been thinking about this problem for 20,
01:38:05.980
30 years. I've analyzed it from every single point of view. I've run every scenario down and every
01:38:13.080
single one ends in encrypt everything. Now, I don't know if that's the right answer. I don't know what
01:38:18.820
the right answer is. But what I do know is that at this point in time, when the general public is
01:38:25.780
just starting to become aware of privacy implications online, just starting to understand
01:38:32.540
what it means to have private messages, what it means to have social media platforms, what it means
01:38:38.640
to broadcast your life to everybody or choose not to, the worst thing you can do is come in and tell
01:38:47.020
people they're idiots for thinking about these things out loud. And so I would just like offer
01:38:53.680
that message to a community that I very much respect and enjoy communicating with. Like, again,
01:39:01.300
I helped start the New York Times tip line. Like we implemented secure drop technology, which is
01:39:06.840
encrypted technology. You can send us messages on WhatsApp and signal. You can get to us in untraceable
01:39:13.020
ways. I very, very much understand the importance and value of encryption and private communications.
01:39:20.520
But I do think there is room to discuss where those are implemented, how those are implemented.
01:39:26.640
Should those be implemented when you know there is a problem? Should those be implemented when you know
01:39:31.880
children are at a distinct disadvantage? And still, the answer might be yes. I don't know the answer.
01:39:37.540
But I would just say that now is the time to help people start having these conversations.
01:39:45.220
And if you've already run every scenario and you know the answer, we'll help people get there
01:39:49.580
in a way that's constructive because the other ways are going to drive people away from it.
01:39:56.140
I should echo a few of those thoughts in that, you know, I am also very concerned about privacy
01:40:00.080
and I would be the first to recoil from the prospect of someone like, you know, A.G. Barr having any more
01:40:08.760
oversight of our lives than he currently has, right? So it's just, it's easy to see how an ability to pry
01:40:16.700
into our private lives can be misused by an ideological government. And yet, we're talking about more than that.
01:40:25.760
We're talking about the fact that at this moment, you have some sadistic lunatic mistreating children
01:40:34.100
who, for whatever reason, he has access to and shooting his exploits on an iPhone, uploading it to AWS,
01:40:46.120
you know, posting some of this stuff, you know, on Tumblr or wherever else. And Apple built the phone
01:40:53.040
and Amazon runs the server and Tumblr just got acquired by Automatic. And you have people
01:41:00.220
getting fantastically wealthy on this technology. And this technology is what is enabling
01:41:07.000
the lone maniac in his basement to accomplish all of this. If he just had a Polaroid camera, yes,
01:41:17.080
he could, you know, walk that single photo in a brown paper bag to the parking lot of a shopping mall
01:41:24.640
and trade it for $100 with some other nefarious stranger and risk getting caught that way.
01:41:32.500
But these companies have built the tools to bring all of this to scale. So presumably, people are making
01:41:38.760
a fair amount of money trading this material now. And they're managing to essentially groom a very large
01:41:48.160
audience of vulnerable people. I mean, you have to imagine that the audience for this
01:41:53.500
is growing the way the audience for kind of weird adult porn is also apparently growing because people
01:42:01.200
are getting numbed, you know, to various stimuli, right? People are, you have access to every possible image
01:42:08.700
online. And certain people are vulnerable to just needing more and more extreme images to even find them
01:42:16.660
to be salient, right? So I would imagine at the periphery, if we're talking about, you know, 400,000 people in
01:42:22.560
New Jersey downloading this stuff, not every one of those people 30 years ago would have been trying to find who
01:42:30.960
they could exchange Polaroids with in a parking lot. And so this is a kind of a cultural contamination,
01:42:38.640
again, much of which is redounding to the bottom line of these companies that are getting fantastically
01:42:44.520
wealthy for, you know, the use of their bandwidth for these purposes. So you can't be a privacy
01:42:49.700
absolutist here. We have to figure out how to change the incentive structure around all this so that the
01:42:55.720
companies themselves find some way to make this much harder to do and to make it much more likely
01:43:02.060
that someone will get caught for doing it. Yeah. And, and, and I know that I do want to speak about
01:43:08.320
pedophiles and education and finish up there, but we haven't even discussed a few things that when you're
01:43:13.560
talking about what people could use their 10% or 20% time or just what these extremely bright people
01:43:20.680
work at these companies, right? We have a, a part of one of our stories, it's just almost too terrible
01:43:26.100
to talk about, about live streaming. And this live streaming is going on, on zoom, where there's
01:43:32.400
over a dozen men sitting around watching another man, rape a child and cheering him on. And the only
01:43:40.880
reason, the only reason I was able to report this case out is because a Canadian undercover officer
01:43:47.560
happened to be sitting in the zoom chat room because it was known that this was a problem on
01:43:53.140
zoom and recorded it. And the police the next day went and saved that child, right? So that's a
01:44:00.820
wonderful story. But the fact is when that case went to trial, which is kind of unbelievable that it
01:44:06.720
went to trial, but it did go to trial. What the prosecutor said, the federal prosecutor, a man named
01:44:11.500
Austin Berry, he said that the offenders know that live streams are harder to detect and that they
01:44:18.960
leave no record. And then quote, that's why they go to zoom. It's the Netflix of child pornography.
01:44:26.660
So there's things we haven't even discussed, like live streaming and new content and classifiers that
01:44:32.720
need to be built for that, that it's a, it's hard, you know, like this is a hard, complicated,
01:44:38.940
technical task. And, and the implications and, and what people were absolutely going to respond to
01:44:45.260
is the idea of walking into a surveillance state, which I know you've had multiple conversations.
01:44:50.040
And I, that is why we did the reporting on this subject. That is why we did it because it brings
01:44:54.620
these questions to a head is how do we deal with this? Now, now the, the answer for you of how we deal
01:45:00.940
with this right now, as far as I'm concerned is education. So I was in Amsterdam. I was actually
01:45:09.500
in the Hague in the Netherlands late last year doing some reporting because this is an international
01:45:14.860
problem. And some of the laws in the Netherlands make this, make it more common for this type of
01:45:21.400
material to be stored on servers in that country. But that said, while I was there, I ran into some law
01:45:27.340
enforcement, some international law enforcement. And I ran into a, a Finnish, basically Homeland
01:45:32.560
Security agent. And we were having a couple of drinks and I was talking to him. I told him what
01:45:38.200
I was there for. He didn't believe me for a while. I thought I was a Russian spy. That was interesting.
01:45:43.300
Once I had finally convinced him that no, I'm actually a reporter and I'm reporting on this subject.
01:45:47.780
He told me that he was actually there for a conference on that subject, which I knew I was there
01:45:53.780
for the same reason. And he said he had two small children. And I said, all right, man, you know,
01:46:00.560
so what do we do? Like, what is the answer to this? And he said, the only thing you can do is educate
01:46:07.440
your children. Like you need to sit down with your children. You need to explain to them that
01:46:12.660
they don't know who they're talking to online, that they cannot assume that that's another child,
01:46:18.220
that they should not be sending images of themselves. They should not be live streaming images of
01:46:22.960
themselves. And even more importantly, if they do, they should feel totally comfortable
01:46:30.160
knowing that you will protect them and support them, even if they've made that mistake.
01:46:36.080
That as of right now, there is no tech solution to this problem. And honestly, it's not in the near
01:46:42.040
future. We also didn't get the opportunity to talk very much about survivors. And that is, I mean,
01:46:47.940
absolutely heartbreaking. I remember I spoke with a 17-year-old girl who had been raped by her father
01:46:53.560
who invited another man to rape her when she was seven years old, videotaped it and put it online.
01:46:59.720
The confusion this young woman feels that her father would do something. She then lost her father.
01:47:06.080
And not many people know this, but people who have been identified in these images and videos,
01:47:10.520
every time their image or video is found on an offender's computer or cloud drive or whatever,
01:47:16.400
they get a notice from the FBI. And this notice is to allow them a chance for financial compensation
01:47:24.340
during the trial, which is often a few thousand dollars maybe. But they get these hundreds of
01:47:30.780
notices a year. So every year or every day, their parents, or often a lawyer because their parents
01:47:38.580
cannot bear it, get these notices saying that it's been found again, it's been found again,
01:47:44.140
it's been found again. So it's really important as we talk about the technology companies and the
01:47:49.860
efforts that need to be made and everything, that the first generation of children who have been
01:47:57.060
sexually abused and now have to deal with that every day of their life, constant reminders.
01:48:05.200
And this isn't a reminder that like one, you were once physically assaulted, like in a fist fight
01:48:12.040
and that video is online, which I'm sure would be terrible. This is the knowledge that you being
01:48:18.140
sexually assaulted as a child is serving the sexual pleasure of other deviants online is just,
01:48:26.620
I mean, I came out of that interview and I was devastated. And so it's very important.
01:48:32.600
We keep the survivors in mind because it's not just the child that it's happening to when it's
01:48:39.160
happening. It's the, again, a whole generation now of children who are growing up every day. I mean,
01:48:47.240
they change their appearance because people try to find them later into the future.
01:48:51.120
They can't speak out about it because it is such a stigmatized thing. It's just,
01:48:58.440
it's an unbelievable pain. And so the thing that in law enforcement and yes, we hope tech companies,
01:49:04.940
and there are huge battles ahead, whether it be encryption, whether it be building classifiers
01:49:09.440
that can detect new content, whether it be trying to figure out how to stop children and young adults
01:49:16.460
from sending photos of themselves that can then be weaponized against them. All of those things,
01:49:21.360
fundamentally for the next few years at the very least, the onus is on the parents to do
01:49:28.500
a better job educating their children, to realize that when their children are playing Minecraft
01:49:33.600
or Fortnite, that there are adults on there trying to coerce them that no matter what platform your
01:49:41.060
child is on, the unfortunate truth is there are monsters on that platform trying to do terrible
01:49:48.040
things. And so while the tech companies, which I really hope, figure out how to deal with this,
01:49:54.380
it is on the parents to educate themselves and their children on how to be aware and avoid these
01:50:02.040
problems. Yeah, although that really only addresses a peripheral problem here. The sexploitation thing
01:50:10.980
is a problem, I'll grant you. And obviously any parent should communicate with their kids around this,
01:50:17.320
you know, don't send images of yourself, realize who you could be talking to an adult. Obviously
01:50:21.640
don't agree to meet people in the physical world based on these online contacts and all that. But
01:50:27.600
apart from the problem of possible stranger abduction being facilitated by that kind of online
01:50:34.900
communication, that doesn't really bring us to the bottom of this hellscape, which is the sort of
01:50:43.180
thing you described happening on Zoom, right, where you have an adult who for whatever reason has access
01:50:49.720
to a child who is then raping that child to produce video for people who have happily gathered to consume
01:51:00.640
it, right? So there's a culture of this. You're right. You're right. It doesn't address that. But I don't
01:51:06.000
want to stray from the idea that, A, a large part of the community feels that any of these types of
01:51:12.100
videos and images, the more that circulate, whether they're coerced or self-produced, the more it drives
01:51:17.300
people down that hole that we just discussed to more and more extreme content. And B, there are
01:51:22.900
several examples, many examples of children. Again, it's almost impossible for me to even imagine,
01:51:28.580
but being 11 years old, having made the mistake of sending an image of my genitals to somebody
01:51:35.980
thinking they were 12, and then having them say that they're going to tell everybody in my whole
01:51:40.720
world that I've done this, and not only that, show them. That has resulted in children abusing,
01:51:47.140
often it is very common to bring in their siblings. I mean, the amount, the way it spreads, and then to
01:51:54.580
actually sexually abuse their own siblings, which then leads to more extortion. So I completely agree
01:52:00.600
with you. It does not solve the dark, dark, dark, depraved things that we mentioned quite a bit in our
01:52:07.420
articles. But sexploitation and sextortion are the fastest growing number of child sexual abuse images
01:52:15.540
online. So it is in no way a panacea, but it is one opportunity to help stem the problem.
01:52:22.360
Hmm. So take me back to the Zoom case. What was revealed about it? Were the people who were
01:52:29.860
watching also arrested, or was it just the perpetrator? No, they were. So let me, I have
01:52:36.600
it in front of me here. So what happened was, it was a man in Pennsylvania who was, it was not the
01:52:43.780
first time this had happened, in fact, that he had, I believe it was his nephew, honestly, but it was a
01:52:50.100
six-year-old boy. And there was, I think, about more than a dozen men. And these, these men were
01:52:55.740
from all around the world, speaking to the point of what we're talking about, the technology. They
01:53:00.240
were all around the world. Now, I think a dozen or so were in the United States, but all around the
01:53:04.000
world. And what was this? Was this monetized in some way? I mean, how does... No, no. In fact, a lot of
01:53:10.100
it's not monetized. Once you get to the dark web, sometimes it can be monetized in a variety of ways,
01:53:16.440
but that's actually one of the, one of the ways that they've helped shut down other types of
01:53:22.020
dark market crimes, like drugs, and some of those things that are traded on, on, on the dark market,
01:53:28.600
which is by Bitcoin transactions. Like whenever you're even, even those can be traced to a certain
01:53:33.740
extent. So there is certain types of things that go on, like Bitcoin mining that they leverage other
01:53:40.920
people's computers to do. They do sell some of this stuff online, but actually what we found,
01:53:46.680
certainly on the open web and on the platforms we've been talking about, is a much greater
01:53:51.000
predilection to just share it with one another, to share and stockpile. So they create these huge,
01:53:57.680
huge stockpiles, often stored on their cloud storage. I mean, we found cases where there are
01:54:02.740
millions of files on their cloud storage, but these people, I mean, it is truly horrendous. The men
01:54:10.440
are sitting around, almost always men sitting around. They have, they have a rule. Okay. In
01:54:16.960
these, it's known that they have to have their, their webcam on because in their minds, a police
01:54:22.960
officer would never sit there with their webcam on. So it's a, it's the rule is cams on. So they're
01:54:29.120
all sitting there. They're rooting this, this man on while he rapes the boy. They're masturbating as
01:54:33.840
they do it. The, the detective, I think it's detective constable, Janelle Blacketer, who is a
01:54:41.100
Toronto police department. She was in the room. She recorded the stream that night. She sent the file
01:54:47.980
to special agent, Austin Barrier, Homeland Security Investigations. They then subpoenaed Zoom,
01:54:55.940
who is very compliant. I mean, when the companies learn about this, they almost always are very quick
01:55:01.620
to react. Zoom sent him information. Turned out the man is in Pennsylvania. The man's name is William
01:55:07.540
Byers Augusta. He's 20 years old. The next day Homeland Security shows up, is able to identify the
01:55:14.960
setting based on the video footage they've seen. They identified certain objects that were also in
01:55:19.480
the room. Saved the six-year-old boy. 14 men from multiple states have been arrested and sent to
01:55:27.320
prison. And Mr. Augusta, who, he received a sentence of up to 90 years in prison.
01:55:34.800
Okay. So it worked in that case, but... It didn't work because the tech companies caught it.
01:55:40.460
Right. It worked because law enforcement caught it. Exactly. And I mean, first of all,
01:55:45.020
just to say something to the privacy absolutists here, I mean, wouldn't you as a user of Zoom
01:55:50.920
be willing to let your conversations be searchable if you knew you could prevent this sort of thing
01:56:00.700
from happening, or you could actually bring the people who are doing this sort of thing
01:56:05.020
to justice? For me, it's just, it would be trivially easy to agree to that in the terms of service.
01:56:12.240
It's like, of course. I just don't understand how you get to, and what is it that you're doing in
01:56:17.280
your life that you think absolute privacy under any conceivable scenario is important to you?
01:56:27.340
What are you doing on Zoom that you can't imagine the government or the tech company ever being able
01:56:35.480
to search it, even just algorithmically, to vet its content? It's a religion. It's a fetish of some
01:56:43.440
kind of absolute, first of all, it's just, it's nothing that human beings have ever had
01:56:48.380
a right to this kind of privacy. I mean, there's no place in the real world where you've ever done
01:56:54.520
anything or said anything that has given you an absolute right to privacy. It's physically
01:57:01.160
impossible, right? There's no room in your house that could hold all your secrets and never be unlocked
01:57:07.780
by a third party, no matter what you had done in the world, right? And yet, it's somehow in digital
01:57:13.360
space, some of us have convinced themselves that we need these rooms, right? And it's, again, for the
01:57:20.660
purposes of this conversation, I've completely lost touch with the ethical intuitions that suggest that
01:57:28.460
And it's the reason we do it. It's the reason we're doing this reporting, is because, again,
01:57:33.780
it was a shortcut to the conversations that I think need to be have around privacy on the internet,
01:57:41.120
around, you know, should companies be scanning people's photos? Should companies be scanning
01:57:46.560
people's videos? Should they be detecting, doing natural language processing to detect grooming?
01:57:52.180
Should they be doing all these things? Like, let's have these conversations. And as you're saying,
01:57:56.940
should Zoom, at what point does your expectation of privacy go away? Like, so you're in a room
01:58:02.660
with 16 other people around the world. Is there an expectation of privacy up until 30 people,
01:58:09.720
up until 50 people? At what point, and again, these are just, I'm sure that people are going to attack
01:58:18.200
me for even raising these questions, but they're honest questions about at what point do these things
01:58:25.300
start to affect people in ways that are, in fact, detrimental, if that is the case, if that would happen.
01:58:32.660
But I think we need to move beyond a little bit of the conversation. Yes, there's some harm in
01:58:38.540
Facebook, let's say, giving our likes to Cambridge Analytica. But there's far, far greater harm,
01:58:45.740
I think we'd all agree, in people being able to trade child sexual abuse material under the cloak
01:58:52.200
of encryption. So let's have that conversation.
01:58:54.360
One other question here, which trips a lot of ethical intuitions one way or the other,
01:59:00.060
what do you think about the prospect of allowing entirely fictional production of similar material,
01:59:08.840
you know, animated child pornography or the CGI version of it, such that it could answer to this
01:59:16.880
apparent appetite in many people without being at all derived from the actual victimization of
01:59:26.080
children? I mean, at the moment, I'm assuming all of that material is just as illegal as anything
01:59:31.860
else that is a real record of a crime. Is anyone arguing that if we could only produce this stuff
01:59:37.040
fictionally, the real problem would be at least diminished?
01:59:41.380
There are people arguing that. I'm not going to say the name of the company because
01:59:45.820
I think that is very questionable. It is illegal in the United States for any kind of like drawings
01:59:54.480
or depicted imagery, I believe. But I think this gets to a very interesting point. And I want to talk
02:00:01.160
specifically about pedophiles. And so before we did this reporting, and even in our first story,
02:00:07.200
as soon as we published, we got lambasted by several people saying that we had used
02:00:11.280
the term pedophile inappropriately. So to speak specifically, pedophiles are people who are
02:00:17.520
attracted, sexually attracted to children. There is a whole nother group of people who look at child
02:00:23.680
sexual abuse imagery. These are people who are not necessarily attracted to children. They are
02:00:28.700
extremists. They are wandering down rabbit holes. They are addicts. And they are a huge part of the
02:00:35.320
problem. But let's speak about pedophiles, because I do think there is, when I'm talking with child
02:00:40.740
advocates and some of the other people, I say, in grappling with this problem, I think the same way
02:00:44.960
that you're starting to grapple with it or have been grappling with it, which is, holy shit, what do
02:00:49.620
we do? I think you have to think about attacking it from all angles, right? And that also means
02:00:55.140
dealing with the people whose natural attraction is to children. I do want to, I mean, sympathy is
02:01:02.360
probably the only word I have for it. There is a group of people, and I'm not going to get into
02:01:07.480
citing any of the studies. As soon as you cite a study, you have a million people telling you why
02:01:11.600
that study was garbage. But there is a group of people who, when they go through puberty,
02:01:18.200
begin to realize that they remain attracted to children of a certain age. And that is the very,
02:01:25.620
very common report of a true pedophile, is that they turn 12, 13, 14, and as they continue to grow
02:01:33.840
older, as they go through puberty, they realize something is wrong. They realize they are still
02:01:38.840
attracted to children. So how do you deal with that? Now, first of all, according to some of these
02:01:43.980
studies, you then have a few years, right, where this child, this young adult, now knows that they
02:01:51.160
have some sort of issue. And so that's an opportunity. That's an opportunity to intervene if we can find out
02:01:57.740
a way to do that. And the second thing that I often think about, and this is a bit tangential
02:02:04.360
from what you're saying, I don't know. And I put the same question. It's a good question. I put the
02:02:08.800
same question to these people about, should there be imagery that would help satisfy this? I mean,
02:02:15.080
imagine as soon as we get to virtual reality, the different types of implications there.
02:02:20.740
I don't know. I think it's worth talking to scientists, talking to people who study this,
02:02:25.360
to see if that would, if that would stop them from offending, then perhaps that's a path worth
02:02:31.680
walking down. I think other people would say that that would simply drive the number of people
02:02:38.780
interested in this higher. Now, if they're still not actually assaulting children, that's a good
02:02:44.300
thing. But I could see the argument, or perhaps if a study were done saying it would drive them to
02:02:50.160
actually want to do this in real life, I'm not really sure. But what I do think adds another layer
02:02:55.240
of complexity, because it's very easy. What I just told you is that a bunch of these men
02:03:00.840
who were arrested for watching another man assault somebody on Zoom, they received very lengthy
02:03:07.600
sentences. I mean, they're getting sentences of 30, 40, 50 years for simply consuming and trading
02:03:14.840
this material. And I don't mean simply to say that it's not serious. I just mean they're not actually
02:03:20.440
doing the abuse. Now, I will get jumped on for that as well. Yes, they are re-victimizing
02:03:25.960
the person in it. But simply to say, they are not the person physically abusing the child.
02:03:31.680
And they're getting prison sentences of 30, 40, 50 years. Previously, I worked at a place called,
02:03:37.620
I helped start something called the Marshall Project, which is a criminal justice website.
02:03:41.060
And we dealt a lot with this idea of rehabilitation, crime, punishment, rehabilitation.
02:03:45.560
I do not know if a true pedophile, somebody who's truly attracted to children, is going to be any less
02:03:56.480
attracted to a child or any less able to constrain themselves from doing this type of thing when they
02:04:04.580
get out of prison 30 years later. And in fact, we have, the sentencing is all over the map, whether
02:04:10.760
it's not at state or federal level. So some of our survivors who we spoke with, they had somebody
02:04:15.480
who went to prison. Remember, they get these notices, went to prison because they had, he had
02:04:20.100
their imagery on his computer, got out, went to prison again. And again, their imagery was found on it.
02:04:28.100
So I don't know exactly how to honestly help people who have attractions to children.
02:04:35.820
Because if it was you or I, and I think about the people I'm attracted to, I, there's nothing I do
02:04:43.960
to be attracted to that person. And I don't think there's anything I could do to not be attracted to
02:04:49.540
some of the people I'm attracted to. I mean, this is a instinct, it's natural, it's whatever it is.
02:04:54.980
And I do feel sympathy for people who, for whatever reason, are attracted to children. And I see that
02:05:03.220
as an opportunity to somehow get in front of the issue at that point. And whether it's with
02:05:10.640
animated 3D models, virtual reality, whatever it might be to help them live as normal a life as
02:05:19.060
possible. And, and with the number one absolute goal of not harm a child, then I think those options
02:05:26.020
should be explored. Yeah, well, I think we, again, we have to differentiate pedophilia from
02:05:33.180
some of the rest of what we're talking about. So because pedophilia is this, is a very unhappy
02:05:39.100
sexual orientation, essentially, right? It's one, the implications of which pitch you into something
02:05:44.960
that's illegal, non-consensual, and, you know, therefore non-actionable if you're an ethical
02:05:52.080
person, right? So you're, you didn't pick who you're attracted to. As far as I know, the research
02:05:56.900
on this suggests that it's undoubtedly, it's partially genetic, but I think it also has to do
02:06:02.380
with what happens to babies in utero. You know, I think there's, I think it's developmental, but
02:06:08.780
obviously we don't understand exactly how someone becomes a pedophile, but we should understand that
02:06:13.520
they didn't make themselves. So they have this, they're, they're profoundly unlucky on some level
02:06:19.160
to find that their, their sexual attraction never matures to being attracted to adults who could
02:06:25.800
consent to have sex with them. And yet that doesn't fully capture or even explain the picture
02:06:35.740
we're seeing when you describe something like that Zoom atrocity, which is you have people who know that
02:06:42.340
they're watching a child getting raped and they're happy to do this. I mean, that's, that's analogous
02:06:49.440
to a heterosexual man being invited to watch a, a woman, you know, who he, he's attracted to women.
02:06:57.320
He can be invited to watch a woman being raped on camera, you know, in a Zoom session. What sort of
02:07:04.340
heterosexual man is a part of that project, right? That's the culture of unethical pedophiles that
02:07:14.560
has to exist for this whole problem to exist. You have to know that you're, what you're doing is,
02:07:22.900
is, is facilitating, motivating, enabling the mistreatment of, and in many cases,
02:07:31.000
torture is not the wrong word for it. Torture of children. That's where we can be far more
02:07:36.200
judgmental of what's happening here. I don't know if you have anything to say about that.
02:07:40.100
I mean, absolutely. No, absolutely. Just absolutely. I don't mean in any way for me saying that
02:07:44.760
I have sympathy for somebody who is born a pedophile.
02:07:47.620
Yeah, I wasn't taking it that way. Yeah. I share your sympathy.
02:07:49.940
And, and I totally agree that even if somebody is born a pedophile, there is no room to trade or
02:07:58.020
share or actually abuse a child. I am deeply sorry if that is your position. If you are a pedophile,
02:08:07.120
I am sorry. I still feel extremely strongly. There is absolutely no circumstance in which it is ever
02:08:16.080
okay, whether you film it or not, to abuse a child. There is no consent. It's right where we started.
02:08:23.200
There is no consent. There is no opportunity for this child to agree. They would, I mean,
02:08:29.400
some of these, and whether they're pedophiles or just terrible people, not to say that those are
02:08:35.300
the same thing, whether they're terrible people or not, some of them will bend over backwards to say
02:08:39.300
that the children like it, that this is called loving a child, that these are things that if you
02:08:45.960
could only see it, I mean, you wouldn't imagine the amount of times that some of these people told me,
02:08:49.960
if you could only see it, you would see how much they enjoy it. To that, I say, you're doing
02:08:56.740
terrible things and you need to be punished for them. Right. And we need to figure out a system.
02:09:03.300
Well, clearly, so the people who are saying that sort of thing, and that's why I have these questions
02:09:07.980
around, around the culture of this, because anyone who's saying, listen, we pedophiles are a happy
02:09:15.120
lot and we treat children well. And if you go back to ancient Greece, this was a norm, right? You know,
02:09:21.780
presumably Plato was doing this to the boys on the block and no one minded. So, you know, get over
02:09:28.000
yourselves, 21st century people. Presumably, even these people can't say with a straight face that,
02:09:35.640
as you report in one of these articles, you know, an infant being anally raped is enjoying
02:09:41.700
this. Right. I mean, it's just like, there's no way. I mean, like I put the question to you. I mean,
02:09:47.780
are there pedophiles who are saying, who are acknowledging that part of this picture is every
02:09:54.220
bit as horrific as we think it is, then they're pointing to some other part of the picture that
02:09:58.640
they consider benign or are they not making those concessions? I mean, the one who I spoke with
02:10:05.060
most extensively insists that the children enjoy it. And the only distinction I could start to get
02:10:17.120
them to draw is prepubescent versus postpubescent. I mean, I said, okay, let's leave aside postpubescent,
02:10:25.080
even though it's still incredibly wrong to take advantage of any child. But let's leave aside the
02:10:32.400
people. Like, how can you say that these prepubescent children are consciously making
02:10:39.120
the decision and understand the ramifications and even further enjoy this activity? And I mean,
02:10:49.180
if there's such thing as privacy absolutists, there are child sexual abuser absolutists. And actually,
02:10:55.600
Sam, it's a big part of the culture. It's similar to many other internet cultures where they radicalize
02:11:01.040
one another. That's what's going on in that Zoom room. That's what's going on in there. They're
02:11:04.780
radicalizing one another. They're trying to normalize their behavior. They're trying to
02:11:10.680
share it amongst other people in order to make themselves feel like it's more normal. And when
02:11:16.180
I was speaking with this person and he finally came to understand that there was no way in hell I was
02:11:21.020
going to look at any of this type of imagery and that all I was trying to do, honestly, all I was
02:11:25.440
trying to do is find out more information about how he was managing to keep his site up and running
02:11:29.660
and listening to his beliefs system happened to, unfortunately, come along with that bit of
02:11:36.320
reporting. But there are people who fundamentally are telling themselves that this is an okay thing.
02:11:43.040
Well, Gabe, we have gotten deep into the darkness together and I just want to thank you for taking
02:11:49.880
the time to educate me and our listeners and, again, anyone out there who has any even a semblance of a
02:11:58.780
privileged position with respect to working in tech, having a brother or sister who works in tech.
02:12:06.880
Please start putting your shoulder to the wheel here and figure out how to make this a prominent
02:12:12.360
problem that will be emphatically solved at some point in the near future. Because clearly,
02:12:19.320
if we don't have the technology that can solve it today, that's coming. And if we incentivize
02:12:24.900
ourselves to produce it, we'll do so and we can get the policy right. But clearly, what we have now is
02:12:30.720
something bordering on a moral catastrophe. So again, Gabe, thank you for all your hard work on this.
02:12:38.560
Thank you so much, Sam. I'm sincerely grateful for the opportunity to discuss it with you.
02:12:42.940
Well, as I said at the top, this conversation was a few months old and I've gone back and asked
02:13:03.120
Gabriel if anything has happened in the meantime. The biggest update was actually the results of
02:13:10.160
all the New York Times coverage Gabriel produced. Apparently, there are two additional bills that
02:13:16.080
have been introduced in Congress. The first was introduced by Lindsey Graham and Richard Blumenthal
02:13:21.380
and it's called the Earn It Act. And if it passes in its current form, companies will lose their
02:13:28.260
Section 230 protections when it comes to child pornography. The second bill was introduced by Ron Wyden
02:13:37.140
and it seeks $5 billion in funding, which would be amazing for law enforcement and others who are on
02:13:45.220
the front lines. And I believe that funding would be over 10 years. So this is a hopeful sign. Once again,
02:13:52.740
thank you to Gabriel and his colleagues for doing so much work here. They certainly brought this problem
02:13:58.020
to my attention. And now I brought it to yours. Thanks for listening.