The Culture War #49 - Adult Content Sites Target Kids, Exposing Porn Industry Corruption
Episode Stats
Length
2 hours and 2 minutes
Words per Minute
191.94582
Summary
In this episode, we're joined by Arden Young and Eric Cochran of Project Veritas to talk about the growing problem of children having access to pornography online, and how the porn industry is making money off of it. They talk about what it means, why it's a problem, and what we can do about it. Also, get a FREE frozen beef sandwich from Wendy's to celebrate the new season of the hit show The Morning Show with Rachel Maddow! Subscribe to the Morning Show on Apple Podcasts, wherever you get your shows, and don't forget to rate, review, and subscribe to our new podcast, The Dark Side Of. Subscribe and comment to stay up to date on all things podcasting and social media! Timestamps: 3:00 - The problem with porn is that it's available to all of us at pretty much any age 7:30 - Pornhub is all over the internet 8:15 - How Pornhub makes money 9:20 - The porn industry doesn't care about morals 11:00- Pornhub doesn t care about morality 13:00 14:40 - What's the problem? 15:40 16:30 17:10 - What are we to do with porn 18:00 | Pornhub's strategy 19:30 | What do we need to know 21:00 // 22:10 22:30 // 23:10 | What are you going to do to protect your kids from porn? 26:00 @ Pornhub? 27: What do you care about? 29:00 / 30:00: What should we know about this? 31:00/32:30 / 33:30/33: What does pornhub do with consent? 35:00 + 34:00? 36:00 & 35:40 / 35:30 + 36:30? 37:00 Or do we care about this stuff? 39:40/36:00 Can we trust Pornhub do we really care about it? 40:00 Are you a pedophile? 41:00 Do we have a problem with pornography? 45:00 Is porn too old to watch porn at a young age? 44:00 What does it matter? 47:00 How do we know we can trust porn at an early age to watch it on the internet?
Transcript
00:00:00.000
Have you heard of anything more chilling than frozen beef?
00:00:04.160
Until November 3rd, get an always-fresh, never-frozen Dave Single from Wendy's for only $4.
00:00:09.600
Nothing scary about that. Taxes extra at participating Wendy's until November 3rd.
00:00:18.800
I'm William Woodham, CEO of the British-born sportsbook Fitstairs.
00:00:22.220
We've been over in Ontario for over a year now.
00:00:27.040
It's impossible to outsmart Canadians on the ice.
00:00:29.520
So at Fitstairs, the world's oldest bookmaker, play with us on anything, anything except hockey.
00:00:35.000
With our world-class casino and 150 years of experience, you're in great hands.
00:00:45.460
One of the issues we talk about quite a bit on a variety of shows on the internet,
00:00:49.900
on Tim Kestirel and The Morning Show is the ubiquity of adult content online
00:00:59.520
And a debate that we've had before is, you know, in the real world, in meat space, as we call it,
00:01:04.740
if a kid were to wander down the street, they might see a pizza place, they might see a bookstore,
00:01:09.860
they might see a music venue, and they might see, I don't know, a gentleman's club or an adult bookstore.
00:01:20.200
And if that kid walks the door and knocks on it, they're going to say,
00:01:24.040
And if a kid tried sneaking in, they'd say, ID, please.
00:01:26.700
And sometimes kids do sneak in to adult bookstores or they have fake IDs.
00:01:30.160
But for the most part, we create barriers to prevent minors from getting access to certain materials.
00:01:35.040
For the same reason, you know, we don't allow people to smoke at a certain age or get tattoos.
00:01:38.880
But now because of the internet, it's kind of actually, it's just at the window.
00:01:42.880
Now you have people actually arguing that there's nothing we can do about it anyway,
00:01:51.240
There are many more libertarian individuals who argue,
00:01:53.320
we can't require identification for these sites because then you're going to have
00:01:56.820
the government and companies saying everyone has to have an ID to go online.
00:02:01.260
I don't know that the answer though is just let children wander the streets figuratively of the
00:02:07.280
internet and get access to the most extreme and psychotic adult content imaginable,
00:02:13.520
which could include even outside of adult content, videos of murder, rape, beheadings,
00:02:22.520
And so I think there's questions that we need to ask about this.
00:02:24.340
So we're gonna have a conversation about this and more, more so what these industries are
00:02:29.260
knowingly doing to make money and how they don't actually care.
00:02:32.880
So we're being joined by a couple individuals who actually investigated this.
00:02:40.460
I'm an investigative journalist under Sound Investigations and I recently went undercover
00:02:51.560
Before that, I was the sources manager at Project Veritas with James O'Keefe.
00:02:55.920
And yeah, founded Sound Investigations, hired Arden.
00:02:58.700
We went undercover and, you know, did the James O'Keefe style to Pornhub.
00:03:04.960
He's created more people who are going and exposing a lot of this corruption.
00:03:08.680
The crazy thing is, in this investigation you guys published, you've got someone working
00:03:16.300
And they're saying outright, like, they don't care about morals, they're making money.
00:03:21.160
And why don't you guys tell us what's going on and we'll get into the philosophy and the
00:03:25.620
So I think total, we recorded around a dozen Pornhub employees or its parent company.
00:03:33.620
And we got them admitting to all sorts of illicit, illegal and scandalous practices, really.
00:03:42.100
Lack of age and consent verification all the way across the board.
00:03:46.380
Not reporting to law enforcement when they see underage videos being uploaded.
00:03:53.220
Purposefully advertising to pedophiles and young teens.
00:03:59.500
Also, purposefully marketing gay, trans, and bi content to historically straight viewers,
00:04:07.040
which, of course, they track a viewer's life path.
00:04:09.260
In that video, that one's really shocking because the individual expresses their intention
00:04:14.720
to, like, querify or something, to, like, try and skew the preferences of straight men
00:04:23.960
So basically, these people are doing some kind of reverse conversion therapy or just outright
00:04:33.240
How does it come to be that you guys decided to start looking into Pornhub or, you know...
00:04:39.260
So, you know, back in 2020, we were both working at Project Veritas at the time, and an article
00:04:47.860
came out in the New York Times called The Children of Pornhub.
00:04:52.580
He did a big article, a big op-ed interviewing victims of sexual abuse who had their abuse
00:05:04.000
And this article was one of the first big blows to the internet porn distributors.
00:05:15.820
It owns basically all the big porn sites you can think of and all the big porn studios
00:05:25.600
It caused Pornhub to have to delete 80% of their videos.
00:05:30.520
They had to remove downloads, free downloads from their website.
00:05:36.860
And I was like, nobody's really done undercover journalism against the porn industry when they're
00:05:44.860
clearly doing these illegal things and just nobody's bothered to do it.
00:05:51.580
And then early last year, I called Arden and said, hey, I have this idea.
00:05:59.640
And then in a few months, these guys, they're based in Canada.
00:06:05.180
So I don't think they knew who James O'Keefe or Project Veritas was.
00:06:10.140
They weren't on guard like Google employees might be to think of undercover journalists.
00:06:21.580
One guy said, yeah, I wouldn't be able to defend this in court.
00:06:24.620
Human traffickers are using our website to make money.
00:07:00.640
They Pornhub itself is just either the eighth or ninth most popular website in the entire world.
00:07:06.100
Uh, and that doesn't include the, this company owns, uh, like almost all the porn sites you
00:07:20.700
So, you know, the U S government has kind of limited things.
00:07:24.080
A lot of people just kind of have to go after them in civil courts.
00:07:29.720
I, I had the, uh, the privilege of meeting sir, Kim.com in New Zealand and his story very
00:07:39.980
10 was his 10 years ago, 12 years ago, longer than that.
00:07:42.700
Probably there was a file upload locker website.
00:07:48.040
You go to mega upload, you say, I want to store files online.
00:07:51.900
People were uploading movies and TV shows that anyone could watch.
00:07:58.160
You wanted to watch a bootleg and you'd search for it.
00:08:00.200
Mega upload would pop up and you'd get a link and you'd watch it.
00:08:12.780
He's not an American, did not run an American company.
00:08:15.580
And the United States went after him in New Zealand because of piracy.
00:08:21.600
What Kim says is when they would contact him and say like, Hey, this is infringement.
00:08:32.160
They decided just to destroy his life and target him.
00:08:34.900
So knowing that story, having gone down there and interviewed him and let's assume the worst
00:08:42.060
He was gleefully supporting copyright infringement.
00:08:47.200
But you mean to tell me with Pornhub, they've got underage victims, trafficking victims.
00:08:53.060
And the US government's like, oh, geez, oh, no, our hands are tied.
00:08:57.080
And that's a really good point because Pornhub has, you know, a lot of people say they take
00:09:04.640
copyright infringement more seriously than actual abuse because there's this fingerprint
00:09:10.240
technology that they can apply to the videos on Pornhub.
00:09:14.460
Any, you know, major tech website has it where if you upload a 30 second clip of the Lion King,
00:09:21.920
And so there's this technology being applied to videos of like major popular porn stars so
00:09:33.140
But when it comes to victims and underage victims, they're very, very, very slow to apply
00:09:40.960
And the key is that they knowingly do this and they profit off it because, for example,
00:09:46.420
there was a Senate hearing this week grilling like Mark Zuckerberg and and, you know, TikTok
00:09:53.700
and Snapchat about things that could potentially be problems and children using their websites.
00:10:00.280
And, you know, people have different opinions on that.
00:10:03.180
But certainly they're not trying to make money off of exploiting children and copyright infringement,
00:10:11.760
But that was the key was in these undercover investigations.
00:10:15.080
The employees admit it's not just a hypothetical problem.
00:10:18.560
It's not just that human traffickers could use our website.
00:10:24.560
They admit they that they do it and they don't care because it makes a lot of money.
00:10:28.480
One guy told Arden on the undercover tape he was so concerned about it that he and another
00:10:35.600
employee went to the chief legal officer and the chief product officer and told them like
00:10:42.880
this is an issue, like these guys are making a lot of money and they told him to shut the
00:10:46.500
F up because it makes too much money and we don't want to hear about it.
00:10:51.720
I can respect if like somebody was a whistleblower and came out and said, I can't believe this.
00:10:57.200
But I cannot respect someone who decides to keep quiet, keep facilitating child trafficking.
00:11:10.280
I mean, the fact that you guys captured as much as you did, I mean, these people are
00:11:13.660
basically bragging or not even I mean, maybe they're they're venting.
00:11:16.980
Maybe they're hoping for some kind of like is a mixture of some some more bragging, some
00:11:23.640
If I just tell people, maybe it'll make it all feel maybe they'll tell me it's OK.
00:11:27.680
And I think, you know, James O'Keefe mentioned this.
00:11:33.620
Like, man, the stuff we learned about what was going on and it's not just porn up to this is an important
00:11:37.600
thing. A lot of people have said, you know, Pornhub gets the brunt of it.
00:11:47.740
I bet OnlyFans is going to completely supplant Pornhub in the future.
00:11:55.780
Like you said, like a lot of people are like, why did they just get the brunt of it?
00:12:00.160
But we, you know, sound investigations were a new venture.
00:12:04.620
We want to investigate more without without saying too much.
00:12:08.980
We have plans to to to go after more of these criminal organizations and figure out, you
00:12:14.740
know, what they're doing criminally, how they're affecting the culture, that kind of thing.
00:12:18.840
I'm actually surprised the New York Times kicked off this story.
00:12:22.860
I mean, it was just that egregious that that that.
00:12:26.000
Um, that, yeah, like it's not like a political issue that people, uh, even people in the porn
00:12:33.020
industry have said, like, there have to be regulations.
00:12:36.140
Because like you said, in the 70s, you know, kids can't just walk into sex shops.
00:12:44.220
You know, we're talking about just completely unfiltered, uh, websites.
00:12:50.260
Um, now we have, uh, in the past, in the past few months, more and more states are passing
00:12:55.520
ID laws to get, uh, to require IDs to, uh, to access websites similar to how you have
00:13:02.500
to show an ID to go into a sex shop or to buy alcohol online.
00:13:06.480
Um, but actually two of the guys that we recorded are getting subpoenaed in a, um, in a big civil
00:13:16.940
It's a child sex trafficking case out of Alabama where a bunch of victims of child sex trafficking
00:13:23.280
that was monetized on Pornhub are suing the company.
00:13:32.200
The one guy was like, yeah, I wouldn't be able to defend this in court.
00:13:36.700
And he even admitted to me on camera that they have corporate attorneys that coach them
00:13:48.500
I'm, I'm just so sick of all of this, you know, cause we've been talking about the, you
00:13:52.840
know, uh, the big news recently has been like border stuff.
00:13:56.200
And I keep hearing excuses for the people facilitating all of that kind of, uh, trafficking.
00:14:02.640
I mean, some of the people that are being brought across the border in the United States are,
00:14:05.280
are sex trafficking victims, but it's just a lot of people want to make excuses because
00:14:10.240
someone's wearing a badge, you know, these, these CBP guys, well, you know, they're working
00:14:15.400
They're putting them on buses for the smugglers, but they were told to do it.
00:14:21.480
So the bigger picture, the reason I bring this up is I, you've got, you've got these
00:14:31.640
And, you know, between Montreal and Eastern Europe, it's between like one in 3000.
00:14:37.560
And I would say about maybe five to 800 of those are in Montreal.
00:14:47.400
At the very least, what needs to happen with all of their employees is subpoena, criminal
00:14:56.320
There is a criminal investigation of the Eastern district of New York where Pornhub admitted
00:15:01.460
to profiting off of sex trafficking, but they just got a slap on the wrist.
00:15:05.500
Basically, they got put on like a three-year probationary period, or if any new sex trafficking
00:15:11.620
material is found on the site, then the criminal charges become permanent.
00:15:16.940
It's like, I don't care if you work in the mailroom.
00:15:19.320
If you knew this stuff was going on, we got a pair of cuffs waiting for you, but nothing
00:15:27.880
It's kind of the problem of a, of a giant company.
00:15:30.700
Um, but I think, I think that, uh, like, like Arden said, you know, the Eastern district
00:15:39.280
They have a three-year probation, but I think now is when they might be vulnerable.
00:15:45.920
We need to get more stories on them and, uh, and then continue, um, you know, getting those
00:15:55.840
And these ID laws that are passing across the U S they're really, really actually impacting
00:16:03.320
Um, Pornhub decided to comply with the ID laws in Louisiana and they reported after that
00:16:10.440
an 80% decrease in traffic to the site out of Louisiana.
00:16:14.200
So compliance with age verification actually decreases traffic by 80%, which is why they just
00:16:21.760
decided to completely block access to everyone with more passing.
00:16:26.220
Well, it's, it's, it's a, it's a, there's a couple of issues there for YouTube.
00:16:30.840
If you make a video on YouTube and they determine it to be age restricted content, the only way
00:16:36.520
to watch it is to log in and that shatters your traffic, not because you're making porn.
00:16:41.800
Like I could make a video where it's like, look at this, you know, violent video that
00:16:47.180
And then nobody watches it because they won't recommend it.
00:16:50.740
And if you click on it, it says, please log in to view this video.
00:16:57.400
So Pornhub is basically like, people just want to go to the site and watch.
00:17:03.420
They should legally have to have those barriers.
00:17:05.560
And if that means they operate only at 20%, well, congratulations.
00:17:10.360
But what, so break down for me what the law does specifically in, in, in these states with
00:17:14.720
Yeah, the, it's a little bit different and most of them are based in Louisiana law.
00:17:22.500
Um, I'm not, I'm not, you know, an expert in all the legalese, but it does require that
00:17:28.440
if your site hosts more than two, if more than two thirds of the content that your, your
00:17:34.780
site hosts is, uh, pornographic, then you need to require, um, ID to, to, you need to
00:17:44.340
require somebody's ID through a third party vendor.
00:17:53.340
It's like, okay, so you can, you can have, you can have porn.
00:17:56.920
Well, I think the idea is to prevent like Twitter and social media.
00:18:01.560
I, I, I definitely understand the, the idea because it's like, again, what do we get into?
00:18:08.340
Like the, the one-offs like where, yeah, somebody could have uploaded, you know, porn to some
00:18:13.680
of the site, but the site isn't really for porn.
00:18:16.180
Um, I think it's to prevent like social media from getting caught up in that.
00:18:26.640
It's where everybody's talking, everybody's hanging out and you can choose different neighborhoods,
00:18:30.660
little pockets where people are talking and retweeting and stuff like this.
00:18:37.700
And there's an adult bookstore, but the windows are blocked out.
00:18:46.340
Twitter should have to do the exact same thing.
00:18:53.060
If you're on any social media platform, so that, you know, people have debated this,
00:18:57.400
like if you require an ID for Twitter, then politics has damaged people who want to speak
00:19:04.200
And I'm like, yeah, you can walk downtown and you don't need an ID.
00:19:08.840
It's your, your, your right to privacy, even in a public space, they can see your face and
00:19:13.480
But if you want to go into a porn shop or you want to buy cigarettes or booze, now we're
00:19:20.000
Twitter, Facebook, Instagram, Snapchat, whatever.
00:19:23.060
If somebody posts porn available for anyone to see, I don't see a difference between
00:19:28.680
that and walking around downtown, holding a big poster of porn.
00:19:32.040
You're going to get, it's a, it's, it's a sanity laws.
00:19:35.120
They come, they take it down and say, we're going to, we're going to arrest you.
00:19:37.680
I mean, honestly, the cops may come down, destroy the sign and say, don't do that again.
00:19:46.680
Actually, it was one of our staff in a, in our internal communications platform where a guy
00:20:01.280
And a woman saw him through the window and called the cops.
00:20:09.360
Another woman claimed she saw him standing in the doorway.
00:20:12.300
He made eye contact with her and like moaned or something, which I really don't believe.
00:20:16.820
And he ended up getting convicted of a misdemeanor for being naked in his own house.
00:20:21.640
He appealed it, risking going to jail and then ended up winning.
00:20:30.340
Like, it's funny to me that this guy has got got criminal, criminally charged in the
00:20:34.420
US and these kinds of people outright saying they know what they're doing.
00:20:41.540
If I think that, uh, you're like, if somebody is in public putting up flyers with adult graphic
00:20:50.880
We should not tolerate it on, on X or Instagram or any other platform.
00:20:54.140
And like you said, there's laws in the books for this already in the physical space.
00:20:57.380
Um, and, but it, how does it not, it does apply to, it does.
00:21:04.060
And, um, and so, you know, now we've got like more and more laws about digital space, but
00:21:08.740
you know, in reality, they could enforce the existing laws.
00:21:12.780
Um, the other thing is that they need to enforce laws that for, for age verification for the
00:21:18.300
people being uploaded because of so many victims that, you know, this is the original
00:21:24.660
Uh, you know, the, the New York times talked about in these employees, uh, talked about
00:21:28.780
in the, in the undercover investigations is complete unverification of, of sex trapping
00:21:36.920
Um, again, in, in, you know, back to the seventies in pornos, you have to, you have to have age
00:21:44.460
verification and consent verification forms for adult, uh, film stars.
00:21:49.120
But now, you know, Pornhub, you can, you can upload content.
00:21:53.100
You can, uh, of, of anything, you know, what's crazy is, uh, only fans.
00:21:57.580
Cause there was one instance where this woman, she was seven or this young girl, she was 17.
00:22:09.620
And then everyone kind of asked the question, she's underage in those videos.
00:22:14.480
Those videos were like, she turns 18 within minutes.
00:22:17.300
She's got all these videos up and they're like, no way she filmed all that in a few minutes.
00:22:23.400
And she can just say, no, I filmed them when I was 18 or whatever.
00:22:29.600
I mean, there's, there's, there's questions there on how you handle that, uh, legally
00:22:33.560
But I guarantee if they went onto her computer, they would see, you know, Dade created and
00:22:38.240
they would see that she was underage when she filmed these videos and then uploaded herself
00:22:43.240
I, I, I, I, I kind of think we, we probably got to go to a full scale regulatory model of
00:22:50.540
Meaning you got to get a permit from the government before you can upload pornographic content.
00:22:55.860
I think that's basically how it, how it existed 50 years ago and really how it existed before
00:23:00.320
the mass tube sites existed, uh, going back to the nineties, uh, you know, I think that
00:23:07.620
that's really, you did, you have to have, it's called section 2257 of the child, uh, protection
00:23:14.980
And you have to have forms that show this model is here.
00:23:25.460
If you're, if you're like a studio, you have to have that.
00:23:28.600
But if you were just some, some guys, you know, some group uploading some fly by the
00:23:35.340
night studio, uploading to a tube site like Pornhub, um, then they don't ask for any of
00:23:43.500
Like if you're a creator, you have to like submit an ID or something.
00:23:52.960
You need to, um, you need to verify with, with your face and your, your ID next to your face.
00:23:58.900
Their website says you have to be at least 18 years old to create an account, to access
00:24:07.340
You can't even sign up unless you, you, you have that, uh, in, but I also think there's
00:24:15.300
The ubiquity of porn and where we're at as a society.
00:24:19.880
Libertarian argument is, you know, let people do what they want to do, but I think it's causing
00:24:23.940
severe depression, it's causing psychological issues in young people.
00:24:27.960
And the challenge then becomes, do we decide to cross a moral line and say, we get it.
00:24:36.440
These people are adults, but we've, we crossed the Rubicon on how this is screwing people's
00:24:42.240
And I don't know, what do you guys think about that?
00:24:45.560
Well, I don't think we like, I don't even have to speak about what I think because one
00:24:49.420
of the guys in the undercover tapes who works at Pornhub and worked there for over 11 years,
00:24:55.880
he was the seventh employee at Pornhub literally says that porn's not healthy.
00:25:10.300
He actually brings up Jordan Peterson and completely mirrors his view on porn.
00:25:15.480
The guy's a product manager at Pornhub and hates the, and hates the product he makes.
00:25:20.700
If you're a, if you're a crack dealer, you don't do your own product.
00:25:24.760
Like every day you're slinging crack, the crackhead walks up to you and they're all messed up
00:25:28.740
and you're like, I ain't touching that stuff, but you can have it.
00:25:37.140
And I think it brings up a question of, um, do people actually like porn or are they addicted
00:25:44.480
or when they were children, did they, did they get addicted?
00:25:49.160
Um, and you know, are they just, are they just drug dealers?
00:25:53.300
I feel like, uh, you know, mostly men, but women too, they, they get excited, right?
00:26:01.980
And that what happens is you make it snap of a finger easy to satisfy that urge through
00:26:10.100
the internet and you get a fork in the road, go to the bar, try and meet some women or
00:26:19.080
I suppose for women, a lot of them, it was by 50 shades of gray or whatever, but they'll,
00:26:23.320
they'll satisfy that in some other way, which does just enough in, you know, their dopamine
00:26:27.900
or whatever, whichever receptor is related to that, that they just abandon, you know,
00:26:35.940
It's kind of crazy to think about if you go back before the era of porn in any capacity,
00:26:39.680
if a man and a woman were feeling randy or whatever, you better go negotiate with another
00:26:47.080
Like guys got to go to a woman and be like, I need to convince you to get me some.
00:27:04.500
And, and one of the things about the Texas age verification law is that it also states
00:27:10.240
that any pornography website has to include a disclaimer about the social and health implications
00:27:21.480
They, uh, and, and fortunately, uh, there was an injunction against it, but, uh, Paxton
00:27:27.720
has been fighting it and the, the junction has, uh, the injunction has been lifted while
00:27:34.460
So he's doing a great job of, of pushing that law.
00:27:37.420
What was the, he, he, he said he was fighting the injunction for him.
00:27:42.980
Unfortunately, some court, uh, sided with, um, uh, there's a group called the free speech
00:27:49.000
coalition, uh, which is really just like porn hubs, legal arm.
00:27:52.960
Um, and they never fight for free speech for anybody else except for pornography companies.
00:27:58.460
And, um, uh, and so they, they've been fighting all of these in courts.
00:28:04.500
The only place they've succeeded is in a court in, um, in Texas.
00:28:09.320
Uh, so they had an injunction against the law, but Ken Paxton has been appealing the injunction
00:28:14.600
and has gotten a lift on the injunction for now while the appeal is still going on.
00:28:19.560
We had, uh, someone on this show talking about the books that are in, in grade schools, like
00:28:27.540
And, uh, she's just like, well, I'm not for censorship.
00:28:35.940
When we, when we say we're against censorship in a colloquial context, we're referring to
00:28:40.240
Does someone have a political opinion to disagree with?
00:28:41.900
They should be allowed to express their opinion.
00:28:45.820
But when it comes to censorship, I'll say this outright to her.
00:28:53.020
Like adult graphic content being put in schools for kids?
00:28:59.760
And nobody should, but they try to, they play this trick.
00:29:05.320
When it comes to like political ideology, not showing kids porn, you creep.
00:29:11.440
I wonder what the next generation ends up looking like when not only is it ubiquitous, but what
00:29:17.920
you guys uncover is employees, they know how bad it is.
00:29:21.820
And with smiles on their faces, like I'm gonna keep doing it.
00:29:25.020
And, and we're seeing more and more reports coming out.
00:29:27.920
I think there was one out of the UK recently or a growing number of sexual crimes against
00:29:33.660
children are actually committed by other children.
00:29:35.660
And it calls into question, like, what has online pornography done to this young generation?
00:29:43.220
I was exposed to pornography, unfortunately, by another child at a very young age, and I
00:29:49.100
Luckily, I, you know, I didn't get addicted or anything like that, but it bothered me for
00:29:55.200
I can't imagine how many kids nowadays are being exposed to things like that.
00:30:00.160
It's crazy because if you really think before the internet, you know, it's the, the trope
00:30:05.860
in movies of the kid being like, yo, I stole my dad's nudie collection.
00:30:09.140
Let's go down by the river and read it or whatever.
00:30:11.300
It's like, I don't know, I guess it's what kids did or maybe the movies claimed, but even
00:30:14.760
before magazines or whatever, this, this concept did not exist.
00:30:18.020
And seemingly overnight, every child growing up that's online has in schools, there's nothing
00:30:27.420
We, we, we, I don't think a functioning society can tolerate the ubiquity of, I'm not just
00:30:38.240
You know, I think the older generation, when they hear porn, they imagine like something
00:30:43.200
from the seventies where a guy's like, Hey babe, I'm the pizza delivery man.
00:30:45.900
And she's like, why don't you come in and I'll pay you.
00:30:49.020
The videos that are online now are the wildest, craziest things you can imagine earmuffs for
00:30:55.640
Cause I'm, I'm just going to come out and I'll be light with it.
00:31:00.640
I'll leave Jake, you got out of this one, but there are videos of like animals, like kids
00:31:09.640
I, I don't even want to mention, but I think let's just say, well, we'll stop at animals,
00:31:15.640
videos of people and animals, kids can find that stuff.
00:31:20.320
And I think, like I was saying, a lot of these adults, when they're talking about, when you,
00:31:25.720
when you talk about porn, they're imagining like a playboy, like a hustler, like a nudie
00:31:31.740
Videos of just like seven people with weird objects swinging from ceiling fans and a dog
00:31:42.440
And these kids are watching it and I'm, I'm like, man, their, their brains are going to
00:31:50.360
And it's like, um, it's like Dylan Rice said in, in the one video, he's a, he's a, um,
00:31:57.820
He was a script writer for, for a lot of the studios at MindGeek, the owner of Pornhub.
00:32:03.340
Um, and, and he talks about, um, the way that they, they try to introduce more and more extreme
00:32:12.480
He said, he says, quote unquote, straight men, you know, we had these sites dedicated
00:32:17.120
to straight men, but then we see, Hey, can we introduce by content?
00:32:23.080
How can we do something that's more counter-cultural?
00:32:26.140
How can, you know, and, and they talk about push the envelope and he, and he even says to
00:32:30.980
me, uh, you know, do you know what the main audience is for trans angels?
00:32:35.720
It's one of their paid subscription sites and it's, it's all trans people.
00:32:39.340
And it's, it's mainly female presenting trans people.
00:32:43.680
And he said, our main buyers for, for that site are straight men.
00:32:50.960
I think, you know, like he says, see if you can convert somebody.
00:32:54.440
So I, you know, I can understand that where it's like a one component of what you guys
00:32:59.780
found was he's saying, we want to introduce queer content to start shifting the perspectives.
00:33:04.160
And so I would just put it like, if you're a guy and your whole life, you've been watching
00:33:11.580
And then one day someone says, here's a guy and you go, I like that.
00:33:16.680
And so it's a, it's a question of, yes, we can acknowledge they are trying to manipulate
00:33:24.880
But I'm kind of just like, I think those people were just gay.
00:33:32.180
However, I, you know, unfortunately did visit trans angels.
00:33:35.880
Uh, and from the waist up, a lot of these actors, uh, really do look like natural women.
00:33:44.860
And I, if I didn't know it was trans angels for some of them, I would have been like,
00:33:51.360
And, and so, uh, they, there's like funny memes where you'll see what looks like breasts
00:33:57.100
and then the camera zooms out and it's a fat guy's ass with like a bra on it.
00:34:01.140
And they're making the point of like, in your mind, you see what looks like cleavage.
00:34:08.680
And then it zooms out and like, you were actually getting off on like a fat guy.
00:34:14.040
But I could understand if you keep feeding content to these people and you inch them
00:34:20.460
incrementally towards shifting their view, that's, I guess, the argument they're making.
00:34:25.680
I kind of feel like though, if you're a, I'm sorry, if you're a straight guy and they show
00:34:30.300
you what looks like a woman and you're like, yeah, what a hot chick.
00:34:32.700
And then it pans down and you go, yeah, I'm okay with that.
00:34:38.000
But I think that means you are to some degree like bisexual or gay.
00:34:43.540
And if you are, I don't do whatever you want to do.
00:34:45.320
I don't guess like within, within, you know, don't hurt anybody.
00:34:48.040
But I think that just means they were gay dudes to begin with.
00:34:52.000
And I think that's probably the case in many cases.
00:34:55.020
I think what Dylan Rice is saying is that there with pornography, there's an excessive like
00:35:02.000
need for more, need for more to more extreme content.
00:35:06.380
You know, eventually you're just not into straight content anymore.
00:35:12.740
And so you like, they believe conversion therapy is real and works.
00:35:17.100
Well, I think they, they need to find, you know, eventually somebody only buys so many,
00:35:27.480
So like, yeah, he calls straight content vanilla content though.
00:35:31.960
But if you look at, you know, hold on, sorry, like all straight content.
00:35:37.700
He just says like, oh, you know, the pretty blonde girl with big boobs with the guy.
00:35:43.140
What if she's like swinging from a ceiling fan and they're wearing like parachutes or something?
00:35:46.560
Well, it might be more theatrical, but, um, I don't, I don't know, but what, what else
00:35:54.700
He said, uh, well, he's, he's talking about like, they need, you know, he says like, we're,
00:36:03.720
Like we need to push more and more content on people because again, they bought that,
00:36:10.480
you know, he says like, I think it's browsers is for straight guys.
00:36:20.840
Uh, the, the owner of Pornhub owns all these sites and how do we then like get them to
00:36:27.540
buy more subscriptions, to buy more videos to our other sites?
00:36:31.600
Well, like, you know, eventually they just run out of this is vanilla content.
00:36:37.500
So I think there may be like some serious scientific data in here, which could massively backfire
00:36:45.260
If this is true, I'm not going to, I'm not a scientist.
00:36:47.380
I'm not going to pretend, uh, I know exactly what happens to these guys, but if he's saying
00:36:52.100
in their pursuit of selling more content, they have found that you can introduce homosexual
00:36:59.660
content to what, what are straight, perceivably straight men, and they slowly start buying it.
00:37:06.840
If they have the data on that, that proves conversion therapy is a real thing.
00:37:09.840
And yeah, it's not even only like gay, trans, bi content.
00:37:15.700
It's also the incest, the step family category that's becoming increasingly popular, uh, that
00:37:23.260
one of the employees told me he was really concerned about it, having real world effects,
00:37:28.380
making people, viewers actually act out these fantasies or try to with family and it causing
00:37:35.160
actual world issues, not to mention the teen category is, uh, you know, one of the most
00:37:43.660
And now they claim, oh, this is 18 and 19 year olds, but we have young, very young looking
00:37:50.340
people in school outfits and things like that with teachers.
00:37:55.000
And it's, um, it's very obvious what they're trying to push.
00:37:58.400
Dylan admitted that they purposefully will cast actors who appear to be around 15 in order
00:38:05.800
to appeal to pedophiles, which they can easily turn into whales or big spenders.
00:38:13.720
I just, I just ban it all, make it all illegal.
00:38:17.960
Just the, everything we're hearing is just so nightmarish, but like the, the liberty minded
00:38:22.460
part of me is like, just regulate it and stop the dark stuff.
00:38:26.160
I just don't know if you can, like, what's the solution if, if it's available, even if
00:38:31.880
you're 18, even if you do ID verification, the, the idea that they're, they're peddling
00:38:37.600
this addictive substance as it were, and they're making you more.
00:38:41.520
And like there's the, it starts with vanilla and then every day you're using it.
00:38:47.520
They're sending you something and they're making you go crazy.
00:38:56.160
I mean, I guess I view it as like prostitution's always been around, oldest profession.
00:39:03.480
And, and then, you know, we, we enter the, the modern era of like videotapes and it's
00:39:12.120
still like, then we have, you know, we have red light districts or we have, you know, we
00:39:18.420
have studios, we have laws that require age verification, consent forms.
00:39:30.060
We, we hit the, the, the 21st century and then mass tube sites happen where user generated
00:39:40.220
And all of a sudden, like nobody's willing to regulate those industries either because
00:39:44.320
they don't know how, or because these companies suddenly made a lot of money.
00:39:53.140
The good news is the creation of porn with actors and studios and all that stuff is on
00:40:02.100
And I believe will likely cease to exist within a few years to be replaced by AI for real.
00:40:13.320
There's already, like I covered one of these stories, the video they can generate with AI.
00:40:22.020
There's a woman and she, there's a video suffer at the beach and she's like shaking her hips
00:40:30.360
And they're making like 30 grand a month off of this.
00:40:33.360
So for what purpose would someone need to pay a woman when they can just sign up for a website
00:40:45.920
Where that goes is crazier because we're separating, you know, these people you've exposed who are
00:40:52.480
We know what we're doing and why we're doing it.
00:40:57.260
A website will just be like, you're allowed to generate AI content so long it's illegal.
00:41:01.400
And then all of a sudden you're going to get an individual.
00:41:06.060
They're going to go onto the website and they're going to type in, make this for me.
00:41:09.040
They are going to pursue in random crazy directions.
00:41:12.180
It's not going to be that some guy working at Pornhub is like, let's see what happens
00:41:15.900
if we send this straight guy, like gay content.
00:41:24.380
Everyone's going to have a weird personalized hyper porn experience if this is not regulated
00:41:31.420
And we already see where this is going with like the Taylor Swift stuff on X where they
00:41:36.620
just do these videos they posted of Taylor Swift.
00:41:43.360
I don't even know if there's a solution to that.
00:41:46.980
I mean, how do you regulate that at that point?
00:41:49.300
And, you know, how does the regulation ensure that you don't stifle AI development in general?
00:41:56.960
And it calls into question because there are more reports of AI child sexual abuse material
00:42:04.360
being generated in, you know, a variety of different ways.
00:42:07.220
But then the argument is, well, that it's not a real child being abused.
00:42:12.340
So one of the one of the questions as it pertains to that was there are adult women 20 or 30
00:42:21.700
And so you've got these people, these creepos arguing.
00:42:29.020
There was a show about a woman who suffered some hormonal disorder, which prevented her
00:42:35.060
from aging beyond what appears to be like eight years old.
00:42:37.840
And so she's like in her mid 20s, wants to have a relationship, but looks like a child.
00:42:47.400
And I'm like, I genuinely feel for that poor woman.
00:42:52.080
But any guy who is going for that, I'm sorry, like you're attracted to a child.
00:43:01.160
But so in that capacity, there have been people made these arguments.
00:43:03.640
They're like, oh, yeah, this is this this woman in this video.
00:43:06.980
Like you were mentioning, they try to make them look young.
00:43:11.160
Even though she looks 15 and she's wearing a school girl, school girl uniform.
00:43:14.580
Like the questions around legality and morality, this is super, super difficult.
00:43:19.200
Do you make illegal any kind of porn that depicts situations of minors, regardless of
00:43:27.760
In which case, if someone creates AI generated content and the female or male in it appear
00:43:35.680
to be underage, but there are adults who look like they're underage, too.
00:43:41.740
Unless we just say the circumstances around it.
00:43:48.600
I think the libertarians would freak out and be like, you can't just tell people they
00:43:56.020
I mean, I guess right now it's like a theoretical problem, you know.
00:44:00.840
Well, AI generated porn is all over the Internet already.
00:44:27.180
And they were deep faking her face onto adult women's bodies in porn videos.
00:44:33.820
And the mom found out this is going on like she's a minor.
00:44:42.000
Yeah, her face on the body, but the body is an adult woman.
00:44:44.800
So there's a new act being introduced called the Protect Act, and it clearly defines, you
00:44:51.020
know, expansive ways that people can be sexually exploited online, including through AI pornography.
00:44:59.820
And I think something like that could more clearly define in the law since the law is
00:45:13.840
So this is actually just this story is from two weeks ago.
00:45:17.540
17 year old Marvel star dancing with the stars performer Xochio Gomez.
00:45:21.400
I'm probably pronouncing that wrong because it's pronounced.
00:45:31.140
Spoke out about finding non-consexual sexually explicit deep fakes with her face on social media
00:45:35.140
and not being able to get the material taken down.
00:45:44.920
Like, I'm not saying you we will literally end up doing that.
00:45:52.240
OK, now let's negotiate on where the lines are that we're going to allow.
00:45:54.680
Yeah, I mean, on a personal level, I totally agree on on a realistic level.
00:45:59.480
My brain kind of goes to in a similar way how public opinion has changed so much over
00:46:08.720
Why not try to educate the public about the true societal and health implications of what
00:46:16.180
Well, you know, I'm really proud of millennials and Gen Z because they don't drink soda anymore.
00:46:33.100
My doctor tells me you should put lemon juice in water and drink that because in 2014,
00:46:37.720
I got a kidney stone, probably, I don't know, drinking soda and Gatorade or who knows what.
00:46:41.980
And so my doctor's like lemon juice and club soda.
00:46:51.020
Like everywhere I go, everyone's got Spindrift and I love it.
00:46:53.900
It's club soda with a little bit of fruit juice in it.
00:47:00.520
Our generation is destroying the soda industry.
00:47:04.080
And I'm like, how did we do something so awesome?
00:47:06.220
Like there's 50 grams of sugar in one can of soda.
00:47:09.780
And that's supposedly like 100% of your daily sugar intake.
00:47:14.360
And we've all kind of realized this is a bad thing.
00:47:17.500
I wonder if there's a similar cultural zeitgeist that could emerge where we're like, yeah,
00:47:23.700
Let's let's not because of government force, but because of a cultural like movement.
00:47:29.460
People just say, I'm not going to go anywhere near that stuff.
00:47:34.280
We even have celebrities like Billie Eilish talking about how messed up pornography made
00:47:44.940
She was talking about how she just wanted to watch the more next extreme thing.
00:47:54.340
And I personally was really surprised that Billie Eilish talked about that.
00:47:59.540
And I wasn't expecting a celebrity to come out against the adult industry because it's
00:48:05.200
the cool thing to side with the adult industry, right?
00:48:08.280
Especially with OnlyFans, you've got a lot of these.
00:48:10.880
This is the crazy thing about the porn industry is, man, there are people online that are so
00:48:18.500
If you even say something like, hey, this is bad.
00:48:21.040
They will try their best to mock and ridicule you into being scared to speak out against
00:48:26.340
You know, Jordan Peterson comes out and he's like, porn is bad.
00:48:29.740
And everybody's like, you know, he's got a point here.
00:48:32.020
And then instantly you get these people being like, ha, I'm a really cool young guy and
00:48:38.140
Like this stuff's clearly destroying people's brains, like melting their faces.
00:48:42.800
But I was, I mean, that's kind of crazy actually that Billie Eilish was one to speak
00:48:50.160
And so the expectation is that the majority of people who are addicted and going to these
00:48:58.340
I think the gap is kind of closing because I think someone, I don't know how trustworthy
00:49:04.920
of a survey it was, but they reported like 25% of females now regularly use pornography.
00:49:13.980
You know, I mean, 50 shades of gray is like people consider that to be porn for women.
00:49:23.400
I mean, do you guys like how much have you researched into like the psychology and the
00:49:37.080
What, you know, what study was skewed in some way.
00:49:41.260
I think it's very clear based on like real world observation what it does to someone.
00:49:46.460
And as a woman growing up, it was always very clear to me who, what men were regular porn
00:49:57.920
And I, like, I genuinely went out of my way to avoid those people in dating situations
00:50:03.160
because I thought it was extremely unattractive.
00:50:08.020
I bet that the people that like these porn websites can see, I'd be willing to bet that
00:50:14.020
men are consistent viewers of pornography and women as individuals, you'll have spikes
00:50:27.280
Women are like, okay, this day it's like a huge increase.
00:50:31.080
If, yeah, if you put like all men and all women up against each other, you'd probably
00:50:35.360
But if you look at individuals, guys are probably like consistently watching on these days and
00:50:39.680
women, it's like in bigger waves, men are shorter waves or whatever.
00:50:43.480
But I feel like the future, uh, as we go in this direction, if we don't do something about
00:50:48.760
it now, I, I, I think like the, the, just the AI VR deep fake stuff will result in the
00:51:01.740
You know, we see these sci-fi movies where they do like, uh, surrogacy.
00:51:07.620
They do like pod babies and there'll be like this idea of having a baby that the old fashioned
00:51:18.340
Where the guy's like, uh, uh, it's not necessarily like everyone's genetically engineered, but
00:51:24.480
And then he has to like lie about identity or whatever.
00:51:27.540
The reason, the way, the reason we'll get to that world is not because women want to
00:51:32.440
And they're like, I have no time to carry a baby.
00:51:34.880
It's going to be because men and women will not be sexually attracted to each other.
00:51:39.540
Like they're going to, they're going to plug their brains into their neural link.
00:51:43.780
And there's going to be like a dragon, a polar bear, you know, robots.
00:51:49.600
And they're going to be like, this is the only thing that works for me.
00:51:51.860
And then when they're in the real world, like let's have kids.
00:51:54.120
They're going to be like, okay, I'll go into my pod, you into yours.
00:51:56.480
And then we will, you know, robo inseminator, whatever the, whatever the, whatever the
00:52:01.060
F like people were talking about how, um, they sell insemination kits online and stuff
00:52:12.620
Like, it's the weirdest thing to me when someone posted this in like one of our super chats
00:52:16.600
that women can go to like target and buy applicators where the dude puts his, you know,
00:52:24.200
she gets some stuff from the dude and then she takes and then she goes and then she does
00:52:27.600
And I'm just like, I mean, it's an easier way to do it, you know what I mean?
00:52:30.860
Like, but they don't want to, I guess, or it's like some weird things happening where
00:52:36.160
And I wonder if outside of just like, we're talking about porn addiction and all that
00:52:40.200
regular familial relationships will be eradicated by this.
00:52:44.680
I mean, I think as a culture, we are seeing short term pleasure being prioritized over long
00:52:50.860
term fulfillment and it shows itself in a lot of different things, but especially within
00:52:59.720
I do think online porn has a lot to do with that.
00:53:03.280
I wonder if this is the apocalypse because they say, I'm half kidding, but they say, you
00:53:09.860
There are comedians and not even comedians, people have said that like every action a man
00:53:14.360
will take is related to trying to have sex or something like this, which I don't think
00:53:20.460
But the argument is like, why would a man want to be successful?
00:53:23.360
And if you look at the evolutionary psychology of it, the reason why a guy wants to be the
00:53:27.200
best hunter or in modern era, the, why he wants to be the best name in pro billiards or
00:53:37.320
And behind every action is the, the, you know, if you're a successful, powerful male who can
00:53:43.280
peacock successfully for the female, you will get the mate.
00:53:46.640
And so what happens to a world where nobody wants it or they can computer generate it?
00:53:51.680
Just end up like sitting on your couch all day eating Cheetos and having this beautiful
00:53:55.100
woman tell you everything you want to hear, but it's fake.
00:54:02.640
It does seem like there's a political component too, though.
00:54:06.840
So I'll ask you guys if you, if, you know, in this regard, you mentioned this story, one
00:54:12.180
of the big components of the story that we should dive more into is the guy saying we're, you
00:54:16.060
know, like querifying or whatever, disrupting the norms.
00:54:19.700
Did you find a political element with some of these employees?
00:54:29.080
Um, I think it's, I think, I think Dylan Rice even said in the undercover tape, like we
00:54:36.320
don't have any, you know, uh, cultural bias or anything.
00:54:40.660
We're just trying to figure out, you know, what content pushes what.
00:54:44.660
I do think it's one of those things where their bias is just natural and it does reflect
00:54:51.100
I don't think you're going to be hired at a company like Pornhub if you're not, or if
00:54:57.280
you don't at least know how to, um, express pro adult industry viewpoints.
00:55:03.280
And it tends to be, uh, you know, like more progressive people, I guess, but you get a
00:55:10.280
mix of people from different political backgrounds, but it's just people.
00:55:18.640
I think the percentage of conservatives working in porn is 0%.
00:55:24.440
I mean, true, true conservatives, probably not.
00:55:27.040
I think there's a, probably a lot of like libertarian types and progressives and, um,
00:55:32.960
anybody who claims to be a conservative, like you go to a porn industry and they're like,
00:55:47.020
There was like some famous moment at the libertarian debates where they were arguing over whether
00:55:52.400
And when some, when I think it was Gary Johnson was like, no, he got booed.
00:55:58.800
And the argument is the market will decide, but parents should figure out the government
00:56:02.460
And I'm like, I mean, I get it, but there's gotta be some limits, right?
00:56:06.720
But yeah, I mean the industry, the adult entertainment industry as a whole is predatory.
00:56:13.080
And a lot of people don't realize like the owner of Pornhub is a former defense attorney
00:56:20.860
He defended, yes, child pornographers, child sex predators, child sexual abusers.
00:56:27.440
And he even, you know, spoke at this big attorney's conference, coaching other defense attorneys
00:56:34.440
of child sex predators, how to get shorter sentences for their clients.
00:56:41.060
He was looked at as one of the top defense attorneys for this type of subject matter.
00:56:47.800
Do you guys consider yourselves conservative or right wing?
00:56:52.520
And I wonder, like, is that a motivating factor for why it's like, we need to figure out what's
00:56:58.900
I think a large part was, I just saw nobody else going after it.
00:57:03.300
Like, we had these skills that we learned from James and other friends at Project Veritas.
00:57:13.280
I talked to other people who were kind of in the anti-exploitation arena.
00:57:17.560
And everybody was like, yeah, nobody's really gone undercover in talking to Pornhub employees
00:57:25.060
And we knew what a touchy subject pornography is to people all across the board, no matter
00:57:31.340
And I think we just both wanted to, like, push the red button, like push the button you're
00:57:37.600
I think in this case, it needed to be investigated.
00:57:41.280
I mean, I mean, holy crap, the stuff that these guys are saying to you.
00:57:44.000
So we're talking about human and child sex trafficking and they're like, yep, and it
00:57:52.220
It's, this stuff was a conspiracy theory before.
00:57:58.240
And now it's like, I got questions about how these guys can operate a website that has child
00:58:04.240
The worst that happened was they had to stop doing the thing.
00:58:10.040
And, you know, if they aren't caught continuing to profit off of sex trafficking, then their
00:58:16.740
criminal charges are dropped completely, which is crazy.
00:58:20.540
Can you guys, are you willing or able to talk about the techniques involved in the undercover
00:58:25.900
investigation or will that, like, blow the operation up?
00:58:39.180
We look up who works for the company and we see, we try to get as many names as possible,
00:58:48.980
And we really just use all publicly available information we can find on each person to find
00:59:03.280
Well, before you say it, I'll just tell you, my understanding is it's like Tinder.
00:59:07.780
You go on and you swipe until you find someone, but that's, that's like casting a wide net.
00:59:16.760
First was a one guy, he, he was making a phlebotomy app.
00:59:21.700
Uh, that's like phlebotomy is where you're like a nurse who, who draws blood.
00:59:26.780
And I don't really know why, why was he making a phlebotomy?
00:59:30.560
They have things where you can like put the phone over the arm or like put a light on your
00:59:35.060
arm and it shows the veins or something like that.
00:59:38.080
I mean, his app was like for scheduling patients and things like that.
00:59:46.760
Cause he's not in the medical industry or anything.
00:59:49.080
He's in the porn industry, but this was like his side hustle that he was doing with a friend.
00:59:53.380
And, and so Arden was like, all right, I'll be a phlebotomist and like, uh, messages him
00:59:59.020
or something says that she's a nurse and she's doing some contract work and she's in town.
01:00:05.160
And, uh, and then, yeah, I got a couple of meetings out of that, you know, pretending to be
01:00:10.840
a phlebotomist, which he's never been in real life.
01:00:13.640
Like I was just hoping for no research on phlebotomy.
01:00:16.840
I mean, but this guy's not gonna know anything about it either.
01:00:19.340
And you, and, and, and, you know, if you're good, you can always just, he'll be like,
01:00:23.020
so when you've done insert technique, you'd be like, oh, we don't even do that anymore.
01:00:27.760
Well, the new, and you just make it up and he's gonna be like, oh, okay, I guess.
01:00:33.400
But like the, oh man, the, the, the trope with like project Veritas and these undercover investigations,
01:00:39.720
it's just like some guy meets some woman off Tinder and she's like, let's go grab drinks.
01:00:45.920
And then he's like, I know what will impress this woman.
01:00:50.120
Let me tell you about the crimes I've committed.
01:01:02.480
Like you message a guy claiming to be a phlebotomist.
01:01:08.160
And then you're like, tell me about the child trafficking you're involved in.
01:01:13.900
I just, I, you know, I asked him a bunch of questions that I am truly genuinely curious
01:01:19.780
And so, uh, that was all coming from a very truthful place.
01:01:23.420
He's like, I know what's going to get me laid telling my date about this child trafficking.
01:01:28.880
There's, there's this short book, very, very short book called.
01:01:36.760
Uh, and it talks about how to get information from short-term relationships with people.
01:01:47.640
People should read it, uh, and then go do this stuff themselves.
01:01:50.840
Like in real life, even just to improve relationships with people.
01:01:53.500
It's about listening to people, knowing what they want.
01:01:58.720
So the first thing you have to do is pretend to be a phlebotomist, but, but, but like, what
01:02:08.960
That was crazy because I made an account on a dating app was swiping in Montreal where their
01:02:18.440
And one of the first profiles I came across was of Mike Farley, who's the 11 year employee,
01:02:31.620
It didn't say a company name or anything, but I decided to just ask him where he worked.
01:02:47.640
I mean, there, there's millions of, how many million people live in Montreal?
01:03:09.660
I, I spent all day, you know, chatting with people, but within the first two hours, you know,
01:03:15.460
And I just say, um, Hey, what kind of tech do you do?
01:03:19.700
And he said, I'm the product manager at Pornhub.
01:03:23.220
And I was like, are you scared that one day you'll come into this moment where it's like,
01:03:28.900
you know, you're under the, you're, you're, you're, you're doing the sting operation.
01:03:32.620
He's giving you information and then, you know, he, he discovers it and you're like
01:03:37.740
outside in the rain and he's like, I trusted you.
01:03:43.220
Like, is there a risk of you actually falling for somebody?
01:03:50.720
Um, and also we definitely limit the amount of meetings and what we do at the meetings
01:04:06.000
I don't know the details on it, but there was like a guy who I think he was like FBI or something.
01:04:10.000
And he infiltrated eco terrorists and he ended up cause they're allowed to sleep with their
01:04:17.060
On the job, they're allowed to have sex with the people they're investigating.
01:04:21.020
And he ended up falling in love with her and then refused to testify against her.
01:04:31.220
Cause that definitely for us, like would totally cross the line.
01:04:38.840
I mean, I, I'm, I could be totally wrong about this.
01:04:43.420
Like if you're, if you're going undercover and you're going into a gang, like the trope
01:04:47.560
in the movies is there'll be like, all right, prove you're not a cop, do the drug.
01:04:53.240
And it's, and then, and then afterwards they're like sneezing and spitting.
01:04:56.960
And it's like, no, if you don't, they're going to know you're, you're a cop.
01:04:59.820
Like you, you're not doing this unless you're, you're in or whatever.
01:05:06.260
Law enforcement can do a lot of things we can't do.
01:05:09.580
We operate within the law and more, a lot of more, but dating a guy is within the law.
01:05:13.560
It's just like, it's kind of dirty to go that far with your, Oh yeah.
01:05:20.400
Plus it's kind of like, it's kind of, I guess in the instance of a, of a fed meeting an eco
01:05:27.680
It's like, okay, like, what's the worst thing you're dealing with?
01:05:30.840
They might smoke too much weed or whatever, but we're not like, if the woman genuinely is
01:05:34.320
not blowing up train tracks, like then there's like, what's the real problem?
01:05:37.520
In this instance, I suppose you're meeting a guy who's bragging about, you know, facilitating
01:05:41.200
It's like, I'm pretty sure we're not going to be into that or someone like that.
01:05:45.660
But, uh, but you know, what, what else is there?
01:05:48.600
You message this guy, he hits you up on Bumble and then you just ask him for a date or what?
01:05:53.260
And, and usually try to wait till I'm the one asked.
01:05:56.440
So it's not like weirdly pushy for a girl, you know, it's kind of crazy how simple it all
01:06:06.680
And then you have like pinhole cameras or like, how does that work?
01:06:12.340
Um, but one of them, you know, it looks like, it looks like a phone, which is kind of obvious,
01:06:32.240
It'll very much look like a real phone, but you can't make calls from it.
01:06:45.360
And then, yeah, it's just like the camera is a tiny little hole right there.
01:06:48.460
So basically, if someone puts their phone under the table, pointing at me like that, I got
01:07:06.100
You know, I know James has been looking for more.
01:07:08.280
I mean, we're still using the stuff that we've used at Project Veritas for, oh yeah, we've
01:07:13.720
still looked, we're still looking, you know, we're still using the stuff that we've used
01:07:40.340
And like the screen pops up, but it's fake icons.
01:07:46.920
If anybody ever goes on a date with you and they bought a Windows phone, you should be asking
01:07:54.320
But if you lock and press the button on the side, it actually looks like an iPhone screen.
01:08:03.540
There's a James O'Keefe just did this undercover video of an executive at an employee of the
01:08:14.660
I mean, this guy says Michelle Obama does not want to run.
01:08:19.100
They've actually had discussions about replacing Kamala Harris.
01:08:21.920
They acknowledge Joe Biden's in serious mental decline, though not clinically diagnosed.
01:08:26.780
But a lot of people are questioning the ethics of James O'Keefe went on a date with a guy
01:08:33.760
And this guy is is just explaining what's going on behind the scenes.
01:08:39.160
I think that knowledge that in the White House office top employees know Biden is unfit.
01:08:47.700
These things are important for the American people to know, but it doesn't rise to the
01:08:52.780
It's just someone should have told us for political reasons.
01:08:56.940
And now people are questioning the ethics of I'll say this.
01:09:04.000
But like, you're a cybersecurity expert sitting in front of James O'Keefe.
01:09:09.860
Like, don't go on dates and spill the beans about all the inner workings of the White House
01:09:15.960
So the information gained from it, the guy's failure at security and his willingness to
01:09:21.140
give up all this information, I don't feel bad enough.
01:09:24.960
I'm like, no, I think James got some good work there.
01:09:30.180
And yeah, you know, and you know, with these with undercover things in general, I think
01:09:34.700
so long as the intention of the journalist is to expose the corruption that's going on
01:09:40.980
and not to humiliate the person that they're investigating, I don't think that should ever
01:09:52.640
You know, I gotta be honest, it's one of the best videos James has ever done.
01:09:58.720
But I think the public right to know that within the White House, they're concerned
01:10:02.580
about Biden's mental fitness is is it's too important for this country.
01:10:06.160
I know it's not a legal criminal activity, but I think it's the benefit to the public.
01:10:11.520
I think that guy should have publicly stated, hey, guys, we are seriously concerned about
01:10:19.120
That being said, when James says and he pulls his glasses off, what are you doing in a
01:10:38.080
You know, he like turns to the B-roll guy, takes off his glasses.
01:10:47.040
I wonder if this guy just didn't know who James was, though.
01:10:54.380
And this is why I'm not going to, I have no sympathy for this guy who fell for this or
01:11:00.120
whatever, if you're working cybersecurity and you don't have an assessment of security and
01:11:06.120
information threats or anything like that, to the extent that you would sit down with
01:11:10.180
the most, the preeminent investigative journalist in this country.
01:11:16.660
Like if, if, you know, if you walk into a lion's den trying to pet a lion, I'm going
01:11:25.740
But dude, you, there was a video of like a guy jumps into a gorilla exhibit and the gorilla
01:11:31.680
And I'm like, well, yeah, I don't want that to happen.
01:11:35.900
But like, what are you going to do if someone chooses to do that?
01:11:38.300
You know, like in the DNC, they have posters, they have trainings showing James's face,
01:11:43.600
showing everybody that James has ever met with at any point, anybody who was ever friends
01:11:50.140
They even have tutorials on like what kind of cameras that are used.
01:11:53.800
And I remember talking with someone who worked at Planned Parenthood who said they have trainings
01:11:59.020
to spot undercover cameras because of Project Veritas.
01:12:05.700
There was one point where, uh, I tweeted something, there's a story that Veritas put out and I
01:12:11.560
tweeted something like, I didn't think it was that big of a deal.
01:12:13.980
And then James responded something with like, we know, and a picture of me from an undercover
01:12:20.200
And there were people who were like, because this is, this is years, it's like 10 years
01:12:24.000
And there are people like, James, why are you secretly recording Tim?
01:12:27.640
Like, bro, you could turn the camera on and film literally everything I say.
01:12:31.000
I say everything three or four hours a day that I'm thinking, like, I don't film me,
01:12:35.760
The only, the only concern I have for like someone coming in here and secretly recording
01:12:39.180
is if there's someone's health and safety and privacy, like someone got cancer.
01:12:45.520
Like that's not, none of our business, not public right to know.
01:12:47.700
But when it comes to these guys at Pornhub, when it comes to Planned Parenthood or whatever,
01:12:53.080
and the DNC, and they're like, we can't let anyone find out what's really going on.
01:12:56.640
I'm like, if you come to me and you say, I am concerned that you may leak some private
01:13:04.860
information about a person, which is not public right to know.
01:13:07.400
I can respect being concerned about that privacy wise.
01:13:10.520
But if a political organization is saying, don't let them film you, it's because they
01:13:14.100
know that their internal workings, they're doing bad things.
01:13:20.540
If this was like, we're a technology company, be wary of people who want to steal and spy
01:13:26.120
My response to you guys are James O'Keefe is if you were filming me secretly, I'd be
01:13:32.080
Cause it'll be just, if, if you, if you, if, if you show a video to all your people like,
01:13:37.000
aha, we caught Tim talking about the gold standard.
01:13:41.660
Or like he wants, he wants regulation on porn or something.
01:13:47.420
And, and there was a time like in the late 1800s, early 1900s when undercover journalism
01:13:52.920
was more commonly accepted as legitimate journalism.
01:14:03.440
And pretended to be insane and, and documented their abuse of the patients there.
01:14:09.980
And we had, uh, under pop, uh, other popular undercover journalists all over the world.
01:14:14.780
And then, um, you know, the journalistic award ceremonies stopped awarding undercover
01:14:22.600
So people really stopped doing it cause they knew they wouldn't be rewarded.
01:14:25.680
I'm sure James has told you this story, but, uh, what, what was the, the grocery store
01:14:30.280
chain, uh, lion something, uh, yeah, food lion.
01:14:34.980
Um, there was a CBS station that investigated them undercover and then food lion sued them.
01:14:42.800
And then food line ended up winning, but only like $1 or something.
01:14:46.500
But it just, the, the, the lawsuits lawfare just ended up scaring away lots of mainstream
01:14:52.640
journalists from ever doing actual investigative or undercover journalism.
01:14:56.040
There was, uh, some story that came out from Veritas rest in peace Veritas, by the way.
01:15:01.100
And, uh, it was, you know, undercover story exposes this and the media runs full, full speed.
01:15:11.240
And then like a month later, you got channel four.
01:15:13.780
I think it was in the UK undercover video of anti-immigration activists.
01:15:23.120
So I'll give you another explanation, another example of just, that's the easiest one for
01:15:29.340
But look at when vice media embedded with the white nationalist in Charlottesville, they
01:15:39.500
When any anti-establishment journalist does quite literally the same thing.
01:15:45.480
And vice did this whole undercover expose on TripAdvisor and how TripAdvisor is rigged and it
01:15:51.940
But when it's about subject matter that really matters, suddenly undercover journalism is
01:15:59.660
So when these journalists try to expose the inner workings of anti-immigration activists,
01:16:06.060
I should say anti-illegal immigration activists, they're celebrated because they oppose the regime,
01:16:14.420
And if you're, say, Sound Investigations or formerly Project Ventures or Omkief Media Group,
01:16:19.640
you are actively against the corruption of the machine and they don't want that exposed.
01:16:27.740
And it is remarkable the cognitive dissonance that exists among the people willing to support
01:16:31.900
I think the challenge many people need to come to terms with is if there is a human being
01:16:37.780
who can look at two specific, like, identical circumstances, but one is good for them and
01:16:46.400
And so they argue, like, okay, we've got corruption in two instances, but this corruption benefits
01:16:53.360
That's what we are dealing with on a large scale.
01:16:55.980
Like, I wouldn't, have you guys gotten any attacks in the media over this stuff?
01:17:01.220
I mean, they're all sponsored by the adult industry.
01:17:04.580
We have, like, adult video network writing hit pieces and stuff like that.
01:17:09.760
But we've gotten several legal threats from Pornhub's attorneys.
01:17:13.440
And it's really funny because, you know, they're mad, heated letters where it's like, you lied
01:17:19.560
about your identity and contacted one of our employees and met with him under false pretenses
01:17:27.040
And we're just like, yeah, that's undercover journalism.
01:17:33.460
It's like, yeah, that's what we're exposing on your website.
01:17:40.220
I think their exact words were something like, you non-consensually recorded and uploaded
01:17:53.440
I respond with, like, that's quite literally what we exposed you doing.
01:17:57.760
Like, your employee was bragging about how you do that and don't care.
01:18:03.840
You're like, hey, you're doing something that's not nearly as bad as the thing that we're
01:18:09.460
You know, and they're demanding we take down our videos that we recorded and all these types
01:18:16.600
of threats and all the while saying that they're huge supporters of free speech, but
01:18:23.800
When it's us, they don't care about our free speech.
01:18:29.420
Was this what kicked off the creation of Sound Investigations?
01:18:35.080
No, I left Project Veritas back at the summer of 2021, I think.
01:18:41.860
And then I was originally in tech, so I went back into tech for a little bit.
01:18:45.640
And then and then I kind of was developing this idea because, like I said, I kind of had the
01:18:55.880
And then like December of 2022, I remember we were at the Turning Point event and I was
01:19:06.320
talking with Arden and she was like, do you need like a sidekick for any project?
01:19:15.180
And then I called her up at the beginning of the last year.
01:19:24.060
Like we are a nonprofit, but we've we have not solicited any donations from anyone.
01:19:30.080
Eric is completely self-funded this just because he's so passionate about this subject matter.
01:19:37.860
And he knows how significantly online pornography and abuse has affected our current culture.
01:19:46.060
Our videos are being cited as evidence in trafficking cases against the company.
01:19:52.100
I mean, these are the reasons why we're doing what we do.
01:19:54.620
Yeah, we saw this as an opportunity to get real results.
01:19:56.720
Like the best stories we ever did at Project Veritas was the stories that got real results.
01:20:00.200
One of my favorite stories was this one in Texas where where our field ops manager got this lady to expose this entire voter fraud ring and the lady ended up going to jail.
01:20:15.440
The ones that get real results, get companies, you know, that are doing wrong to get sued, to go to people, you know, the people to go to jail.
01:20:24.320
And so I was like, this has a real this has so much impact.
01:20:29.980
It will have cultural effects, sure, but it also have like real tangible effects.
01:20:37.780
I'd be willing to bet that if you met with literally anybody, they would expose corruption in their industry no matter what it was.
01:20:44.980
Yeah, I don't care too much about the corruption of like a grocery store.
01:20:48.400
Although, to be fair, Foodline, there was something there.
01:20:50.300
I mean, like, if there's a guy and it's like, sometimes we change the date on the meet to be one day later.
01:21:00.100
But I'd be willing to bet that if you met with anybody and like, I was like, I work with a water reclamation plant in the city.
01:21:08.260
He'd be like, well, you know, sometimes you don't actually clean the water at all.
01:21:13.820
You know, at Project Veritas, that was when part of our interview process was when when we're bringing a new undercover journalist was go out, go out and get a local story.
01:21:34.420
They would be like, yeah, there's this guy who's been peeing in the fountain at the mall every day.
01:21:45.520
The cop like told the undercover journalist there's been a guy peeing in the fountain every day or something.
01:21:52.440
The story that I got was a an employee at a restaurant at the mall, and he said the manager is always making the employees hide the fact that they have a mice infestation.
01:22:04.660
So when the health inspector comes, the manager is like, everyone, shut up.
01:22:10.220
We should have had like a separate branch of like Project Veritas local or something, because, yeah, like you said, you could people can do this locally in their own communities.
01:22:18.800
And like and the great thing about local news is that sometimes it's easier, depending on where you live, to get like lawmakers to change things versus like federal regulators, all that.
01:22:29.480
You might be able to contact like your council member about your undercover investigation and get some change.
01:22:36.560
So, yeah, I mean, local stories are great, too.
01:22:40.600
I mean, imagine if you did that and then you hand that off to a local news outlet like, hey, check out.
01:22:45.120
The problem is the local news outlet might be like they're they're an advertiser.
01:22:49.160
And there was a it was a large restaurant chain, so I don't know.
01:22:55.800
No, I don't think we ever released any local things.
01:22:58.840
Is there are you guys like I mean, your faces are on camera now.
01:23:02.520
Is there a risk that you're not going to be able to like Veritas had to rotate undercover reporters, right?
01:23:12.360
And and part of the reason why I am doing public interviews is just because we're such a small team.
01:23:18.200
And and we were like, well, who's going to do the public interviews?
01:23:23.340
So yeah, I mean, and it's very powerful to have the undercover journalist.
01:23:28.760
So, you know, Arden can be like, I'm the one who recorded this.
01:23:44.220
You know, James O'Keefe puts on glasses, goes into the middle of D.C.
01:23:48.360
and gets the guy to talk about all the things in D.C.
01:23:51.640
That's the crazy thing that nobody walked up to him like, James, I'm a big fan.
01:23:56.220
But yeah, no, I surprised James and I were hanging out not long ago.
01:24:00.760
So, man, people come up to us like, whoa, it's James O'Keefe, Tim Poole, like we're big fans.
01:24:11.400
I mean, he sprayed his hair a little bit, put on glasses, and yeah, can just go into a D.C.
01:24:17.760
So, yeah, I mean, yeah, not to say too much, but we had some plans.
01:24:34.940
And because like I've said, we've had some real impact with just the two of us on a shoestring budget.
01:24:42.300
So, like, we have big plans for what we could do with more money, more people, more industries.
01:24:49.640
There are a lot more companies in the U.S., like you said, you named some sites, too, that we're interested in.
01:24:58.420
You know, there's a lot of people ask all the time, like, how do you get started?
01:25:04.700
And I'm just like, hearing this, there is a deep underbelly of corruption and malfeasance that exists basically everywhere.
01:25:12.980
And we need real journalists who are willing to do journalism.
01:25:18.060
It's funny because, you know, we're watching the media industry collapse over the past week.
01:25:22.400
I don't know if you guys saw these stories, all these layoffs.
01:25:24.080
Wall Street Journal just laid off people in their Washington bureau, which is crazy.
01:25:28.340
And you get these journalists, I do air quotes here, who are saying things like, without journalism, there's going to be a lot of corruption.
01:25:36.880
Like, you guys are the corruption, but you're right about the corruption.
01:25:40.620
There's probably some dude who works at City Hall in your town, and there's something he's not telling you.
01:25:48.500
There's certainly reasons sometimes to withhold information.
01:25:51.640
If, you know, someone came out and said, like, we've discovered this problem, we want to make sure we can solve the problem, otherwise it can be exploited, for example.
01:26:01.520
There was this famous story of something called DNS cache poisoning.
01:26:05.940
This is a flaw that was found in the domain name servers of the internet, that if it was exploited, it could destroy, like, overnight.
01:26:16.420
I don't know the full details, because I'm not super into cybersecurity.
01:26:20.000
But people I know who worked on it said, once we discovered it, we didn't tell anybody.
01:26:24.740
We went to the big companies, and the government said, fix it now.
01:26:28.320
If they publicly disclosed, hey, somebody would hit it, and then it would break.
01:26:33.400
But after they solve the problem, then, so I can understand that.
01:26:36.500
But there are people who are in government who are like, I just found out this guy has been embezzling public funds.
01:26:42.900
If the public finds out, it will be disruptive to the city, and people will revolt, and there will be a recall, and it will be cast.
01:26:55.300
All someone has to do is, I do it legally, of course, because there's one party, two party, all party consent states I got to watch out for.
01:27:03.060
Now, this smartphone HD hidden camera DVR thing you mentioned is $500.
01:27:15.220
There's ones that look like pens you could tuck in your shirt or whatever.
01:27:18.440
And, yeah, anyone can do it as long as you check all your boxes legally.
01:27:26.720
I was an actress in Hollywood throughout my teens and young adulthood.
01:27:34.520
The most popular one would probably be Modern Family.
01:27:45.280
The moment you start exposing pedos, they're like, well, that goes against their core values.
01:28:00.580
In that capacity, though, the big challenge is going to be legal issues.
01:28:03.440
And if you guys are a small operation, like, how do you navigate that?
01:28:07.140
I mean, we have a good Canadian lawyer because these operations took place in Canada.
01:28:12.040
So we wanted to cross all the T's, dot all the I's.
01:28:15.500
Canada, fortunately, is very friendly to undercover journalism, maybe surprisingly.
01:28:25.860
So, you know, we were clear that we wanted to, you know, make sure there's no accidental
01:28:30.560
defamation, anything like that, any, you know, small things.
01:28:35.560
Um, but yeah, like you said, you know, it's not, it's not cheap, but it's a lot cheaper
01:28:41.080
than making some mistake and then ruining an entire operation, you know.
01:28:44.620
Illinois is one of the craziest states when it comes to, uh, recording because it's a
01:28:52.660
Because you know, the story of the Chicago Sun-Times, do you know this about the Mirage
01:28:57.740
Uh, the Chicago Sun-Times set, they, they bought a bar.
01:29:01.920
Uh, this was, I forget, I think in the seventies, they bought a bar.
01:29:08.360
And, and then they recorded like all these local regulators.
01:29:16.180
And, uh, I think that's probably one of the, the reasons for now Illinois being like, no
01:29:23.340
undercover journalism, never going to happen again.
01:29:26.200
Cause, uh, they got a lot of people in trouble.
01:29:32.320
And just have cameras and microphones everywhere.
01:29:34.820
And now then you're going to get, they will, they will end your life.
01:29:40.680
But there's certain things I wonder, like if you were to hang up a sign that maybe looked
01:29:47.120
kind of tongue in cheek, that was like, smile, you're on camera.
01:29:50.500
Would that eliminate their, uh, you know, like.
01:29:59.520
He had to consult with counsel and some of that stuff.
01:30:03.420
I never knew this 1978 sometimes and the better government association with, uh, they, uh,
01:30:10.580
broke the story of a 25 part series, documenting the abuses and crimes committed at the tavern,
01:30:14.600
which was shaken down repeatedly by state and local officials.
01:30:18.380
It was initially nominated for the Pulitzer prize for general reporting, but the board
01:30:22.180
decided not to award it, uh, award the sun BGA collaboration after editor Ben Bradley
01:30:27.640
of the Washington post led an attack on the grounds.
01:30:30.160
The reporters used undercover reporting, a form of deception to report the story.
01:30:35.940
And that's a huge reason why there's not a whole ton of undercover journalism anymore
01:30:41.720
is because journalists got scared to even do it.
01:30:50.160
I'd like to imagine that there was at one point like real journalism in this country,
01:30:53.460
but I just don't really, it's, it's small at this point.
01:30:57.560
Um, they say the Pulitzer prize is the highest award in journalism, but a CIA assassination
01:31:05.220
If you're actually about to break some major stories for the benefit of the public, you
01:31:09.560
might, uh, you might get really depressed real soon and then your car brakes fail.
01:31:21.200
Do you remember the guy who was exposing the CIA?
01:31:24.480
And then they said he killed himself by shooting himself twice in the head.
01:31:31.940
Where do you, where do you, where do you guys, uh, uh, what's your plans?
01:31:34.000
Like, where do you, where do you see where you're going?
01:31:36.140
And then like, obviously with, with Veritas kind of just imploding.
01:31:40.680
I'm wondering like, there's, there's two of you now, at least like James O'Keefe is still
01:32:03.760
We want to continue exposing exploitation in the adult industry.
01:32:07.880
There's, uh, again, uh, nobody else is doing it.
01:32:12.420
So we're not, you know, we're not stepping on anybody's toes.
01:32:17.000
I think it's one of the biggest issues, uh, in the world right now.
01:32:20.780
Um, a lot of these countries are out, or a lot of these companies are outside of the
01:32:26.160
So that adds some complication, but we still want to do it.
01:32:29.540
Um, uh, so yeah, the plan is really, we're kind of in fundraising mode while still doing
01:32:41.560
Um, and, uh, when we get to fundraising, we're going to bring on more people.
01:32:48.660
We're going to expose more companies and, uh, and we're going to get more tangible changes.
01:32:57.600
We have some, we have some, uh, people who've, who've, uh, done really good work.
01:33:01.420
I wanted to ask about like potential subject matter you're interested in, but I don't
01:33:04.540
know if that would just compromise your ability to go ahead and ask.
01:33:07.380
Well, like what, what, what, what kind of subjects are you guys looking to go after next?
01:33:10.640
I mean, I think obviously mentioned exploitation.
01:33:14.880
Online sexual exploitation is of big interest to us.
01:33:20.140
Yeah, I think there's, um, uh, the abortion issue is another big issue, especially in this
01:33:28.680
election cycle and, uh, a lot of, I mean, a lot of my friends have been locked up for
01:33:35.680
peaceful pro-life speech, uh, by this administration.
01:33:44.800
Um, I'm, I'm wondering, have there been any, uh, I know Steven Crowder did.
01:33:50.040
And I don't know if you'd call it a comedy bit or undercover journalism because he went,
01:33:56.180
He went to a planned parenthood or something pretending to be a trans woman saying that
01:34:02.260
I think this was a long time or this was a few years ago.
01:34:05.260
He, uh, he, he said that he was a trans woman who was pregnant or something.
01:34:12.600
So it's, it's, it's, it's, you know, is it a cow?
01:34:16.780
And the point he made was that if a man takes a pregnancy, he's on a pregnancy test and it
01:34:21.800
comes up positive, he likely has cancer or the, it's a, it's an indication of cancer.
01:34:25.400
And they were like, we're not going anywhere near this because you know, the, the gender
01:34:29.900
ideology issue is so touchy, but I'm wondering, has there been any, uh, am I missing this?
01:34:37.360
Like undercover reporting into these clinics that are giving kids hormones and drugs.
01:34:41.300
You know, we had, um, on Timcast IRL, we had a detransitioner who said that she went
01:34:46.340
into a planned paranoid and within minutes, they said, here's the maximum dosage of testosterone
01:34:52.400
So I'm wondering if there's anything big on that.
01:34:55.340
I mean, you know, back at project Veritas, there was some good reporting about these clinics.
01:34:59.840
I think it was called like the two young investigation.
01:35:03.040
They had like their name for it, but it was these clinics providing, um, hormones and offering
01:35:09.240
surgeries for underage patients without parent consent.
01:35:14.900
And, um, and not only that, but I think there was, gosh, there was a lot of like really egregious
01:35:22.740
There was even one doctor who was like, yeah, you know, we give, we okay these surgeries and
01:35:29.660
And maybe it's only like an immediate satisfaction kind of thing for them and they'll regret it
01:35:42.340
Like this letter comes out accusing James of these ridiculous stories, like stealing a
01:35:49.020
I can't believe it's just too stupid of a story.
01:35:57.180
Uh, and then Arden, uh, she was like joining my team at the time.
01:36:08.140
It was, uh, did it seem like there was any merit to what was being claimed?
01:36:11.960
I mean, I, like, like I said, I wasn't there, but James is a very good friend of mine.
01:36:18.480
I mean, yeah, I mean, this is, I, I mean, I'll just say like, is an excuse to like remove
01:36:24.380
a leader, like, unfortunately there's, there's lots of jealousy to go around.
01:36:27.880
And, uh, I mean, you know, I have friends on both sides of the issue, great people,
01:36:37.020
And I mean, the reality of it is project Veritas was one of the most consequential news
01:36:42.480
organizations of our generation and possibly in this country ever.
01:36:50.800
I mean, the work that Veritas did was massive, especially the, uh, Amy Roback, uh, video where
01:36:56.980
she's like, Epstein, we got him, Clinton, all that stuff.
01:37:08.580
The, uh, uh, the fact that James is now still doing his work, just put out a, a, a massive
01:37:15.340
video, uh, uh, um, with political insider knowledge.
01:37:18.080
I don't, I don't see whatever the effort was as being tremendously successful, but it's
01:37:22.960
kind of crazy that Veritas was destroyed in the way it was.
01:37:25.000
And, and the question is like, how does this happen?
01:37:27.020
The thing is a lot of people want to take out James, you know, like I think the federal
01:37:33.340
I mean, he and, he and I were raided by the FBI, uh, you, you know, about that back in
01:37:38.820
2021 and, um, and I think the idea is in some way they need to remove him, put him out of
01:37:52.460
I think, uh, I think they're kind of waiting to see, uh, if he's down for the count, but
01:38:00.360
So he's always going to keep on like, they're going to have to literally put him in prison
01:38:09.960
So I guess I have a question about the, uh, the, um, you, the, the expansion of all this
01:38:15.380
undercover journalism, because there's been criticisms of me in the past.
01:38:21.040
I'd go on the ground to these various protests and I would have my phone in my hand, plainly
01:38:25.760
visible for everybody to see streaming 24 seven.
01:38:27.980
And there, the argument was, it was good that we got a window into all these things that
01:38:32.600
are happening, but as more people adopt this practice, privacy starts to get eroded.
01:38:39.340
I mean, do we want to live in a world where everything we say is going to be secretly
01:38:42.260
recorded by someone or should we just accept it?
01:38:48.480
I think it's like you said, I think people, uh, there are things that the public has a right
01:38:53.580
Like people don't, the job of a journalist is not to embarrass somebody.
01:38:57.560
By exposing personal secrets or, um, or trade secrets or health information, um, or just
01:39:12.340
Well, health information, it's, it's, it's, it's rough.
01:39:14.520
I mean, Joe Biden's health information is public right to know.
01:39:18.400
So, you know, having someone from the white house say, yeah, he's in mental decline.
01:39:23.300
I think that's, that's, that's less specific than like, oh, he has this personal embarrassing,
01:39:29.900
I like, well, I think that the issue here is he's the president.
01:39:33.260
I mean, it's all his position has to be taken into account.
01:39:37.100
And, um, yeah, what's newsworthy in for one person might be totally irrelevant if a different
01:39:47.720
So for, uh, for me doing a Tim gets IRL, for instance, you get these leftists that
01:39:53.640
will make a video about a tweet I made and their justification is, oh, but he's got millions
01:40:00.960
And so at what point is it, everyone starts secretly recording.
01:40:05.200
And then do we find ourselves in this, you know, hypervigilant state always where we're
01:40:09.840
scared someone's going to publish something, you know, it may not be in my, like it,
01:40:14.300
like I get to go to the doctor and the doctor's like, oh, you've got a kidney stone or whatever.
01:40:18.100
And it's like, okay, I'd prefer it if people didn't know that I honestly don't care.
01:40:21.680
I'm literally saying it, but like, let's say there's somebody who's like, I'd prefer
01:40:23.840
no one knew that someone else might be like, it's really important that this guy who runs
01:40:27.380
his big show, people know that he's sick or something and they decide it's newsworthy.
01:40:32.060
And more to the point, it's not a question of them being right or wrong.
01:40:34.620
It's a question of the expansion of this technology.
01:40:37.640
Do we create the panopticon ourselves, which we are like sort of already doing, you know,
01:40:43.240
like once, once everybody had a phone camera, right?
01:40:52.100
I don't think there's a putting, you know, the, the genie back in the bottle with that
01:41:05.580
Um, like, I think, I think there's just no going back from that.
01:41:12.240
Might as well expose some real stuff while we're at it.
01:41:15.900
Maybe, maybe the reality is just, we're going to get, I suppose the scarier thing is outside
01:41:20.940
of the idea that whatever it is you do is probably recorded.
01:41:26.660
But the scary thing now is how AI can generate video.
01:41:29.720
So, and so we had, uh, what, what video was it recently?
01:41:38.060
And, uh, you guys saw this, you see this one, the Carrie Lake and the guy shaking her down.
01:41:41.620
And the first question a lot of people ask is, is this fake?
01:41:48.280
Jeff DeWitt said that it was selectively edited.
01:41:53.900
Now they can just be like, ah, it's a deep fake.
01:41:58.620
someone will, someone could go on a date with you, undercover film you, but then AI generate
01:42:10.600
It's an increasing problem for undercover journalism.
01:42:12.920
We got to do, we got to do undercover journalism while people would still believe that it's real.
01:42:17.680
Um, and, and a lot of people suggested to me when I was producing the, the tapes, like,
01:42:22.320
oh, you should, you should cut out more of the background audio.
01:42:25.720
You need to like clean up the audio more in these tapes.
01:42:28.620
And I was like, no, no, I want it to be as raw as possible.
01:42:34.360
And I love saying we edited the audio or it sounds fake.
01:42:37.260
Of course we still got accused of that, but I love that argument.
01:42:40.260
Cause it's like, okay, please explain the scenario where this person would be making these admissions
01:42:48.800
Like there's, he's saying full sentences where he's like talking about Pornhub specifically,
01:42:55.060
talking about trafficking, rape, sexual abuse specifically.
01:42:59.320
And there's like no universe in where I could have taken this out of context.
01:43:05.540
Well, that's a selective argument, uh, selective editing argument.
01:43:09.840
Uh, we saw this with the Kyle Rittenhouse case.
01:43:11.940
They tried using CGI images to convict this kid.
01:43:14.720
So the way it works is they have a photo, uh, on, I think it was like an iPad and they zoom
01:43:24.880
The way digital zoom works on these apps when you zoom in is that an algorithm creates pixels
01:43:32.340
So when they zoom in and say, see, look, here's what the image shows.
01:43:39.340
You need to actually show the static image in its standard form and then ask people if
01:43:44.940
So we're already in this period where it could go one of two ways.
01:43:49.560
Someone could secretly record another person, CGI audio of them.
01:43:57.920
I mean, I could not believe it as I'm watching this trial.
01:44:00.640
Well, these guys are trying to use computer generated images to convict Rittenhouse.
01:44:07.640
And the, and the, and he put it on the defense to prove otherwise.
01:44:11.700
So this means someone could go to court and say, here's a recording of, you know, of
01:44:17.900
And here's you saying all of these things you did that are illegal.
01:44:26.840
Your expert says, see this all here that proves it's fake.
01:44:29.880
Then their expert's hands up and says, no, that's actually a totally normal thing.
01:44:40.220
Just like the implications for undercover journalism.
01:44:42.940
I mean, it's easier with video undercover journalism than, than just audio, of course,
01:44:48.060
to prove that it's real, to, to look more real.
01:44:50.380
Cause, but in 10 years, like it might just be over.
01:44:55.540
It might just be very simple to just make fake videos of people.
01:44:59.740
Cause, uh, I was asked by Joe Rogan about that.
01:45:02.240
Uh, this was like a couple of years ago, last time I was on a show and he's like, are you
01:45:05.500
I was like, no, I'm like, context, uh, is, is going to prove these things.
01:45:12.400
People are going to be like, I don't believe that they're gonna be more resilient to it.
01:45:14.560
And now based on what we've seen now, I'm terrified because it's not just about the
01:45:21.660
You know, one thing I've pointed out, the scariest thing is going to be when someone
01:45:26.500
will record you guys and it'll be a real conversation.
01:45:29.500
And you'll say something like, obviously we would never do anything illegal.
01:45:32.980
We have great lawyers and we're working really, really hard to make sure everything
01:45:37.640
And then all they have to do is get rid of ill or, or, or add ill.
01:45:42.960
And so you go, you know, obviously we're doing things that are illegal, but we have great
01:45:47.880
And then what they'll do is you'll get sued or whatever.
01:45:52.840
They'll, they'll say, you know, did you have this conversation with this person?
01:46:03.520
And then they add two letters to a word you said.
01:46:06.860
And so on the video, you're not going to notice you can't track mouth movements that
01:46:12.400
They're like, here's you admitting you commit crimes.
01:46:15.100
But you just told us the conversation happened.
01:46:20.140
You've admitted you've confessed to have a nice day.
01:46:26.220
I mean, because even then the argument is going to be, you need an expert who can debunk
01:46:38.340
Like the jury is not going to have the expertise to know whose expert is right.
01:46:42.900
And it's just going to be good luck convincing people.
01:46:45.020
It is kind of crazy if you think about it though.
01:46:46.280
If you go back before any of this forensic evidence and video, what were juries like back
01:46:57.620
A lot of people probably went to jail that were innocent.
01:46:59.560
But, but outside of that too, I think that the issue of AI is going to be the, the, the
01:47:05.540
I think that'll be like a big, big crisis for us.
01:47:08.760
People are going to choose to just go into these universes.
01:47:12.480
They're going to make whatever titillating content they can make.
01:47:16.780
And I think what's going to happen is, you know, going back, like way back to the earlier
01:47:21.840
conversation, literally about what you guys exposed, you have no willpower to regulate
01:47:29.160
what is clearly criminal, no law enforcement willpower to arrest people who are clearly
01:47:36.480
And I mean, you go five miles over the speed limit, you get pulled over and the cop yells
01:47:41.120
You are an employee at a massive child trafficking organization.
01:47:44.140
And they say, well, you know, I don't have to do here.
01:47:46.560
So as we move forward with all this technology, I, I, I can appreciate the investigative journalism.
01:47:52.180
I can see that there are people upset about it, but it looks like there's going to be
01:47:54.080
a massive split where most people are just like, no, they're just in for it.
01:47:59.940
And everything's kind of going to, going to break down, I guess.
01:48:06.980
Well, maybe, I mean, you know, I'll put it this way.
01:48:10.740
Turn of the century, 1900s, they were writing about how the cities would be over with horse
01:48:17.980
There's going to be piles of horse crap as more and more people are born and inhabit the
01:48:27.500
And there's going to be mounds of manure where they can't get rid of.
01:48:36.940
I mean, you know, Pornhub, their leadership even acknowledged that we are, you know, the
01:48:46.440
We are currently winning with all of these laws passing, the age verification laws, with
01:48:52.020
them having, being forced to change their upload policies throughout, you know, various
01:49:00.420
So I really do think positive things are happening.
01:49:03.800
I think more and more people are realizing how harmful pornography is on so many levels.
01:49:09.920
I get DMs all the time from people saying, I was an addict or I'm still addicted, but
01:49:16.980
So I really do believe good things are happening and we'll just, our job is to just keep exposing
01:49:24.940
I feel like, uh, do you guys remember, what was that, what was that law they tried passing?
01:49:36.060
And there was this big push online where they were like, if you allow this, they're going
01:49:40.200
And I feel like, you know, now in hindsight and looking at all this stuff, I'm not, I'm
01:49:43.600
not going to comment directly on those laws because it's been too long and they're specific,
01:49:46.640
but I definitely feel like when it comes to this stuff, there is a manipulative.
01:49:51.060
Oh no, our rights, you better not ban pornography and child trafficking.
01:49:57.460
And I'm kind of like, at this point, uh, you should.
01:50:01.080
And specifically, ALO, Pornhub's parent company, they try to classify themselves as just a tech
01:50:07.780
So they try to group themselves in with all of these tech giants that have immunity for
01:50:15.540
And so that's the argument they're trying to get away with.
01:50:18.300
Are they, are they basically arguing, well, we don't make any content.
01:50:22.480
For, for Pornhub specifically, but really they can't, they can't correctly use that argument
01:50:28.460
because they're a porn giant that does everything from write to shoot, to produce, to, yes, to
01:50:36.500
And even Dylan said, he's a senior script writer there.
01:50:39.380
He says, we even write some of the amateur stuff that you see being uploaded.
01:50:44.200
We work with these people and we actually write it for them.
01:50:47.880
So a lot of it is manipulated by this huge parent company itself.
01:50:53.540
These class action lawsuits they've tried that, that our, our work has been cited and they've,
01:50:58.200
uh, tried to claim section 230 immunity failed every time.
01:51:07.060
I mean, you know, like, like Arden said, they, uh, they try to say, oh, we're just a platform.
01:51:23.780
And there's a big, uh, I don't know the case number offhand, but there's a big, uh, class
01:51:27.440
action federal lawsuit in California where they tried to do it.
01:51:33.080
It's, it's a bunch of victims sued the parent company because, uh, Pornhub had this content
01:51:39.540
partner that they worked with called girls do porn.
01:51:42.320
And they were one of the top performing content partners on the site made millions of dollars,
01:51:47.020
but essentially how they procured these girls in these videos is, um, through coercion through
01:51:54.500
You could see in some of these videos that the doors for the rooms that they're in were
01:52:00.700
Um, many of these girls were contacted while under age and groomed so that they would show
01:52:06.140
up to this location to film a video on their 18th birthday.
01:52:10.940
Um, so there was a lot of really horrible illegal stuff going on.
01:52:19.240
And so a bunch of these, uh, women sued the parent company.
01:52:24.420
For those that aren't familiar, just to briefly address it.
01:52:27.340
Section 230 of the telecommunity of the community.
01:52:29.960
I think it's the communications decency act provides broad immunity to websites over the
01:52:38.360
And so it was supposed to be, it's, it gets started because there's like a website, they
01:52:43.300
publish an article and I think it was the wolf of wall street.
01:52:45.600
They, uh, they, they wrote, they wrote something about him and someone commented something false.
01:52:50.080
And so I could be wrong, but they got sued for defamation and they were like, we did not
01:52:57.080
And so this law gets passed saying, okay, okay, you can't be held responsible for what other
01:53:00.720
people put on your site, but now it's turned into the companies have broad immunity.
01:53:06.440
When they censor political ideas, they don't like, they can basically violate their own
01:53:12.580
They can allow graphic content and just say, don't look at me.
01:53:17.680
But that's why it's fascinating to hear that in some of the, some of these cases, it failed
01:53:22.320
In all of the cases, actually, every single, every single civil lawsuit that they've tried
01:53:37.500
A lot of people have called for outright getting rid of section 230, which I think is a mistake.
01:53:42.380
I just think the issue has to be, uh, like you, you mentioned Pornhub, they make the stuff
01:53:48.420
they're deeply involved in it, but I, I don't actually even respect like Twitter when they,
01:53:56.340
when they have porn on their website, like nah, you, you, you, like, if you went into,
01:54:00.080
imagine going to like a, a, a Savon, a Jewelosco, an Albertsons, uh, a Safeway.
01:54:06.080
I'm just naming as many regional grocery stores as I can.
01:54:08.600
And, uh, you're walking around with your kids or, you know, with your friends, friends
01:54:11.640
and family and you know, there's a coffee section and you're grabbing it.
01:54:14.260
Then you walk over, there's a porn section, just hardcore graphic stuff all down the aisle.
01:54:23.620
No, I think we should have some kind of like, I don't know, regulation or something to stop
01:54:29.540
And it's illegal to, uh, market, uh, porn to children.
01:54:33.700
Again, like a lot of these things are like things that already exist for the physical
01:54:39.080
And they're just not, they're just not being enforced in the digital space.
01:54:43.220
And so now we're like having these States do the ID laws and some other things.
01:54:47.360
And, uh, like to clarify that, like this applies to the digital space as well.
01:54:52.960
I think one thing that we're seeing a lot with, uh, schools and these books, people on
01:55:01.000
But I'm before elaborating specifically on like, you know, for those who don't know the
01:55:04.120
background, I'm curious if you guys have seen anything in your work that is a connection
01:55:08.740
in any way to what they're pushing on kids into the porn industry.
01:55:12.200
I mean, arguably some of the admissions, we had a couple of different employees expressing
01:55:21.640
just like their personal opinions, but we thought it was still significant that they
01:55:25.960
had positive views about, um, 12 year olds watching trans porn in order to find themselves
01:55:35.480
And one person even remarked that a kid could find their kink on men.com, which is also one
01:55:42.740
But that's basically the argument these people have for, uh, putting the graphic material
01:55:47.340
That it's, that it's education rather than pornography.
01:55:53.440
If, if you've got people who work there who are saying that they want to keep sending more
01:55:58.120
and more extreme content to guys, and eventually a guy who's straight, who's watching straight
01:56:01.700
porn, they can send trans or gay content to, and they start shifting in that direction.
01:56:07.260
That's them outright admitting they convert and groom people.
01:56:10.060
So when they then talk about kids and they're doing the same thing, that is an admission
01:56:14.420
that they, that they know when they get these books in these schools, they are grooming kids
01:56:20.400
to adopt patterns of behavior or, or, you know, proclivities or whatever.
01:56:25.660
And I feel like the pipeline ends up in, in whether someone's doing porn or not, they're
01:56:32.860
And they're hyper-sexualized, you know, that's, uh, that's a large part of it.
01:56:36.380
I wonder, like, I, I genuinely don't know if there's like, is there an end game to just
01:56:45.260
Is it to stop people from having kids because they're too messed up to be attracted to each
01:56:50.720
So I think the incentives just line up right now where there's still a lot of money to
01:56:58.480
That's so like, you know, uh, and like, what are people interested in?
01:57:05.840
I think 65% of internet activity is porn related.
01:57:14.280
But I, like I was saying earlier, it will, the, the, the production of porn is over in
01:57:22.300
Like there was a, a, a year ago I noticed on Instagram, this is why I absolutely despise
01:57:29.880
Um, look when I'm on Instagram, you know what I watch?
01:57:36.020
It's, it's winter now and I'm getting tons of snowboarding and skiing.
01:57:39.140
And so I'm just watching, I was watching Sean White do like a backflip.
01:57:43.720
Him and like his friends are just cruising and doing flips and I watch poker videos and
01:57:50.880
It's like, but they jam like these kind of like quote unquote instant models in my feed
01:57:58.980
I'm like, dude, I'm a, I got a girlfriend not interested in, in this.
01:58:03.280
I'm like, I, I don't want to get my feet inundated with a, with a bunch of women showing
01:58:09.060
I just want to watch some guy win a poker hand.
01:58:15.400
They keep showing this stuff because I think the reality is the algorithm shows people click
01:58:21.540
And so it doesn't matter who you are, what you do.
01:58:25.700
And we think this is, so it's probably the highest click through content.
01:58:29.500
This is why on YouTube in the early days, all the thumbnails were women in bikinis and
01:58:33.880
But I noticed this a year ago, these photos were fake women and it was obviously AI women.
01:58:41.080
And so, you know, I go to the, the explore tab and I get a video of a guy's rollerblade
01:58:47.900
And then you'll see in, in the grid, there's a woman.
01:58:59.900
And we're hearing stories about them making like 30 grand a month.
01:59:02.760
This is the big story that some guys, so you don't got to worry about your daughters.
01:59:09.640
But the brains of, of like young people in society are going to be like in what, 20 years,
01:59:16.040
the next generation, there's going to be a woman and she's going to go to like, she's
01:59:19.420
going to meet a guy and she's going to be like, I have no physical attraction to you
01:59:21.680
whatsoever because I'm into aliens and, you know, stone golems and just weird insert
01:59:38.540
Cause there's no way they're going to be able to procreate.
01:59:43.140
But at any rate, as we're getting close to wrapping up, is there anything you guys want to
01:59:59.420
Is there, is it like, what can we, what can we look for in the future though?
02:00:02.800
Is there, is there ways you guys are hiring people looking for support or what's going
02:00:07.300
More, more undercover videos in the future and hope to be hiring soon.
02:00:15.320
We love to hire new journalists and get more story.
02:00:21.920
I'm Arden underscore young underscore on Twitter.
02:00:26.520
And sound investigations is, uh, sound investig.
02:00:35.100
Uh, I think it's really cool that, you know, basically we used to have this great industry
02:00:39.720
of undercover journalism and it was considered to be something good historically.
02:00:43.360
Then I, the story of the Mirage Tavern in Chicago was surprising me back in the seventies.
02:00:47.860
They were like, not about it, but it's cool that, uh, you know, after whatever happens
02:00:54.620
I remember seeing the video and I'm like, this is very Veritas esque.
02:00:57.480
So it would be cool to see more people start to work to, uh, expose corruption.
02:01:02.320
And I think, you know, locally finding out, uh, local corruption and malfeasance is going
02:01:09.700
So I appreciate you guys coming and hanging out.
02:01:12.860
Uh, everybody who is listening, make sure you subscribe to tenant media.
02:01:16.080
The culture war show is, uh, every Friday at 10 AM till noon.
02:01:19.700
And we've got more shows to come conflict, controversy, debates, and cool subject matter.
02:01:24.300
The next show we'll have is going to be Timcast IRL over at youtube.com slash Timcast IRL
02:01:39.920
Have you heard of anything more chilling than frozen beef until November 3rd?
02:01:45.380
Get an always fresh, never frozen Dave single from Wendy's for only $4.
02:01:51.380
Taxes extra participating Wendy's until November 3rd.
02:02:00.100
My name is William Woodhams and I'm the CEO of the British-born sportsbook, Fitstairs.
02:02:05.060
We've been over in Ontario for well over a year now and have loved every second of it,
02:02:12.480
Let me tell you, us Brits simply can't get our heads around hockey.
02:02:16.180
It is so confusing to us and it is impossible to outsmart Canadians on the ice.
02:02:21.440
That's why at Fitstairs, the world's oldest bookmaker, you can play with this on anything,
02:02:25.640
anything you want, cricket, tennis, just not hockey.
02:02:30.460
Plus, with our world-class casino and over 150 years of experience, you're in great hands.
02:02:36.940
So, you've got to stop pucking around and go to fitstairs.ca.