The Culture War - Tim Pool - February 02, 2024


The Culture War #49 - Adult Content Sites Target Kids, Exposing Porn Industry Corruption


Episode Stats

Length

2 hours and 2 minutes

Words per Minute

191.94582

Word Count

23,576

Sentence Count

2,024

Misogynist Sentences

51

Hate Speech Sentences

42


Summary

In this episode, we're joined by Arden Young and Eric Cochran of Project Veritas to talk about the growing problem of children having access to pornography online, and how the porn industry is making money off of it. They talk about what it means, why it's a problem, and what we can do about it. Also, get a FREE frozen beef sandwich from Wendy's to celebrate the new season of the hit show The Morning Show with Rachel Maddow! Subscribe to the Morning Show on Apple Podcasts, wherever you get your shows, and don't forget to rate, review, and subscribe to our new podcast, The Dark Side Of. Subscribe and comment to stay up to date on all things podcasting and social media! Timestamps: 3:00 - The problem with porn is that it's available to all of us at pretty much any age 7:30 - Pornhub is all over the internet 8:15 - How Pornhub makes money 9:20 - The porn industry doesn't care about morals 11:00- Pornhub doesn t care about morality 13:00 14:40 - What's the problem? 15:40 16:30 17:10 - What are we to do with porn 18:00 | Pornhub's strategy 19:30 | What do we need to know 21:00 // 22:10 22:30 // 23:10 | What are you going to do to protect your kids from porn? 26:00 @ Pornhub? 27: What do you care about? 29:00 / 30:00: What should we know about this? 31:00/32:30 / 33:30/33: What does pornhub do with consent? 35:00 + 34:00? 36:00 & 35:40 / 35:30 + 36:30? 37:00 Or do we care about this stuff? 39:40/36:00 Can we trust Pornhub do we really care about it? 40:00 Are you a pedophile? 41:00 Do we have a problem with pornography? 45:00 Is porn too old to watch porn at a young age? 44:00 What does it matter? 47:00 How do we know we can trust porn at an early age to watch it on the internet?


Transcript

00:00:00.000 Have you heard of anything more chilling than frozen beef?
00:00:04.160 Until November 3rd, get an always-fresh, never-frozen Dave Single from Wendy's for only $4.
00:00:09.600 Nothing scary about that. Taxes extra at participating Wendy's until November 3rd.
00:00:13.680 Terms and conditions apply.
00:00:15.120 I hate hockey. Seriously, I can't stand it.
00:00:18.800 I'm William Woodham, CEO of the British-born sportsbook Fitstairs.
00:00:22.220 We've been over in Ontario for over a year now.
00:00:24.960 We haven't made a loony from hockey.
00:00:27.040 It's impossible to outsmart Canadians on the ice.
00:00:29.520 So at Fitstairs, the world's oldest bookmaker, play with us on anything, anything except hockey.
00:00:35.000 With our world-class casino and 150 years of experience, you're in great hands.
00:00:39.400 So stop pucking around. Go to fitstairs.ca.
00:00:42.460 19+, Ontario only. Please play responsibly.
00:00:45.460 One of the issues we talk about quite a bit on a variety of shows on the internet,
00:00:49.900 on Tim Kestirel and The Morning Show is the ubiquity of adult content online
00:00:56.500 and how children have immediate access to it.
00:00:59.520 And a debate that we've had before is, you know, in the real world, in meat space, as we call it,
00:01:04.740 if a kid were to wander down the street, they might see a pizza place, they might see a bookstore,
00:01:09.860 they might see a music venue, and they might see, I don't know, a gentleman's club or an adult bookstore.
00:01:15.900 The windows are blocked out.
00:01:18.340 18 up only, it says on the door.
00:01:20.200 And if that kid walks the door and knocks on it, they're going to say,
00:01:22.480 kid, you can't come in here. Are you nuts?
00:01:24.040 And if a kid tried sneaking in, they'd say, ID, please.
00:01:26.700 And sometimes kids do sneak in to adult bookstores or they have fake IDs.
00:01:30.160 But for the most part, we create barriers to prevent minors from getting access to certain materials.
00:01:35.040 For the same reason, you know, we don't allow people to smoke at a certain age or get tattoos.
00:01:38.880 But now because of the internet, it's kind of actually, it's just at the window.
00:01:42.880 Now you have people actually arguing that there's nothing we can do about it anyway,
00:01:47.060 so just let them have it.
00:01:49.160 I think that's particularly insane.
00:01:51.240 There are many more libertarian individuals who argue,
00:01:53.320 we can't require identification for these sites because then you're going to have
00:01:56.820 the government and companies saying everyone has to have an ID to go online.
00:02:01.260 I don't know that the answer though is just let children wander the streets figuratively of the
00:02:07.280 internet and get access to the most extreme and psychotic adult content imaginable,
00:02:13.520 which could include even outside of adult content, videos of murder, rape, beheadings,
00:02:20.560 just horrifying things.
00:02:22.520 And so I think there's questions that we need to ask about this.
00:02:24.340 So we're gonna have a conversation about this and more, more so what these industries are
00:02:29.260 knowingly doing to make money and how they don't actually care.
00:02:32.880 So we're being joined by a couple individuals who actually investigated this.
00:02:36.660 Would you guys like to introduce yourselves?
00:02:38.340 Yeah.
00:02:38.700 Hey, my name is Arden Young.
00:02:40.460 I'm an investigative journalist under Sound Investigations and I recently went undercover
00:02:45.620 to investigate Pornhub.
00:02:47.320 Right on.
00:02:47.960 And sir?
00:02:48.540 I'm Eric Cochran.
00:02:49.540 I'm the founder of Sound Investigations.
00:02:51.560 Before that, I was the sources manager at Project Veritas with James O'Keefe.
00:02:55.920 And yeah, founded Sound Investigations, hired Arden.
00:02:58.700 We went undercover and, you know, did the James O'Keefe style to Pornhub.
00:03:03.180 Man, what has James done?
00:03:04.960 He's created more people who are going and exposing a lot of this corruption.
00:03:08.680 The crazy thing is, in this investigation you guys published, you've got someone working
00:03:14.920 for Pornhub, I believe.
00:03:16.300 And they're saying outright, like, they don't care about morals, they're making money.
00:03:20.240 But let's start from the beginning.
00:03:21.160 And why don't you guys tell us what's going on and we'll get into the philosophy and the
00:03:24.340 morals.
00:03:25.380 Yeah.
00:03:25.620 So I think total, we recorded around a dozen Pornhub employees or its parent company.
00:03:31.900 It's called A-Lo.
00:03:33.620 And we got them admitting to all sorts of illicit, illegal and scandalous practices, really.
00:03:42.100 Lack of age and consent verification all the way across the board.
00:03:46.380 Not reporting to law enforcement when they see underage videos being uploaded.
00:03:53.220 Purposefully advertising to pedophiles and young teens.
00:03:57.000 And what else?
00:03:59.500 Also, purposefully marketing gay, trans, and bi content to historically straight viewers,
00:04:07.040 which, of course, they track a viewer's life path.
00:04:09.260 In that video, that one's really shocking because the individual expresses their intention
00:04:14.720 to, like, querify or something, to, like, try and skew the preferences of straight men
00:04:20.220 towards...
00:04:20.960 Yeah, the word convert was used.
00:04:23.080 Convert.
00:04:23.960 So basically, these people are doing some kind of reverse conversion therapy or just outright
00:04:29.640 conversion therapy, I guess.
00:04:31.980 So, well, let's start from the beginning.
00:04:33.240 How does it come to be that you guys decided to start looking into Pornhub or, you know...
00:04:38.180 Yeah.
00:04:39.260 So, you know, back in 2020, we were both working at Project Veritas at the time, and an article
00:04:47.860 came out in the New York Times called The Children of Pornhub.
00:04:50.780 It was by Nicholas Kristof.
00:04:52.580 He did a big article, a big op-ed interviewing victims of sexual abuse who had their abuse
00:04:59.560 monetized by criminals on Pornhub.
00:05:04.000 And this article was one of the first big blows to the internet porn distributors.
00:05:10.600 The owner of Pornhub is...
00:05:12.600 It used to be called MindGeek.
00:05:14.920 Now it's called A-Lo.
00:05:15.820 It owns basically all the big porn sites you can think of and all the big porn studios
00:05:21.460 in North America.
00:05:23.120 And it was a...
00:05:23.860 This article was a huge blow.
00:05:25.600 It caused Pornhub to have to delete 80% of their videos.
00:05:30.520 They had to remove downloads, free downloads from their website.
00:05:36.860 And I was like, nobody's really done undercover journalism against the porn industry when they're
00:05:44.860 clearly doing these illegal things and just nobody's bothered to do it.
00:05:48.700 So I kind of ruminated on that idea.
00:05:51.580 And then early last year, I called Arden and said, hey, I have this idea.
00:05:56.680 I'll put up the money.
00:05:57.720 Let's go do this.
00:05:59.640 And then in a few months, these guys, they're based in Canada.
00:06:05.180 So I don't think they knew who James O'Keefe or Project Veritas was.
00:06:08.760 They didn't see it coming.
00:06:10.140 They weren't on guard like Google employees might be to think of undercover journalists.
00:06:16.160 And they just opened right up.
00:06:19.860 You know, they started spitting out.
00:06:21.580 One guy said, yeah, I wouldn't be able to defend this in court.
00:06:24.180 Rapist.
00:06:24.620 Human traffickers are using our website to make money.
00:06:27.880 Children were on the site.
00:06:29.400 Like underage women.
00:06:31.440 Videos of children being abused.
00:06:33.600 Yes.
00:06:34.180 Right.
00:06:34.400 It's so crazy that this was a mainstream.
00:06:37.320 I mean, it was a meme.
00:06:39.700 Like, that's how popular Pornhub was.
00:06:42.320 And to a certain degree, still is.
00:06:43.620 And they had underage victims on the site.
00:06:47.420 I know one of the issues was that they could.
00:06:49.200 There was one big.
00:06:49.860 I think it might have been this story.
00:06:51.000 Victims trying to get the content removed.
00:06:52.580 And Pornhub was like, nah.
00:06:54.040 Right.
00:06:54.300 How are there not criminal charges?
00:06:55.520 How are these people not in prison?
00:06:56.980 It's it's again, they're a huge company.
00:06:59.520 They are a monopoly.
00:07:00.640 They Pornhub itself is just either the eighth or ninth most popular website in the entire world.
00:07:06.100 Uh, and that doesn't include the, this company owns, uh, like almost all the porn sites you
00:07:13.980 think of, uh, their, their parent company.
00:07:16.200 And so they have a lot of power.
00:07:18.840 Uh, they're based in Montreal.
00:07:20.700 So, you know, the U S government has kind of limited things.
00:07:24.080 A lot of people just kind of have to go after them in civil courts.
00:07:27.200 I don't buy it.
00:07:28.220 Yeah.
00:07:28.500 No, no, no.
00:07:29.720 I, I had the, uh, the privilege of meeting sir, Kim.com in New Zealand and his story very
00:07:35.920 famously with mega upload.
00:07:37.380 Do you guys know the mega upload story?
00:07:39.160 Not too much.
00:07:39.980 10 was his 10 years ago, 12 years ago, longer than that.
00:07:42.700 Probably there was a file upload locker website.
00:07:45.640 What that means is it's like Google drive.
00:07:48.040 You go to mega upload, you say, I want to store files online.
00:07:51.340 Yeah.
00:07:51.900 People were uploading movies and TV shows that anyone could watch.
00:07:56.140 So new movie came out.
00:07:58.160 You wanted to watch a bootleg and you'd search for it.
00:08:00.200 Mega upload would pop up and you'd get a link and you'd watch it.
00:08:03.500 The, uh, the, uh, law enforcement.
00:08:06.400 I just want to keep, I'll keep it very light.
00:08:08.380 Raided his home in New Zealand.
00:08:10.860 He's never set foot in this country.
00:08:12.780 He's not an American, did not run an American company.
00:08:15.580 And the United States went after him in New Zealand because of piracy.
00:08:20.000 And here's where it's here.
00:08:20.960 Here's the worst part.
00:08:21.600 What Kim says is when they would contact him and say like, Hey, this is infringement.
00:08:25.940 You gotta take it down.
00:08:26.520 He'd say, you got it.
00:08:27.500 Send us the link.
00:08:28.040 We'll remove it all.
00:08:29.160 Right.
00:08:29.280 But so many people were using it.
00:08:30.760 It was an avalanche.
00:08:32.160 They decided just to destroy his life and target him.
00:08:34.900 So knowing that story, having gone down there and interviewed him and let's assume the worst
00:08:39.840 case scenario with Kim.com.
00:08:42.060 He was gleefully supporting copyright infringement.
00:08:45.900 Well, they got them boys.
00:08:47.200 But you mean to tell me with Pornhub, they've got underage victims, trafficking victims.
00:08:53.060 And the US government's like, oh, geez, oh, no, our hands are tied.
00:08:55.680 They're Canadian.
00:08:56.540 I don't buy it.
00:08:57.080 And that's a really good point because Pornhub has, you know, a lot of people say they take
00:09:04.640 copyright infringement more seriously than actual abuse because there's this fingerprint
00:09:10.240 technology that they can apply to the videos on Pornhub.
00:09:14.460 Any, you know, major tech website has it where if you upload a 30 second clip of the Lion King,
00:09:19.380 like it's just not going to happen.
00:09:21.100 It won't be allowed.
00:09:21.920 And so there's this technology being applied to videos of like major popular porn stars so
00:09:31.460 that their stuff doesn't get pirated.
00:09:33.140 But when it comes to victims and underage victims, they're very, very, very slow to apply
00:09:39.640 that kind of technology.
00:09:40.960 And the key is that they knowingly do this and they profit off it because, for example,
00:09:46.420 there was a Senate hearing this week grilling like Mark Zuckerberg and and, you know, TikTok
00:09:53.700 and Snapchat about things that could potentially be problems and children using their websites.
00:10:00.280 And, you know, people have different opinions on that.
00:10:03.180 But certainly they're not trying to make money off of exploiting children and copyright infringement,
00:10:10.540 et cetera, et cetera.
00:10:11.760 But that was the key was in these undercover investigations.
00:10:15.080 The employees admit it's not just a hypothetical problem.
00:10:18.560 It's not just that human traffickers could use our website.
00:10:21.060 It's that they know they do.
00:10:23.080 They know that rapists do it.
00:10:24.560 They admit they that they do it and they don't care because it makes a lot of money.
00:10:28.480 One guy told Arden on the undercover tape he was so concerned about it that he and another
00:10:35.600 employee went to the chief legal officer and the chief product officer and told them like
00:10:42.880 this is an issue, like these guys are making a lot of money and they told him to shut the
00:10:46.500 F up because it makes too much money and we don't want to hear about it.
00:10:51.720 I can respect if like somebody was a whistleblower and came out and said, I can't believe this.
00:10:56.320 I can't work here.
00:10:57.200 But I cannot respect someone who decides to keep quiet, keep facilitating child trafficking.
00:11:03.760 It makes me I want to see them all in prison.
00:11:06.700 Yeah.
00:11:06.880 Like they should they should be arrested.
00:11:08.140 They should be criminally charged.
00:11:09.080 They should be investigated.
00:11:10.280 I mean, the fact that you guys captured as much as you did, I mean, these people are
00:11:13.660 basically bragging or not even I mean, maybe they're they're venting.
00:11:16.980 Maybe they're hoping for some kind of like is a mixture of some some more bragging, some
00:11:20.460 more venting.
00:11:21.580 Maybe they're hoping for some absolution.
00:11:23.640 If I just tell people, maybe it'll make it all feel maybe they'll tell me it's OK.
00:11:27.680 And I think, you know, James O'Keefe mentioned this.
00:11:29.380 Why is it that so many people spill the beans?
00:11:30.920 I think they have guilty consciences.
00:11:32.540 Absolutely.
00:11:33.100 But I'm sorry.
00:11:33.620 Like, man, the stuff we learned about what was going on and it's not just porn up to this is an important
00:11:37.600 thing. A lot of people have said, you know, Pornhub gets the brunt of it.
00:11:41.020 Well, they're the biggest.
00:11:41.980 But there are a lot of sites that do this.
00:11:44.060 I mean, I have questions about OnlyFans.
00:11:46.000 Yeah.
00:11:46.520 And that's getting crazy, too.
00:11:47.740 I bet OnlyFans is going to completely supplant Pornhub in the future.
00:11:51.180 Yeah.
00:11:51.620 Well, it may.
00:11:52.880 Yeah.
00:11:53.340 You know, we we definitely.
00:11:55.460 Yeah.
00:11:55.780 Like you said, like a lot of people are like, why did they just get the brunt of it?
00:11:58.580 Because they're the biggest.
00:12:00.160 But we, you know, sound investigations were a new venture.
00:12:03.420 We want to do more.
00:12:04.620 We want to investigate more without without saying too much.
00:12:08.980 We have plans to to to go after more of these criminal organizations and figure out, you
00:12:14.740 know, what they're doing criminally, how they're affecting the culture, that kind of thing.
00:12:18.840 I'm actually surprised the New York Times kicked off this story.
00:12:22.060 Yeah.
00:12:22.480 Yeah.
00:12:22.860 I mean, it was just that egregious that that that.
00:12:26.000 Um, that, yeah, like it's not like a political issue that people, uh, even people in the porn
00:12:33.020 industry have said, like, there have to be regulations.
00:12:36.140 Because like you said, in the 70s, you know, kids can't just walk into sex shops.
00:12:41.560 They can't just go rent, uh, pornos.
00:12:44.220 You know, we're talking about just completely unfiltered, uh, websites.
00:12:50.260 Um, now we have, uh, in the past, in the past few months, more and more states are passing
00:12:55.520 ID laws to get, uh, to require IDs to, uh, to access websites similar to how you have
00:13:02.500 to show an ID to go into a sex shop or to buy alcohol online.
00:13:06.480 Um, but actually two of the guys that we recorded are getting subpoenaed in a, um, in a big civil
00:13:14.600 action, um, lawsuit.
00:13:16.920 Yeah.
00:13:16.940 It's a child sex trafficking case out of Alabama where a bunch of victims of child sex trafficking
00:13:23.280 that was monetized on Pornhub are suing the company.
00:13:27.120 Wow.
00:13:27.660 So these, but these are Canadians.
00:13:29.180 Right.
00:13:29.720 So they, yeah, they are getting subpoenaed.
00:13:32.200 The one guy was like, yeah, I wouldn't be able to defend this in court.
00:13:35.100 This wouldn't hold up in court.
00:13:36.280 Yeah.
00:13:36.700 And he even admitted to me on camera that they have corporate attorneys that coach them
00:13:41.880 to hide information during depositions.
00:13:44.480 And so I hope that that's used in court.
00:13:48.500 I'm, I'm just so sick of all of this, you know, cause we've been talking about the, you
00:13:52.840 know, uh, the big news recently has been like border stuff.
00:13:55.720 Yeah.
00:13:56.200 And I keep hearing excuses for the people facilitating all of that kind of, uh, trafficking.
00:14:01.320 And of course there's an overlap.
00:14:02.640 I mean, some of the people that are being brought across the border in the United States are,
00:14:05.280 are sex trafficking victims, but it's just a lot of people want to make excuses because
00:14:10.240 someone's wearing a badge, you know, these, these CBP guys, well, you know, they're working
00:14:14.160 with the cartels.
00:14:14.840 They're bringing them in.
00:14:15.400 They're putting them on buses for the smugglers, but they were told to do it.
00:14:18.600 And I'm like, not interested, man.
00:14:21.480 So the bigger picture, the reason I bring this up is I, you've got, you've got these
00:14:25.300 big companies.
00:14:26.320 How many employees does Pornhub have?
00:14:29.360 It's difficult to say.
00:14:31.160 Yeah.
00:14:31.640 And, you know, between Montreal and Eastern Europe, it's between like one in 3000.
00:14:35.780 Yeah.
00:14:35.960 I would say about 2000.
00:14:37.560 And I would say about maybe five to 800 of those are in Montreal.
00:14:41.920 Yeah.
00:14:43.400 Five to 800 a month.
00:14:44.560 Okay.
00:14:44.760 So let's just start there.
00:14:45.760 They should all be in prison.
00:14:47.400 At the very least, what needs to happen with all of their employees is subpoena, criminal
00:14:53.960 investigation, lawyer up, baby.
00:14:55.580 Yeah.
00:14:56.320 There is a criminal investigation of the Eastern district of New York where Pornhub admitted
00:15:01.460 to profiting off of sex trafficking, but they just got a slap on the wrist.
00:15:05.500 Basically, they got put on like a three-year probationary period, or if any new sex trafficking
00:15:11.620 material is found on the site, then the criminal charges become permanent.
00:15:16.220 Wow.
00:15:16.940 It's like, I don't care if you work in the mailroom.
00:15:19.320 If you knew this stuff was going on, we got a pair of cuffs waiting for you, but nothing
00:15:24.400 ever gets done.
00:15:27.440 Yeah.
00:15:27.880 It's kind of the problem of a, of a giant company.
00:15:30.700 Um, but I think, I think that, uh, like, like Arden said, you know, the Eastern district
00:15:36.580 in New York, they did the slap on the wrist.
00:15:39.280 They have a three-year probation, but I think now is when they might be vulnerable.
00:15:43.200 I think now we need to get more admissions.
00:15:45.920 We need to get more stories on them and, uh, and then continue, um, you know, getting those
00:15:52.680 up into, into the, uh, the regulators.
00:15:55.740 Yeah.
00:15:55.840 And these ID laws that are passing across the U S they're really, really actually impacting
00:16:01.720 Pornhub in a negative way.
00:16:03.320 Um, Pornhub decided to comply with the ID laws in Louisiana and they reported after that
00:16:10.440 an 80% decrease in traffic to the site out of Louisiana.
00:16:14.200 So compliance with age verification actually decreases traffic by 80%, which is why they just
00:16:21.760 decided to completely block access to everyone with more passing.
00:16:26.220 Well, it's, it's, it's a, it's a, there's a couple of issues there for YouTube.
00:16:30.840 If you make a video on YouTube and they determine it to be age restricted content, the only way
00:16:36.520 to watch it is to log in and that shatters your traffic, not because you're making porn.
00:16:41.800 Like I could make a video where it's like, look at this, you know, violent video that
00:16:45.840 happened in New York.
00:16:46.480 Oh, it's breaking news.
00:16:47.180 And then nobody watches it because they won't recommend it.
00:16:50.740 And if you click on it, it says, please log in to view this video.
00:16:53.400 And people just go, I don't know, whatever.
00:16:54.660 They just don't do it.
00:16:55.680 Right.
00:16:55.860 People don't want to log in.
00:16:57.400 So Pornhub is basically like, people just want to go to the site and watch.
00:17:00.880 They don't want any barriers for them.
00:17:02.340 Well, that's too bad.
00:17:03.420 They should legally have to have those barriers.
00:17:05.560 And if that means they operate only at 20%, well, congratulations.
00:17:08.620 That's your market cap.
00:17:10.360 But what, so break down for me what the law does specifically in, in, in these states with
00:17:14.340 these IDs.
00:17:14.720 Yeah, the, it's a little bit different and most of them are based in Louisiana law.
00:17:20.300 Uh, Texas is a little bit different.
00:17:22.500 Um, I'm not, I'm not, you know, an expert in all the legalese, but it does require that
00:17:28.440 if your site hosts more than two, if more than two thirds of the content that your, your
00:17:34.780 site hosts is, uh, pornographic, then you need to require, um, ID to, to, you need to
00:17:44.340 require somebody's ID through a third party vendor.
00:17:47.060 That's, that's, that's still kind of insane.
00:17:48.900 To, to access it.
00:17:50.500 That's crazy.
00:17:52.120 There's, there's a threshold.
00:17:53.340 It's like, okay, so you can, you can have, you can have porn.
00:17:56.640 Yeah.
00:17:56.920 Well, I think the idea is to prevent like Twitter and social media.
00:18:01.560 I, I, I definitely understand the, the idea because it's like, again, what do we get into?
00:18:08.340 Like the, the one-offs like where, yeah, somebody could have uploaded, you know, porn to some
00:18:13.680 of the site, but the site isn't really for porn.
00:18:16.180 Um, I think it's to prevent like social media from getting caught up in that.
00:18:20.000 Yeah.
00:18:20.180 I think like.
00:18:20.720 They should be caught up in that.
00:18:22.380 Perhaps.
00:18:23.020 I mean, imagine Twitter as downtown, right?
00:18:26.640 It's where everybody's talking, everybody's hanging out and you can choose different neighborhoods,
00:18:30.660 little pockets where people are talking and retweeting and stuff like this.
00:18:33.400 Kids can walk around downtown.
00:18:34.780 Okay.
00:18:34.900 We, we all agree with that.
00:18:35.720 They can go to the butcher shop, whatever.
00:18:37.700 And there's an adult bookstore, but the windows are blocked out.
00:18:40.500 Nobody can see inside.
00:18:41.460 Nobody.
00:18:41.920 It doesn't matter how old you are.
00:18:43.220 You got to knock on the door.
00:18:44.160 They'll open it up and they'll say ID, please.
00:18:46.340 Twitter should have to do the exact same thing.
00:18:49.060 For, for like it's pornographic sections.
00:18:51.360 Exactly.
00:18:51.840 Yes.
00:18:52.180 That makes sense.
00:18:52.820 Yeah.
00:18:53.060 If you're on any social media platform, so that, you know, people have debated this,
00:18:57.400 like if you require an ID for Twitter, then politics has damaged people who want to speak
00:19:02.560 anonymously whistleblowers, for instance.
00:19:04.200 And I'm like, yeah, you can walk downtown and you don't need an ID.
00:19:07.560 You don't got to say who you are.
00:19:08.840 It's your, your, your right to privacy, even in a public space, they can see your face and
00:19:12.640 all that.
00:19:13.480 But if you want to go into a porn shop or you want to buy cigarettes or booze, now we're
00:19:18.160 going to need an ID from you.
00:19:18.940 So here's how I look at it.
00:19:20.000 Twitter, Facebook, Instagram, Snapchat, whatever.
00:19:23.060 If somebody posts porn available for anyone to see, I don't see a difference between
00:19:28.680 that and walking around downtown, holding a big poster of porn.
00:19:32.040 You're going to get, it's a, it's, it's a sanity laws.
00:19:33.780 I agree.
00:19:34.120 You get arrested.
00:19:34.920 Yeah.
00:19:35.120 They come, they take it down and say, we're going to, we're going to arrest you.
00:19:37.680 I mean, honestly, the cops may come down, destroy the sign and say, don't do that again.
00:19:41.700 That's probably what happened.
00:19:42.840 A lot of circumstances.
00:19:44.000 There was this story.
00:19:45.120 Someone sent me.
00:19:46.680 Actually, it was one of our staff in a, in our internal communications platform where a guy
00:19:51.360 was naked in his own home, walking around.
00:19:54.360 It wasn't, he was a trucker.
00:19:56.360 There's a shared apartment.
00:19:57.880 It was empty.
00:19:58.320 He was by himself.
00:19:59.140 So he gets up to start packing and he's naked.
00:20:01.280 And a woman saw him through the window and called the cops.
00:20:03.820 The cops come and knock on the door.
00:20:05.500 And he's like, what?
00:20:07.320 And they were like, seems normal to us.
00:20:08.600 Nothing seems out of the ordinary.
00:20:09.360 Another woman claimed she saw him standing in the doorway.
00:20:12.300 He made eye contact with her and like moaned or something, which I really don't believe.
00:20:16.820 And he ended up getting convicted of a misdemeanor for being naked in his own house.
00:20:21.640 He appealed it, risking going to jail and then ended up winning.
00:20:25.380 It's like, bro, I'm in my own house.
00:20:26.800 Don't look through my windows.
00:20:27.620 Like that's crazy.
00:20:29.380 But anyway, I digress.
00:20:30.340 Like, it's funny to me that this guy has got got criminal, criminally charged in the
00:20:34.420 US and these kinds of people outright saying they know what they're doing.
00:20:38.000 Get away with it.
00:20:39.400 And back to the main point.
00:20:41.540 If I think that, uh, you're like, if somebody is in public putting up flyers with adult graphic
00:20:49.240 content, we wouldn't tolerate it.
00:20:50.880 We should not tolerate it on, on X or Instagram or any other platform.
00:20:54.000 Right.
00:20:54.140 And like you said, there's laws in the books for this already in the physical space.
00:20:57.380 Um, and, but it, how does it not, it does apply to, it does.
00:21:01.280 It's just that it's not being enforced.
00:21:03.560 Yeah.
00:21:04.060 And, um, and so, you know, now we've got like more and more laws about digital space, but
00:21:08.740 you know, in reality, they could enforce the existing laws.
00:21:12.780 Um, the other thing is that they need to enforce laws that for, for age verification for the
00:21:18.300 people being uploaded because of so many victims that, you know, this is the original
00:21:22.440 thing back to the really egregious stuff.
00:21:24.660 Uh, you know, the, the New York times talked about in these employees, uh, talked about
00:21:28.780 in the, in the undercover investigations is complete unverification of, of sex trapping
00:21:35.260 victims, rape victims.
00:21:36.920 Um, again, in, in, you know, back to the seventies in pornos, you have to, you have to have age
00:21:44.460 verification and consent verification forms for adult, uh, film stars.
00:21:49.120 But now, you know, Pornhub, you can, you can upload content.
00:21:53.100 You can, uh, of, of anything, you know, what's crazy is, uh, only fans.
00:21:57.580 Cause there was one instance where this woman, she was seven or this young girl, she was 17.
00:22:04.560 She had, she had a bunch of followers.
00:22:06.500 The moment she turned 18, she had porn.
00:22:09.620 And then everyone kind of asked the question, she's underage in those videos.
00:22:14.480 Those videos were like, she turns 18 within minutes.
00:22:17.300 She's got all these videos up and they're like, no way she filmed all that in a few minutes.
00:22:20.100 Those are videos of an underage girl.
00:22:23.400 And she can just say, no, I filmed them when I was 18 or whatever.
00:22:28.200 I don't think so.
00:22:29.600 I mean, there's, there's, there's questions there on how you handle that, uh, legally
00:22:33.040 and morally.
00:22:33.560 But I guarantee if they went onto her computer, they would see, you know, Dade created and
00:22:38.240 they would see that she was underage when she filmed these videos and then uploaded herself
00:22:41.880 underage.
00:22:43.240 I, I, I, I, I kind of think we, we probably got to go to a full scale regulatory model of
00:22:49.920 the porn industry.
00:22:50.540 Meaning you got to get a permit from the government before you can upload pornographic content.
00:22:54.760 I don't, I don't understand.
00:22:55.860 I think that's basically how it, how it existed 50 years ago and really how it existed before
00:23:00.320 the mass tube sites existed, uh, going back to the nineties, uh, you know, I think that
00:23:07.620 that's really, you did, you have to have, it's called section 2257 of the child, uh, protection
00:23:14.320 act.
00:23:14.980 And you have to have forms that show this model is here.
00:23:20.520 We have their IDs on record.
00:23:22.080 Here's their signed consent forms.
00:23:23.740 You have to have all that.
00:23:25.460 If you're, if you're like a studio, you have to have that.
00:23:28.600 But if you were just some, some guys, you know, some group uploading some fly by the
00:23:35.340 night studio, uploading to a tube site like Pornhub, um, then they don't ask for any of
00:23:40.880 that.
00:23:41.500 I think only fans requires ID, right?
00:23:43.500 Like if you're a creator, you have to like submit an ID or something.
00:23:47.160 They require a bank account at the very least.
00:23:49.780 Yes, they actually do.
00:23:50.840 They do.
00:23:51.440 Uh, they do require an ID.
00:23:52.960 You need to, um, you need to verify with, with your face and your, your ID next to your face.
00:23:57.860 Yeah.
00:23:58.320 Yeah.
00:23:58.700 Yeah.
00:23:58.900 Their website says you have to be at least 18 years old to create an account, to access
00:24:02.200 the platform as a fan or a creator.
00:24:04.280 And so I can respect that, right?
00:24:07.340 You can't even sign up unless you, you, you have that, uh, in, but I also think there's
00:24:11.420 probably something just, look, it's bad.
00:24:15.300 The ubiquity of porn and where we're at as a society.
00:24:18.020 I don't think it's healthy.
00:24:19.880 Libertarian argument is, you know, let people do what they want to do, but I think it's causing
00:24:23.940 severe depression, it's causing psychological issues in young people.
00:24:27.960 And the challenge then becomes, do we decide to cross a moral line and say, we get it.
00:24:34.680 You like it.
00:24:36.440 These people are adults, but we've, we crossed the Rubicon on how this is screwing people's
00:24:41.260 brains up substantially.
00:24:42.240 And I don't know, what do you guys think about that?
00:24:44.240 Having, having seen what you've seen?
00:24:45.560 Well, I don't think we like, I don't even have to speak about what I think because one
00:24:49.420 of the guys in the undercover tapes who works at Pornhub and worked there for over 11 years,
00:24:55.880 he was the seventh employee at Pornhub literally says that porn's not healthy.
00:25:01.600 It's addictive.
00:25:02.440 It's a drug.
00:25:03.400 We don't know the health implications of it.
00:25:05.580 We don't know any of the implications of it.
00:25:08.360 Uh, it's damaging to relationships.
00:25:10.300 He actually brings up Jordan Peterson and completely mirrors his view on porn.
00:25:14.320 Wow.
00:25:14.900 Yeah.
00:25:15.480 The guy's a product manager at Pornhub and hates the, and hates the product he makes.
00:25:19.640 I mean, come on.
00:25:20.700 If you're a, if you're a crack dealer, you don't do your own product.
00:25:23.240 You know how bad it is.
00:25:24.300 Yeah.
00:25:24.760 Like every day you're slinging crack, the crackhead walks up to you and they're all messed up
00:25:28.740 and you're like, I ain't touching that stuff, but you can have it.
00:25:31.520 Man, people are just evil.
00:25:33.440 I don't know.
00:25:33.900 I'm sorry.
00:25:35.360 That's, that's, that's, that's just nasty.
00:25:37.140 And I think it brings up a question of, um, do people actually like porn or are they addicted
00:25:44.480 or when they were children, did they, did they get addicted?
00:25:49.160 Um, and you know, are they just, are they just drug dealers?
00:25:53.300 I feel like, uh, you know, mostly men, but women too, they, they get excited, right?
00:26:00.420 You know, humans want to procreate.
00:26:01.980 And that what happens is you make it snap of a finger easy to satisfy that urge through
00:26:10.100 the internet and you get a fork in the road, go to the bar, try and meet some women or
00:26:15.900 men, or go online and look at pictures.
00:26:19.080 I suppose for women, a lot of them, it was by 50 shades of gray or whatever, but they'll,
00:26:23.320 they'll satisfy that in some other way, which does just enough in, you know, their dopamine
00:26:27.900 or whatever, whichever receptor is related to that, that they just abandon, you know,
00:26:33.160 pursuing relationships.
00:26:35.940 It's kind of crazy to think about if you go back before the era of porn in any capacity,
00:26:39.680 if a man and a woman were feeling randy or whatever, you better go negotiate with another
00:26:46.080 person.
00:26:47.080 Like guys got to go to a woman and be like, I need to convince you to get me some.
00:26:51.160 Now they're just like, I can pay for it.
00:26:53.120 I can get it from anybody on the internet.
00:26:54.400 I can, it's not the same, but it's enough.
00:26:57.400 And now relationships are just on fire.
00:26:59.440 Like social order is, is in chaos.
00:27:02.000 Birth rates are declining.
00:27:03.920 Yeah.
00:27:04.500 And, and one of the things about the Texas age verification law is that it also states
00:27:10.240 that any pornography website has to include a disclaimer about the social and health implications
00:27:17.680 of porn before proceeding.
00:27:19.720 Basically just like tobacco products.
00:27:21.480 They, uh, and, and fortunately, uh, there was an injunction against it, but, uh, Paxton
00:27:27.720 has been fighting it and the, the junction has, uh, the injunction has been lifted while
00:27:32.140 the appeal was.
00:27:32.980 Yes.
00:27:33.360 Yeah.
00:27:33.480 In Texas.
00:27:33.980 Yeah.
00:27:34.460 So he's doing a great job of, of pushing that law.
00:27:36.800 Oh, okay.
00:27:37.260 Okay.
00:27:37.420 What was the, he, he, he said he was fighting the injunction for him.
00:27:39.940 Sorry.
00:27:40.340 There was an injunction against the law.
00:27:42.080 Oh, yeah.
00:27:42.980 Unfortunately, some court, uh, sided with, um, uh, there's a group called the free speech
00:27:49.000 coalition, uh, which is really just like porn hubs, legal arm.
00:27:52.960 Um, and they never fight for free speech for anybody else except for pornography companies.
00:27:58.460 And, um, uh, and so they, they've been fighting all of these in courts.
00:28:04.500 The only place they've succeeded is in a court in, um, in Texas.
00:28:09.320 Uh, so they had an injunction against the law, but Ken Paxton has been appealing the injunction
00:28:14.600 and has gotten a lift on the injunction for now while the appeal is still going on.
00:28:19.560 We had, uh, someone on this show talking about the books that are in, in grade schools, like
00:28:25.520 genderqueer and this book is gay.
00:28:27.540 And, uh, she's just like, well, I'm not for censorship.
00:28:30.260 So I'm for free speech, blah, blah.
00:28:32.040 And I'm like, yeah, I'm for censorship.
00:28:33.540 And I've, I've, I've never argued otherwise.
00:28:35.380 Right.
00:28:35.940 When we, when we say we're against censorship in a colloquial context, we're referring to
00:28:39.600 politics.
00:28:40.240 Does someone have a political opinion to disagree with?
00:28:41.900 They should be allowed to express their opinion.
00:28:43.440 The ability to, to dissent.
00:28:45.160 Right.
00:28:45.620 Right.
00:28:45.820 But when it comes to censorship, I'll say this outright to her.
00:28:49.560 And I'm like, yeah, absolutely.
00:28:50.900 I want these books censored.
00:28:51.880 Are you, is that a joke?
00:28:53.020 Like adult graphic content being put in schools for kids?
00:28:56.060 Not interested.
00:28:57.100 Censor that in two seconds.
00:28:58.320 I got no problem saying that.
00:28:59.760 And nobody should, but they try to, they play this trick.
00:29:02.220 Like I thought you were for free speech.
00:29:04.480 It's like, yeah.
00:29:05.140 Yeah.
00:29:05.320 When it comes to like political ideology, not showing kids porn, you creep.
00:29:09.720 That's where we're at though.
00:29:11.440 I wonder what the next generation ends up looking like when not only is it ubiquitous, but what
00:29:17.920 you guys uncover is employees, they know how bad it is.
00:29:21.820 And with smiles on their faces, like I'm gonna keep doing it.
00:29:24.440 Yeah.
00:29:25.020 And, and we're seeing more and more reports coming out.
00:29:27.920 I think there was one out of the UK recently or a growing number of sexual crimes against
00:29:33.660 children are actually committed by other children.
00:29:35.660 And it calls into question, like, what has online pornography done to this young generation?
00:29:43.220 I was exposed to pornography, unfortunately, by another child at a very young age, and I
00:29:47.560 never forgot it.
00:29:49.100 Luckily, I, you know, I didn't get addicted or anything like that, but it bothered me for
00:29:52.520 a very long time.
00:29:53.460 It's not something you forget.
00:29:55.200 I can't imagine how many kids nowadays are being exposed to things like that.
00:30:00.160 It's crazy because if you really think before the internet, you know, it's the, the trope
00:30:05.860 in movies of the kid being like, yo, I stole my dad's nudie collection.
00:30:09.140 Let's go down by the river and read it or whatever.
00:30:11.300 It's like, I don't know, I guess it's what kids did or maybe the movies claimed, but even
00:30:14.760 before magazines or whatever, this, this concept did not exist.
00:30:18.020 And seemingly overnight, every child growing up that's online has in schools, there's nothing
00:30:24.340 they can do like in the immediate to block it.
00:30:27.420 We, we, we, I don't think a functioning society can tolerate the ubiquity of, I'm not just
00:30:33.420 talking about porn, like these videos go nuts.
00:30:38.240 You know, I think the older generation, when they hear porn, they imagine like something
00:30:43.200 from the seventies where a guy's like, Hey babe, I'm the pizza delivery man.
00:30:45.900 And she's like, why don't you come in and I'll pay you.
00:30:48.160 Ha ha dude.
00:30:49.020 The videos that are online now are the wildest, craziest things you can imagine earmuffs for
00:30:54.640 your kids.
00:30:55.640 Cause I'm, I'm just going to come out and I'll be light with it.
00:30:58.640 We all know.
00:31:00.640 I'll leave Jake, you got out of this one, but there are videos of like animals, like kids
00:31:06.220 get access to the weirdest things.
00:31:08.640 Yeah.
00:31:09.640 I, I don't even want to mention, but I think let's just say, well, we'll stop at animals,
00:31:15.640 videos of people and animals, kids can find that stuff.
00:31:18.400 It's, it's, it's messing them up.
00:31:20.320 And I think, like I was saying, a lot of these adults, when they're talking about, when you,
00:31:25.720 when you talk about porn, they're imagining like a playboy, like a hustler, like a nudie
00:31:28.780 magazine or whatever.
00:31:29.520 It's like, dude, you have no idea.
00:31:31.740 Videos of just like seven people with weird objects swinging from ceiling fans and a dog
00:31:39.280 comes in and it's just, this stuff is insane.
00:31:42.440 And these kids are watching it and I'm, I'm like, man, their, their brains are going to
00:31:48.320 be completely fried.
00:31:49.900 Yeah.
00:31:50.360 And it's like, um, it's like Dylan Rice said in, in the one video, he's a, he's a, um,
00:31:56.400 Senior script writer.
00:31:57.680 Yeah.
00:31:57.820 He was a script writer for, for a lot of the studios at MindGeek, the owner of Pornhub.
00:32:02.560 Porn studios.
00:32:03.160 Yeah.
00:32:03.340 Um, and, and he talks about, um, the way that they, they try to introduce more and more extreme
00:32:11.720 content.
00:32:12.480 He said, he says, quote unquote, straight men, you know, we had these sites dedicated
00:32:17.120 to straight men, but then we see, Hey, can we introduce by content?
00:32:21.020 Can we introduce some queer content in here?
00:32:23.080 How can we do something that's more counter-cultural?
00:32:26.140 How can, you know, and, and they talk about push the envelope and he, and he even says to
00:32:30.980 me, uh, you know, do you know what the main audience is for trans angels?
00:32:35.720 It's one of their paid subscription sites and it's, it's all trans people.
00:32:39.340 And it's, it's mainly female presenting trans people.
00:32:43.680 And he said, our main buyers for, for that site are straight men.
00:32:48.500 Well, they're clearly not straight men.
00:32:50.480 Yeah.
00:32:50.960 I think, you know, like he says, see if you can convert somebody.
00:32:54.220 Yeah.
00:32:54.440 So I, you know, I can understand that where it's like a one component of what you guys
00:32:59.780 found was he's saying, we want to introduce queer content to start shifting the perspectives.
00:33:04.160 And so I would just put it like, if you're a guy and your whole life, you've been watching
00:33:10.820 like women.
00:33:11.580 And then one day someone says, here's a guy and you go, I like that.
00:33:15.200 Well, you're bi or you're gay.
00:33:16.680 And so it's a, it's a question of, yes, we can acknowledge they are trying to manipulate
00:33:22.420 people and convert them or whatever.
00:33:24.880 But I'm kind of just like, I think those people were just gay.
00:33:29.040 You know what I mean?
00:33:29.700 Like, I think that could be the case.
00:33:31.360 It's possible.
00:33:32.180 However, I, you know, unfortunately did visit trans angels.
00:33:35.880 Uh, and from the waist up, a lot of these actors, uh, really do look like natural women.
00:33:44.860 And I, if I didn't know it was trans angels for some of them, I would have been like,
00:33:49.300 well, that's a woman.
00:33:50.500 Right.
00:33:50.980 Right.
00:33:51.360 And, and so, uh, they, there's like funny memes where you'll see what looks like breasts
00:33:57.100 and then the camera zooms out and it's a fat guy's ass with like a bra on it.
00:34:01.140 And they're making the point of like, in your mind, you see what looks like cleavage.
00:34:05.900 A guy's imagining a woman and big boobs.
00:34:08.680 And then it zooms out and like, you were actually getting off on like a fat guy.
00:34:12.440 And they're like, ah, like it's messed up.
00:34:14.040 But I could understand if you keep feeding content to these people and you inch them
00:34:20.460 incrementally towards shifting their view, that's, I guess, the argument they're making.
00:34:25.680 I kind of feel like though, if you're a, I'm sorry, if you're a straight guy and they show
00:34:30.300 you what looks like a woman and you're like, yeah, what a hot chick.
00:34:32.700 And then it pans down and you go, yeah, I'm okay with that.
00:34:35.000 Well, you're gay.
00:34:35.800 There's nothing.
00:34:36.280 I don't got any beef with that.
00:34:38.000 But I think that means you are to some degree like bisexual or gay.
00:34:41.800 And by all means, that's fine.
00:34:43.540 And if you are, I don't do whatever you want to do.
00:34:45.320 I don't guess like within, within, you know, don't hurt anybody.
00:34:48.040 But I think that just means they were gay dudes to begin with.
00:34:51.460 Yeah.
00:34:52.000 And I think that's probably the case in many cases.
00:34:55.020 I think what Dylan Rice is saying is that there with pornography, there's an excessive like
00:35:02.000 need for more, need for more to more extreme content.
00:35:06.240 Yeah.
00:35:06.380 You know, eventually you're just not into straight content anymore.
00:35:10.720 It doesn't get you off anymore.
00:35:12.740 And so you like, they believe conversion therapy is real and works.
00:35:17.100 Well, I think they, they need to find, you know, eventually somebody only buys so many,
00:35:22.660 you know, they buy all your straight content.
00:35:25.480 Now, what else do they buy?
00:35:27.480 So like, yeah, he calls straight content vanilla content though.
00:35:31.960 But if you look at, you know, hold on, sorry, like all straight content.
00:35:37.700 He just says like, oh, you know, the pretty blonde girl with big boobs with the guy.
00:35:41.920 That's, that's vanilla.
00:35:43.140 What if she's like swinging from a ceiling fan and they're wearing like parachutes or something?
00:35:46.120 Is that vanilla?
00:35:46.560 Well, it might be more theatrical, but, um, I don't, I don't know, but what, what else
00:35:54.080 did he say?
00:35:54.700 He said, uh, well, he's, he's talking about like, they need, you know, he says like, we're,
00:36:02.520 it's, it's all about the money.
00:36:03.720 Like we need to push more and more content on people because again, they bought that,
00:36:10.480 you know, he says like, I think it's browsers is for straight guys.
00:36:14.160 So like Kings, yeah, reality Kings.
00:36:17.520 So like, these are all ALO owned sites.
00:36:19.600 Yeah, right.
00:36:20.840 Uh, the, the owner of Pornhub owns all these sites and how do we then like get them to
00:36:27.540 buy more subscriptions, to buy more videos to our other sites?
00:36:31.600 Well, like, you know, eventually they just run out of this is vanilla content.
00:36:36.560 This is actually really interesting.
00:36:37.500 So I think there may be like some serious scientific data in here, which could massively backfire
00:36:43.400 on leftist ideology.
00:36:45.260 If this is true, I'm not going to, I'm not a scientist.
00:36:47.380 I'm not going to pretend, uh, I know exactly what happens to these guys, but if he's saying
00:36:52.100 in their pursuit of selling more content, they have found that you can introduce homosexual
00:36:59.660 content to what, what are straight, perceivably straight men, and they slowly start buying it.
00:37:06.840 If they have the data on that, that proves conversion therapy is a real thing.
00:37:09.840 And yeah, it's not even only like gay, trans, bi content.
00:37:15.700 It's also the incest, the step family category that's becoming increasingly popular, uh, that
00:37:23.260 one of the employees told me he was really concerned about it, having real world effects,
00:37:28.380 making people, viewers actually act out these fantasies or try to with family and it causing
00:37:35.160 actual world issues, not to mention the teen category is, uh, you know, one of the most
00:37:41.980 popular categories.
00:37:43.660 And now they claim, oh, this is 18 and 19 year olds, but we have young, very young looking
00:37:50.340 people in school outfits and things like that with teachers.
00:37:55.000 And it's, um, it's very obvious what they're trying to push.
00:37:58.400 Dylan admitted that they purposefully will cast actors who appear to be around 15 in order
00:38:05.800 to appeal to pedophiles, which they can easily turn into whales or big spenders.
00:38:11.180 And it also appeals to young teens.
00:38:13.720 I just, I just ban it all, make it all illegal.
00:38:16.840 You can't do it anymore.
00:38:17.640 I don't know.
00:38:17.960 Just the, everything we're hearing is just so nightmarish, but like the, the liberty minded
00:38:22.460 part of me is like, just regulate it and stop the dark stuff.
00:38:26.160 I just don't know if you can, like, what's the solution if, if it's available, even if
00:38:31.880 you're 18, even if you do ID verification, the, the idea that they're, they're peddling
00:38:37.600 this addictive substance as it were, and they're making you more.
00:38:41.520 And like there's the, it starts with vanilla and then every day you're using it.
00:38:47.520 They're sending you something and they're making you go crazy.
00:38:51.000 Essentially.
00:38:52.580 I don't know.
00:38:53.460 What do we do?
00:38:53.960 Do we just say no more?
00:38:55.020 We, you can't do it.
00:38:56.160 I mean, I guess I view it as like prostitution's always been around, oldest profession.
00:39:03.480 And, and then, you know, we, we enter the, the modern era of like videotapes and it's
00:39:12.120 still like, then we have, you know, we have red light districts or we have, you know, we
00:39:18.420 have studios, we have laws that require age verification, consent forms.
00:39:23.380 Um, you know, this stuff is regulated.
00:39:26.440 And then we hit the aughts era.
00:39:30.060 We, we hit the, the, the 21st century and then mass tube sites happen where user generated
00:39:37.060 content, it just becomes a free for all.
00:39:40.220 And all of a sudden, like nobody's willing to regulate those industries either because
00:39:44.320 they don't know how, or because these companies suddenly made a lot of money.
00:39:48.840 They have a lot of capital.
00:39:50.280 They have a lot of power.
00:39:51.560 Well, I got good news and bad news.
00:39:53.140 The good news is the creation of porn with actors and studios and all that stuff is on
00:40:01.360 the way out.
00:40:02.100 And I believe will likely cease to exist within a few years to be replaced by AI for real.
00:40:10.520 Like only fans, you're done.
00:40:13.320 There's already, like I covered one of these stories, the video they can generate with AI.
00:40:20.300 There's like, she talks to you.
00:40:22.020 There's a woman and she, there's a video suffer at the beach and she's like shaking her hips
00:40:25.880 and she's smiling and winking.
00:40:26.920 And she's like, Hey guys, what's up?
00:40:28.820 And it's totally AI generated.
00:40:30.360 And they're making like 30 grand a month off of this.
00:40:33.360 So for what purpose would someone need to pay a woman when they can just sign up for a website
00:40:41.220 where they can type in, this is what I want.
00:40:43.220 And AI generate someone to do this for them.
00:40:45.920 Where that goes is crazier because we're separating, you know, these people you've exposed who are
00:40:51.240 like, we know what's going on.
00:40:52.480 We know what we're doing and why we're doing it.
00:40:53.980 And we don't care.
00:40:54.680 We make money.
00:40:55.440 They're out of the equation now.
00:40:57.260 A website will just be like, you're allowed to generate AI content so long it's illegal.
00:41:01.400 And then all of a sudden you're going to get an individual.
00:41:06.060 They're going to go onto the website and they're going to type in, make this for me.
00:41:09.040 They are going to pursue in random crazy directions.
00:41:12.180 It's not going to be that some guy working at Pornhub is like, let's see what happens
00:41:15.900 if we send this straight guy, like gay content.
00:41:18.240 It's going to be a dude just going wherever.
00:41:21.900 I think it's going to get absolutely insane.
00:41:24.380 Everyone's going to have a weird personalized hyper porn experience if this is not regulated
00:41:30.400 in some way.
00:41:31.420 And we already see where this is going with like the Taylor Swift stuff on X where they
00:41:36.620 just do these videos they posted of Taylor Swift.
00:41:38.860 That's crazy.
00:41:39.860 The photos.
00:41:40.720 Sorry.
00:41:41.640 That's where we're headed.
00:41:43.360 I don't even know if there's a solution to that.
00:41:45.620 I don't know.
00:41:46.500 Yeah.
00:41:46.980 I mean, how do you regulate that at that point?
00:41:49.300 And, you know, how does the regulation ensure that you don't stifle AI development in general?
00:41:56.960 And it calls into question because there are more reports of AI child sexual abuse material
00:42:04.360 being generated in, you know, a variety of different ways.
00:42:07.220 But then the argument is, well, that it's not a real child being abused.
00:42:12.340 So one of the one of the questions as it pertains to that was there are adult women 20 or 30
00:42:19.900 years old who look like children.
00:42:21.700 And so you've got these people, these creepos arguing.
00:42:26.460 I mean, this is actually a big moral question.
00:42:29.020 There was a show about a woman who suffered some hormonal disorder, which prevented her
00:42:35.060 from aging beyond what appears to be like eight years old.
00:42:37.840 And so she's like in her mid 20s, wants to have a relationship, but looks like a child.
00:42:44.660 Yeah.
00:42:45.260 And it's.
00:42:45.800 And sounds like one.
00:42:46.660 And sounds like one.
00:42:47.400 And I'm like, I genuinely feel for that poor woman.
00:42:52.080 But any guy who is going for that, I'm sorry, like you're attracted to a child.
00:42:58.600 Yeah.
00:42:58.680 Huge concern.
00:43:00.140 Yeah.
00:43:00.420 And that sucks.
00:43:01.160 But so in that capacity, there have been people made these arguments.
00:43:03.640 They're like, oh, yeah, this is this this woman in this video.
00:43:06.980 Like you were mentioning, they try to make them look young.
00:43:09.040 Oh, she's 20 years old, though.
00:43:10.400 So it's totally fine.
00:43:11.160 Even though she looks 15 and she's wearing a school girl, school girl uniform.
00:43:14.580 Like the questions around legality and morality, this is super, super difficult.
00:43:19.200 Do you make illegal any kind of porn that depicts situations of minors, regardless of
00:43:26.360 the actor or the person involved?
00:43:27.760 In which case, if someone creates AI generated content and the female or male in it appear
00:43:35.680 to be underage, but there are adults who look like they're underage, too.
00:43:39.680 Like, how do you regulate that?
00:43:41.740 Unless we just say the circumstances around it.
00:43:45.720 I don't know.
00:43:47.300 I don't know.
00:43:48.120 Ban it all.
00:43:48.600 I think the libertarians would freak out and be like, you can't just tell people they
00:43:52.720 can't have this stuff.
00:43:54.000 There's, you know.
00:43:55.580 Yeah.
00:43:56.020 I mean, I guess right now it's like a theoretical problem, you know.
00:43:59.480 It's so like.
00:44:00.840 Well, AI generated porn is all over the Internet already.
00:44:06.360 That's true.
00:44:07.140 Yeah.
00:44:07.520 Yeah.
00:44:08.000 No, it's concerning.
00:44:08.840 So we talked about this.
00:44:11.860 So how do you pronounce it?
00:44:13.080 Sochi.
00:44:13.600 Sochi.
00:44:14.720 The what's her name?
00:44:17.080 The woman from Dr. Strange.
00:44:19.780 Oh, I don't know.
00:44:20.320 Let me pull it up.
00:44:21.600 Dr. Strange.
00:44:22.500 Xochio Gomez.
00:44:23.700 I don't know how you pronounce it.
00:44:24.680 Xochio.
00:44:25.700 She's underage.
00:44:27.180 And they were deep faking her face onto adult women's bodies in porn videos.
00:44:33.820 And the mom found out this is going on like she's a minor.
00:44:38.560 What do you what do you do?
00:44:40.780 Like this is getting crazy.
00:44:42.000 Yeah, her face on the body, but the body is an adult woman.
00:44:44.800 So there's a new act being introduced called the Protect Act, and it clearly defines, you
00:44:51.020 know, expansive ways that people can be sexually exploited online, including through AI pornography.
00:44:59.820 And I think something like that could more clearly define in the law since the law is
00:45:05.300 so far behind technology.
00:45:07.120 The U.S. Senate just reintroduced that.
00:45:08.900 Yeah.
00:45:09.120 So hopefully that passes.
00:45:13.680 Yeah.
00:45:13.840 So this is actually just this story is from two weeks ago.
00:45:17.540 17 year old Marvel star dancing with the stars performer Xochio Gomez.
00:45:21.400 I'm probably pronouncing that wrong because it's pronounced.
00:45:23.800 It's it's spelled, you know, so chittle.
00:45:26.960 Sorry.
00:45:27.260 I think it's Xochio.
00:45:30.220 Is that what it is?
00:45:31.140 Spoke out about finding non-consexual sexually explicit deep fakes with her face on social media
00:45:35.140 and not being able to get the material taken down.
00:45:36.780 But most importantly is she's 17.
00:45:39.260 Yeah.
00:45:39.980 But this is the.
00:45:42.520 You know, I default to ban it all.
00:45:44.920 Like, I'm not saying you we will literally end up doing that.
00:45:47.120 I'm going to go big ask on this one.
00:45:48.900 Art of the deal, Donald Trump.
00:45:50.240 I want it all banned.
00:45:51.520 It's all illegal.
00:45:52.240 OK, now let's negotiate on where the lines are that we're going to allow.
00:45:54.680 Yeah, I mean, on a personal level, I totally agree on on a realistic level.
00:45:59.480 My brain kind of goes to in a similar way how public opinion has changed so much over
00:46:05.800 the years surrounding cigarettes.
00:46:08.720 Why not try to educate the public about the true societal and health implications of what
00:46:14.600 pornography does to you?
00:46:16.180 Well, you know, I'm really proud of millennials and Gen Z because they don't drink soda anymore.
00:46:22.640 Soda sales are rapidly declining.
00:46:24.840 And everybody likes drinking Spindrift.
00:46:28.080 This is the craziest thing.
00:46:29.900 It's like a weird thing that happened.
00:46:31.460 So.
00:46:33.100 My doctor tells me you should put lemon juice in water and drink that because in 2014,
00:46:37.720 I got a kidney stone, probably, I don't know, drinking soda and Gatorade or who knows what.
00:46:41.980 And so my doctor's like lemon juice and club soda.
00:46:45.480 Drink that.
00:46:46.120 You're good to go.
00:46:46.860 And I'm like, OK.
00:46:48.320 And now everybody drinks this stuff.
00:46:51.020 Like everywhere I go, everyone's got Spindrift and I love it.
00:46:53.900 It's club soda with a little bit of fruit juice in it.
00:46:56.080 I'm like very low sugar, almost none.
00:46:58.720 And this is what soda should be.
00:47:00.520 Our generation is destroying the soda industry.
00:47:04.080 And I'm like, how did we do something so awesome?
00:47:06.220 Like there's 50 grams of sugar in one can of soda.
00:47:09.780 And that's supposedly like 100% of your daily sugar intake.
00:47:14.360 And we've all kind of realized this is a bad thing.
00:47:16.540 We're getting rid of it.
00:47:17.500 I wonder if there's a similar cultural zeitgeist that could emerge where we're like, yeah,
00:47:22.100 this stuff's melting our brains.
00:47:23.700 Let's let's not because of government force, but because of a cultural like movement.
00:47:29.460 People just say, I'm not going to go anywhere near that stuff.
00:47:32.020 It's really bad for you.
00:47:33.340 You know?
00:47:33.960 Yeah.
00:47:34.280 We even have celebrities like Billie Eilish talking about how messed up pornography made
00:47:42.660 her feel and made her brain.
00:47:44.940 She was talking about how she just wanted to watch the more next extreme thing.
00:47:50.380 She said it was really disturbing.
00:47:51.840 She didn't like where it was bringing her.
00:47:53.240 So she just had to stop.
00:47:54.340 And I personally was really surprised that Billie Eilish talked about that.
00:47:59.540 And I wasn't expecting a celebrity to come out against the adult industry because it's
00:48:05.200 the cool thing to side with the adult industry, right?
00:48:07.760 Yeah.
00:48:08.280 Especially with OnlyFans, you've got a lot of these.
00:48:10.880 This is the crazy thing about the porn industry is, man, there are people online that are so
00:48:17.700 into it.
00:48:18.500 If you even say something like, hey, this is bad.
00:48:20.520 You shouldn't do it.
00:48:21.040 They will try their best to mock and ridicule you into being scared to speak out against
00:48:25.700 it.
00:48:26.260 Yeah.
00:48:26.340 You know, Jordan Peterson comes out and he's like, porn is bad.
00:48:28.560 Don't do it.
00:48:29.020 Clean your room.
00:48:29.740 And everybody's like, you know, he's got a point here.
00:48:32.020 And then instantly you get these people being like, ha, I'm a really cool young guy and
00:48:35.720 you're stupid.
00:48:36.760 And I'm like, I don't care what you think.
00:48:38.140 Like this stuff's clearly destroying people's brains, like melting their faces.
00:48:42.800 But I was, I mean, that's kind of crazy actually that Billie Eilish was one to speak
00:48:46.540 I didn't expect that.
00:48:47.900 Mostly because Billie's female.
00:48:50.160 And so the expectation is that the majority of people who are addicted and going to these
00:48:55.360 websites are dudes.
00:48:56.380 Right.
00:48:56.780 But is that true?
00:48:58.340 I think the gap is kind of closing because I think someone, I don't know how trustworthy
00:49:04.920 of a survey it was, but they reported like 25% of females now regularly use pornography.
00:49:11.360 And that was surprising.
00:49:13.180 25.
00:49:13.980 You know, I mean, 50 shades of gray is like people consider that to be porn for women.
00:49:19.560 Right.
00:49:20.020 Like erotica.
00:49:21.220 Yeah.
00:49:21.720 Yeah.
00:49:22.120 I don't know why that is.
00:49:23.400 I mean, do you guys like how much have you researched into like the psychology and the
00:49:28.020 behavioral stuff behind it?
00:49:29.360 Like, no, I don't know.
00:49:31.740 Not a whole lot.
00:49:32.520 And there's a lot of conflicting things.
00:49:34.280 You don't know exactly what's right.
00:49:37.080 What, you know, what study was skewed in some way.
00:49:41.260 I think it's very clear based on like real world observation what it does to someone.
00:49:46.460 And as a woman growing up, it was always very clear to me who, what men were regular porn
00:49:56.120 viewers and addicted to porn.
00:49:57.920 And I, like, I genuinely went out of my way to avoid those people in dating situations
00:50:03.160 because I thought it was extremely unattractive.
00:50:05.580 You know what?
00:50:06.420 You know what I'd be willing to bet?
00:50:08.020 I bet that the people that like these porn websites can see, I'd be willing to bet that
00:50:14.020 men are consistent viewers of pornography and women as individuals, you'll have spikes
00:50:19.420 throughout the month.
00:50:21.420 Right.
00:50:21.780 That's probably true.
00:50:22.520 Yeah.
00:50:22.700 Because our hormonal cycles are so different.
00:50:25.420 Whereas guys are just like on 24 seven.
00:50:27.280 Women are like, okay, this day it's like a huge increase.
00:50:29.980 Like two days out of the month.
00:50:31.080 If, yeah, if you put like all men and all women up against each other, you'd probably
00:50:34.160 see a straight line.
00:50:35.360 But if you look at individuals, guys are probably like consistently watching on these days and
00:50:39.680 women, it's like in bigger waves, men are shorter waves or whatever.
00:50:43.180 Right.
00:50:43.480 But I feel like the future, uh, as we go in this direction, if we don't do something about
00:50:48.760 it now, I, I, I think like the, the, just the AI VR deep fake stuff will result in the
00:50:59.400 end of human breeding in the normal way.
00:51:01.740 You know, we see these sci-fi movies where they do like, uh, surrogacy.
00:51:07.620 They do like pod babies and there'll be like this idea of having a baby that the old fashioned
00:51:12.260 way.
00:51:12.440 What was that, um, brave new world.
00:51:14.580 Is that what they do in the brave new world?
00:51:15.860 What was that movie?
00:51:17.100 Gattaca?
00:51:17.880 Yeah.
00:51:18.340 Where the guy's like, uh, uh, it's not necessarily like everyone's genetically engineered, but
00:51:22.700 he's, he's not.
00:51:23.800 So he's inferior.
00:51:24.480 And then he has to like lie about identity or whatever.
00:51:27.540 The reason, the way, the reason we'll get to that world is not because women want to
00:51:31.820 be girl bosses.
00:51:32.440 And they're like, I have no time to carry a baby.
00:51:34.080 I'm the CEO of a company.
00:51:34.880 It's going to be because men and women will not be sexually attracted to each other.
00:51:39.540 Like they're going to, they're going to plug their brains into their neural link.
00:51:43.780 And there's going to be like a dragon, a polar bear, you know, robots.
00:51:49.600 And they're going to be like, this is the only thing that works for me.
00:51:51.860 And then when they're in the real world, like let's have kids.
00:51:54.120 They're going to be like, okay, I'll go into my pod, you into yours.
00:51:56.480 And then we will, you know, robo inseminator, whatever the, whatever the, whatever the
00:52:01.060 F like people were talking about how, um, they sell insemination kits online and stuff
00:52:07.040 because people, they're not doing it anymore.
00:52:10.120 Women still want to get pregnant.
00:52:11.420 And so, but what is it?
00:52:12.620 Like, it's the weirdest thing to me when someone posted this in like one of our super chats
00:52:16.600 that women can go to like target and buy applicators where the dude puts his, you know,
00:52:24.200 she gets some stuff from the dude and then she takes and then she goes and then she does
00:52:27.180 it the rest.
00:52:27.600 And I'm just like, I mean, it's an easier way to do it, you know what I mean?
00:52:30.860 Like, but they don't want to, I guess, or it's like some weird things happening where
00:52:34.480 there's no relationship forming.
00:52:36.160 And I wonder if outside of just like, we're talking about porn addiction and all that
00:52:40.200 regular familial relationships will be eradicated by this.
00:52:44.480 Yeah.
00:52:44.680 I mean, I think as a culture, we are seeing short term pleasure being prioritized over long
00:52:50.860 term fulfillment and it shows itself in a lot of different things, but especially within
00:52:56.780 relationships.
00:52:57.800 And I think it's really sad to see.
00:52:59.720 I do think online porn has a lot to do with that.
00:53:02.260 Yeah.
00:53:03.280 I wonder if this is the apocalypse because they say, I'm half kidding, but they say, you
00:53:07.660 know, the world revolves around sex.
00:53:09.860 There are comedians and not even comedians, people have said that like every action a man
00:53:14.360 will take is related to trying to have sex or something like this, which I don't think
00:53:19.280 is true.
00:53:20.460 But the argument is like, why would a man want to be successful?
00:53:23.360 And if you look at the evolutionary psychology of it, the reason why a guy wants to be the
00:53:27.200 best hunter or in modern era, the, why he wants to be the best name in pro billiards or
00:53:32.320 whatever is the status, the success.
00:53:35.340 And it leads to you getting what you want.
00:53:37.320 And behind every action is the, the, you know, if you're a successful, powerful male who can
00:53:43.280 peacock successfully for the female, you will get the mate.
00:53:46.640 And so what happens to a world where nobody wants it or they can computer generate it?
00:53:51.680 Just end up like sitting on your couch all day eating Cheetos and having this beautiful
00:53:55.100 woman tell you everything you want to hear, but it's fake.
00:53:56.760 It's a robot.
00:53:59.000 It's terrifying.
00:54:00.340 Yeah.
00:54:01.240 Well, I don't know what the answer is.
00:54:02.640 It does seem like there's a political component too, though.
00:54:06.840 So I'll ask you guys if you, if, you know, in this regard, you mentioned this story, one
00:54:12.180 of the big components of the story that we should dive more into is the guy saying we're, you
00:54:16.060 know, like querifying or whatever, disrupting the norms.
00:54:19.700 Did you find a political element with some of these employees?
00:54:23.240 Like it's activism to change society.
00:54:27.520 I don't think so.
00:54:28.760 Really?
00:54:29.080 Um, I think it's, I think, I think Dylan Rice even said in the undercover tape, like we
00:54:36.320 don't have any, you know, uh, cultural bias or anything.
00:54:40.660 We're just trying to figure out, you know, what content pushes what.
00:54:44.660 I do think it's one of those things where their bias is just natural and it does reflect
00:54:50.600 in their work.
00:54:51.100 I don't think you're going to be hired at a company like Pornhub if you're not, or if
00:54:57.280 you don't at least know how to, um, express pro adult industry viewpoints.
00:55:02.500 Yeah.
00:55:02.960 Yeah.
00:55:03.280 And it tends to be, uh, you know, like more progressive people, I guess, but you get a
00:55:10.280 mix of people from different political backgrounds, but it's just people.
00:55:14.460 I feel okay with, um, with internet.
00:55:18.640 I think the percentage of conservatives working in porn is 0%.
00:55:22.980 Yeah.
00:55:24.440 I mean, true, true conservatives, probably not.
00:55:27.040 I think there's a, probably a lot of like libertarian types and progressives and, um,
00:55:32.960 anybody who claims to be a conservative, like you go to a porn industry and they're like,
00:55:36.580 well, I'm actually a conservative.
00:55:37.480 No, you're not.
00:55:38.100 Yeah.
00:55:38.500 You're just saying that.
00:55:39.620 Yeah.
00:55:39.940 They're like nothing conservative about porn.
00:55:42.160 Right.
00:55:42.700 0%.
00:55:43.580 Libertarians, you probably get a lot of it.
00:55:46.300 Yeah.
00:55:47.020 There was like some famous moment at the libertarian debates where they were arguing over whether
00:55:50.800 they can sell heroin to children.
00:55:52.400 And when some, when I think it was Gary Johnson was like, no, he got booed.
00:55:56.480 Oh man.
00:55:57.980 What?
00:55:58.800 And the argument is the market will decide, but parents should figure out the government
00:56:01.700 shouldn't be involved.
00:56:02.460 And I'm like, I mean, I get it, but there's gotta be some limits, right?
00:56:06.720 But yeah, I mean the industry, the adult entertainment industry as a whole is predatory.
00:56:13.080 And a lot of people don't realize like the owner of Pornhub is a former defense attorney
00:56:18.740 of child sex predators.
00:56:20.860 He defended, yes, child pornographers, child sex predators, child sexual abusers.
00:56:27.440 And he even, you know, spoke at this big attorney's conference, coaching other defense attorneys
00:56:34.440 of child sex predators, how to get shorter sentences for their clients.
00:56:38.340 I mean, this was his life.
00:56:39.780 This was how he was known.
00:56:41.060 He was looked at as one of the top defense attorneys for this type of subject matter.
00:56:46.180 And now he owns Pornhub.
00:56:47.800 Do you guys consider yourselves conservative or right wing?
00:56:50.980 Yeah, definitely.
00:56:52.520 And I wonder, like, is that a motivating factor for why it's like, we need to figure out what's
00:56:55.500 going on with the porn industry?
00:56:57.480 I think it's part of it.
00:56:58.900 I think a large part was, I just saw nobody else going after it.
00:57:03.300 Like, we had these skills that we learned from James and other friends at Project Veritas.
00:57:09.260 And nobody, nobody had ever gone.
00:57:13.280 I talked to other people who were kind of in the anti-exploitation arena.
00:57:17.560 And everybody was like, yeah, nobody's really gone undercover in talking to Pornhub employees
00:57:24.160 before.
00:57:25.060 And we knew what a touchy subject pornography is to people all across the board, no matter
00:57:30.120 what you believe.
00:57:31.340 And I think we just both wanted to, like, push the red button, like push the button you're
00:57:36.140 not supposed to push.
00:57:37.600 I think in this case, it needed to be investigated.
00:57:41.280 I mean, I mean, holy crap, the stuff that these guys are saying to you.
00:57:44.000 So we're talking about human and child sex trafficking and they're like, yep, and it
00:57:48.360 makes money and they're not in prison.
00:57:50.800 That's insane to me.
00:57:52.220 It's, this stuff was a conspiracy theory before.
00:57:54.780 Epstein Island was like, oh, you're crazy.
00:57:57.040 These people are not involved in this.
00:57:58.240 And now it's like, I got questions about how these guys can operate a website that has child
00:58:02.280 trafficking on it and they're not in prison.
00:58:04.240 The worst that happened was they had to stop doing the thing.
00:58:08.540 Wow, man.
00:58:09.280 Yeah.
00:58:10.040 And, you know, if they aren't caught continuing to profit off of sex trafficking, then their
00:58:16.740 criminal charges are dropped completely, which is crazy.
00:58:20.540 Can you guys, are you willing or able to talk about the techniques involved in the undercover
00:58:25.900 investigation or will that, like, blow the operation up?
00:58:29.180 I mean, to some degree.
00:58:30.480 I mean, we love that part.
00:58:31.700 We can say some, yeah, I mean, that's fine.
00:58:33.940 Undercover stuff.
00:58:34.440 All right.
00:58:34.660 So how does it start?
00:58:35.420 What do you guys do and how does this begin?
00:58:36.660 The internet.
00:58:39.180 We look up who works for the company and we see, we try to get as many names as possible,
00:58:46.220 details about their role, what they do there.
00:58:48.980 And we really just use all publicly available information we can find on each person to find
00:58:55.700 creative ways to make contact with them.
00:58:58.240 Well, what's one creative way?
00:58:59.980 Can you give us an example?
00:59:01.160 Yeah.
00:59:02.000 Talk about Sean.
00:59:03.280 Well, before you say it, I'll just tell you, my understanding is it's like Tinder.
00:59:07.780 You go on and you swipe until you find someone, but that's, that's like casting a wide net.
00:59:13.160 That's like finding a needle in the haste.
00:59:14.760 Yeah.
00:59:15.400 Interesting thing about that.
00:59:16.760 First was a one guy, he, he was making a phlebotomy app.
00:59:21.700 Uh, that's like phlebotomy is where you're like a nurse who, who draws blood.
00:59:26.780 And I don't really know why, why was he making a phlebotomy?
00:59:29.360 Oh, I've, I've seen these.
00:59:30.560 They have things where you can like put the phone over the arm or like put a light on your
00:59:35.060 arm and it shows the veins or something like that.
00:59:37.520 Oh yeah.
00:59:38.080 I mean, his app was like for scheduling patients and things like that.
00:59:40.840 Oh, okay.
00:59:40.980 That's much more boring.
00:59:41.740 Yeah.
00:59:42.140 It was very, you know, um, just yeah.
00:59:45.820 Random for him.
00:59:46.760 Cause he's not in the medical industry or anything.
00:59:48.820 Yeah.
00:59:49.080 He's in the porn industry, but this was like his side hustle that he was doing with a friend.
00:59:52.960 Yeah.
00:59:53.380 And, and so Arden was like, all right, I'll be a phlebotomist and like, uh, messages him
00:59:59.020 or something says that she's a nurse and she's doing some contract work and she's in town.
01:00:03.760 She really likes the app.
01:00:05.160 And, uh, and then, yeah, I got a couple of meetings out of that, you know, pretending to be
01:00:10.840 a phlebotomist, which he's never been in real life.
01:00:13.060 So hold on.
01:00:13.640 Like I was just hoping for no research on phlebotomy.
01:00:16.840 I mean, but this guy's not gonna know anything about it either.
01:00:18.940 Yeah.
01:00:19.340 And you, and, and, and, you know, if you're good, you can always just, he'll be like,
01:00:23.020 so when you've done insert technique, you'd be like, oh, we don't even do that anymore.
01:00:26.360 Like that's an old school technique.
01:00:27.760 Well, the new, and you just make it up and he's gonna be like, oh, okay, I guess.
01:00:31.080 I don't want to bore you.
01:00:32.480 Yeah.
01:00:33.400 But like the, oh man, the, the, the trope with like project Veritas and these undercover investigations,
01:00:39.720 it's just like some guy meets some woman off Tinder and she's like, let's go grab drinks.
01:00:45.920 And then he's like, I know what will impress this woman.
01:00:48.320 Corporate malfeasance I'm involved in.
01:00:50.120 Let me tell you about the crimes I've committed.
01:00:51.720 And it's like, wow, tell me more.
01:00:52.960 And how do they just fall for this?
01:00:54.500 And that, that is kind of how it goes.
01:00:56.640 Yeah.
01:00:57.620 Um, yeah, we did a couple of dating app ploys.
01:01:01.620 Wait, wait, wait, hold on.
01:01:02.480 Like you message a guy claiming to be a phlebotomist.
01:01:06.220 Yeah.
01:01:06.460 Interested in phlebotomy.
01:01:07.860 Yeah.
01:01:08.160 And then you're like, tell me about the child trafficking you're involved in.
01:01:10.500 He's like, sure.
01:01:11.880 People like talking.
01:01:13.160 Essentially.
01:01:13.900 I just, I, you know, I asked him a bunch of questions that I am truly genuinely curious
01:01:19.280 about.
01:01:19.780 And so, uh, that was all coming from a very truthful place.
01:01:23.420 He's like, I know what's going to get me laid telling my date about this child trafficking.
01:01:27.600 That's crazy.
01:01:28.880 There's, there's this short book, very, very short book called.
01:01:32.460 It's not all about me.
01:01:33.880 It's by this former FBI special agent.
01:01:36.760 Uh, and it talks about how to get information from short-term relationships with people.
01:01:42.160 Highly recommend people read it.
01:01:43.520 It'll take like two hours, an hour and a half.
01:01:45.480 It's a little tiny booklet.
01:01:46.880 Teeny tiny.
01:01:47.640 People should read it, uh, and then go do this stuff themselves.
01:01:50.840 Like in real life, even just to improve relationships with people.
01:01:53.500 It's about listening to people, knowing what they want.
01:01:56.320 Uh, and it's, it's very, very effective.
01:01:58.500 Yeah.
01:01:58.720 So the first thing you have to do is pretend to be a phlebotomist, but, but, but like, what
01:02:05.480 else could you explain?
01:02:06.080 You mentioned there was dating app stuff.
01:02:07.460 Is it a dating app stuff?
01:02:08.960 That was crazy because I made an account on a dating app was swiping in Montreal where their
01:02:17.580 headquarters are.
01:02:18.440 And one of the first profiles I came across was of Mike Farley, who's the 11 year employee,
01:02:24.080 was the number seventh guy to ever be hired.
01:02:26.140 Did you super like?
01:02:27.760 So his profile, all it said was tech.
01:02:31.620 It didn't say a company name or anything, but I decided to just ask him where he worked.
01:02:37.840 And the first thing he said was Pornhub.
01:02:39.820 But that's crazy.
01:02:40.440 So you, you're, was it Tinder?
01:02:43.200 Uh, no.
01:02:44.580 Bumble, I think.
01:02:45.080 Bumble.
01:02:45.500 Yeah.
01:02:45.660 So that, that's kind of crazy to me.
01:02:47.640 I mean, there, there's millions of, how many million people live in Montreal?
01:02:50.800 Yeah.
01:02:51.000 Millions of people.
01:02:51.400 And you open it.
01:02:52.160 And there's like 800 people.
01:02:53.140 And you recognized him.
01:02:54.620 No, no, no, I didn't.
01:02:55.960 I just had, he, I saw his photo.
01:02:58.900 He just said.
01:02:59.900 Tech.
01:03:00.300 Product management at tech.
01:03:02.420 Oh, so you're just like chatting up anybody.
01:03:05.300 Sometimes.
01:03:06.180 Yeah.
01:03:06.780 I decided to just like try as much as I could.
01:03:09.660 I, I spent all day, you know, chatting with people, but within the first two hours, you know,
01:03:13.680 I swipe right, right on this guy.
01:03:15.460 And I just say, um, Hey, what kind of tech do you do?
01:03:19.700 And he said, I'm the product manager at Pornhub.
01:03:23.220 And I was like, are you scared that one day you'll come into this moment where it's like,
01:03:28.900 you know, you're under the, you're, you're, you're, you're doing the sting operation.
01:03:32.620 He's giving you information and then, you know, he, he discovers it and you're like
01:03:37.740 outside in the rain and he's like, I trusted you.
01:03:41.240 I loved you.
01:03:41.940 And you're like, I'm sorry.
01:03:43.220 Like, is there a risk of you actually falling for somebody?
01:03:45.860 Uh, no, no, I, I think I'm safe from that.
01:03:50.720 Um, and also we definitely limit the amount of meetings and what we do at the meetings
01:03:58.420 in order for no lines to be crossed.
01:04:01.340 There are, uh, stories in law enforcement.
01:04:04.400 Uh, this is an old story.
01:04:06.000 I don't know the details on it, but there was like a guy who I think he was like FBI or something.
01:04:10.000 And he infiltrated eco terrorists and he ended up cause they're allowed to sleep with their
01:04:14.640 targets.
01:04:15.640 Absolutely.
01:04:16.640 Yeah.
01:04:16.800 Yeah.
01:04:16.920 Yeah.
01:04:17.060 On the job, they're allowed to have sex with the people they're investigating.
01:04:20.320 What?
01:04:21.020 And he ended up falling in love with her and then refused to testify against her.
01:04:25.220 And they're like, this is your job.
01:04:27.040 You're going in there.
01:04:27.560 And he's like, Nope.
01:04:28.460 I wonder what like the ethic.
01:04:30.960 Yeah.
01:04:31.220 Cause that definitely for us, like would totally cross the line.
01:04:35.560 We're not like, yeah, no.
01:04:37.360 Yeah.
01:04:37.480 Law enforcement, something wild.
01:04:38.840 I mean, I, I'm, I could be totally wrong about this.
01:04:40.940 My understanding is they can do drugs too.
01:04:42.720 Hmm.
01:04:43.420 Like if you're, if you're going undercover and you're going into a gang, like the trope
01:04:47.560 in the movies is there'll be like, all right, prove you're not a cop, do the drug.
01:04:51.320 And then they'll be like, I'll do the drug.
01:04:52.660 And they do.
01:04:53.240 And it's, and then, and then afterwards they're like sneezing and spitting.
01:04:55.720 Like, I can't believe I had to do the drug.
01:04:56.960 And it's like, no, if you don't, they're going to know you're, you're a cop.
01:04:59.820 Like you, you're not doing this unless you're, you're in or whatever.
01:05:03.400 Yeah.
01:05:04.340 Yeah.
01:05:05.200 Police.
01:05:05.800 Yeah.
01:05:06.260 Law enforcement can do a lot of things we can't do.
01:05:08.500 Right.
01:05:09.000 Well, I mean,
01:05:09.580 We operate within the law and more, a lot of more, but dating a guy is within the law.
01:05:13.560 It's just like, it's kind of dirty to go that far with your, Oh yeah.
01:05:18.440 Yeah.
01:05:19.160 Yeah.
01:05:19.540 For sure.
01:05:20.400 Plus it's kind of like, it's kind of, I guess in the instance of a, of a fed meeting an eco
01:05:25.520 activist, they're concerned about terrorism.
01:05:27.680 It's like, okay, like, what's the worst thing you're dealing with?
01:05:30.840 They might smoke too much weed or whatever, but we're not like, if the woman genuinely is
01:05:34.320 not blowing up train tracks, like then there's like, what's the real problem?
01:05:37.520 In this instance, I suppose you're meeting a guy who's bragging about, you know, facilitating
01:05:40.560 child trafficking.
01:05:41.200 It's like, I'm pretty sure we're not going to be into that or someone like that.
01:05:44.880 Yeah.
01:05:45.660 But, uh, but you know, what, what else is there?
01:05:47.400 Like, how does this take place?
01:05:48.600 You message this guy, he hits you up on Bumble and then you just ask him for a date or what?
01:05:52.940 Yeah.
01:05:53.260 And, and usually try to wait till I'm the one asked.
01:05:56.440 So it's not like weirdly pushy for a girl, you know, it's kind of crazy how simple it all
01:06:01.900 sounds.
01:06:02.620 Yeah, really?
01:06:03.040 No, it genuinely is very simple.
01:06:05.660 Yeah.
01:06:06.300 Yeah.
01:06:06.680 And then you have like pinhole cameras or like, how does that work?
01:06:09.720 Yeah.
01:06:09.980 I have a lot of different kinds of cameras.
01:06:12.340 Um, but one of them, you know, it looks like, it looks like a phone, which is kind of obvious,
01:06:18.560 but it has the camera that's actually.
01:06:21.280 In the top of it?
01:06:21.860 On the top of it.
01:06:22.760 So if you put it like that, it films upward.
01:06:25.560 Is it a phone though?
01:06:26.920 No, it's not a real phone.
01:06:28.340 Wait, what?
01:06:28.800 So it looks like a phone.
01:06:29.720 It just looks like a phone.
01:06:30.080 It would just have like a lock screen.
01:06:31.760 Yeah.
01:06:31.940 Yeah.
01:06:32.240 It'll very much look like a real phone, but you can't make calls from it.
01:06:35.700 But when, does it have apps or anything on it?
01:06:38.120 No, no.
01:06:39.040 Yeah.
01:06:39.620 It's just all like fake icons.
01:06:42.080 Really?
01:06:42.920 But like digital screen.
01:06:44.340 Yeah.
01:06:44.600 Digital screen.
01:06:45.360 And then, yeah, it's just like the camera is a tiny little hole right there.
01:06:48.460 So basically, if someone puts their phone under the table, pointing at me like that, I got
01:06:52.420 to go, hey, you get that phone out of there.
01:06:53.720 Yeah, honestly.
01:06:54.700 Yeah, you got to look for the little lens.
01:06:56.420 Yeah.
01:06:56.640 Yeah.
01:06:57.200 Wow.
01:06:58.020 Any other cool tech?
01:07:00.520 It's not very advanced, to be honest.
01:07:02.140 It really hasn't.
01:07:03.340 We're looking for more.
01:07:04.400 We're looking for better.
01:07:05.360 Yeah.
01:07:06.100 You know, I know James has been looking for more.
01:07:08.280 I mean, we're still using the stuff that we've used at Project Veritas for, oh yeah, we've
01:07:13.720 still looked, we're still looking, you know, we're still using the stuff that we've used
01:07:18.840 at Project Veritas for five years or whatever.
01:07:19.580 Can I Google that?
01:07:20.320 How do I Google that?
01:07:20.920 Fake camera phone?
01:07:22.840 Yeah, what would you Google?
01:07:23.440 Fake phone, camera?
01:07:25.840 Law mate, cell phone, maybe?
01:07:27.860 Law mate?
01:07:28.720 Yeah.
01:07:28.960 Trying to look up lawmate?
01:07:29.780 Law mate, cell phone, Cam.
01:07:31.800 Yeah.
01:07:33.120 Oh yeah, look at this.
01:07:35.660 Oh, that's super cool.
01:07:37.300 Yeah, you found it?
01:07:38.120 Yeah.
01:07:38.420 I think I did.
01:07:39.180 Yeah, that's it.
01:07:40.340 And like the screen pops up, but it's fake icons.
01:07:43.200 Yo, that's a Windows phone.
01:07:44.400 Who's going to fall for that?
01:07:45.760 Actually, so that's like.
01:07:46.920 If anybody ever goes on a date with you and they bought a Windows phone, you should be asking
01:07:49.740 questions regardless.
01:07:51.920 Yeah, that's like the functional screen.
01:07:54.320 But if you lock and press the button on the side, it actually looks like an iPhone screen.
01:08:01.480 Like a lock screen.
01:08:02.500 Yeah.
01:08:03.540 There's a James O'Keefe just did this undercover video of an executive at an employee of the
01:08:11.000 executive office of the White House.
01:08:12.180 Yeah.
01:08:12.600 Yeah.
01:08:12.960 And I think it's actually a massive scoop.
01:08:14.660 I mean, this guy says Michelle Obama does not want to run.
01:08:18.900 Yeah.
01:08:19.100 They've actually had discussions about replacing Kamala Harris.
01:08:21.920 They acknowledge Joe Biden's in serious mental decline, though not clinically diagnosed.
01:08:26.780 But a lot of people are questioning the ethics of James O'Keefe went on a date with a guy
01:08:32.000 who's not committing any crimes.
01:08:33.760 And this guy is is just explaining what's going on behind the scenes.
01:08:39.160 I think that knowledge that in the White House office top employees know Biden is unfit.
01:08:46.320 Kamala Harris is unpopular, can't win.
01:08:47.700 These things are important for the American people to know, but it doesn't rise to the
01:08:51.280 level of criminal activity.
01:08:52.780 It's just someone should have told us for political reasons.
01:08:56.940 And now people are questioning the ethics of I'll say this.
01:09:01.080 I feel bad for this guy.
01:09:02.680 He's on a date with James.
01:09:04.000 But like, you're a cybersecurity expert sitting in front of James O'Keefe.
01:09:08.120 You probably shouldn't be talking.
01:09:09.300 Yeah.
01:09:09.860 Like, don't go on dates and spill the beans about all the inner workings of the White House
01:09:14.900 office.
01:09:15.960 So the information gained from it, the guy's failure at security and his willingness to
01:09:21.140 give up all this information, I don't feel bad enough.
01:09:23.820 I'm not going to be like, that was wrong.
01:09:24.960 I'm like, no, I think James got some good work there.
01:09:27.180 But part of me does feel bad.
01:09:28.180 This doofy guy was going on a date.
01:09:30.180 And yeah, you know, and you know, with these with undercover things in general, I think
01:09:34.700 so long as the intention of the journalist is to expose the corruption that's going on
01:09:40.980 and not to humiliate the person that they're investigating, I don't think that should ever
01:09:45.980 be the intent.
01:09:48.200 James really humiliated this guy.
01:09:49.880 Oh, I'm sure he was embarrassed.
01:09:51.860 I didn't see the second part.
01:09:52.640 You know, I gotta be honest, it's one of the best videos James has ever done.
01:09:56.860 I do kind of feel bad for the guy.
01:09:57.960 I get it.
01:09:58.720 But I think the public right to know that within the White House, they're concerned
01:10:02.580 about Biden's mental fitness is is it's too important for this country.
01:10:06.160 I know it's not a legal criminal activity, but I think it's the benefit to the public.
01:10:11.520 I think that guy should have publicly stated, hey, guys, we are seriously concerned about
01:10:16.680 Biden's mental fitness.
01:10:18.760 Yeah.
01:10:19.120 That being said, when James says and he pulls his glasses off, what are you doing in a
01:10:24.100 meeting with James O'Keefe?
01:10:25.000 And the guy's like, oh, I've heard of you.
01:10:26.240 It's like, oh, man.
01:10:29.000 Wow.
01:10:30.140 It was good television.
01:10:31.500 Yeah.
01:10:31.800 I really like he had a good B-roll shot.
01:10:34.080 I don't know who was doing his B-roll.
01:10:35.740 They picked the camera.
01:10:36.360 It was like, yeah, really good.
01:10:38.080 You know, he like turns to the B-roll guy, takes off his glasses.
01:10:41.540 I'm James O'Keefe.
01:10:43.080 So good.
01:10:44.580 Like Clark Kent.
01:10:45.660 Yeah.
01:10:46.120 Yeah.
01:10:46.440 Oh, man.
01:10:47.040 I wonder if this guy just didn't know who James was, though.
01:10:50.600 Like he may have heard of Veritas.
01:10:52.420 But James also brings up a really good point.
01:10:54.380 And this is why I'm not going to, I have no sympathy for this guy who fell for this or
01:11:00.120 whatever, if you're working cybersecurity and you don't have an assessment of security and
01:11:06.120 information threats or anything like that, to the extent that you would sit down with
01:11:10.180 the most, the preeminent investigative journalist in this country.
01:11:14.340 Yeah.
01:11:14.920 I'm sorry.
01:11:15.340 I got no sympathy for you.
01:11:16.660 Like if, if, you know, if you walk into a lion's den trying to pet a lion, I'm going
01:11:23.540 to be like, we'll try and get you out.
01:11:25.740 But dude, you, there was a video of like a guy jumps into a gorilla exhibit and the gorilla
01:11:30.760 just like beat him to death.
01:11:31.680 And I'm like, well, yeah, I don't want that to happen.
01:11:35.360 You know what I mean?
01:11:35.900 But like, what are you going to do if someone chooses to do that?
01:11:38.300 You know, like in the DNC, they have posters, they have trainings showing James's face,
01:11:43.600 showing everybody that James has ever met with at any point, anybody who was ever friends
01:11:48.800 with anybody at Project Veritas.
01:11:50.140 They even have tutorials on like what kind of cameras that are used.
01:11:53.800 And I remember talking with someone who worked at Planned Parenthood who said they have trainings
01:11:59.020 to spot undercover cameras because of Project Veritas.
01:12:02.020 Because they're evil.
01:12:02.900 I'm sorry.
01:12:04.200 Because they're evil.
01:12:05.500 Right.
01:12:05.700 There was one point where, uh, I tweeted something, there's a story that Veritas put out and I
01:12:11.560 tweeted something like, I didn't think it was that big of a deal.
01:12:13.980 And then James responded something with like, we know, and a picture of me from an undercover
01:12:18.680 camera.
01:12:19.420 Why?
01:12:19.980 Yeah.
01:12:20.200 And there were people who were like, because this is, this is years, it's like 10 years
01:12:23.220 ago or eight years ago.
01:12:24.000 And there are people like, James, why are you secretly recording Tim?
01:12:26.220 He's cool.
01:12:26.640 And I'm like, I don't care.
01:12:27.640 Like, bro, you could turn the camera on and film literally everything I say.
01:12:31.000 I say everything three or four hours a day that I'm thinking, like, I don't film me,
01:12:35.400 whatever.
01:12:35.760 The only, the only concern I have for like someone coming in here and secretly recording
01:12:39.180 is if there's someone's health and safety and privacy, like someone got cancer.
01:12:44.520 You're like, well, dude, come on.
01:12:45.520 Like that's not, none of our business, not public right to know.
01:12:47.380 Right.
01:12:47.700 But when it comes to these guys at Pornhub, when it comes to Planned Parenthood or whatever,
01:12:53.080 and the DNC, and they're like, we can't let anyone find out what's really going on.
01:12:56.640 I'm like, if you come to me and you say, I am concerned that you may leak some private
01:13:04.860 information about a person, which is not public right to know.
01:13:07.400 I can respect being concerned about that privacy wise.
01:13:10.520 But if a political organization is saying, don't let them film you, it's because they
01:13:14.100 know that their internal workings, they're doing bad things.
01:13:17.420 Right.
01:13:18.300 Like the trade secrets I get.
01:13:20.180 Yep.
01:13:20.540 If this was like, we're a technology company, be wary of people who want to steal and spy
01:13:23.720 on us.
01:13:24.120 That's not James O'Keefe.
01:13:25.260 Right.
01:13:26.120 My response to you guys are James O'Keefe is if you were filming me secretly, I'd be
01:13:30.280 like, oh, I hope you get a million hits on it.
01:13:32.080 Cause it'll be just, if, if you, if you, if, if you show a video to all your people like,
01:13:37.000 aha, we caught Tim talking about the gold standard.
01:13:39.260 I'd be like, oh, okay.
01:13:40.300 You know, all right.
01:13:41.220 Yeah.
01:13:41.660 Or like he wants, he wants regulation on porn or something.
01:13:44.360 And the libertarians are mad.
01:13:45.200 I'd be like, well, okay.
01:13:45.900 You know, I do say that.
01:13:47.420 And, and there was a time like in the late 1800s, early 1900s when undercover journalism
01:13:52.920 was more commonly accepted as legitimate journalism.
01:13:57.540 Nellie.
01:13:57.780 And it went out of trend.
01:13:59.080 Nellie Bly.
01:13:59.680 Nellie Bly.
01:14:00.740 Famously.
01:14:01.360 She went to that mental institution.
01:14:03.440 And pretended to be insane and, and documented their abuse of the patients there.
01:14:08.240 And it was a huge story.
01:14:09.980 And we had, uh, under pop, uh, other popular undercover journalists all over the world.
01:14:14.780 And then, um, you know, the journalistic award ceremonies stopped awarding undercover
01:14:21.200 journalists awards.
01:14:22.600 So people really stopped doing it cause they knew they wouldn't be rewarded.
01:14:25.680 I'm sure James has told you this story, but, uh, what, what was the, the grocery store
01:14:30.280 chain, uh, lion something, uh, yeah, food lion.
01:14:34.980 Um, there was a CBS station that investigated them undercover and then food lion sued them.
01:14:42.800 And then food line ended up winning, but only like $1 or something.
01:14:46.500 But it just, the, the, the lawsuits lawfare just ended up scaring away lots of mainstream
01:14:52.640 journalists from ever doing actual investigative or undercover journalism.
01:14:56.040 There was, uh, some story that came out from Veritas rest in peace Veritas, by the way.
01:15:01.100 And, uh, it was, you know, undercover story exposes this and the media runs full, full speed.
01:15:07.540 It's edited.
01:15:08.600 It's, it's manipulative.
01:15:09.960 This undercover stuff is dirty.
01:15:11.240 And then like a month later, you got channel four.
01:15:13.780 I think it was in the UK undercover video of anti-immigration activists.
01:15:17.860 And it was the greatest journalism ever done.
01:15:19.920 This is what's wild.
01:15:21.360 These, these people are evil.
01:15:23.120 So I'll give you another explanation, another example of just, that's the easiest one for
01:15:27.680 us as it pertains to undercover journalism.
01:15:29.340 But look at when vice media embedded with the white nationalist in Charlottesville, they
01:15:34.620 win every award.
01:15:35.340 They get all these accolades and awards.
01:15:37.020 Everyone's like, wow, such great work.
01:15:39.500 When any anti-establishment journalist does quite literally the same thing.
01:15:42.940 They say you're friends.
01:15:44.280 Yeah, no, it's true.
01:15:45.480 And vice did this whole undercover expose on TripAdvisor and how TripAdvisor is rigged and it
01:15:51.060 got really popular.
01:15:51.940 But when it's about subject matter that really matters, suddenly undercover journalism is
01:15:57.100 the devil.
01:15:58.000 If you're, if you're a part of the regime.
01:15:59.660 So when these journalists try to expose the inner workings of anti-immigration activists,
01:16:06.060 I should say anti-illegal immigration activists, they're celebrated because they oppose the regime,
01:16:13.360 they oppose the establishment.
01:16:14.420 And if you're, say, Sound Investigations or formerly Project Ventures or Omkief Media Group,
01:16:19.640 you are actively against the corruption of the machine and they don't want that exposed.
01:16:25.800 So you're bad.
01:16:27.740 And it is remarkable the cognitive dissonance that exists among the people willing to support
01:16:31.600 this.
01:16:31.900 I think the challenge many people need to come to terms with is if there is a human being
01:16:37.780 who can look at two specific, like, identical circumstances, but one is good for them and
01:16:45.580 one is bad for them.
01:16:46.400 And so they argue, like, okay, we've got corruption in two instances, but this corruption benefits
01:16:51.940 it's me, therefore it's good.
01:16:53.360 That's what we are dealing with on a large scale.
01:16:55.980 Like, I wouldn't, have you guys gotten any attacks in the media over this stuff?
01:17:00.400 A couple.
01:17:01.220 I mean, they're all sponsored by the adult industry.
01:17:04.580 We have, like, adult video network writing hit pieces and stuff like that.
01:17:09.760 But we've gotten several legal threats from Pornhub's attorneys.
01:17:13.440 And it's really funny because, you know, they're mad, heated letters where it's like, you lied
01:17:19.560 about your identity and contacted one of our employees and met with him under false pretenses
01:17:24.200 and recorded him without his consent.
01:17:27.040 And we're just like, yeah, that's undercover journalism.
01:17:30.460 Like, it's not a crime.
01:17:31.460 It's also a little bit ironic.
01:17:32.180 Recorded without your consent.
01:17:33.460 It's like, yeah, that's what we're exposing on your website.
01:17:36.540 Exactly.
01:17:37.400 Oh, wow.
01:17:37.920 That's really fascinating, too.
01:17:38.960 Yeah.
01:17:39.320 It's like, yeah.
01:17:40.220 I think their exact words were something like, you non-consensually recorded and uploaded
01:17:45.540 videos of our employees.
01:17:47.640 And it was like, um, this is so tone deaf.
01:17:50.320 It's not even funny.
01:17:51.320 Did you respond?
01:17:53.180 No.
01:17:53.440 I respond with, like, that's quite literally what we exposed you doing.
01:17:57.280 Yeah.
01:17:57.760 Like, your employee was bragging about how you do that and don't care.
01:18:01.560 So why do you care that we did?
01:18:03.840 You're like, hey, you're doing something that's not nearly as bad as the thing that we're
01:18:07.600 doing.
01:18:08.900 I know.
01:18:09.460 You know, and they're demanding we take down our videos that we recorded and all these types
01:18:16.600 of threats and all the while saying that they're huge supporters of free speech, but
01:18:22.040 only when it comes to the adult industry.
01:18:23.800 When it's us, they don't care about our free speech.
01:18:25.600 They want us to delete all our videos.
01:18:27.540 How did you guys come to it?
01:18:29.420 Was this what kicked off the creation of Sound Investigations?
01:18:32.080 I mean, you were working for Veritas.
01:18:33.600 What happened with that?
01:18:34.340 Why'd you leave?
01:18:35.080 No, I left Project Veritas back at the summer of 2021, I think.
01:18:41.860 And then I was originally in tech, so I went back into tech for a little bit.
01:18:45.640 And then and then I kind of was developing this idea because, like I said, I kind of had the
01:18:54.180 idea back in the Project Veritas days.
01:18:55.880 And then like December of 2022, I remember we were at the Turning Point event and I was
01:19:06.320 talking with Arden and she was like, do you need like a sidekick for any project?
01:19:09.880 And I was like, I kind of had an idea.
01:19:11.960 Hold on.
01:19:12.760 Let me like get some money together.
01:19:14.300 Let me think about something.
01:19:15.180 And then I called her up at the beginning of the last year.
01:19:17.660 I was like, all right, let's let's go do this.
01:19:19.680 Here's the idea.
01:19:20.560 Do you want in?
01:19:21.580 Yeah.
01:19:21.700 And this is like a total passion project.
01:19:24.060 Like we are a nonprofit, but we've we have not solicited any donations from anyone.
01:19:30.080 Eric is completely self-funded this just because he's so passionate about this subject matter.
01:19:37.860 And he knows how significantly online pornography and abuse has affected our current culture.
01:19:46.060 Our videos are being cited as evidence in trafficking cases against the company.
01:19:52.100 I mean, these are the reasons why we're doing what we do.
01:19:54.620 Yeah, we saw this as an opportunity to get real results.
01:19:56.720 Like the best stories we ever did at Project Veritas was the stories that got real results.
01:20:00.200 One of my favorite stories was this one in Texas where where our field ops manager got this lady to expose this entire voter fraud ring and the lady ended up going to jail.
01:20:14.420 And those are our best stories.
01:20:15.440 The ones that get real results, get companies, you know, that are doing wrong to get sued, to go to people, you know, the people to go to jail.
01:20:24.320 And so I was like, this has a real this has so much impact.
01:20:29.980 It will have cultural effects, sure, but it also have like real tangible effects.
01:20:34.920 And that's what we're seeing now.
01:20:36.440 You know, I got to be honest.
01:20:37.780 I'd be willing to bet that if you met with literally anybody, they would expose corruption in their industry no matter what it was.
01:20:44.980 Yeah, I don't care too much about the corruption of like a grocery store.
01:20:48.100 Right.
01:20:48.400 Although, to be fair, Foodline, there was something there.
01:20:50.300 I mean, like, if there's a guy and it's like, sometimes we change the date on the meet to be one day later.
01:20:56.220 Yeah.
01:20:56.520 Well, OK, you know, I mean.
01:20:57.460 It's local news.
01:20:58.320 It's bad.
01:20:59.180 Yeah, but.
01:21:00.100 But I'd be willing to bet that if you met with anybody and like, I was like, I work with a water reclamation plant in the city.
01:21:06.720 And you'll be like, oh, yeah, how's that go?
01:21:08.260 He'd be like, well, you know, sometimes you don't actually clean the water at all.
01:21:10.520 You're drinking feces.
01:21:11.840 I bet you'd find that stuff.
01:21:13.560 Yeah.
01:21:13.820 You know, at Project Veritas, that was when part of our interview process was when when we're bringing a new undercover journalist was go out, go out and get a local story.
01:21:26.740 Go, go wear a camera.
01:21:28.620 Go talk.
01:21:29.080 Go to the mall.
01:21:29.780 Go to a restaurant and get a story.
01:21:31.920 And the best journalist would.
01:21:34.420 They would be like, yeah, there's this guy who's been peeing in the fountain at the mall every day.
01:21:39.840 And like and.
01:21:41.280 Is that a true story, though?
01:21:42.820 Yeah, that was really one.
01:21:44.220 It was at the local mall.
01:21:45.520 The cop like told the undercover journalist there's been a guy peeing in the fountain every day or something.
01:21:51.000 Yeah.
01:21:51.460 What was yours?
01:21:52.440 The story that I got was a an employee at a restaurant at the mall, and he said the manager is always making the employees hide the fact that they have a mice infestation.
01:22:04.660 So when the health inspector comes, the manager is like, everyone, shut up.
01:22:08.580 Wow.
01:22:09.000 We really should have released this.
01:22:10.220 We should have had like a separate branch of like Project Veritas local or something, because, yeah, like you said, you could people can do this locally in their own communities.
01:22:18.800 And like and the great thing about local news is that sometimes it's easier, depending on where you live, to get like lawmakers to change things versus like federal regulators, all that.
01:22:29.480 You might be able to contact like your council member about your undercover investigation and get some change.
01:22:36.560 So, yeah, I mean, local stories are great, too.
01:22:39.560 Wow, that's crazy.
01:22:40.600 I mean, imagine if you did that and then you hand that off to a local news outlet like, hey, check out.
01:22:45.120 The problem is the local news outlet might be like they're they're an advertiser.
01:22:48.100 Sorry, have a nice day.
01:22:49.140 Yeah.
01:22:49.160 And there was a it was a large restaurant chain, so I don't know.
01:22:53.500 Did this ever get released?
01:22:55.800 No, I don't think we ever released any local things.
01:22:58.840 Is there are you guys like I mean, your faces are on camera now.
01:23:02.520 Is there a risk that you're not going to be able to like Veritas had to rotate undercover reporters, right?
01:23:08.840 Yeah, they had a team.
01:23:10.240 Um, I mean, yeah, there's a risk.
01:23:12.360 And and part of the reason why I am doing public interviews is just because we're such a small team.
01:23:18.200 And and we were like, well, who's going to do the public interviews?
01:23:21.800 And Eric said, not it.
01:23:23.340 So yeah, I mean, and it's very powerful to have the undercover journalist.
01:23:28.760 So, you know, Arden can be like, I'm the one who recorded this.
01:23:31.880 It's not like some faceless.
01:23:32.940 Yeah, it creates more authority.
01:23:35.340 I can speak to it.
01:23:36.300 I know the ins and outs of what I recorded.
01:23:38.840 So I think it just kind of makes sense.
01:23:41.880 And I can still go undercover.
01:23:44.220 You know, James O'Keefe puts on glasses, goes into the middle of D.C.
01:23:48.360 and gets the guy to talk about all the things in D.C.
01:23:51.640 That's the crazy thing that nobody walked up to him like, James, I'm a big fan.
01:23:54.080 Yeah, you would think so.
01:23:55.660 That's crazy.
01:23:56.220 But yeah, no, I surprised James and I were hanging out not long ago.
01:24:00.760 So, man, people come up to us like, whoa, it's James O'Keefe, Tim Poole, like we're big fans.
01:24:06.280 How did he do that?
01:24:06.980 This is amazing.
01:24:07.880 He colored his hair and put on glasses.
01:24:10.220 He didn't even get a haircut.
01:24:11.400 I mean, he sprayed his hair a little bit, put on glasses, and yeah, can just go into a D.C.
01:24:15.920 Roshan.
01:24:16.580 Yeah.
01:24:16.820 That's wild.
01:24:17.760 So, yeah, I mean, yeah, not to say too much, but we had some plans.
01:24:23.400 And yeah, we'd also like to expand the team.
01:24:26.600 We'd like to bring on more people.
01:24:29.000 We need money for that.
01:24:29.660 Yeah, exactly.
01:24:31.020 So, we're working on raising some money.
01:24:33.540 We'll figure something out.
01:24:34.940 And because like I've said, we've had some real impact with just the two of us on a shoestring budget.
01:24:42.300 So, like, we have big plans for what we could do with more money, more people, more industries.
01:24:49.640 There are a lot more companies in the U.S., like you said, you named some sites, too, that we're interested in.
01:24:58.420 You know, there's a lot of people ask all the time, like, how do you get started?
01:25:03.180 How do you work in journalism?
01:25:04.700 And I'm just like, hearing this, there is a deep underbelly of corruption and malfeasance that exists basically everywhere.
01:25:12.580 Yeah.
01:25:12.980 And we need real journalists who are willing to do journalism.
01:25:18.060 It's funny because, you know, we're watching the media industry collapse over the past week.
01:25:22.400 I don't know if you guys saw these stories, all these layoffs.
01:25:24.080 Wall Street Journal just laid off people in their Washington bureau, which is crazy.
01:25:26.980 It's an election year.
01:25:28.040 Yeah.
01:25:28.340 And you get these journalists, I do air quotes here, who are saying things like, without journalism, there's going to be a lot of corruption.
01:25:34.780 You know, you need us.
01:25:35.780 And I'm like, not you.
01:25:36.880 Like, you guys are the corruption, but you're right about the corruption.
01:25:40.620 There's probably some dude who works at City Hall in your town, and there's something he's not telling you.
01:25:46.580 The public should know.
01:25:48.500 There's certainly reasons sometimes to withhold information.
01:25:50.860 I think that's absolutely true.
01:25:51.640 If, you know, someone came out and said, like, we've discovered this problem, we want to make sure we can solve the problem, otherwise it can be exploited, for example.
01:26:01.520 There was this famous story of something called DNS cache poisoning.
01:26:05.940 This is a flaw that was found in the domain name servers of the internet, that if it was exploited, it could destroy, like, overnight.
01:26:16.420 I don't know the full details, because I'm not super into cybersecurity.
01:26:20.000 But people I know who worked on it said, once we discovered it, we didn't tell anybody.
01:26:24.740 We went to the big companies, and the government said, fix it now.
01:26:27.300 I understand that.
01:26:28.320 If they publicly disclosed, hey, somebody would hit it, and then it would break.
01:26:33.400 But after they solve the problem, then, so I can understand that.
01:26:36.500 But there are people who are in government who are like, I just found out this guy has been embezzling public funds.
01:26:42.900 If the public finds out, it will be disruptive to the city, and people will revolt, and there will be a recall, and it will be cast.
01:26:48.400 Let's just say nothing.
01:26:49.740 Right.
01:26:50.000 And that's how the corruption persists.
01:26:51.680 Right.
01:26:52.360 So it could be a town of 500 people.
01:26:55.300 All someone has to do is, I do it legally, of course, because there's one party, two party, all party consent states I got to watch out for.
01:27:00.980 Yeah.
01:27:01.480 But anybody could do this.
01:27:03.060 Now, this smartphone HD hidden camera DVR thing you mentioned is $500.
01:27:07.820 Yeah.
01:27:08.120 Not the easiest thing in the world to get.
01:27:09.900 But even a regular cell phone can do this.
01:27:11.980 Regular cell phone.
01:27:13.160 There's cheaper options on Amazon, too.
01:27:15.220 There's ones that look like pens you could tuck in your shirt or whatever.
01:27:18.440 And, yeah, anyone can do it as long as you check all your boxes legally.
01:27:23.160 You know, I don't have a journalism degree.
01:27:24.920 I didn't go to school for journalism.
01:27:26.720 I was an actress in Hollywood throughout my teens and young adulthood.
01:27:31.440 Wow.
01:27:31.820 Were you anything anyone would know?
01:27:32.960 We would know.
01:27:33.620 Yeah.
01:27:34.520 The most popular one would probably be Modern Family.
01:27:37.000 I did a couple.
01:27:37.500 Oh, wow.
01:27:38.100 Yeah.
01:27:38.840 Speaking roles.
01:27:39.780 Yeah.
01:27:40.140 Yeah.
01:27:40.440 Oh, wow.
01:27:40.820 Yeah.
01:27:40.980 I play Luke's little friend slash.
01:27:43.520 And now you're out because, like.
01:27:44.980 Yeah.
01:27:45.280 The moment you start exposing pedos, they're like, well, that goes against their core values.
01:27:48.540 So.
01:27:49.780 Yeah.
01:27:50.180 That industry, at least.
01:27:51.720 No, definitely not.
01:27:53.180 Not as bad anymore.
01:27:55.460 Wow.
01:27:56.660 Yeah.
01:27:56.880 But I think the main point.
01:27:58.560 Anybody could get started.
01:27:59.940 So I'm curious.
01:28:00.580 In that capacity, though, the big challenge is going to be legal issues.
01:28:03.440 And if you guys are a small operation, like, how do you navigate that?
01:28:06.740 Yeah.
01:28:07.140 I mean, we have a good Canadian lawyer because these operations took place in Canada.
01:28:12.040 So we wanted to cross all the T's, dot all the I's.
01:28:15.500 Canada, fortunately, is very friendly to undercover journalism, maybe surprisingly.
01:28:19.120 Wow.
01:28:19.520 Across the board, it's one party consent.
01:28:22.260 Whole country.
01:28:23.480 Yeah.
01:28:24.800 All of Canada.
01:28:25.860 So, you know, we were clear that we wanted to, you know, make sure there's no accidental
01:28:30.560 defamation, anything like that, any, you know, small things.
01:28:33.580 So we have really good counsel in Canada.
01:28:35.560 Um, but yeah, like you said, you know, it's not, it's not cheap, but it's a lot cheaper
01:28:41.080 than making some mistake and then ruining an entire operation, you know.
01:28:44.620 Illinois is one of the craziest states when it comes to, uh, recording because it's a
01:28:48.820 mafia state.
01:28:49.600 Exactly.
01:28:50.040 And they're super evil.
01:28:51.360 So you can't record.
01:28:52.660 Because you know, the story of the Chicago Sun-Times, do you know this about the Mirage
01:28:57.120 bar?
01:28:57.740 Uh, the Chicago Sun-Times set, they, they bought a bar.
01:29:01.920 Uh, this was, I forget, I think in the seventies, they bought a bar.
01:29:05.560 Yeah.
01:29:06.020 And they called it the Mirage, I think.
01:29:07.760 Yeah.
01:29:08.360 And, and then they recorded like all these local regulators.
01:29:11.080 They rigged it with hidden cameras.
01:29:12.900 Yeah.
01:29:13.080 Yeah.
01:29:13.360 They set up.
01:29:13.880 Wow.
01:29:14.120 That's such a cool idea.
01:29:15.820 Yeah.
01:29:16.180 And, uh, I think that's probably one of the, the reasons for now Illinois being like, no
01:29:23.340 undercover journalism, never going to happen again.
01:29:26.200 Cause, uh, they got a lot of people in trouble.
01:29:28.220 So I got to set up a political lobbyist club.
01:29:31.960 Yes.
01:29:32.320 And just have cameras and microphones everywhere.
01:29:34.820 And now then you're going to get, they will, they will end your life.
01:29:38.660 Yeah.
01:29:38.940 They might kill you for that.
01:29:40.680 But there's certain things I wonder, like if you were to hang up a sign that maybe looked
01:29:47.120 kind of tongue in cheek, that was like, smile, you're on camera.
01:29:50.500 Would that eliminate their, uh, you know, like.
01:29:55.900 Yeah.
01:29:56.580 The Illinois has some untested things.
01:29:59.520 He had to consult with counsel and some of that stuff.
01:30:02.540 Yo, this story is crazy.
01:30:03.420 I never knew this 1978 sometimes and the better government association with, uh, they, uh,
01:30:10.580 broke the story of a 25 part series, documenting the abuses and crimes committed at the tavern,
01:30:14.600 which was shaken down repeatedly by state and local officials.
01:30:18.380 It was initially nominated for the Pulitzer prize for general reporting, but the board
01:30:22.180 decided not to award it, uh, award the sun BGA collaboration after editor Ben Bradley
01:30:27.640 of the Washington post led an attack on the grounds.
01:30:30.160 The reporters used undercover reporting, a form of deception to report the story.
01:30:34.220 Damn.
01:30:34.760 These people are evil.
01:30:35.940 And that's a huge reason why there's not a whole ton of undercover journalism anymore
01:30:41.720 is because journalists got scared to even do it.
01:30:45.500 That's Washington post, man.
01:30:47.480 Wow.
01:30:47.940 In the seventies, even I'm surprised.
01:30:50.160 I'd like to imagine that there was at one point like real journalism in this country,
01:30:53.460 but I just don't really, it's, it's small at this point.
01:30:57.000 You know what it is?
01:30:57.560 Um, they say the Pulitzer prize is the highest award in journalism, but a CIA assassination
01:31:02.400 is the highest award in journalism.
01:31:04.520 True.
01:31:04.980 Yeah.
01:31:05.220 If you're actually about to break some major stories for the benefit of the public, you
01:31:09.560 might, uh, you might get really depressed real soon and then your car brakes fail.
01:31:14.300 Mm-hmm.
01:31:14.840 And that's where we find ourselves.
01:31:16.400 Or you shoot yourself to death twice.
01:31:19.180 Yeah.
01:31:19.460 That was the famous story.
01:31:20.260 What's that guy's name?
01:31:21.200 Do you remember the guy who was exposing the CIA?
01:31:24.480 And then they said he killed himself by shooting himself twice in the head.
01:31:27.360 Yeah.
01:31:27.760 It was like a real story.
01:31:28.840 I don't remember his name.
01:31:29.880 It's just.
01:31:31.400 Yeah.
01:31:31.940 Where do you, where do you, where do you guys, uh, uh, what's your plans?
01:31:34.000 Like, where do you, where do you see where you're going?
01:31:35.400 Where, where are you going?
01:31:36.140 And then like, obviously with, with Veritas kind of just imploding.
01:31:40.680 I'm wondering like, there's, there's two of you now, at least like James O'Keefe is still
01:31:45.300 doing his work.
01:31:45.960 O'Keefe Media Group.
01:31:46.740 Now there's sound investigations.
01:31:48.220 It's going to get bigger.
01:31:49.500 Is it a good thing?
01:31:50.320 I don't know what's happening.
01:31:52.780 Yeah, we, we hope so.
01:31:54.940 Uh, we have a lot of ideas.
01:31:56.580 We're still out in the field.
01:31:57.820 Uh, I'll say that much.
01:31:59.340 Um, we, um, we have a few avenues.
01:32:03.760 We want to continue exposing exploitation in the adult industry.
01:32:07.880 There's, uh, again, uh, nobody else is doing it.
01:32:12.420 So we're not, you know, we're not stepping on anybody's toes.
01:32:15.300 And I think it's very important work.
01:32:17.000 I think it's one of the biggest issues, uh, in the world right now.
01:32:20.780 Um, a lot of these countries are out, or a lot of these companies are outside of the
01:32:25.280 United States.
01:32:26.160 So that adds some complication, but we still want to do it.
01:32:29.540 Um, uh, so yeah, the plan is really, we're kind of in fundraising mode while still doing
01:32:37.000 a few things on the ground.
01:32:38.600 Uh, we still got a few operations going on.
01:32:41.560 Um, and, uh, when we get to fundraising, we're going to bring on more people.
01:32:47.220 We're going to go to these countries.
01:32:48.660 We're going to expose more companies and, uh, and we're going to get more tangible changes.
01:32:54.220 You got to recruit more people.
01:32:55.760 Definitely.
01:32:56.340 Yeah.
01:32:56.540 We have some people in mind.
01:32:57.600 We have some, we have some, uh, people who've, who've, uh, done really good work.
01:33:01.420 I wanted to ask about like potential subject matter you're interested in, but I don't
01:33:04.540 know if that would just compromise your ability to go ahead and ask.
01:33:07.380 Well, like what, what, what, what kind of subjects are you guys looking to go after next?
01:33:10.640 I mean, I think obviously mentioned exploitation.
01:33:13.360 There's still so much more there.
01:33:14.880 Online sexual exploitation is of big interest to us.
01:33:19.640 What else?
01:33:20.140 Yeah, I think there's, um, uh, the abortion issue is another big issue, especially in this
01:33:28.680 election cycle and, uh, a lot of, I mean, a lot of my friends have been locked up for
01:33:35.680 peaceful pro-life speech, uh, by this administration.
01:33:39.140 Yeah.
01:33:39.240 It's a big story that's happening right now.
01:33:40.380 Like 11 people.
01:33:41.280 Yeah.
01:33:41.740 Yeah.
01:33:42.240 Uh, like those are crazy.
01:33:44.200 Yeah.
01:33:44.800 Um, I'm, I'm wondering, have there been any, uh, I know Steven Crowder did.
01:33:50.040 And I don't know if you'd call it a comedy bit or undercover journalism because he went,
01:33:55.680 what, what did he do?
01:33:56.180 He went to a planned parenthood or something pretending to be a trans woman saying that
01:34:00.580 was, I didn't see that.
01:34:02.260 I think this was a long time or this was a few years ago.
01:34:04.280 I think I remember.
01:34:04.960 Yeah.
01:34:05.140 Yeah.
01:34:05.260 He, uh, he, he said that he was a trans woman who was pregnant or something.
01:34:09.620 Yeah.
01:34:10.140 And, uh, and they just went along with it.
01:34:12.600 So it's, it's, it's, it's, you know, is it a cow?
01:34:15.840 It's undercover journalism.
01:34:16.780 And the point he made was that if a man takes a pregnancy, he's on a pregnancy test and it
01:34:21.800 comes up positive, he likely has cancer or the, it's a, it's an indication of cancer.
01:34:25.400 And they were like, we're not going anywhere near this because you know, the, the gender
01:34:29.900 ideology issue is so touchy, but I'm wondering, has there been any, uh, am I missing this?
01:34:37.360 Like undercover reporting into these clinics that are giving kids hormones and drugs.
01:34:41.300 You know, we had, um, on Timcast IRL, we had a detransitioner who said that she went
01:34:46.340 into a planned paranoid and within minutes, they said, here's the maximum dosage of testosterone
01:34:51.500 you can have.
01:34:52.400 So I'm wondering if there's anything big on that.
01:34:55.340 I mean, you know, back at project Veritas, there was some good reporting about these clinics.
01:34:59.840 I think it was called like the two young investigation.
01:35:03.040 They had like their name for it, but it was these clinics providing, um, hormones and offering
01:35:09.240 surgeries for underage patients without parent consent.
01:35:14.900 And, um, and not only that, but I think there was, gosh, there was a lot of like really egregious
01:35:22.040 admissions.
01:35:22.740 There was even one doctor who was like, yeah, you know, we give, we okay these surgeries and
01:35:28.040 these treatments for 14 year olds.
01:35:29.660 And maybe it's only like an immediate satisfaction kind of thing for them and they'll regret it
01:35:34.200 later.
01:35:34.460 But at least they're happy in the moment.
01:35:36.180 So you were at Veritas too?
01:35:37.960 Used to be.
01:35:38.540 Yeah.
01:35:38.780 What can we ask?
01:35:40.080 Can I ask like what happened?
01:35:41.580 It's crazy.
01:35:42.340 Like this letter comes out accusing James of these ridiculous stories, like stealing a
01:35:47.280 pregnant woman's sandwich.
01:35:48.020 And I'm just like, I'm sorry.
01:35:49.020 I can't believe it's just too stupid of a story.
01:35:51.460 It was so messy and complicated.
01:35:53.640 Yeah.
01:35:53.660 I mean, I wasn't there.
01:35:55.060 Uh, I, I left back in 2021.
01:35:57.180 Uh, and then Arden, uh, she was like joining my team at the time.
01:36:03.400 So it wasn't really part of it.
01:36:05.200 Yeah.
01:36:05.220 I kind of got out at an opportune time.
01:36:07.600 Yeah.
01:36:08.140 It was, uh, did it seem like there was any merit to what was being claimed?
01:36:11.960 I mean, I, like, like I said, I wasn't there, but James is a very good friend of mine.
01:36:16.540 And like, that's what I mean.
01:36:17.200 Like you guys have worked with James.
01:36:18.480 I mean, yeah, I mean, this is, I, I mean, I'll just say like, is an excuse to like remove
01:36:24.380 a leader, like, unfortunately there's, there's lots of jealousy to go around.
01:36:27.880 And, uh, I mean, you know, I have friends on both sides of the issue, great people,
01:36:32.600 talented people, hard workers.
01:36:34.060 I just decide to hands off.
01:36:37.020 And I mean, the reality of it is project Veritas was one of the most consequential news
01:36:42.480 organizations of our generation and possibly in this country ever.
01:36:48.440 Uh, I'm not trying to say it is the biggest.
01:36:50.800 I mean, the work that Veritas did was massive, especially the, uh, Amy Roback, uh, video where
01:36:56.980 she's like, Epstein, we got him, Clinton, all that stuff.
01:37:00.260 And, uh, you know, Veritas drops that.
01:37:02.060 That's massive.
01:37:02.740 Yeah.
01:37:03.200 And then it gets destroyed.
01:37:05.000 And for what reason?
01:37:06.700 I mean, I actually think it's beneficial.
01:37:08.580 The, uh, uh, the fact that James is now still doing his work, just put out a, a, a massive
01:37:15.340 video, uh, uh, um, with political insider knowledge.
01:37:18.080 I don't, I don't see whatever the effort was as being tremendously successful, but it's
01:37:22.960 kind of crazy that Veritas was destroyed in the way it was.
01:37:25.000 And, and the question is like, how does this happen?
01:37:27.020 The thing is a lot of people want to take out James, you know, like I think the federal
01:37:31.380 government has been after him.
01:37:33.340 I mean, he and, he and I were raided by the FBI, uh, you, you know, about that back in
01:37:38.820 2021 and, um, and I think the idea is in some way they need to remove him, put him out of
01:37:47.740 the picture.
01:37:48.940 Um, and I think this might be part of it.
01:37:52.460 I think, uh, I think they're kind of waiting to see, uh, if he's down for the count, but
01:37:58.140 the thing is, James is a leader and a builder.
01:38:00.360 So he's always going to keep on like, they're going to have to literally put him in prison
01:38:06.280 to stop him from building a new organization.
01:38:09.460 Yeah.
01:38:09.960 So I guess I have a question about the, uh, the, um, you, the, the expansion of all this
01:38:15.380 undercover journalism, because there's been criticisms of me in the past.
01:38:19.600 Cause I would do live, I would go live stream.
01:38:21.040 I'd go on the ground to these various protests and I would have my phone in my hand, plainly
01:38:25.760 visible for everybody to see streaming 24 seven.
01:38:27.980 And there, the argument was, it was good that we got a window into all these things that
01:38:32.600 are happening, but as more people adopt this practice, privacy starts to get eroded.
01:38:37.800 And so there is a fear.
01:38:39.340 I mean, do we want to live in a world where everything we say is going to be secretly
01:38:42.260 recorded by someone or should we just accept it?
01:38:48.480 I think it's like you said, I think people, uh, there are things that the public has a right
01:38:53.100 to know.
01:38:53.580 Like people don't, the job of a journalist is not to embarrass somebody.
01:38:57.560 By exposing personal secrets or, um, or trade secrets or health information, um, or just
01:39:08.860 things that are personal.
01:39:10.160 It's not gossip.
01:39:11.140 It's not meant to.
01:39:12.180 Right.
01:39:12.340 Well, health information, it's, it's, it's, it's rough.
01:39:14.520 I mean, Joe Biden's health information is public right to know.
01:39:17.340 Absolutely.
01:39:18.400 So, you know, having someone from the white house say, yeah, he's in mental decline.
01:39:21.720 We know.
01:39:23.300 I think that's, that's, that's less specific than like, oh, he has this personal embarrassing,
01:39:28.620 like health condition.
01:39:29.900 I like, well, I think that the issue here is he's the president.
01:39:32.860 Yeah.
01:39:33.260 I mean, it's all his position has to be taken into account.
01:39:37.100 And, um, yeah, what's newsworthy in for one person might be totally irrelevant if a different
01:39:45.140 person were saying the same thing.
01:39:46.560 Well, there's the challenge, right?
01:39:47.720 So for, uh, for me doing a Tim gets IRL, for instance, you get these leftists that
01:39:53.640 will make a video about a tweet I made and their justification is, oh, but he's got millions
01:39:57.940 of followers.
01:39:58.560 We absolutely have to address this issue.
01:39:59.660 And I'm like, okay, I get that.
01:40:00.960 And so at what point is it, everyone starts secretly recording.
01:40:05.200 And then do we find ourselves in this, you know, hypervigilant state always where we're
01:40:09.840 scared someone's going to publish something, you know, it may not be in my, like it,
01:40:14.300 like I get to go to the doctor and the doctor's like, oh, you've got a kidney stone or whatever.
01:40:18.100 And it's like, okay, I'd prefer it if people didn't know that I honestly don't care.
01:40:21.680 I'm literally saying it, but like, let's say there's somebody who's like, I'd prefer
01:40:23.840 no one knew that someone else might be like, it's really important that this guy who runs
01:40:27.380 his big show, people know that he's sick or something and they decide it's newsworthy.
01:40:30.560 And then someone else agrees.
01:40:32.060 And more to the point, it's not a question of them being right or wrong.
01:40:34.620 It's a question of the expansion of this technology.
01:40:37.640 Do we create the panopticon ourselves, which we are like sort of already doing, you know,
01:40:43.240 like once, once everybody had a phone camera, right?
01:40:46.100 We now into a world.
01:40:47.340 Yeah.
01:40:47.600 Everything you're filmed 24 seven.
01:40:48.940 There's nothing you can do.
01:40:49.480 Welcome to the, the, the brave new world.
01:40:51.680 Right.
01:40:52.100 I don't think there's a putting, you know, the, the genie back in the bottle with that
01:40:56.300 one.
01:40:56.800 Like, yeah, everybody's got a cell phone.
01:41:00.460 Google was recording you all the time.
01:41:02.500 Every company has got big data on, on you.
01:41:05.580 Um, like, I think, I think there's just no going back from that.
01:41:09.960 Probably anyway, you just, yeah, I agree.
01:41:12.240 Might as well expose some real stuff while we're at it.
01:41:15.660 Yeah.
01:41:15.900 Maybe, maybe the reality is just, we're going to get, I suppose the scarier thing is outside
01:41:20.940 of the idea that whatever it is you do is probably recorded.
01:41:24.140 I mean, satellites, who knows?
01:41:26.660 But the scary thing now is how AI can generate video.
01:41:29.720 So, and so we had, uh, what, what video was it recently?
01:41:34.380 It was, um, it was audio with Carrie Lake.
01:41:38.060 And, uh, you guys saw this, you see this one, the Carrie Lake and the guy shaking her down.
01:41:41.620 And the first question a lot of people ask is, is this fake?
01:41:44.620 Was it made by a deep fake?
01:41:46.680 And the guy, what's his name?
01:41:48.280 Jeff DeWitt said that it was selectively edited.
01:41:50.300 That's what they love to say.
01:41:51.440 It was edited.
01:41:52.300 It's not real.
01:41:53.040 Oh, they took it out.
01:41:53.900 Now they can just be like, ah, it's a deep fake.
01:41:55.680 It's not real.
01:41:55.900 I know.
01:41:56.580 Yeah.
01:41:56.960 So at what point is it just going to be like,
01:41:58.620 someone will, someone could go on a date with you, undercover film you, but then AI generate
01:42:06.420 whatever it is they want you to say.
01:42:07.600 And what's, how do you, how do you prove it?
01:42:09.180 How do you, what do you do?
01:42:10.360 Yeah.
01:42:10.600 It's an increasing problem for undercover journalism.
01:42:12.920 We got to do, we got to do undercover journalism while people would still believe that it's real.
01:42:17.680 Um, and, and a lot of people suggested to me when I was producing the, the tapes, like,
01:42:22.320 oh, you should, you should cut out more of the background audio.
01:42:25.720 You need to like clean up the audio more in these tapes.
01:42:28.620 And I was like, no, no, I want it to be as raw as possible.
01:42:32.200 Cause I don't want anybody accusing us.
01:42:34.360 And I love saying we edited the audio or it sounds fake.
01:42:37.260 Of course we still got accused of that, but I love that argument.
01:42:40.260 Cause it's like, okay, please explain the scenario where this person would be making these admissions
01:42:47.900 or these statements.
01:42:48.800 Like there's, he's saying full sentences where he's like talking about Pornhub specifically,
01:42:55.060 talking about trafficking, rape, sexual abuse specifically.
01:42:59.320 And there's like no universe in where I could have taken this out of context.
01:43:05.060 Yeah.
01:43:05.540 Well, that's a selective argument, uh, selective editing argument.
01:43:08.520 I'm wondering what happens.
01:43:09.840 Uh, we saw this with the Kyle Rittenhouse case.
01:43:11.940 They tried using CGI images to convict this kid.
01:43:14.720 So the way it works is they have a photo, uh, on, I think it was like an iPad and they zoom
01:43:20.480 in.
01:43:21.300 There is no such thing as zoom in.
01:43:23.440 It is not real.
01:43:24.660 Right.
01:43:24.880 The way digital zoom works on these apps when you zoom in is that an algorithm creates pixels
01:43:30.440 that it thinks are there.
01:43:32.340 So when they zoom in and say, see, look, here's what the image shows.
01:43:36.040 You're like, no, no, no, no.
01:43:37.160 That, that zoomed image is manipulated.
01:43:39.340 You need to actually show the static image in its standard form and then ask people if
01:43:44.060 they can see it.
01:43:44.940 So we're already in this period where it could go one of two ways.
01:43:49.560 Someone could secretly record another person, CGI audio of them.
01:43:55.600 And then what are you going to do?
01:43:56.880 It was nuts.
01:43:57.920 I mean, I could not believe it as I'm watching this trial.
01:44:00.640 Well, these guys are trying to use computer generated images to convict Rittenhouse.
01:44:04.220 And the judge is like, I'll allow it.
01:44:05.800 And I'm like, that's crazy.
01:44:07.480 Yeah.
01:44:07.640 And the, and the, and he put it on the defense to prove otherwise.
01:44:11.080 Yeah.
01:44:11.700 So this means someone could go to court and say, here's a recording of, you know, of
01:44:15.720 you guys that we got.
01:44:17.900 And here's you saying all of these things you did that are illegal.
01:44:20.080 And then you go, that's not a real recording.
01:44:21.640 And they'll say, prove it.
01:44:22.800 Now what happens?
01:44:23.900 You guys bring in your expert.
01:44:25.660 They bring in their expert.
01:44:26.840 Your expert says, see this all here that proves it's fake.
01:44:29.880 Then their expert's hands up and says, no, that's actually a totally normal thing.
01:44:33.120 Good luck, Jerry.
01:44:33.800 Figure it out.
01:44:34.660 Right.
01:44:36.940 Yeah.
01:44:37.380 Welcome to the brave new world.
01:44:38.680 This is going to be tough.
01:44:39.760 Yeah.
01:44:40.220 Just like the implications for undercover journalism.
01:44:42.940 I mean, it's easier with video undercover journalism than, than just audio, of course,
01:44:48.060 to prove that it's real, to, to look more real.
01:44:50.380 Cause, but in 10 years, like it might just be over.
01:44:55.080 Yeah.
01:44:55.540 It might just be very simple to just make fake videos of people.
01:44:58.940 It's funny.
01:44:59.740 Cause, uh, I was asked by Joe Rogan about that.
01:45:02.240 Uh, this was like a couple of years ago, last time I was on a show and he's like, are you
01:45:04.660 worried about deep fakes?
01:45:05.500 I was like, no, I'm like, context, uh, is, is going to prove these things.
01:45:11.360 Like someone makes a fake video.
01:45:12.400 People are going to be like, I don't believe that they're gonna be more resilient to it.
01:45:14.560 And now based on what we've seen now, I'm terrified because it's not just about the
01:45:20.380 fact that someone could make something fake.
01:45:21.660 You know, one thing I've pointed out, the scariest thing is going to be when someone
01:45:26.500 will record you guys and it'll be a real conversation.
01:45:29.500 And you'll say something like, obviously we would never do anything illegal.
01:45:32.980 We have great lawyers and we're working really, really hard to make sure everything
01:45:36.060 is done properly.
01:45:37.640 And then all they have to do is get rid of ill or, or, or add ill.
01:45:42.320 Sorry.
01:45:42.720 Right.
01:45:42.960 And so you go, you know, obviously we're doing things that are illegal, but we have great
01:45:45.980 lawyers.
01:45:46.380 So we're doing it right.
01:45:47.880 And then what they'll do is you'll get sued or whatever.
01:45:52.840 They'll, they'll say, you know, did you have this conversation with this person?
01:45:57.100 Say yes.
01:45:58.060 Did you talk about these issues?
01:45:59.380 Yes.
01:46:00.220 Did you say that you have good lawyers?
01:46:02.100 Yes.
01:46:02.420 Okay.
01:46:02.860 Play the tape.
01:46:03.520 And then they add two letters to a word you said.
01:46:06.860 And so on the video, you're not going to notice you can't track mouth movements that
01:46:10.260 perfectly.
01:46:11.280 And then what do you do?
01:46:12.400 They're like, here's you admitting you commit crimes.
01:46:14.380 You know, that's fake.
01:46:15.100 But you just told us the conversation happened.
01:46:16.700 You just told us you had said these things.
01:46:18.860 Here it is.
01:46:19.540 Everyone can hear it.
01:46:20.140 You've admitted you've confessed to have a nice day.
01:46:22.600 Bang.
01:46:23.080 That's scary.
01:46:24.140 Yeah.
01:46:24.840 So I don't know how we deal with it.
01:46:26.220 I mean, because even then the argument is going to be, you need an expert who can debunk
01:46:31.200 the fake videos.
01:46:32.740 Yeah.
01:46:33.800 But, but then you get an expert who bunks.
01:46:37.880 Yeah.
01:46:38.340 Like the jury is not going to have the expertise to know whose expert is right.
01:46:42.600 And yeah.
01:46:42.900 And it's just going to be good luck convincing people.
01:46:45.020 It is kind of crazy if you think about it though.
01:46:46.280 If you go back before any of this forensic evidence and video, what were juries like back
01:46:50.300 then?
01:46:51.380 A guy is like, he did it.
01:46:52.460 I saw him do it.
01:46:53.300 No, he didn't.
01:46:53.860 I didn't see him do it.
01:46:54.560 It's like, who do you believe?
01:46:56.400 Yeah.
01:46:57.040 Good luck, man.
01:46:57.620 A lot of people probably went to jail that were innocent.
01:46:59.560 But, but outside of that too, I think that the issue of AI is going to be the, the, the
01:47:03.500 personal universe.
01:47:05.540 I think that'll be like a big, big crisis for us.
01:47:08.220 Yeah.
01:47:08.760 People are going to choose to just go into these universes.
01:47:12.480 They're going to make whatever titillating content they can make.
01:47:15.660 They're going to lock themselves away.
01:47:16.780 And I think what's going to happen is, you know, going back, like way back to the earlier
01:47:21.840 conversation, literally about what you guys exposed, you have no willpower to regulate
01:47:29.160 what is clearly criminal, no law enforcement willpower to arrest people who are clearly
01:47:33.660 committing crimes.
01:47:34.580 And they just stand there.
01:47:36.480 And I mean, you go five miles over the speed limit, you get pulled over and the cop yells
01:47:40.360 at you.
01:47:41.120 You are an employee at a massive child trafficking organization.
01:47:44.140 And they say, well, you know, I don't have to do here.
01:47:46.560 So as we move forward with all this technology, I, I, I can appreciate the investigative journalism.
01:47:52.180 I can see that there are people upset about it, but it looks like there's going to be
01:47:54.080 a massive split where most people are just like, no, they're just in for it.
01:47:58.980 They're going to let it happen.
01:47:59.940 And everything's kind of going to, going to break down, I guess.
01:48:04.580 That's a bleak picture.
01:48:06.980 Well, maybe, I mean, you know, I'll put it this way.
01:48:10.740 Turn of the century, 1900s, they were writing about how the cities would be over with horse
01:48:14.880 manure and that that's the end of New York.
01:48:17.980 There's going to be piles of horse crap as more and more people are born and inhabit the
01:48:23.300 city.
01:48:24.460 Density increases.
01:48:25.840 Horses are going to be crapping like crazy.
01:48:27.500 And there's going to be mounds of manure where they can't get rid of.
01:48:29.500 Then they invented the car.
01:48:31.060 Yeah.
01:48:31.460 And everyone was like, never saw that coming.
01:48:33.420 There's something we're not thinking of.
01:48:34.940 Yeah, I hope.
01:48:35.620 Yeah.
01:48:36.080 Yeah.
01:48:36.320 I hope so too.
01:48:36.940 I mean, you know, Pornhub, their leadership even acknowledged that we are, you know, the
01:48:44.160 pro-regulation side of the argument.
01:48:46.440 We are currently winning with all of these laws passing, the age verification laws, with
01:48:52.020 them having, being forced to change their upload policies throughout, you know, various
01:48:56.800 public scandals that they've been accused of.
01:49:00.420 So I really do think positive things are happening.
01:49:03.800 I think more and more people are realizing how harmful pornography is on so many levels.
01:49:09.920 I get DMs all the time from people saying, I was an addict or I'm still addicted, but
01:49:14.580 I'm trying to quit.
01:49:15.420 Thank you so much.
01:49:16.980 So I really do believe good things are happening and we'll just, our job is to just keep exposing
01:49:23.260 and doing it.
01:49:24.000 And that's all we can do.
01:49:24.940 I feel like, uh, do you guys remember, what was that, what was that law they tried passing?
01:49:30.320 What was it?
01:49:30.940 Back page or whatever.
01:49:32.300 Yeah.
01:49:32.920 Yeah.
01:49:33.420 They shut that down.
01:49:34.460 Sasta Festa.
01:49:35.260 Yeah.
01:49:35.620 Yeah.
01:49:35.780 Yeah.
01:49:36.060 And there was this big push online where they were like, if you allow this, they're going
01:49:38.880 to censor free speech.
01:49:40.200 And I feel like, you know, now in hindsight and looking at all this stuff, I'm not, I'm
01:49:43.600 not going to comment directly on those laws because it's been too long and they're specific,
01:49:46.640 but I definitely feel like when it comes to this stuff, there is a manipulative.
01:49:51.060 Oh no, our rights, you better not ban pornography and child trafficking.
01:49:57.460 And I'm kind of like, at this point, uh, you should.
01:50:00.480 Yeah.
01:50:01.080 And specifically, ALO, Pornhub's parent company, they try to classify themselves as just a tech
01:50:07.340 company.
01:50:07.780 So they try to group themselves in with all of these tech giants that have immunity for
01:50:12.880 what's hosted on the site under section 230.
01:50:15.540 And so that's the argument they're trying to get away with.
01:50:18.300 Are they, are they basically arguing, well, we don't make any content.
01:50:21.900 Or YouTube.
01:50:22.300 Yeah.
01:50:22.480 For, for Pornhub specifically, but really they can't, they can't correctly use that argument
01:50:28.460 because they're a porn giant that does everything from write to shoot, to produce, to, yes, to
01:50:35.120 distribute porn.
01:50:35.780 They do everything.
01:50:36.500 And even Dylan said, he's a senior script writer there.
01:50:39.380 He says, we even write some of the amateur stuff that you see being uploaded.
01:50:44.200 We work with these people and we actually write it for them.
01:50:47.480 Wow.
01:50:47.880 So a lot of it is manipulated by this huge parent company itself.
01:50:53.380 Man.
01:50:53.540 These class action lawsuits they've tried that, that our, our work has been cited and they've,
01:50:58.200 uh, tried to claim section 230 immunity failed every time.
01:51:01.460 Really?
01:51:01.860 Yeah.
01:51:02.300 Well, elaborate.
01:51:03.860 What's that?
01:51:04.320 Give me, give me, can you talk about it?
01:51:05.600 Cause that's big.
01:51:06.020 That's big.
01:51:06.660 Yeah.
01:51:07.060 I mean, you know, like, like Arden said, they, uh, they try to say, oh, we're just a platform.
01:51:11.660 We're like YouTube.
01:51:12.860 Was there like a specific case?
01:51:14.820 Um, yeah.
01:51:15.500 The, there's the GDP case.
01:51:17.340 Yeah.
01:51:17.840 Uh, well, there's the GDP case.
01:51:19.320 That's the criminal charges as well.
01:51:20.940 But yeah, there was a simple case.
01:51:22.420 Section 230 did not protect them.
01:51:23.340 Yes.
01:51:23.780 And there's a big, uh, I don't know the case number offhand, but there's a big, uh, class
01:51:27.440 action federal lawsuit in California where they tried to do it.
01:51:31.060 Um, and failed there.
01:51:33.080 It's, it's a bunch of victims sued the parent company because, uh, Pornhub had this content
01:51:39.540 partner that they worked with called girls do porn.
01:51:42.320 And they were one of the top performing content partners on the site made millions of dollars,
01:51:47.020 but essentially how they procured these girls in these videos is, um, through coercion through
01:51:52.980 sometimes force through trafficking.
01:51:54.500 You could see in some of these videos that the doors for the rooms that they're in were
01:51:58.740 barricaded.
01:51:59.480 The girls couldn't get out.
01:52:00.700 Um, many of these girls were contacted while under age and groomed so that they would show
01:52:06.140 up to this location to film a video on their 18th birthday.
01:52:10.940 Um, so there was a lot of really horrible illegal stuff going on.
01:52:15.500 Many of the videos were violent.
01:52:17.220 Um, you see blood in some of them.
01:52:19.240 And so a bunch of these, uh, women sued the parent company.
01:52:23.500 Wow.
01:52:24.420 For those that aren't familiar, just to briefly address it.
01:52:27.340 Section 230 of the telecommunity of the community.
01:52:29.960 I think it's the communications decency act provides broad immunity to websites over the
01:52:35.480 content posted by other people.
01:52:38.360 And so it was supposed to be, it's, it gets started because there's like a website, they
01:52:43.300 publish an article and I think it was the wolf of wall street.
01:52:45.600 They, uh, they, they wrote, they wrote something about him and someone commented something false.
01:52:50.080 And so I could be wrong, but they got sued for defamation and they were like, we did not
01:52:54.700 write that.
01:52:55.280 That was a comment from somebody else.
01:52:57.080 And so this law gets passed saying, okay, okay, you can't be held responsible for what other
01:53:00.720 people put on your site, but now it's turned into the companies have broad immunity.
01:53:06.440 When they censor political ideas, they don't like, they can basically violate their own
01:53:11.240 terms, the spirit of their terms.
01:53:12.580 They can allow graphic content and just say, don't look at me.
01:53:16.740 I didn't do it.
01:53:17.680 But that's why it's fascinating to hear that in some of the, some of these cases, it failed
01:53:21.560 on them.
01:53:22.320 In all of the cases, actually, every single, every single civil lawsuit that they've tried
01:53:26.780 did play that in is failed.
01:53:28.560 Yeah.
01:53:29.280 That's wild.
01:53:30.180 So it's, it's not working for Pornhub.
01:53:32.300 But it's crazy that it works for politics.
01:53:34.180 Right.
01:53:34.900 Yeah.
01:53:35.240 I mean, I'm working for everybody else.
01:53:36.680 I suppose it's a good thing.
01:53:37.500 A lot of people have called for outright getting rid of section 230, which I think is a mistake.
01:53:42.380 I just think the issue has to be, uh, like you, you mentioned Pornhub, they make the stuff
01:53:48.420 they're deeply involved in it, but I, I don't actually even respect like Twitter when they,
01:53:56.340 when they have porn on their website, like nah, you, you, you, like, if you went into,
01:54:00.080 imagine going to like a, a, a Savon, a Jewelosco, an Albertsons, uh, a Safeway.
01:54:06.080 I'm just naming as many regional grocery stores as I can.
01:54:08.600 And, uh, you're walking around with your kids or, you know, with your friends, friends
01:54:11.640 and family and you know, there's a coffee section and you're grabbing it.
01:54:14.260 Then you walk over, there's a porn section, just hardcore graphic stuff all down the aisle.
01:54:18.320 Like we would not tolerate that.
01:54:19.760 Right.
01:54:19.960 And that's what we have right now.
01:54:21.480 And they try to use section 230 to justify it.
01:54:23.620 No, I think we should have some kind of like, I don't know, regulation or something to stop
01:54:28.160 that.
01:54:29.120 Yeah.
01:54:29.540 And it's illegal to, uh, market, uh, porn to children.
01:54:33.700 Again, like a lot of these things are like things that already exist for the physical
01:54:38.700 space.
01:54:39.080 And they're just not, they're just not being enforced in the digital space.
01:54:43.220 And so now we're like having these States do the ID laws and some other things.
01:54:47.360 And, uh, like to clarify that, like this applies to the digital space as well.
01:54:52.960 I think one thing that we're seeing a lot with, uh, schools and these books, people on
01:54:57.020 the right would call grooming and absolutely.
01:55:00.520 Yeah.
01:55:01.000 But I'm before elaborating specifically on like, you know, for those who don't know the
01:55:04.120 background, I'm curious if you guys have seen anything in your work that is a connection
01:55:08.740 in any way to what they're pushing on kids into the porn industry.
01:55:12.200 I mean, arguably some of the admissions, we had a couple of different employees expressing
01:55:21.640 just like their personal opinions, but we thought it was still significant that they
01:55:25.960 had positive views about, um, 12 year olds watching trans porn in order to find themselves
01:55:32.120 and, uh, explore their sexualities.
01:55:35.480 And one person even remarked that a kid could find their kink on men.com, which is also one
01:55:41.060 of their sites.
01:55:42.740 But that's basically the argument these people have for, uh, putting the graphic material
01:55:46.340 in schools for children.
01:55:47.340 That it's, that it's education rather than pornography.
01:55:51.680 So let's combine these things.
01:55:53.440 If, if you've got people who work there who are saying that they want to keep sending more
01:55:58.120 and more extreme content to guys, and eventually a guy who's straight, who's watching straight
01:56:01.700 porn, they can send trans or gay content to, and they start shifting in that direction.
01:56:07.260 That's them outright admitting they convert and groom people.
01:56:10.060 So when they then talk about kids and they're doing the same thing, that is an admission
01:56:14.420 that they, that they know when they get these books in these schools, they are grooming kids
01:56:20.400 to adopt patterns of behavior or, or, you know, proclivities or whatever.
01:56:25.660 And I feel like the pipeline ends up in, in whether someone's doing porn or not, they're
01:56:31.140 going to be in that world.
01:56:32.660 Yeah.
01:56:32.860 And they're hyper-sexualized, you know, that's, uh, that's a large part of it.
01:56:36.380 I wonder, like, I, I genuinely don't know if there's like, is there an end game to just
01:56:41.420 shattering brains?
01:56:43.640 What, what, what, what is the end result?
01:56:45.260 Is it to stop people from having kids because they're too messed up to be attracted to each
01:56:48.520 other?
01:56:49.480 Maybe on some level.
01:56:50.720 So I think the incentives just line up right now where there's still a lot of money to
01:56:56.560 be made on the internet.
01:56:58.480 That's so like, you know, uh, and like, what are people interested in?
01:57:03.580 Uh, they're interested in porn.
01:57:05.520 Yeah.
01:57:05.840 I think 65% of internet activity is porn related.
01:57:10.460 Man.
01:57:11.360 It's the, it's the easiest.
01:57:12.700 It's the wild west.
01:57:13.480 Yeah.
01:57:14.280 But I, like I was saying earlier, it will, the, the, the production of porn is over in
01:57:19.840 the physical sense.
01:57:21.100 Seriously.
01:57:22.300 Like there was a, a, a year ago I noticed on Instagram, this is why I absolutely despise
01:57:28.980 Instagram.
01:57:29.880 Um, look when I'm on Instagram, you know what I watch?
01:57:32.600 I watch action sports insert season, right?
01:57:36.020 It's, it's winter now and I'm getting tons of snowboarding and skiing.
01:57:39.140 And so I'm just watching, I was watching Sean White do like a backflip.
01:57:42.160 I'm like, that's cool.
01:57:42.900 Like, wow.
01:57:43.720 Him and like his friends are just cruising and doing flips and I watch poker videos and
01:57:48.880 uh, that's basically it.
01:57:50.880 It's like, but they jam like these kind of like quote unquote instant models in my feed
01:57:57.580 all the time.
01:57:57.920 I don't look at them.
01:57:58.980 I'm like, dude, I'm a, I got a girlfriend not interested in, in this.
01:58:03.280 I'm like, I, I don't want to get my feet inundated with a, with a bunch of women showing
01:58:07.760 off their asses or whatever.
01:58:09.060 I just want to watch some guy win a poker hand.
01:58:10.900 Oh, he's got seven dues.
01:58:11.980 Then he bluffed them.
01:58:12.640 Ha ha.
01:58:12.880 How fun.
01:58:13.580 Oh, this guy's doing a backflip.
01:58:14.640 I've had a good time.
01:58:15.400 They keep showing this stuff because I think the reality is the algorithm shows people click
01:58:20.520 it no matter what.
01:58:21.300 Yeah.
01:58:21.540 And so it doesn't matter who you are, what you do.
01:58:24.060 Here's, here's something for you.
01:58:25.700 And we think this is, so it's probably the highest click through content.
01:58:29.500 This is why on YouTube in the early days, all the thumbnails were women in bikinis and
01:58:33.340 stuff like this.
01:58:33.880 But I noticed this a year ago, these photos were fake women and it was obviously AI women.
01:58:41.080 And so, you know, I go to the, the explore tab and I get a video of a guy's rollerblade
01:58:45.240 and he does a double backflip.
01:58:46.180 And I'm like, wow, scroll down.
01:58:47.900 And then you'll see in, in the grid, there's a woman.
01:58:51.040 And I'm like, that's CGI.
01:58:53.360 That is an AI generated image.
01:58:55.240 That's crazy.
01:58:56.420 Now you can't tell it all.
01:58:58.480 Like only a little bit.
01:58:59.900 And we're hearing stories about them making like 30 grand a month.
01:59:02.760 This is the big story that some guys, so you don't got to worry about your daughters.
01:59:07.440 They can't do porn anymore.
01:59:08.640 That's done.
01:59:09.640 But the brains of, of like young people in society are going to be like in what, 20 years,
01:59:16.040 the next generation, there's going to be a woman and she's going to go to like, she's
01:59:19.420 going to meet a guy and she's going to be like, I have no physical attraction to you
01:59:21.680 whatsoever because I'm into aliens and, you know, stone golems and just weird insert
01:59:28.820 crazy stuff that doesn't exist.
01:59:30.240 And the guy's going to say, I agree.
01:59:31.460 I also have no interest in you.
01:59:32.440 I'm into dolphins and whales.
01:59:34.640 And it's like, they're not, you want a family?
01:59:37.180 Okay.
01:59:37.600 Buy the insemination kit.
01:59:38.540 Cause there's no way they're going to be able to procreate.
01:59:40.700 Maybe that's the plan, I guess.
01:59:42.220 I don't know.
01:59:43.140 But at any rate, as we're getting close to wrapping up, is there anything you guys want to
01:59:46.200 mention or shout out before we close out?
01:59:48.080 No, go to soundinvestigations.com.
01:59:51.340 You can see all of our videos there.
01:59:53.660 Follow Arden on Twitter, Arden Young.
01:59:57.200 Yeah.
01:59:57.580 Thank you for having us on.
01:59:58.580 Thank you so much.
01:59:59.420 Is there, is it like, what can we, what can we look for in the future though?
02:00:01.740 What can people expect?
02:00:02.800 Is there, is there ways you guys are hiring people looking for support or what's going
02:00:06.640 on?
02:00:07.300 More, more undercover videos in the future and hope to be hiring soon.
02:00:13.080 So yeah, definitely watch out for that.
02:00:15.320 We love to hire new journalists and get more story.
02:00:18.960 Right on.
02:00:19.420 What's your, uh, you got a Twitter X account?
02:00:21.640 Yeah.
02:00:21.920 I'm Arden underscore young underscore on Twitter.
02:00:26.140 Right on.
02:00:26.520 And sound investigations is, uh, sound investig.
02:00:30.280 It ends at G cause that's a 13.
02:00:31.880 Right on.
02:00:35.100 Uh, I think it's really cool that, you know, basically we used to have this great industry
02:00:39.720 of undercover journalism and it was considered to be something good historically.
02:00:43.360 Then I, the story of the Mirage Tavern in Chicago was surprising me back in the seventies.
02:00:47.860 They were like, not about it, but it's cool that, uh, you know, after whatever happens
02:00:51.340 with, with, with Veritas, James is still here.
02:00:53.600 You guys popped up.
02:00:54.620 I remember seeing the video and I'm like, this is very Veritas esque.
02:00:57.480 So it would be cool to see more people start to work to, uh, expose corruption.
02:01:01.840 Yeah.
02:01:02.320 And I think, you know, locally finding out, uh, local corruption and malfeasance is going
02:01:09.160 to be huge.
02:01:09.700 So I appreciate you guys coming and hanging out.
02:01:11.700 It's been a blast.
02:01:12.720 Yeah.
02:01:12.860 Uh, everybody who is listening, make sure you subscribe to tenant media.
02:01:16.080 The culture war show is, uh, every Friday at 10 AM till noon.
02:01:19.700 And we've got more shows to come conflict, controversy, debates, and cool subject matter.
02:01:24.300 The next show we'll have is going to be Timcast IRL over at youtube.com slash Timcast IRL
02:01:28.780 8 PM.
02:01:29.340 Thanks for hanging out.
02:01:30.040 We'll see you all then.
02:01:39.920 Have you heard of anything more chilling than frozen beef until November 3rd?
02:01:45.380 Get an always fresh, never frozen Dave single from Wendy's for only $4.
02:01:49.760 Nothing scary about that.
02:01:51.380 Taxes extra participating Wendy's until November 3rd.
02:01:53.720 Terms and conditions apply.
02:01:55.280 I hate hockey.
02:01:57.340 Seriously, I can't stand it.
02:02:00.100 My name is William Woodhams and I'm the CEO of the British-born sportsbook, Fitstairs.
02:02:05.060 We've been over in Ontario for well over a year now and have loved every second of it,
02:02:10.960 except one thing.
02:02:12.480 Let me tell you, us Brits simply can't get our heads around hockey.
02:02:16.180 It is so confusing to us and it is impossible to outsmart Canadians on the ice.
02:02:21.440 That's why at Fitstairs, the world's oldest bookmaker, you can play with this on anything,
02:02:25.640 anything you want, cricket, tennis, just not hockey.
02:02:30.460 Plus, with our world-class casino and over 150 years of experience, you're in great hands.
02:02:36.940 So, you've got to stop pucking around and go to fitstairs.ca.
02:02:42.000 That's F-I-T-Z-D-A-R-E-S dot C-A.
02:02:46.500 19 Plus, Ontario only.
02:02:48.440 Please play responsibly.