The Jordan B. Peterson Podcast


460. AI, Internet Scams, and the Balance of Freedom | Chris Olson


Summary

In this episode, we speak with Chris Olson, CEO of the Media Trust Company, a company that works with large corporations to protect their digital assets. We discuss the perils of criminal activity in the online world, including online pornography, phishing, and other types of online crime, as well as how corporations and governments can work together to combat these perils, and what they can do to prevent them from happening in the first place. And, of course, we talk about Cybersecurity, which is a broad umbrella term that covers a broad array of issues related to protecting the digital assets of corporates, governments, and the general public from harm from criminals. Music: Fair Weather Fans by The Baseball Project, Recorded in Los Angeles, CA and produced by Fordham University, New York, NY Thank you for listening to HYPEBEAST Radio and Creative Commons. Please don't forget to rate, review, and subscribe to our other shows MIC/LINE, The Anthropology, The HYPE Report, and HYPETALKS! Subscribe to our new podcast CRITIQUE. Subscribe on Apple Podcasts! Subscribe on iTunes Learn more about your ad choices. Use the promo code CRITICIALS to receive 10% off your first purchase when you shop at Caff or become a supporter of one of our new online retail stores! and receive a FREE copy of our newest book, CRITICALS by clicking here. Use promo code DAILYWEEKS to receive $10,000 when you book your first month and receive $25,000 in the offer is good for 10% OFFER $5,000 or more than $50, and get a discount of $50 OFFERING FREE when you enter the offer gets two months get a VIP discount when you buy a product review starts in two months or more get $4,000 get two months use that starts by clicking VIP + VIP gets 4 months get 5, VIP gets 5,000, they also get VIP access to VIP gets $4 VIP access? Subscribe and get an ad-only offer starts starting at $4 PRIVOTORCHORDS ONLY, they get my deal starts starting on my VIP membership offer starts $4? Learn more: bit.ee, use promo code CHECKED, $4 REVIEW AND VIPREALER_PROMO_PRODUCED, FREE FASTEST AND VIP_PROGONE?


Transcript

00:00:00.000 BetOnline has one of the largest offerings and betting odds in the world.
00:00:03.420 Beyond traditional sports, BetOnline gives you the option to bet on political events
00:00:06.700 like the outcome of the presidential election, whether Hunter Biden serves jail time before
00:00:10.460 2025, or who's going to be the next Republican speaker.
00:00:14.080 Political betting allows you to wager on real-world events outside the realm of sports.
00:00:18.140 Or if you're a diehard sports fan, BetOnline makes sports betting more accessible and convenient
00:00:22.420 than ever before.
00:00:23.660 With just a few clicks, you can place bets on your favorite teams or events from the
00:00:26.740 comfort of your own home.
00:00:27.540 BetOnline prides themselves with their higher-than-average betting limits of up to $25,000, and you
00:00:32.760 can increase your wagering amount by contacting their player services desk by phone or email.
00:00:37.660 So whether you're watching your favorite team or the news surrounding the upcoming election,
00:00:41.020 why not spice things up with a friendly wager at BetOnline?
00:00:43.980 Go to BetOnline.ag to place your bets.
00:00:46.320 Use promo code DAILYWIRE to get a 50% sign-up bonus of up to $250.
00:00:50.700 That's BetOnline.ag.
00:00:52.420 Use promo code DAILYWIRE.
00:00:54.100 BetOnline.
00:00:54.680 The options are endless.
00:00:57.540 Hello, everybody.
00:01:11.260 Today, I have the opportunity to speak with Chris Olson, who's CEO of the Media Trust Company.
00:01:18.380 His company is involved, occupies the forefront of attempts to make the online world a safer
00:01:29.800 place.
00:01:30.180 He mostly works with corporations to do that, mostly to protect their digital assets.
00:01:35.100 But I was interested in a more broad-ranging conversation discussing the dangers of online
00:01:42.620 criminality in general.
00:01:44.700 A substantial proportion of online interaction is criminal.
00:01:49.740 That's particularly true if you include pornography within that purview, because porn itself constitutes
00:01:55.600 about 20% to 25% of internet traffic.
00:01:58.840 But there's all sorts of criminal activity as well.
00:02:01.020 And so Chris and I talked about, for example, the people who are most vulnerable to criminal
00:02:06.080 activity, which includes elderly people who are particularly susceptible to romance scams
00:02:12.460 initiated on dating websites, but then undertaken off those sites, and also to phishing scams
00:02:22.640 on their devices that indicate, for example, that something's gone wrong with the device and
00:02:26.980 that they need to be repaired in a manner that also places them in the hands of criminals.
00:02:31.020 The sick and infirm are often targeted with false medical offers.
00:02:36.700 17-year-old men are targeted with offers for illicit drug purchase.
00:02:43.340 And juvenile girls, 14, 13, that age, who are interested in modeling careers, for example, are
00:02:50.040 frequently targeted by human traffickers.
00:02:52.200 This is a major problem.
00:02:53.980 The vast majority of elderly people are targeted by criminals on a regular basis.
00:02:57.860 They're very well identified demographically.
00:03:01.660 They know their ages.
00:03:02.820 They know where they live.
00:03:04.220 They know a lot about their online usage habits.
00:03:07.680 And they have personal details of the sort that can be gleaned as a consequence of continual
00:03:13.460 interaction with the online world.
00:03:15.540 And so I talked to Chris about all of that and about how we might conceptualize this as a
00:03:22.580 society when we're deciding to bring order to what is really the borderless, the ultimate borderless
00:03:30.080 Wild West community, and that's the hyper-connected and possibly increasingly pathological
00:03:37.980 online world.
00:03:40.620 Join us for that.
00:03:42.960 Well, hello, Mr. Olson.
00:03:44.220 Thank you for agreeing to do this.
00:03:45.740 We met at the presidential prayer breakfast not so long ago, and we had an engaging conversation
00:03:51.680 about the online world and its perils.
00:03:55.020 And I thought it would be extremely interesting for me and hopefully for everyone else to engage
00:04:02.880 in a serious conversation about, well, the spread of general criminality and misbehavior
00:04:08.380 online.
00:04:09.400 And so do you want to maybe start by telling people what you do, and then we'll delve more
00:04:15.160 deeply into the general problem?
00:04:17.300 Great.
00:04:17.620 Yes.
00:04:17.960 And thank you, Jordan.
00:04:18.960 Thanks for having me.
00:04:20.240 I'm the CEO and founder of the Media Trust Company, not intended to be an oxymoron.
00:04:25.520 Our primary job is to help big tech and digital media companies not cause harm when they monetize
00:04:32.640 audiences and when they target digital content.
00:04:35.300 So let's delve into the domains of possible harm.
00:04:39.960 So you're working with large companies.
00:04:42.940 Can you give us who, like, what sort of companies do you work with?
00:04:46.640 And then maybe you could delineate for us the potential domains of harm.
00:04:53.160 Yeah.
00:04:53.340 So I work with companies that own digital assets that people visit.
00:04:58.060 And I think maybe to set a quick premise, cybersecurity is a mature industry designed
00:05:04.480 to monetize the CISO, the chief security officer, generally protecting machines.
00:05:11.320 So there's a mindset geared to making sure that the digital asset is not harming servers,
00:05:18.440 the company or government data.
00:05:20.900 Our difference is that we're helping companies that are in digital.
00:05:24.160 So think big media companies, we're helping them protect from harming consumers, which
00:05:29.220 is the difference between digital crime, which is going to target people, and cybersecurity,
00:05:34.260 which is generally targeting corporates and governments and machines.
00:05:37.760 So now, does your work involve protection of the companies themselves also against online
00:05:46.120 criminal activity, or is it mostly aimed at stopping the companies themselves from, what
00:05:53.820 would you say, mostly, I suppose, inadvertently harming their consumers in pursuit of their
00:06:00.460 enterprise and their monetization?
00:06:02.660 Yeah.
00:06:02.960 So the great question, and I think that's where the heart of the matter is.
00:06:06.840 So our primary job is to watch the makeup of what targets digital citizens' devices.
00:06:13.240 The internet is made up of roughly 80% third-party code.
00:06:18.060 And what that means is when a consumer is visiting a news website, when they're checking
00:06:22.160 sports scores, when they're visiting social media, the predominance of activity that's
00:06:27.380 running on their machine is coming from companies that are not the owner of the website or the
00:06:33.340 mobile app that they're visiting.
00:06:35.320 That third-party code is where this mystery begins.
00:06:39.200 So who actually controls the impact on the consumer when they're visiting an asset that
00:06:46.040 is mostly made up of source code and content coming from other companies?
00:06:49.780 So our job is to look at that third-party content, discern what is good and bad based on company
00:06:55.620 policies, based on what might be harming the consumer, and then informing those companies what
00:07:01.180 is violating and how they can go about stopping that.
00:07:04.300 What sort of third-party code concerns might they face or have they faced?
00:07:11.220 What are the specifics that you're looking for?
00:07:14.240 Maybe you could also provide us with some of the more egregious examples of the kinds of
00:07:19.360 things that you're ferreting out, identifying, ferreting it out, and attempting to stop.
00:07:25.440 Yeah.
00:07:25.600 So I think putting any digital company into the conversation is critical.
00:07:31.720 So we're talking about tech support scams and romance scams targeting seniors.
00:07:37.080 That is an epidemic.
00:07:39.260 If you're a senior and you're on the internet on a regular basis, you're being attacked,
00:07:43.300 if not daily, certainly every week.
00:07:46.580 That is now a cultural phenomenon.
00:07:48.260 There's movies being produced about the phenomenon of seniors being targeted and attacked online.
00:07:54.040 It's teens.
00:07:55.920 So a 17-year-old male is being bombarded with information on how to buy opioids or other
00:08:01.500 drugs and having them shipped to their house.
00:08:03.520 If you're a 14-year-old female and you're interested in modeling, you're being approached by human
00:08:08.420 traffickers.
00:08:09.660 The sick and infirm are frantically searching the internet for cures.
00:08:14.020 While that's happening, they're having their life savings stolen.
00:08:16.780 So our job is to watch that third-party content and code, which is often advertising.
00:08:23.100 It's basically a real estate play on what keeps the consumer active on the digital asset
00:08:28.700 to find that problem and then give it back to the company.
00:08:34.420 I can jump in quickly in how we go about doing that.
00:08:37.520 So we become a synthetic persona.
00:08:41.080 We've been doing this for not quite two decades, but getting on 19 years.
00:08:45.060 We have physical machines in more than 120 countries.
00:08:49.420 We know how to look like a senior citizen, a teenager, someone with an illness.
00:08:54.400 And then we're rendering digital assets as those personas, acting more or less as a honeypot
00:08:59.540 to attract the problem that's coming through the digital supply chain, which runs on our
00:09:06.260 devices.
00:09:06.860 And I think that's going to be a key part of this conversation as we go.
00:09:09.820 Most of that action is happening with us.
00:09:14.080 And so it's difficult for tech companies and media companies to understand fully what's
00:09:17.620 happening to us.
00:09:19.060 That's the point of their monetization, right?
00:09:21.540 That moment in time.
00:09:22.760 So our job is to detect these problems and then help them make that go away.
00:09:27.880 Right.
00:09:28.120 Okay, so you set yourself up as a replica of the potential target of the scams, and then
00:09:37.100 you can deliver the information that you gather about how someone in that vulnerable position
00:09:43.000 might be interacting with the company's services in question to keep the criminals at bay.
00:09:48.480 Let's go through these different categories of vulnerability to crime that you described.
00:09:55.640 I suspect there's stories of great interest there.
00:09:58.540 So you started with scams directed at seniors.
00:10:01.540 So I've had people in my own family targeted by online scammers who were, in fact, quite successful
00:10:10.140 at making away with a good proportion of their life savings in one case.
00:10:14.260 And I know that seniors in particular who grew up in an environment of high trust, especially
00:10:23.780 with regards to corporate entities, they're not particularly technologically savvy.
00:10:30.440 They're trusting.
00:10:32.100 And then you have the additional complication, of course, in the case of particularly elderly
00:10:38.320 seniors, that their cognitive faculties aren't necessarily all that they once were.
00:10:44.260 And they're often lonely and isolated, too.
00:10:47.140 And so that makes them very straightforward targets for especially people who worm into
00:10:53.100 their confidence.
00:10:55.100 You talked about, was it romance scams on the senior side?
00:10:59.460 It is romance scams on the senior side.
00:11:01.420 Okay, so lay all that out.
00:11:03.540 Tell us some stories and describe to everybody exactly what they would see and how this operates.
00:11:08.600 Okay, so a senior is joining a dating website, just as a teenager or someone in middle age
00:11:16.140 would do.
00:11:16.660 They're looking for romance.
00:11:18.980 There are people on the other side of that that are collecting data on that senior, potentially
00:11:25.380 interacting with them.
00:11:27.020 Once they get enough information on that particular senior, they're going to start to find them
00:11:32.060 in other ways.
00:11:32.900 Send me emails and information.
00:11:34.740 Let's move out the dating site.
00:11:36.820 They're going to start calling them on the phone.
00:11:39.940 As that starts to evolve, it's that information collection, getting them to do certain things
00:11:45.420 online that sucks them deeper and deeper in.
00:11:47.540 From that moment forward, they become very much wed and emotionally oriented towards that person
00:11:53.320 that they're involved with.
00:11:54.820 And the theft goes from there.
00:11:58.400 Right.
00:11:58.900 So you go on a dating website as, say, someone in your 70s.
00:12:05.980 You're lonely and looking for companionship.
00:12:09.620 There are scam artists that are on those dating websites as well who must have what?
00:12:19.300 I suspect they probably have keywords and profiling information that enables them to zero in on
00:12:24.700 people who are likely targets.
00:12:28.960 Do you know how sophisticated is that?
00:12:31.000 Like, do you think that the criminals who are engaged in this activity, how good is their
00:12:35.260 ability to profile?
00:12:36.320 Do you think they can identify such things as early signs of cognitive degeneration?
00:12:41.260 I think this is organized crime and they have their own algorithms and processes to identify
00:12:46.380 people.
00:12:47.700 I also, to your earlier point, people believe what they see on computers.
00:12:52.740 They're following what's being provided to them, which makes them relatively easy marks.
00:12:57.820 So once that process starts, they're reeling them in.
00:13:01.680 If they lose a fish, that's no problem because they're going after so many in any given day.
00:13:07.180 They also have infrastructure in local markets to go deal with people personally.
00:13:12.820 So this is a very large criminal organization that has a lot of horsepower to identify and
00:13:18.640 then attack.
00:13:19.940 Right.
00:13:20.340 Okay.
00:13:20.760 So do you have any sense?
00:13:23.300 See, I hadn't thought about the full implications of that.
00:13:26.040 So obviously, if you were a psychopathic scam artist, posing as a false participant on a dating
00:13:37.160 website would be extremely, potentially extremely fertile ground, not only for seniors who could
00:13:43.680 be scammed out of their savings, but you also mentioned, let's say, younger people who are
00:13:50.080 on the website who might be useful in terms of human trafficking operations.
00:13:57.260 So do you have any sense, for example, of the proportion of participants on a given dating
00:14:05.660 platform that are actually criminals or psychopaths in disguise?
00:14:10.080 Because here, let me give you an example.
00:14:12.160 You undoubtedly know about this, but there was a website, can't remember the name of it,
00:14:18.380 unfortunately.
00:14:20.040 I believe it was Canadian, that was set up some years ago to facilitate illicit affairs.
00:14:27.040 And they enrolled thousands of people, all of whose data was eventually leaked, much of that,
00:14:35.520 to great scandal.
00:14:36.420 For example, the notion was to match people who were married secretly with other people
00:14:41.840 who were married to have illicit affairs.
00:14:44.540 They got an awful lot of men on the website and almost no women.
00:14:49.260 And so they created tens of thousands, if I remember correctly, fake profiles of women
00:14:55.100 to continue to entice the men to maintain what I believe was a monthly fee for the service.
00:15:02.660 Um, Ashley Madison, it was called, right?
00:15:06.080 And so, so obviously.
00:15:10.080 Going online without ExpressVPN is like not paying attention to the safety demonstration
00:15:14.500 on a flight.
00:15:15.400 Most of the time, you'll probably be fine.
00:15:17.600 But what if one day that weird yellow mask drops down from overhead and you have no idea
00:15:22.560 what to do?
00:15:23.200 In our hyper-connected world, your digital privacy isn't just a luxury.
00:15:27.000 It's a fundamental right.
00:15:28.340 Every time you connect to an unsecured network in a cafe, hotel, or airport, you're essentially
00:15:33.320 broadcasting your personal information to anyone with a technical know-how to intercept
00:15:37.500 it.
00:15:37.820 And let's be clear, it doesn't take a genius hacker to do this.
00:15:41.020 With some off-the-shelf hardware, even a tech-savvy teenager could potentially access your passwords,
00:15:46.140 bank logins, and credit card details.
00:15:48.400 Now, you might think, what's the big deal?
00:15:50.500 Who'd want my data anyway?
00:15:52.040 Well, on the dark web, your personal information could fetch up to $1,000.
00:15:55.900 That's right, there's a whole underground economy built on stolen identities.
00:16:00.740 Enter ExpressVPN.
00:16:02.480 It's like a digital fortress, creating an encrypted tunnel between your device and the
00:16:06.500 internet.
00:16:07.180 Their encryption is so robust that it would take a hacker with a supercomputer over a
00:16:11.380 billion years to crack it.
00:16:12.800 But don't let its power fool you.
00:16:14.540 ExpressVPN is incredibly user-friendly.
00:16:16.960 With just one click, you're protected across all your devices.
00:16:20.000 Phones, laptops, tablets, you name it.
00:16:22.100 That's why I use ExpressVPN whenever I'm traveling or working from a coffee shop.
00:16:26.320 It gives me peace of mind knowing that my research, communications, and personal data
00:16:30.460 are shielded from prying eyes.
00:16:32.320 Secure your online data today by visiting expressvpn.com slash Jordan.
00:16:37.060 That's E-X-P-R-E-S-S-V-P-N dot com slash Jordan, and you can get an extra three months free.
00:16:43.560 ExpressVPN.com slash Jordan.
00:16:45.320 A dating website would be wonderful hunting grounds for any kind of predator.
00:16:55.280 And so, do you have any sense of what proportion of the people who are participating on online
00:17:00.960 dating sites are actually predators, criminals?
00:17:04.040 I don't know what percentage of the participants on the sites are predators.
00:17:10.180 But where we come in in our expertise is that everyone that is visiting is giving information
00:17:17.000 into sort of the digital ecosystem.
00:17:20.740 And so, the issue from there is that they're then able to be targeted wherever they go online.
00:17:26.220 So, there's information that's being collected from the site that they're visiting.
00:17:30.540 That is then moving out into the ecosystem so that wherever they go, they're being pulled
00:17:35.260 back and targeted.
00:17:36.000 In an example like in Ashley Madison, a criminal may be able to get the digital information about
00:17:41.880 the people whose data was stolen, come back to them six months later, coming from another
00:17:47.320 website via email or SMS text, and then press the attack at that stage.
00:17:53.620 For us, in becoming a digital persona, our job is to look like someone based on the information
00:18:01.380 that sites have collected about them.
00:18:03.460 So, we look like an 85-year-old grandmother living in a senior community.
00:18:09.080 When you become that type of profile, no matter who else is engaging with you online, the algorithm
00:18:14.680 and the content that is going to be served to you is coming from criminals regardless of
00:18:20.520 their activity on that particular site that you're visiting.
00:18:23.400 It's simply based on who you are.
00:18:25.980 So, artificial intelligence has been around in use for and in digital media and targeting
00:18:32.400 people for 2010-11.
00:18:34.640 So, the initial initial use case was collecting data on us.
00:18:38.960 That was the key initial step for AI utilization.
00:18:43.280 The second step was then turning that around and targeting people better, right?
00:18:48.220 So, AI was first used to collect information, right, make things interesting behind the scenes
00:18:54.420 for people.
00:18:55.440 Second, creating better audience segments, which enable that targeting.
00:19:00.020 This third phase that's happening today, you see ChatGPT and the LLMs being used in regular
00:19:06.880 use.
00:19:07.320 The third big stage is writing content on our devices on the fly.
00:19:12.840 So, regardless of where the criminal actor is, regardless of how they're moving into the
00:19:18.340 ecosystem and what initial buying point, they're able to find that person, write content on the
00:19:25.220 fly that's particularly tailored to what the digital ecosystem knows about them to create
00:19:30.480 the situation where they then respond and the criminal activity can occur.
00:19:33.880 Right, and so, what that implies as well then, I suppose, is that we're going to see very
00:19:42.560 sophisticated LLM criminals, right, who will be able to, this is the logical conclusion of
00:19:51.380 what you're laying out, is that they'll be able to engage.
00:19:56.660 Huh, so I just saw a video, it's gone viral, it was released about three weeks ago, that portrayed
00:20:05.460 the newest version of ChatGPT, and it's a version that can see you through the video camera on your
00:20:16.560 phone and can interact with you very much like a person.
00:20:21.300 So, they had this ChatGPT device interacting with a kind of unkempt, nerdy sort of engineer
00:20:33.260 character who was preparing for an interview, a job interview, and the ChatGPT system was coaching
00:20:41.200 him on his appearance and his presentation.
00:20:44.100 And I think they used Scarlett Johansson's voice for the ChatGPT bot.
00:20:51.340 It was very, very flirtatious, very intelligent, extremely perceptive, and was paying attention
00:20:59.180 to this engineer who was preparing his interview like a, what would you say, like the girlfriend
00:21:08.480 of his dreams would if he had someone who was paying more attention to him than it was ever
00:21:13.380 paid attention to him in his life.
00:21:16.140 And so, I can imagine a system like that set up to be an optimal criminal, especially if
00:21:22.420 it was also fed all sorts of information about that person's wants and likes.
00:21:28.260 So, let's delve into that a little bit.
00:21:31.180 How much of a digital footprint do you suppose, like how well are each of us now replicated online
00:21:38.620 as a consequence of the criminal or corporate aggregation of our online behavior?
00:21:45.960 So, the typical senior, for example, how much information would be commonly available
00:21:51.220 to criminal types about, well, the typical senior, the typical person, typical 14-year-old
00:21:57.840 for that matter?
00:21:59.000 Right.
00:21:59.380 The majority of their prior activity that they've engaged in online.
00:22:04.320 So, corporate digital data companies know a highly, their job is to know as much about
00:22:12.080 us as possible, and then to target us with information to maximize profit, right?
00:22:17.100 That's the core goal.
00:22:19.120 Criminals have access to that data, and they're leveraging it just like a big brand advertiser
00:22:24.740 would.
00:22:25.400 So, they know it's a grandmother, and they're going to put in something that only runs on the
00:22:29.780 grandmother's device, which makes it very, very difficult for big tech and digital media
00:22:33.620 companies to see the problem before it occurs.
00:22:37.180 I think another thing that's really important to understand is this is our most open border,
00:22:42.700 right?
00:22:42.880 So, we've got an idea of national sovereignty.
00:22:46.520 There's lots of discussion on whether or not our southern border is as secure as it should
00:22:52.140 be.
00:22:52.360 Our actual devices, our cell phones, our televisions, our personal computers are open to source code
00:22:59.040 and information coming from any country, any person at any time, and typically resolved
00:23:05.000 to the highest bidder.
00:23:07.220 Right, right.
00:23:08.480 So, the digital world, the virtual world, is it a lawless frontier?
00:23:16.480 I mean, I guess one of the problems is, it's like, if I'm targeted by a criminal gang in
00:23:20.940 Nigeria, what the hell can I do about that?
00:23:24.580 I mean, the case I mentioned to you of my relative who was scammed out of a good proportion
00:23:30.760 of their life savings, that gang was operating in Eastern Europe.
00:23:36.480 We could more or less identify who they were, but there was really nothing that could be done
00:23:42.540 about it.
00:23:42.980 I mean, these are people who are operating, well, out of any physical proximity, but also
00:23:48.680 even out of, hypothetically, the jurisdiction of, well, say, lawmakers in Canada, police services
00:23:56.420 in Canada.
00:23:57.380 And so, how lawless, how is it, how should we be conceptualizing the status of law in the
00:24:08.200 online and virtual world?
00:24:10.860 Yeah, and I think this is where the major rub is.
00:24:15.700 So, I'm going to walk back and talk about cybersecurity as an industry first.
00:24:20.180 So, cybersecurity is relatively mature.
00:24:22.760 It is now geared to monetizing the chief security officer, the chief information security officer.
00:24:29.380 What that means, it's providing products and services designed to protect what they are
00:24:34.660 paid to hold dear, which is the corporate asset, so the machines and the data for the corporation.
00:24:41.420 If you're part of the government, which is where we're going to go in the conversation,
00:24:45.460 then your job as a CIO or a CISO is to protect government machines.
00:24:50.260 Governments will tell you that they're protecting you, right?
00:24:52.840 They're protecting you from digital harm.
00:24:54.260 What that means today is they're protecting your data on the DMV website.
00:24:59.720 That's basically the beginning and the end of cybersecurity and digital protection.
00:25:04.180 There's legislation which is occurring coming from attorneys general, from state legislatures,
00:25:10.540 from the federal government in the U.S. to a degree.
00:25:13.740 Other countries seem to be further ahead.
00:25:15.440 But seeking to protect people from data collection, and that's your GDPR in Europe.
00:25:22.520 Many states in the United States are putting some rules in place around what corporations
00:25:27.640 can collect, what they can do with the data.
00:25:30.740 The predominant use case is to provide a consumer with an opt-out mechanism.
00:25:35.080 Most consumers say, okay, I want to read the content.
00:25:37.700 They're not doing a whole lot with the opt-out compliance.
00:25:40.520 So, that's not been a big help to your typical consumer.
00:25:45.440 But it's really the mindset that's the problem and the mindset of corporate and government
00:25:50.360 that is at issue.
00:25:52.040 And so, governments need to tactically engage on a 24-7 basis with digital crime in the same
00:25:59.180 way that they're policing the street.
00:26:01.780 So, the metaphor would look like this.
00:26:04.220 If grandmothers were walking down the street and being mugged or attacked at the rate that
00:26:09.320 they're getting hit online, you would have the National Guard policing every street in America.
00:26:15.440 So, the government needs to take a step forward.
00:26:17.860 When I say the governments, that is, is governments need to take a step forward and do a better
00:26:23.100 job at policing people tactically.
00:26:25.580 And that does not mean that they're going after big tech or digital media companies.
00:26:29.940 It means that they're protecting people with the mindset that they're going to go ahead
00:26:34.640 and cooperate with the digital ecosystem to do a better job to reduce overall crime.
00:26:40.700 Right.
00:26:41.720 So, your point appears to be that we have mechanisms in place, like the ones that are offered by
00:26:49.620 your company, that protect the corporations against the liability that they would be laden
00:26:59.780 with if the data on their servers was compromised.
00:27:04.280 But that is by no means the same thing as having a police force that's accessible to people,
00:27:11.500 individual people, who are actually the victims of criminal activity.
00:27:14.700 Those aren't the same things at all.
00:27:16.320 It's like armed guards at a safe in a bank compared to police on the street that are designed,
00:27:22.060 that are there to protect ordinary people or who can be called.
00:27:24.800 Is that, have I got that about right?
00:27:26.780 Yes.
00:27:27.060 And digital crime is crime.
00:27:30.020 So, this is when you're stealing grandmother, stealing grandmother's money, that is theft.
00:27:35.080 We don't need a lot of new laws.
00:27:36.800 What we need to do is actively engage with the digital, digital ecosystem to try to get
00:27:41.800 in front of the problem to reduce overall numbers of attacks, which reduces the number
00:27:47.200 of victims.
00:27:48.200 And to date, when we think about digital safety, it's predominantly education and then increasing
00:27:54.740 support for victims.
00:27:56.920 Victims are post-attack.
00:27:58.520 They've already had their money stolen.
00:28:01.160 Getting in front of that is the key.
00:28:03.260 We've got to start to reduce digital harm.
00:28:05.780 I've been doing this for a good number of years.
00:28:08.040 And the end of that conversation does reside with local and state governments.
00:28:12.640 And ultimately, the federal government is going to have, in the United States, is going to
00:28:16.140 have to find resources to actively protect beyond having discussions about legislating
00:28:21.040 data control or social media as a problem.
00:28:25.040 Okay.
00:28:25.400 So, I'm trying to wrestle with how this is possible, even in principle.
00:28:29.380 So, now, you said that, for example, what your company does is, and we'll get back into
00:28:34.640 that, is produce virtual victims, in a sense, false virtual victims, so that you can attract
00:28:40.640 the criminals, so that you can see what they're doing.
00:28:42.680 So, I presume that you can report on what you find to the company so that they can decrease
00:28:48.000 the susceptibility they have to exploitation by these bad actors.
00:28:53.320 But that's not the same thing as actually tracking down the criminals and holding them
00:28:58.440 responsible for their predatory activity.
00:29:01.500 And I'm curious about what you think about how that's possible, even in principle, is,
00:29:06.800 first of all, these criminals tend to be, or can easily be, acting at a great distance
00:29:12.980 in jurisdictions where they're not likely to be held accountable in any case, even by
00:29:18.560 the authorities, or maybe they're even the authorities themselves.
00:29:22.080 But also, as you pointed out, more and more, it's possible for the criminal activity to
00:29:29.260 be occurring on the local machine.
00:29:32.400 And so, that makes it even more undetectable.
00:29:36.760 So, I don't see, I can't understand easily, you obviously are in a much better position
00:29:43.740 to comment on this, how even in principle, there can be such a thing as, let's say, an
00:29:49.760 effective digital police force.
00:29:51.720 Like, even if you find the activity that someone's engaged in, and you can bring that to a halt by
00:29:59.260 changing the way the data is handled, that doesn't mean you've identified the criminals
00:30:03.620 or held them accountable.
00:30:06.080 So, what, if anything, I can't understand how that can proceed even in principle.
00:30:13.020 Starting a business can be tough, but thanks to Shopify, running your online storefront is
00:30:17.620 easier than ever.
00:30:19.000 Shopify is the global commerce platform that helps you sell at every stage of your business.
00:30:23.280 From the launch your online shop stage, all the way to the did we just hit a million
00:30:27.000 orders stage, Shopify is here to help you grow.
00:30:30.420 Our marketing team uses Shopify every day to sell our merchandise, and we love how easy
00:30:34.560 it is to add more items, ship products, and track conversions.
00:30:38.340 With Shopify, customize your online store to your style with flexible templates and powerful
00:30:42.920 tools, alongside an endless list of integrations and third-party apps like on-demand printing,
00:30:48.160 accounting, and chatbots.
00:30:49.880 Shopify helps you turn browsers into buyers with the internet's best converting checkout, up
00:30:54.340 to 36% better compared to other leading e-commerce platforms.
00:30:58.260 No matter how big you want to grow, Shopify gives you everything you need to take control
00:31:02.100 and take your business to the next level.
00:31:04.660 Sign up for a $1 per month trial period at shopify.com slash jbp, all lowercase.
00:31:10.700 Go to shopify.com slash jbp now to grow your business, no matter what stage you're in.
00:31:15.940 That's shopify.com slash jbp.
00:31:18.300 So, the digital ecosystem is made up of a supply chain, just like every other industry.
00:31:27.200 There are various steps that a piece of content is going to go through before it winds up on
00:31:32.380 your phone.
00:31:33.340 So, it's running through a number of different companies, different cloud solutions, different
00:31:38.700 servers that put content out.
00:31:41.040 Okay, they're intermediaries.
00:31:42.380 And so, a relationship between those digital police with the governments and those entities
00:31:48.040 on a tactical basis is really the first step.
00:31:51.820 Seeing crime and then reporting that back up the chain so that it can be stopped higher and
00:31:57.700 higher up towards ultimately the initiation point of where that content is delivered.
00:32:03.460 So, it seems fantastic, but it is possible.
00:32:07.320 Well, the criminals need to have, they need to use intermediary processes in order to get
00:32:13.440 access to the local devices.
00:32:16.260 And so, you're saying that, I believe that, that those intermediary agencies could be enticed,
00:32:25.520 forced, compelled, invited to make it much more difficult for the criminals to utilize their
00:32:31.280 services.
00:32:31.800 I guess that's, and that that might actually be effective.
00:32:34.420 That, does that, that still doesn't, does that aid in the identification of the actual
00:32:40.040 criminals themselves?
00:32:41.580 Because, I mean, that's the advantage of the justice system, right?
00:32:44.100 Is you actually get your hands on the criminal at some point.
00:32:47.920 Yes, and I think ultimately, it does.
00:32:51.120 So, you have to start and you have to start to build the information about where it's coming
00:32:56.240 from.
00:32:56.960 You then have to cooperate with the private entities.
00:33:00.420 Our digital streets are managed and made up of private companies.
00:33:04.140 It's not a government-run internet.
00:33:06.200 All of the information that's fed to us, at least in Western society, is coming from these
00:33:10.240 private companies.
00:33:11.720 And so, I think rather than having an antagonistic relationship between governments and private
00:33:16.580 companies where they're trying to legislate to put them into a position, that may be appropriate
00:33:21.560 for certain rules and regulations.
00:33:23.280 It may be appropriate to raise the age of accessing social media from 13 to 16 or 18.
00:33:29.440 And that is a proper place for the government to be legislating.
00:33:33.620 On the other hand, an eye towards reducing crime is critical.
00:33:38.500 And the ethical and moral mindset among all of the parties, and that's governments through
00:33:43.660 our corporations, has to be solely on protecting people.
00:33:49.280 And I think that's something that is significantly missing.
00:33:51.980 It's missing in the legislation.
00:33:54.280 It's missing in cybersecurity.
00:33:56.360 It's not something that we've engaged in as a society.
00:33:59.360 So, there are a few countries, and I think even a few states in the U.S., that are looking
00:34:06.300 at a broader whole-of-society approach.
00:34:08.920 That whole-of-society approach is a mimicking of how the internet and the digital ecosystem
00:34:14.740 works, which is certainly a whole-of-society activity, right?
00:34:18.940 So, it is the thing that influences and affects all of us every single moment of every single
00:34:23.920 day.
00:34:24.860 Engaging in that, looking across the impact of society and doing better via cooperation
00:34:29.920 is a critical, critical next step.
00:34:33.060 How often do you think the typical elderly person in the United States, say, is being
00:34:40.400 successfully—no, is being first communicated with by criminal agents?
00:34:47.120 And then, how often successfully communicated with?
00:34:52.780 What's the scope of the problem?
00:34:55.120 The scope is, if you're a senior citizen, in particular, if you're a female senior citizen,
00:35:00.960 roughly 78 to about 85 years old, we see that 2.5 to 3% of every single page impression
00:35:08.920 or app view is attempting to target you with some form of crime or influence that's going
00:35:15.880 to move you towards crime.
00:35:17.620 So, it is highly, highly significant.
00:35:22.660 In some ways, looking at—this is shooting fish in a barrel to make a dent.
00:35:26.780 So, you're concerned that the legal system isn't going to be able to find the criminals.
00:35:31.360 There is so much to detect and stop and so much room to turn them off quickly, right?
00:35:37.580 That we can gain a significant reduction in digital crime by working together in considering society
00:35:45.260 as a whole instead of the different pockets and how can we legislate or how can we try
00:35:49.540 to move a private company to do better on their own.
00:35:51.860 Okay, so let's—now let's—okay, so we talked a little bit about the danger that's posed by one form of con game
00:36:01.180 in relationship to potential criminal victims, and that was senior romance scams.
00:36:07.400 What are the other primary dangers that are posed to seniors?
00:36:12.000 And then let's go through your list.
00:36:13.740 You talked about 17-year-olds who are being sold online access to drugs.
00:36:17.860 That includes now, by the way, a burgeoning market in under—over the—under the table hormonal treatments
00:36:27.820 for kids who are—who've had induced gender dysphoria.
00:36:32.880 So, you talked about seniors, 17-year-olds who are being marketed, elicited drugs,
00:36:38.480 14-year-olds who are being enticed into, let's say, modeling, and people who are sick and infirm.
00:36:43.720 So, those are four major categories.
00:36:45.580 Let's start with the seniors again.
00:36:47.280 And apart from romance scams, what are the most common forms of criminal incursion that you see?
00:36:52.240 The most common form is the tech support or upgrade scam.
00:36:58.100 And essentially, the internet knows that you are a senior.
00:37:01.900 When you're going to a website that you and I would visit,
00:37:04.900 instead of having a nice relationship with that site and reading the content
00:37:08.300 and then moving on to something else,
00:37:10.220 you're getting a pop-up or some form of information that's telling you
00:37:14.300 there's something wrong with your computer.
00:37:16.080 You either need to call a phone number or you need to click a button,
00:37:19.720 which then moves you down to something else that is more significant.
00:37:24.680 This is happening millions and millions and millions of times per day.
00:37:29.080 And it is something that we can all do something about.
00:37:34.000 Attempting to educate seniors to try to not listen to the computer when it's telling you to do something is not working.
00:37:39.620 No, well, no wonder.
00:37:42.520 I mean, look, to manage that, it's so sophisticated because, you know,
00:37:47.960 once you've worked with computers for 20 years, especially if you grew up with them,
00:37:52.320 you know when your computer, your phone is telling you something that's actually valid and when it isn't.
00:37:58.960 It doesn't even look this, a lot of these criminal notifications, they don't even look right.
00:38:03.460 They look kind of amateurish.
00:38:05.360 They don't have the same aesthetic that you'd expect if it was a genuine communication from your phone.
00:38:11.280 But man, you have to know the ecosystem to be able to distinguish that kind of message from the typical thing your phone or any website might ask you to do.
00:38:22.840 And educating seniors, it's not just a matter of describing to them that this might happen.
00:38:28.460 They would have to be tech-savvy cell phone users.
00:38:33.120 And it's hard enough to do that if you're young, you know, much less if you're outside that whole technological revolution.
00:38:40.080 So I can't see the educational approach.
00:38:43.820 The criminals are just going to outrun that as fast as it happens.
00:38:47.100 So, yeah, so that's pretty.
00:38:48.660 So 3%, eh?
00:38:50.640 That's a lot.
00:38:51.400 That's about what you'd expect.
00:38:53.880 Yeah, it is highly significant.
00:38:56.660 And I think getting in front of this problem requires cooperation with states, moving that tactically to have the idea of a police force looking at digital.
00:39:07.320 And I think one of the things that both sides, whether it's private companies or states, needs to wrap their head around is that there's going to be a cooperative motion to do better with people in mind.
00:39:20.120 Yeah.
00:39:20.560 All right, so let's move to the other categories of likely victim.
00:39:25.040 So you talked about romance scams and also computer upgrade, repair, and error scams for seniors.
00:39:33.200 Is there any other domains where seniors are particularly susceptible?
00:39:37.780 Also, I think what I'd put into context is a lot of the data collection that results in people getting phone calls with a voice copy of their grandchild, right?
00:39:49.440 Which then ultimately is going to result in a scam.
00:39:51.980 It is that digital connection that is the leading point that drives the ability to commit those types of crimes.
00:40:02.080 The ability to marry their grandson or granddaughter's voice with their digital persona and then finding a phone number that they can use to call them.
00:40:11.660 So there's a lot of action happening just in our daily interactions that's ultimately being moved out into the ecosystem that we have to take a look at.
00:40:19.620 Right, well, and then you're going to have that deepfake problem too where those systems that use your grandchild's voice will actually be able to have a conversation with you targeted to you in that voice in real time.
00:40:38.680 And we're probably no more than, well, the technical capacity for that's already there.
00:40:44.320 I imagine we're no more than about a year away from widespread adoption of exactly that tactic.
00:40:49.840 So, you know, I've been talking to some lawmakers in Washington about such things, about protection of digital identity.
00:40:55.760 And one of the notions I've been toying with, maybe you can tell me what you think about this, is that the production of a deepfake, the theft of someone's digital identity,
00:41:08.680 to be used to be used to impersonate them, should be a crime that's equivalent in severity to kidnapping.
00:41:17.720 That's what it looks like to me.
00:41:19.640 You know, because if I can use my daughter's, if I can use your daughter's voice in a real-time conversation to scam you out of your life savings,
00:41:27.720 it's really not much different than me holding her at gunpoint and forcing her to do the same thing.
00:41:32.660 And so, I don't know, like, if you've given some consideration to severity of crime or even classification,
00:41:38.960 but theft of digital identity looks to me something very much like kidnapping.
00:41:43.800 What do you, like, any thoughts about that?
00:41:46.620 Yeah, for me, I would simplify it a little bit.
00:41:49.480 Using Section 230 or the First Amendment to try to claim that the use of our personal identity to do something online when it's a crime doesn't make sense.
00:42:03.660 So, if it's being used, we want to simplify this first.
00:42:07.280 We don't need a broad, broad-based rule on identity necessarily before.
00:42:12.680 We simply state that if someone's using this for a crime, it's a crime, and that that is going to be prosecuted if you're caught and detected,
00:42:22.940 which then goes back to actually catching and detecting that.
00:42:27.220 The way that that ultimately—
00:42:28.240 That uses the preexistent legal framework and doesn't require much of a move.
00:42:33.580 But I'm concerned that the criminals will just be able to circumvent that as the legal—as the technology—as the technology develops.
00:42:42.400 And that was why I was thinking about something that might be a deeper and more universal approach.
00:42:48.040 I know it's harder to implement legislatively, but that was the thinking behind it anyway, so.
00:42:55.320 Hey, everyone.
00:42:56.340 Real quick before you skip, I want to talk to you about something serious and important.
00:42:59.980 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:43:06.920 We know how isolating and overwhelming these conditions can be,
00:43:10.220 and we wanted to take a moment to reach out to those listening who may be struggling.
00:43:14.240 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:43:21.960 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:43:28.780 If you're suffering, please know you are not alone.
00:43:32.600 There's hope, and there's a path to feeling better.
00:43:35.800 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:43:41.540 Let this be the first step towards the brighter future you deserve.
00:43:45.140 Yeah, for us, there is a path that leverages that content to bring it to the device.
00:43:56.060 And I think understanding that mechanism and how it's brought forward versus looking at the content,
00:44:01.480 and I'll give you an example of what's happening in political advertising as we speak.
00:44:05.760 Understanding the pathway for how that content is delivered is ultimately how we get back to the criminal or the entity that's using that to perpetrate the crime.
00:44:17.260 The actual creation of the content is incredibly difficult to stop.
00:44:21.460 It's when it moves out to our devices that it becomes something that we need to be really paying attention to.
00:44:27.960 So in political advertising up to October of this past year, our customers asked us to flag the presence of AI source code.
00:44:38.280 So the idea there was they didn't want to be caught holding the bag of being caught being the server of AI-generated political content.
00:44:47.240 By October, we essentially stopped using that policy because we had achieved greater than 50% of the content that we were scanning had some form of AI.
00:45:07.180 It may have been to make the sun a little more yellow, the ocean a little bit more blue.
00:45:11.540 But using that as a flag to understand what's being delivered out, once you get over 50%, you're looking at more than you're not looking at.
00:45:21.780 That's not a good automated method to execute on digital safety.
00:45:26.000 So as we move forward, we have a reasonably sophisticated model to detect deep fakes, very much still in a test mode, but it's starting to pay some dividends.
00:45:38.220 And unquestionably, what we see is using the idea of deep fakes to create fear is significantly greater than the use of deep fakes.
00:45:49.000 Now, that's limited to a political advertising conversation.
00:45:52.000 We're not seeing a lot of deep fake serving in information or certainly not in the paid content side.
00:45:58.060 But the idea of fearing what's being delivered to the consumer is very much becoming part of a mainstream conversation.
00:46:07.840 Yeah, well, wasn't there some insistence from the White House itself in the last couple of weeks that some of the claims that the Republicans were making with regards to Biden were a consequence of deep fake audio?
00:46:24.180 So, not video, I don't think, but audio, if I got that right, does that story ring a bell?
00:46:30.060 And I think where we are at this stage in technology is very likely there is plenty of deep fake audio happening around the candidates.
00:46:37.700 So whether you're Donald Trump or Joe Biden or even local political campaigns, it's really that straightforward.
00:46:45.760 I think on the video side, there are going to be people working on it left and right.
00:46:49.900 I think it's the idea of using that as a weapon to sow some form of confusion among the populace.
00:46:57.400 Some doubt.
00:46:57.980 Some doubt is going to be dramatically more valuable than the actual utilization of deep fakes to move society.
00:47:03.940 Oh, you do, eh?
00:47:05.500 So you do think that even if the technology develops to the point where it's easy to use.
00:47:11.260 So you think that it'll be weaponization of the doubt that's sowed by the fact that such things exist.
00:47:17.740 And we've been watching this for a very, very long time.
00:47:21.700 And our perspective is coming at this from a digital crime and a safety in content.
00:47:28.780 Safety in content typically means don't run adult content in front of children.
00:47:33.200 Don't serve weapons in New York State.
00:47:36.200 They're not going to like that.
00:47:37.960 Don't have a couple walking down the beach in Saudi Arabia, right?
00:47:40.700 Their ministry of media is going to be very unhappy with the digital company that's bringing that kind of content in.
00:47:47.880 Eye of the beholder, safe content, drugs and alcohol, right?
00:47:51.360 Targeting the wrong kinds of people.
00:47:54.320 So we look at this from a lens of how do you find and remove things from the ecosystem?
00:48:01.040 If we continue down the path that we're on today, most people won't trust what they see.
00:48:06.840 And so we're discussing education.
00:48:09.920 They're going to self-evolve to a point where so much of the information that's being fed to them is just going to be disbelieved because it's going to be safer to not go down that path.
00:48:21.500 I'm wondering if live events, for example, are going to become once again extremely compelling and popular because they'll be the only events that you'll actually be able to trust.
00:48:34.180 I think so.
00:48:36.280 I think it's also critical that we find a way to get a handle on kind of the anti-news and get back.
00:48:45.660 The entities promoting trust in journalism, that is a very meaningful conversation and it is something that we need to try to get back to.
00:48:55.560 It's much less expensive to have automation or create something that's going to create some kind of situation where people continue to click.
00:49:03.920 That's a terrible relationship with the digital ecosystem.
00:49:07.560 It's not good for people to have that in their hand.
00:49:10.740 And, you know, with the place where digital crime is today, if you're a senior citizen, your relationship is often net negative with the internet, right?
00:49:21.400 You may want to stick to calling your kids on voiceover IP where you can see their face.
00:49:26.280 Lots of different ways to do that in video calling.
00:49:28.900 But doing other things on the internet, including things as simple as email, it may be more dangerous to engage than any benefit, you know, that you're going to get back.
00:49:39.640 And I think as we move closer to that moment in time, this is where we all need to be picking up and focusing on digital safety, focusing on the consumer.
00:49:47.480 However, I think corporates are going to have to engage on that.
00:49:52.580 Okay, okay.
00:49:53.220 So let me ask you a question about that because one of the things I've been thinking about is that a big part of this problem is that way too much of what you can do on the net is free, free.
00:50:06.920 Now, the problem with free is that, let's take Twitter, for example.
00:50:11.960 Well, if it's free, then it's 20% psychopaths and 30% bots because there's no barrier to entry.
00:50:21.100 And so wherever, maybe there's a rule like this is wherever the discourse is free, the psychopaths and the psychopaths will eventually come to dominate and maybe quite rapidly.
00:50:33.500 The psychopaths and the exploiters because there's no barrier to entry and there's no consequence for misbehavior.
00:50:39.240 So, like, we're putting together a social media platform at the moment that's part of an online university and our subscription price will be something between $30 and $50 a month, which is not inexpensive.
00:50:54.020 Although compared to going to university, it's virtually free.
00:50:56.980 You know, and we've been concerned about that to some degree because it's comparatively expensive for, like, a social media network.
00:51:05.820 But possibly the advantage is that it would keep the criminal players at a minimum, right?
00:51:13.100 Because it seems to me that as you increase the cost of accessing people, you decrease people's ability to do, well, low-cost, you know, multi-person monitoring of the sort that casts a wide net and that costs no money.
00:51:30.320 Right. So, have you, what are your thoughts about the fact that so much of this online pathology is proliferating because when we have free access to a service, so to speak, the criminals also have free access to us?
00:51:47.440 Am I barking up the wrong tree or does that seem, does it mean that the internet is going to become more siloed and more private because of that?
00:51:55.180 I think it's going to go in two ways. So, one, you will find safety in how much money you spend. And that's already true.
00:52:03.840 So, when there are paywalls within even large news sites, the deeper you go into the paywall, the higher the cost to reach the consumer, right?
00:52:13.700 Not just coming from the consumer, but even through with advertising and other content producers, the lower the activity of the criminal because it's more expensive for them to do business.
00:52:22.880 So, that is true. That's been true.
00:52:25.000 Right. Okay. Okay.
00:52:26.080 True throughout. I think the other requirement, because we're very acclimated to having free content, is that the entire supply chain is going to have to engage.
00:52:36.780 So, when you think through who is responsible for the last mile of content that's able to reach our devices inside of our home, right?
00:52:45.080 Is that the big telcos? Is that the companies that are giving us Wi-Fi and bringing data into our houses?
00:52:52.860 Right now, they're putting their hands back and it's not our job to understand what happens to you on your device.
00:53:00.280 If anything, there's a data requirement that says we're not allowed to know or we're not allowed to keep track of where you go and what comes onto your device.
00:53:09.500 There's a big difference between monitoring where we go online and what is delivered into our device.
00:53:17.520 And this is missing from the conversation.
00:53:20.640 Privacy is critically important and privacy is about how we engage in our activities on the internet.
00:53:26.900 The other side of that is what happens after the data about us is collected.
00:53:32.400 And that piece is not something that is necessarily private.
00:53:36.220 It should not be broadcast what is delivered to us.
00:53:38.520 But someone needs to understand and have some control over what is actually brought in based on the data that is collected.
00:53:47.080 And that is a whole of society, meaning all of the companies, all of the entities that are part of this ultimate transaction to put that piece of content on our phone and our laptop and our TV need to get involved in better protecting people.
00:54:00.240 One of the primary issues is there are so many events, trillions of events per day on all of our devices, that even when you have paywalls, the problem is so huge that you can always find access to people's machines until we get together and do something better about it.
00:54:18.560 Okay, okay, okay.
00:54:19.920 So paywalls in some ways are a partial solution, but they're, okay, so that's useful to know.
00:54:26.500 Now, do you have, I want to ask you a specific question, then we'll go back to classes of people who are being targeted by criminals.
00:54:35.700 I want to continue walking through your list.
00:54:38.200 Do you have specific legislative suggestions that you believe would be helpful at the federal or state level?
00:54:46.040 And do you, are you in contact with people as much as you'd like to be who are legislators, who are interested in contemplating such legislative moves?
00:54:59.260 I went, the reason I'm asking is because I went to Washington probably the same time I met you, and I was talking to a variety of lawmakers on the Hill there who are interested in digital identity security, but, you know, it isn't obvious that they know what to do because it's, well, it's complicated, you might say.
00:55:21.420 It's extremely complicated, and I think the big tech companies are in some ways in the same boat.
00:55:27.040 So do you already have access to people sufficiently as far as you're concerned who are drawing on your expertise to determine how to make the digital realm a less criminally rife place?
00:55:41.620 I would always like more access.
00:55:43.620 What I find is that the state governments are really where the action is.
00:55:49.260 And when I say, and they're closer to people, right?
00:55:52.520 So the federal government is quite far away from, you know, a grandmother or someone in high school.
00:55:59.980 The state governments know the people who run the hospitals.
00:56:02.500 They know people at senior communities.
00:56:04.680 They understand what's happening on the ground.
00:56:07.640 They're also much closer, if not managing overall police forces, right?
00:56:12.660 So that may be down at the county level or other types of districts, but they understand a daily police force.
00:56:20.000 So I think what we're seeking is to influence states to take tactical action.
00:56:27.900 And if that requires legislation, what that would be is putting funds forward to police people from digital crime the same way that they're policing people or helping to police crime against people in their homes, walking down the street, on our highways, in our banks, right?
00:56:46.480 The typical type of crime.
00:56:48.920 We're 20 years in from data collection, data targeting, third-party code, kind of dominating our relationship with our devices.
00:56:57.180 It is the one piece that governments really haven't started to work on a whole lot.
00:57:03.540 The United Kingdom, on the other hand, has three different agencies that are, that they've been given the authority to tactically and actively engage with the digital ecosystem.
00:57:13.400 So those are the companies that make up the cloud, that serve advertising and serve content, that build websites and e-commerce systems.
00:57:20.040 They're finding problems, and then they're engaging tactically with that digital supply chain to turn off attacks.
00:57:26.800 It's the beginning, right?
00:57:28.180 Are they doing that in a manner that's analogous to the approach that you're taking, the creation of these virtual victims and the analysis of—
00:57:37.280 I think mostly it's receiving feedback from people that are being targeted and getting enough information about those to then move it upstream.
00:57:48.160 Legislation that would say that a synthetic persona in a particular local geography counts as crime, that would be a big leap for governments to take.
00:57:58.000 That would be very, very useful in the ability to go out and actually prosecute.
00:58:02.880 But I think that's going to be a very, very difficult solution.
00:58:06.580 I think the problem must be addressed in cooperation with big tech and digital media, and that as a police force in a local market, content is targeted locally.
00:58:18.120 It's geofenced, right?
00:58:19.280 So something is going to be served into the state of Tennessee differently than it served into New York State.
00:58:24.540 As that information is gathered, it should be given to those who can turn off attacks quickly, that is crime reduction, and then ultimately be working together where if there's certainty that there is a crime, and the companies that are part of the supply chain have information on the actual criminal, that they're sharing that in a way that, one, they're not getting in trouble for sharing the information.
00:58:46.700 But two, they're collectively moving upstream to that demand source that's bringing the content to our device.
00:58:53.280 I think that becomes a natural flow at some point in the future.
00:58:56.600 The faster we get there, the better.
00:58:58.760 And I want to make sure that I'm making this clear, that's not about protecting a machine, that's about protecting the person at the other end of the machine, and keeping that mindset is critical.
00:59:10.920 Right, right.
00:59:11.940 Okay, so let's go back to your list of victims.
00:59:14.380 So you were talking about, you mentioned 17-year-old males who are being offered the opportunity to buy drugs online.
00:59:22.120 So tell us about that market.
00:59:23.740 I don't know much about that at all.
00:59:25.780 And do you have some sense of how widespread, first of all, if the 17-year-old is being targeted to purchase illicit drugs, are they being put in touch with people who actually do supply the illicit drugs?
00:59:40.500 Like, has the drug marketing enterprise been well-established online?
00:59:45.060 And can you flesh that out?
00:59:46.540 What does that look like?
00:59:48.000 Yeah, so this is a place where the biggest tech and digital media companies have done a very good job removing that from digital advertising and targeted content on their pipes.
00:59:58.120 But that is still something that's happening every single day and actually growing, predominantly through social media channels or interactions between the person who's going to end up selling the drugs.
01:00:11.060 And the person could be in any country.
01:00:12.780 This is coming through the mail or it's leading to the streets and making a purchase.
01:00:18.500 But what I can give you, if I'm going to get these numbers right, roughly 2,000 deaths from fentanyl or similar drugs in the Commonwealth of Virginia in 2023.
01:00:30.360 And the belief is that greater than 50% of those drug transactions began online.
01:00:36.180 So it is a predominant location for the targeting of people to buy, informing people that the drugs are available, and then ultimately making the sale.
01:00:46.140 Okay, okay, and they break down the demographics in the same way, making the presumption that males in all likelihood of a particular age are the most likely targets.
01:00:58.840 And using, I wonder what other demographic information would be relevant if you were trying to target the typical drug seeker.
01:01:09.340 I don't know.
01:01:10.340 Okay, so then you talked about 14-year-old girls who are being targeted by human traffickers.
01:01:17.440 And you mentioned something about modeling.
01:01:22.120 Yeah, so if there's going to be a core profile, a low-hanging fruit profile for human traffickers,
01:01:29.400 young females that are interested in things like modeling, fashion, presenting themselves out on social media,
01:01:37.660 are going to be a high, high target.
01:01:41.300 Often that is going to be the people that are finding their way to become friends with them
01:01:46.360 and using that information psychologically to have a relationship.
01:01:50.140 But there's also the algorithms in place that enable entities to put code on device,
01:01:55.440 which allows them to continue to track them and find them off-platform, off-social media as well.
01:02:00.200 Right, and what's the scope of that problem?
01:02:04.460 Like, you said that it's something in the neighborhood of 3% of the interactions that the typical elderly woman
01:02:12.260 between, you said, I think, 78 and 85, something like that,
01:02:16.100 3% of the interactions they're having online are facilitated by criminals.
01:02:21.280 What's the typical situation for a 14-year-old girl who's been doing a fair bit of fashion shopping online?
01:02:28.420 That is incredibly difficult.
01:02:30.340 We don't have that data.
01:02:31.600 Okay, okay, okay.
01:02:32.680 Incredibly difficult to find, but it's something that's happening on a daily routine basis.
01:02:38.460 Okay, so it's something for people to be aware of if they have adolescent girls who are interacting online.
01:02:45.180 Yeah, yes, I imagine they're targeted in all sorts of ways.
01:02:48.640 And then you mentioned people who are looking for medical information online,
01:02:52.660 who are sick and infirm,
01:02:54.040 and then obviously in a position to be targeted by scammers in consequence of that.
01:02:59.180 So, and if you want to put it into context of frequency,
01:03:04.540 senior citizens are the highest targeted,
01:03:07.060 and then the next highest targeted segment are going to be those searching for some form of medical solution to a problem.
01:03:13.140 Oh, yeah.
01:03:13.680 Okay, so they're number two.
01:03:15.120 Okay.
01:03:15.740 And that is a heavy desperation moment.
01:03:19.840 They're also often traveling back and forth to health facilities,
01:03:23.400 whether they're in a hospital directly or they're moving back and forth between doctor's offices.
01:03:28.000 That information is made available.
01:03:29.780 You can buy people who visit, you know, health places on a routine basis.
01:03:33.880 And then they're very easy.
01:03:35.740 They're desperate.
01:03:36.680 And so they're very easy to suck in to a problem.
01:03:40.000 It really ranges from stealing money, so having access to bank accounts,
01:03:45.620 to phishing attacks where you're suggesting that they become part of a program.
01:03:49.540 You're gathering more and more information on them to then do future attacks,
01:03:54.060 to selling scam products.
01:03:55.560 So one of the great phenomenons in digital media during COVID,
01:04:00.420 especially in the first maybe nine, ten months,
01:04:03.020 was targeting seniors and then people with any form of illness,
01:04:06.340 explaining to them how COVID is going to do something very, very bad to you,
01:04:10.380 buy this product now.
01:04:12.400 And those were scams.
01:04:13.440 So the product rarely showed up.
01:04:14.980 It certainly wasn't very, very useful.
01:04:17.420 Those may be kind of low, you know, they're low problem on a per crime basis.
01:04:22.780 But when you look at it across society, the impact is spectacularly huge.
01:04:28.160 Okay.
01:04:28.600 Okay.
01:04:29.540 Why is the impact spectacularly huge if you look at it in that manner?
01:04:35.960 Well, the numbers add up, right?
01:04:38.520 So they're spending more and more and more money, which is a big, big issue.
01:04:42.320 But it also feeds the mindset of I'm going to,
01:04:46.860 the computer is going to tell me something.
01:04:49.040 It's going to create some sort of concern within me.
01:04:52.260 If they weren't looking at the computer,
01:04:54.000 it never would have occurred to them to look for the problem in the first place.
01:04:58.220 And so in addition to stealing our money, it's stealing our time.
01:05:01.560 And it's creating a great sense of fear that people are then living with
01:05:07.320 and kind of walking around all day wondering,
01:05:09.100 my computer told me this thing.
01:05:10.860 I'm very concerned about it.
01:05:12.860 It's continuing to feed more information.
01:05:15.060 The more you click, the more afraid you become,
01:05:17.720 which becomes a very, very big impact on society.
01:05:21.140 So, I don't know if you know this, but it's an interesting fact.
01:05:26.620 It's an extremely interesting fact in my estimation.
01:05:31.160 Do you know that sex itself evolved to deal with parasites?
01:05:36.760 I did not.
01:05:38.980 Okay.
01:05:39.460 So here's the idea.
01:05:42.720 I mean, I don't know if there's,
01:05:44.600 there are very few truths that are more fundamental than this one.
01:05:47.500 So, parasites are typically simpler than their hosts.
01:05:55.400 So they can breed faster.
01:05:57.780 And what that means is that in an arm race between host and parasites,
01:06:02.660 the parasites can win because they breed faster.
01:06:05.780 So they can evolve faster.
01:06:07.180 So, sex evolved to confuse the parasites.
01:06:12.360 Imagine that the best way for your genes to replicate themselves
01:06:16.540 would be for you to breed parthenogenetically.
01:06:21.140 You just clone yourself.
01:06:24.280 There's no reason for a sexual partner.
01:06:26.420 When you have a sexual partner, half your genes are left behind.
01:06:29.620 That's a big cost to pay if the goal is gene propagation.
01:06:35.420 The parasite problem is so immense that sexually reproducing creatures,
01:06:39.980 and that's the bulk of creatures that there are,
01:06:42.320 sexually reproducing creatures are willing to sacrifice half their genes
01:06:46.400 to mix up their physiology
01:06:51.260 so that parasites can't be transmitted perfectly from generation to generation.
01:06:56.840 So the parasite problem is so immense
01:06:59.960 that it caused the evolution of sex,
01:07:02.880 and creatures will sacrifice half their genes to prevent it.
01:07:07.080 So what that implies,
01:07:08.680 like we have this whole new digital ecosystem,
01:07:11.600 right, which is a biological revolution for all intents and purposes.
01:07:15.580 It's a whole new level of reality.
01:07:17.220 And the parasite problem is very likely to be overwhelming.
01:07:22.600 I mean, we have police forces,
01:07:25.000 we have laws,
01:07:26.000 we have prisons to deal with parasites in their human form.
01:07:30.200 But now we have a whole new ecosystem
01:07:32.900 that is amenable to the invasion of the parasites,
01:07:36.540 and they are coming like mad,
01:07:38.480 I mean, in all sorts of forms.
01:07:41.140 I mean, we don't even know how extensive the problem is to some degree
01:07:44.800 because there's not just the criminals that you talk about.
01:07:48.440 They're bad enough, or they're bad.
01:07:50.900 But we also have the online troll types
01:07:54.140 who use social media to spread derision
01:07:56.920 and to play sadistic tricks and games
01:07:59.580 and to manipulate for attention.
01:08:01.880 And we know that they're sadistic, psychopathic,
01:08:05.240 Machiavellian, and narcissistic
01:08:06.980 because the psychological data is already in.
01:08:09.460 They fall into the parasite category.
01:08:11.520 And we also have all that quasi-criminal activity like pornography.
01:08:16.820 And so it's certainly possible
01:08:18.820 that if the Internet, in some sense,
01:08:22.240 is a new ecosystem full of new life forms,
01:08:24.700 that it could be swamped by the parasites and taken out.
01:08:28.520 That's what you'd predict from a biological perspective,
01:08:32.300 looking at the history of life.
01:08:34.520 And so this is an unbelievably deep and profound problem.
01:08:38.400 See, I kind of think this is one of the main dangers
01:08:42.380 of this untrammeled online criminality
01:08:44.780 is that societies themselves tend to
01:08:47.400 undergo revolutionary collapse
01:08:51.620 when the parasites get the upper hand.
01:08:54.680 And it's definitely the case
01:08:55.920 that by allowing the unregulated flourishing
01:09:00.900 of parasitical criminals online
01:09:03.920 that we really risk destabilizing our whole society.
01:09:08.400 Because when those sorts of people become successful,
01:09:12.620 that's very bad news for everyone else.
01:09:15.200 It doesn't take that many of them
01:09:16.640 to really cause trouble.
01:09:19.280 So anyways, that's a bit of a segue into...
01:09:21.640 Well, it's pretty fascinating.
01:09:24.680 A couple quick points here.
01:09:26.380 So one,
01:09:28.480 the primary concern for the entities in digital
01:09:32.080 is on their content
01:09:34.160 versus the consumer.
01:09:37.660 So there's content adjacency.
01:09:40.060 The largest flag we have
01:09:41.780 for content that's brought by third parties
01:09:44.540 that's going to run on someone else's content
01:09:46.660 is the Israel-Hamas conflict.
01:09:49.300 The reason for that is less about
01:09:51.220 having a person get upset
01:09:52.760 than it is for having a large brand
01:09:54.980 like Coca-Cola or Procter & Gamble
01:09:56.780 have other content
01:09:58.240 that's going to run near that Israel-Hamas context.
01:10:03.300 Right, just in vicinity.
01:10:04.820 Yeah, and that is worrying about Pixels
01:10:08.880 or perhaps the name of a corporation
01:10:10.600 more than the impact on the grandmother, right?
01:10:13.100 Who's going to be hit
01:10:13.760 in the next impression with the crime.
01:10:16.080 And so we're still in a nascent spot
01:10:18.900 within the tech infrastructure
01:10:20.800 where those who would provide the capital
01:10:22.880 to provide us with all of those free services
01:10:25.240 are dominating the conversation.
01:10:27.100 That's part of why a government needs to step in
01:10:29.240 and say we're going to focus on crime.
01:10:32.840 What that also does,
01:10:34.000 getting back to a parasitic evolution,
01:10:36.540 what's the sacrifice
01:10:37.700 that big tech, digital media,
01:10:40.100 and the corporates, the brands,
01:10:41.420 are going to make
01:10:41.980 in order to protect grandmothers?
01:10:45.340 Right now, the bigger concern
01:10:48.120 is about what might be fake
01:10:50.420 because it's wasting a penny
01:10:52.540 or a fraction of a penny
01:10:53.820 when a pixel is delivered to an end device.
01:10:57.100 The spend is about monetizing
01:10:59.380 each individual nanosecond to pixel
01:11:02.880 that's going to run in front of us
01:11:04.880 versus the consumer.
01:11:07.640 And I think this is an incredibly myopic viewpoint.
01:11:11.220 Digital safety for the brand
01:11:12.640 is about making sure
01:11:13.760 the picture of their product
01:11:15.060 is in a happy location
01:11:16.620 while grandmothers are losing bank accounts.
01:11:19.080 And I think that evolution
01:11:20.560 is going to require a sacrifice.
01:11:22.340 I think the companies
01:11:23.600 that engage in digital safety
01:11:25.520 and many big tech
01:11:26.680 and digital media companies
01:11:27.600 go way out of their way
01:11:29.060 to do a good job protecting people.
01:11:32.180 Ultimately, they're going to win
01:11:33.140 because the relationship with us
01:11:35.140 is going to be so much significantly better
01:11:37.340 and protected and trusted
01:11:38.740 that they're just going to wind up
01:11:41.120 interfacing with us
01:11:42.720 better than those
01:11:44.400 who are trying to protect their own.
01:11:46.260 Right, right, right, right, right.
01:11:48.240 Well, that's an optimist.
01:11:49.320 Well, that makes sense to me.
01:11:50.440 That's an optimistic view
01:11:51.720 because, I mean, fundamentally,
01:11:54.300 what makes companies wealthy
01:11:58.160 reliably over the long run
01:12:00.120 is the bond of trust
01:12:02.520 that they have with their customers.
01:12:05.560 Right, that's what brand,
01:12:07.100 that's really what a brand worth is
01:12:09.380 in the final analysis.
01:12:10.440 I mean, Disney was worth a fortune
01:12:11.860 as a brand because everybody trusted
01:12:13.880 both their products
01:12:16.160 and the intent behind them.
01:12:17.680 And so that's a very hard thing to build up,
01:12:20.620 but it is the basis of wealth.
01:12:22.240 I mean, trust is the basis of wealth.
01:12:25.040 And so it's interesting
01:12:26.960 to contemplate the fact
01:12:28.360 that that means that it might be
01:12:29.820 in the best interests
01:12:30.640 of the large online companies
01:12:33.800 to ensure the safety of the people
01:12:35.800 rather than the safety of their products,
01:12:38.480 the safety of the people
01:12:39.380 who are using their services.
01:12:42.180 That's an interesting to think about.
01:12:43.860 Okay, so let me,
01:12:45.180 maybe we can,
01:12:47.680 close at least this part of the discussion
01:12:49.860 with a bit of a further investigation
01:12:51.880 into these virtual persona
01:12:54.940 that you're creating
01:12:56.060 that work as the false targets
01:13:00.700 of criminal activity.
01:13:02.900 Tell me about them
01:13:04.000 and tell me how many of them,
01:13:05.740 approximately, if you can.
01:13:07.300 I don't want to interfere
01:13:08.200 with any trade secrets,
01:13:09.300 but, like, how many,
01:13:12.700 what kind of volume
01:13:13.860 of false personas
01:13:16.780 are you producing
01:13:18.220 to attract criminal activity?
01:13:20.280 And is that something
01:13:21.400 that can be increasingly AI-mediated?
01:13:24.260 Yes.
01:13:24.760 So it is,
01:13:25.980 we use manual processes,
01:13:28.000 but we also use AI
01:13:29.980 and continuous scanning
01:13:31.460 of digital assets
01:13:32.920 to keep those profiles active.
01:13:35.040 So our job
01:13:36.800 isn't so much
01:13:37.740 to become a grandmother
01:13:39.240 to the world,
01:13:40.100 it's to have certain components
01:13:41.440 that enable big tech
01:13:43.620 or ad serving
01:13:45.320 or content delivery
01:13:46.560 to perceive us to be that.
01:13:48.880 And so we're really
01:13:50.480 kind of gaming back
01:13:51.900 to the system
01:13:53.680 to find those objects
01:13:55.620 or those persona
01:13:57.000 kind of classifications
01:13:58.200 on device,
01:13:59.520 whether that's actual phones
01:14:00.640 or televisions
01:14:02.020 or actual computers,
01:14:04.940 and then to run the content
01:14:06.520 with as much of that
01:14:08.240 as possible.
01:14:09.040 So we're running
01:14:09.880 millions of combinations
01:14:11.720 of potential consumers.
01:14:14.840 Some of them
01:14:15.400 are many, many profiles
01:14:16.900 at the same time
01:14:17.840 because it's not going
01:14:19.940 to discern
01:14:20.380 between different activities
01:14:22.120 as long as you have
01:14:22.860 something that leans
01:14:23.680 towards what they're looking for.
01:14:25.520 But then what gets
01:14:26.600 very interesting
01:14:28.100 is a predominance
01:14:30.140 of the content
01:14:30.860 is an auction model.
01:14:33.160 And so you have to fit
01:14:34.100 within price points
01:14:35.160 of what the criminals
01:14:36.400 are trying to attract as well,
01:14:38.880 which is not always
01:14:40.020 people with a lot of money.
01:14:41.940 It's everyone
01:14:42.760 in the ecosystem.
01:14:44.900 And so we're very much
01:14:46.320 becoming a very nuanced
01:14:47.880 set of personas.
01:14:49.680 Millions of these,
01:14:50.500 a very, very critical component
01:14:51.880 is geography, right?
01:14:53.860 So they're going to target
01:14:55.380 a specific town differently.
01:14:57.340 I don't know if you've had
01:14:58.260 any offers from your government
01:15:00.100 to buy you solar panels.
01:15:01.520 There aren't actually
01:15:02.280 a lot of government programs
01:15:03.300 that are going to pay
01:15:03.840 for your solar panels.
01:15:05.560 Those are typically
01:15:06.120 some forms of scams
01:15:07.660 and they're directed
01:15:08.820 at a local market.
01:15:10.720 What's very interesting
01:15:12.060 right now is that
01:15:13.040 rather than using AI
01:15:14.480 to design content
01:15:15.720 to pull you in better,
01:15:17.900 we're seeing more and more
01:15:18.980 of similar content
01:15:20.180 designed so that it's harder
01:15:21.800 to pull it down
01:15:22.600 once it's made bad, right?
01:15:24.800 So they'll make 30 copies
01:15:25.920 where there used to be
01:15:26.640 one or two.
01:15:27.240 So you can pull down 15, 20
01:15:29.260 and there's still going
01:15:30.260 to be 10 or 15 left.
01:15:32.140 Right, right, right, right.
01:15:34.780 So what a strange world, eh?
01:15:36.180 Where we have the proliferation
01:15:39.000 of AI-enabled victims,
01:15:44.780 victim decoys
01:15:46.020 to decoy AI-enhanced
01:15:48.940 online criminals from praying.
01:15:52.020 Yes, AI is used for safety
01:15:54.640 to defend us from AI.
01:15:56.900 We have hit that moment.
01:16:00.700 Yeah, well, so then, you know,
01:16:02.360 I was just trying
01:16:03.160 to contemplate briefly
01:16:04.320 what sort of evolutionary
01:16:05.420 arms race that produces, right?
01:16:09.440 Hyper victims
01:16:10.300 and super criminals,
01:16:11.900 something like that.
01:16:13.180 Jesus, weird.
01:16:14.740 Well, you know,
01:16:15.660 there is some worry
01:16:16.780 that ultimately
01:16:17.700 it's a horsepower game, right?
01:16:19.740 So when it's AI versus AI,
01:16:21.500 the more computer horsepower
01:16:22.520 you have,
01:16:23.740 the more likely it is
01:16:25.300 that your AI is going to win.
01:16:26.900 And at the Media Trust,
01:16:30.340 our job is to make
01:16:31.520 the digital ecosystem safer
01:16:32.880 for people.
01:16:34.660 We're not all that concerned
01:16:36.480 about one AI beating another AI,
01:16:39.220 unless that's in context
01:16:40.920 of having a grandmother,
01:16:42.880 not lose her bank account, right?
01:16:44.380 That is the core gist
01:16:46.420 of how we look at it,
01:16:47.280 which is different
01:16:48.180 than an enamor relationship
01:16:50.180 with technology
01:16:51.060 and seeking technology solutions
01:16:53.420 for a technical problem.
01:16:54.700 This is a human issue.
01:16:57.360 And with that,
01:16:58.180 the personas are human reflections
01:17:00.260 back into the delivery of content.
01:17:02.840 It's not about the machine.
01:17:05.160 How are you feeling
01:17:05.960 about your chances of control
01:17:09.620 over on,
01:17:10.760 or our chances for that matter,
01:17:12.480 of control over online criminality?
01:17:14.480 And how successful
01:17:17.280 do you believe you are
01:17:19.940 in your attempts
01:17:20.900 to stay on top of
01:17:23.640 and ahead of
01:17:25.440 the criminal activity
01:17:26.960 that you're trying to fight?
01:17:28.580 For our customers
01:17:29.820 that prioritize digital safety,
01:17:32.320 it is a,
01:17:33.640 the vast majority
01:17:35.600 of what might run through
01:17:36.920 to attack someone
01:17:37.920 is being detected
01:17:39.400 and removed.
01:17:40.820 They need to have
01:17:41.860 the appropriate mindset.
01:17:43.340 They need to be willing
01:17:44.040 to go up onto the,
01:17:45.240 to the demand source
01:17:46.380 to remove
01:17:47.100 bad activity
01:17:48.640 that's going to be coming down.
01:17:50.120 You don't just want
01:17:50.780 to play whack-a-mole.
01:17:51.780 You have to engage
01:17:52.660 in that next step.
01:17:55.280 Those that do
01:17:56.140 are very successful
01:17:57.700 and create safe environments.
01:17:59.760 It is not possible
01:18:00.620 to make this go away.
01:18:02.100 The pipes,
01:18:02.720 the way that the internet works,
01:18:04.080 the way the data targeting works,
01:18:05.400 it's just not something
01:18:06.400 you can eliminate entirely.
01:18:08.140 But there are companies
01:18:09.160 that are in front of this
01:18:10.760 that will withhold
01:18:11.960 millions of dollars
01:18:13.460 in revenue
01:18:13.980 at any given moment
01:18:14.920 to prevent
01:18:15.900 the possibility
01:18:17.060 of targeting something
01:18:18.780 and having something bad happen.
01:18:21.360 But there are a lot of companies
01:18:22.680 that are not willing
01:18:23.760 to go that far.
01:18:25.040 I think right now
01:18:26.540 in some of the bigger companies,
01:18:28.560 we see
01:18:29.120 a lot of risk
01:18:31.160 towards this
01:18:32.300 who's going to win
01:18:33.140 the chat GPT,
01:18:34.600 who's going to win
01:18:35.220 the LLM race.
01:18:36.740 There is so much at stake
01:18:38.840 in that from a competitive
01:18:40.960 and revenue perspective.
01:18:43.180 The companies
01:18:44.100 that can monetize
01:18:45.660 that the best
01:18:46.420 are going to start
01:18:47.920 to leap forward.
01:18:49.500 When you're looking
01:18:50.460 at the world
01:18:51.000 from a how does
01:18:51.700 my technology win
01:18:53.060 versus how do I safely
01:18:54.700 get my technology
01:18:55.540 to do the things
01:18:56.320 that I want,
01:18:57.220 that's when you start
01:18:57.840 to run a lot of risk.
01:18:58.900 We're in a risk-on
01:19:00.780 phase in digital
01:19:01.740 right now.
01:19:03.240 Right.
01:19:03.340 But your earlier claim
01:19:05.380 I think
01:19:05.780 which is worth
01:19:06.520 returning to
01:19:07.280 was that
01:19:07.920 over any reasonable
01:19:10.220 period of time,
01:19:11.840 there's the rub,
01:19:13.220 the companies
01:19:13.860 that do what's necessary
01:19:15.520 to ensure
01:19:16.340 the trust
01:19:18.820 of what you say,
01:19:21.520 to ensure
01:19:23.560 that their users
01:19:25.820 can trust
01:19:26.660 the interactions
01:19:27.740 with them
01:19:28.220 are going to be
01:19:28.800 the ones
01:19:29.180 that are arguably
01:19:30.720 best positioned
01:19:31.560 to maintain
01:19:33.820 their economic
01:19:34.480 advantage
01:19:35.020 in the years
01:19:35.520 to come.
01:19:36.640 And I think
01:19:37.260 yes,
01:19:39.500 and those
01:19:40.240 that are willing
01:19:40.720 to engage
01:19:41.420 with governments
01:19:42.380 to do a better job
01:19:44.580 to ultimately
01:19:45.080 find the bad actors
01:19:46.540 and take them down,
01:19:48.640 they're going to be
01:19:49.460 a big part
01:19:50.220 of making the ecosystem
01:19:51.020 better
01:19:51.520 rather than insulating
01:19:52.960 and hiding behind
01:19:53.780 this sort of risk
01:19:54.600 legal regime
01:19:55.660 that's going to not
01:19:56.920 want to bring data
01:19:57.580 forward to clean up
01:19:58.440 the ecosystem.
01:19:59.640 Okay.
01:20:00.140 Okay.
01:20:00.480 Well, for everybody
01:20:02.240 watching and listening,
01:20:04.420 I'm going to
01:20:05.380 continue my discussion
01:20:07.060 with Chris Olson
01:20:07.880 on the Daily Wire
01:20:09.040 side of the interview
01:20:10.980 where I'm going
01:20:12.500 to find out
01:20:13.300 more about,
01:20:14.520 well,
01:20:14.900 how he built
01:20:17.020 his company
01:20:17.600 and how his interest
01:20:18.760 in prevention,
01:20:20.860 understanding
01:20:21.360 and preventing
01:20:21.900 online crime
01:20:22.600 developed
01:20:23.040 and also
01:20:23.600 what his plans
01:20:25.100 for the future are.
01:20:26.740 And so,
01:20:27.100 if those of you
01:20:27.640 who are watching
01:20:28.280 and listening
01:20:28.780 are inclined
01:20:29.880 to join us
01:20:30.640 on the Daily Wire
01:20:31.280 side,
01:20:31.660 that would be
01:20:32.240 much appreciated.
01:20:33.920 Thank you to everybody
01:20:34.760 who is watching
01:20:35.540 and listening
01:20:35.960 for your time
01:20:36.520 and attention
01:20:36.980 and thank you
01:20:37.980 very much,
01:20:38.460 Mr. Olson,
01:20:39.060 for, well,
01:20:40.720 fleshing out
01:20:41.320 our understanding
01:20:42.080 of the perils
01:20:43.760 and possibilities
01:20:45.180 that await us
01:20:47.600 as the internet
01:20:48.820 rolls forward
01:20:49.800 at an ever-increasing rate.
01:20:51.340 and also for,
01:20:52.460 I would say,
01:20:52.900 alerting everybody
01:20:53.660 who's watching
01:20:54.360 and listening
01:20:54.820 to the,
01:20:55.560 what would you say,
01:20:57.340 the particular points
01:20:58.300 of access
01:20:58.900 that the online criminals
01:21:00.820 have at the moment
01:21:01.960 when we're in
01:21:04.720 our most vulnerable
01:21:05.600 states,
01:21:07.800 sick,
01:21:08.400 young,
01:21:10.200 seeking,
01:21:11.340 old,
01:21:11.980 all of those things
01:21:13.040 because we all
01:21:14.380 have people,
01:21:15.140 we all know people
01:21:15.860 who are in those categories
01:21:16.980 and are looking
01:21:19.340 for ways to protect them
01:21:20.600 against the people
01:21:21.380 that you're also
01:21:22.100 trying to protect us from.
01:21:23.520 So,
01:21:24.060 thank you very much
01:21:24.740 for that.
01:21:25.800 Thank you.
01:21:26.200 Thanks for having us.
01:21:27.160 Me.
01:21:28.160 You bet.
01:21:29.060 You bet.
01:21:29.840 And again,
01:21:30.440 thanks to everybody
01:21:31.500 who's watching,
01:21:32.220 listening to the film crew
01:21:33.260 down here
01:21:33.960 in Chile today
01:21:35.640 in Santiago.
01:21:36.600 Thank you very much
01:21:37.260 for your help today, guys,
01:21:38.420 and to the Daily Wire people
01:21:39.500 for making this conversation
01:21:40.740 possible.
01:21:41.660 That's much appreciated.
01:21:43.120 Thanks very much,
01:21:44.400 Mr. Olson.
01:21:45.120 Good to talk to you.
01:21:46.500 Thank you.