In this episode, we speak with Chris Olson, CEO of the Media Trust Company, a company that works with large corporations to protect their digital assets. We discuss the perils of criminal activity in the online world, including online pornography, phishing, and other types of online crime, as well as how corporations and governments can work together to combat these perils, and what they can do to prevent them from happening in the first place. And, of course, we talk about Cybersecurity, which is a broad umbrella term that covers a broad array of issues related to protecting the digital assets of corporates, governments, and the general public from harm from criminals. Music: Fair Weather Fans by The Baseball Project, Recorded in Los Angeles, CA and produced by Fordham University, New York, NY Thank you for listening to HYPEBEAST Radio and Creative Commons. Please don't forget to rate, review, and subscribe to our other shows MIC/LINE, The Anthropology, The HYPE Report, and HYPETALKS! Subscribe to our new podcast CRITIQUE. Subscribe on Apple Podcasts! Subscribe on iTunes Learn more about your ad choices. Use the promo code CRITICIALS to receive 10% off your first purchase when you shop at Caff or become a supporter of one of our new online retail stores! and receive a FREE copy of our newest book, CRITICALS by clicking here. Use promo code DAILYWEEKS to receive $10,000 when you book your first month and receive $25,000 in the offer is good for 10% OFFER $5,000 or more than $50, and get a discount of $50 OFFERING FREE when you enter the offer gets two months get a VIP discount when you buy a product review starts in two months or more get $4,000 get two months use that starts by clicking VIP + VIP gets 4 months get 5, VIP gets 5,000, they also get VIP access to VIP gets $4 VIP access? Subscribe and get an ad-only offer starts starting at $4 PRIVOTORCHORDS ONLY, they get my deal starts starting on my VIP membership offer starts $4? Learn more: bit.ee, use promo code CHECKED, $4 REVIEW AND VIPREALER_PROMO_PRODUCED, FREE FASTEST AND VIP_PROGONE?
00:38:05.360They don't have the same aesthetic that you'd expect if it was a genuine communication from your phone.
00:38:11.280But man, you have to know the ecosystem to be able to distinguish that kind of message from the typical thing your phone or any website might ask you to do.
00:38:22.840And educating seniors, it's not just a matter of describing to them that this might happen.
00:38:28.460They would have to be tech-savvy cell phone users.
00:38:33.120And it's hard enough to do that if you're young, you know, much less if you're outside that whole technological revolution.
00:38:40.080So I can't see the educational approach.
00:38:43.820The criminals are just going to outrun that as fast as it happens.
00:38:56.660And I think getting in front of this problem requires cooperation with states, moving that tactically to have the idea of a police force looking at digital.
00:39:07.320And I think one of the things that both sides, whether it's private companies or states, needs to wrap their head around is that there's going to be a cooperative motion to do better with people in mind.
00:39:20.560All right, so let's move to the other categories of likely victim.
00:39:25.040So you talked about romance scams and also computer upgrade, repair, and error scams for seniors.
00:39:33.200Is there any other domains where seniors are particularly susceptible?
00:39:37.780Also, I think what I'd put into context is a lot of the data collection that results in people getting phone calls with a voice copy of their grandchild, right?
00:39:49.440Which then ultimately is going to result in a scam.
00:39:51.980It is that digital connection that is the leading point that drives the ability to commit those types of crimes.
00:40:02.080The ability to marry their grandson or granddaughter's voice with their digital persona and then finding a phone number that they can use to call them.
00:40:11.660So there's a lot of action happening just in our daily interactions that's ultimately being moved out into the ecosystem that we have to take a look at.
00:40:19.620Right, well, and then you're going to have that deepfake problem too where those systems that use your grandchild's voice will actually be able to have a conversation with you targeted to you in that voice in real time.
00:40:38.680And we're probably no more than, well, the technical capacity for that's already there.
00:40:44.320I imagine we're no more than about a year away from widespread adoption of exactly that tactic.
00:40:49.840So, you know, I've been talking to some lawmakers in Washington about such things, about protection of digital identity.
00:40:55.760And one of the notions I've been toying with, maybe you can tell me what you think about this, is that the production of a deepfake, the theft of someone's digital identity,
00:41:08.680to be used to be used to impersonate them, should be a crime that's equivalent in severity to kidnapping.
00:41:19.640You know, because if I can use my daughter's, if I can use your daughter's voice in a real-time conversation to scam you out of your life savings,
00:41:27.720it's really not much different than me holding her at gunpoint and forcing her to do the same thing.
00:41:32.660And so, I don't know, like, if you've given some consideration to severity of crime or even classification,
00:41:38.960but theft of digital identity looks to me something very much like kidnapping.
00:41:43.800What do you, like, any thoughts about that?
00:41:46.620Yeah, for me, I would simplify it a little bit.
00:41:49.480Using Section 230 or the First Amendment to try to claim that the use of our personal identity to do something online when it's a crime doesn't make sense.
00:42:03.660So, if it's being used, we want to simplify this first.
00:42:07.280We don't need a broad, broad-based rule on identity necessarily before.
00:42:12.680We simply state that if someone's using this for a crime, it's a crime, and that that is going to be prosecuted if you're caught and detected,
00:42:22.940which then goes back to actually catching and detecting that.
00:42:56.340Real quick before you skip, I want to talk to you about something serious and important.
00:42:59.980Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:43:06.920We know how isolating and overwhelming these conditions can be,
00:43:10.220and we wanted to take a moment to reach out to those listening who may be struggling.
00:43:14.240With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:43:21.960He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:43:28.780If you're suffering, please know you are not alone.
00:43:32.600There's hope, and there's a path to feeling better.
00:43:35.800Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:43:41.540Let this be the first step towards the brighter future you deserve.
00:43:45.140Yeah, for us, there is a path that leverages that content to bring it to the device.
00:43:56.060And I think understanding that mechanism and how it's brought forward versus looking at the content,
00:44:01.480and I'll give you an example of what's happening in political advertising as we speak.
00:44:05.760Understanding the pathway for how that content is delivered is ultimately how we get back to the criminal or the entity that's using that to perpetrate the crime.
00:44:17.260The actual creation of the content is incredibly difficult to stop.
00:44:21.460It's when it moves out to our devices that it becomes something that we need to be really paying attention to.
00:44:27.960So in political advertising up to October of this past year, our customers asked us to flag the presence of AI source code.
00:44:38.280So the idea there was they didn't want to be caught holding the bag of being caught being the server of AI-generated political content.
00:44:47.240By October, we essentially stopped using that policy because we had achieved greater than 50% of the content that we were scanning had some form of AI.
00:45:07.180It may have been to make the sun a little more yellow, the ocean a little bit more blue.
00:45:11.540But using that as a flag to understand what's being delivered out, once you get over 50%, you're looking at more than you're not looking at.
00:45:21.780That's not a good automated method to execute on digital safety.
00:45:26.000So as we move forward, we have a reasonably sophisticated model to detect deep fakes, very much still in a test mode, but it's starting to pay some dividends.
00:45:38.220And unquestionably, what we see is using the idea of deep fakes to create fear is significantly greater than the use of deep fakes.
00:45:49.000Now, that's limited to a political advertising conversation.
00:45:52.000We're not seeing a lot of deep fake serving in information or certainly not in the paid content side.
00:45:58.060But the idea of fearing what's being delivered to the consumer is very much becoming part of a mainstream conversation.
00:46:07.840Yeah, well, wasn't there some insistence from the White House itself in the last couple of weeks that some of the claims that the Republicans were making with regards to Biden were a consequence of deep fake audio?
00:46:24.180So, not video, I don't think, but audio, if I got that right, does that story ring a bell?
00:46:30.060And I think where we are at this stage in technology is very likely there is plenty of deep fake audio happening around the candidates.
00:46:37.700So whether you're Donald Trump or Joe Biden or even local political campaigns, it's really that straightforward.
00:46:45.760I think on the video side, there are going to be people working on it left and right.
00:46:49.900I think it's the idea of using that as a weapon to sow some form of confusion among the populace.
00:48:09.920They're going to self-evolve to a point where so much of the information that's being fed to them is just going to be disbelieved because it's going to be safer to not go down that path.
00:48:21.500I'm wondering if live events, for example, are going to become once again extremely compelling and popular because they'll be the only events that you'll actually be able to trust.
00:48:36.280I think it's also critical that we find a way to get a handle on kind of the anti-news and get back.
00:48:45.660The entities promoting trust in journalism, that is a very meaningful conversation and it is something that we need to try to get back to.
00:48:55.560It's much less expensive to have automation or create something that's going to create some kind of situation where people continue to click.
00:49:03.920That's a terrible relationship with the digital ecosystem.
00:49:07.560It's not good for people to have that in their hand.
00:49:10.740And, you know, with the place where digital crime is today, if you're a senior citizen, your relationship is often net negative with the internet, right?
00:49:21.400You may want to stick to calling your kids on voiceover IP where you can see their face.
00:49:26.280Lots of different ways to do that in video calling.
00:49:28.900But doing other things on the internet, including things as simple as email, it may be more dangerous to engage than any benefit, you know, that you're going to get back.
00:49:39.640And I think as we move closer to that moment in time, this is where we all need to be picking up and focusing on digital safety, focusing on the consumer.
00:49:47.480However, I think corporates are going to have to engage on that.
00:49:53.220So let me ask you a question about that because one of the things I've been thinking about is that a big part of this problem is that way too much of what you can do on the net is free, free.
00:50:06.920Now, the problem with free is that, let's take Twitter, for example.
00:50:11.960Well, if it's free, then it's 20% psychopaths and 30% bots because there's no barrier to entry.
00:50:21.100And so wherever, maybe there's a rule like this is wherever the discourse is free, the psychopaths and the psychopaths will eventually come to dominate and maybe quite rapidly.
00:50:33.500The psychopaths and the exploiters because there's no barrier to entry and there's no consequence for misbehavior.
00:50:39.240So, like, we're putting together a social media platform at the moment that's part of an online university and our subscription price will be something between $30 and $50 a month, which is not inexpensive.
00:50:54.020Although compared to going to university, it's virtually free.
00:50:56.980You know, and we've been concerned about that to some degree because it's comparatively expensive for, like, a social media network.
00:51:05.820But possibly the advantage is that it would keep the criminal players at a minimum, right?
00:51:13.100Because it seems to me that as you increase the cost of accessing people, you decrease people's ability to do, well, low-cost, you know, multi-person monitoring of the sort that casts a wide net and that costs no money.
00:51:30.320Right. So, have you, what are your thoughts about the fact that so much of this online pathology is proliferating because when we have free access to a service, so to speak, the criminals also have free access to us?
00:51:47.440Am I barking up the wrong tree or does that seem, does it mean that the internet is going to become more siloed and more private because of that?
00:51:55.180I think it's going to go in two ways. So, one, you will find safety in how much money you spend. And that's already true.
00:52:03.840So, when there are paywalls within even large news sites, the deeper you go into the paywall, the higher the cost to reach the consumer, right?
00:52:13.700Not just coming from the consumer, but even through with advertising and other content producers, the lower the activity of the criminal because it's more expensive for them to do business.
00:52:26.080True throughout. I think the other requirement, because we're very acclimated to having free content, is that the entire supply chain is going to have to engage.
00:52:36.780So, when you think through who is responsible for the last mile of content that's able to reach our devices inside of our home, right?
00:52:45.080Is that the big telcos? Is that the companies that are giving us Wi-Fi and bringing data into our houses?
00:52:52.860Right now, they're putting their hands back and it's not our job to understand what happens to you on your device.
00:53:00.280If anything, there's a data requirement that says we're not allowed to know or we're not allowed to keep track of where you go and what comes onto your device.
00:53:09.500There's a big difference between monitoring where we go online and what is delivered into our device.
00:53:17.520And this is missing from the conversation.
00:53:20.640Privacy is critically important and privacy is about how we engage in our activities on the internet.
00:53:26.900The other side of that is what happens after the data about us is collected.
00:53:32.400And that piece is not something that is necessarily private.
00:53:36.220It should not be broadcast what is delivered to us.
00:53:38.520But someone needs to understand and have some control over what is actually brought in based on the data that is collected.
00:53:47.080And that is a whole of society, meaning all of the companies, all of the entities that are part of this ultimate transaction to put that piece of content on our phone and our laptop and our TV need to get involved in better protecting people.
00:54:00.240One of the primary issues is there are so many events, trillions of events per day on all of our devices, that even when you have paywalls, the problem is so huge that you can always find access to people's machines until we get together and do something better about it.
00:54:19.920So paywalls in some ways are a partial solution, but they're, okay, so that's useful to know.
00:54:26.500Now, do you have, I want to ask you a specific question, then we'll go back to classes of people who are being targeted by criminals.
00:54:35.700I want to continue walking through your list.
00:54:38.200Do you have specific legislative suggestions that you believe would be helpful at the federal or state level?
00:54:46.040And do you, are you in contact with people as much as you'd like to be who are legislators, who are interested in contemplating such legislative moves?
00:54:59.260I went, the reason I'm asking is because I went to Washington probably the same time I met you, and I was talking to a variety of lawmakers on the Hill there who are interested in digital identity security, but, you know, it isn't obvious that they know what to do because it's, well, it's complicated, you might say.
00:55:21.420It's extremely complicated, and I think the big tech companies are in some ways in the same boat.
00:55:27.040So do you already have access to people sufficiently as far as you're concerned who are drawing on your expertise to determine how to make the digital realm a less criminally rife place?
00:55:43.620What I find is that the state governments are really where the action is.
00:55:49.260And when I say, and they're closer to people, right?
00:55:52.520So the federal government is quite far away from, you know, a grandmother or someone in high school.
00:55:59.980The state governments know the people who run the hospitals.
00:56:02.500They know people at senior communities.
00:56:04.680They understand what's happening on the ground.
00:56:07.640They're also much closer, if not managing overall police forces, right?
00:56:12.660So that may be down at the county level or other types of districts, but they understand a daily police force.
00:56:20.000So I think what we're seeking is to influence states to take tactical action.
00:56:27.900And if that requires legislation, what that would be is putting funds forward to police people from digital crime the same way that they're policing people or helping to police crime against people in their homes, walking down the street, on our highways, in our banks, right?
00:56:48.920We're 20 years in from data collection, data targeting, third-party code, kind of dominating our relationship with our devices.
00:56:57.180It is the one piece that governments really haven't started to work on a whole lot.
00:57:03.540The United Kingdom, on the other hand, has three different agencies that are, that they've been given the authority to tactically and actively engage with the digital ecosystem.
00:57:13.400So those are the companies that make up the cloud, that serve advertising and serve content, that build websites and e-commerce systems.
00:57:20.040They're finding problems, and then they're engaging tactically with that digital supply chain to turn off attacks.
00:57:28.180Are they doing that in a manner that's analogous to the approach that you're taking, the creation of these virtual victims and the analysis of—
00:57:37.280I think mostly it's receiving feedback from people that are being targeted and getting enough information about those to then move it upstream.
00:57:48.160Legislation that would say that a synthetic persona in a particular local geography counts as crime, that would be a big leap for governments to take.
00:57:58.000That would be very, very useful in the ability to go out and actually prosecute.
00:58:02.880But I think that's going to be a very, very difficult solution.
00:58:06.580I think the problem must be addressed in cooperation with big tech and digital media, and that as a police force in a local market, content is targeted locally.
00:58:19.280So something is going to be served into the state of Tennessee differently than it served into New York State.
00:58:24.540As that information is gathered, it should be given to those who can turn off attacks quickly, that is crime reduction, and then ultimately be working together where if there's certainty that there is a crime, and the companies that are part of the supply chain have information on the actual criminal, that they're sharing that in a way that, one, they're not getting in trouble for sharing the information.
00:58:46.700But two, they're collectively moving upstream to that demand source that's bringing the content to our device.
00:58:53.280I think that becomes a natural flow at some point in the future.
00:58:58.760And I want to make sure that I'm making this clear, that's not about protecting a machine, that's about protecting the person at the other end of the machine, and keeping that mindset is critical.
00:59:25.780And do you have some sense of how widespread, first of all, if the 17-year-old is being targeted to purchase illicit drugs, are they being put in touch with people who actually do supply the illicit drugs?
00:59:40.500Like, has the drug marketing enterprise been well-established online?
00:59:48.000Yeah, so this is a place where the biggest tech and digital media companies have done a very good job removing that from digital advertising and targeted content on their pipes.
00:59:58.120But that is still something that's happening every single day and actually growing, predominantly through social media channels or interactions between the person who's going to end up selling the drugs.
01:00:11.060And the person could be in any country.
01:00:12.780This is coming through the mail or it's leading to the streets and making a purchase.
01:00:18.500But what I can give you, if I'm going to get these numbers right, roughly 2,000 deaths from fentanyl or similar drugs in the Commonwealth of Virginia in 2023.
01:00:30.360And the belief is that greater than 50% of those drug transactions began online.
01:00:36.180So it is a predominant location for the targeting of people to buy, informing people that the drugs are available, and then ultimately making the sale.
01:00:46.140Okay, okay, and they break down the demographics in the same way, making the presumption that males in all likelihood of a particular age are the most likely targets.
01:00:58.840And using, I wonder what other demographic information would be relevant if you were trying to target the typical drug seeker.