In this episode of the podcast, we sit down with computer scientist and cryptographer Dr. Hans von Mises to talk about privacy, cryptography, and the importance of privacy in the modern age. Hans has dedicated his life to preserving privacy, and in doing so, he is challenging the authority of the most powerful people on the planet.
00:01:11.960What is privacy and why is it important?
00:01:14.860So I believe that privacy is core to freedom at the end of the day.
00:01:21.860I would even go as far as saying that it is synonymous with freedom.
00:01:25.740And it is protecting you, protecting your inner core essentially, protecting your identity as a human being from forces that don't want you to be an individual and a human being at the end of the day.
00:01:46.700I think what it really boils down to is, and in that regard, I think privacy is relatively similar to what was originally intended also with the Second Amendment in the United States.
00:02:01.920It is a tool for you as a human being to protect yourself against coercive force, against your very soul, your inner core.
00:02:11.800So there are forces, and this has always been true at every time in history, that seek to make people less human, to turn human beings into slaves or animals or objects.
00:02:24.180And privacy is the thing that prevents that.
00:02:27.240So the crazy principle that exists within this universe is that there's this asymmetry baked right into the very fabric that we exist in.
00:02:39.000There's certain mathematical problems where the effort required to undo them isn't just scaling linearly or exponentially,
00:02:47.760but that scales so violently that the universe itself prohibits persons that don't have access, don't have permission to undo this mathematical problem, that they literally cannot do that.
00:02:59.100So what that means is that with a very little amount of energy, a minuscule amount of energy, a laptop, a battery, and a few milliseconds of computation,
00:03:09.600you can create a secret that not even the strongest imaginable superpower on Earth is able to, without your explicit granting of access, are able to recover.
00:03:22.380That is the fundamental principle on top of which encryption, cryptography, and privacy in the modern age are built.
00:03:30.380And it's so fascinating that the universe itself allows for this computational asymmetry, where I can create a secret, I can encrypt something, I can make something hidden,
00:03:41.460and you with the most powerful imaginable coercive force, violence, you could imagine continent-sized computers running for the entire lifespan of the universe,
00:03:53.980you would not be able to apply that force to my secret because I have encrypted it and the universe inherently sort of smiles upon encryption and appreciates that.
00:04:05.280So I always found that so intoxicating, this concept that this is inherently baked into the universe,
00:04:12.200it is an interaction between mathematics and physics sort of, and is a fundamental property just like you could say nuclear weapons are a fundamental property of reality, right?
00:04:25.440And so encryption and privacy exist in this reality, and before you, we as humans have figured it out, that wasn't necessarily clear, right?
00:04:36.860It could also be that you can never hide something, encrypt something, keep something to yourself, but it turns out you actually can.
00:04:47.740And what it conceptually allows you to do is to take something and move it into a different realm, the encrypted realm, right?
00:04:56.980And if someone else wants to go into that realm, follow you there, they would need unlimited resources to do so.
00:05:06.700And I would say that's what really got me into cryptography and privacy.
00:05:11.460Okay, I'm having all kinds of realizations simultaneously, first of all, that you're an extraordinary person.
00:05:18.180I think that's, first, listen to three minutes.
00:05:21.100Okay, who are you, where are you from, and are you ready to suffer for your ideas?
00:05:29.140Because what you've just articulated is the most direct, subtle, but direct possible challenge to global authority anyone could ever articulate.
00:05:39.200But first, how did you come to this, where are you from?
00:05:42.360Tell us about yourself for just a moment.
00:05:44.580So I was born in Germany, I'm 25 years old, and I originally actually, in my life I studied law, and then later I studied mathematics and computer science.
00:06:01.360And then, at some point, I met a few people who also had these kinds of ideas about privacy, technology, distributed technology, decentralization.
00:06:13.540And we then decided to found a company that builds this kind of technology.
00:06:18.580And that's how I ended up here, I guess.
00:06:20.760So you're German, you're a product of Europe and European culture, which is not privacy for all of its wonderful qualities.
00:06:30.320It built the world, I love Europe, and the culture, but it's not a privacy culture.
00:06:44.100If you view privacy as this inherent political thing that protects you as a human being, there is data protection laws, GDPR, right?
00:06:57.500There's fines against surveillance, capitalist tech giants in Europe.
00:07:01.740But as you said, I feel like most of that stuff is a charade.
00:07:06.300It's not really about protecting your privacy.
00:07:10.900And we are seeing that in the UK, in the European Union.
00:07:14.700I mean, there's so many cases that already have made some significant movements already this year.
00:07:20.940So I would say for me personally, it has really been this technological and mathematical understanding of the power of this technology.
00:07:33.360So realizing this, realizing that the universe allows us to do these things, and the universe sort of has this built right into it, got me so fascinated that I really thought deeply about this.
00:07:49.840And what I realized sort of is that what humans have done in the past is that they've allowed information, right?
00:07:57.740Any type of information that we now share with our mobile surveillance devices.
00:08:01.760So that information to be encrypted and be put at rest somewhere securely, right?
00:08:08.400That is how encryption has mainly been used.
00:08:11.280Or to do things like signals doing, where we do end-to-end encrypted messaging, right?
00:08:17.560Where we are able to send some message from one human to another human being via some untrusted channel, right?
00:08:27.000Where there can be interceptors that try to get those messages.
00:08:30.300But thanks to mathematics, we are able to send this message across the whole universe.
00:08:35.780And it arrives at the endpoint with no intermediary being able to take a look at the message because of this inherent property of the universe.
00:08:44.140What I realized sort of has been that there's a missing piece, which is whenever we are accessing this information, whenever we are interacting with this information, whenever we want to utilize it, basically, we have to decrypt it again.
00:08:59.160Which then makes it accessible to whoever takes a look at it, right?
00:09:04.420Whoever runs the machine that you decide to put that data on, which can be AWS, which can be cloud providers, big data, big AI, whoever, right?
00:09:14.960And so this idea that I had was, what if we can take this asymmetry that is a fact of reality and move that to computation itself to enable that all of those computations can be executed in private as well.
00:09:31.540And then we can do some amazing things.
00:09:34.060Then the two of us can decide to compute something together, not just exchange information via some secure communication channel, but actually perform some mathematical function over something, produce an output from some inputs.
00:09:48.060But we can keep those inputs to ourselves.
00:09:51.000So Tucker has a secret, Yannick has a secret, and with this technology, we can produce some value, some information, while you don't have to share your secret, I don't have to share my secret, and we can scale that to enormous sizes where the entirety of humanity can do those things, where countries can do those things.
00:10:10.600But importantly, at its core, what we're doing is we're implementing this asymmetry that exists within the universe, and bringing that to the next level, to the final form, sort of.
00:10:21.960And that's how I ended up founding Archeum, yeah.
00:10:24.920Getting older can make you realize you don't actually want all the things you have.
00:12:08.140If investing is your something, we get it.
00:12:10.800Cooperators Financial Representatives are here to help.
00:12:13.420With genuine advice that puts your needs first, we got you.
00:12:17.040For all your holistic investment and life insurance advice needs, talk to us today.
00:12:21.520Cooperators. Investing in your future together.
00:12:23.980Mutual funds are offered through Cooperators Financial Investment Services, Inc. to Canadian residents, except those in Quebec and the territories.
00:12:30.880Segregated funds are administered by Cooperators Life Insurance Company.
00:12:33.180Life insurance is underwritten by Cooperators Life Insurance Company.
00:12:35.260I can't think of a more virtuous project.
00:12:37.980When you said it in the first minute, the point of the project is to preserve humanity, to keep human beings human.
00:12:47.440They're not just objects controlled by larger forces.
00:12:59.420Can you be more specific about our current system and how it doesn't protect privacy?
00:13:05.560Yes, so I would say there's, so I think there's a lot of things to unravel.
00:13:15.980If we take a look at the systems that we are interacting with every single day, what those tools and applications, those social media networks, basically everything that we do in our digital lives and all of our lives have basically shifted from physical reality to this digital world.
00:13:37.580So everything we basically do, everything we do in this room, everything we do when we are out in the street, because all of the technology has become part of physical reality, has been consumed sort of.
00:13:50.520And so all of this has been built on top of what the former Harvard professor Shoshana Zubov has called surveillance capitalism, right?
00:13:59.540And I think that that really lies at the core.
00:14:01.940And it's relatively straightforward to understand what those companies are doing if you ask yourself, hey, why is this application that I'm using actually free, right?
00:14:14.080Why is nobody charging me to ask this super intelligent chatbot questions every day?
00:14:22.080Why are they building data sensors for trillions of dollars while I don't have to pay anything for it, right?
00:14:28.640So that's the question that you need to ask yourself, right?
00:14:31.340And what you end up realizing is that all of those systems are basically built as rent extraction mechanisms, where from you as a user, you're not really a user, you're sort of a subject of those platforms, you are being used to extract value from you without you noticing.
00:14:54.860And they're able to extract value from you because all of your behavior, all of your interactions with those systems are being taken and they perform mass surveillance, bulk surveillance.
00:15:11.940We're not even talking about intelligence or governments or anything.
00:15:15.380We're just talking about those companies that exist within our economy.
00:15:19.820And so they record everything they can because every single bit of information that I can take from your behavior allows me to predict your behavior.
00:15:29.560And where I can predict your behavior, I can utilize that to, in the most simple case, do something like serving you ads, right?
00:15:37.380But in more complex cases, I can do things like I can steer your behavior.
00:16:05.600But at the same time, sort of nowadays, it has transformed into one of the biggest threats to human civilization.
00:16:16.320At the user level, at my level, the level of the iPhone owner, is it possible to communicate privately with assurance of privacy with another person?
00:16:29.760So we start with this concept of insecure communication channels.
00:16:34.320And since every communication channel is insecure, what we employ is end-to-end encryption.
00:16:42.360And end-to-end encryption allows us to take this information, take a message, and lock it securely so that only Tucker and Yannick are able to unlock them and see what's going on.
00:16:57.360So there have been many cases where big players with big interests, I guess, have attempted to undermine cryptography, have attempted to get rid of end-to-end encryption, to install backdoors.
00:17:11.120There has been what is commonly called the crypto wars in the 1990s, right?
00:17:16.380Where the cypherpunks fought for the right to publish open source encryption and cryptography.
00:17:26.400But at the end of the day, I would say, as a realistic assessment, this kind of cryptography is secure and it works.
00:17:32.660Now, that, unfortunately, is not the whole answer because what you have to think about is, now, what happens with those end devices, right?
00:17:41.880I mean, the message, the messenger, right, that is being sent from Yannick to Tucker might be secure.
00:17:47.880But now, if I cannot undermine and apply force to this message to understand what's inside, well, I'm just going to apply force to your phone.
00:18:01.220So, when we look at different applications, for sure, there is a whole variety of applications, messaging applications, right, that do not employ encryption and security standards and might collect all of your messages and images and utilize them for those machines, right, that extract as much value as possible from you.
00:18:26.400But there's applications like Signal that don't do that, that are actual open source cryptography technology that anyone can verify themselves and take this code and turn it into an actual application, install it on your phone.
00:18:40.520All of those things are possible, right?
00:19:13.980But also because people make mistakes, right?
00:19:16.300Honest mistakes that are non-malicious.
00:19:18.160And so, I think that in general also speaks for the importance for free accessible hardware where people with technical skills can play around with and find issues.
00:19:29.020But at its core, what you're being subjected to right now, I would say, is tactical surveillance.
00:19:37.680And what it means is that there's some actor, can be some state actor, can be someone else, that decides that Tucker Carlson is worth to be surveilled.
00:20:19.380And what we've seen over the last few years is sort of a shift away from tactical surveillance towards strategic surveillance.
00:20:28.980And surveillance capitalism has really helped this concept because there's so much data that is being locked, that can be stored.
00:20:36.720There are so many new devices and applications that can be employed.
00:20:40.740And so, we see pushes like, for example, chat control within the European Union that is sort of a backdoor to implement backdoors within all of the messenger applications to be able to scan your applications, to scan your messages, to take your messages somewhere else, and decide whether or not those people like what you're saying within your private messages.
00:21:02.860So, I would say, in general, as a normal human being, with your iPhone, you are still able to privately communicate.
00:21:15.740However, this ability has greatly been limited.
00:21:18.820If there is someone who wants to see your message, I would say they can, unfortunately.
00:21:25.220How difficult is it for a determined, say, state actor, an intel agency, to say, I want to read this man's communications, listen to his calls, watch his videos, read his texts?
00:21:40.000So, I think that, and we can look at different court cases that have publicly emerged in regards to Apple, for example, right?
00:21:47.780Where Apple has refused intelligence to give them backdoor access to their devices.
00:21:55.700And what's so important about this discussion that we are having here is that every time you're building a system where you add backdoor access so that someone in the future can decide to get access and take a look at what you're writing, right?
00:22:11.880And what that invites is for everyone to do that, because a backdoor inherently is a security flaw in our system.
00:22:18.820And it's not just some specific intelligence agency that decides to read your messages, right?
00:22:24.440It's every intelligence agency on Earth at that point, right?
00:22:27.700And so, that's why, as a nation, you cannot weaken security by getting rid of privacy without weakening your entire economy, cybersecurity, and also social fabric at the end of the day, right?
00:22:42.860And the whole strategic positioning of you as a nation.
00:22:45.140How difficult it is, I would say, also, from a practical operational security standpoint, depends on what are you doing with your phone, right?
00:22:59.660Is your phone this strict device that is only used for messaging, or is your phone also using different types of media?
00:23:09.920So, I think two years ago, there was this case where there was a zero-day backdoor being used across Apple devices, because when I sent you an image and your messenger had auto download on, I could get full access to your phone by sending you a message.
00:23:31.860And you're not my contact even, probably, right?
00:23:35.180I just figure out what your phone number is, I sent you an image, the image gets automatically downloaded, some malicious code that I have injected gets executed, and now I own your phone and I can do whatever I want.
00:23:47.760And then, end-to-end encryption doesn't help you, right?
00:23:49.920Because I have literal access to the end device that decrypts this information.
00:23:53.980And so, that's very dangerous, that has been fixed, but I think what it highlights really is that complexity is the issue here.
00:24:01.460So, complexity in the kinds of applications that you're running, complexity in the underlying operating system that this device has, all of that complexity invites mistakes and also malicious security flaws to be installed in those systems.
00:27:23.540Not even the democratic processes are able to have oversight because it's all wrapped in secrecy.
00:27:30.340So that really brings us to the fundamental issue here, also with strategic surveillance, surveilling everyone.
00:27:35.820Just deciding, well, I'll take a look at everyone's phone, store everything, and maybe I don't like someone in the future, then I have this backlog of information.
00:27:45.360So the important question to consider here is thinking about, is there even a future where, from a legal standpoint, it is possible to implement procedures that guarantee that there is no secret surveillance in place?
00:28:01.960Yes, which I think the answer is pretty clear to that question.
00:28:24.960To allow for that to be implemented in the 21st century.
00:28:29.260But what we've seen sort of is that the tools that governments have access to are so powerful that it is impossible to make a law that prohibits use of that.
00:28:46.140Because whoever, within a centralized architecture, that's always the case, has access to this technology, basically becomes a single point of failure.
00:28:56.300And that single point of failure will necessarily be corrupted by the power that exists.
00:29:03.900Just a couple obvious lowbrow technical questions.
00:29:09.300Is the iPhone safer than the Android or less?
00:29:14.540So I would say a huge advantage that Android devices bring to the table, right, is this nature of, I guess, a subset of those devices, right?
00:29:26.300Not speaking for the entirety, but the operating system, for example, being publicly viewable by anyone, right?
00:31:05.340Is there an advantage to the iPad over the Mac from a privacy standpoint?
00:31:09.660I think what it boils down to there is what kind of applications could be installed on your system.
00:31:23.760I would say in general, devices like the iPhone or the iPad operate in a more sandboxed way where applications are actually isolated, right?
00:31:37.980Rather than how it works on operating systems like macOS or Windows, right?
00:31:43.360Where you could compromise the entire system way more easily, right?
00:31:47.760So on the iPhone, you just have an app store with applications and the level of compromise that such an application can have, theoretically, at least from the idea, is limited to just the single application, right?
00:32:02.620It doesn't have access to your messenger if you're installing an app.
00:32:06.480Although it has, I guess, if there's some flaw in the system, which always is the case.
00:32:11.500So you never have this absolute security.
00:32:13.840I think what it really boils down to is this idea that really emerged in the 1990s of decentralization, right?
00:32:25.020Moving away from central single points of failures towards decentralization, where we can mitigate a lot of these risks by not depending, I guess, on one single type of computer and not even depending on one single computer,
00:32:39.700but having many computers, which introduces redundancy, resilience, and, I guess, risk reduction and distribution to computer systems.
00:32:49.580So speaking more broadly about how the internet in a free society should be built, I guess, yeah.
00:32:55.780So most people don't wake up in the morning and decide to feel horrible, exhausted, foggy, disconnected from themselves.
00:33:01.600But it does happen, and it happens slowly.
00:33:03.700You're working hard, you're showing up, and then your energy disappears by midday.
00:33:08.160Your focus is dull, your weight won't move.
00:33:11.120A lot of people are told, that's just getting old.
00:33:27.560And that's why we're happy to partner with Joy and Blokes, a company that was built for people who are done guessing and ready to figure out what exactly is going on.
00:33:36.660And that starts with comprehensive lab work and a one-on-one consultation with a licensed clinician.
00:33:42.000An actual human being explains what's happening inside you and builds a personalized plan, which includes hormone optimization, peptide therapy, targeted supplements.
00:36:28.300Everything sits within a distributed network where as long as you're not able to basically get access to the entire globally distributed network,
00:36:38.680to every single participant, you have security.
00:36:42.340And it's difficult to do that with your own phone.
00:36:47.700But at the end of the day, I think over time, those systems get more secure.
00:36:55.420However, what is important is to be certain that there is no backdoors explicitly installed, right?
00:37:02.880I think there's some countries where if you're buying a phone from there, you could be certain,
00:37:09.820okay, there might be something installed because the company itself is owned by the government.
00:37:14.180And we need legal frameworks for that.
00:37:19.340And also what we require sort of is that the manufacturing process itself mirrors distributed decentralized systems.
00:37:30.180Where there, again, is not a supply chain of single points of failure, where if one single worker decides to install some backdoor because they get paid off, right?
00:37:44.440And I think that Apple runs on that model already.
00:37:48.680So, I would be relatively comfortable with these kinds of systems.
00:37:53.400But there's also other interesting technologies.
00:37:56.040So, for example, Solana, which is an American company, blockchain network, right?
00:38:02.960They actually have their own phone company or offering phones.
00:38:09.760They have a very small manufacturer and they manufacture those phones because they say, well, those phones need to be very secure because you literally store your money on there now because your money is digital and on top of a blockchain network.
00:38:24.300And so, I think those are very interesting approaches where I'm really looking forward to seeing more phones like this where there's, then again, a competitive market emerging for who's building the most secure phone.
00:39:03.140With the EncroChat or whatever it was called, there was this large-scale police operation to stop truck cartels, which worked out nicely, I guess, in the end.
00:39:16.860But the company itself was just a facade to sell backdoor phones.
00:40:05.240That anyone can verify for themselves.
00:40:08.240And what it means is that we have this global community of mathematicians and cryptographers that have invented those protocols, that have independently, without getting funding from CIA or whomever, thought of mathematical problems that they want to solve, that they are passionate about.
00:40:26.640And all of those people look at those open source lines of code and mathematical formulas, and they find those flaws in those systems.
00:40:36.700And so, that makes me confident in the design of Signal itself.
00:40:56.760So, I think it would be highly unlikely that Signal itself would actually turn out to not be secure.
00:41:05.600There has been this interesting case called, that was in the early 2000s, where there was this attempt to actually undermine strong encryption called, very exotic name, Dual Elliptic Curve Deterministic Random Bit Generator.
00:41:27.900Nobody understands, no non-technical person understands what that means, right?
00:41:32.060And it was actually, what you need to understand in order to comprehend what has happened there is that when we encrypt information, when we, as I said earlier, when we take something and move it into this different realm where you cannot follow this information into that realm, because that would require you to have literally infinite resources, more energy than the sun will emit over its lifespan.
00:42:27.220And so, when we encrypt something, we make use of that chaos and we inject it into a message that we are sending, for example.
00:42:36.620And it's only possible to not decrypt that message in an unauthorized way if the randomness that has been injected in this message is actually unpredictable.
00:44:16.600What we're doing with all of this randomness that we are injecting into information is we are basically describing what key is being used to unlock them.
00:44:29.500And if I don't know how the randomness looks like, if I don't know what the next playing card in the stack is,
00:44:36.220I have to try every single possible key and try to unlock it with this message.
00:44:42.620So, you could think of it as, I have this message.
00:44:44.620Now, I want to apply violence to this message in order to recover it.
00:44:48.840What I'm doing is I take key number one, I try to unlock it, doesn't work.
00:49:11.400So it's literally impossible to then use some other alternative that is secure because certification only gets provided for this backdoor technology.
00:49:24.100But it got uncovered thanks to Snowden.
00:49:49.760What they also uncovered is that they actually paid this company that built those products, 10 million U.S. dollars, the NSA, to use that as a standard.
00:49:59.220So yeah, that's why you cannot trust anyone.
00:50:03.320As you point out, it's not simply, I mean, so this is an intel agency trying to spy on its own people, the ones who pay for it to exist.
00:50:12.900But it's, and that's immoral and, you know, something that we should fight against.
00:50:17.840But they were also sabotaging the U.S. economy and U.S. national security.
00:50:22.760And because if your cryptography is fake, then that means you're exposed on every level throughout your society.
00:50:32.220And it's so interesting because it is their task.
00:50:36.200That's why it was possible for them to do that, to increase national security, right?
00:50:41.900At that point, they were the leading cryptography research company in the world, sort of, right?
00:50:47.980And so that really is striking to me, that you're willing to undermine the entire security of your nation.
00:50:56.600And that, at the end of the day, puts you in a worse strategic position.
00:51:01.380I think many people don't realize that.
00:51:04.120I never thought about it until you mentioned it, but it just, it highlights, I mean, I love Ed Snowden and I'm not embarrassed of that, I'm proud.
00:51:10.960But it just highlights, you know, the suffering that he's been through in order to help his own country.
00:51:18.300And he's still slandered constantly in it.
00:51:43.220I think, as I said, I think this is a great example to look at, where even with those backdoors that had been implemented,
00:51:54.620there were cryptographers within this global open source mathematics cryptography community that rang the bell, but nobody was listening to them.
00:52:03.080But they actually identified the issue years in advance and rang the bell and said, this is not secure, not random, even within those companies and standardization institutes.
00:53:34.600And I think, I mean, it's interesting to think about, is there cryptography that is being developed in-house within militaries or whatever proprietary human organization, right?
00:53:46.500That is not publicly known, that is incredibly powerful.
00:53:51.520So, I mean, what I've been doing with my team, and I'm so glad that I have those incredible cryptographers in my team that actually understand all of those things on a way, way more detailed level than I do,
00:54:09.560is build this protocol that allows us to literally take everyone's data.
00:54:16.660So you could imagine the entirety of the United States, right?
00:54:20.300We take everyone's healthcare data, something like that, right?
00:54:24.260And then we say, well, we need to do something with that data.
00:54:27.720Let's say we need to research our disease or whatever.
00:54:30.920Instead of taking that data and passing it to some company that will inevitably expose it, lose it, it will get leaked, or it will be used against those people, we encrypt it.
00:54:40.520Nobody ever has to share any information.
00:54:42.740And we just run whatever computation that we collectively set, we are going to do that with this data.
00:54:48.320We do that, we get the result, we, I don't know, figure out a cure to cancer or whatever.
00:54:53.240But at no point in time, you ever had to share your data.
00:55:00.920And it sort of is the holy grail of cryptography, I would say, being able to do these kinds of things.
00:55:08.020Because you can now run any type of computer program instead of in the public, in private.
00:55:14.260And you can restructure the way that your entire economy and country can work, right?
00:55:20.700And that goes beyond just economical human interactions that also touches upon things like rethinking how we can actually improve democratic processes.
00:55:32.960Because what those computations inherently have as a property is so-called verifiability.
00:55:40.240So, what's the status quo sort of in the current internet is you task some cloud provider to run a computer program for you, right?
00:55:54.220Because you have limited resources, you want them to run that computer program for you.
00:55:58.720So, you pass them some information, an algorithm, and you get an output back.
00:56:03.660But how do you know that this output is actually correct, right?
00:56:07.580Could be that there was an error, could be that they maliciously tried to undermine the output that they have sent you.
00:56:15.840So, this technology that we've built actually solves this, right?
00:56:44.360So, I'm very lucky that within my company, I have very experienced cryptographers who've literally worked more years on these specific issues than I have been in cryptography.
00:57:00.120And so, I'm sort of building on the shoulders of giants, of course, right?
00:57:06.580And there has, for a very long time, been research in those areas, being able to run those encrypted computations.
00:57:14.540But it has never been practical enough where it is fast enough, cheap enough, right?
00:57:20.760And versatile enough where you can actually do all of those things.
00:57:23.580And so, I think what really guided us is to, and what really guided me in the way that I designed the system is to think about, okay, how can I actually build this system so that people are going to use it and are going to build applications and are going to integrate that into systems, right?
00:57:45.020Because I think with privacy technology in general, in the past, what has been done is that it sort of has been created in an echo chamber, in a vacuum almost, where you're a smart cryptographer that builds amazing technology, but you maybe don't understand how markets work and how to get product market fit, how to actually get those users, right?
00:58:09.400And so, we've tried to build it in a different way, and that's how we ended up here.
00:58:15.340But to be honest, it was an evolutionary process for us.
00:58:19.420So, we originally started with a different kind of cryptography, I would say, that was more limited, that didn't allow for all of those interactions.
00:58:29.940And then, at some point, we sort of decided, okay, and we realized that that was not good enough, that was not enough.
00:58:37.240And at that point, basically, everyone was still building with that technology, and we were like, let's do something different instead.
00:58:44.400Let's think about how the future will look like, how sort of computation and privacy can converge in something bigger for the entirety of humanity.
00:58:52.240And that's then how we built it, in very, very quick time, actually.
00:59:10.640So, big names in the space of blockchain distributed systems, right?
00:59:19.800All of those networks, like Bitcoin, all of those networks are distributed in nature, decentralized.
00:59:27.460And, yeah, there's a lot of players within that space that truly believe in the value of privacy, and that privacy is a human right, and privacy is inevitable as a technology that like to support it, but not just support it, right?
00:59:46.140Because it is something they believe in, but invest in it because they sort of have realized that this is one of the most powerful technologies that can exist in humanity, right?
00:59:59.740Being able to take information, move it into this realm, and then it can stay in this realm, and it can be processed, and everyone can do that.
01:00:08.720It is emancipating, and it is powerful for businesses, but also nation states.
01:00:13.260At the end of the day, it is a neutral technology, and so we have investors that believe in that.
01:00:20.620So, one of the applications, we were just talking off camera, one of the applications for this technology, well, one of the big ones, is the movement of money in a way that's private.
01:00:35.240And let me just add one editorial comment.
01:00:37.020The great disappointment of the last 10 years for me is that crypto transactions don't seem to be as private or beyond government control as I thought they would be.
01:00:45.220I hope they are someday, but watching the Canadian truckers, you know, have their crypto frozen was just such a shock.
01:00:56.260Yes, so if you think about Bitcoin as the state-of-the-art model of, or I guess the original, not state-of-the-art, but the original kind of blockchain network, right?
01:01:09.100What it is at the end of the day is a way for distributed people to find consensus over some unit of money, which is actually more like a commodity than actually a financial instrument.
01:02:29.760What you have in Bitcoin specifically is pseudonymity.
01:02:33.840So you don't see on the blockchain, Tucker Carlson has 10 Bitcoin or whatever and send Yannick one Bitcoin.
01:02:41.880You instead see A, B, C, D, E, F, G, blah, blah, blah, whatever, right?
01:02:45.300A random string of numbers and letters has sent something to another random string of letters and numbers.
01:02:53.560However, they're linked to this identity that you have.
01:02:58.480So for every single transaction that you've performed in history on top of this distributed letter, you will see all of those transactions.
01:03:07.620So I, when you, later after the show, send me one Bitcoin, I guess, right?
01:04:02.100That is actually a dystopian scenario where we could end up if this is adopted as the technology where all of your money now sits and, and you're sending transactions,
01:04:12.600where you have this big upside of having cash-like properties, which is amazing, but you have this tremendous downside of literally everything being recorded for the conceivable future of humanity, right?
01:04:29.200And that inherently limits your freedom to use this technology.
01:04:34.080And so that is an issue that, that exists not just within Bitcoin, but also other blockchain networks.
01:04:41.080And Bitcoin is this, the, this pure form.
01:04:44.080That's why within this, this crypto industry, there's a lot of competition also between, between different players that say Bitcoin is this pure form that only allows transfers of money, right?
01:04:54.420And other networks allow execution as well.
01:04:57.400And that is, that has led to what is commonly being called smart contracts.
01:05:03.560So this concept of computer programs that, that simply exist in the Azure, basically a computer program that can execute something that you tell it to do, and it will guarantee to do so.
01:05:15.920And this amazing property that, that, that all of the, the, the founding fathers of, of those networks basically identified as important is so-called censorship resistance, which I think is also important in, in real life.
01:05:28.860And so those networks provide censorship resistance.
01:05:32.760It doesn't matter if one computer decides, well, I'm not going to accept Tucker's transaction because I don't like Tucker.
01:05:38.740Well, there's going to be another computer that says, I will accept it.
01:05:41.620So that is censorship resistance that is inherently baked into those systems.
01:05:45.920And what that means is if you interact with this as this invisible machine, right, you get guaranteed execution for whatever you tell it to do, either send someone money or perform some other computational logic that is baked into the system.
01:06:02.520And so there had been, have been different kinds of pioneers on the, on the front of performing, yeah, adding cryptographic privacy to those systems.
01:06:14.340Um, there has, um, there has, for example, emerged on a network called zero cash, C cash, which is basically Bitcoin with cryptographic privacy.
01:06:23.100Um, and there have also been pioneers like the inventors of tornado cash who have built a smart contract that exists within this adger is unstoppable.
01:06:33.900Once you've uploaded, once you've uploaded it, you cannot stop it anymore.
01:06:37.900And, um, the kind of code that they implemented there gave you privacy on top of this public network, um, which was the, or is the Ethereum virtual machine.
01:06:49.140So they did that and, um, tornado cash did that tornado cash.
01:06:52.980Were they, did they win the Nobel prize?
01:06:54.660Um, did they get the presidential medal of freedom?
01:06:57.280And what happened next when they offered privacy?
01:06:59.280So there were, I think it was three founders, um, Roman Storm, who's an American citizen, um, Roman, uh, Semenov, who is a Russian national and, um, Alexei Petersev, who is a Russian national as well, who lives in the Netherlands.
01:07:18.640Um, he has been convicted of, um, assisting in, in money laundering for five years, um, and, and, five years in prison, um, and Roman Storm, um, has been convicted of, um, in the United States of, um, conspiring to run a money transmitter without a license.
01:07:45.540Now, why has this happened? Why did they suffer such grave consequences?
01:07:53.940Brought on trial. I mean, it's, it's actually, if you look at what, what Roman Storm has faced, it was 40 years in prison for this, um, in the United States.
01:08:04.840In the United States of America. And why, why has that happened, right?
01:08:10.100They built a privacy tool where it was an illicit actor that used their privacy tool.
01:08:15.740Um, and that is a shame because, um, it was an illicit actor that, um, a lot of people agree on it as an illicit actor.
01:08:24.060I think the two of us also agree that North Korea laundering stolen hacked funds is an illicit actor.
01:08:34.520The, the underlying question really is...
01:08:36.260And when we're sure that actually happened?
01:08:37.620We are sure that happened, yes. For sure that, that has happened.
01:08:40.240Um, and so they, they stole funds, um, because they were able to hack, um, different systems and, um, then were able to, to utilize this platform to gain privacy to then move those funds somewhere else.
01:08:56.840Did Roman Storm participate in the North Korean hedge fund theft?
01:09:32.520Um, but the, the jury, um, yeah, could not find a unanimous decision on the, on the main charges, I guess, circumventing sanctions and, um, and, and, um, helping with, with money laundering.
01:09:46.920Now, the interesting thing is, before they got arrested, what has happened?
01:09:51.060Um, the OFAC, the ORAS for foreign asset control in the United States, um, they took the software that those developers had written.
01:10:02.520And uploaded to the AFR, where it is become out of anyone's control, um, unstoppable by nature.
01:10:12.780They, they essentially wrote code for a software tool for anyone to get privacy.
01:10:18.260Um, that software tool got sanctioned.
01:10:21.560It got put on the SDN list for specially designated nationals, where you put the names of terrorists and you put the address in this AFR thing, right?
01:10:40.500All of the companies closed their developers account, developer accounts.
01:10:45.300Um, the software they wrote, the, the free speech that they performed by coming up with those ideas and, and publishing it to the world got censored.
01:10:55.080Because they were added to a, to a list, which they don't even belong on because it's, uh, it is not.
01:11:01.060Without any vote in Congress, by the way, or this is just part of the, I think it's under State Department now, but I could be, or Treasury, I can't even remember.
01:11:19.520And so, so it got added onto this list.
01:11:21.900Um, and I think last year, um, a court in the, uh, state of Texas actually ruled that OFAC, um, does not have the statutory authority to do any of that.
01:11:36.100Um, and they then silently removed, um, tornado cache again from the SDN list.
01:11:41.940However, nobody is, is able to use the tool now, right?
01:11:44.960Because every company for compliance reasons, um, outcasts you from, from the user base if you've ever touched any, any, um, anything related to that.
01:11:55.620And Roman Storm is, he was convicted, you said.
01:11:59.600He, he, there was a hung jury on, on the strongest charges, but on other charges he was convicted?
01:12:04.220He was convicted on one charge, um, on, I think they, it is called, um, yeah, conspiracy to, um, to run a money processor, financial institution, right?
01:12:18.440A bank, um, without a banking license.
01:12:43.360He made use of his inherent right for freedom of speech to build something that enables others to make use of their right for freedom of speech, right?
01:12:55.640Because that is, at the end of the day, the, the freedom of economic interaction, right?
01:13:01.400That is what, but he helped others protect for themselves.
01:13:05.240He never processed a transaction for anyone, right?
01:13:32.940So, um, there is, there is, I think, incredible, um, institutions like the Electronic Frontiers Foundation, the EFF, um, and, um, DeFi Education Fund, but also companies like Coinbase, who actually have invested substantial amount of money, um, into defending Roman Storm.
01:13:54.740And, um, yeah, Alexei Petersev as well.
01:13:58.340Um, I think Alexei Petersev also doesn't get enough attention.
01:14:03.120He's, um, I mean, he's now under house arrest in, in, in, in the Netherlands, um, and preparing to, to appeal his decision, I think, something like that.
01:14:11.660Why are so many of the players in this Russian?
01:14:14.280Um, I think it really boils down to them having a deep understanding about, I think historically, maybe culturally, they have an understanding about the importance of privacy.
01:14:24.740In a society to, to, to uphold freedom, um, which is a shame, you know, that they.
01:14:32.880Well, they've, they've suffered for that knowledge.
01:14:54.740I think it's interesting, um, how we also, all of us take that, take that as a granted that these kinds of people, um, go out of, um, their everyday lives and put a target on their head by, by shipping this technology.
01:15:11.280To enable you, um, to gain privacy, um, and, um, simply the knowledge about the existence of bad actors in the world, um, has made them, has made them victims and put them in jail, which is insane.
01:15:27.460Um, well, I mean, it's something the rest of us should push back against, I think.
01:15:37.520So, if you could be more precise about what you think the real motive was behind going after Tornado Cash and Roman Storm, like what, why was the U.S. government not prosecuting drug cartels in order to prosecute Roman Storm?
01:15:52.540I think, um, so that has taken place under the previous administration.
01:15:57.980Um, so I think President Trump with his administration has, um, done tremendous work in regards to pushing the adoption of decentralized technology, of really allowing us, all of the people in that space to, to try to rethink the financial system and, and build this technology.
01:16:19.400Because they've sort of realized that, um, technological innovation runs at a faster pace than, than legislative processes.
01:16:29.560And, and, um, under the previous administration, that looked differently.
01:16:34.220So, so I think, um, that has helped this technology spread a lot.
01:16:38.820Um, and it is, however, important to consider privacy.
01:16:45.080And when the executive order banning, um, CBDCs was signed, um, central bank digital currencies, um, an explicit reason why CBDCs should never be adopted in the United States was the privacy concern.
01:17:00.100Because if we look at, uh, all of those new digital currencies being built in Europe, um, and all around the world, I guess, besides the U.S., which is great, which, which, which actually, um, is amazing, I think, um, is that all of them are surveillance, um, machines to even a higher degree than the current financial system is already, right?
01:17:23.560It is already a, um, a surveillance system, but what's so important about this next generation of, of money is, um, we are sort of at a crossroads.
01:17:34.320Do we want our money to enable us freedom, freedom of economical interaction, freedom of thought at the end of the day?
01:17:41.540Because whatever we think we do, right, where we, we want to put our money where our mouth is, um, or do we want a monetary system that enables automatic subsequent action, um, based on whatever activity you perform in your digital life?
01:18:00.780Which can mean things like, now all of your money is frozen and you don't have any access to it anymore because whatever you just did, um, was deemed as undesirable by big brother, I guess, right?
01:18:13.600So that is, that is literally the, the two possible futures that we have.
01:18:35.840So if you actually think about something like, um, tornado cash or all of the, I mean, there's, there's a lot of applications that, for example, utilize Arceum to also bring this level of privacy, right?
01:18:46.720Um, if you think about all of these, these, these systems, they are in my mind personally, I mean, as long as you have an internet connection, if you don't have an internet connection,
01:18:57.240maybe, um, you, you cannot spend your money right now, um, but as long as that exists, even superior to cash, because you don't have any serial numbers anymore, right?
01:19:07.840Wait, so you say cash is being surveilled?
01:19:10.000Sure. I mean, when I go to the ATM and withdraw money, the serial numbers are recorded in some database.
01:19:17.220And when a merchant, um, at, at Walmart, I guess, or wherever puts that into their cash registry, um, you can also record a serial number.
01:19:50.720I mean, I mean, there, there, there, there could be other tracking mechanisms.
01:19:53.000I don't know, but, but I've read about this technology, which clearly exists, um, and it's being used to even turn the cash system into a surveillance, surveillance system.
01:20:04.060And, um, it's not even, again, I think all of this is not even just, um, someone with governmental authority deciding to surveil people, right?
01:20:16.840It is also companies, companies seeing economical value in surveilling you, um, and then utilizing this new technology, utilizing the internet, um, to, to do that.
01:20:27.420And, um, it boils down to power, I would say, control, right?
01:20:31.480Um, if you have access to as much information as possible, you can better prepare for the future and you can predict behaviors of your users or different actors.
01:20:41.740And so that's why those, those systems get implemented.
01:20:43.700Um, so we are on this, on this fork in, in, um, the path towards the future and what the people that are architecting those central bank digital currency systems have realized.
01:20:57.880And that's, um, so interesting to me is this old concept that the cypherpunks in the 1990s, um, came up with, which is code is law, um, which expresses what, what has happened with tornado cash.
01:21:10.680I think nicely where, um, it is the ultimate law sort of when you have this network that nobody controls and there's some piece of software and it just executes.
01:21:20.820Whatever is written within that software code executes, there's no way of stopping it, there's no way of doing anything about it.
01:21:28.060And so that's what I mean when I say code is law, code is law.
01:21:31.120And the architects of those alternative systems have realized that there's so much power in being able to, let's say, take your chat messages and see that you have said something against big brother and big brother, brother doesn't appreciate that, right?
01:21:45.660And so automatically now your money, um, is frozen and that is code is law, right?
01:21:53.160In the utopian sense and in the dystopian sense where software automatically can lock you out of all of those systems.
01:22:00.220And I would much rather, um, have a utopian future than a dystopian future.
01:22:05.020But at the end of the day, from a technological standpoint, those things are similar.
01:22:10.060The only difference really is cryptography.
01:22:15.660Because you're offering that on a scale even larger than anything Tornado Cash or Roman Storm Attempt did, it has to have occurred to you that whether or not you have prominent investors, like you face some risk.
01:22:31.420So I think, um, what, what, what I'm doing with Archeum at the end of the day is I'm providing the most versatile and superior form you can execute a computer program, right?
01:22:45.500Within encryption, you can execute a computer program, and you can have many people contribute encrypted data, and you can do all sorts of things.
01:22:54.420You can do things starting with, um, financial transfers, right?
01:22:59.040You can add privacy to financial systems.
01:23:00.860But that doesn't just mean we're adding privacy to me and you, Tucker, interacting with each other.
01:23:06.720We can also add privacy to entire markets, right?
01:23:10.100Which, again, can also have downsides.
01:23:11.960I'm not arguing that there's only upsides with this technology.
01:23:15.200There might be actors that then utilize that, um, not, not, not just talking about criminal activity, but just unethical activity, right?
01:23:25.140So, um, at its core, it is neutral technology.
01:23:28.440Um, but the use cases that, that I'm really focused on enabling, also our use cases like enabling within the healthcare system to actually utilize data that currently is being stored, but it is being stored in a very inefficient way where it's isolated, right?
01:23:45.700So, with my technology, we can take this data and use it without ever risking that data to be exploited, without ever taking ownership of your data because you're the patient, you're the human, right?
01:23:57.240I have no right to, to take ownership over that.
01:23:59.880And I don't need with that technology because you can consent and say, let's improve healthcare or whatever with my data, but you're not getting my data because it's encrypted, right?
01:24:10.480It's this, um, I don't know, it's a crazy concept to wrap your head around.
01:24:14.200I, I, I get it, but it enables so much also on a national security level that it is strictly superior technology.
01:24:20.860And I think this example that I told you earlier about verifiability, right?
01:24:25.800Um, mathematically being able to be convinced that a computer, um, program, a computation that has been executed in privacy, right?
01:24:36.940Um, has been executed correctly is such an amazing concept.
01:24:41.180And, and, and the way I think about it really is opening up a new design space, um, all together and allowing companies to do actual innovation instead of innovating only on the front of how can I extract as much value as possible from my user by surveilling them.
01:24:59.960Um, so I don't really, I don't really think about it the way that you framed it.
01:25:06.140I'm building this generalized computing platform that can be used by anyone, um, because I don't have any control over it, right?
01:25:14.660I'm not building a controlled infrastructure.
01:25:17.320I'm building open, um, software that is used for good.
01:25:22.020And I'm grateful that you are, and I don't at all mean to make you pessimistic or paranoid, but in so doing, you're threatening current stakeholders.
01:25:40.260Um, I mean, when, when, when cars first came along, right, there were unions of, um, um, horse carriage, um, taxi ride providers that did not want to see cars on the road.
01:25:53.860So there's always, um, interests that, um, try, try to utilize both technology and, and law, um, to prevent others from getting into that position.
01:26:06.260Yeah, keep the current monopoly in place.
01:26:39.560So from the perspective of the average American consumer who's not following this carefully, when does your life begin to look different as a result of this kind of technology?
01:26:49.180When will you see this sort of thing in action?
01:27:16.740Um, so I think, I think it will, will affect your, your everyday life, um, positively.
01:27:23.320Um, once, once I guess there's an, um, infliction point reached, um, on, on multiple fronts, right?
01:27:33.360I'm, I was talking about healthcare and national security, also financial system, right?
01:27:37.840Um, but it also, I mean, so that's a criticism I actually have for Signal.
01:27:43.100Um, that is that there exists one single point of failure within Signal's, um, technological stack that I've been vocal about and I, I dislike, which is that, um, what they call private contact discovery.
01:27:58.140Um, where I have a set of contacts in my, contacts on my phone, right?
01:28:19.440How does Signal ensure that those contacts are encrypted and secure, right?
01:28:25.760They, um, use trusted hardware for that.
01:28:29.340And that is a critical flaw within their infrastructure.
01:28:32.020So there's technology, um, trusted execution environments is what they're called, manufactured by, by Intel, for example.
01:28:40.840Um, and this technology comes with this, this promise of being secure and being able to basically do what, what we're doing with mathematics, but instead with trust.
01:28:51.540Um, so they say we built a secure machine.
01:28:53.680Do you think we shouldn't trust Intel?
01:29:09.080Last year, there have been a myriad, just last year, but over the last 10 years, a myriad of exploits of the technology.
01:29:15.460Um, so in the past, it has always been sold sort of as, here's this technology, um, and it does verifiability and privacy and just put your data in that.
01:29:27.340Um, there's no, there's no backdoor, right?
01:29:40.180And then last year, there were those researchers that said, well, if you have physical access to this computer, you can just read out all of the data and you can not even just read out all of the data, but you can fake keys and then you can perform fake computations on behalf of other people.
01:29:56.660So if you're building a financial system with a computer like this, I can just change numbers, right?
01:30:02.460And I know what your numbers and I can, I can change those numbers.
01:30:05.980And that's not even the core issue I have with that in the case of, of Signal, right?
01:30:10.220So Signal is, I think, still relying on that tech.
01:30:16.500I mean, I hope they run the hardware because at least there I have a little bit of remaining trust assumption that, okay, they will not, um, yeah, try to hack those PCs, which is relatively straightforward.
01:30:28.200You just connect a few cables at the end of the day.
01:30:30.920Um, and, and then you can exfiltrate the information, which is the, the, the interactions, right?
01:30:59.180So within the manufacturing process, I mean, I think it would be very naive to assume that there's no backdoor similar to what we talked earlier about with dual EC, right?
01:31:08.560Um, or something like, um, the, the clipper chip thing, right?
01:31:12.560That, that was, um, attempted in the 90s.
01:31:15.380So there's very, it's very likely, I would say that there's some randomness tempering, let's call it that, um, that could be in place because you are literally also getting, uh, keys right from the manufacturing process, right?
01:31:29.600So it's this proprietary supply chain and then they ship that computer to you and it comes with random keys, um, that have been generated in that proprietary production line.
01:31:41.320Um, so there's many single points of failure and that's what I, what I don't like about Signal because I don't want, um, this information out there, right?
01:31:59.900And, and then they can just build this thing without a single point of failure, without a way, because this is sort of a reasonable way for our state also to say, well, you actually have this data.
01:32:12.320But they cannot really argue that they don't have that data because they could connect a few cables to that computer and then get that data.
01:32:19.460So it's not the secure device that, um, people claimed in the past, um, it was.
01:32:25.340So I think that is important, um, um, to resolve.
01:32:30.140I actually don't recall how I got to the attention.
01:32:32.540I wonder, uh, if any, if any big hardware manufacturer will begin to offer truly secure devices for sale.
01:33:21.600So, so, so I think, um, it is impossible to build secure hardware, um, in that regard, where, where those claims of full privacy and security,
01:33:31.900um, are factually true, um, that is impossible.
01:33:35.680There have been so many techniques where you actually just, um, yeah, use, use so many different tools, um, to, to play around with those devices,
01:33:43.380where it is literally impossible to implement secure and, um, verifiable systems.
01:33:50.580Because even while verifying them, you need to take them apart, um, sort of destroying them in the process.
01:33:58.280What I think, however, exists sort of is this concept of decentralization and why that's so powerful.
01:34:04.580Because it doesn't really matter if this manufacturer here, um, creates a backdoor, um, as long as I have 10 different computers or 100 computers, right, from different manufacturers
01:34:16.200and there's one that does not have a full system level backdoor installed, um, I am secure under this trust model that we've developed in our company, right?
01:34:25.220So, I think that's why decentralization is so important.
01:34:29.740That was the basis of our political system when it was created, that same concept.
01:34:32.940The power is dangerous and so it has to be spread among different holders, different entities, so it doesn't concentrate and kill everybody and enslave them.
01:34:47.100And, and I think, um, it is sort of important to look at surveillance in the same way, um, where if you, if you have access to surveillance, you basically have access to unlimited power.
01:35:00.940So, whatever surveillance system we, we implement, um, be it chat control in the European Union, where I've been very vocal, vocally opposed to on, on X.
01:35:11.620Um, and I, I actually just learned, um, last week that the UK implemented, um, their version of chat control on the 8th of January, um, which is a censorship machine.
01:35:27.460Um, and, um, um, surveillance backdoor, right, installed within all of your messaging applications.
01:35:34.620Um, and it comes with this claim of, well, we are implementing this because we need to fight child exploitation, right?
01:36:06.440The people engaged in importing drugs into our country, laundering the money, exploiting the children and committing serial acts of terror against their own population.
01:36:21.900So, um, what's so funny, um, is that in 1999, um, the, some, some policing working group of the European Commission, um, there was a transcript of their discussions.
01:36:35.400And literally within the transcript, when they were talking about implementing digital surveillance systems, they were like, I think we should switch our arguments over to, um, child exploitation because that is more emotionally charged, right?
01:37:02.420So, um, there, there is a reason why we don't believe that that's the actual reason, but what I'm arguing for is that that doesn't even matter.
01:37:13.080Even if, even if the politicians are convinced that it's about protecting the children and that's the most effective measure to do that, right?
01:37:21.320To surveil all of the jets, um, what's going to happen is, thanks to this being implemented as infrastructure that exists everywhere and there being a small circle of people that have access to this technology, it will get abused.
01:37:37.420Um, it is very easy to abuse those systems because the abuse itself happens within secrecy.
01:37:59.860By the way, a lot of these businesses draw the worst people, like the most unethical people have the most power in case you haven't noticed.
01:38:11.960I mean, there's a, there's a economical function sort of to reward this, right?
01:38:16.640Because if I build an application and, um, you build an application and we just provide some value to our user, um, and the user pays for that, basically capitalism, right?
01:38:29.220Um, all of that works out nicely, but then you decide, uh, what if I take all of this information from my user and I use that to extract additional value from him, right?
01:38:39.880You're way more profitable, profitable for that.
01:38:53.820And so unethical behavior gets rewarded in the system.
01:38:56.540Just to be clear about what you're saying, are you saying that all texts sent within the UK are now monitored by the UK government?
01:39:03.400Um, I'm not 100% familiar with all of the intricacies of what the, um, digital service or online safety, I think it's, it's called in the UK.
01:39:13.440What is happening there is that there is censorship being applied to the messages.
01:39:18.260So you receive whatever unsolicited image, right?
01:39:22.640Um, and then, um, that's being censored.
01:39:25.180So, um, what's happening there is, I think, I think what's important to understand is that censorship is a byproduct of surveillance, generally speaking.
01:39:35.060And so, um, you need to take a look at all messages in order to be censored or something, to censored something, right?
01:39:41.700And so, that's what's happening there.
01:39:44.140Um, and even if we assume only the best of intentions, you have this infrastructure in place that tomorrow cannot just be abused by someone.
01:40:06.820And to thousands of people, exclamation point.
01:40:09.220And she won't, and we'll see who gets arrested.
01:40:11.540Yeah, that's a, that's a great experiment.
01:40:13.880Actually, I, I need to attend a conference in, in the UK, um, this year.
01:40:19.520Um, and it's so funny because a month ago there was this, I think it's also some proposal that basically specifies that people that work on, on encryption are sort of,
01:40:30.160sort of persona non grata in the, in the UK, something like that.
01:40:33.480I think it's not yet implemented, but I saw that on, on X.
01:40:36.800I mean, you can't get in the country if you're for privacy?