The Tucker Carlson Show - February 13, 2026


How to Stop the Government From Spying on You, Explained by a Digital Privacy Expert


Episode Stats

Length

1 hour and 55 minutes

Words per Minute

153.293

Word Count

17,722

Sentence Count

1,003

Misogynist Sentences

4

Hate Speech Sentences

7


Summary

In this episode of the podcast, we sit down with computer scientist and cryptographer Dr. Hans von Mises to talk about privacy, cryptography, and the importance of privacy in the modern age. Hans has dedicated his life to preserving privacy, and in doing so, he is challenging the authority of the most powerful people on the planet.


Transcript

00:00:00.000 When the weather cools down, Golden Nugget Online Casino turns up the heat.
00:00:05.180 This winter, make any moment golden and play thousands of games like our new slot, Wolf It Up,
00:00:11.340 and all the fan-favorite Huff and Puff games.
00:00:14.480 Whether you're curled up on the couch or taking five between snow shovels,
00:00:18.820 play winner's hottest collection of slots, from brand new games to the classics you know and love.
00:00:24.860 You can also pull up your favorite table games like Blackjack, Roulette, and Craps,
00:00:29.440 or go for even more excitement with our library of live dealer games.
00:00:34.440 Download the Golden Nugget Online Casino app, and you've got everything you need to layer on the fun this winter.
00:00:41.480 In partnership with Golden Nugget Online Casino.
00:00:44.980 Gambling problem? Call ConX Ontario at 1-866-531-2600.
00:00:50.960 19 and over. Physically present in Ontario. Eligibility restrictions apply.
00:00:55.360 See GoldenNuggetCasino.com for details. Please play responsibly.
00:01:04.360 You've dedicated your life to preserving privacy.
00:01:09.180 So let's just start big picture.
00:01:11.960 What is privacy and why is it important?
00:01:14.860 So I believe that privacy is core to freedom at the end of the day.
00:01:21.860 I would even go as far as saying that it is synonymous with freedom.
00:01:25.740 And it is protecting you, protecting your inner core essentially, protecting your identity as a human being from forces that don't want you to be an individual and a human being at the end of the day.
00:01:44.020 Oh, that was so nicely put.
00:01:46.700 I think what it really boils down to is, and in that regard, I think privacy is relatively similar to what was originally intended also with the Second Amendment in the United States.
00:02:01.920 It is a tool for you as a human being to protect yourself against coercive force, against your very soul, your inner core.
00:02:11.800 So there are forces, and this has always been true at every time in history, that seek to make people less human, to turn human beings into slaves or animals or objects.
00:02:24.180 And privacy is the thing that prevents that.
00:02:27.240 So the crazy principle that exists within this universe is that there's this asymmetry baked right into the very fabric that we exist in.
00:02:39.000 There's certain mathematical problems where the effort required to undo them isn't just scaling linearly or exponentially,
00:02:47.760 but that scales so violently that the universe itself prohibits persons that don't have access, don't have permission to undo this mathematical problem, that they literally cannot do that.
00:02:59.100 So what that means is that with a very little amount of energy, a minuscule amount of energy, a laptop, a battery, and a few milliseconds of computation,
00:03:09.600 you can create a secret that not even the strongest imaginable superpower on Earth is able to, without your explicit granting of access, are able to recover.
00:03:22.380 That is the fundamental principle on top of which encryption, cryptography, and privacy in the modern age are built.
00:03:30.380 And it's so fascinating that the universe itself allows for this computational asymmetry, where I can create a secret, I can encrypt something, I can make something hidden,
00:03:41.460 and you with the most powerful imaginable coercive force, violence, you could imagine continent-sized computers running for the entire lifespan of the universe,
00:03:53.980 you would not be able to apply that force to my secret because I have encrypted it and the universe inherently sort of smiles upon encryption and appreciates that.
00:04:05.280 So I always found that so intoxicating, this concept that this is inherently baked into the universe,
00:04:12.200 it is an interaction between mathematics and physics sort of, and is a fundamental property just like you could say nuclear weapons are a fundamental property of reality, right?
00:04:25.440 And so encryption and privacy exist in this reality, and before you, we as humans have figured it out, that wasn't necessarily clear, right?
00:04:36.860 It could also be that you can never hide something, encrypt something, keep something to yourself, but it turns out you actually can.
00:04:45.280 And so that is fascinating, I think.
00:04:47.740 And what it conceptually allows you to do is to take something and move it into a different realm, the encrypted realm, right?
00:04:56.980 And if someone else wants to go into that realm, follow you there, they would need unlimited resources to do so.
00:05:06.700 And I would say that's what really got me into cryptography and privacy.
00:05:11.460 Okay, I'm having all kinds of realizations simultaneously, first of all, that you're an extraordinary person.
00:05:18.180 I think that's, first, listen to three minutes.
00:05:21.100 Okay, who are you, where are you from, and are you ready to suffer for your ideas?
00:05:29.140 Because what you've just articulated is the most direct, subtle, but direct possible challenge to global authority anyone could ever articulate.
00:05:39.200 But first, how did you come to this, where are you from?
00:05:42.360 Tell us about yourself for just a moment.
00:05:44.580 So I was born in Germany, I'm 25 years old, and I originally actually, in my life I studied law, and then later I studied mathematics and computer science.
00:06:01.360 And then, at some point, I met a few people who also had these kinds of ideas about privacy, technology, distributed technology, decentralization.
00:06:13.540 And we then decided to found a company that builds this kind of technology.
00:06:18.580 And that's how I ended up here, I guess.
00:06:20.760 So you're German, you're a product of Europe and European culture, which is not privacy for all of its wonderful qualities.
00:06:30.320 It built the world, I love Europe, and the culture, but it's not a privacy culture.
00:06:34.340 It is not, no.
00:06:35.140 No.
00:06:35.720 So, especially German, how did you, why did you come to this conclusion when all of your neighbors didn't?
00:06:42.060 So I think it's interesting, right?
00:06:44.100 If you view privacy as this inherent political thing that protects you as a human being, there is data protection laws, GDPR, right?
00:06:57.500 There's fines against surveillance, capitalist tech giants in Europe.
00:07:01.740 But as you said, I feel like most of that stuff is a charade.
00:07:06.300 It's not really about protecting your privacy.
00:07:10.900 And we are seeing that in the UK, in the European Union.
00:07:14.700 I mean, there's so many cases that already have made some significant movements already this year.
00:07:20.940 So I would say for me personally, it has really been this technological and mathematical understanding of the power of this technology.
00:07:33.360 So realizing this, realizing that the universe allows us to do these things, and the universe sort of has this built right into it, got me so fascinated that I really thought deeply about this.
00:07:49.840 And what I realized sort of is that what humans have done in the past is that they've allowed information, right?
00:07:57.740 Any type of information that we now share with our mobile surveillance devices.
00:08:01.760 So that information to be encrypted and be put at rest somewhere securely, right?
00:08:08.400 That is how encryption has mainly been used.
00:08:11.280 Or to do things like signals doing, where we do end-to-end encrypted messaging, right?
00:08:17.560 Where we are able to send some message from one human to another human being via some untrusted channel, right?
00:08:27.000 Where there can be interceptors that try to get those messages.
00:08:30.300 But thanks to mathematics, we are able to send this message across the whole universe.
00:08:35.780 And it arrives at the endpoint with no intermediary being able to take a look at the message because of this inherent property of the universe.
00:08:44.140 What I realized sort of has been that there's a missing piece, which is whenever we are accessing this information, whenever we are interacting with this information, whenever we want to utilize it, basically, we have to decrypt it again.
00:08:59.160 Which then makes it accessible to whoever takes a look at it, right?
00:09:04.420 Whoever runs the machine that you decide to put that data on, which can be AWS, which can be cloud providers, big data, big AI, whoever, right?
00:09:14.960 And so this idea that I had was, what if we can take this asymmetry that is a fact of reality and move that to computation itself to enable that all of those computations can be executed in private as well.
00:09:31.540 And then we can do some amazing things.
00:09:34.060 Then the two of us can decide to compute something together, not just exchange information via some secure communication channel, but actually perform some mathematical function over something, produce an output from some inputs.
00:09:48.060 But we can keep those inputs to ourselves.
00:09:51.000 So Tucker has a secret, Yannick has a secret, and with this technology, we can produce some value, some information, while you don't have to share your secret, I don't have to share my secret, and we can scale that to enormous sizes where the entirety of humanity can do those things, where countries can do those things.
00:10:10.600 But importantly, at its core, what we're doing is we're implementing this asymmetry that exists within the universe, and bringing that to the next level, to the final form, sort of.
00:10:21.960 And that's how I ended up founding Archeum, yeah.
00:10:24.920 Getting older can make you realize you don't actually want all the things you have.
00:10:29.540 That's why mini-storage is so big.
00:10:31.500 Consumerism kind of loses its appeal.
00:10:33.520 What you really want is peace, peace of mind.
00:10:36.820 And what's the best way to get that?
00:10:37.980 Well, keeping your home and family protected would be at the top of the list.
00:10:41.600 Enter SimpliSafe.
00:10:42.620 This month, you get a 50% discount on your first SimpliSafe system.
00:10:48.020 SimpliSafe takes a much better approach to home security.
00:10:52.180 The idea is, how about we stop people, invaders, before they come into the house?
00:10:57.240 Not just trying to scare them once they're already inside your house.
00:11:00.100 And they do that with cameras, backed by live agents who keep watch over your property.
00:11:04.740 If someone's lurking outside, they will tell the person, get out of here, and then they'll call the police if they don't.
00:11:11.560 60-day satisfaction guarantee or your money back.
00:11:14.600 So there really is no risk here.
00:11:17.000 Not surprisingly, SimpliSafe has been named America's best home security system for five years running.
00:11:22.300 Protect your home today.
00:11:23.340 Enjoy 50% off a new SimpliSafe system with professional monitoring at SimpliSafe.com slash Tucker.
00:11:29.420 SimpliSafe.com slash Tucker.
00:11:33.520 There is no safe like SimpliSafe.com.
00:11:35.240 With the RBC Avion Visa, you can book any airline, any flight, any time.
00:11:41.440 So start ticking off your travel list.
00:11:43.980 Grand Canyon? Grand.
00:11:45.960 Great Barrier Reef? Great.
00:11:48.400 Galapagos? Galapago?
00:11:50.800 Switch and get up to 55,000 Avion points that never expire.
00:11:55.540 Your idea of never missing out happens here.
00:11:57.980 Conditions apply.
00:11:59.720 Visit rbc.com slash Avion.
00:12:06.120 Everyone needs help with something.
00:12:08.140 If investing is your something, we get it.
00:12:10.800 Cooperators Financial Representatives are here to help.
00:12:13.420 With genuine advice that puts your needs first, we got you.
00:12:17.040 For all your holistic investment and life insurance advice needs, talk to us today.
00:12:21.520 Cooperators. Investing in your future together.
00:12:23.980 Mutual funds are offered through Cooperators Financial Investment Services, Inc. to Canadian residents, except those in Quebec and the territories.
00:12:30.880 Segregated funds are administered by Cooperators Life Insurance Company.
00:12:33.180 Life insurance is underwritten by Cooperators Life Insurance Company.
00:12:35.260 I can't think of a more virtuous project.
00:12:37.980 When you said it in the first minute, the point of the project is to preserve humanity, to keep human beings human.
00:12:47.440 They're not just objects controlled by larger forces.
00:12:50.420 They're human beings with souls.
00:12:51.620 And again, I don't think there's any more important thing that you could be doing with your life.
00:12:57.760 So thank you for that.
00:12:59.420 Can you be more specific about our current system and how it doesn't protect privacy?
00:13:05.560 Yes, so I would say there's, so I think there's a lot of things to unravel.
00:13:15.980 If we take a look at the systems that we are interacting with every single day, what those tools and applications, those social media networks, basically everything that we do in our digital lives and all of our lives have basically shifted from physical reality to this digital world.
00:13:37.580 So everything we basically do, everything we do in this room, everything we do when we are out in the street, because all of the technology has become part of physical reality, has been consumed sort of.
00:13:50.520 And so all of this has been built on top of what the former Harvard professor Shoshana Zubov has called surveillance capitalism, right?
00:13:59.540 And I think that that really lies at the core.
00:14:01.940 And it's relatively straightforward to understand what those companies are doing if you ask yourself, hey, why is this application that I'm using actually free, right?
00:14:14.080 Why is nobody charging me to ask this super intelligent chatbot questions every day?
00:14:22.080 Why are they building data sensors for trillions of dollars while I don't have to pay anything for it, right?
00:14:28.640 So that's the question that you need to ask yourself, right?
00:14:31.340 And what you end up realizing is that all of those systems are basically built as rent extraction mechanisms, where from you as a user, you're not really a user, you're sort of a subject of those platforms, you are being used to extract value from you without you noticing.
00:14:54.860 And they're able to extract value from you because all of your behavior, all of your interactions with those systems are being taken and they perform mass surveillance, bulk surveillance.
00:15:09.180 And it's those companies, right?
00:15:10.740 We're just talking about companies.
00:15:11.940 We're not even talking about intelligence or governments or anything.
00:15:15.380 We're just talking about those companies that exist within our economy.
00:15:19.820 And so they record everything they can because every single bit of information that I can take from your behavior allows me to predict your behavior.
00:15:29.560 And where I can predict your behavior, I can utilize that to, in the most simple case, do something like serving you ads, right?
00:15:37.380 But in more complex cases, I can do things like I can steer your behavior.
00:15:42.560 I can literally control you.
00:15:44.740 I can turn you into a puppet that does whatever I want.
00:15:48.300 And so those are the systems that we are faced with right now.
00:15:52.160 And the internet has sort of been this amazing emancipator for humanity, right?
00:15:57.940 This show is only possible because of the internet.
00:16:00.600 Otherwise, with traditional media, we wouldn't be able to speak about those topics, I feel like.
00:16:05.300 That's right.
00:16:05.600 But at the same time, sort of nowadays, it has transformed into one of the biggest threats to human civilization.
00:16:16.320 At the user level, at my level, the level of the iPhone owner, is it possible to communicate privately with assurance of privacy with another person?
00:16:28.340 That's an interesting question.
00:16:29.760 So we start with this concept of insecure communication channels.
00:16:34.320 And since every communication channel is insecure, what we employ is end-to-end encryption.
00:16:42.360 And end-to-end encryption allows us to take this information, take a message, and lock it securely so that only Tucker and Yannick are able to unlock them and see what's going on.
00:16:56.400 And that is a fact.
00:16:57.360 So there have been many cases where big players with big interests, I guess, have attempted to undermine cryptography, have attempted to get rid of end-to-end encryption, to install backdoors.
00:17:11.120 There has been what is commonly called the crypto wars in the 1990s, right?
00:17:16.380 Where the cypherpunks fought for the right to publish open source encryption and cryptography.
00:17:24.480 And many, many more cases, I guess.
00:17:26.400 But at the end of the day, I would say, as a realistic assessment, this kind of cryptography is secure and it works.
00:17:32.660 Now, that, unfortunately, is not the whole answer because what you have to think about is, now, what happens with those end devices, right?
00:17:41.340 Fair.
00:17:41.880 I mean, the message, the messenger, right, that is being sent from Yannick to Tucker might be secure.
00:17:47.880 But now, if I cannot undermine and apply force to this message to understand what's inside, well, I'm just going to apply force to your phone.
00:18:00.100 And that's sort of what's happening.
00:18:01.220 So, when we look at different applications, for sure, there is a whole variety of applications, messaging applications, right, that do not employ encryption and security standards and might collect all of your messages and images and utilize them for those machines, right, that extract as much value as possible from you.
00:18:26.400 But there's applications like Signal that don't do that, that are actual open source cryptography technology that anyone can verify themselves and take this code and turn it into an actual application, install it on your phone.
00:18:40.520 All of those things are possible, right?
00:18:42.100 So, that's not the issue.
00:18:43.420 The underlying issue really is that you have this device in your hand that is sort of closed hardware.
00:18:49.800 You don't know how that thing works, right?
00:18:52.200 It is impossible to understand how that thing works.
00:18:54.400 It is impossible to understand how the operating system on that thing works.
00:18:58.620 And there's flaws in those systems, right?
00:19:01.900 Those are closed systems.
00:19:03.360 There's flaws in those systems for some reason because people don't always have the best interests of others in mind.
00:19:11.640 Not always.
00:19:12.560 Not always.
00:19:13.980 But also because people make mistakes, right?
00:19:16.300 Honest mistakes that are non-malicious.
00:19:18.160 And so, I think that in general also speaks for the importance for free accessible hardware where people with technical skills can play around with and find issues.
00:19:29.020 But at its core, what you're being subjected to right now, I would say, is tactical surveillance.
00:19:37.680 And what it means is that there's some actor, can be some state actor, can be someone else, that decides that Tucker Carlson is worth to be surveilled.
00:19:49.760 I think that has been decided, yeah.
00:19:51.680 You think so?
00:19:52.700 I think I do, yeah.
00:19:53.440 I'm getting that sense.
00:19:54.700 Yeah.
00:19:55.340 So, tactical surveillance.
00:19:57.140 That means that you specifically are being targeted.
00:20:00.200 And that is in contrast to strategic surveillance, which is this idea of everyone is being surveilled.
00:20:07.480 Let's just surveil everyone, collect every single bit of information, and store that for the entirety of human history.
00:20:14.380 And then someday, maybe we'll be able to use that, right?
00:20:17.100 So, those are those two concepts.
00:20:19.380 And what we've seen over the last few years is sort of a shift away from tactical surveillance towards strategic surveillance.
00:20:28.980 And surveillance capitalism has really helped this concept because there's so much data that is being locked, that can be stored.
00:20:36.720 There are so many new devices and applications that can be employed.
00:20:40.740 And so, we see pushes like, for example, chat control within the European Union that is sort of a backdoor to implement backdoors within all of the messenger applications to be able to scan your applications, to scan your messages, to take your messages somewhere else, and decide whether or not those people like what you're saying within your private messages.
00:21:02.860 So, I would say, in general, as a normal human being, with your iPhone, you are still able to privately communicate.
00:21:12.080 That is still something that exists.
00:21:15.740 However, this ability has greatly been limited.
00:21:18.820 If there is someone who wants to see your message, I would say they can, unfortunately.
00:21:25.220 How difficult is it for a determined, say, state actor, an intel agency, to say, I want to read this man's communications, listen to his calls, watch his videos, read his texts?
00:21:38.000 How hard is it for them to do that?
00:21:40.000 So, I think that, and we can look at different court cases that have publicly emerged in regards to Apple, for example, right?
00:21:47.780 Where Apple has refused intelligence to give them backdoor access to their devices.
00:21:55.700 And what's so important about this discussion that we are having here is that every time you're building a system where you add backdoor access so that someone in the future can decide to get access and take a look at what you're writing, right?
00:22:11.880 And what that invites is for everyone to do that, because a backdoor inherently is a security flaw in our system.
00:22:18.820 And it's not just some specific intelligence agency that decides to read your messages, right?
00:22:24.440 It's every intelligence agency on Earth at that point, right?
00:22:27.700 And so, that's why, as a nation, you cannot weaken security by getting rid of privacy without weakening your entire economy, cybersecurity, and also social fabric at the end of the day, right?
00:22:42.860 And the whole strategic positioning of you as a nation.
00:22:45.140 How difficult it is, I would say, also, from a practical operational security standpoint, depends on what are you doing with your phone, right?
00:22:59.660 Is your phone this strict device that is only used for messaging, or is your phone also using different types of media?
00:23:07.520 Are you sending images?
00:23:09.080 Are you receiving messages?
00:23:09.920 So, I think two years ago, there was this case where there was a zero-day backdoor being used across Apple devices, because when I sent you an image and your messenger had auto download on, I could get full access to your phone by sending you a message.
00:23:31.860 And you're not my contact even, probably, right?
00:23:35.180 I just figure out what your phone number is, I sent you an image, the image gets automatically downloaded, some malicious code that I have injected gets executed, and now I own your phone and I can do whatever I want.
00:23:47.760 And then, end-to-end encryption doesn't help you, right?
00:23:49.920 Because I have literal access to the end device that decrypts this information.
00:23:53.980 And so, that's very dangerous, that has been fixed, but I think what it highlights really is that complexity is the issue here.
00:24:01.460 So, complexity in the kinds of applications that you're running, complexity in the underlying operating system that this device has, all of that complexity invites mistakes and also malicious security flaws to be installed in those systems.
00:24:16.880 Of course.
00:24:17.320 Yeah.
00:24:17.680 Human organizations are the same way.
00:24:19.280 The bigger they are, the easier they are to subvert.
00:24:21.800 Yes.
00:24:22.180 Of course.
00:24:22.720 Yeah.
00:24:22.980 February is the perfect month to get cozy because it's chilly outside.
00:24:26.640 Our partners at Cozy Earth understand this, and they're helping Americans everywhere stay toasty throughout the frigid winter.
00:24:33.820 We hope you're seated because this detail may shock you.
00:24:36.440 Cozy Earth offers bamboo pajamas.
00:24:39.760 Lightweight, shockingly soft, these pajamas are a true upgrade.
00:24:43.340 They sleep cooler than cotton.
00:24:44.820 Plus, they're made out of bamboo.
00:24:46.020 That is just wild and awesome.
00:24:48.380 From pajamas and blankets to towels and sheets, Cozy Earth is something unusual and great for everybody, and it's entirely risk-free.
00:24:54.580 You get a 100-night sleep trial, 10-year warranty.
00:24:58.580 There is no downside that we can see.
00:25:00.780 So share love this February.
00:25:02.840 Wrap yourself for someone you care for in comfort that feels special.
00:25:07.040 Bamboo pajamas!
00:25:08.980 Visit CozyEarth.com.
00:25:10.660 Use the code TUCKER for 20% off.
00:25:13.780 That's code TUCKER for up to 20% off.
00:25:15.620 And if you get a post-purchase survey, make certain to mention you heard about Cozy Earth from...
00:25:19.580 Got PC Optimum points?
00:25:22.220 Visit Shoppers Drug Mart for the bonus redemption event and get more for your points.
00:25:26.340 Friday, February 13th to Wednesday, February 18th.
00:25:29.340 Valid in-store and online.
00:25:34.020 Investing is all about the future.
00:25:36.140 So, what do you think is going to happen?
00:25:38.140 Bitcoin is sort of inevitable at this point.
00:25:40.640 I think it would come down to precious metals.
00:25:43.220 I hope we don't go cashless.
00:25:45.080 I would say land is a safe investment.
00:25:47.940 Technology, companies, solar energy.
00:25:50.100 Robotic pollinators might be a thing.
00:25:52.680 A wrestler to face a robot?
00:25:54.360 That will have to happen.
00:25:55.960 So, whatever you think is going to happen in the future, you can invest in it at Wealthsimple.
00:26:01.700 Start now at Wealthsimple.com.
00:26:03.980 Us.
00:26:04.960 So, that's very...
00:26:06.040 I mean, that's a very simple thing, though.
00:26:07.360 Yeah.
00:26:07.580 To send someone, you know, to text him an image and all of a sudden you have control of his phone.
00:26:12.680 Luna, I think we can be fairly confident that people who have adversaries are being surveilled, right?
00:26:19.140 Yes, I think so.
00:26:20.120 I would say that tactical surveillance really is something that exists.
00:26:27.740 I would say in this battle for privacy, it is actually not the most important thing to focus on, right?
00:26:34.900 Because this kind of tactical surveillance, sort of, I feel like to a certain degree we need to accept, unfortunately, right?
00:26:44.020 Not the tactical surveillance that says, Tucker Carlson is a journalist.
00:26:47.580 I don't like that.
00:26:48.560 Let me surveil him, right?
00:26:49.720 That's not the kind of tactical surveillance I'm speaking of.
00:26:52.260 But if we have legal procedures and actual judicary warrants in place, right?
00:26:58.380 I feel like as a society, we could accept that to convert certain criminal activity.
00:27:03.020 As long as we trust our systems, we can definitely accept that, of course.
00:27:07.120 But the fundamental issue really is, and that's sort of so ironic, right?
00:27:11.900 That all of the surveillance, sort of, needs to operate under secrecy in order to function, right?
00:27:18.980 You should not know that you're being surveilled.
00:27:21.580 Nobody, sort of, has oversight.
00:27:23.540 Not even the democratic processes are able to have oversight because it's all wrapped in secrecy.
00:27:30.340 So that really brings us to the fundamental issue here, also with strategic surveillance, surveilling everyone.
00:27:35.820 Just deciding, well, I'll take a look at everyone's phone, store everything, and maybe I don't like someone in the future, then I have this backlog of information.
00:27:45.360 So the important question to consider here is thinking about, is there even a future where, from a legal standpoint, it is possible to implement procedures that guarantee that there is no secret surveillance in place?
00:28:01.960 Yes, which I think the answer is pretty clear to that question.
00:28:06.600 And it is?
00:28:07.580 It is.
00:28:08.520 I think it is not.
00:28:09.880 So I think it is important to have these laws in place, right?
00:28:14.480 Of course.
00:28:14.680 That prohibits surveillance and that enable different kinds of processes with warrants, right?
00:28:22.380 Literally the Fourth Amendment, right?
00:28:24.520 Yes.
00:28:24.960 To allow for that to be implemented in the 21st century.
00:28:29.260 But what we've seen sort of is that the tools that governments have access to are so powerful that it is impossible to make a law that prohibits use of that.
00:28:46.140 Because whoever, within a centralized architecture, that's always the case, has access to this technology, basically becomes a single point of failure.
00:28:56.300 And that single point of failure will necessarily be corrupted by the power that exists.
00:29:03.900 Just a couple obvious lowbrow technical questions.
00:29:09.300 Is the iPhone safer than the Android or less?
00:29:11.920 That's a good question.
00:29:14.540 So I would say a huge advantage that Android devices bring to the table, right, is this nature of, I guess, a subset of those devices, right?
00:29:26.300 Not speaking for the entirety, but the operating system, for example, being publicly viewable by anyone, right?
00:29:32.660 You can understand it.
00:29:33.920 And I think that is so important, not just for security, but also for technological innovation.
00:29:40.580 And so I would say that is a huge advantage.
00:29:43.920 Now, the devices are manufactured by some manufacturer who you need to trust at the end of the day,
00:29:50.220 based on how the hardware is built and how the firmware is compiled and then put on your device.
00:29:57.660 So there have been interesting operating systems.
00:30:01.040 I think there's one called Graphene OS, which is a secure open source operating system, as far as I know.
00:30:08.280 I haven't looked too deeply into that.
00:30:09.780 But you could, on an Android device, theoretically say, I'm going to run my own operating system on that,
00:30:15.440 which I think is a strong value proposition.
00:30:17.960 Now, I myself am also an Apple user.
00:30:20.020 There is also a sort of element of institutional trust involved here, right?
00:30:26.280 Where you say, okay, I trust the manufacturing and software process that this company has.
00:30:32.940 But in general, if I'm being honest, if I wouldn't be lazy, right?
00:30:38.720 What I'd be doing is I would actually be looking for a minimalistic, secure, open source operating system for my mobile phone.
00:30:47.600 And I would build that myself and get some hardware and put that on there.
00:30:53.960 So I would say that would be the smartest thing to do if you are technically versatile.
00:30:58.980 I read that you use an iPad, not a Mac.
00:31:01.720 Is there an advantage?
00:31:02.620 That's what I did back in the day when I started.
00:31:05.120 Yeah.
00:31:05.340 Is there an advantage to the iPad over the Mac from a privacy standpoint?
00:31:09.660 I think what it boils down to there is what kind of applications could be installed on your system.
00:31:23.760 I would say in general, devices like the iPhone or the iPad operate in a more sandboxed way where applications are actually isolated, right?
00:31:37.980 Rather than how it works on operating systems like macOS or Windows, right?
00:31:43.360 Where you could compromise the entire system way more easily, right?
00:31:47.760 So on the iPhone, you just have an app store with applications and the level of compromise that such an application can have, theoretically, at least from the idea, is limited to just the single application, right?
00:32:02.620 It doesn't have access to your messenger if you're installing an app.
00:32:06.480 Although it has, I guess, if there's some flaw in the system, which always is the case.
00:32:11.500 So you never have this absolute security.
00:32:13.840 I think what it really boils down to is this idea that really emerged in the 1990s of decentralization, right?
00:32:25.020 Moving away from central single points of failures towards decentralization, where we can mitigate a lot of these risks by not depending, I guess, on one single type of computer and not even depending on one single computer,
00:32:39.700 but having many computers, which introduces redundancy, resilience, and, I guess, risk reduction and distribution to computer systems.
00:32:49.580 So speaking more broadly about how the internet in a free society should be built, I guess, yeah.
00:32:55.780 So most people don't wake up in the morning and decide to feel horrible, exhausted, foggy, disconnected from themselves.
00:33:01.600 But it does happen, and it happens slowly.
00:33:03.700 You're working hard, you're showing up, and then your energy disappears by midday.
00:33:08.160 Your focus is dull, your weight won't move.
00:33:11.120 A lot of people are told, that's just getting old.
00:33:13.140 That's what it is.
00:33:13.840 But that's not actually true.
00:33:16.480 For many men and women, these are not personal failures.
00:33:19.460 They are signals tied to your metabolism, your hormones, and nutrient imbalances that go undetected for years.
00:33:26.060 You don't even know you're deficient.
00:33:27.560 And that's why we're happy to partner with Joy and Blokes, a company that was built for people who are done guessing and ready to figure out what exactly is going on.
00:33:36.660 And that starts with comprehensive lab work and a one-on-one consultation with a licensed clinician.
00:33:42.000 An actual human being explains what's happening inside you and builds a personalized plan, which includes hormone optimization, peptide therapy, targeted supplements.
00:33:50.560 So don't settle.
00:33:51.900 Go to joyandblokes.com slash Tucker.
00:33:53.900 Use the code Tucker for 50% off your lab work and 20% off all supplements.
00:34:00.200 That's joyandblokes.com slash Tucker.
00:34:03.080 Use the code Tucker.
00:34:04.040 50% off labs, 20% off supplements.
00:34:06.900 Joy and Blokes.
00:34:08.280 Get your edge back.
00:34:10.040 Stuck in that winter slump?
00:34:11.840 Try Dove Men plus Care Aluminum Free Deodorant.
00:34:15.000 All it takes is a small change to your routine to lift your mood.
00:34:17.920 And it can be as simple as starting your day with the mood-boosting scents of Dove Men Plus Care Aluminum Free Deodorant.
00:34:23.940 It'll keep you feeling fresh for up to 72 hours.
00:34:27.380 And when you smell good, you feel good.
00:34:30.320 Visit Dove.com to learn more.
00:34:33.280 Things are feeling a little less human these days, aren't they?
00:34:37.040 But isn't the whole point of progress to make things more human?
00:34:40.500 That's why, at TD, when we design a product, whether it's an app for making trading easier or monitoring your account for fraud,
00:34:47.920 we ask one simple question.
00:34:50.180 How does this help people?
00:34:52.460 That's how we're making banking more simple, more seamless, and more intuitive.
00:34:57.720 But most importantly, that's how TD is making banking more human.
00:35:02.020 You've said a couple of times that the problem is the hardware.
00:35:07.320 It's not the software.
00:35:09.460 So it is the device, right?
00:35:12.360 It's the union of the hardware and the software.
00:35:14.440 Yes.
00:35:14.880 So what's the option?
00:35:16.640 Is there an option at this point?
00:35:17.860 If I am intent on sending a private message to someone else electronically,
00:35:25.340 is there a way to do it as of right now that's private, guaranteed private?
00:35:31.400 So I would say the way that I myself at least handle it really is to have a dedicated phone for that specific use case, right?
00:35:41.420 And then just have encrypted messenger there that you can trust because maybe you don't even install it via the App Store,
00:35:49.080 but you have built it yourself and there's no other interactions taking place with that phone.
00:35:54.760 I would say from an operational security standpoint, that is as good as it can get.
00:36:00.080 Otherwise, you would really have to look at, I don't know, you can do creative things always, right?
00:36:06.020 You could write your message and hand encrypt it and then type it in the phone, right?
00:36:12.200 So it doesn't matter at that point.
00:36:14.280 So maybe we need to get away from the devices altogether, right?
00:36:19.920 What's interesting, what we're doing with Archium is that we never have a single point of failure.
00:36:26.980 Everything is encrypted.
00:36:28.300 Everything sits within a distributed network where as long as you're not able to basically get access to the entire globally distributed network,
00:36:38.680 to every single participant, you have security.
00:36:42.340 And it's difficult to do that with your own phone.
00:36:47.700 But at the end of the day, I think over time, those systems get more secure.
00:36:55.420 However, what is important is to be certain that there is no backdoors explicitly installed, right?
00:37:01.500 From those manufacturing processes.
00:37:02.880 I think there's some countries where if you're buying a phone from there, you could be certain,
00:37:09.820 okay, there might be something installed because the company itself is owned by the government.
00:37:14.180 And we need legal frameworks for that.
00:37:19.340 And also what we require sort of is that the manufacturing process itself mirrors distributed decentralized systems.
00:37:30.180 Where there, again, is not a supply chain of single points of failure, where if one single worker decides to install some backdoor because they get paid off, right?
00:37:41.960 They can do so.
00:37:42.780 But instead, there is oversight.
00:37:44.440 And I think that Apple runs on that model already.
00:37:48.680 So, I would be relatively comfortable with these kinds of systems.
00:37:53.400 But there's also other interesting technologies.
00:37:56.040 So, for example, Solana, which is an American company, blockchain network, right?
00:38:02.960 They actually have their own phone company or offering phones.
00:38:09.760 They have a very small manufacturer and they manufacture those phones because they say, well, those phones need to be very secure because you literally store your money on there now because your money is digital and on top of a blockchain network.
00:38:24.300 And so, I think those are very interesting approaches where I'm really looking forward to seeing more phones like this where there's, then again, a competitive market emerging for who's building the most secure phone.
00:38:39.860 Yeah.
00:38:40.080 I actually think a friend of Julian Assange from Germany, I don't remember his name, had a company manufacturing secure phones.
00:38:52.040 The issue with explicitly built secure phones, however, always is that I would say many of these companies are honeypots.
00:39:01.800 I've noticed.
00:39:02.880 Yeah.
00:39:03.140 With the EncroChat or whatever it was called, there was this large-scale police operation to stop truck cartels, which worked out nicely, I guess, in the end.
00:39:16.860 But the company itself was just a facade to sell backdoor phones.
00:39:22.720 Yeah.
00:39:23.940 Right.
00:39:24.540 I mean, it's the perfect honeypot.
00:39:27.060 And so, by the way, a signal, which I'm not saying is a honeypot, of course, but it was, and I use it, as the authorities know.
00:39:35.840 But it was created with CIA money.
00:39:39.440 So, it doesn't mean it's a CIA operation, but why wouldn't it be?
00:39:45.480 I mean, honestly, I'm not accusing anybody because I have no knowledge.
00:39:48.920 Yeah.
00:39:49.300 But it'd be a pretty obvious move, right?
00:39:52.020 It would be.
00:39:54.000 I think what's important when we look at Signal, actually, is that we look at what Signal is.
00:40:02.380 Signal is open source software.
00:40:05.020 Yes.
00:40:05.240 That anyone can verify for themselves.
00:40:08.240 And what it means is that we have this global community of mathematicians and cryptographers that have invented those protocols, that have independently, without getting funding from CIA or whomever, thought of mathematical problems that they want to solve, that they are passionate about.
00:40:26.640 And all of those people look at those open source lines of code and mathematical formulas, and they find those flaws in those systems.
00:40:36.700 And so, that makes me confident in the design of Signal itself.
00:40:42.460 Do you use it?
00:40:43.400 I use Signal, yes.
00:40:44.480 I got my entire family to use Signal.
00:40:46.260 Okay, good.
00:40:46.920 So, well, that's, and I have to say, I know a lot of Intel people use Signal.
00:40:51.220 Yeah.
00:40:51.620 A lot, all the ones I know.
00:40:54.100 And so, that tells you something right there.
00:40:56.260 Yes.
00:40:56.760 So, I think it would be highly unlikely that Signal itself would actually turn out to not be secure.
00:41:05.600 There has been this interesting case called, that was in the early 2000s, where there was this attempt to actually undermine strong encryption called, very exotic name, Dual Elliptic Curve Deterministic Random Bit Generator.
00:41:24.340 Dual E-L-E-C-D-R-B-G, right?
00:41:27.900 Nobody understands, no non-technical person understands what that means, right?
00:41:32.060 And it was actually, what you need to understand in order to comprehend what has happened there is that when we encrypt information, when we, as I said earlier, when we take something and move it into this different realm where you cannot follow this information into that realm, because that would require you to have literally infinite resources, more energy than the sun will emit over its lifespan.
00:41:58.440 Isn't that crazy, right?
00:41:59.440 Isn't that crazy, right?
00:41:59.800 So, you cannot follow there.
00:42:03.200 Well, how fundamentally this asymmetry is achieved in cryptography is that the universe runs on energy and uncertainty, right?
00:42:14.300 Particles, jitter, stars burst.
00:42:16.620 And so, there's this randomness in the universe.
00:42:19.120 If you look at the sky or if you just look at how things are made up, there's random noise everywhere.
00:42:27.080 Yes.
00:42:27.220 And so, when we encrypt something, we make use of that chaos and we inject it into a message that we are sending, for example.
00:42:36.620 And it's only possible to not decrypt that message in an unauthorized way if the randomness that has been injected in this message is actually unpredictable.
00:42:50.440 Now, if we think of random…
00:42:51.860 Unless it's truly random.
00:42:52.940 It has to be truly random, yeah.
00:42:55.180 I cannot figure out how you arrived at the random number.
00:42:59.680 No pattern.
00:43:00.740 No pattern.
00:43:01.420 Exactly.
00:43:01.900 True randomness, true entropy, right?
00:43:03.520 Yes.
00:43:03.900 That's what cryptographers, I would say, spend most of their time on, thinking about how can we achieve true randomness.
00:43:11.500 Because then, if we are able to inject that using mathematics, for you, it becomes impossible to distinguish this message from randomness.
00:43:21.080 You can't find a pattern.
00:43:22.500 Hence, you're not able to apply any optimized algorithm to undermine it.
00:43:26.740 Exactly.
00:43:27.040 So, if we think about it practically, what that means is, let's say we have a deck of cards, 52 playing cards, right?
00:43:34.500 And I randomly shuffle this deck of poker cards.
00:43:39.520 We have 52 cards.
00:43:40.820 What that means is that there's so many possible ways that a deck could be stacked,
00:43:47.580 that it is very unlikely that for truly randomly shuffled decks there have ever been two identical decks in the history of humanity.
00:43:57.200 Which is hard to believe in general, but that's how statistics and mathematics work, right?
00:44:02.180 So, we take this deck and we use it as the randomness.
00:44:06.440 Now, if I play with a magician, the magician can pretend to shuffle the deck, but actually they have not shuffled the deck.
00:44:13.800 They know what the cards look like.
00:44:16.600 What we're doing with all of this randomness that we are injecting into information is we are basically describing what key is being used to unlock them.
00:44:29.500 And if I don't know how the randomness looks like, if I don't know what the next playing card in the stack is,
00:44:36.220 I have to try every single possible key and try to unlock it with this message.
00:44:42.620 So, you could think of it as, I have this message.
00:44:44.620 Now, I want to apply violence to this message in order to recover it.
00:44:48.840 What I'm doing is I take key number one, I try to unlock it, doesn't work.
00:44:54.320 Then let's try key number two.
00:44:55.980 And you do that for an inconceivable large amount of numbers.
00:45:00.260 So, that's why you basically, practically speaking, cannot brute force these kind of mechanisms.
00:45:05.600 Although, you can if you know where to start looking for the keys.
00:45:09.740 If you know that you need to start looking at the millionth key, then you can recover it.
00:45:16.500 And so, if the deck is being manipulated, the randomness is being manipulated, then you can undermine encryption.
00:45:23.940 While the process of encrypting it itself remains sound, right?
00:45:28.780 You don't notice it.
00:45:29.700 You actually do what you mathematically need to do to securely send your message.
00:45:34.660 But the value that you use to do so, this randomness, is actually not random.
00:45:38.740 And that's what had been attempted with this specific algorithm, dual EC, the RBG.
00:45:48.060 What they did was they created this concept of kleptography, where they actually have randomness,
00:45:55.300 derive it in a way that is deterministic, and they actually have some secret value.
00:46:00.740 And then from that secret value, they derive fake randomness.
00:46:03.860 It looks random, but it's not actually random.
00:46:05.920 And the NSA proposed this algorithm to the NIST, the National Institute of Science and Technology,
00:46:16.200 in the early 2000s, as the best state-of-the-art randomness derivation function, I guess, right?
00:46:25.720 And that got accepted.
00:46:28.180 They got accepted as official standard.
00:46:30.460 And then there was companies like RSA, actually a highly sophisticated and respected cryptography company, right?
00:46:38.800 With the founders being some of the fathers of modern-day cryptography, right?
00:46:45.160 So that then built products and distributed it to industry and people using this technology.
00:46:52.600 Nobody knew about it, but it's not actually true that nobody knew about it.
00:46:57.540 So there were a lot of cryptographers that raised questions a couple of years later,
00:47:02.580 where they were like, I don't think this is actually random.
00:47:06.400 It looks suspicious to me.
00:47:07.940 But they were like, if someone theoretically had access to some secret key S,
00:47:13.340 and then created some mathematical formulas and actually mathematically proved that there was insecurity there.
00:47:20.620 It was not random.
00:47:21.880 Because they noticed a pattern and it's...
00:47:23.500 So basically what they realized is that there's just those numbers.
00:47:30.480 So they wrote this proposal.
00:47:33.480 Hey, let's use this algorithm.
00:47:35.260 And this algorithm contains some constant numbers.
00:47:38.780 So there's those numbers written there.
00:47:40.880 And then they were like, are those numbers random?
00:47:43.460 Because we are literally deriving our randomness from those numbers.
00:47:46.460 And we're like, yeah, those are random.
00:47:48.240 We randomly generated them.
00:47:49.840 So it turns out there was some other key that is being used to then mathematically be able to recover whatever randomness you used.
00:47:56.560 So that was the secret attempt to undermine cryptography.
00:48:01.200 By the US government.
00:48:03.120 Yes, yes.
00:48:04.180 And I think what's so striking about this again is that you're not just undermining privacy, right?
00:48:12.720 You're undermining the entire security of your economy, your country, right?
00:48:17.660 And banking, missile codes, everything.
00:48:22.680 Yes, everything.
00:48:23.560 So the thing that then happened was in 2013, Snowden revealed a few papers, I guess.
00:48:36.940 And one of those was Project Bullrun.
00:48:43.040 And within Project Bullrun, they allocated funding to that specific project where they tried to undermine cryptography.
00:48:53.880 And so once that got published, the corresponding companies and standardization institutes.
00:49:01.520 And it's so striking that you get standardization because once it's defined as a standard, you and industry need to implement it, right?
00:49:10.600 To get certification.
00:49:11.400 So it's literally impossible to then use some other alternative that is secure because certification only gets provided for this backdoor technology.
00:49:24.100 But it got uncovered thanks to Snowden.
00:49:26.780 Then people stopped using it.
00:49:28.440 Was he celebrated?
00:49:29.520 Did he win the Presidential Medal of Freedom for this?
00:49:31.540 Yes, in an alternative reality.
00:49:36.020 In a different realm, I guess.
00:49:38.320 One of the great patriots of our time.
00:49:41.160 Relentless.
00:49:41.740 I mean, they'd murder him in a second.
00:49:43.840 He's still in exile.
00:49:45.400 Not by choice.
00:49:46.680 But yeah.
00:49:49.760 What they also uncovered is that they actually paid this company that built those products, 10 million U.S. dollars, the NSA, to use that as a standard.
00:49:59.220 So yeah, that's why you cannot trust anyone.
00:50:03.320 As you point out, it's not simply, I mean, so this is an intel agency trying to spy on its own people, the ones who pay for it to exist.
00:50:12.900 But it's, and that's immoral and, you know, something that we should fight against.
00:50:17.840 But they were also sabotaging the U.S. economy and U.S. national security.
00:50:22.760 And because if your cryptography is fake, then that means you're exposed on every level throughout your society.
00:50:31.040 You are, yes.
00:50:31.740 Yeah.
00:50:32.220 And it's so interesting because it is their task.
00:50:36.200 That's why it was possible for them to do that, to increase national security, right?
00:50:41.900 At that point, they were the leading cryptography research company in the world, sort of, right?
00:50:47.980 And so that really is striking to me, that you're willing to undermine the entire security of your nation.
00:50:56.600 And that, at the end of the day, puts you in a worse strategic position.
00:51:01.380 I think many people don't realize that.
00:51:04.120 I never thought about it until you mentioned it, but it just, it highlights, I mean, I love Ed Snowden and I'm not embarrassed of that, I'm proud.
00:51:10.960 But it just highlights, you know, the suffering that he's been through in order to help his own country.
00:51:18.300 And he's still slandered constantly in it.
00:51:22.000 It drives me crazy.
00:51:23.380 But this is yet another example of why he did something more than almost anyone else to help this country.
00:51:32.100 So you are, sounds like you're convinced that open, that the current state of the art in cryptography is actually secure.
00:51:40.960 Yes, yeah, 100%.
00:51:43.220 I think, as I said, I think this is a great example to look at, where even with those backdoors that had been implemented,
00:51:54.620 there were cryptographers within this global open source mathematics cryptography community that rang the bell, but nobody was listening to them.
00:52:03.080 But they actually identified the issue years in advance and rang the bell and said, this is not secure, not random, even within those companies and standardization institutes.
00:52:13.260 But nobody took it seriously.
00:52:15.440 Or I guess, took it seriously, but doesn't matter if the law is you have to use this algorithm, right?
00:52:21.140 So that makes me very confident that this system works, the system of mathematicians.
00:52:30.100 Is cryptography global?
00:52:33.800 Which is to say, like, is Chinese cryptography different or stronger than European or American?
00:52:40.900 It's interesting.
00:52:42.140 So you have actually specific encryption standards used by militaries of the world, right?
00:52:51.200 So the Chinese use different cryptography than the Russians, than the Americans.
00:52:56.380 It is, at the end of the day, the same thing, right, from a mathematical standpoint.
00:53:00.340 But there are some deviations in the level of security and the kind of numbers used, right?
00:53:06.380 So everyone builds their own standards because they mutually distrust each other.
00:53:12.060 But at the end of the day, the underlying mathematics are the same.
00:53:17.280 The cryptographic standards, the way that cryptography works, that is the same.
00:53:22.400 So there's no reason to think the Chinese or the Russians have stronger cryptography than the Europeans and the Americans?
00:53:28.000 So I think, no, no.
00:53:34.600 And I think, I mean, it's interesting to think about, is there cryptography that is being developed in-house within militaries or whatever proprietary human organization, right?
00:53:46.500 That is not publicly known, that is incredibly powerful.
00:53:51.520 So, I mean, what I've been doing with my team, and I'm so glad that I have those incredible cryptographers in my team that actually understand all of those things on a way, way more detailed level than I do,
00:54:09.560 is build this protocol that allows us to literally take everyone's data.
00:54:16.660 So you could imagine the entirety of the United States, right?
00:54:20.300 We take everyone's healthcare data, something like that, right?
00:54:24.260 And then we say, well, we need to do something with that data.
00:54:27.720 Let's say we need to research our disease or whatever.
00:54:30.920 Instead of taking that data and passing it to some company that will inevitably expose it, lose it, it will get leaked, or it will be used against those people, we encrypt it.
00:54:40.520 Nobody ever has to share any information.
00:54:42.740 And we just run whatever computation that we collectively set, we are going to do that with this data.
00:54:48.320 We do that, we get the result, we, I don't know, figure out a cure to cancer or whatever.
00:54:53.240 But at no point in time, you ever had to share your data.
00:54:56.380 Your data never left your ownership.
00:54:59.160 And I think that's really core.
00:55:00.920 And it sort of is the holy grail of cryptography, I would say, being able to do these kinds of things.
00:55:08.020 Because you can now run any type of computer program instead of in the public, in private.
00:55:14.260 And you can restructure the way that your entire economy and country can work, right?
00:55:20.700 And that goes beyond just economical human interactions that also touches upon things like rethinking how we can actually improve democratic processes.
00:55:32.960 Because what those computations inherently have as a property is so-called verifiability.
00:55:40.240 So, what's the status quo sort of in the current internet is you task some cloud provider to run a computer program for you, right?
00:55:54.220 Because you have limited resources, you want them to run that computer program for you.
00:55:58.720 So, you pass them some information, an algorithm, and you get an output back.
00:56:03.660 But how do you know that this output is actually correct, right?
00:56:07.580 Could be that there was an error, could be that they maliciously tried to undermine the output that they have sent you.
00:56:15.840 So, this technology that we've built actually solves this, right?
00:56:19.560 Verifiability for computations.
00:56:21.600 You can mathematically verify that a computation has been correctly executed.
00:56:26.880 And that itself is an amazing property, an amazing property that you want to see within every system, right?
00:56:33.180 But you don't get that amazing property without implementing privacy for those systems.
00:56:38.760 Isn't that amazing?
00:56:39.680 It is amazing.
00:56:40.680 How did you all create this?
00:56:44.360 So, I'm very lucky that within my company, I have very experienced cryptographers who've literally worked more years on these specific issues than I have been in cryptography.
00:57:00.120 And so, I'm sort of building on the shoulders of giants, of course, right?
00:57:06.580 And there has, for a very long time, been research in those areas, being able to run those encrypted computations.
00:57:14.540 But it has never been practical enough where it is fast enough, cheap enough, right?
00:57:20.760 And versatile enough where you can actually do all of those things.
00:57:23.580 And so, I think what really guided us is to, and what really guided me in the way that I designed the system is to think about, okay, how can I actually build this system so that people are going to use it and are going to build applications and are going to integrate that into systems, right?
00:57:45.020 Because I think with privacy technology in general, in the past, what has been done is that it sort of has been created in an echo chamber, in a vacuum almost, where you're a smart cryptographer that builds amazing technology, but you maybe don't understand how markets work and how to get product market fit, how to actually get those users, right?
00:58:09.400 And so, we've tried to build it in a different way, and that's how we ended up here.
00:58:15.340 But to be honest, it was an evolutionary process for us.
00:58:19.420 So, we originally started with a different kind of cryptography, I would say, that was more limited, that didn't allow for all of those interactions.
00:58:29.940 And then, at some point, we sort of decided, okay, and we realized that that was not good enough, that was not enough.
00:58:37.240 And at that point, basically, everyone was still building with that technology, and we were like, let's do something different instead.
00:58:44.400 Let's think about how the future will look like, how sort of computation and privacy can converge in something bigger for the entirety of humanity.
00:58:52.240 And that's then how we built it, in very, very quick time, actually.
00:58:56.340 How did you fund it?
00:59:00.300 So, we got investor funding, and I'm incredibly thankful for all of the investors that I've gotten.
00:59:09.500 Coinbase, for example.
00:59:10.640 So, big names in the space of blockchain distributed systems, right?
00:59:19.800 All of those networks, like Bitcoin, all of those networks are distributed in nature, decentralized.
00:59:27.460 And, yeah, there's a lot of players within that space that truly believe in the value of privacy, and that privacy is a human right, and privacy is inevitable as a technology that like to support it, but not just support it, right?
00:59:46.140 Because it is something they believe in, but invest in it because they sort of have realized that this is one of the most powerful technologies that can exist in humanity, right?
00:59:59.740 Being able to take information, move it into this realm, and then it can stay in this realm, and it can be processed, and everyone can do that.
01:00:07.040 That is incredibly powerful.
01:00:08.720 It is emancipating, and it is powerful for businesses, but also nation states.
01:00:13.260 At the end of the day, it is a neutral technology, and so we have investors that believe in that.
01:00:20.620 So, one of the applications, we were just talking off camera, one of the applications for this technology, well, one of the big ones, is the movement of money in a way that's private.
01:00:33.040 How exactly does that work?
01:00:35.240 And let me just add one editorial comment.
01:00:37.020 The great disappointment of the last 10 years for me is that crypto transactions don't seem to be as private or beyond government control as I thought they would be.
01:00:45.220 I hope they are someday, but watching the Canadian truckers, you know, have their crypto frozen was just such a shock.
01:00:52.260 I've never gotten over it.
01:00:54.080 Will this technology change that?
01:00:56.260 Yes, so if you think about Bitcoin as the state-of-the-art model of, or I guess the original, not state-of-the-art, but the original kind of blockchain network, right?
01:01:09.100 What it is at the end of the day is a way for distributed people to find consensus over some unit of money, which is actually more like a commodity than actually a financial instrument.
01:01:23.040 That's right.
01:01:53.020 Most people think that these kinds of networks are anonymous and are dangerous, right?
01:02:01.080 Because I feel like it has actually been a narrative that media and different actors want the people to believe.
01:02:10.140 I just have to add, I would like them to be anonymous and dangerous.
01:02:14.180 Oh, yes.
01:02:14.700 Yeah, yeah, yeah.
01:02:15.560 That's what I was hoping for.
01:02:16.920 Yes, so people believe that, which attracts people, right, and also keeps other people from using them and trying to outlaw them.
01:02:27.520 In actuality, they are not anonymous.
01:02:29.760 What you have in Bitcoin specifically is pseudonymity.
01:02:33.840 So you don't see on the blockchain, Tucker Carlson has 10 Bitcoin or whatever and send Yannick one Bitcoin.
01:02:41.880 You instead see A, B, C, D, E, F, G, blah, blah, blah, whatever, right?
01:02:45.300 A random string of numbers and letters has sent something to another random string of letters and numbers.
01:02:53.560 However, they're linked to this identity that you have.
01:02:58.480 So for every single transaction that you've performed in history on top of this distributed letter, you will see all of those transactions.
01:03:07.620 So I, when you, later after the show, send me one Bitcoin, I guess, right?
01:03:13.300 So I would see...
01:03:15.240 They're cheaper today than they were yesterday.
01:03:16.940 So when I, when you send me something, what I'll be able to see is all of the other transfers that you've performed in the past, right?
01:03:30.980 That's, that's unfortunately how Bitcoin works.
01:03:34.000 And so it has this inherent full transparency.
01:03:37.420 There is no privacy because it's so easy to then via, I guess, on and off ramps, how you actually moved money in there, right?
01:03:45.700 Because you most likely don't actually get this currency through, through work by applying energy.
01:03:51.540 You buy it for, for a different currency, fiat money, right?
01:03:55.260 So your identity is linked, everything is public.
01:03:58.760 And so that's a fundamental issue.
01:04:02.100 That is actually a dystopian scenario where we could end up if this is adopted as the technology where all of your money now sits and, and you're sending transactions,
01:04:12.600 where you have this big upside of having cash-like properties, which is amazing, but you have this tremendous downside of literally everything being recorded for the conceivable future of humanity, right?
01:04:26.620 And you have no privacy.
01:04:29.200 And that inherently limits your freedom to use this technology.
01:04:34.080 And so that is an issue that, that exists not just within Bitcoin, but also other blockchain networks.
01:04:41.080 And Bitcoin is this, the, this pure form.
01:04:44.080 That's why within this, this crypto industry, there's a lot of competition also between, between different players that say Bitcoin is this pure form that only allows transfers of money, right?
01:04:54.420 And other networks allow execution as well.
01:04:57.400 And that is, that has led to what is commonly being called smart contracts.
01:05:03.560 So this concept of computer programs that, that simply exist in the Azure, basically a computer program that can execute something that you tell it to do, and it will guarantee to do so.
01:05:15.920 And this amazing property that, that, that all of the, the, the founding fathers of, of those networks basically identified as important is so-called censorship resistance, which I think is also important in, in real life.
01:05:28.320 Very.
01:05:28.860 And so those networks provide censorship resistance.
01:05:32.760 It doesn't matter if one computer decides, well, I'm not going to accept Tucker's transaction because I don't like Tucker.
01:05:38.740 Well, there's going to be another computer that says, I will accept it.
01:05:41.620 So that is censorship resistance that is inherently baked into those systems.
01:05:45.920 And what that means is if you interact with this as this invisible machine, right, you get guaranteed execution for whatever you tell it to do, either send someone money or perform some other computational logic that is baked into the system.
01:06:02.520 And so there had been, have been different kinds of pioneers on the, on the front of performing, yeah, adding cryptographic privacy to those systems.
01:06:14.340 Um, there has, um, there has, for example, emerged on a network called zero cash, C cash, which is basically Bitcoin with cryptographic privacy.
01:06:23.100 Um, and there have also been pioneers like the inventors of tornado cash who have built a smart contract that exists within this adger is unstoppable.
01:06:33.900 Once you've uploaded, once you've uploaded it, you cannot stop it anymore.
01:06:36.900 So they did that.
01:06:37.900 And, um, the kind of code that they implemented there gave you privacy on top of this public network, um, which was the, or is the Ethereum virtual machine.
01:06:49.140 So they did that and, um, tornado cash did that tornado cash.
01:06:52.980 Were they, did they win the Nobel prize?
01:06:54.660 Um, did they get the presidential medal of freedom?
01:06:57.280 And what happened next when they offered privacy?
01:06:59.280 So there were, I think it was three founders, um, Roman Storm, who's an American citizen, um, Roman, uh, Semenov, who is a Russian national and, um, Alexei Petersev, who is a Russian national as well, who lives in the Netherlands.
01:07:18.640 Um, he has been convicted of, um, assisting in, in money laundering for five years, um, and, and, five years in prison, um, and Roman Storm, um, has been convicted of, um, in the United States of, um, conspiring to run a money transmitter without a license.
01:07:45.540 Now, why has this happened? Why did they suffer such grave consequences?
01:07:50.880 They were arrested.
01:07:51.980 They were arrested.
01:07:53.120 And brought on trial.
01:07:53.940 Brought on trial. I mean, it's, it's actually, if you look at what, what Roman Storm has faced, it was 40 years in prison for this, um, in the United States.
01:08:04.840 In the United States of America. And why, why has that happened, right?
01:08:10.100 They built a privacy tool where it was an illicit actor that used their privacy tool.
01:08:15.740 Um, and that is a shame because, um, it was an illicit actor that, um, a lot of people agree on it as an illicit actor.
01:08:24.060 I think the two of us also agree that North Korea laundering stolen hacked funds is an illicit actor.
01:08:30.660 Yes.
01:08:31.300 Misusing a tool, right?
01:08:32.440 So there's no question about this.
01:08:34.520 The, the underlying question really is...
01:08:36.260 And when we're sure that actually happened?
01:08:37.620 We are sure that happened, yes. For sure that, that has happened.
01:08:40.240 Um, and so they, they stole funds, um, because they were able to hack, um, different systems and, um, then were able to, to utilize this platform to gain privacy to then move those funds somewhere else.
01:08:56.840 Did Roman Storm participate in the North Korean hedge fund theft?
01:09:00.200 He did not know.
01:09:01.220 So if I rob a bank and then jump into my Chevrolet and speed away, does the president of General Motors get arrested?
01:09:09.540 Usually he doesn't know.
01:09:10.760 Okay.
01:09:11.120 Which is interesting because he provided clearly this tool for you to escape, right?
01:09:16.260 A getaway car!
01:09:16.320 And he, he knows that people, um, get away with cars, right?
01:09:21.500 Yes, he does.
01:09:22.660 So...
01:09:23.220 Kind of weird how he's dodged those obvious charges.
01:09:27.100 Is that, that's really what happened?
01:09:28.600 That is really what happened, yeah.
01:09:29.960 And, and has faced 40 years in jail.
01:09:32.520 Um, but the, the jury, um, yeah, could not find a unanimous decision on the, on the main charges, I guess, circumventing sanctions and, um, and, and, um, helping with, with money laundering.
01:09:46.920 Now, the interesting thing is, before they got arrested, what has happened?
01:09:51.060 Um, the OFAC, the ORAS for foreign asset control in the United States, um, they took the software that those developers had written.
01:10:02.520 And uploaded to the AFR, where it is become out of anyone's control, um, unstoppable by nature.
01:10:11.480 Anyone can use it.
01:10:12.780 They, they essentially wrote code for a software tool for anyone to get privacy.
01:10:18.260 Um, that software tool got sanctioned.
01:10:21.560 It got put on the SDN list for specially designated nationals, where you put the names of terrorists and you put the address in this AFR thing, right?
01:10:32.080 Um, um, um, of the software.
01:10:34.420 So the source code itself became illegal.
01:10:38.060 It was deleted from the internet.
01:10:40.500 All of the companies closed their developers account, developer accounts.
01:10:45.300 Um, the software they wrote, the, the free speech that they performed by coming up with those ideas and, and publishing it to the world got censored.
01:10:55.080 Because they were added to a, to a list, which they don't even belong on because it's, uh, it is not.
01:11:01.060 Without any vote in Congress, by the way, or this is just part of the, I think it's under State Department now, but I could be, or Treasury, I can't even remember.
01:11:08.900 But they have enormous power.
01:11:10.480 They've destroyed the lives of many thousands of people without any democratic oversight at all.
01:11:16.820 And, uh, it's pretty, pretty shocking.
01:11:19.360 Yeah.
01:11:19.520 And so, so it got added onto this list.
01:11:21.900 Um, and I think last year, um, a court in the, uh, state of Texas actually ruled that OFAC, um, does not have the statutory authority to do any of that.
01:11:36.100 Um, and they then silently removed, um, tornado cache again from the SDN list.
01:11:41.940 However, nobody is, is able to use the tool now, right?
01:11:44.960 Because every company for compliance reasons, um, outcasts you from, from the user base if you've ever touched any, any, um, anything related to that.
01:11:55.620 And Roman Storm is, he was convicted, you said.
01:11:59.600 He, he, there was a hung jury on, on the strongest charges, but on other charges he was convicted?
01:12:04.220 He was convicted on one charge, um, on, I think they, it is called, um, yeah, conspiracy to, um, to run a money processor, financial institution, right?
01:12:18.440 A bank, um, without a banking license.
01:12:21.440 Conspiracy to start a bank.
01:12:22.920 So, so, so, so, so they put him in jail.
01:12:25.960 Actually?
01:12:27.060 So, so he's, he, it is one year jail sentence that's on the charge, right?
01:12:31.140 But he's, he's currently in the process of appealing that.
01:12:33.380 So, um, Roman Storm, um, didn't run a bank.
01:12:39.820 He, he didn't create a bank.
01:12:41.460 He created software, right?
01:12:43.360 He made use of his inherent right for freedom of speech to build something that enables others to make use of their right for freedom of speech, right?
01:12:55.640 Because that is, at the end of the day, the, the freedom of economic interaction, right?
01:13:01.400 That is what, but he helped others protect for themselves.
01:13:05.240 He never processed a transaction for anyone, right?
01:13:08.000 He's not an intermediary.
01:13:09.340 He specifically built technology that is disintermediated where you yourself use that software.
01:13:15.480 Um, yeah.
01:13:17.320 And so, um, the remarkable thing is I pay some attention, obviously not enough.
01:13:23.620 I was not aware of this story until I was reading up on you.
01:13:27.700 What, where's all the coverage on Roman Storm?
01:13:30.760 He doesn't even have a Wikipedia page.
01:13:32.620 Yeah.
01:13:32.940 So, um, there is, there is, I think, incredible, um, institutions like the Electronic Frontiers Foundation, the EFF, um, and, um, DeFi Education Fund, but also companies like Coinbase, who actually have invested substantial amount of money, um, into defending Roman Storm.
01:13:54.740 And, um, yeah, Alexei Petersev as well.
01:13:58.340 Um, I think Alexei Petersev also doesn't get enough attention.
01:14:03.120 He's, um, I mean, he's now under house arrest in, in, in, in the Netherlands, um, and preparing to, to appeal his decision, I think, something like that.
01:14:11.660 Why are so many of the players in this Russian?
01:14:14.280 Um, I think it really boils down to them having a deep understanding about, I think historically, maybe culturally, they have an understanding about the importance of privacy.
01:14:24.740 In a society to, to, to uphold freedom, um, which is a shame, you know, that they.
01:14:32.880 Well, they've, they've suffered for that knowledge.
01:14:35.040 Yes.
01:14:35.520 For 70 years, more than.
01:14:37.420 Um, so, yeah, it's just, it's very striking.
01:14:40.780 It's 140 million people.
01:14:42.660 It's a tiny country, relatively speaking.
01:14:44.260 And yet they are way overrepresented from Pavel Durev on down.
01:14:48.460 For sure.
01:14:49.060 Yeah, that is, yeah, that is true.
01:14:50.960 Um, so I think, um, yeah.
01:14:54.740 I think it's interesting, um, how we also, all of us take that, take that as a granted that these kinds of people, um, go out of, um, their everyday lives and put a target on their head by, by shipping this technology.
01:15:10.800 Yes.
01:15:11.280 To enable you, um, to gain privacy, um, and, um, simply the knowledge about the existence of bad actors in the world, um, has made them, has made them victims and put them in jail, which is insane.
01:15:27.460 Um, well, I mean, it's something the rest of us should push back against, I think.
01:15:32.740 But the hurdle for me is not knowing.
01:15:34.940 Again, I didn't even know this was happening.
01:15:36.740 I should have guessed.
01:15:37.520 So, if you could be more precise about what you think the real motive was behind going after Tornado Cash and Roman Storm, like what, why was the U.S. government not prosecuting drug cartels in order to prosecute Roman Storm?
01:15:52.540 I think, um, so that has taken place under the previous administration.
01:15:57.980 Um, so I think President Trump with his administration has, um, done tremendous work in regards to pushing the adoption of decentralized technology, of really allowing us, all of the people in that space to, to try to rethink the financial system and, and build this technology.
01:16:19.400 Because they've sort of realized that, um, technological innovation runs at a faster pace than, than legislative processes.
01:16:29.560 And, and, um, under the previous administration, that looked differently.
01:16:34.220 So, so I think, um, that has helped this technology spread a lot.
01:16:38.820 Um, and it is, however, important to consider privacy.
01:16:45.080 And when the executive order banning, um, CBDCs was signed, um, central bank digital currencies, um, an explicit reason why CBDCs should never be adopted in the United States was the privacy concern.
01:17:00.100 Because if we look at, uh, all of those new digital currencies being built in Europe, um, and all around the world, I guess, besides the U.S., which is great, which, which, which actually, um, is amazing, I think, um, is that all of them are surveillance, um, machines to even a higher degree than the current financial system is already, right?
01:17:23.560 It is already a, um, a surveillance system, but what's so important about this next generation of, of money is, um, we are sort of at a crossroads.
01:17:34.320 Do we want our money to enable us freedom, freedom of economical interaction, freedom of thought at the end of the day?
01:17:41.540 Because whatever we think we do, right, where we, we want to put our money where our mouth is, um, or do we want a monetary system that enables automatic subsequent action, um, based on whatever activity you perform in your digital life?
01:18:00.780 Which can mean things like, now all of your money is frozen and you don't have any access to it anymore because whatever you just did, um, was deemed as undesirable by big brother, I guess, right?
01:18:13.600 So that is, that is literally the, the two possible futures that we have.
01:18:18.220 It's two extremes.
01:18:19.460 Um, there's no, no possible future in between.
01:18:22.840 Um, and what they've, the architects of, of those.
01:18:27.680 So you're assuming cash is over.
01:18:29.100 Um, cash already is also, um, being heavily surveilled.
01:18:34.000 So your banknote has a serial number.
01:18:35.840 So if you actually think about something like, um, tornado cash or all of the, I mean, there's, there's a lot of applications that, for example, utilize Arceum to also bring this level of privacy, right?
01:18:46.720 Um, if you think about all of these, these, these systems, they are in my mind personally, I mean, as long as you have an internet connection, if you don't have an internet connection,
01:18:57.240 maybe, um, you, you cannot spend your money right now, um, but as long as that exists, even superior to cash, because you don't have any serial numbers anymore, right?
01:19:07.840 Wait, so you say cash is being surveilled?
01:19:10.000 Sure. I mean, when I go to the ATM and withdraw money, the serial numbers are recorded in some database.
01:19:17.220 And when a merchant, um, at, at Walmart, I guess, or wherever puts that into their cash registry, um, you can also record a serial number.
01:19:26.700 So, um.
01:19:27.480 Is that true?
01:19:28.220 Yeah, there, there has been, I, I read an article a few months ago about a tracking system like that within Europe.
01:19:34.920 So, that is very practical and, yeah.
01:19:38.720 I'm going to take a, a magic marker, a pen, and distort the serial numbers in all my cash now.
01:19:44.380 Yeah, right.
01:19:45.520 So, I mean, it should be, it should still be legal tender, right?
01:19:48.860 I would think so.
01:19:49.780 Yeah.
01:19:50.020 I'd never heard of that.
01:19:50.720 I mean, I mean, there, there, there, there could be other tracking mechanisms.
01:19:53.000 I don't know, but, but I've read about this technology, which clearly exists, um, and it's being used to even turn the cash system into a surveillance, surveillance system.
01:20:04.060 And, um, it's not even, again, I think all of this is not even just, um, someone with governmental authority deciding to surveil people, right?
01:20:16.840 It is also companies, companies seeing economical value in surveilling you, um, and then utilizing this new technology, utilizing the internet, um, to, to do that.
01:20:27.420 And, um, it boils down to power, I would say, control, right?
01:20:31.480 Um, if you have access to as much information as possible, you can better prepare for the future and you can predict behaviors of your users or different actors.
01:20:41.740 And so that's why those, those systems get implemented.
01:20:43.700 Um, so we are on this, on this fork in, in, um, the path towards the future and what the people that are architecting those central bank digital currency systems have realized.
01:20:57.880 And that's, um, so interesting to me is this old concept that the cypherpunks in the 1990s, um, came up with, which is code is law, um, which expresses what, what has happened with tornado cash.
01:21:10.680 I think nicely where, um, it is the ultimate law sort of when you have this network that nobody controls and there's some piece of software and it just executes.
01:21:20.820 Whatever is written within that software code executes, there's no way of stopping it, there's no way of doing anything about it.
01:21:28.060 And so that's what I mean when I say code is law, code is law.
01:21:31.120 And the architects of those alternative systems have realized that there's so much power in being able to, let's say, take your chat messages and see that you have said something against big brother and big brother, brother doesn't appreciate that, right?
01:21:45.660 And so automatically now your money, um, is frozen and that is code is law, right?
01:21:53.160 In the utopian sense and in the dystopian sense where software automatically can lock you out of all of those systems.
01:22:00.220 And I would much rather, um, have a utopian future than a dystopian future.
01:22:05.020 But at the end of the day, from a technological standpoint, those things are similar.
01:22:10.060 The only difference really is cryptography.
01:22:13.980 Privacy.
01:22:14.940 Privacy.
01:22:15.660 Because you're offering that on a scale even larger than anything Tornado Cash or Roman Storm Attempt did, it has to have occurred to you that whether or not you have prominent investors, like you face some risk.
01:22:30.940 Sure.
01:22:31.420 So I think, um, what, what, what I'm doing with Archeum at the end of the day is I'm providing the most versatile and superior form you can execute a computer program, right?
01:22:45.500 Within encryption, you can execute a computer program, and you can have many people contribute encrypted data, and you can do all sorts of things.
01:22:54.420 You can do things starting with, um, financial transfers, right?
01:22:59.040 You can add privacy to financial systems.
01:23:00.860 But that doesn't just mean we're adding privacy to me and you, Tucker, interacting with each other.
01:23:06.720 We can also add privacy to entire markets, right?
01:23:10.100 Which, again, can also have downsides.
01:23:11.960 I'm not arguing that there's only upsides with this technology.
01:23:15.200 There might be actors that then utilize that, um, not, not, not just talking about criminal activity, but just unethical activity, right?
01:23:22.920 The way that people may interact.
01:23:25.140 So, um, at its core, it is neutral technology.
01:23:28.440 Um, but the use cases that, that I'm really focused on enabling, also our use cases like enabling within the healthcare system to actually utilize data that currently is being stored, but it is being stored in a very inefficient way where it's isolated, right?
01:23:45.700 So, with my technology, we can take this data and use it without ever risking that data to be exploited, without ever taking ownership of your data because you're the patient, you're the human, right?
01:23:57.240 I have no right to, to take ownership over that.
01:23:59.880 And I don't need with that technology because you can consent and say, let's improve healthcare or whatever with my data, but you're not getting my data because it's encrypted, right?
01:24:10.480 It's this, um, I don't know, it's a crazy concept to wrap your head around.
01:24:14.200 I, I, I get it, but it enables so much also on a national security level that it is strictly superior technology.
01:24:20.860 And I think this example that I told you earlier about verifiability, right?
01:24:25.800 Um, mathematically being able to be convinced that a computer, um, program, a computation that has been executed in privacy, right?
01:24:36.940 Um, has been executed correctly is such an amazing concept.
01:24:41.180 And, and, and the way I think about it really is opening up a new design space, um, all together and allowing companies to do actual innovation instead of innovating only on the front of how can I extract as much value as possible from my user by surveilling them.
01:24:59.960 Um, so I don't really, I don't really think about it the way that you framed it.
01:25:06.140 I'm building this generalized computing platform that can be used by anyone, um, because I don't have any control over it, right?
01:25:14.660 I'm not building a controlled infrastructure.
01:25:17.320 I'm building open, um, software that is used for good.
01:25:22.020 And I'm grateful that you are, and I don't at all mean to make you pessimistic or paranoid, but in so doing, you're threatening current stakeholders.
01:25:33.400 Um, sure.
01:25:35.460 But I think that's, that's, that's always the case with, with new technology, right?
01:25:39.580 Of course.
01:25:40.140 Yeah.
01:25:40.260 Um, I mean, when, when, when cars first came along, right, there were unions of, um, um, horse carriage, um, taxi ride providers that did not want to see cars on the road.
01:25:53.300 Of course.
01:25:53.860 So there's always, um, interests that, um, try, try to utilize both technology and, and law, um, to prevent others from getting into that position.
01:26:06.260 Yeah, keep the current monopoly in place.
01:26:07.440 Exactly.
01:26:07.760 Of course.
01:26:08.020 It all depends, the stakes depend entirely on how disruptive the new technology is.
01:26:12.320 Yes.
01:26:13.100 Ask Nikolai Tesla.
01:26:14.580 Yeah.
01:26:14.920 Um, right, right.
01:26:16.060 Sorry, dark.
01:26:17.100 But, so it's not a concern.
01:26:19.940 It is not a concern for me, no.
01:26:22.480 Hmm.
01:26:23.540 I wonder if that's just a quirk of your personality where you're just not afraid of stuff.
01:26:28.180 That's actually a issue.
01:26:29.740 I, I, I would say I sort of suffer some, uh, suffer from sometimes not being afraid of things, but.
01:26:35.900 Good.
01:26:36.060 I think it's.
01:26:36.880 I think you need that.
01:26:37.560 Yeah.
01:26:37.800 In order to proceed.
01:26:39.560 So from the perspective of the average American consumer who's not following this carefully, when does your life begin to look different as a result of this kind of technology?
01:26:49.180 When will you see this sort of thing in action?
01:26:50.920 How will you experience it?
01:26:52.100 That's actually, um, that's actually, um, a brilliant question.
01:26:57.360 Um, I, I think just trying to run numbers in my head and, and trying to predict the future.
01:27:06.440 That's something I've never done, by the way.
01:27:08.140 I've never paused in mid-conversation that I've got to run some numbers in my head.
01:27:11.580 I do this all the time.
01:27:15.040 Yeah, I never have.
01:27:16.740 Um, so I think, I think it will, will affect your, your everyday life, um, positively.
01:27:23.320 Um, once, once I guess there's an, um, infliction point reached, um, on, on multiple fronts, right?
01:27:33.360 I'm, I was talking about healthcare and national security, also financial system, right?
01:27:37.840 Um, but it also, I mean, so that's a criticism I actually have for Signal.
01:27:43.100 Um, that is that there exists one single point of failure within Signal's, um, technological stack that I've been vocal about and I, I dislike, which is that, um, what they call private contact discovery.
01:27:58.140 Um, where I have a set of contacts in my, contacts on my phone, right?
01:28:03.820 You do the same thing.
01:28:05.260 And if there is an intersection between the two sets that we have, where I have you as a contact, you have me as a contact.
01:28:12.140 Yes.
01:28:12.560 I get, um, Tucker suggested on Signal, right?
01:28:15.880 Um, only in that case.
01:28:17.820 How does that work, right?
01:28:19.440 How does Signal ensure that those contacts are encrypted and secure, right?
01:28:25.760 They, um, use trusted hardware for that.
01:28:29.340 And that is a critical flaw within their infrastructure.
01:28:32.020 So there's technology, um, trusted execution environments is what they're called, manufactured by, by Intel, for example.
01:28:40.840 Um, and this technology comes with this, this promise of being secure and being able to basically do what, what we're doing with mathematics, but instead with trust.
01:28:51.540 Um, so they say we built a secure machine.
01:28:53.680 Do you think we shouldn't trust Intel?
01:28:55.760 I, I think so, yes.
01:28:58.740 I, I, I think, I think, I think the.
01:29:01.940 They're required to trust Intel.
01:29:03.580 Yeah.
01:29:03.920 I, I think it's, uh, an insane idea to, to begin with.
01:29:07.560 Last year, it's been funny.
01:29:09.080 Last year, there have been a myriad, just last year, but over the last 10 years, a myriad of exploits of the technology.
01:29:15.460 Um, so in the past, it has always been sold sort of as, here's this technology, um, and it does verifiability and privacy and just put your data in that.
01:29:27.340 Um, there's no, there's no backdoor, right?
01:29:30.720 Um, of course not.
01:29:31.960 Why would there be a backdoor?
01:29:33.000 Um, why would Intel cooperate with anyone?
01:29:35.500 Sure, right.
01:29:36.100 Um, and, um, you would do that.
01:29:40.180 And then last year, there were those researchers that said, well, if you have physical access to this computer, you can just read out all of the data and you can not even just read out all of the data, but you can fake keys and then you can perform fake computations on behalf of other people.
01:29:56.660 So if you're building a financial system with a computer like this, I can just change numbers, right?
01:30:02.460 And I know what your numbers and I can, I can change those numbers.
01:30:05.980 And that's not even the core issue I have with that in the case of, of Signal, right?
01:30:10.220 So Signal is, I think, still relying on that tech.
01:30:14.440 So I think they run this hardware.
01:30:16.500 I mean, I hope they run the hardware because at least there I have a little bit of remaining trust assumption that, okay, they will not, um, yeah, try to hack those PCs, which is relatively straightforward.
01:30:28.200 You just connect a few cables at the end of the day.
01:30:30.920 Um, and, and then you can exfiltrate the information, which is the, the, the interactions, right?
01:30:37.460 Is Tucker my contact?
01:30:38.600 Is Yannick Tucker's contact, right?
01:30:40.340 That's very sensitive information.
01:30:42.120 Um, and so, um, that is a single point of failure.
01:30:46.080 Um, whereas they could access that information or whoever gets access to that information.
01:30:51.920 Um, and we're not even thinking about potential backdoors at that point, right?
01:30:56.900 Within that, within that hardware.
01:30:59.060 Right.
01:30:59.180 So within the manufacturing process, I mean, I think it would be very naive to assume that there's no backdoor similar to what we talked earlier about with dual EC, right?
01:31:08.560 Um, or something like, um, the, the clipper chip thing, right?
01:31:12.560 That, that was, um, attempted in the 90s.
01:31:15.380 So there's very, it's very likely, I would say that there's some randomness tempering, let's call it that, um, that could be in place because you are literally also getting, uh, keys right from the manufacturing process, right?
01:31:29.600 So it's this proprietary supply chain and then they ship that computer to you and it comes with random keys, um, that have been generated in that proprietary production line.
01:31:41.320 Um, so there's many single points of failure and that's what I, what I don't like about Signal because I don't want, um, this information out there, right?
01:31:48.840 What does my address book look like?
01:31:50.700 So they can fix that.
01:31:51.680 They can fix that with technology that we've built, right?
01:31:54.480 They can use our technology.
01:31:55.760 I'm more than happy to just give them the technology.
01:31:58.680 I mean, it's open source, right?
01:31:59.900 And, and then they can just build this thing without a single point of failure, without a way, because this is sort of a reasonable way for our state also to say, well, you actually have this data.
01:32:10.760 They'll give us this data, right?
01:32:12.320 But they cannot really argue that they don't have that data because they could connect a few cables to that computer and then get that data.
01:32:19.460 So it's not the secure device that, um, people claimed in the past, um, it was.
01:32:25.340 So I think that is important, um, um, to resolve.
01:32:30.140 I actually don't recall how I got to the attention.
01:32:32.540 I wonder, uh, if any, if any big hardware manufacturer will begin to offer truly secure devices for sale.
01:32:46.300 Um.
01:32:48.260 Which is not worth it probably, right?
01:32:50.220 So, so I think it is, it is worth it, right?
01:32:53.700 You as a military want to have secure devices, right?
01:32:57.120 Um, everyone, I think everyone would rather compute on a secure device.
01:33:02.040 It is an insecure device.
01:33:03.680 Um.
01:33:04.200 But the manufacturers aren't making their money from the devices.
01:33:08.260 I mean, they're making money.
01:33:09.660 I don't know what it costs to make an iPhone less than 900 bucks, but I mean, it's an annuity.
01:33:14.240 Like the, the long, you know, the second you buy an iPhone, you're making money for the company every day you use it, right?
01:33:20.680 Sure, sure.
01:33:21.600 So, so, so I think, um, it is impossible to build secure hardware, um, in that regard, where, where those claims of full privacy and security,
01:33:31.900 um, are factually true, um, that is impossible.
01:33:35.680 There have been so many techniques where you actually just, um, yeah, use, use so many different tools, um, to, to play around with those devices,
01:33:43.380 where it is literally impossible to implement secure and, um, verifiable systems.
01:33:50.580 Because even while verifying them, you need to take them apart, um, sort of destroying them in the process.
01:33:55.880 So, that, that does not exist.
01:33:58.280 What I think, however, exists sort of is this concept of decentralization and why that's so powerful.
01:34:04.580 Because it doesn't really matter if this manufacturer here, um, creates a backdoor, um, as long as I have 10 different computers or 100 computers, right, from different manufacturers
01:34:16.200 and there's one that does not have a full system level backdoor installed, um, I am secure under this trust model that we've developed in our company, right?
01:34:25.220 So, I think that's why decentralization is so important.
01:34:29.740 That was the basis of our political system when it was created, that same concept.
01:34:32.940 The power is dangerous and so it has to be spread among different holders, different entities, so it doesn't concentrate and kill everybody and enslave them.
01:34:40.720 That's obviously going away.
01:34:42.080 But that was, that was the concept of the American Republic.
01:34:46.080 Yeah, exactly.
01:34:47.100 And, and I think, um, it is sort of important to look at surveillance in the same way, um, where if you, if you have access to surveillance, you basically have access to unlimited power.
01:35:00.940 So, whatever surveillance system we, we implement, um, be it chat control in the European Union, where I've been very vocal, vocally opposed to on, on X.
01:35:11.620 Um, and I, I actually just learned, um, last week that the UK implemented, um, their version of chat control on the 8th of January, um, which is a censorship machine.
01:35:27.460 Um, and, um, um, surveillance backdoor, right, installed within all of your messaging applications.
01:35:34.620 Um, and it comes with this claim of, well, we are implementing this because we need to fight child exploitation, right?
01:35:42.380 There's always one.
01:35:43.240 Child exploitation.
01:35:44.660 They care about the children.
01:35:45.900 Yeah, I, I, I strongly believe that.
01:35:49.560 So, um, they, they basically have, there's, there's basically four, four, um, reasons to implement surveillance.
01:35:56.480 So there's child exploitation.
01:35:57.820 Yes.
01:35:58.160 There's terrorism.
01:35:59.380 Yes.
01:35:59.800 There's money laundering.
01:36:00.780 Yes.
01:36:01.260 And there's, uh, war on drugs.
01:36:03.720 Oh, war on drugs.
01:36:04.380 Those are the four reasons, right?
01:36:05.540 And they always wrote it.
01:36:06.440 The people engaged in importing drugs into our country, laundering the money, exploiting the children and committing serial acts of terror against their own population.
01:36:15.720 They're all very concerned.
01:36:16.800 Oh, man, I, I really think we now need surveillance now that you say it.
01:36:20.540 Not of us.
01:36:21.680 Yeah.
01:36:21.900 So, um, what's so funny, um, is that in 1999, um, the, some, some policing working group of the European Commission, um, there was a transcript of their discussions.
01:36:35.400 And literally within the transcript, when they were talking about implementing digital surveillance systems, they were like, I think we should switch our arguments over to, um, child exploitation because that is more emotionally charged, right?
01:36:49.100 That convinces people.
01:36:50.640 And so it's not, not just that, that for us, it is obvious that that's not what's going on, right?
01:36:56.460 There's.
01:36:56.800 When the people who covered up the grooming gangs are making that case, it's like, I don't think it's sincere at this point.
01:37:01.840 Exactly right.
01:37:02.420 So, um, there, there is a reason why we don't believe that that's the actual reason, but what I'm arguing for is that that doesn't even matter.
01:37:13.080 Even if, even if the politicians are convinced that it's about protecting the children and that's the most effective measure to do that, right?
01:37:21.320 To surveil all of the jets, um, what's going to happen is, thanks to this being implemented as infrastructure that exists everywhere and there being a small circle of people that have access to this technology, it will get abused.
01:37:37.420 Um, it is very easy to abuse those systems because the abuse itself happens within secrecy.
01:37:44.000 So there's, there's no oversight.
01:37:45.580 Of course, and instantaneously because of the, the rise in computational power.
01:37:49.000 It's not like someone has to go to the Stassi archives to read all the files.
01:37:52.800 It's like.
01:37:53.240 And, and, and, and Sam Altman will gladly help you to, to sift through all of the, your, your warehouse.
01:37:58.780 Oh, he's a good guy.
01:37:59.860 By the way, a lot of these businesses draw the worst people, like the most unethical people have the most power in case you haven't noticed.
01:38:07.420 It's wild.
01:38:08.460 It is wild.
01:38:09.100 Yeah.
01:38:09.500 Um, yeah.
01:38:11.960 I mean, there's a, there's a economical function sort of to reward this, right?
01:38:16.640 Because if I build an application and, um, you build an application and we just provide some value to our user, um, and the user pays for that, basically capitalism, right?
01:38:29.220 Um, all of that works out nicely, but then you decide, uh, what if I take all of this information from my user and I use that to extract additional value from him, right?
01:38:39.880 You're way more profitable, profitable for that.
01:38:42.640 So the incentives.
01:38:43.400 And so then those incentives shift towards the setup and these kinds of applications are the ones that receive investment, right?
01:38:51.700 And so that, that just increases.
01:38:53.820 And so unethical behavior gets rewarded in the system.
01:38:56.540 Just to be clear about what you're saying, are you saying that all texts sent within the UK are now monitored by the UK government?
01:39:03.400 Um, I'm not 100% familiar with all of the intricacies of what the, um, digital service or online safety, I think it's, it's called in the UK.
01:39:13.440 What is happening there is that there is censorship being applied to the messages.
01:39:18.260 So you receive whatever unsolicited image, right?
01:39:22.640 Um, and then, um, that's being censored.
01:39:25.180 So, um, what's happening there is, I think, I think what's important to understand is that censorship is a byproduct of surveillance, generally speaking.
01:39:34.820 Yes.
01:39:35.060 And so, um, you need to take a look at all messages in order to be censored or something, to censored something, right?
01:39:41.700 And so, that's what's happening there.
01:39:44.140 Um, and even if we assume only the best of intentions, you have this infrastructure in place that tomorrow cannot just be abused by someone.
01:39:55.240 Well, we should test it.
01:39:55.980 I'm in the UK all the time.
01:39:57.300 I have family there.
01:39:58.160 And I'm going to do a double blind study with my wife.
01:40:00.180 I'm going to test to every person in my contact list, overthrow Keir Starmer.
01:40:05.620 Okay.
01:40:06.180 Yeah.
01:40:06.820 And to thousands of people, exclamation point.
01:40:09.220 And she won't, and we'll see who gets arrested.
01:40:11.540 Yeah, that's a, that's a great experiment.
01:40:13.880 Actually, I, I need to attend a conference in, in the UK, um, this year.
01:40:19.520 Um, and it's so funny because a month ago there was this, I think it's also some proposal that basically specifies that people that work on, on encryption are sort of,
01:40:30.160 sort of persona non grata in the, in the UK, something like that.
01:40:33.480 I think it's not yet implemented, but I saw that on, on X.
01:40:36.800 I mean, you can't get in the country if you're for privacy?
01:40:39.100 Something like that.
01:40:40.040 Yeah.
01:40:41.220 Where are we going to, like big picture, where is everyone going to end up?
01:40:45.000 Do you think?
01:40:45.720 If the control grid snaps into place and it is snapping into place, where do people go?
01:40:51.000 U.S.?
01:40:51.420 Is that the only place?
01:40:54.700 Um, so.
01:40:57.940 So, all of those, I mean, we are basically, I would say, not just sliding into that direction, but galloping.
01:41:08.220 Yes.
01:41:08.520 Um, and the infrastructure, it has been quite a while, um, since I started trying to implement those in your face things, right?
01:41:20.800 Where you literally call it chat control.
01:41:22.820 I mean, imagine how crazy that is.
01:41:24.720 It's literally stating every single messaging platform, email, whatever we need to scan for this made up reason.
01:41:31.780 Um, but trust us, we will only do that for this made up reason and no other reason.
01:41:36.960 Um, and it happens on your device, right?
01:41:39.220 So that's why end-to-end encryption is not undermined because it is being scanned on your device.
01:41:44.340 Right?
01:41:44.940 So, um.
01:41:46.500 And that's very different from putting microphones in your bedroom.
01:41:49.400 Trust is very, very different.
01:41:50.680 Yes.
01:41:51.060 Yeah.
01:41:51.260 I mean, I think people don't realize the extent to how, um, surveillance is possible nowadays.
01:41:58.520 So, um, with, with Wi-Fi routers, you can determine movements within your, your apartment, right?
01:42:08.020 And so there, there, there was this, um, this one company, I mean, that wasn't a big scandal.
01:42:15.000 It was literally just, um, I don't know if you're familiar.
01:42:18.720 Um, I think he's called Louis Rossman, who's a YouTuber from New York, who was fighting for
01:42:25.340 the right to prepare, uh, right to repair, um, devices and stuff, right?
01:42:30.280 Yes.
01:42:30.520 So he's always been very, um, much advocating those, those efforts.
01:42:34.760 And so he, he just made this, this video.
01:42:37.080 Um, where he went through the privacy policy of, um, of some internet service provider.
01:42:43.720 Um, and the privacy policy explicitly stated that they're allowed to monetize the movement
01:42:49.660 data, um, that they get from those devices that you put in your home.
01:42:55.780 And the funny thing about this, this case that he was highlighting is that, um, for you
01:43:01.680 as a, as a person that lives in this, in this building, you didn't even have an
01:43:06.960 option to choose a different internet service provider because, um, with, um, I guess, bulk
01:43:12.560 agreements between a landlord and the internet service provider, you are forced to have those
01:43:17.760 routers and those routers aren't even within your, um, within your apartment there in the
01:43:22.640 vaults or somewhere.
01:43:23.320 And so you're just being scanned within your most intimate, um, intimate, um, area of
01:43:30.800 life, your home, um, by your internet service provider.
01:43:36.800 Okay.
01:43:36.820 And what about phones listening to people, the microphone on the phone or the camera on
01:43:42.100 the phone taping you?
01:43:43.460 So there's an interesting concept of, um, um, um, ultrasound, um, um, um, listening of
01:43:53.680 those phones where, where basically you have a TV advertisement, um, and we don't hear
01:43:59.140 ultrasound, right?
01:44:00.200 But your phone with its microphone can, could, could hear it.
01:44:04.080 I don't know if it's ultrasound or whatever frequency, right?
01:44:06.440 So within that advertisement, we're going to play that sound.
01:44:10.240 So you, your phone can pick that up.
01:44:12.500 And then when you go to our fast, fast food restaurant on the same day, we know that this
01:44:18.680 advertisement has, has worked, um, because your phone previously registered it.
01:44:23.400 So there have been a lot of attempts like this.
01:44:25.760 I think that surfaced a couple of years ago.
01:44:28.220 Um, this case, I don't recall the exact name of, of, of how this technology was called,
01:44:32.160 but, um, especially, um, there were court cases actually against that, um, where they, um,
01:44:41.240 require the company that offered the technology to make the user aware that this is happening
01:44:46.660 because a lot of apps had, um, this technology installed and they had microphone permissions
01:44:53.540 and they just installed this library because maybe that library pays the app developer some
01:44:58.260 money, right?
01:44:58.780 Um, and at the end it is tracking you.
01:45:02.020 So what I'm, what I'm just trying to say is there's an sort of infinite amount of ways
01:45:05.980 you can be tracked.
01:45:07.000 I mean, just enough, enough last year in the U S, um, there were, um, those cases surfacing
01:45:15.000 surrounding, um, city surveillance cameras, um, around 40,000, um, of these I think exist
01:45:22.240 in the U S and those camera cameras or also license plate readers, right?
01:45:28.460 All of that, um, um, are incredibly smart, um, equipped with artificial intelligence to
01:45:34.400 directly track faces of humans.
01:45:37.360 Um, and, um, there was this, this one YouTuber, Ben Jordan, who actually exposed that.
01:45:43.960 And funnily enough, after exposing that, got private investigators from the set company
01:45:49.160 to his home to, I guess, fully destroy his privacy.
01:45:53.780 Um, but, um, so he, he, I think he helped expose that, um, that none of these cameras were
01:46:03.180 encrypted.
01:46:04.060 So they were recording all cities across the U S permanently 24 seven storing that everything
01:46:12.620 being mass availed while anyone could just via a Google search and some specific query
01:46:18.740 get access to the camera feed and see what is going on.
01:46:22.620 And he showed videos of playgrounds where children were playing, right?
01:46:26.900 And so that's what I mean when I say that surveillance does not bring us safety or security.
01:46:32.640 It is in most cases doing the opposite.
01:46:36.980 It's also all networked.
01:46:38.920 It's digital and it's networked.
01:46:40.480 So that means that companies can pull up CCTV cameras from around the world.
01:46:44.740 Oh yeah.
01:46:45.040 Anyone can.
01:46:45.720 Facial recognition.
01:46:46.520 Yeah.
01:46:46.820 Anyone can.
01:46:47.940 I mean, it's, and, and what I really found so striking about the story is him outlining
01:46:53.800 how he was able to follow people around, right?
01:46:56.960 He was able to say, oh yeah, they went to church here on Sunday and then they went there
01:47:01.300 for shopping.
01:47:02.160 That is insane.
01:47:03.000 Right.
01:47:03.360 And I don't know you as a human being, just, there was this one video of, uh, of an adult
01:47:08.960 man just going onto a completely empty playground and just hopping onto the swing and just swinging
01:47:15.740 swinging, swinging there, right?
01:47:16.880 If, if this person knew that he was being watched, he would never have done that.
01:47:20.660 Right.
01:47:21.200 And so this, this, this idea of escapism, um, is entirely, um, impossible in a, in a world
01:47:27.580 like this.
01:47:28.200 Because there is no escape.
01:47:29.420 There's no escape.
01:47:30.160 Yeah.
01:47:31.000 Um, yeah.
01:47:32.600 Um, also with, with license plate readers, which aren't license plate readers, they are
01:47:37.620 surveillance cameras that pretend to only do a specific function.
01:47:41.520 Um, there was, what other functions do they do?
01:47:44.660 I mean, record everything and be able to track cars, even if they don't have a license plate.
01:47:49.040 So you cannot be just a license plate reader.
01:47:51.680 If you're, one of your capabilities is to also help you identify cars that don't have a
01:47:55.800 license plate, right?
01:47:58.020 Fair.
01:47:58.780 So, um, um, I just, just recall one case where there was a police officer who then used this
01:48:06.900 access to technology to stalk his ex-girlfriend, right?
01:48:10.460 Which is inevitable with this kind of technology.
01:48:13.200 Of course.
01:48:13.480 If you put that, um, that power into, into the hands of, of individuals who can use this
01:48:18.540 technology in secrecy, right?
01:48:19.860 It's not like throwing a nuclear bomb on, on a country.
01:48:23.920 People will notice, right?
01:48:25.520 Mass surveillance, nobody notices.
01:48:29.660 Can you, uh, if, so people have made it two hours into this interview.
01:48:33.780 They're obviously interested in you.
01:48:35.380 First, can you pronounce and spell your name?
01:48:38.180 Yannick Schrade, Y-A-N-N-I-K-S-C-H-R-A-D-E.
01:48:46.060 The name of your company and its spelling?
01:48:48.380 Archeum, A-R-C-I-U-M.
01:48:52.180 How do you speak English as fluently as you do since it's your second language?
01:48:56.760 Um, I would say as a, as a, it's funny because as a child, um,
01:49:05.360 when I, when I was in, in high school, there were phases because I was consuming so much,
01:49:10.280 um, English content, um, on the internet, um, that I was consciously thinking in English,
01:49:16.720 right?
01:49:17.040 As a child.
01:49:17.880 Um, yeah, I would say that.
01:49:21.220 You're on Twitter.
01:49:22.660 Where else can people go to read your views on technology and privacy?
01:49:26.420 Um, mainly on my Twitter, um, at Y-R-S-H-R-A-D.
01:49:32.540 Um, and I, I also have a small, small website, um, just, just my personal website, I guess.
01:49:41.300 I don't have a blog there.
01:49:42.300 Um, I write all of my articles basically on, on Twitter.
01:49:46.180 Sometimes, sometimes I get the chance to, to publish my views on, um, on some very niche,
01:49:53.100 um, um, news outlets in, in Germany.
01:49:57.120 Um, but, um, most news outlets don't really care about privacy.
01:50:01.040 Um, so, so I, I, I stick with X and I, I really like, I really like talking on X, sharing
01:50:08.440 my thoughts on X, writing articles there, right?
01:50:11.380 Um, when I, when I talked about, um, about chat control specifically on X and, um, it's
01:50:20.720 so funny.
01:50:21.380 Um, we, we haven't even touched on, on, on, on, on the fact that chat control, the way
01:50:26.860 it's, um, aimed to be implemented in the European Union with the current proposal.
01:50:31.880 I mean, what, what happened is that there was this proposal where they said, you need
01:50:35.860 to, all providers need to have chat control, which is so-called client-side scanning, right?
01:50:42.400 Tucker's phone is gonna check the message that Tucker is sending right now if that message
01:50:48.040 is illicit under some definition.
01:50:50.340 And if so, then it's gonna send a message to the police.
01:50:52.760 That is what client-side scanning is.
01:50:54.640 Um, and in its most, um, I guess, innocent form, it would just be, um, we're gonna censor
01:51:00.760 the message because, I don't, I don't know, child exploitation or whatever made up reason,
01:51:05.400 right?
01:51:05.680 So, so, um, we're gonna censor that message.
01:51:08.540 In the worst case, it would just be, we're gonna forward, forward that message.
01:51:11.800 And that's, that's what the law that they had, um, is that received a lot of backlash.
01:51:17.100 Also, thanks to Elon Musk, um, and didn't pass.
01:51:21.080 And then, um, as you would expect, shortly after, I think it was less than a month, um, they
01:51:27.360 came back with a new proposal.
01:51:29.280 And that new proposal, um, made it voluntary.
01:51:35.400 So, the new proposal basically states, um, hey, Mark Zuckerberg, do you want to voluntarily
01:51:42.100 add a surveillance mechanism to your applications?
01:51:45.840 Um, which, which is insane, right?
01:51:47.760 Because, of course, companies will, will voluntarily implement those surveillance mechanisms.
01:51:51.980 But if you go down, um, those different paragraphs in that proposal, what you will realize is
01:51:59.980 that it is, in fact, not voluntary.
01:52:03.640 Um, what you will realize is that, um, in order to combat child exploitation, the European-
01:52:10.640 Terrorism, money laundering.
01:52:12.000 Yes, yes.
01:52:12.440 Um, so, in order to do that, um, they're gonna, um, introduce a new bureaucratic agency, um,
01:52:22.440 who is tasked with, um, risk assessing different platforms, right?
01:52:27.320 So, we're gonna look at Signal, we're gonna look at WhatsApp, we're gonna look at Gmail,
01:52:31.720 every single platform.
01:52:32.720 We're gonna risk assess and then we're gonna be like, hmm, how risky is that platform?
01:52:37.260 If it's risky, then we apply coercive measures and they need to implement, um, all, I guess,
01:52:44.800 all, all, all measures to, to combat whatever illicit activity, um, is, is targeted, which
01:52:51.620 in the case of child exploitation explicitly means that because that's the only thing you
01:52:55.700 can do, scan those messages, right?
01:52:57.900 Um, and so, it is not voluntary after all because if, and it explicitly says that you, if
01:53:05.480 you don't want to land in the high-risk category, just voluntarily scan and then you're not in
01:53:11.340 that category.
01:53:12.380 That's, uh, in the US, that's called extortion?
01:53:14.900 Yeah.
01:53:15.340 You don't have to give me your money, but I'll shoot you.
01:53:17.700 Yeah, yeah, yeah, yeah, but, but, but feel free to not give me your money.
01:53:21.060 It's your choice.
01:53:22.080 Yeah, yeah, yeah, yeah.
01:53:23.240 Um, last question, where do you, you're 25 years old, which is remarkable, where do you
01:53:29.820 imagine you'll be at 45?
01:53:32.420 At 45?
01:53:32.980 Um, you mean?
01:53:38.560 What will you be doing?
01:53:39.740 What will the world look like?
01:53:40.800 What the world will look like?
01:53:43.340 Um, I'm a very optimistic person.
01:53:45.840 So, um, while there is those two trajectories, right, that I think not just the United States,
01:53:53.220 but humanity in general will either take, right, one of those, um, I strongly believe that we
01:53:59.440 will be able to, um, move into the utopian direction instead of the dystopian direction.
01:54:04.920 And so, um, what it means for what I need to achieve, um, is I need to, um, not just tell,
01:54:13.580 tell people about the importance of this, right?
01:54:15.540 Um, people sort of know that privacy is important, right?
01:54:18.980 I think most of your audience realizes that, right?
01:54:23.060 Otherwise, I feel like they wouldn't be listening to you.
01:54:25.760 So, um, it is, of course, about education and, and, and stuff.
01:54:30.860 But more importantly, and that's this core realization that I had, is that privacy is only
01:54:36.920 going to get adopted if it enables strictly superior technology.
01:54:41.960 And so, that's what I'm doing.
01:54:42.880 That's the mission.
01:54:43.960 That's what I'm doing with Archeum to enable, um, a situation in which you have to adopt it,
01:54:50.920 sort of, because it would be retarded to not do so.
01:54:54.240 Um, and so, that's, that's what I'm trying to do.
01:54:57.360 And I think we can end up in a world like this where, um.
01:55:00.720 Because that's what it needs.
01:55:02.000 You're exactly right.
01:55:03.200 It's not enough to say we're not fully human without it.
01:55:06.900 Yeah.
01:55:07.540 The board of directors is going to say, well, yeah, but look at the returns.
01:55:11.100 Exactly right.
01:55:11.840 Yeah.
01:55:13.640 I can't, uh, thank you enough if our viewers knew how this interview came about.
01:55:19.400 I don't think they would believe it.
01:55:21.940 So, I'm not even going to suggest, I'm not even going to say.
01:55:24.240 How this interview came about.
01:55:25.440 But it was through a series of, um, chance encounters that it was just really felt like
01:55:31.180 the hand of God.
01:55:31.840 So, thank you very much for doing this, Yannick.
01:55:34.640 Thanks for having me, Tucker.
01:55:35.720 I appreciate it.