Signal is an encrypted messaging app that allows you to send and receive messages without being spied on by other people. It s designed to make it easier for you to communicate with your friends, family, and significant others without fear of government surveillance. In this episode, we talk about how Signal came about, why it s important, and what it s trying to do to make the internet more private, and why we should all be using it. We also talk about privacy, privacy issues, and privacy in general, and how we can all learn to be more courageous in the process of living in a world where we re constantly being surveilled and tracked by the government. This episode was produced and edited by Annie-Rose Strasser and Alex Blumberg. Our theme song is Come Alone by Suneaters, courtesy of Lotuspool Records. Our ad music is by Build Buildings Records, provided by Epitaph Records, recorded live on location in Los Angeles, CA. Please rate, review, and subscribe to our podcast on Apple Podcasts, wherever you get your favourite streaming platform. Thank you so much for your support and support. The opinions stated here are our own, and opinions expressed in this podcast are not those of our websites and social media platforms. We are not affiliated with any of the companies listed below. We do not own the rights to any of these products or services mentioned in the podcast. If you like them, please reach out to us directly or indirectly through our social media outlets. We thank them. Thanks for listening and share this podcast! if you re a friend, we are a supporter of the podcast and/or your support is appreciated and we appreciate the support we receive. in any of their support is greatly appreciated. Thank you, we really appreciate it greatly. Timestamps: - Thank you. - This podcast is appreciated! - Please rate and review the podcast is very much appreciated. -- Thank you for your feedback is very appreciated - thank you for the support is much appreciated, thank you, it helps us out there, it really helps us make this podcast out there more than you can help us make the podcast better than you know what we can do it -- we really do it, it's a lot more than that helps us know that we can make it better than that -- thank you -- and we re making it so much out there.
00:00:53.000Okay, well, you know, I think ultimately what we're trying to do with Signal is stop mass surveillance to bring some normality to the internet and to explore a different way of developing technology that might ultimately serve all of us better.
00:01:09.000We should tell people, maybe people just tuning in, Signal is an app that is...
00:01:15.000Explain how it works and what it does.
00:01:41.000Typically, if you want to send somebody a message, I think most people's expectation is that when they write a message and they press send, that the people who can see that message are the person who wrote the message and the intended recipient.
00:01:56.000There's tons of people who are in between, who are monitoring these things, who are collecting data information.
00:02:02.000And Signal's different because we've designed it so that we don't have access to that information.
00:02:08.000So when you send an SMS, that is the least secure of all messages.
00:02:14.000So if you have an Android phone and you use a standard messaging app and you send a message to one of your friends, that is the least of all when it comes to security, right?
00:02:33.000So iPhones use iMessage, which is slightly more secure, but it gets uploaded to the cloud, and it's a part of their iCloud service, so it goes to some servers and then goes to the other person.
00:02:49.000It's encrypted along the way, but it's still, it can be intercepted.
00:02:59.000Yeah, like Jeff Bezos' situation, exactly.
00:03:02.000Fundamentally, there's two ways to think about security.
00:03:04.000One is computer security, this idea that we'll somehow make computers secure.
00:03:09.000We'll put information on the computers, and then we'll prevent other people from accessing those computers.
00:03:13.000And that is a losing strategy that people have been losing for 30 years.
00:03:18.000Information ends up on a computer somewhere, and it ends up compromised in the end.
00:03:22.000The other way to think about security is information security, where you secure the information itself, that you don't have to worry about the security of the computers.
00:03:28.000You could have some computers in the cloud somewhere, information's flowing through them, and people can compromise those things and it doesn't really matter because the information itself is encrypted.
00:03:38.000And so, you know, things like SMS, you know, the iMessage cloud backups, most other messengers, Facebook Messenger, all that stuff, you know, they're relying on this computer security model And that ends up disappointing people in the end.
00:03:59.000What was unsatisfactory about the other options that were available?
00:04:04.000Well, because the way the internet works today is insane.
00:04:09.000Fundamentally, I feel like private communication is important because I think that change happens in private.
00:04:16.000Everything that is fundamentally decent today started out as something that was a socially unacceptable idea at the time.
00:04:24.000You look at things like abolition of slavery, legalization of marijuana, legalization of same-sex marriage, even constructing the Declaration of Independence.
00:04:35.000Those are all things that required a space for people to process ideas outside the context of everyday life.
00:04:47.000Those spaces don't exist on the internet today.
00:04:49.000I think it's kind of crazy the way the internet works today.
00:04:53.000If you imagined You know, every moment that you were talking to somebody in real life, there was somebody there just with a clipboard, a stranger, taking notes about what you said.
00:05:04.000That would change the character of your conversations.
00:05:08.000And I think that in some ways, like, we're living through a shortage of brave or bold or courageous ideas, in part because people don't have the space to process what's happening in their lives outside of the context of everyday interactions,
00:05:26.000That's a really good way to put it, because you've got to give people a chance to think things through.
00:05:32.000But if you do that publicly, they're not going to.
00:05:35.000They're going to sort of like basically what you see on Twitter.
00:05:40.000If you stray from what is considered to be the acceptable norm or the current ideology or whatever opinions you're supposed to have on a certain subject, You get attacked, ruthlessly so.
00:05:56.000So you see a lot of self-censorship, and you also see a lot of virtue signaling, where people sort of pretend that they espouse a certain series of ideas because that'll get them some social cred.
00:06:10.000I think that communication in those environments is performative.
00:06:14.000You're either performing for an angry mob, you're performing for advertisers, you're performing for the governments that are watching.
00:06:23.000And I think also the ideas that make it through are kind of tainted as a result.
00:06:31.000Did you watch any of the online hearing stuff that was happening over COVID? You know, where city councils and stuff were having their hearings online?
00:06:41.000It was kind of interesting to me because it's like, you know, they can't meet in person, so they're doing it online.
00:06:46.000And that means that the public comment period was also online, you know?
00:06:50.000And so it used to be that, like, you know, if you go to a city council meeting, they have a period of public comment where, you know, people can just stand up and say what they think, you know?
00:06:58.000And, like, ordinarily, it's like, oh, you got to go to city hall, you got to, like, wait in line, you got to sit there, you know?
00:07:02.000But then when it's on Zoom, it's just sort of like anyone can just show up on the Zoom thing.
00:07:06.000You know, they just dial in and they're just like, here's what I think, you know?
00:07:10.000You know, it was kind of interesting because particularly when a lot of the police brutality still was happening in Los Angeles, I was watching the city council hearings and people were just like, you know, they were just calling, you know, like, fuck you!
00:07:24.000I yield the rest of my time, fuck you!
00:07:26.000You know, it was just like really brutal and not undeservedly so.
00:07:33.000You know, what was interesting to me was just watching the politicians, basically, you know, who just had to sit there, and just, they were just like...
00:07:42.000And it was just like, you know, you get three minutes, and then there's someone else to get, you know, and they're just like, okay, and now we'll hear from, you know, like...
00:07:48.000And, you know, watching that, you sort of realize that it's like, to be a politician, you have to just sort of fundamentally not really care what people think of you, you know?
00:08:01.000You have to fundamentally just be comfortable sitting, you know, and having people yell at you, you know, in three minute increments for an hour or whatever, you know.
00:08:11.000And so it seems like what we've sort of done is like bred these people who are willing to do that, you know.
00:08:16.000And in some ways that's like a useful characteristic, but in other ways that's the characteristic of a psychopath, you know.
00:08:59.000No, but, and I'm, I think, you know, Trump is perfectly capable of just not caring.
00:09:03.000You know, just like people, like, you know, Grayson is just like, yeah, whatever, you know, I'm the best, they don't, you know.
00:09:07.000And, like, that's, you know, that's politics.
00:09:11.000But I think, you know, the danger is when that, you know, to do anything ambitious, you know, outside of politics or whatever, you know, requires that you're capable of just not caring, you know, what people think or whatever, because everything is happening in public.
00:09:24.000I think you made a really good point in that change comes from people discussing things privately because you have to be able to take a chance.
00:09:37.000You have to be daring and you have to be able to confide in people and you have to be able to say, hey, this is not right and we're going to do something about it.
00:09:46.000If you do that publicly, the powers that be that do not want change in any way, shape, or form, they'll come down on you.
00:09:54.000This is essentially what Edward Snowden was warning everyone about when he decided to go public with all this NSA information.
00:10:02.000We're saying, look, this is not what we signed up for.
00:10:06.000Someone's constantly monitoring your emails, constantly listening to phone calls.
00:10:11.000This is not this mass surveillance thing.
00:10:13.000It's very bad for just the culture of free expression, just our ability to have ideas and to be able to share them back and forth and vet them out.
00:10:25.000I think when you look at the history of that kind of surveillance, there are a few interesting inflection points.
00:10:31.000At the beginning of the internet as we know it, in the early to mid-90s, there were these DOD efforts to do mass surveillance.
00:10:43.000They were sort of open about what they were doing.
00:10:47.000One of them was this program called Total Information Awareness.
00:10:53.000And they were trying to start this office, I think called the Total Awareness Office or something within the DoD.
00:10:58.000And the idea was they're just going to collect information on all Americans and everyone's communication and just stockpile it into these databases and then they would use that to mine those things for information.
00:11:09.000It was sort of like their effort to get in on this at the beginning of the information age.
00:13:42.000And people were like, no, I don't think so.
00:13:43.000But instead, everyone ended up just carrying cell phones at all times, which are tracking your location and reporting them into centralized repositories that government has access to.
00:13:52.000And so, you know, this sort of like oblique surveillance infrastructure ended up emerging.
00:13:59.000And that was what, you know, people sort of knew about, but, you know, didn't really know.
00:14:34.000Cambridge Analytica was a firm that was using big data in order to forecast and manipulate people's opinions.
00:14:51.000In particular, they were involved in the 2016 election.
00:14:58.000It was sort of, you know, so it's like, you know, what Stone have revealed was PRISM, which was the cooperation between the government and these places where data was naturally accumulating, like Facebook, Google, etc., you know, and the phone company.
00:15:12.000And Cambridge Analytica, I think, was the moment that people were like, oh, there's like also sort of like a private version of PRISM, you know, that's like not just governments, but like the data is out there.
00:15:22.000And other people who are motivated are using that against us, you know?
00:15:25.000And so I think, you know, in the beginning it was sort of like, oh, this could be scary.
00:15:29.000And then it was like, oh, but, you know, we're just using these services.
00:15:33.000And then people were like, oh, wait, the government is, you know, using the data that we're, you know, sending to these services.
00:15:39.000And then people were like, oh, wait, like anybody can use the data against us.
00:15:43.000And they were like, oh, you know, it's like, I think things went from like, I don't really have anything to hide to like, wait a second, these people can...
00:15:49.000Predict and influence how I'm going to vote based on what kind of jeans I buy?
00:15:54.000And then sort of where we are today, where I think people are also beginning to realize that the companies themselves that are doing this kind of data collection are also not necessarily acting in our best interests.
00:17:29.000Okay, I mean, I think you can say, like, no one anticipated that these things would be this significant.
00:17:36.000But I also think that there's, you know, I think ultimately, like, what we end up seeing again and again is that, like, bad business models produce bad technology, you know?
00:18:13.000And they're subject to external demands as a result.
00:18:17.000They have to grow infinitely, which is insane, but that's the expectation.
00:18:21.000And so what we end up seeing is that the technology is not necessarily in our best interest because that's not what it was designed for to begin with.
00:18:31.000That is insane that companies are expected to grow infinitely.
00:18:55.000Yeah, and that's why, I mean, I think the Silicon Valley obsession with China is a big part of that, where people, they're just like, wow, that's a lot of people there.
00:19:58.000Someone had to stick a – like, literally, they're getting it out of the ground, digging into the dirt to get it out of the ground.
00:20:05.000We were talking about it on the podcast.
00:20:06.000They were like, is there a way that this could – is there a future that you could foresee where you could buy a phone that is guilt-free?
00:20:17.000If I buy a pair of shoes, like I bought a pair of boots from my friend Jocko's company.
00:20:51.000With a phone, I have this bizarre disconnect.
00:20:53.000I try to pretend that I'm not buying something that's made in a factory where there's a fucking net around it because so many people jump to their deaths that instead of trying to make things better, they say, we're going to put nets up, catch these fuckers, put them back to work.
00:21:48.000You don't want to buy a slave phone, right?
00:21:52.000Yeah, I mean, but okay, so, you know, I feel like it's difficult to have this conversation without having a conversation about capitalism, right?
00:21:59.000Because, like, ultimately, you know, what we're talking about is, like, externalities, that the prices of things don't incorporate their true cost, you know, that, like, you know, we're destroying the planet for plastic trinkets and reality television, you know, like...
00:22:12.000We can have the full conversation if you like.
00:22:19.000Because when most people know the actual...
00:22:24.000From the origin of the materials, like how they're coming...
00:22:30.000How they're getting out of the ground, how they're getting into your phone, how they're getting constructed, how they're getting manufactured and assembled by these poor people...
00:22:41.000When most people hear about it, they don't like it.
00:22:57.000So, like, if you have a car that you know is being made by slaves, or a car that's being made in Detroit by union workers, wouldn't you choose the car, as long as they're both of equal quality?
00:23:10.000I think a lot of people would feel good about their choice.
00:23:14.000If they could buy something that, well, no, these people are given a very good wage.
00:23:18.000They have health insurance and they're taken care of.
00:24:00.000I mean, you know, even agriculture, you know, it's just like, you know, the sugar you put in your car, you know, it's like, I've been to the sugar beet harvest, you know, it's apocalyptic, you know, it's like, you know, so I think there's just like an aspect of civilization that we don't usually see or think about.
00:24:21.000Not non-conscious, but I mean conscious capitalism would be the idea that you want to make a profit, but you only want to make a profit if everything works.
00:24:32.000Like the idea of me buying my shoes from origin.
00:24:36.000Like knowing, okay, these are the guys that make it.
00:25:38.000I think it's difficult to be in that market, if you want to be in the market of conscious capitalism or whatever, because it's a market for lemons.
00:25:46.000Because it's so easy to just put a green logo on whatever it is that you're creating, and no one will ever see the back of the supply chain.
00:26:24.000All the various elements that are involved in all these different processes, all these different things that we buy and use.
00:26:31.000And then, as you said, they're apocalyptic, which is a great way of describing it.
00:26:36.000If you're at the ground watching these kids pull coltan out of the ground in Africa, you'd probably feel really sick about your cell phone.
00:27:00.000But I think if you put humans together and you give them this diffusion of responsibility that comes from a corporation and then you give them a mandate, you have to make as much money as possible every single year.
00:27:11.000And then you have shareholders and you have all these different factors that will allow them to say, well, I just work for the company.
00:27:21.000You know, I just, you know, you got the guy carving up a stake saying, listen, I'm so sorry that we have to use slaves, but look, Apple's worth $5 trillion.
00:27:29.000We've done a great job for our shareholders.
00:27:32.000At the end of the line, follow it all the way down to the beginning, and you literally have slaves.
00:27:37.000Yeah, I fundamentally agree, and I think that that's, you know, that's...
00:27:45.000Anytime you end up in a situation where, like, most people do not have the agency that they would need in order to direct their life the way that they would want, you know, direct their life so that we're living in a sane and sustainable way,
00:28:04.000And I think that's the situation we're in now, you know.
00:28:06.000And honestly, I feel like, you know, the stuff that we were talking about before of, you know, people...
00:28:13.000You know, sort of being mean online is a reflection of that.
00:28:17.000You know, that That's the only power that people have.
00:28:25.000The only thing you can do is call someone a name, you're going to call them a name.
00:28:32.000And I think that it's unfortunate, but I think it is also unfortunate that most people have so little agency and control over the way that the world works that that's all they have to do.
00:28:48.000And I guess you would say also that the people that do have power, that are running these corporations, don't take into account what it would be like to be the person at the bottom of the line.
00:29:17.000They've probably done what they think is something.
00:29:22.000Even the CEO of a company is someone who's just doing their job at the end of the day.
00:29:27.000They don't have ultimate control and agency over how it is that a company performs because they are accountable to their shareholders, they're accountable to the board.
00:29:36.000I think there is a tendency for people to look at what's happening, particularly with technology today, And think that it's the fault of the people, the leaders of these companies.
00:29:55.000Slavoj Žižek always talks about when you look at the old political speeches, if you look at the fascist leaders, they would give a speech and when there was a moment of applause, they would just sort of stand there and accept the applause because in their ideology,
00:30:11.000they were responsible for the thing that people were applauding.
00:30:16.000And if you watch the old communist leaders, like when Stalin would give a speech and he would say something and there would be a moment of applause, he would also applaud.
00:30:24.000Because in their ideology of historical materialism, they were just agents of history.
00:30:29.000They were just the tools of the inevitable.
00:31:03.000And at this point, if we look at where we are in 2020, it seems inevitable.
00:31:09.000It seems like there's just this unstoppable amount of momentum behind innovation and behind just the process of Creating newer, better technology and constantly putting it out and then dealing with the demand for that newer,
00:31:24.000better technology and then competing with all the other people that are also putting out newer, better technology.
00:31:35.000We are helping the demise of human beings.
00:31:38.000Because I feel, and I've said this multiple times and I'm going to say it again, I think that we are the electronic caterpillar that will give way to the butterfly.
00:31:52.000We are putting together something that's going to take over.
00:31:56.000We're putting together some ultimate being, some symbiotic connection between humans and technology, or literally an artificial version of life, not even artificial, a version of life constructed with silicone and wires and things that we're making.
00:32:14.000If we keep going the way we're going, we're going to come up with a technology that I think we're a ways away.
00:32:31.000Yeah, we're a ways away, but how many ways?
00:32:49.000The Turing test is if someone sat down with, like in Ex Machina, Remember, it was one of my all-time favorite movies, where the coder is brought in to talk to the woman, and he falls in love with the robot lady, and she passes the Turing test,
00:33:41.000I mean, just think that this man back then was thinking there's going to be a time where we will have some kind of a creation where we imitate life, the current life that we're aware of,
00:33:59.000where we're going to make a version of it that's going to be indistinguishable from the versions that are biological.
00:34:05.000That very guy, by whatever twisted ideas of what human beings should or shouldn't do, whatever expectations of culture at the time, is forced to be chemically castrated and winds up committing suicide.
00:35:23.000And so instead we have this empirical test where it's just sort of like, well, if you can't tell the difference without being able to see it, then we'll just call that.
00:35:33.000I think that is really a lot closer than we think.
00:36:02.000They're gonna come up with some cool name for them.
00:36:05.000Yeah, I mean, I think that there's a lot of, most of what I see in like the artificial intelligence world right now is not really intelligence, you know, it's, it's just matching, you know, it's like you show a model, an image of 10 million cats, and then you can show it an image,
00:36:21.000and it will be like, I predict that this is a cat.
00:36:24.000And then you can show it an image of a truck, and it'll be like, I predict that this is not a cat.
00:36:30.000I think there's one way of looking at it that's like, well, you just do that with enough things enough times, and that's what intelligence is.
00:36:40.000The way that it's being approached right now, I think, is also dangerous in a lot of ways, because what we're doing is just feeding information about the world into these models, and that just encodes the existing biases and problems with the world into the things that we're creating.
00:37:02.000This ecosystem is moving and it's advancing.
00:37:04.000The thing that I think is unfortunate is that right now, that ecosystem, this really capital-driven investment startup ecosystem, has a monopoly on groups of young people trying to do something ambitious together in the world.
00:37:23.000In the same way that I think it's unfortunate that grad school has a monopoly on groups of people learning things together.
00:37:31.000Part of what we're trying to do different with Signal is it's a non-profit because we want to be for something other than profit.
00:37:39.000We're trying to explore a different way of groups of people doing something mildly ambitious.
00:37:44.000Has anyone come along and go, I know it's a non-profit, but would you like to sell?
00:37:55.000It's kind of amazing, though, that you guys have figured out a way to create, like, basically a better version of iMessage that you could use on Android.
00:38:04.000Because one of the big complaints about Android is the lack of any encrypted messaging services.
00:38:57.000So, they've been trying to move from this very old standard called SMS that you mentioned before to this newer thing called RCS, which actually I don't know what that stands for.
00:39:08.000I think in my mind I always think of it as standing for too little too late.
00:39:17.000So they're doing that on the part of the ecosystem that they control, which is the devices that they make and sell.
00:39:23.000And they're trying to get other people on board as well.
00:39:27.000Originally, RCS didn't have any facility for end-to-end encryption.
00:39:31.000And they're actually using our stuff, the Signal Protocol, in the new version of RCS that they're shipping.
00:39:38.000So I think they've announced that, but I don't know if it's on or not.
00:39:42.000I have two bones to pick with you guys.
00:39:44.000Two things that I don't necessarily like.
00:39:46.000One, when I downloaded Signal and I joined, basically everyone that I'm friends with who was also on Signal got a message that I'm on Signal.
00:41:28.000With the discovery question of you don't want people to know that you're on Signal, it's kind of So, we're working on it, but it's a more difficult problem than you might imagine because you want some people to know that you're on...
00:43:16.000And so, like, a lot of what we're trying to do is actually just square the way the technology actually works with what it is that people perceive.
00:43:23.000And so, like, fundamentally, right now, you know, Signal is based on phone numbers.
00:43:29.000If you register with your phone number, like, people are going to know that they can contact you on Signal.
00:43:35.000It's very difficult to make it so that they can't, you know, that, like, If we didn't do that, they could hit the compose button and see just that they could send you a message.
00:43:45.000They would just see you in the list of contacts that they can send messages to.
00:43:48.000And then if we didn't display that, they could just try and send you a message and see whether a message goes through.
00:43:54.000It's always possible to detect whether it is that you're on Signal the way that things are currently designed.
00:44:00.000It's interesting also how it works so much differently with Android than it does with iMessage.
00:44:04.000With Android, it'll also send an SMS. I noticed that I can use Signal as my main messaging app on Android.
00:44:13.000And it'll send SMS or it'll send a Signal message.
00:45:44.000But I mean, a lot of that is like, I think the reason why it is that way is kind of interesting to me, which is, you know, it's like these are protocol, you know, it's like when you're just using a normal SMS message on Android,
00:46:55.000And I think, I mean, it's like, I think the thing that everyone's worried about right now with Apple is like, you know, Apple, you know what I said before of like bad business models produce bad technology.
00:47:06.000You know, thus far, Apple's business model is much better than, you know, Google or Facebook or Amazon or, you know, like they're Their business is predicated on selling phones, selling hardware.
00:47:20.000And that means that they can think a little bit more thoughtfully about the way that their software works than other people.
00:47:27.000And I think what people are concerned about is that that business model is going to change.
00:47:36.000Approaching an asymptote of how many phones that they can sell.
00:47:39.000And so now they're looking at software.
00:47:41.000They're like, what if we had our own search engine?
00:48:40.000They delete all the information after you make...
00:48:43.000If you go to a destination, it's not saving it, sending it to a server, and making sure it knows what was there and what wasn't there and how well you traveled and sharing information.
00:49:28.000For sure, the intent behind the software that they have constructed, I think, has been much better than a lot of the other players in Big Tech.
00:49:36.000I think the concern is just that as that software becomes a larger part of their bottom line, that that might change.
00:49:44.000I wonder if they can figure out a way to have an I don't give a fuck phone or I care phone.
00:49:51.000Like, you want to have an I don't give a fuck phone?
00:49:53.000This phone is like, who knows what's making it?
00:52:03.000But I think if you were the CEO of Apple and you were like, this is a priority, we're going to spend, you know, however many trillions of dollars it takes to do this...
00:52:22.000But it's not like, even then, if you were just like, you know, I'm willing to take the hit, you know, I'm going to do, no one can oust me or whatever.
00:52:33.000Then it's like your share price plummets, which means that your employee retention plummets because those people are also working for the equity.
00:53:17.000It says, 100% recyclable and renewable materials across all of our products and packaging because making doesn't have to mean taking from the planet.
00:54:24.000If they can actually do that, 100% resource, if they can figure out a way to do that, And to have recyclable materials and have all renewable electricity,
00:54:40.000whether it's wind or solar, if they could really figure out how to do that, I think that would be pretty amazing.
00:55:14.000And that's the, when you get into these insidious arguments about, or conversations about conspiracies, like conspiracies to keep people impoverished, they're like, well, why would you want to keep people impoverished?
00:55:26.000Well, who's going to work in the coal mines?
00:55:29.000You're not going to get wealthy, highly educated people to work in the coal mines.
00:55:32.000You need someone to work in the coal mines.
00:55:36.000What you do is you don't help anybody get out of these situations.
00:55:40.000So you'll always have the ability to draw from these impoverished communities, these poor people that live in Appalachia or wherever their coal miners are coming from.
00:55:54.000Like, I have a friend who came from Kentucky, and he's like, the way he described it to me, he goes, man, you've never seen poverty like that.
00:56:02.000Like, people don't want to concentrate on those people because it's not as glamorous as some other forms of poverty.
00:56:07.000He goes, but those communities are so poor.
00:56:35.000That's why it's rare that a company comes along and has a business plan like Signal where they're like, we're going to be non-profit.
00:56:43.000We're going to create something that we think is of extreme value to human beings, just to civilization in general, the ability to communicate anonymously or at least privately.
00:56:57.000It's a very rare thing that you guys have done, that we decided to do this and to do it in a non-profit way.
00:57:04.000What was the decision that led up to that?
00:58:41.000And also, it's like, you know, a project like this is not just the software that runs on your phone, but the service of, like, you know, moving the messages around on the internet, and that requires a little bit of care and attention, and if you're not doing that, then it will dissipate.
00:58:56.000And if you're doing something non-profit, the way you're doing it, how do you pay everybody?
00:59:02.000Yeah, well, okay, so, you know, the history of this was, um, I think before the internet really took over our lives in the way that it has, there were the kind of social spaces for people to experiment with different ideas outside of the context of their everyday lives,
00:59:21.000you know, like art projects, punk rendezvous, experimental gatherings.
01:01:07.000There were some really important things that happened then.
01:01:10.000In the 80s, there was this person who was this lone maniac who was writing a bunch of papers about cryptography during a time when it wasn't actually that relevant because there was no internet.
01:01:20.000The applications for these things were harder to imagine.
01:01:24.000And then in the late 80s there was this guy who wrote a Who was a retired engineer who discovered the papers that this maniac, David Chum, had been writing and was really...
01:01:37.000Was he doing this in isolation or was he a part of a project or anything?
01:01:45.000But he did a lot of the notable work on using the primitives that had already been developed.
01:01:55.000And he had a lot of interesting ideas and...
01:01:57.000There's this guy who was a retired engineer, his name was Tim May, who was kind of a weird character.
01:02:03.000And he found these papers by David Chum, was really enchanted by what they could represent for a future.
01:02:11.000And he wanted to write like a sci-fi novel that was sort of predicated on a world where cryptography existed and there was a future where the internet was developed.
01:02:19.000And so he wrote some notes about this novel, and he titled the notes The Crypto Anarchy Manifesto.
01:02:26.000And he published the notes online, and people got really into the notes.
01:02:31.000And then he started a mailing list in the early 90s called the Cypherpunks mailing list.
01:02:37.000And all these people started, you know, joined the mailing list and they started communicating about, you know, what the future was going to be like and how, you know, they needed to develop cryptography to live their, you know, crypto-anarchy future.
01:02:49.000And at the time, it's strange to think about now, but cryptography was somewhat illegal.
01:02:58.000So if you wrote a little bit of crypto code and you sent it to your friend in Canada, that was the same as, like, shipping Stinger missiles across the border to Canada.
01:03:07.000So did people actually go to jail for cryptography?
01:03:10.000There were some high-profile legal cases.
01:03:15.000I don't know of any situations where people were tracked down as munitions dealers or whatever, but it really hampered what people were capable of doing.
01:03:25.000There were some people who wrote some crypto software called Pretty Good Privacy, PGP. And they printed it in a book, like an MIT Press book, in a machine-readable font.
01:03:39.000And then they're like, this is speech.
01:04:31.000Okay, International Traffic and Arms Regulation.
01:04:34.000It's a United States regulatory regime to restrict and control the export of defense and military-related technologies to safeguard U.S. national security and further U.S. foreign policy objectives.
01:04:46.000ITAR. Yeah, they were closed and Gila was closed until like November.
01:06:11.000I discovered sailing by accident where I was like...
01:06:15.000Working on a project with a friend in the early 2000s, and we were looking on Craigslist for something unrelated, and we saw a boat that was for sale for $4,000.
01:06:22.000And I thought a boat was like a million dollars or something.
01:06:27.000There's probably even cheaper boats, you know?
01:06:30.000And so we got really into it, and we discovered that you can go to any marina in North America and get a boat for free.
01:06:36.000You know, that like every marina has a lean sail dock on it where people have stopped paying their slip fees, and the boats are just derelict and abandoned, and they've You know, put it on these stocks.
01:07:39.000Because it's pretty amazing, you know, me and some friends used to sail around the Caribbean and...
01:07:46.000You know, the feeling of, like, you know, you pull up an anchor, and then you sail, like, you know, 500 miles to some other country or whatever, and you get there, and you drop the anchor, and you're just like, we...
01:08:58.000So then you're just like, okay, well, we started here, and then we headed on this heading, and we did that, and we traveled 10 miles, so we must be here.
01:09:06.000And then once a day, you can take a sight with your sextant, and then you can do some dead reckoning with a compass.
01:10:06.000But, like, yeah, I was also, like, just weirdly ideological about it, where, like, I had a job once in the Caribbean that was, like, I was almost like a camp counselor, basically, where there was this camp that was like a sailing camp, but it was, like, 13 teenagers, mostly from North America.
01:10:22.000Showed up in St. Martin and then got on a boat with me and another woman my age.
01:14:31.000If I win, and then I would like pick the thing that was like their sort of deepest fear, you know, it's like the really shy person had to like write a haiku about every day and then read it aloud at dinner.
01:14:42.000You know, like the, you know, the person who was like really into like having like a manicure, like wasn't allowed to shave her legs for the rest of the, you know, like that kind of thing.
01:14:53.000And so then by the end of it, it was just like, you know, everyone had lost, you know, so everyone was like reading the haiku at dinner and doing, you know.
01:14:58.000How are you so good at rock, paper, scissors?
01:15:01.000It's just, you know, skill, muscle, intuition.
01:16:27.000The only way, if you ever put a monetary equivalent to that, it would have to be a spectacular amount of money for me to let someone else program the show.
01:16:35.000I've never let anybody do that before.
01:18:33.000So how did you learn how to do all this stuff?
01:18:36.000Was it trial by fire when you were learning how to use all this, I mean, I don't want to call it ancient equipment, but mechanical equipment to figure out how to...
01:18:52.000Where the fuck does one learn how to operate a sextant and then navigate in the ocean?
01:19:00.000Uh, yeah, just, I would, you know, I started, uh, you know, me and some friends got a boat and, um, we started fixing it up and making a lot of mistakes and then, you know, started taking some trips and then...
01:20:06.000A friend of mine was living in San Francisco and he wanted to learn how to sail.
01:20:09.000And I was like, you know, what you should do is you should get like a little boat, like a little sailing thingy, you know, and then you can just anchor it like off the shore in this area that no one cares about.
01:20:17.000And, you know, you could just sort of experiment with this little boat.
01:20:19.000And so he started looking on Craigslist and he found this boat that was for sale for 500 bucks up in the North Bay.
01:20:25.000And every time we called the phone number, we got an answering machine that was like, hello, you've reached Dr. Ken Thompson, honorary.
01:20:33.000I'm unable to take your call, you know?
01:20:41.000And so finally we got in touch with this guy.
01:20:43.000We go up there, and it's the kind of situation where, like, we pull up, and there's, like, the trailer that the boat's supposed to go on, and it's just full of scrap metal.
01:22:46.000Yeah, I was wearing a PFT, a Type 2 PFT, and we took it to this boat ramp, and it was the end of the day, and the wind was blowing kind of hard, and the conditions weren't that good, but I was like, oh, we're just doing this little thing, this little maneuver, and we were in two boats.
01:23:03.000I built this little wooden rowing boat, and my friend was going to go out in that with one anchor, and I was going to sail out this boat.
01:23:19.000So he's going to go out in this little rowboat, and I was going to sail out this little catamaran.
01:23:25.000And we had two anchors, and we're going to anchor it, and then we're going to get in the rowboat and row back.
01:23:29.000And it seemed a little windy, and I got in the boat first, and I got out around this pier and was hit by the full force of the wind and realized that it was blowing like 20 knots.
01:23:38.000It was way, way too much for what we were trying to do.
01:23:40.000But I had misrigged part of the boat, so it took me a while to get it turned around.
01:23:45.000And by the time I got it turned around, my friend had rowed out around the pier, and he got hit by the force of the wind and just got blown out into the bay.
01:23:53.000So he's rowing directly into the wind and moving backwards.
01:24:35.000And so I go to turn around, and right as I'm turning around, a gust of wind hit the boat and capsized it before I could even know that it was happening.
01:24:44.000It's one moment, you're on the boat, and the next moment you're in the water.
01:27:56.000And you think about all the people like Joshua Slocum, Jim Gray, people who were lost at sea, and you realize they all had this thing that they went through, you know, this hour-long ordeal of just floating alone, and no one will even ever know what that was or what that was like, you know?
01:28:11.000And eventually, I realized I wasn't going to make it ashore.
01:32:16.000Yeah, that's why we were talking about selling Anguilla.
01:32:18.000So the people who moved to Anguilla were part of this moment of like...
01:32:23.000How much did that shift your direction in your life though?
01:32:26.000Did it change like the way like it seems almost I mean I haven't had a near-death experience but I've had a lot of psychedelic experiences and in some ways I think they're kind of similar and that life shifts to the point where whatever you thought of life before that experience is almost like oh come on that's nonsense Yeah,
01:32:47.000I mean, it changes your perspective, or it did for me.
01:32:49.000And, you know, because also in that moment, you know, it's like, you know, I think you go through this sort of embarrassing set of things where you're like, oh, I had these things I was going to do tomorrow.
01:33:01.000Like, I'm not going to be able to do them.
01:33:04.000And then you're like, wait, why is that the thing that I'm concerned about?
01:34:22.000Because Jamie and I were talking about that one day.
01:34:24.000Because they had to do something because if they didn't do something, Justin Bieber would be the number one topic every day, no matter what was happening in the world.
01:34:32.000I can believe that they wanted to change that because the problem was, at the time, Twitter was held together with bubblegum and dental floss.
01:34:41.000Every time Bieber would tweet, the lights would dim and the building would shake a little bit.
01:35:00.000Okay, so there's, you know, people talk about, like, invisible labor.
01:35:03.000Like, the invisible labor behind that tweet is just kind of comical, because it's like, when he did that, you know, people, like, you know, it's like my first day there, you know, it's like he tweeted something, and, you know, the building's, like, kind of shaking, and, like, alarms are going off.
01:35:16.000People are, like, scrambling around, you know?
01:35:19.000You know, it's like this realization where you're just like, never in my life did I think that anything Justin Bieber did would like really affect me in any like deep way, you know?
01:35:28.000And then here I am just like scrambling around to like facilitate.
01:35:31.000What are your thoughts on curating what trends and what doesn't trend and whether or not social media should have any sort of obligation in terms of...
01:35:43.000How things, whether or not people see things, like shadow banning and things along those lines.
01:35:51.000I'm very torn on this stuff because I think that things should just be.
01:35:57.000And if you have a situation where Justin Bieber is the most popular thing on the internet, that's just what it is.
01:36:06.000I get how you would say, well, this is going to fuck up our whole program, like what we're trying to do with this thing.
01:36:14.000What do you mean, fuck up our whole program?
01:36:15.000Well, what you're trying to do with Twitter, I mean, I would assume what you're trying to do is give people a place where they could share important information and, you know, have people, you know...
01:36:29.000I mean, Twitter has been used successfully to overturn governments.
01:36:38.000Break news on very important events and alert people to danger.
01:36:42.000There's so many positive things about Twitter.
01:36:45.000If it's overwhelmed by Justin Bieber and Justin Bieber fan accounts, if it's overwhelmed, then the top ten things that are trending are all nonsense.
01:36:56.000I could see how someone would think we're going to do a good thing by suppressing that.
01:37:11.000Why do you think they kept him from trending?
01:37:16.000Well, I mean, I don't know about that specific situation.
01:37:20.000I mean, I think, you know, looking at the larger picture, right, like...
01:37:27.000In a way, you know, it's like, if you think about, like, 20 years ago, whenever anybody talked about, like, society, you know, everyone would always say, like, the problem is the media.
01:37:53.000People were, you know, getting their own printing presses.
01:37:56.000We were convinced that if we made publishing more equitable, if everybody had the equal ability to produce and consume content, that the world would change.
01:38:09.000In some ways, what we have today is the fantasy of those dreams from 20 years ago.
01:38:37.000And also, that anybody could share their weird ideas about the world.
01:38:42.000And I think, in some ways, we were wrong.
01:38:46.000You know, that we thought, like, you know, the word we got today is like, yeah, like, if a cop kills somebody in the suburbs of St. Louis, like, everybody knows about it.
01:38:55.000I think we overestimated how much that would matter.
01:38:58.000And I think we also believed that the things that everyone would be sharing were, like, our weird ideas about the world.
01:39:05.000And instead, we got, like, you know, Flat Earth and, like, you know, anti-vax and, like, you know, all this stuff, right?
01:39:12.000And so it's, like, in a sense, like, I'm glad that those things exist because they're, like, they're sort of what we wanted, you know?
01:39:19.000But I think what we did, what we underestimated is, like, how important the medium is.
01:39:25.000Like, the medium is the message kind of thing.
01:39:26.000And that, like, What we were doing at the time of writing zines and sharing information, I don't think we understood how much that was predicated on actually building community and relationships with each other.
01:39:42.000Like, what we didn't want was just, like, more channels on the television.
01:39:45.000And that's sort of what we got, you know?
01:39:47.000And so I think, you know, it's like everyone is, like, on YouTube trying to monetize their content, whatever, you know?
01:40:04.000I think, like, you know, now that there's, like, you know, these two simultaneous truths that everyone seems to believe that are in contradiction with each other.
01:40:12.000You know, like, one is that, like, everything is relative.
01:40:16.000Everyone is entitled to their own opinion.
01:43:52.000If you set aside all of the takedown stuff, all the deplatforming stuff, if you say, okay, Facebook, Twitter, these companies, they don't do that anymore.
01:45:14.000He had a Twitter account that was set up for...
01:45:19.000It was Unity 2020. And the idea was, like, instead of looking at this in terms of left versus right, Republican versus Democrat, let's get reasonable people from both sides, like a Tulsi Gabbard and a Dan Crenshaw.
01:45:33.000Bring them together and perhaps maybe put into people's minds the idea that, like, this idea, this concept of it has to be a Republican vice president or a Republican president.
01:47:29.000So they just decided this is not worthy of trending.
01:47:32.000So you have arbitrary decisions that are being made by people, most likely because they feel that ideologically Kanye West is not aligned with...
01:47:42.000I mean, he was wearing the MAGA hat for a while.
01:47:45.000So they just decided this is not trending.
01:47:51.000You've got millions and millions and millions of people who are watching it.
01:47:54.000Whether there, you know, it's like whether there are, but I think this is the point, you know, it's like whether there, whether it's people, whether it's algorithms, you know, there are forces that are making decisions about what people see and what people don't see, and they're based on certain objectives that I think are most often business objectives.
01:48:48.000But I think actually focusing on the outlying cases of this person was deplatformed, this person was intentionally, ideologically not promoted or de-emphasized or whatever.
01:49:02.000I think that that, like, obfuscates or, you know, draws attention away from the larger thing that's happening, which is that, like, those things are happening just implicitly all the time.
01:49:15.000And that, like, it almost, like, serves to the advantage of these platforms to...
01:49:20.000Highlight the times when they remove somebody because what they're trying to do is reframe this is like, okay, well, yeah, we've got these algorithms or whatever.
01:49:27.000The problem is there's just these bad people, you know, and we have to decide there's a bad content from bad people and we have to decide, you know, what to do about this bad content and these bad people.
01:49:36.000And I think that distracts people from the fact that like the platforms are at every moment making a decision about what you see and what you don't see.
01:49:54.000There's a problem of deplatforming, because in many ways, deplatforming decisions are being made based on ideology.
01:50:02.000It's a certain specific ideology that the people that are deplatforming the other folks have that doesn't align with the people that are Being de-platformed.
01:50:11.000These people that are being de-platformed, they have ideas that these people find offensive or they don't agree with.
01:50:17.000So they say, we're going to take you off.
01:50:25.000Well, I think that there's a tendency for a lot of these platforms to try to define some policy about what it is that they want and they don't want.
01:50:37.000I feel like that's sort of a throwback to this modernist view of science and how science works and we can objectively and rigorously define these things.
01:50:47.000I just don't think that's actually how the world works.
01:51:15.000But it's still useful, and that's why it's taught.
01:51:18.000Because you can use it to predict motion outcomes, that kind of thing.
01:51:21.000What's incorrect about Newtonian physics in the sense that they shouldn't be teaching it?
01:51:27.000I mean, today, you know, people believe that the truth is that, you know, there's like, you know, relativity, like gravity is not a force.
01:51:33.000There's like, you know, these planes and stuff, whatever, you know, that like there are other models to describe how the universe works.
01:51:39.000And Newtonian physics is considered outmoded.
01:51:43.000But it still has utility in the fact that you can use it to predict the...
01:51:47.000So you're talking about in terms of quantum physics and string theory and a lot of these more...
01:51:51.000Yeah, it's like relativity at the large scale, quantum physics at the small scale.
01:51:56.000And even those things are most likely not true in the sense that they aren't consistent with each other and people are trying to unify them and find something that does make sense at both of those scales.
01:52:05.000The history of science is a history of things that weren't actually true.
01:52:08.000You know, Bohr's model of the atom, Newtonian physics.
01:52:11.000People have these, you know, Copernicus's model of the solar system.
01:52:15.000People have these ideas of how things work.
01:52:18.000And the reason that people are drawn to them is because they actually have utility.
01:52:21.000That it's like, oh, we can use this to predict the motion of the planets.
01:52:24.000Oh, we can use this to send a rocket into space.
01:52:25.000Oh, we can use this to, you know, have better outcomes, you know, for some medical procedure or whatever.
01:52:47.000When you look at the emergence of science and people conceiving of it as a truth, it became this new authority that everyone was trying to appeal to.
01:52:56.000If you look at all of the 19th century political philosophy, I mean, okay, I think the question of truth is, like, you know, it's even a little squishy with the hard sciences, right?
01:53:08.000But once you get into, like, soft sciences, like social science, psychology, like, then it's even squishier, you know, that, like, these things are really not about truth.
01:53:17.000They're about, like, some kind of utility.
01:53:18.000And when you're talking about utility, the important question is, like, useful for what and to whom?
01:53:25.000And I think that's just always the important question to be asking, right?
01:53:28.000Because, you know, when you look at, like, all the 19th century political writing, it's all trying to frame things in terms of science in this way that it just seems laughable now.
01:53:35.000But, you know, like, at the time, they were just like, we're going to prove that communism is, like, the most true, like, social economic system in the world, you know?
01:53:43.000Like, there are whole disciplines of that.
01:53:45.000You know, people had like PhDs in that, you know, their whole research departments in the Soviet Union, people doing that.
01:53:51.000And we laugh about that now, but I don't think it's that different than like social science in the West, you know?
01:53:57.000And so I think, you know, it's like if you lose sight of that, then you can try, then you try to like frame Social questions in terms of truths.
01:54:06.000It's like, this is the kind of content that we want, and we can rigorously define that, and we can define why that's going to have the outcomes that we want it to.
01:54:13.000But once you get on that road, you're like, okay, well, terrorist stuff.
01:54:17.000We don't like terrorist stuff, so we're going to rigorously define that, and then we have a policy, no terrorist stuff.
01:54:24.000And then China shows up, and they're like, we've got this problem with terrorists, the Uyghurs.
01:54:32.000I think if people from the beginning acknowledged that all of objectivity is just a particular worldview and that we're not going to regularsly define these things in a way of what is true and what isn't, then I think we would have better outcomes.
01:54:59.000But that's, it's on a social media platform.
01:55:01.000But isn't it, there's a weird thing when you decide that you have one particular ideology that's being supported in another particular ideology that That is being suppressed.
01:55:14.000And this is what conservative people feel when they're on social media platforms.
01:55:19.000Almost all of them, other than the ones we talked about before, Parler and Gab and the alternative ones, they're all very left-wing in terms of the ideology that they support.
01:55:30.000The things that can get you in trouble on Twitter.
01:56:13.000And kind of amazing that he didn't do anything along the way while he was witnessing people get deplatformed, and particularly this This sort of bias towards people on the left and this discrimination against people on the right.
01:56:30.000There's people on the right that have been banned and shadow banned and blocked from posting things.
01:56:36.000You run into this situation where you wonder what exactly is a social media platform.
01:56:45.000It's just a small private company and maybe you have some sort of a video platform and there's only a few thousand people on it and you only want videos that align with your perspective.
01:56:59.000But when you're the biggest video platform on earth like YouTube and you decide that you are going to take down anything that disagrees with your perspective on how COVID should be handled...
01:58:03.000Is it the way that people get to express ideas?
01:58:06.000And isn't the best way to express ideas to allow people to decide, based on the better argument, what is correct and what's incorrect?
01:58:16.000Like, this is what freedom of speech is supposed to be about.
01:58:20.000It's supposed to be about, you have an idea, I have an idea, these two ideas come together, and then the observers get to go, hmm, okay, well, this guy's got a lot of facts behind him.
01:58:39.000There's going to be some people that go, no, there's a suppression of hollow earth and hollow earth is the truth and hollow earth facts and hollow earth theory.
01:58:46.000But you've got to kind of let that happen.
01:58:49.000You gotta kind of have people that are crazy.
01:58:52.000Remember the old dude that used to stand on the corners with the placards on, the world is ending tomorrow?
01:59:21.000It's going to influence people that are easily influenced.
01:59:23.000And the question is, who are we protecting and why are we protecting these people?
01:59:27.000Well, okay, but I think, in my mind, what's going on is, like, the problem is that it used to be that some person with very strange ideas about the world wearing a sign on the street corner shouting was just a person with very strange ideas about the world wearing a sign on the street corner shouting.
01:59:43.000Now, there's somebody, you know, with very strange ideas about the world, and those ideas are being amplified by a billion-dollar company, because there are algorithms that amplify that.
01:59:52.000And what I'm saying is that instead of actually talking about that, instead of addressing that problem, those companies are trying to distract us from that discussion by saying, Would the correct way to handle it...
02:00:23.000Would it be to make algorithms illegal in that respect?
02:00:26.000Like to not be able to amplify or detract?
02:00:29.000To not be able to ban, shadow ban, or just to have whatever trends trend.
02:00:38.000Whatever people like, let them like it.
02:00:41.000And say, listen, this thing that you've done by creating an algorithm that encourages people to interact, encourages people to interact on Facebook, encourages people to spend more time on the computer, what you've done is you've kind of distorted what is valuable to people.
02:00:57.000You've changed it and guided it in a way that is ultimately, perhaps arguably, detrimental to society.
02:01:26.000It's complicated because, one, I have no faith in, like when you say ban or make it illegal or whatever, I have zero faith in the government being able to handle this.
02:01:37.000Yeah, nor do I. Every time I see a cookie warning on a website, I'm like, okay, these people are not the people that are good.
02:01:43.000This is what they've given us after all this time.
02:01:45.000These people are not going to solve this for us.
02:01:47.000And also, I think a lot of what it is that the satisfaction that people feel and the discomfort that people feel and the concern that people have is a concern about power.
02:01:59.000That right now, these tech companies have a lot of power.
02:02:03.000And I think that the concern that is coming from government is the concern for their power.
02:02:10.000The right has made such a big deal about deplatforming.
02:02:15.000And I think it's because they're trying to put these companies on notice.
02:02:55.000Okay, I think, I guess maybe, let me just reframe this to say that, like, I think it's interesting that we are, we've hit an inflection point, right?
02:03:03.000Where, like, the era of utopianism with regards to technology is over.
02:03:43.000People more and more understanding how it is that these systems function.
02:03:47.000I think we're increasingly see that people understand that this is really about power, it's about authority, and that we should be trying to build things that limit the power that people have.
02:03:58.000If you had your wish, if you could let these social media platforms, whether it's video platforms like YouTube or Facebook or Twitter, if you If you had the call,
02:04:13.000if they called you up and said, Moxie, we're going to let you make the call.
02:04:39.000But I think the way that messaging apps are going, there's a trajectory where a project like Signal becomes more of a social experience.
02:04:49.000And that, like, the things that we're building extend beyond just, like, you know, sending messages.
02:04:54.000Particularly, I think, as more and more communication moves into group chats and things like that.
02:04:59.000And, you know, the foundation that we're building it on is a foundation where we know nothing.
02:05:03.000You know, it's like, if I looked up your Signal account record right now of, like, all the information that we had about you on Signal, There's only two pieces of information.
02:05:10.000The date that you created the account and the date that you last used Signal.
02:05:35.000Well, I think, you know, some of the stuff that we're working on now of just like moving away from phone numbers, you can have like, you know, a username so that you can like post that more publicly.
02:05:41.000And then, you know, we have groups, now you have group links.
02:05:43.000And then, you know, maybe we can do something with events.
02:05:45.000And we can, you know, that's like, we're sort of moving in the direction of like, an app that's good for communicating with connections you already have to an app that's also good for creating new connections.
02:05:57.000Would you think that social media would be better served with the algorithms that are in place and with the mechanisms for determining what's trending in place and for their trust and safety or whatever their content monitoring policy they have now or have it wide open?
02:06:36.000I think bad business models create bad technology, which has bad outcomes.
02:06:40.000You know, that's the problem we have today, right?
02:06:41.000So the problem is that there's a financial incentive for them to...
02:06:45.000That if we, you know, if you look at, like, the metrics, you know, that we talked about, like, you know, what Facebook cares about is just, like, time that you spent looking at the screen on Facebook, you know?
02:06:54.000Like, if we were to have metrics, if Signal were to have metrics, you know, our metrics would be, like, what we want is for you to use the app as little as possible, for you to actually have the app open as little as possible, but for the velocity of information to be as high as possible.
02:07:07.000So it's like you're getting maximum utility.
02:07:09.000You're spending as little time possible looking at this thing while getting as much out of it as you can.
02:07:13.000How could that be engineered, do you think?
02:07:32.000We don't have to, you know, satisfy public markets.
02:07:36.000We also don't have to build a pyramid scheme where we have like, you know, 2 billion users so that we can monetize them to like, you know, a few hundred thousand advertisers so that we can, you know, like we don't have to do any of that.
02:07:46.000And so We have the freedom to pick the metrics that we think are the ways that we think technology should work, that we think will better serve all of us.
02:07:55.000So what would better be served is a bunch of wild hippies like yourself that don't want to make any money at all, put together a social media app.
02:08:36.000Well, that would be great if they could figure out a way to develop some sort of social media platform that just operated on donations and could rival the ones that are operating on advertising revenue.
02:08:48.000Because I agree with you that that creates a giant problem.
02:08:53.000And that's what we're working on, slowly.
02:08:57.000So you just look at it in terms of bad business model equals bad outcome.
02:09:51.000It's like these people are not going to save us, man.
02:09:53.000You know, and it's like anything that they do will probably just make things worse.
02:09:55.000Do you think that it's a valid argument that conservatives have though?
02:09:58.000That they're being censored and that their voice is not being heard?
02:10:02.000I know what you said in terms of, you know, that if someone had something on YouTube that said that gay people are unhuman and they should be abolished and banned and delete that video.
02:10:18.000But I think there's other perspectives, like the Unity 2020 perspective, which is not in any way negative.
02:10:25.000Yeah, I mean, I don't know what happened with that, but I feel like what I... I think it could be a part of this thing of just like, well, we create this policy and we have these...
02:10:33.000You know, we define things this way, and then a lot of stuff just gets caught up in it.
02:10:36.000You know, where it's just like, now you're like taking down content about the Uyghurs because you wanted to do something else.
02:10:40.000You know, that if people would just be more honest about, like, there is not really an objectivity...
02:10:44.000And, you know, we're looking for these specific outcomes and this is why that I think, you know, maybe we would have better results.
02:12:30.000She and other people working there and...
02:12:35.000They were organizing for, one, trying to apply the protections that full-time workers and benefits of full-time workers there had to a lot of the temporary workers, like the people who work in security, the people who are working in the cafeteria, the people who are driving buses and stuff like that,
02:12:51.000who are living a lot more precariously.
02:12:54.000But also for creative control over how the technology that they're producing is used.
02:13:01.000So Google was involved in some military contracts that were pretty sketch.
02:13:07.000Like applying machine learning AI stuff to military technology.
02:13:11.000And then finally, there had been a lot of high profile sexual harassment incidents at Google where the perpetrators of sexual harassment were Usually paid large severances in order to leave.
02:13:33.000And they, like a lot of people walked out.
02:13:35.000I don't know what the numbers were, but a lot of people, they managed to organize internally and walked out.
02:13:39.000And I think stuff like that is encouraging because, you know, it's like we look at the hearings and it's like the people in Congress don't even know who's the right person to talk to.
02:13:51.000You know, it's like, you know, old people talking about But isn't that another issue where you're going to have people who have an ideological perspective?
02:14:13.000And that may be opposed to people that have a different ideological perspective, but they're sort of disproportionately represented on the left in these social media corporations.
02:14:24.000When you get kids that come out of school, they have degrees in tech, or they're interested in tech, they tend to almost universally lean left.
02:14:36.000Like, when it comes to the technology, I don't think people are...
02:14:42.000I think what almost everyone can agree is the amount of money and resources that we're putting into surveillance, into ad tech, into these algorithms that are just about increasing engagement, that they're just not good for the world.
02:14:55.000And if you put a different CEO in charge...
02:14:57.000That person's just going to get fired.
02:14:59.000But if the entire company organizes together and says, no, this is what we want.
02:15:03.000This is how we want to allocate resources.
02:15:05.000This is how we want to create the world, then you can't fire all those people.
02:15:11.000So they'd have to get together and unionize and have a very distinct mandate, very clear that we want to go back to do no evil or whatever the fuck it used to be.
02:15:25.000Yeah, where they don't really have that as a big sign anymore.
02:15:29.000Do you think that would really have an impact, though?
02:15:31.000I mean, it seems like the amount of money, when you find out the amount of money that's being generated by Google and Facebook and YouTube, the numbers are so staggering that to shut that valve off, to like...
02:15:51.000It's almost like it had to have been engineered from the beginning, like what you're doing at Signal.
02:15:57.000Like someone had to look at it from the beginning and go, you know what, if we rely on advertiser revenue, we're going to have a real problem.
02:16:05.000And I think, but I think it's, yeah, exactly.
02:16:07.000I mean, you know, I think you're right.
02:16:09.000And there's, you know, part of the problem with just relying on tech workers to organize themselves is that they are shareholders of these companies.
02:16:17.000You know, they have a financial stake in their outcome.
02:16:19.000And so that influences the way that they think about things.
02:19:05.000Yeah, what we need to do is take you to a yoga class and you go to an organic food store and you talk to people about their rights and then...
02:19:35.000Like, there's this, like, in the history of people who are, like, doing...
02:19:39.000Like, building cryptography, stuff like that, there was this period of time where the thesis was basically, like, all right, what we're going to do is develop really powerful tools for ourselves, and then we're going to teach everyone to be like us, you know?
02:19:52.000And that didn't work because, you know, we didn't really anticipate the way that computers were going.
02:19:57.000So I try to be, like, as normal as possible.
02:20:00.000You know, I just, like, have, like, a normal setup.
02:20:48.000Do you feel like you have extra scrutiny on you because of the fact that you're involved in this messaging application that Glenn Greenwald and Edward Snowden and a bunch of other people that are seriously concerned with...
02:21:05.000Security and privacy that maybe people are upset at you?
02:21:11.000That you've created something that allows people to share encrypted messages?
02:21:27.000But in some ways, that means that there's less pressure on me because, you know, it's like if you're the creator of Facebook Messenger and your computer gets hacked, like, that's everyone's Facebook messages are, you know, gone.
02:23:04.000They would stop you, and they would be like, hey, we just need you to type in your password here so that we can get through the full disk encryption.
02:23:11.000And they would be like, well, if you don't do that, we're going to take this, and we're going to send it to our lab, and they're going to get it anyway.
02:23:36.000No, but I'm saying they didn't say, hey, you were...
02:23:40.000You're thought to have done this or there's some...
02:23:45.000No, they would always just be like, oh no, this is just random or whatever.
02:23:47.000But there would be two people at the exit of the plane with photographs of me, you know, waiting for me to step off the plane and they would escort.
02:23:53.000They wouldn't even wait for me to get to the...
02:23:55.000So did you have to have like a burner laptop?
02:23:58.000I just wouldn't travel with electronics, you know, because it was just...
02:24:06.000That was only internationally, though, because they can't do that domestically.
02:24:10.000So domestically, you just had long waits, and then they'd eventually give you a ticket?
02:24:16.000Yeah, they would eventually give you a ticket and then you'd get the selective screening where they would take all the stuff out of your bag and like, you know, filter out your car.
02:24:24.000And then at every connection, the TSA would come to the gate of the connecting thing, even though you're already behind security, and do it again at the connection.
02:24:57.000Yeah, I was thinking, actually, I was thinking on the way here, it's funny how, like, I remember after the last election, everyone was talking about, like, California leaving the United States.
02:26:09.000Like campaign that already existed and everyone sort of got behind it and he was just like oh shit and he lives in Russia now you know and and but he like didn't really understand um optics I think where he like he like the re the way that people everyone found out that uh he lived in Russia was that he opened a California embassy in Moscow so they like announced like you know CalAXIT has opened the first California embassy like in a foreign country but it was in Moscow and this was right as all the like Russian like stuff was happening you know So
02:26:41.000if you're conspiratorially minded, you'd have drawn some incorrect conclusions.
02:26:51.000So what was your motivation to hang out with this guy for a whole day?
02:26:55.000I mean, I was just fascinated, you know, because here's this guy who's, like, doing this kind of ambitious thing, and it just, the optics seem so bad, you know?
02:27:00.000I think he reminded me of, like, the Hannah Arendt quote that's like, you know, if the essence of power is deceit, does that mean that the essence of impotence is truth?
02:28:06.000There's all these autonomous regions in the world that are essentially their own countries, you know, but they're not recognized by the UN or other countries, you know.
02:28:41.000So these countries all have their own soccer teams, but they can't play in FIFA because they're not recognized by the UN. So FIFA can't recognize them.
02:29:52.000I think it's, like, an interesting, you know, it's, like, in a way that I feel like, you know, society moves by, like, pushing at the edges, you know, that, like, it's the fringes that end up moving the center.
02:30:02.000I feel like, you know, looking at the margins of the way politics works is an interesting view of, like, how everything else works, you know, that, like, going to Abkhazia, it was so crazy getting there, you know, it's like, You know, we travel all through Russia.
02:30:17.000We get to this, like, militarized border.
02:30:19.000You go through these three checkpoints that aren't supposed to exist, but obviously exist.
02:30:23.000You know, you get to the other side, and it's just the same as where you just were.
02:30:28.000You know, you guys fought a brutal civil war, you know, with, like, genocide, like, full-on, you know, like, crazy shit.
02:30:41.000I feel like it's this thing you see again and again of the institutions that we're familiar with in the world that exists are the institutions of kings.
02:30:52.000It's like police, military, illegal apparatus, tax collectors.
02:30:58.000Every moment in history since then has been about trying to change ownership of those institutions.
02:31:05.000And it's always sort of dissatisfying, you know?
02:31:08.000And, like, you know, just seeing that happen again and again.
02:31:11.000And just, like, you know, realizing that it's like maybe what we should be doing is actually trying to get rid of these institutions or change these institutions in some way, you know?
02:31:18.000Don't you think there's a very slow rate of progress, but ultimately progress?
02:31:24.000If you follow Pinker's work, it looks at all the various metrics like murder, rape, racism, crime, all these different things.
02:31:33.000Over time, we're clearly moving in a better direction.
02:31:41.000You know, I was listening to this podcast today.
02:31:45.000We were talking about religion, and it was discussing the Bible, and they were talking about all the different stories that are in the Bible, many of them that are hundreds of years apart, that were collected and put into that.
02:32:01.000Just stop and think about a book that was written literally before...
02:32:08.000The Constitution was drafted, and that book is being introduced today as gospel.
02:32:16.000And that there's a new book that's going to be written 200 years from now, and that will be attached to the new version of the Bible as well.
02:32:24.000And then one day someone will come across this, and it will all be interpreted as the will and the words of God that all came about.
02:32:48.000Alan Turing in 1950 being chemically castrated for being gay to, in my lifetime, seeing gay marriage as being something that was very fringe when I was a boy living in San Francisco to universal across the United States today,
02:33:06.000at least mostly accepted by the populace, right?
02:33:10.000That this is a very short amount of time where a big change has happened.
02:33:13.000And that these changes are coming quicker and quicker and quicker.
02:33:16.000I would hope that this is a trend that is moving in the correct direction.
02:33:21.000Yeah, certainly there are some things that are getting better, yeah.
02:33:24.000And I feel like, to me, it's important to, you know, for a lot of those things, like the things you mentioned, like gay marriage, I think it's important to realize that, like, a lot of those, a lot of that progress would not have happened without the ability to break the law, honestly.
02:33:41.000How would anyone have known that we wanted to allow same-sex marriage if no one had been able to have a same-sex relationship because Saudi laws had been perfectly enforced?
02:33:50.000How would we know that we want to legalize marijuana if no one had ever been able to consume marijuana?
02:33:56.000So I think a lot of the fear around increased surveillance data is that space dissipates.
02:34:08.000But, you know, on the other hand, you know, it's like we're living in the apocalypse, you know, that it's like if you took someone from 200 years ago who used to be able to just walk up to the Klamath River and dump a bucket in the water and pull out, you know, 12 salmon and that was, you know, their food.
02:34:22.000And you were like, oh, yeah, the way it works today is you go to Whole Foods and it's $20 a pound and it's, you know, pretty good.
02:34:27.000You know, they'd be like, what have you done?
02:35:55.000They've been working to try to make this mine a reality for, I think, a couple of decades now.
02:36:01.000And people have been fighting tirelessly to educate people on what a devastating impact this is going to have on the ecology of that area and the fact that the environment will be permanently devastated.
02:36:13.000There's no way of bringing this back and there's no way of doing this without destroying the environment.
02:36:18.000Because the specific style of mining that they have to employ in order to pull that copper and gold out of the ground It involves going deep, deep into the earth to find these reservoirs of gold and copper and sulfur they have to go through and then they have to remove the waste.
02:36:35.000And mining companies have invested hundreds of millions of dollars in this and then abandoned it.
02:37:16.000It's pretty epic where he talks to this one guy who's dedicated the last 20 years of his life trying to fight this.
02:37:24.000Let me just find it real quick because it's pretty intense.
02:37:29.000And it's terrifying when you see how close it's come to actually being implemented and how if it happens, there's no way you pull that back.
02:38:51.000But these assholes that just want copper and gold are willing to do this.
02:38:54.000And there was this one politician in particular that has a gigantic windfall, if he can pull this off, or a lobbyist or whatever the fuck he is.
02:39:02.000But he stands to make, I think they said $14 million if he can actually get the shovels into the ground.
02:39:52.000I mean, we are getting along fine without that copper and without that gold, and we are using the resource of the salmon, and people are employed that are enjoying that resource, and they're also able to go there and see the bears eating the salmon and seeing this incredible wild place.
02:40:13.000Alaska is one of the few really, truly wild spots in this country.
02:40:21.000And if you get enough greedy assholes together, and they can figure out a way to make this a reality, and with the wrong people in positions of power, that's 100% possible.
02:40:33.000Yeah, you might even say we've organized the entire world economy to fuck that up.
02:40:39.000But I think that it's like the question of agency of how do we affect these processes is tough.
02:40:46.000Well, I was joking, obviously, about killing that person, but there was a recent one of the Iranian scientists was assassinated, and this brought up this gigantic ethical debate.
02:40:59.000And we don't know who did it, whether it was an Israeli army Mossad held a press conference to say, we didn't do it, while wearing t-shirts that said, we definitely did it.
02:41:07.000Assassinated Iranian nuclear scientists shot with remote-controlled machine gun.
02:42:37.000Like, if someone is actively trying to acquire nuclear weapons, and we think that those people are going to use those nuclear weapons, is it ethical to kill that person?
02:42:46.000And if that person's a scientist, they're not a...
02:42:50.000Yeah, I mean, I think the causality stuff is really hard to figure out, you know.
02:42:56.000But I think most of the time it's not about the one person, you know, that it's not, you know, maybe sometimes it is, but I think most, it's just like, I feel like assassination politics in the tech arena does not work, you know, that it's like you can get rid of all the people at the top of these companies and that's not what's going to do it,
02:43:12.000you know, that there are like these structural reasons why these things keep happening over and over again.
02:43:16.000Yeah, I think they're trying to slow it down, though, right?
02:43:58.000You know, you go down that road and, you know, where things can happen too.
02:44:01.000A great example is, so one of the things that came out in a lot of the documents that Snowden released was that the NSA had worked with a standards body called NIST in order to produce a random number generator that was backdoored.
02:44:19.000So random numbers are very important in cryptography, and if you can predict what the random numbers are going to be, then you win.
02:44:27.000And so the NSA had produced this random number generator that allowed them to predict what the random numbers would be because they knew of this one constant that was in there.
02:44:39.000They knew a reciprocal value that you can't derive just by looking at it, but they know because they created it.
02:44:45.000And they had what they called a nobody but us backdoor.
02:45:05.000And so the idea was that, like, the NSA would have these capabilities, they had developed, you know, these vulnerabilities that they could exploit in situations like this, you know, that they could, like, take advantage of foreign powers and stuff like that in ways that wouldn't boomerang back at them.
02:45:20.000But what happened was, in, I think, you know, 20 early teens, Juniper got hacked, and somebody secretly changed that one parameter.
02:45:33.000That was, like, basically the back door to a different one that they knew the reciprocal value to.
02:45:41.000And it's most likely China or Russia that did this.
02:45:45.000And then what's kind of interesting is there was a big incident where the OPM, the Office of Personnel Management, I think, was compromised.
02:45:54.000And they have records on, you know, foreign intelligence assets and stuff like that.
02:45:59.000Their systems were compromised, it seems like, maybe by China.
02:46:03.000And what's sort of interesting is that they were running the Juniper networking gear that had been, you know, hacked in this one specific way.
02:46:12.000And so it's kind of possible that, like, you know, the NSA developed this backdoor that they were going to use for situations like this, you know, against foreign adversaries or whatever, and that the whole thing just boomeranged back at them, and the OPM was compromised as a result.
02:46:29.000But this is like, I don't know, I think it's, You know, it's easy to look at things like Stuxnet and stuff like that and just be like, yeah, this is harm reduction or whatever, you know, but like in the end, it can have real-world consequences.
02:46:43.000And this is also why people are so hesitant about, you know, like the government is always like, well, why don't you develop a form of cryptography where it like works except for us, you know, weaken the content, you know.
02:48:59.000Is that a valid comparison to what they're doing in Silicon Valley?
02:49:03.000Like, Huawei did have routers that had third-party access, apparently, and they were shown that information was going to a third party that was not supposed to be, right?
02:49:18.000Well, okay, I think there's a couple...
02:49:21.000There have been incidents where it's like, yeah, there's data collection that's happening.
02:49:25.000Well, there's data collection happening in all Western products, too.
02:49:30.000And actually, the way the Western products are designed are really scary.
02:49:34.000In the telecommunications space, there's a legal requirement called CALEA, Communications and Law Enforcement Act, or something like that, that requires telecommunications equipment to have...
02:49:47.000To have eavesdropping, like surveillance stuff built into it, like when you produce the hardware, in order to sell it in the United States, you have to have...
02:50:20.000And that data is end-to-end encrypted, so nobody can eavesdrop on those calls, including us.
02:50:27.000But so communication equipment that is produced in the United States has to have this so-called lawful intercept capability.
02:50:36.000But what's crazy about that is that's the same, you know, it's like these are U.S. companies and they're selling that all around the world.
02:50:41.000So that's the shit that gets shipped to UAE. Yeah.
02:50:44.000You know, so it's like it's the secondary effect thing of like the United States government was like, well, we're going to be responsible with this or whatever.
02:50:50.000We're going to have warrants or whatever.
02:50:52.000And then that same equipment gets shipped to tyrants and repressive regimes all over the place.
02:50:57.000And they just got a ready-made thing to just avail everyone's phone calls.
02:51:01.000So it's like, I don't know, it's hard to indict Huawei for acting substantially different than the way, than, you know, whatever, the US industry acts.
02:51:13.000It's just, certainly they have a different political environment and, you know, they are much more willing to use that information to do really brutal stuff.
02:51:24.000Well, it wasn't just that they banned Huawei devices, but they also banned them from using Google.
02:51:30.000That's when I thought, like, wow, this is really...
02:52:16.000Oh, you want all this other stuff that we make that's not part of just the stock-free thing, like Play Services and all the Google stuff.
02:52:25.000Increasingly more and more of Android is just getting shoved into this proprietary bit.
02:52:29.000And they're like, okay, you want access to this?
02:52:31.000Then it's going to cost you in these ways.
02:52:34.000And I think it probably got to the point where Huawei was just like...
02:52:41.000We're not willing to pay, you know, even either monetarily or through whatever compromise they would have to make, and they were just like, we're gonna do our own thing.
02:52:49.000I thought it was because of the State Department's boycott.
02:52:52.000Oh, it could have also been that there was a legal requirement that they stopped doing it, yeah.
02:52:59.000I think I might be right, but I'm not sure though.
02:53:02.000But it just made me think, like, I understand that there's a sort of connection that can't be broken between business and government in China, and that business and government are united.
02:53:13.000It's not like, you know, like Apple and the FBI, right?
02:53:27.000They just send it directly to the people.
02:53:30.000What we're terrified of is that these relationships that business and government have in this country, they're getting tighter and tighter intertwined.
02:53:39.000And we look at a country like China that does have this sort of inexorable connection between business and government, and we're terrified that we're going to be like that someday.
02:53:59.000It was like, you know, that there are already these relationships, you know.
02:54:03.000You know, the NSA called it PRISM. And, you know, tech companies just called it, like, the consoles or whatever they had built for these, you know, for these requests.
02:54:21.000I think a lot of people, a lot of nations look at China and are envious, right?
02:54:24.000Where it's like, they've done this thing where they just, you know, they built like the Great Firewall of China, and that has served them in a lot of ways.
02:54:35.000You know, one, surveillance, obviously, like they have total control of everything that appears on the internet.
02:54:41.000So not just surveillance, but also content moderation, propaganda, but then also, It allows them to have their own internet economy.
02:54:52.000China is large enough that they can have their own ecosystem.
02:55:05.000And I think a lot of nations look at China and they're just like, huh, that was kind of smart.
02:55:08.000It's like you have your own ecosystem, your own infrastructure that you control, and you have the ability to do content moderation, and you have the ability to do surveillance.
02:55:16.000And so I think the fear is that there's going to be a balkanization of the internet where Russia will be next and then every country that has an economy large enough will go down the same road.
02:55:29.000There's a couple things that happened that are what you're saying, but directly seems to be related to this.
02:55:35.000Sweeping crackdown on facial recognition tech.
02:55:37.000House and Senate Democrats on Tuesday rolled out a legislation to halt federal use of facial recognition software and require state and local authorities to pause any use of the technology to receive federal funding.
02:55:49.000The Facial Recognition and Biometric Technology Moratorium Act introduced Thursday.
02:55:58.000Marks one of the most ambitious crackdowns on Facebook.
02:56:11.000I mean, I think this is connected to what you're saying, just in the sense that, like...
02:56:17.000You know, the people who are producing that facial recognition technology, it's not the government.
02:56:20.000It's, you know, Valenti or whoever sells services to the government.
02:56:23.000And then, you know, the government is then deploying this technology that they're getting from industry and in kind of crazy ways.
02:56:30.000Like, there's the story of the Black Lives Matter protester who, like, the police, like, NYPD, you know, not like the FBI, you know, NYPD, like, tracked him to his house using facial recognition technology.
02:57:47.000New York City Police Department uses facial recognition software to track down a Black Lives Matter activist accused of assault after allegedly shouting into a police officer's ear with a bullhorn.
03:00:13.000That the Kennedys called the Pupnicks.
03:00:16.000And the Pupnicks captivated the imagination of children across America.
03:00:22.000Because Jackie Kennedy said something, she was like, I don't know what we're going to do with the dogs, you know?
03:00:26.000And that ignited a spontaneous letter-writing campaign from children across America who all requested one of the puppies.
03:00:34.000Jackie Kennedy selected two children in America whose names were Mark Bruce and Karen House.
03:00:40.000And she delivered two puppies to each of these people.
03:00:44.000One of them lived in Missouri, the other lived in Illinois.
03:00:49.000And I have sort of been obsessed with the idea that those puppies had puppies, and that those puppies had puppies, and that somewhere in the American Midwest today are the descendants of the original animals in space.
03:01:02.000The first animal to go to space and survive.
03:01:04.000They've probably been watered down so heavily.
03:01:24.000And so, yeah, I've been obsessed with the idea that these dogs could still be out there, and I've been trying to find the dogs.
03:01:32.000So I've been trying to track down these two people, notably Karen House, because she got the female dog.
03:01:37.000And I think she's still alive, and I think she lives in the Chicago area, but I can't get in touch with her because I'm not, I don't know, I'm not an investigative journalist.
03:01:45.000I, like, don't know how to do this or whatever.
03:01:46.000So, if anybody knows anything about the whereabouts of Karen House or the descendants of the Soviet space dogs, I'm very interested.
03:01:54.000My goal is just to meet one, you know?
03:01:56.000How should someone get in touch with you?