The Joe Rogan Experience - December 01, 2020


Joe Rogan Experience #1572 - Moxie Marlinspike


Episode Stats

Length

3 hours and 2 minutes

Words per Minute

177.76027

Word Count

32,382

Sentence Count

2,571

Misogynist Sentences

25

Hate Speech Sentences

30


Summary

Signal is an encrypted messaging app that allows you to send and receive messages without being spied on by other people. It s designed to make it easier for you to communicate with your friends, family, and significant others without fear of government surveillance. In this episode, we talk about how Signal came about, why it s important, and what it s trying to do to make the internet more private, and why we should all be using it. We also talk about privacy, privacy issues, and privacy in general, and how we can all learn to be more courageous in the process of living in a world where we re constantly being surveilled and tracked by the government. This episode was produced and edited by Annie-Rose Strasser and Alex Blumberg. Our theme song is Come Alone by Suneaters, courtesy of Lotuspool Records. Our ad music is by Build Buildings Records, provided by Epitaph Records, recorded live on location in Los Angeles, CA. Please rate, review, and subscribe to our podcast on Apple Podcasts, wherever you get your favourite streaming platform. Thank you so much for your support and support. The opinions stated here are our own, and opinions expressed in this podcast are not those of our websites and social media platforms. We are not affiliated with any of the companies listed below. We do not own the rights to any of these products or services mentioned in the podcast. If you like them, please reach out to us directly or indirectly through our social media outlets. We thank them. Thanks for listening and share this podcast! if you re a friend, we are a supporter of the podcast and/or your support is appreciated and we appreciate the support we receive. in any of their support is greatly appreciated. Thank you, we really appreciate it greatly. Timestamps: - Thank you. - This podcast is appreciated! - Please rate and review the podcast is very much appreciated. -- Thank you for your feedback is very appreciated - thank you for the support is much appreciated, thank you, it helps us out there, it really helps us make this podcast out there more than you can help us make the podcast better than you know what we can do it -- we really do it, it's a lot more than that helps us know that we can make it better than that -- thank you -- and we re making it so much out there.


Transcript

00:00:15.000 We're gonna just sit here and talk for a long time, huh?
00:00:18.000 Yeah, we already started right now.
00:00:19.000 We already started.
00:00:20.000 It has begun.
00:00:21.000 Yes.
00:00:23.000 What was your question, though?
00:00:25.000 I was gonna ask, you know, like, what if something comes up, you know?
00:00:27.000 Like what?
00:00:28.000 You know, you need to, like, pee or something.
00:00:30.000 Oh, you can totally do that.
00:00:31.000 Yeah, we'll just pause and just run out and pee.
00:00:33.000 That happens.
00:00:34.000 Don't sweat it.
00:00:35.000 I want you to be comfortable.
00:00:36.000 Have you ever done a podcast before?
00:00:38.000 First time.
00:00:39.000 Really?
00:00:39.000 First time.
00:00:41.000 So tell me how, where Signal came from.
00:00:44.000 What was the impetus?
00:00:45.000 What was, how did it get started?
00:00:48.000 It's a long story.
00:00:49.000 Sorry, we got time.
00:00:50.000 We got plenty of time.
00:00:51.000 We got time.
00:00:53.000 Okay, well, you know, I think ultimately what we're trying to do with Signal is stop mass surveillance to bring some normality to the internet and to explore a different way of developing technology that might ultimately serve all of us better.
00:01:09.000 We should tell people, maybe people just tuning in, Signal is an app that is...
00:01:15.000 Explain how it works and what it does.
00:01:17.000 I use it.
00:01:18.000 It's a messaging app.
00:01:20.000 It's a messaging app, yeah.
00:01:21.000 Fundamentally, it's just a messaging app.
00:01:23.000 Yes.
00:01:24.000 Explain...
00:01:24.000 Lofty aspirations.
00:01:25.000 Yeah.
00:01:26.000 Yeah, it's a messaging app, but it's somewhat different from the way the rest of technology works because it is encrypted.
00:01:37.000 So...
00:01:41.000 Typically, if you want to send somebody a message, I think most people's expectation is that when they write a message and they press send, that the people who can see that message are the person who wrote the message and the intended recipient.
00:01:53.000 But that's not actually the case.
00:01:56.000 There's tons of people who are in between, who are monitoring these things, who are collecting data information.
00:02:02.000 And Signal's different because we've designed it so that we don't have access to that information.
00:02:08.000 So when you send an SMS, that is the least secure of all messages.
00:02:14.000 So if you have an Android phone and you use a standard messaging app and you send a message to one of your friends, that is the least of all when it comes to security, right?
00:02:26.000 Yeah, it's a low bar.
00:02:27.000 That's the low bar.
00:02:29.000 And then iPhone, what is this?
00:02:31.000 Signal.
00:02:31.000 Oh, there you go.
00:02:33.000 So iPhones use iMessage, which is slightly more secure, but it gets uploaded to the cloud, and it's a part of their iCloud service, so it goes to some servers and then goes to the other person.
00:02:49.000 It's encrypted along the way, but it's still, it can be intercepted.
00:02:55.000 Yeah, I mean, okay, so there's...
00:02:56.000 Like Jeff Bezos' situation.
00:02:59.000 Yeah, like Jeff Bezos' situation, exactly.
00:03:02.000 Fundamentally, there's two ways to think about security.
00:03:04.000 One is computer security, this idea that we'll somehow make computers secure.
00:03:09.000 We'll put information on the computers, and then we'll prevent other people from accessing those computers.
00:03:13.000 And that is a losing strategy that people have been losing for 30 years.
00:03:18.000 Information ends up on a computer somewhere, and it ends up compromised in the end.
00:03:22.000 The other way to think about security is information security, where you secure the information itself, that you don't have to worry about the security of the computers.
00:03:28.000 You could have some computers in the cloud somewhere, information's flowing through them, and people can compromise those things and it doesn't really matter because the information itself is encrypted.
00:03:38.000 And so, you know, things like SMS, you know, the iMessage cloud backups, most other messengers, Facebook Messenger, all that stuff, you know, they're relying on this computer security model And that ends up disappointing people in the end.
00:03:56.000 Why did you guys create it?
00:03:59.000 What was unsatisfactory about the other options that were available?
00:04:04.000 Well, because the way the internet works today is insane.
00:04:09.000 Fundamentally, I feel like private communication is important because I think that change happens in private.
00:04:16.000 Everything that is fundamentally decent today started out as something that was a socially unacceptable idea at the time.
00:04:24.000 You look at things like abolition of slavery, legalization of marijuana, legalization of same-sex marriage, even constructing the Declaration of Independence.
00:04:35.000 Those are all things that required a space for people to process ideas outside the context of everyday life.
00:04:47.000 Those spaces don't exist on the internet today.
00:04:49.000 I think it's kind of crazy the way the internet works today.
00:04:53.000 If you imagined You know, every moment that you were talking to somebody in real life, there was somebody there just with a clipboard, a stranger, taking notes about what you said.
00:05:04.000 That would change the character of your conversations.
00:05:08.000 And I think that in some ways, like, we're living through a shortage of brave or bold or courageous ideas, in part because people don't have the space to process what's happening in their lives outside of the context of everyday interactions,
00:05:25.000 you know?
00:05:26.000 That's a really good way to put it, because you've got to give people a chance to think things through.
00:05:32.000 But if you do that publicly, they're not going to.
00:05:35.000 They're going to sort of like basically what you see on Twitter.
00:05:40.000 If you stray from what is considered to be the acceptable norm or the current ideology or whatever opinions you're supposed to have on a certain subject, You get attacked, ruthlessly so.
00:05:56.000 So you see a lot of self-censorship, and you also see a lot of virtue signaling, where people sort of pretend that they espouse a certain series of ideas because that'll get them some social cred.
00:06:09.000 Yeah, exactly.
00:06:10.000 I think that communication in those environments is performative.
00:06:14.000 You're either performing for an angry mob, you're performing for advertisers, you're performing for the governments that are watching.
00:06:23.000 And I think also the ideas that make it through are kind of tainted as a result.
00:06:31.000 Did you watch any of the online hearing stuff that was happening over COVID? You know, where city councils and stuff were having their hearings online?
00:06:40.000 No, I did not.
00:06:41.000 It was kind of interesting to me because it's like, you know, they can't meet in person, so they're doing it online.
00:06:46.000 And that means that the public comment period was also online, you know?
00:06:50.000 And so it used to be that, like, you know, if you go to a city council meeting, they have a period of public comment where, you know, people can just stand up and say what they think, you know?
00:06:58.000 And, like, ordinarily, it's like, oh, you got to go to city hall, you got to, like, wait in line, you got to sit there, you know?
00:07:02.000 But then when it's on Zoom, it's just sort of like anyone can just show up on the Zoom thing.
00:07:06.000 You know, they just dial in and they're just like, here's what I think, you know?
00:07:09.000 And...
00:07:10.000 You know, it was kind of interesting because particularly when a lot of the police brutality still was happening in Los Angeles, I was watching the city council hearings and people were just like, you know, they were just calling, you know, like, fuck you!
00:07:24.000 I yield the rest of my time, fuck you!
00:07:26.000 You know, it was just like really brutal and not undeservedly so.
00:07:33.000 You know, what was interesting to me was just watching the politicians, basically, you know, who just had to sit there, and just, they were just like...
00:07:41.000 Take it!
00:07:42.000 And it was just like, you know, you get three minutes, and then there's someone else to get, you know, and they're just like, okay, and now we'll hear from, you know, like...
00:07:48.000 And, you know, watching that, you sort of realize that it's like, to be a politician, you have to just sort of fundamentally not really care what people think of you, you know?
00:08:01.000 You have to fundamentally just be comfortable sitting, you know, and having people yell at you, you know, in three minute increments for an hour or whatever, you know.
00:08:11.000 And so it seems like what we've sort of done is like bred these people who are willing to do that, you know.
00:08:16.000 And in some ways that's like a useful characteristic, but in other ways that's the characteristic of a psychopath, you know.
00:08:21.000 Yes.
00:08:22.000 Yes.
00:08:23.000 And I think what we're seeing is that that also extends outside of those environments.
00:08:27.000 To do anything ambitious today requires that you just are comfortable with that kind of feedback.
00:08:37.000 Like Trump's tweets.
00:08:42.000 Yeah.
00:08:59.000 No, but, and I'm, I think, you know, Trump is perfectly capable of just not caring.
00:09:03.000 You know, just like people, like, you know, Grayson is just like, yeah, whatever, you know, I'm the best, they don't, you know.
00:09:07.000 And, like, that's, you know, that's politics.
00:09:11.000 But I think, you know, the danger is when that, you know, to do anything ambitious, you know, outside of politics or whatever, you know, requires that you're capable of just not caring, you know, what people think or whatever, because everything is happening in public.
00:09:24.000 I think you made a really good point in that change comes from people discussing things privately because you have to be able to take a chance.
00:09:37.000 You have to be daring and you have to be able to confide in people and you have to be able to say, hey, this is not right and we're going to do something about it.
00:09:46.000 If you do that publicly, the powers that be that do not want change in any way, shape, or form, they'll come down on you.
00:09:54.000 This is essentially what Edward Snowden was warning everyone about when he decided to go public with all this NSA information.
00:10:02.000 We're saying, look, this is not what we signed up for.
00:10:06.000 Someone's constantly monitoring your emails, constantly listening to phone calls.
00:10:11.000 This is not this mass surveillance thing.
00:10:13.000 It's very bad for just the culture of free expression, just our ability to have ideas and to be able to share them back and forth and vet them out.
00:10:23.000 It's very bad.
00:10:25.000 Yeah.
00:10:25.000 I think when you look at the history of that kind of surveillance, there are a few interesting inflection points.
00:10:31.000 At the beginning of the internet as we know it, in the early to mid-90s, there were these DOD efforts to do mass surveillance.
00:10:43.000 They were sort of open about what they were doing.
00:10:47.000 One of them was this program called Total Information Awareness.
00:10:53.000 And they were trying to start this office, I think called the Total Awareness Office or something within the DoD.
00:10:58.000 And the idea was they're just going to collect information on all Americans and everyone's communication and just stockpile it into these databases and then they would use that to mine those things for information.
00:11:09.000 It was sort of like their effort to get in on this at the beginning of the information age.
00:11:15.000 And, you know, it was ridiculous.
00:11:17.000 You know, it's like they called it Total Information Awareness.
00:11:19.000 They had a logo that was like, you know, the pyramid with the eye on top of it.
00:11:24.000 Oh, yeah.
00:11:25.000 This is their logo.
00:11:26.000 Oh, God.
00:11:28.000 The pyramid with the eye, like, casting a beam on the earth.
00:11:31.000 That bit of Latin there means knowledge is power.
00:11:34.000 Oh, wow.
00:11:35.000 And interesting, this program was actually started by John Poindexter, of all people, who was involved in the Iran-Contra stuff, I think.
00:11:43.000 Really?
00:11:44.000 Yeah, yeah.
00:11:44.000 And he, like, went to jail for a second, then was pardoned or something.
00:11:49.000 So, anyway, you know, they're like...
00:11:51.000 It's just so fucked up that these people are in charge of anything.
00:11:53.000 I know, but what's also kind of comical is that they were like, this is what we're going to do.
00:11:58.000 Look at how crazy this is.
00:12:00.000 This is our plan.
00:12:01.000 And people were like, I don't think so.
00:12:04.000 What year was this?
00:12:06.000 This was like early, mid-90s.
00:12:08.000 Look at this.
00:12:09.000 Authentication, biometric data, face, fingerprints, gait.
00:12:15.000 Iris, your gait.
00:12:16.000 So they're going to identify people based on the way they walk?
00:12:20.000 I guess your gait is that specific?
00:12:22.000 Yeah.
00:12:23.000 Then automated virtual data repositories, privacy and security.
00:12:28.000 This is fascinating.
00:12:29.000 Because if you look at, I mean, obviously no one thought of cell phones back then.
00:12:33.000 Exactly, right.
00:12:34.000 So this is like kind of amateurish, right?
00:12:36.000 So it's like, they're like, this is what we're going to do, you know?
00:12:38.000 And people are like, I don't think so.
00:12:39.000 Even like Congress is like, guys, I don't think we can approve this.
00:12:43.000 You need a better logo, you know?
00:12:45.000 Yeah, for sure.
00:12:46.000 But it's just this whole flow chart.
00:12:50.000 Is that what this would be?
00:12:51.000 What do you call something like this?
00:12:53.000 What is it called?
00:12:56.000 Flowchart, I guess, sort of.
00:12:57.000 Designed to dazzle you.
00:12:59.000 Yeah.
00:13:00.000 It's like baffling to figure out what it is.
00:13:02.000 Like, first of all, what are all those little color tubes?
00:13:05.000 Those little ones?
00:13:06.000 Those little cylinders?
00:13:08.000 Those are data silos.
00:13:09.000 Oh.
00:13:09.000 That's the universal.
00:13:10.000 They're all different colors.
00:13:11.000 There's purple ones.
00:13:12.000 What's in the purple data?
00:13:13.000 Well, gate, maybe.
00:13:14.000 That's where gate lives, yeah.
00:13:16.000 It's all Prince's information.
00:13:18.000 Okay, so that, you know, this stuff all sort of got shut down, right?
00:13:21.000 Yeah.
00:13:22.000 They're like, okay, we can't do this, you know?
00:13:24.000 And then instead, what ended up happening was data naturally accumulated in different places.
00:13:33.000 Back then, what they were trying to do is be like, our proposal is that everyone carry a government-mandated tracking device at all times.
00:13:40.000 What do you guys think?
00:13:41.000 It'll make us safer.
00:13:42.000 And people were like, no, I don't think so.
00:13:43.000 But instead, everyone ended up just carrying cell phones at all times, which are tracking your location and reporting them into centralized repositories that government has access to.
00:13:52.000 And so, you know, this sort of like oblique surveillance infrastructure ended up emerging.
00:13:59.000 And that was what, you know, people sort of knew about, but, you know, didn't really know.
00:14:03.000 And that's what Snowden revealed.
00:14:06.000 It was like, you know, we don't have this.
00:14:08.000 Instead, it's like all of those things are happening naturally, you know.
00:14:12.000 You know, gate detection, fingerprint, you know, like all this stuff's happening naturally.
00:14:15.000 It's ending up in these places.
00:14:17.000 And then...
00:14:18.000 You know, governments are just going to those places and getting the information.
00:14:24.000 And then I think, you know, the next inflection point was really Cambridge Analytica.
00:14:29.000 You know, that was a moment where I think people were like...
00:14:32.000 Explain that to people, please.
00:14:34.000 Cambridge Analytica was a firm that was using big data in order to forecast and manipulate people's opinions.
00:14:51.000 In particular, they were involved in the 2016 election.
00:14:58.000 It was sort of, you know, so it's like, you know, what Stone have revealed was PRISM, which was the cooperation between the government and these places where data was naturally accumulating, like Facebook, Google, etc., you know, and the phone company.
00:15:12.000 And Cambridge Analytica, I think, was the moment that people were like, oh, there's like also sort of like a private version of PRISM, you know, that's like not just governments, but like the data is out there.
00:15:22.000 And other people who are motivated are using that against us, you know?
00:15:25.000 And so I think, you know, in the beginning it was sort of like, oh, this could be scary.
00:15:29.000 And then it was like, oh, but, you know, we're just using these services.
00:15:33.000 And then people were like, oh, wait, the government is, you know, using the data that we're, you know, sending to these services.
00:15:39.000 And then people were like, oh, wait, like anybody can use the data against us.
00:15:43.000 And they were like, oh, you know, it's like, I think things went from like, I don't really have anything to hide to like, wait a second, these people can...
00:15:49.000 Predict and influence how I'm going to vote based on what kind of jeans I buy?
00:15:54.000 And then sort of where we are today, where I think people are also beginning to realize that the companies themselves that are doing this kind of data collection are also not necessarily acting in our best interests.
00:16:07.000 Yeah, for sure.
00:16:09.000 There's also this weird thing that's happening with these companies that are gathering the data, whether it's Facebook or Google.
00:16:18.000 I don't think they ever set out to be what they are.
00:16:23.000 They started out, like Facebook, for example, we were talking about it before.
00:16:28.000 It was really just sort of like a social networking thing.
00:16:33.000 And this was in the early days.
00:16:35.000 It was a business.
00:16:36.000 I don't think anybody ever thought it was going to be something that influences world elections in a staggering way.
00:16:43.000 Especially in other parts of the world, where Facebook becomes the sort of de facto messaging app on your phone when you get it.
00:16:53.000 I mean, it has had massive...
00:16:55.000 Impact on politics, on shaping culture, on...
00:17:00.000 I mean, even genocide has been connected to Facebook in certain countries.
00:17:06.000 You know, it's weird that this thing that is in...
00:17:10.000 I don't know how many different languages does Facebook operate under?
00:17:16.000 All of them, yeah.
00:17:17.000 I mean, that this was just a social app...
00:17:22.000 It was from Harvard, right?
00:17:23.000 They were just connecting students together?
00:17:25.000 Wasn't that initially what the first iteration of it was?
00:17:29.000 Yeah.
00:17:29.000 Okay, I mean, I think you can say, like, no one anticipated that these things would be this significant.
00:17:36.000 But I also think that there's, you know, I think ultimately, like, what we end up seeing again and again is that, like, bad business models produce bad technology, you know?
00:17:45.000 That, like...
00:17:49.000 Mark Zuckerberg did not create Facebook because of his deep love of social interactions.
00:17:54.000 He did not have some deep sense of wanting to connect people and connect the world.
00:17:58.000 That's not his passion.
00:18:00.000 Jeff Bezos did not start Amazon because of his deep love of books.
00:18:05.000 These companies are oriented around profit.
00:18:09.000 They're trying to make money.
00:18:13.000 And they're subject to external demands as a result.
00:18:17.000 They have to grow infinitely, which is insane, but that's the expectation.
00:18:21.000 And so what we end up seeing is that the technology is not necessarily in our best interest because that's not what it was designed for to begin with.
00:18:31.000 That is insane that companies are expected to grow infinitely.
00:18:37.000 What is your expectation?
00:18:39.000 To take over everything.
00:18:40.000 To have all the money.
00:18:42.000 And then more.
00:18:44.000 Yeah, if we extrapolate, we anticipate we will have all the money.
00:18:47.000 There will be no other money.
00:18:50.000 If you keep going, that's what has to happen.
00:18:53.000 How can you just grow infinitely?
00:18:54.000 That's bizarre.
00:18:55.000 Yeah, and that's why, I mean, I think the Silicon Valley obsession with China is a big part of that, where people, they're just like, wow, that's a lot of people there.
00:19:03.000 Yes, that's a lot of people there.
00:19:05.000 You can just keep growing.
00:19:06.000 Yeah, there was a fantastic thing that I was reading this morning.
00:19:12.000 God, I wish I could remember what the source of it was.
00:19:15.000 But they were essentially talking about how strange it is that there are so many people that are...
00:19:29.000 We're good to go.
00:19:51.000 Oh, yeah.
00:19:52.000 You have eight grams of cobalt in your pocket over there.
00:19:55.000 Yeah.
00:19:56.000 Mined by actual child slaves.
00:19:58.000 Someone had to stick a – like, literally, they're getting it out of the ground, digging into the dirt to get it out of the ground.
00:20:05.000 We were talking about it on the podcast.
00:20:06.000 They were like, is there a way that this could – is there a future that you could foresee where you could buy a phone that is guilt-free?
00:20:17.000 If I buy a pair of shoes, like I bought a pair of boots from my friend Jocko's company.
00:20:24.000 He's got a company called Origin.
00:20:26.000 They make handmade boots.
00:20:27.000 And it's made in a factory in Maine.
00:20:30.000 You can see a tour of the factory.
00:20:31.000 These guys are stitching these things together, and it's a real quality boot.
00:20:35.000 And I'm like, I like that I could buy this.
00:20:37.000 I know where it came from.
00:20:38.000 I could see a video of the guys making it.
00:20:41.000 This is a thing that I could feel like...
00:20:43.000 I am giving them money.
00:20:45.000 They're giving me a product.
00:20:47.000 There's a nice exchange.
00:20:48.000 It feels good.
00:20:49.000 I don't feel like that with a phone.
00:20:51.000 With a phone, I have this bizarre disconnect.
00:20:53.000 I try to pretend that I'm not buying something that's made in a factory where there's a fucking net around it because so many people jump to their deaths that instead of trying to make things better, they say, we're going to put nets up, catch these fuckers, put them back to work.
00:21:09.000 Is it possible...
00:21:12.000 That we would all get together and say, hey, enough of this shit.
00:21:17.000 Will you make us a goddamn phone that doesn't make me feel like I'm supporting slavery?
00:21:22.000 Yeah, I mean, I think you're asking...
00:21:25.000 Too much?
00:21:28.000 I think you're asking...
00:21:30.000 I think that's the same as asking, will civilization ever decide that we collectively want to have a sane and sustainable way of living?
00:21:39.000 Yeah.
00:21:41.000 Sane and sustainable.
00:21:42.000 And I hope the answer is yes.
00:21:44.000 I think a lot of us do.
00:21:45.000 You do, right?
00:21:47.000 I do.
00:21:48.000 You don't want to buy a slave phone, right?
00:21:52.000 Yeah, I mean, but okay, so, you know, I feel like it's difficult to have this conversation without having a conversation about capitalism, right?
00:21:59.000 Because, like, ultimately, you know, what we're talking about is, like, externalities, that the prices of things don't incorporate their true cost, you know, that, like, you know, we're destroying the planet for plastic trinkets and reality television, you know, like...
00:22:12.000 We can have the full conversation if you like.
00:22:15.000 Let's start with phones, though.
00:22:18.000 Let's start with...
00:22:19.000 Because when most people know the actual...
00:22:24.000 From the origin of the materials, like how they're coming...
00:22:30.000 How they're getting out of the ground, how they're getting into your phone, how they're getting constructed, how they're getting manufactured and assembled by these poor people...
00:22:41.000 When most people hear about it, they don't like it.
00:22:44.000 It makes them very uncomfortable.
00:22:45.000 But they just sort of go, la la la.
00:22:48.000 They just plug their ears and keep going and buy the latest iPhone 12 because it's cool.
00:22:52.000 It's new.
00:22:53.000 What would they do instead?
00:22:54.000 Well, if there was an option.
00:22:57.000 So, like, if you have a car that you know is being made by slaves, or a car that's being made in Detroit by union workers, wouldn't you choose the car, as long as they're both of equal quality?
00:23:10.000 I think a lot of people would feel good about their choice.
00:23:14.000 If they could buy something that, well, no, these people are given a very good wage.
00:23:18.000 They have health insurance and they're taken care of.
00:23:21.000 They have a pension plan.
00:23:23.000 There's all these good things that we would like to have ourselves that these workers get.
00:23:28.000 So you should probably buy that car.
00:23:30.000 Why isn't there an option like that for a phone?
00:23:32.000 We looked at this thing called a fare phone.
00:23:34.000 We're going over it.
00:23:35.000 Can't even fucking buy it in America.
00:23:37.000 Like, no, America has no options for fare.
00:23:41.000 They only have them in, like, Holland and a couple other European countries.
00:23:46.000 Yeah.
00:23:46.000 I mean, I think...
00:23:50.000 Yeah, maybe it's good to, you know, start with the question of phones.
00:23:52.000 I think if you really examined, like, most of the things in your everyday life, there is an apocalyptic aspect to them.
00:23:59.000 Yes.
00:24:00.000 I mean, you know, even agriculture, you know, it's just like, you know, the sugar you put in your car, you know, it's like, I've been to the sugar beet harvest, you know, it's apocalyptic, you know, it's like, you know, so I think there's just like an aspect of civilization that we don't usually see or think about.
00:24:21.000 Not non-conscious, but I mean conscious capitalism would be the idea that you want to make a profit, but you only want to make a profit if everything works.
00:24:32.000 Like the idea of me buying my shoes from origin.
00:24:36.000 Like knowing, okay, these are the guys that make it.
00:24:38.000 This is how they make it.
00:24:39.000 This makes me feel good.
00:24:41.000 I like this.
00:24:42.000 If there was that with everything...
00:24:45.000 If you buy a home from a guy who you know built the home, this is the man.
00:24:52.000 This is the chief construction guy.
00:24:56.000 These are the carpenters.
00:24:57.000 This is the architect.
00:24:58.000 Oh, okay, I get it.
00:25:00.000 This all makes sense.
00:25:01.000 Yeah, I mean, and I think that's the image that a lot of companies try to project.
00:25:06.000 You know what I mean?
00:25:07.000 Like, you know, even Apple will say, you know, it's like designed by Apple in California.
00:25:13.000 Sure, designed.
00:25:15.000 And I think that's the same as like the architect and the builders that you know, you know, but those materials are coming from somewhere.
00:25:22.000 That's true.
00:25:22.000 The wood is coming from somewhere.
00:25:24.000 And it's not just wood.
00:25:27.000 There's petrochemicals.
00:25:28.000 That whole supply chain is apocalyptic and you're never going to meet all of those people.
00:25:34.000 And so I think, sure, they're...
00:25:38.000 I think it's difficult to be in that market, if you want to be in the market of conscious capitalism or whatever, because it's a market for lemons.
00:25:46.000 Because it's so easy to just put a green logo on whatever it is that you're creating, and no one will ever see the back of the supply chain.
00:25:56.000 That's a sad statement about humans.
00:26:02.000 You know, that we're...
00:26:03.000 That this is how...
00:26:04.000 I mean, this is how we always do things if you let us.
00:26:10.000 If you leave us alone.
00:26:12.000 If there's a way...
00:26:13.000 You know, I mean, privacy is so important when it comes to communication with individuals.
00:26:18.000 And this is why you created Signal.
00:26:20.000 But when you can sort of hide...
00:26:24.000 All the various elements that are involved in all these different processes, all these different things that we buy and use.
00:26:31.000 And then, as you said, they're apocalyptic, which is a great way of describing it.
00:26:36.000 If you're at the ground watching these kids pull coltan out of the ground in Africa, you'd probably feel really sick about your cell phone.
00:26:46.000 Yeah, but I don't think...
00:26:52.000 I think it's a little more complicated than to say that just like humans are terrible or whatever.
00:26:57.000 No, I don't think humans are terrible.
00:26:59.000 I think humans are great.
00:27:00.000 But I think if you put humans together and you give them this diffusion of responsibility that comes from a corporation and then you give them a mandate, you have to make as much money as possible every single year.
00:27:11.000 And then you have shareholders and you have all these different factors that will allow them to say, well, I just work for the company.
00:27:19.000 You know, it's not my call.
00:27:21.000 You know, I just, you know, you got the guy carving up a stake saying, listen, I'm so sorry that we have to use slaves, but look, Apple's worth $5 trillion.
00:27:29.000 We've done a great job for our shareholders.
00:27:31.000 Yeah, yeah, yeah.
00:27:32.000 At the end of the line, follow it all the way down to the beginning, and you literally have slaves.
00:27:37.000 Yeah, I fundamentally agree, and I think that that's, you know, that's...
00:27:45.000 Anytime you end up in a situation where, like, most people do not have the agency that they would need in order to direct their life the way that they would want, you know, direct their life so that we're living in a sane and sustainable way,
00:28:01.000 that, yeah, I think is a problem.
00:28:04.000 And I think that's the situation we're in now, you know.
00:28:06.000 And honestly, I feel like, you know, the stuff that we were talking about before of, you know, people...
00:28:13.000 You know, sort of being mean online is a reflection of that.
00:28:17.000 You know, that That's the only power that people have.
00:28:25.000 The only thing you can do is call someone a name, you're going to call them a name.
00:28:32.000 And I think that it's unfortunate, but I think it is also unfortunate that most people have so little agency and control over the way that the world works that that's all they have to do.
00:28:48.000 And I guess you would say also that the people that do have power, that are running these corporations, don't take into account what it would be like to be the person at the bottom of the line.
00:29:02.000 To be the person that is...
00:29:04.000 There's no discussion.
00:29:06.000 There's no board meetings.
00:29:08.000 Like, hey guys, what are we doing about slavery?
00:29:10.000 Well, no, I'm sure that they do talk about that, honestly.
00:29:15.000 But they've done nothing.
00:29:17.000 They've probably done what they think is something.
00:29:22.000 Even the CEO of a company is someone who's just doing their job at the end of the day.
00:29:27.000 They don't have ultimate control and agency over how it is that a company performs because they are accountable to their shareholders, they're accountable to the board.
00:29:36.000 I think there is a tendency for people to look at what's happening, particularly with technology today, And think that it's the fault of the people, the leaders of these companies.
00:29:53.000 I think it goes both ways.
00:29:55.000 Slavoj Žižek always talks about when you look at the old political speeches, if you look at the fascist leaders, they would give a speech and when there was a moment of applause, they would just sort of stand there and accept the applause because in their ideology,
00:30:11.000 they were responsible for the thing that people were applauding.
00:30:16.000 And if you watch the old communist leaders, like when Stalin would give a speech and he would say something and there would be a moment of applause, he would also applaud.
00:30:24.000 Because in their ideology of historical materialism, they were just agents of history.
00:30:29.000 They were just the tools of the inevitable.
00:30:31.000 It wasn't them.
00:30:33.000 You know, they had just sort of been chosen as the agents of this thing that was an inevitable process.
00:30:37.000 And so they were applauding history, you know.
00:30:39.000 Sometimes when I see the CEOs of tech companies give speeches and people applaud, I feel like they should also be applauding.
00:30:46.000 That it's not them.
00:30:50.000 Technology has its own agency, its own force that they're the tools of, in a way.
00:30:58.000 That's a very interesting way of looking at it.
00:31:01.000 Yeah, they are the tools of it.
00:31:03.000 And at this point, if we look at where we are in 2020, it seems inevitable.
00:31:09.000 It seems like there's just this unstoppable amount of momentum behind innovation and behind just the process of Creating newer, better technology and constantly putting it out and then dealing with the demand for that newer,
00:31:24.000 better technology and then competing with all the other people that are also putting out newer, better technology.
00:31:33.000 Look what we're doing.
00:31:35.000 We are helping the demise of human beings.
00:31:38.000 Because I feel, and I've said this multiple times and I'm going to say it again, I think that we are the electronic caterpillar that will give way to the butterfly.
00:31:50.000 We don't know what we're doing.
00:31:52.000 We are putting together something that's going to take over.
00:31:56.000 We're putting together some ultimate being, some symbiotic connection between humans and technology, or literally an artificial version of life, not even artificial, a version of life constructed with silicone and wires and things that we're making.
00:32:14.000 If we keep going the way we're going, we're going to come up with a technology that I think we're a ways away.
00:32:31.000 Yeah, we're a ways away, but how many ways?
00:32:33.000 50 years?
00:32:34.000 The moment that I can put my hand under the automatic sink thing and have the soap come out without waving around, then I'll be worried.
00:32:45.000 That's simplistic, sir.
00:32:47.000 How dare you?
00:32:48.000 Here's a good example.
00:32:49.000 The Turing test is if someone sat down with, like in Ex Machina, Remember, it was one of my all-time favorite movies, where the coder is brought in to talk to the woman, and he falls in love with the robot lady, and she passes the Turing test,
00:33:06.000 because he's in love with her.
00:33:08.000 I mean, he really can't differentiate, in his mind, that is a woman, that's not a robot.
00:33:15.000 Was it Alan Turing?
00:33:17.000 What was the gentleman's name?
00:33:19.000 Alan Turing.
00:33:19.000 Alan Turing, that came up with the Turing test.
00:33:22.000 You know, he was a gay man in England in the 1950s when it was illegal to be gay.
00:33:29.000 And they chemically castrated him because of that.
00:33:33.000 And he wound up killing himself.
00:33:35.000 That's only 70 years ago.
00:33:38.000 Oh yeah, yeah.
00:33:39.000 It's fucking insane.
00:33:41.000 I mean, just think that this man back then was thinking there's going to be a time where we will have some kind of a creation where we imitate life, the current life that we're aware of,
00:33:59.000 where we're going to make a version of it that's going to be indistinguishable from the versions that are biological.
00:34:05.000 That very guy, by whatever twisted ideas of what human beings should or shouldn't do, whatever expectations of culture at the time, is forced to be chemically castrated and winds up committing suicide.
00:34:20.000 Just by the hand of humans.
00:34:23.000 Fucking strange, man.
00:34:25.000 Like, really strange.
00:34:27.000 I mean...
00:34:29.000 Worse than strange.
00:34:31.000 Oh, yes.
00:34:32.000 Horrible.
00:34:33.000 But I mean, so bizarre that this is the guy that comes up with the test of how do we know when something is...
00:34:43.000 When it passes, when you have an artificial person that passes for a person, and then what kind of rights do we give this person?
00:34:51.000 What is this?
00:34:52.000 What is it?
00:34:54.000 If it has emotions, what if it cries?
00:34:56.000 Are you allowed to kick it?
00:34:58.000 You know, like, what do you do?
00:35:00.000 Like, that's—but I made it.
00:35:01.000 I turned it on.
00:35:02.000 I could fucking torture it.
00:35:03.000 But you can't.
00:35:04.000 It's screaming.
00:35:05.000 It's in agony.
00:35:06.000 Don't do that.
00:35:07.000 Yeah.
00:35:08.000 I mean, you know, I don't think about this stuff that often, but it is, you know, it's an empirical test, right?
00:35:13.000 So it's like, it's a way to avoid having to define what consciousness is, right?
00:35:18.000 Which is kind of strange.
00:35:19.000 We're conscious beings and we don't actually really even know what that means.
00:35:22.000 Right.
00:35:23.000 And so instead we have this empirical test where it's just sort of like, well, if you can't tell the difference without being able to see it, then we'll just call that.
00:35:33.000 I think that is really a lot closer than we think.
00:35:37.000 I think that's 50 years.
00:35:40.000 I think that if everything goes well, I think I'm going to be a 103-year-old man on my dying bed being taken care of by robots.
00:35:48.000 And I'm going to feel real fucked up about that.
00:35:51.000 I'm going to be like, oh my god.
00:35:53.000 I can't believe this.
00:35:54.000 I'm gonna leave and then all the people that I knew that are alive, they're the last of the people.
00:35:59.000 This is it.
00:35:59.000 The robots are gonna take over.
00:36:01.000 They're not even gonna be robots.
00:36:02.000 They're gonna come up with some cool name for them.
00:36:05.000 Yeah, I mean, I think that there's a lot of, most of what I see in like the artificial intelligence world right now is not really intelligence, you know, it's, it's just matching, you know, it's like you show a model, an image of 10 million cats, and then you can show it an image,
00:36:21.000 and it will be like, I predict that this is a cat.
00:36:24.000 And then you can show it an image of a truck, and it'll be like, I predict that this is not a cat.
00:36:30.000 I think there's one way of looking at it that's like, well, you just do that with enough things enough times, and that's what intelligence is.
00:36:36.000 But I kind of hope not.
00:36:40.000 The way that it's being approached right now, I think, is also dangerous in a lot of ways, because what we're doing is just feeding information about the world into these models, and that just encodes the existing biases and problems with the world into the things that we're creating.
00:36:57.000 That, I think, has negative results.
00:37:00.000 But it's true.
00:37:02.000 This ecosystem is moving and it's advancing.
00:37:04.000 The thing that I think is unfortunate is that right now, that ecosystem, this really capital-driven investment startup ecosystem, has a monopoly on groups of young people trying to do something ambitious together in the world.
00:37:23.000 In the same way that I think it's unfortunate that grad school has a monopoly on groups of people learning things together.
00:37:31.000 Part of what we're trying to do different with Signal is it's a non-profit because we want to be for something other than profit.
00:37:39.000 We're trying to explore a different way of groups of people doing something mildly ambitious.
00:37:44.000 Has anyone come along and go, I know it's a non-profit, but would you like to sell?
00:37:50.000 Well, you can't do that.
00:37:53.000 There's nothing to sell.
00:37:55.000 It's kind of amazing, though, that you guys have figured out a way to create, like, basically a better version of iMessage that you could use on Android.
00:38:04.000 Because one of the big complaints about Android is the lack of any encrypted messaging services.
00:38:10.000 Or just good messaging services.
00:38:11.000 Yeah, they've just recently come out with their own version of iMessage, but it kind of sucks.
00:38:16.000 You can't do group chats.
00:38:18.000 There's a lot of things you can't do with it, and it's encrypted.
00:38:23.000 I don't think it's rolled out everywhere, too, right?
00:38:26.000 It's not everywhere.
00:38:27.000 I don't think it's rolled out at all, actually.
00:38:29.000 Oh, you could get a beta?
00:38:30.000 Is that what it is?
00:38:31.000 Yeah, I don't know what the...
00:38:32.000 Right, so it's like, you know, Android...
00:38:36.000 So Google, for Android, makes an app called Messages, which is just the standard SMS texting app.
00:38:41.000 And they put that on the phones that they make, like the Pixel and stuff like that, you know.
00:38:46.000 And then there's the rest of the ecosystem.
00:38:49.000 You know, there's like, you know, Samsung devices, Huawei devices, you know, all this stuff.
00:38:52.000 And it's sort of...
00:38:54.000 It depends, you know, what's on those things.
00:38:57.000 And...
00:38:57.000 So, they've been trying to move from this very old standard called SMS that you mentioned before to this newer thing called RCS, which actually I don't know what that stands for.
00:39:08.000 I think in my mind I always think of it as standing for too little too late.
00:39:11.000 But they're trying to move to that.
00:39:17.000 So they're doing that on the part of the ecosystem that they control, which is the devices that they make and sell.
00:39:23.000 And they're trying to get other people on board as well.
00:39:27.000 Originally, RCS didn't have any facility for end-to-end encryption.
00:39:31.000 And they're actually using our stuff, the Signal Protocol, in the new version of RCS that they're shipping.
00:39:38.000 So I think they've announced that, but I don't know if it's on or not.
00:39:42.000 I have two bones to pick with you guys.
00:39:44.000 Two things that I don't necessarily like.
00:39:46.000 One, when I downloaded Signal and I joined, basically everyone that I'm friends with who was also on Signal got a message that I'm on Signal.
00:39:57.000 So you ratted me out.
00:39:59.000 You ratted me out to all these people that are in my contact list.
00:40:02.000 Why do you want it to be difficult for people to communicate with you privately?
00:40:05.000 Well, me personally, because there's a lot of people that have my phone number that I wish didn't have my phone number.
00:40:10.000 And now all of a sudden they got a message from me that I'm on Signal.
00:40:13.000 And then they send me a message.
00:40:15.000 Hey, I'd like this from you.
00:40:16.000 I want you to do that for me.
00:40:17.000 How about call me about this?
00:40:19.000 I got a project.
00:40:21.000 So I just wish you didn't rat me out.
00:40:23.000 I wish there was a way that you could say, do you want Everyone to know that you just joined Signal.
00:40:29.000 Yes or no?
00:40:30.000 I'd say no!
00:40:32.000 Another one.
00:40:32.000 Those little dot dot dots, the ellipsis.
00:40:34.000 Yeah.
00:40:35.000 Can you shut that off?
00:40:36.000 Because I don't want anybody to know that I'm responding to a text.
00:40:39.000 You can turn it off.
00:40:39.000 Can you turn that off?
00:40:40.000 Oh, okay.
00:40:41.000 So it's in the settings?
00:40:42.000 Yeah, privacy settings.
00:40:43.000 Typing indicators, you can turn it off.
00:40:44.000 Leave receipts, you can turn it off.
00:40:46.000 That's a big problem with iMessage.
00:40:47.000 People get mad at you.
00:40:49.000 They see the dot, dot, dots, and then there's no message.
00:40:52.000 Like, hey, you were going to respond, and then you didn't.
00:40:55.000 Why don't you just relax?
00:40:57.000 Just go about your life and pretend that I didn't text you back yet.
00:41:00.000 Because I will.
00:41:02.000 But it's not like the dot, dot, dots.
00:41:04.000 People are like, oh, it's coming.
00:41:05.000 Here comes the message.
00:41:06.000 And then there's no message!
00:41:10.000 Yeah, you can turn that off.
00:41:10.000 You can also turn off your receipt so people don't even know if you've read their message.
00:41:14.000 Yes, that's good, too.
00:41:15.000 Yeah.
00:41:16.000 My friend Sagar has it set up so that if he texts you, you have 30 minutes, bitch, and then they all disappear.
00:41:22.000 All the messages disappear.
00:41:24.000 Oh, oh, they disappear.
00:41:25.000 Yeah, yeah, yeah.
00:41:25.000 That's kind of a sweet move.
00:41:27.000 I like that.
00:41:28.000 With the discovery question of you don't want people to know that you're on Signal, it's kind of So, we're working on it, but it's a more difficult problem than you might imagine because you want some people to know that you're on...
00:41:44.000 I'll text them!
00:41:46.000 So you want nobody to know?
00:41:48.000 Well, me personally, I have a unique set of problems that comes with anything that I do, like with messaging and stuff.
00:41:58.000 I've changed my number once a year, and I have multiple phone numbers.
00:42:04.000 I got a lot of problems.
00:42:05.000 But this is a unique problem with me.
00:42:08.000 All of a sudden, I'm like, how the fuck does he know?
00:42:11.000 And then I had to ask someone.
00:42:13.000 They go, oh no, when you sign up, it sends everybody on your contact list that's on Signal a message that says you're on Signal.
00:42:20.000 I'm like, oh!
00:42:21.000 Well, we don't send that, actually.
00:42:24.000 I know you don't care, but we don't actually know who your contacts are.
00:42:28.000 Signal does, though.
00:42:29.000 The app does.
00:42:31.000 The app on your phone does, and it doesn't even send a message to those people.
00:42:34.000 It's just that those people know your phone number, and that app now knows that that phone number is on Signal.
00:42:42.000 Did you do that just to get more people to use Signal?
00:42:47.000 Why, when you sign up for Signal, does it send all the other people in your contact list on Signal a message?
00:42:54.000 A lot of people like it.
00:42:56.000 So a lot of people like knowing who they can communicate with.
00:42:59.000 And the other thing is we try to square the actual technology with the way that it appears to work to people.
00:43:06.000 So right now, with most technology, it seems like you send a message and the person who can see it is the person who received the message.
00:43:12.000 You sent the message to, you know, the intent recipient, you know?
00:43:14.000 And that's not how it actually works.
00:43:16.000 And so, like, a lot of what we're trying to do is actually just square the way the technology actually works with what it is that people perceive.
00:43:23.000 And so, like, fundamentally, right now, you know, Signal is based on phone numbers.
00:43:29.000 If you register with your phone number, like, people are going to know that they can contact you on Signal.
00:43:35.000 It's very difficult to make it so that they can't, you know, that, like, If we didn't do that, they could hit the compose button and see just that they could send you a message.
00:43:45.000 They would just see you in the list of contacts that they can send messages to.
00:43:48.000 And then if we didn't display that, they could just try and send you a message and see whether a message goes through.
00:43:54.000 It's always possible to detect whether it is that you're on Signal the way that things are currently designed.
00:44:00.000 It's interesting also how it works so much differently with Android than it does with iMessage.
00:44:04.000 With Android, it'll also send an SMS. I noticed that I can use Signal as my main messaging app on Android.
00:44:13.000 And it'll send SMS or it'll send a Signal message.
00:44:16.000 It doesn't do that with iPhones.
00:44:19.000 Yeah, Apple doesn't let you.
00:44:21.000 Yeah, I found that pretty interesting.
00:44:23.000 Because I tried to send people messages.
00:44:24.000 I thought it would just send it as an SMS and it didn't.
00:44:26.000 We would if we could, but Apple doesn't allow it.
00:44:31.000 It doesn't allow it.
00:44:32.000 Interesting.
00:44:33.000 Because Apple's scared of you.
00:44:35.000 Say it!
00:44:36.000 Say it!
00:44:37.000 They're fucking scared!
00:44:38.000 No, I mean...
00:44:39.000 They should be.
00:44:40.000 Apple is...
00:44:41.000 It's a better version of what they've got.
00:44:42.000 How about that?
00:44:44.000 I agree, but yeah, I mean, they have a much more complicated answer, but maybe you can distill it down to them.
00:44:50.000 You guys need to just develop your own version of AirDrop, and then no one will need Apple ever again.
00:44:56.000 That's what's holding people back, like a universal airdrop.
00:44:59.000 Airdrop keeps a lot of fucking people on Apple.
00:45:02.000 It's the best.
00:45:04.000 You make a video, like a long video, a couple minutes long, and you can just airdrop it to me.
00:45:09.000 Whereas if you text it to me, especially if I have an Android phone, oh, it becomes this disgusting version.
00:45:15.000 I'll downsample it.
00:45:16.000 It looks terrible.
00:45:17.000 Yeah, no, that's true.
00:45:19.000 That's true.
00:45:20.000 Yeah, photographs are not too bad.
00:45:22.000 I think it does a down-sample photographs as well, but not too bad.
00:45:26.000 It's like, you could look at it, it looks like a good photograph.
00:45:30.000 But video is just god-awful.
00:45:32.000 It's embarrassing when someone sends you a video and you have it on an Android phone.
00:45:35.000 You're like, what the fuck did you send me?
00:45:37.000 This is terrible.
00:45:39.000 What did you take this with?
00:45:41.000 A flip phone from the 90s?
00:45:43.000 It's so bad.
00:45:44.000 But I mean, a lot of that is like, I think the reason why it is that way is kind of interesting to me, which is, you know, it's like these are protocol, you know, it's like when you're just using a normal SMS message on Android,
00:46:00.000 you know, that was like this...
00:46:03.000 agreement that phone carriers made with each other in like, you know...
00:46:08.000 2002?
00:46:09.000 No, before that.
00:46:10.000 Really?
00:46:10.000 96?
00:46:11.000 Yeah, exactly.
00:46:12.000 And then they've been unable to change the way that it works since then because you have to get everyone to agree.
00:46:19.000 Right.
00:46:19.000 And is Apple holding back some sort of a universal standard?
00:46:24.000 Because if they did have a universal standard, then everyone would have this option to use.
00:46:28.000 You could use a Samsung phone or a Google phone.
00:46:30.000 You could use anything, and everybody would be able to message you clearly, without a problem.
00:46:35.000 Like, one of the things that holds people back is if you switch from an iPhone to an Android phone, you lose all those iMessages.
00:46:41.000 Sure, sure, sure.
00:46:42.000 Yeah.
00:46:42.000 Yeah, they're probably doing that intentionally because they...
00:46:46.000 Fucking weasels.
00:46:48.000 Don't they have enough money?
00:46:50.000 Like, Jesus Christ.
00:46:51.000 There's never enough.
00:46:52.000 That's the problem.
00:46:53.000 That is the problem, right?
00:46:54.000 Yeah.
00:46:55.000 And I think, I mean, it's like, I think the thing that everyone's worried about right now with Apple is like, you know, Apple, you know what I said before of like bad business models produce bad technology.
00:47:06.000 You know, thus far, Apple's business model is much better than, you know, Google or Facebook or Amazon or, you know, like they're Their business is predicated on selling phones, selling hardware.
00:47:20.000 And that means that they can think a little bit more thoughtfully about the way that their software works than other people.
00:47:27.000 And I think what people are concerned about is that that business model is going to change.
00:47:36.000 Approaching an asymptote of how many phones that they can sell.
00:47:39.000 And so now they're looking at software.
00:47:41.000 They're like, what if we had our own search engine?
00:47:43.000 What if we had our own thing?
00:47:46.000 And the moment that that starts to happen, then they're sort of moving in the direction of the rest of big tech.
00:47:53.000 Which, you know, who knows how they do it, but that's what I think people are concerned about.
00:47:58.000 They've done a better job at protecting your privacy, though, in terms of, like, particularly Apple Maps.
00:48:04.000 Like, their Map app is far superior in terms of sharing your information than, say, like, the Google Maps.
00:48:13.000 But the argument you could make is that Google Maps is a superior product because they share that information.
00:48:20.000 Google Maps is also Waze now, right?
00:48:23.000 They bought Waze, which is fantastic.
00:48:25.000 It lets you know where the cops are, there's an accident up ahead, all kinds of shit, right?
00:48:30.000 But Apple Maps is not that good.
00:48:34.000 I use it because I like the ethic behind it.
00:48:37.000 I like their idea behind it.
00:48:40.000 They delete all the information after you make...
00:48:43.000 If you go to a destination, it's not saving it, sending it to a server, and making sure it knows what was there and what wasn't there and how well you traveled and sharing information.
00:48:56.000 They're not doing that.
00:48:57.000 They're not sharing your information.
00:48:59.000 Right?
00:49:01.000 We don't know.
00:49:03.000 I'm sure that they have a policy.
00:49:05.000 I haven't read the policy, and maybe the policy says that.
00:49:10.000 Supposedly.
00:49:12.000 You're still in the world of trying to make computers secure.
00:49:18.000 There's probably data, the data is probably accumulating somewhere, and maybe people can compromise those places.
00:49:26.000 Yeah.
00:49:27.000 We don't know.
00:49:28.000 For sure, the intent behind the software that they have constructed, I think, has been much better than a lot of the other players in Big Tech.
00:49:36.000 I think the concern is just that as that software becomes a larger part of their bottom line, that that might change.
00:49:44.000 I wonder if they can figure out a way to have an I don't give a fuck phone or I care phone.
00:49:51.000 Like, you want to have an I don't give a fuck phone?
00:49:53.000 This phone is like, who knows what's making it?
00:49:56.000 But look, it's really good.
00:49:58.000 It's got a 100 megapixel camera and all this jazz.
00:50:01.000 And a 5,000 milliamp battery.
00:50:04.000 And then you've got an I care phone.
00:50:05.000 And the I care phone, it's like an iPhone X. But what's different about the iCareFone?
00:50:13.000 The iCareFone, you get a clear line of distinction.
00:50:18.000 You get a real clear path.
00:50:21.000 This is where we got our materials.
00:50:23.000 These are the people that are making it.
00:50:25.000 This is how much they're getting paid.
00:50:27.000 Everyone is unionized.
00:50:29.000 They're all getting healthcare.
00:50:31.000 This is...
00:50:31.000 They'll have 401k plans.
00:50:35.000 It costs a little bit more.
00:50:36.000 It's not as good.
00:50:37.000 If you truly encapsulated all of the social costs with producing that phone, I think it would cost more than a little bit more.
00:50:44.000 How much more do you think it would cost?
00:50:45.000 I think some astronomical number.
00:50:50.000 I'm sure Apple would prefer not to have child slaves mining cobalt for the batteries that are in their phone.
00:50:56.000 Is that a thing you can say when a company is worth as much as most countries?
00:51:01.000 They have so much cash.
00:51:03.000 Can you really say that they would rather not use slaves?
00:51:07.000 Can you imagine?
00:51:08.000 I don't want to go broke.
00:51:10.000 I only have $14 trillion.
00:51:13.000 What am I going to do?
00:51:14.000 What am I going to do?
00:51:15.000 I need slaves.
00:51:16.000 I need someone to dig the coal tan out of the Congo.
00:51:20.000 What would I do if I was them?
00:51:22.000 Well, first of all, it could never be them.
00:51:26.000 It would never work.
00:51:27.000 But if I was, I would say, hey, why don't we open up a factory in America?
00:51:32.000 And why don't we...
00:51:34.000 But you've got to mind the cobalt isn't in America.
00:51:36.000 Right.
00:51:36.000 Why don't we get all of our cobalt from recycled phones?
00:51:40.000 Is that possible?
00:51:42.000 Who's going to recycle them?
00:51:43.000 That's a good question.
00:51:44.000 I think that's what the Fairphone is trying to do, right?
00:51:47.000 Aren't they using all recycled materials?
00:51:49.000 No.
00:51:50.000 Yeah, I mean, I don't...
00:51:52.000 Any image I've seen of electronic recycling is equally apocalyptic.
00:51:55.000 You know, there's just piles of shit like in some next to a lake in China where people are...
00:52:01.000 You're bumming me out, man.
00:52:02.000 How about we do...
00:52:03.000 But I think if you were the CEO of Apple and you were like, this is a priority, we're going to spend, you know, however many trillions of dollars it takes to do this...
00:52:11.000 Your shareholders go, hey, fuckface.
00:52:13.000 You're fired.
00:52:14.000 Out!
00:52:16.000 You would have to be the grand poobah of Apple.
00:52:20.000 You'd have to be the ultimate ruler.
00:52:22.000 But it's not like, even then, if you were just like, you know, I'm willing to take the hit, you know, I'm going to do, no one can oust me or whatever.
00:52:32.000 I'm the grand poobot, you know?
00:52:33.000 Then it's like your share price plummets, which means that your employee retention plummets because those people are also working for the equity.
00:52:41.000 Right.
00:52:42.000 Stock options.
00:52:43.000 And then they get poached away by these other companies.
00:52:46.000 Dirty companies come and steal your clean employees.
00:52:48.000 This is what Apple's website says now.
00:52:50.000 It says they're committed to one day sourcing 100%.
00:52:54.000 Look at this.
00:52:55.000 Completely recycled, every bit as advanced.
00:52:59.000 One day.
00:52:59.000 We're committed to one day sourcing.
00:53:01.000 One day.
00:53:02.000 We're planning on the year 30,000.
00:53:06.000 I mean, you know, it's like, I don't...
00:53:08.000 They're not like sitting around twirling their mustaches.
00:53:11.000 You know what I mean?
00:53:12.000 It's just like, everyone likes good things and not bad things.
00:53:15.000 Maybe they are.
00:53:15.000 Let me read that again, Jamie.
00:53:17.000 It says, 100% recyclable and renewable materials across all of our products and packaging because making doesn't have to mean taking from the planet.
00:53:26.000 Oh, come on.
00:53:28.000 You guys...
00:53:28.000 It's like Nike.
00:53:30.000 It's the same thing too, right?
00:53:31.000 They're all committed to Black Lives Matter and all these social justice causes and they're using slave labor too.
00:53:37.000 You know, aren't they?
00:53:38.000 In China, they're using slave labor to make Nikes.
00:53:40.000 Probably.
00:53:41.000 So go back to that thing.
00:53:44.000 What are they trying to do?
00:53:46.000 I remember seeing a robot they have that can take the pieces out of it at a very fast rate than probably human hands can.
00:53:55.000 Oh, okay.
00:53:56.000 So that's why I was trying to dig through here, but I found that.
00:53:58.000 Well, that would be good.
00:53:59.000 I think that's the robot.
00:54:00.000 That's the peacetaking robot?
00:54:03.000 Daisy, it's good.
00:54:03.000 This is Daisy.
00:54:04.000 Don't name her.
00:54:06.000 Name her, you've got a problem.
00:54:08.000 There you go, 23rd.
00:54:09.000 Right?
00:54:10.000 Entirely clean energy, which isn't quite as...
00:54:12.000 It's, you know...
00:54:13.000 2030 means transitioning hundreds of our manufacturing suppliers to 100% renewable sources of electricity.
00:54:22.000 Well, that's interesting.
00:54:24.000 If they can actually do that, 100% resource, if they can figure out a way to do that, And to have recyclable materials and have all renewable electricity,
00:54:40.000 whether it's wind or solar, if they could really figure out how to do that, I think that would be pretty amazing.
00:54:46.000 But who's going to put it together?
00:54:49.000 Are they going to still use slaves to put it together?
00:54:52.000 I mean, I guess the people that are working at Foxconn are technically slaves, but would you want your child to work there?
00:54:58.000 You know?
00:55:00.000 Yeah, I mean, I think you can say that about a lot of the aspects of our economy, though.
00:55:03.000 You know, who would willingly go into a coal mine?
00:55:07.000 Yes.
00:55:07.000 Right.
00:55:08.000 Yeah.
00:55:09.000 You know, there's some element of coercion to a lot of what keeps the world spinning.
00:55:14.000 Right.
00:55:14.000 And that's the, when you get into these insidious arguments about, or conversations about conspiracies, like conspiracies to keep people impoverished, they're like, well, why would you want to keep people impoverished?
00:55:26.000 Well, who's going to work in the coal mines?
00:55:29.000 You're not going to get wealthy, highly educated people to work in the coal mines.
00:55:32.000 You need someone to work in the coal mines.
00:55:34.000 So what do you do?
00:55:36.000 What you do is you don't help anybody get out of these situations.
00:55:40.000 So you'll always have the ability to draw from these impoverished communities, these poor people that live in Appalachia or wherever their coal miners are coming from.
00:55:51.000 There's Not a whole lot of ways out.
00:55:54.000 Like, I have a friend who came from Kentucky, and he's like, the way he described it to me, he goes, man, you've never seen poverty like that.
00:56:02.000 Like, people don't want to concentrate on those people because it's not as glamorous as some other forms of poverty.
00:56:07.000 He goes, but those communities are so poor.
00:56:12.000 Yeah.
00:56:13.000 40 million Americans, right?
00:56:14.000 Yeah.
00:56:14.000 40 million Americans are living in poverty.
00:56:15.000 Yeah.
00:56:17.000 I mean, I don't know if that conspiracy is accurate, but that's the one that people always want to draw from, right?
00:56:22.000 They always want to...
00:56:23.000 I mean, I don't think you need a conspiracy.
00:56:25.000 You know, you just...
00:56:25.000 You have...
00:56:26.000 You have poor people.
00:56:28.000 Structural forces, you know, that are like...
00:56:29.000 Yeah.
00:56:30.000 Yeah.
00:56:31.000 Yeah.
00:56:33.000 That's...
00:56:35.000 That's why it's rare that a company comes along and has a business plan like Signal where they're like, we're going to be non-profit.
00:56:43.000 We're going to create something that we think is of extreme value to human beings, just to civilization in general, the ability to communicate anonymously or at least privately.
00:56:57.000 It's a very rare thing that you guys have done, that we decided to do this and to do it in a non-profit way.
00:57:04.000 What was the decision that led up to that?
00:57:08.000 How many people were involved?
00:57:12.000 Now there's 20-something people.
00:57:18.000 Do you think that's a lot or a little?
00:57:24.000 I think that's a little.
00:57:26.000 I think it's always interesting talking to people.
00:57:29.000 A lot of times I'll meet somebody and they're like, oh yeah, you're the person who did Signal or something.
00:57:34.000 I'm like, oh yeah, yeah, yeah.
00:57:35.000 They're like, okay, cool.
00:57:36.000 What are you doing now?
00:57:37.000 I'm like, oh, I'm still working on Signal.
00:57:40.000 They're like, oh, is there another Signal that you're going to do?
00:57:44.000 You're going to do Signal 2?
00:57:45.000 I think it's hard for people to understand that software is never finished.
00:57:50.000 There's this...
00:57:52.000 Which is something that I really envy about, like, the kind of creative work that someone like you does.
00:57:57.000 You know, that, like, I envy artists, musicians, writers, poets, painters, you know, people who can create something and be done.
00:58:06.000 You know, that, like, you can record an album today, and 20 years later, you can listen to that album, and it'll be just good, you know?
00:58:12.000 It's like, software's never finished.
00:58:14.000 And if you stop, it'll just, like, float away like dandelions.
00:58:20.000 What happens if you stop?
00:58:21.000 Because software is not...
00:58:23.000 It's very hard to explain this.
00:58:25.000 It doesn't exist in isolation.
00:58:28.000 It's a part of the ecosystem of all software.
00:58:31.000 And that ecosystem is moving, and it's moving really fast.
00:58:33.000 There's a lot of money behind it, a lot of energy in it.
00:58:36.000 And if you aren't moving with it, it will just...
00:58:40.000 Stop working.
00:58:41.000 And also, it's like, you know, a project like this is not just the software that runs on your phone, but the service of, like, you know, moving the messages around on the internet, and that requires a little bit of care and attention, and if you're not doing that, then it will dissipate.
00:58:56.000 And if you're doing something non-profit, the way you're doing it, how do you pay everybody?
00:59:00.000 Like, how does it work?
00:59:02.000 Yeah, well, okay, so, you know, the history of this was, um, I think before the internet really took over our lives in the way that it has, there were the kind of social spaces for people to experiment with different ideas outside of the context of their everyday lives,
00:59:21.000 you know, like art projects, punk rendezvous, experimental gatherings.
00:59:33.000 The embers of art movements.
00:59:35.000 These spaces existed and were things that I found myself in and a part of.
00:59:40.000 And they were important to me in my life.
00:59:41.000 You look like a dude who'd go to Burning Man.
00:59:44.000 Actually, I'm not a dude that goes to Burning Man.
00:59:48.000 Maybe you're missing it.
00:59:50.000 I've been once.
00:59:50.000 I went in 2000, I think.
00:59:54.000 Early adopter.
00:59:57.000 Well, it's funny because at the time that I went, people were like, oh man, it's not like it used to be.
01:00:02.000 And now people are like, have you been?
01:00:03.000 I was like, I went once in 2000. Like, wow, wow, that's when it was like the real deal.
01:00:06.000 I'm like, I don't think so.
01:00:09.000 It's one of those things where it's like, you know, there's like day one and then on day two, they're like, ah, it's not like day one.
01:00:13.000 Right, of course, of course.
01:00:16.000 But yeah, I don't know.
01:00:18.000 Those things, those spaces were important to me and like an important part of my life.
01:00:21.000 And As more of our life started to be taken over by technology, Me and my friends felt like those spaces were missing online.
01:00:33.000 We wanted to demonstrate that it was possible to create spaces like that.
01:00:39.000 There had been a history of people thinking about cryptography in particular, which is kind of funny in hindsight.
01:00:55.000 The history of cryptography is actually not long, at least outside of the military.
01:01:02.000 It really starts in the 70s.
01:01:07.000 There were some really important things that happened then.
01:01:10.000 In the 80s, there was this person who was this lone maniac who was writing a bunch of papers about cryptography during a time when it wasn't actually that relevant because there was no internet.
01:01:20.000 The applications for these things were harder to imagine.
01:01:24.000 And then in the late 80s there was this guy who wrote a Who was a retired engineer who discovered the papers that this maniac, David Chum, had been writing and was really...
01:01:37.000 Was he doing this in isolation or was he a part of a project or anything?
01:01:40.000 No, I think David Chum was...
01:01:41.000 I think he's an academic.
01:01:43.000 I'm embarrassed that I don't know.
01:01:45.000 But he did a lot of the notable work on using the primitives that had already been developed.
01:01:55.000 And he had a lot of interesting ideas and...
01:01:57.000 There's this guy who was a retired engineer, his name was Tim May, who was kind of a weird character.
01:02:03.000 And he found these papers by David Chum, was really enchanted by what they could represent for a future.
01:02:11.000 And he wanted to write like a sci-fi novel that was sort of predicated on a world where cryptography existed and there was a future where the internet was developed.
01:02:19.000 And so he wrote some notes about this novel, and he titled the notes The Crypto Anarchy Manifesto.
01:02:26.000 And he published the notes online, and people got really into the notes.
01:02:31.000 And then he started a mailing list in the early 90s called the Cypherpunks mailing list.
01:02:37.000 And all these people started, you know, joined the mailing list and they started communicating about, you know, what the future was going to be like and how, you know, they needed to develop cryptography to live their, you know, crypto-anarchy future.
01:02:49.000 And at the time, it's strange to think about now, but cryptography was somewhat illegal.
01:02:55.000 It was regulated as a munition.
01:02:57.000 Really?
01:02:57.000 Yeah.
01:02:58.000 So if you wrote a little bit of crypto code and you sent it to your friend in Canada, that was the same as, like, shipping Stinger missiles across the border to Canada.
01:03:06.000 Wow!
01:03:07.000 So did people actually go to jail for cryptography?
01:03:10.000 There were some high-profile legal cases.
01:03:15.000 I don't know of any situations where people were tracked down as munitions dealers or whatever, but it really hampered what people were capable of doing.
01:03:24.000 So people got really creative.
01:03:25.000 There were some people who wrote some crypto software called Pretty Good Privacy, PGP. And they printed it in a book, like an MIT Press book, in a machine-readable font.
01:03:39.000 And then they're like, this is speech.
01:03:41.000 This is a book.
01:03:43.000 I have my First Amendment right to print this book and to distribute it.
01:03:46.000 And then they shipped the books to Canada and other countries and stuff, and then people in those places scanned it back in.
01:03:52.000 To computers.
01:03:54.000 And they were able to make the case that they were legally allowed to do this because of their First Amendment rights.
01:04:01.000 And other people moved to Anguilla and started writing code in Anguilla and shipping it around the world.
01:04:09.000 There were a lot of people who were fervently interested.
01:04:12.000 Why Anguilla?
01:04:13.000 Because it's close to the United States and there were no laws there about producing cryptography.
01:04:20.000 I think that was something people thought about.
01:04:22.000 They have like three cases of COVID there ever.
01:04:26.000 Oh, really?
01:04:26.000 Yeah, it's a really interesting place.
01:04:28.000 Yeah, I used to work down there.
01:04:30.000 Really?
01:04:31.000 Okay, International Traffic and Arms Regulation.
01:04:34.000 It's a United States regulatory regime to restrict and control the export of defense and military-related technologies to safeguard U.S. national security and further U.S. foreign policy objectives.
01:04:46.000 ITAR. Yeah, they were closed and Gila was closed until like November.
01:04:51.000 They wouldn't let anybody in.
01:04:53.000 And yeah, if you want to go there, they have like, I was reading all these crazy restrictions.
01:04:57.000 You have to get COVID tested and you have to apply.
01:05:01.000 And then when you get there, they test you when you get there.
01:05:03.000 Because they have no deaths.
01:05:06.000 Yeah, yeah, yeah.
01:05:07.000 That's cool.
01:05:08.000 Yeah, I like Gila.
01:05:09.000 It's an interesting place.
01:05:11.000 Yeah, this is what I was reading.
01:05:12.000 They're inviting companies to come move here.
01:05:14.000 Like, come work here.
01:05:15.000 Oh, interesting.
01:05:16.000 Come, we'll test the shit out of you.
01:05:19.000 You can't go anywhere, but come here.
01:05:21.000 It's beautiful.
01:05:21.000 It is beautiful.
01:05:22.000 I used to work on boats down there.
01:05:23.000 Yeah?
01:05:24.000 What'd you do on boats?
01:05:26.000 I was like really...
01:05:28.000 I don't know.
01:05:29.000 I, for a while, was really into sailing and I had a commercial license and I was moving boats around and stuff.
01:05:37.000 My parents lived in a sailboat for a while.
01:05:38.000 Oh, really?
01:05:39.000 Yeah.
01:05:40.000 Yeah, they just decided to just check out.
01:05:43.000 And this was like...
01:05:46.000 I want to say early 2000s, somewhere around then.
01:05:49.000 I lived on a sailboat for a few years until my mom got tired of it.
01:05:52.000 They go around the world?
01:05:53.000 They were in the Bahamas.
01:05:56.000 They were all around that part of the world.
01:06:01.000 They were in California for a little while on their boat.
01:06:04.000 They just decided, let's just live on a boat for a while.
01:06:09.000 Yeah, it's pretty crazy.
01:06:11.000 I discovered sailing by accident where I was like...
01:06:15.000 Working on a project with a friend in the early 2000s, and we were looking on Craigslist for something unrelated, and we saw a boat that was for sale for $4,000.
01:06:22.000 And I thought a boat was like a million dollars or something.
01:06:25.000 I was just like, what?
01:06:25.000 The sailboats are $4,000?
01:06:26.000 And this is just some listing.
01:06:27.000 There's probably even cheaper boats, you know?
01:06:30.000 And so we got really into it, and we discovered that you can go to any marina in North America and get a boat for free.
01:06:36.000 You know, that like every marina has a lean sail dock on it where people have stopped paying their slip fees, and the boats are just derelict and abandoned, and they've You know, put it on these stocks.
01:06:42.000 Really?
01:06:43.000 Yeah.
01:06:43.000 You get a boat for free?
01:06:45.000 Yeah.
01:06:45.000 They have an auction.
01:06:46.000 There's usually like a minimum bid of, you know...
01:06:50.000 50 bucks?
01:06:50.000 50 bucks or whatever, you know.
01:06:52.000 And most times it doesn't get bid on and they chop the boat up and throw it away.
01:06:57.000 Really?
01:06:57.000 And if you show up...
01:06:58.000 So a functional boat?
01:07:00.000 Oh, functional.
01:07:01.000 Oh, that's the problem, right?
01:07:03.000 You know...
01:07:03.000 You gotta maintain the shit out of boats.
01:07:05.000 Yeah, so, you know, if you put some work into it, though, you can get it going.
01:07:09.000 And so we started doing that.
01:07:11.000 We were, like, you know, getting boats, fixing them up, sailing them as far as we could.
01:07:14.000 And then eventually I got a commercial license and started sailing other people's boats.
01:07:19.000 Wow!
01:07:19.000 All this on a whim of, how much does a boat cost?
01:07:22.000 You can get a boat for four grand?
01:07:24.000 Holy shit!
01:07:25.000 Next thing you know, you're working on boats.
01:07:28.000 Yeah, yeah.
01:07:28.000 I mean, I was...
01:07:29.000 It's a really...
01:07:30.000 It's a whole world, you know?
01:07:31.000 It's just like, you know, finding that link on Craigslist was like, you know, opening a door to another reality, right?
01:07:38.000 Where it's like...
01:07:38.000 Yeah.
01:07:39.000 Because it's pretty amazing, you know, me and some friends used to sail around the Caribbean and...
01:07:46.000 You know, the feeling of, like, you know, you pull up an anchor, and then you sail, like, you know, 500 miles to some other country or whatever, and you get there, and you drop the anchor, and you're just like, we...
01:07:56.000 It was just the wind.
01:07:57.000 The wind that took, you know, like, there was no engine, there was no fuel.
01:08:02.000 It was just the wind, you know, and you catch fish, and, you know, it's just like...
01:08:05.000 If you want to go real old school, you've got to use one of them...
01:08:07.000 What are those fucking sex tents?
01:08:09.000 Sex tents, of course.
01:08:09.000 Do you use one of those?
01:08:10.000 No, you didn't!
01:08:11.000 Did you really?
01:08:13.000 Yeah.
01:08:13.000 I was like really into like, you know, no electronics, like it's just complicated, you know, they're expensive or whatever.
01:08:18.000 So we had a TAF rail log.
01:08:21.000 It's like a little propeller on a string that you connect to a gauge.
01:08:26.000 And as it turns, the gauge keeps track of how far you've traveled.
01:08:31.000 What?
01:08:32.000 Yeah, so it's like...
01:08:33.000 A propeller on a string?
01:08:35.000 So it's just a thing that turns a string at a constant rate depending on how fast you're moving.
01:08:39.000 So it can gauge how much distance you've traveled.
01:08:43.000 So is the string marked?
01:08:46.000 No, no, no.
01:08:47.000 It's just a constant length.
01:08:49.000 It's always spinning, and it's always turning the gauge.
01:08:52.000 And then it reads a number?
01:08:54.000 So it says how many miles?
01:08:55.000 So it's just like a dial in the number of how many nautical miles you've traveled.
01:08:57.000 Wow.
01:08:58.000 So then you're just like, okay, well, we started here, and then we headed on this heading, and we did that, and we traveled 10 miles, so we must be here.
01:09:06.000 And then once a day, you can take a sight with your sextant, and then you can do some dead reckoning with a compass.
01:09:14.000 Wow!
01:09:15.000 Wow!
01:09:16.000 Dude, you went old school.
01:09:18.000 Yeah, I once had a job, actually.
01:09:20.000 Who did you do this with?
01:09:22.000 Just friends, yeah.
01:09:23.000 And you gotta have some fucking committed friends.
01:09:26.000 Because, like, the friends had to be, you know, you had to be all on the same page.
01:09:30.000 Because they could be like, hey man, let's get a fucking GPS. You guys are assholes.
01:09:33.000 I don't want to die.
01:09:35.000 I'm not going to get eaten by a shark.
01:09:36.000 How much food do we have?
01:09:37.000 People die out here, man.
01:09:38.000 This is the ocean.
01:09:40.000 Yeah.
01:09:40.000 We didn't really have any money, so it wasn't much of a decision.
01:09:44.000 To put things in perspective, we took a trip through the Caribbean once from Florida.
01:09:51.000 The way that we got to Florida was riding freight trains.
01:09:54.000 We hopped trains to get there.
01:09:55.000 This was low-budget traveling.
01:09:58.000 You guys were hobos.
01:10:00.000 No.
01:10:01.000 That's a hobo move.
01:10:02.000 It was low-bagger, for sure.
01:10:06.000 But, like, yeah, I was also, like, just weirdly ideological about it, where, like, I had a job once in the Caribbean that was, like, I was almost like a camp counselor, basically, where there was this camp that was like a sailing camp, but it was, like, 13 teenagers, mostly from North America.
01:10:22.000 Showed up in St. Martin and then got on a boat with me and another woman my age.
01:10:28.000 And we were like the adults.
01:10:30.000 And it was just like we sailed from St. Martin to Trinidad over the course of six weeks with these like 13 kids on a 50-foot sailboat.
01:10:37.000 Who left their kids with you?
01:10:38.000 That's what I want to know, man.
01:10:40.000 It was like...
01:10:40.000 Is this you?
01:10:42.000 Me and my friends made a video called Hold Fast that was trying to demystify sailing.
01:10:49.000 Bro, you've been rocking this wacky hair for a long time.
01:10:51.000 Dude, I know.
01:10:54.000 You know, pandemic.
01:10:56.000 Wow.
01:11:00.000 Whoa, you had tornadoes out there?
01:11:02.000 Yeah.
01:11:03.000 And you caught fish?
01:11:04.000 Yeah, yeah.
01:11:04.000 So you lived off the fish that you caught, basically?
01:11:07.000 Yeah.
01:11:07.000 Yeah, fish, cock, seaweed.
01:11:10.000 Wow, seaweed?
01:11:11.000 Yeah.
01:11:12.000 So when you prepare seaweed, what do you do?
01:11:13.000 You boil it?
01:11:15.000 You're going to sharpen your fucking knife, son.
01:11:16.000 I know.
01:11:17.000 That's ridiculous.
01:11:18.000 What are you using, a pencil to try to kill that poor fish?
01:11:22.000 This whole video is embarrassing.
01:11:24.000 So thank you for that, James.
01:11:25.000 Because you kind of didn't know what you were doing?
01:11:26.000 And here's you with...
01:11:28.000 What are you doing here?
01:11:31.000 You're mapping out where you're at?
01:11:32.000 This is Dead Reckoning, yeah.
01:11:33.000 Dead Reckoning.
01:11:34.000 That position was 50 miles off.
01:11:36.000 50 miles off?
01:11:38.000 So where you thought you were versus where you actually were was 50 miles difference?
01:11:42.000 Yeah.
01:11:42.000 And you're going how many miles an hour?
01:11:44.000 Very slow.
01:11:45.000 If you're doing really well, you know you're making five knots.
01:11:48.000 Five nautical miles an hour.
01:11:49.000 Five miles an hour.
01:11:50.000 Jesus Christ.
01:11:52.000 So you're walking.
01:11:54.000 You're basically walking on the ocean.
01:11:56.000 Yeah.
01:11:57.000 Not walking.
01:11:58.000 It's slow going.
01:11:58.000 But you never stop.
01:11:59.000 That's the thing.
01:11:59.000 You can sail all night.
01:12:00.000 You can just keep going.
01:12:01.000 You're a light jog.
01:12:02.000 You're jogging on the ocean.
01:12:04.000 Anyway, I was a tyrant with these kids.
01:12:06.000 We had a nice boat and I disabled all of the electronics.
01:12:09.000 I disabled the electric anchor windlass.
01:12:12.000 How long was this boat?
01:12:13.000 How long was this boat?
01:12:15.000 This was 50 feet.
01:12:16.000 50 feet with 14 kids, you said?
01:12:18.000 I think 13. 50 is a big boat.
01:12:20.000 That's actually a big boat.
01:12:21.000 Yeah, but it doesn't seem like a lot of room for all these kids.
01:12:24.000 Yeah, people are like sleeping on the deck.
01:12:26.000 Oh my god, that's insane.
01:12:28.000 Did you feel weird?
01:12:29.000 I mean, you're responsible for their food?
01:12:31.000 You're responsible for making sure they don't fight with each other?
01:12:35.000 Yeah, I mean, I actually enjoyed it.
01:12:36.000 I think it was fun.
01:12:37.000 Yeah?
01:12:37.000 Well, it seemed like it.
01:12:40.000 You have to make it work.
01:12:42.000 There's no other solution.
01:12:44.000 You're on this boat with these kids.
01:12:46.000 Yeah, that's true.
01:12:46.000 Do you still keep in touch with those kids?
01:12:48.000 No, that was sort of like pre-social media.
01:12:52.000 Right.
01:12:54.000 They're going to reach out to you now.
01:12:56.000 Man, I remember that.
01:12:57.000 That was fucking crazy.
01:12:59.000 I can't believe my parents left me with you.
01:13:01.000 I can't believe they did either.
01:13:03.000 So did you have to sign any paperwork or anything?
01:13:06.000 How did you take care of these kids?
01:13:08.000 I'm sure I had to sign something.
01:13:09.000 I don't remember.
01:13:10.000 You don't remember?
01:13:10.000 Yeah.
01:13:11.000 Wow.
01:13:11.000 Was there any time where you were like halfway into this trip?
01:13:14.000 You're like, what have I signed up for?
01:13:16.000 Oh, sure.
01:13:17.000 All the time.
01:13:17.000 Yeah.
01:13:18.000 But I was...
01:13:18.000 You know...
01:13:19.000 I had never really been in a situation like that either.
01:13:22.000 And...
01:13:22.000 Who has?
01:13:24.000 I don't know.
01:13:25.000 It's like I didn't even have siblings.
01:13:26.000 You know?
01:13:27.000 Like it's like...
01:13:27.000 Oh, really?
01:13:28.000 Yeah.
01:13:28.000 So...
01:13:29.000 But I... And I was pretty...
01:13:31.000 You know, it was interesting.
01:13:32.000 I feel like I learned a lot.
01:13:33.000 And it was...
01:13:36.000 But I was pretty tyrannical in a lot of ways.
01:13:39.000 But in a way that I was trying to encourage.
01:13:42.000 It was fun to see particularly teenagers who had a really North American affect about how to be.
01:13:50.000 Just let all of that go over a few weeks on the ocean where it's just like, you know, it's just us.
01:13:58.000 We're here.
01:13:59.000 There's nobody else watching.
01:14:01.000 You know, we're sleeping next to each other.
01:14:03.000 You know, it's like the kids just getting comfortable with themselves, you know?
01:14:09.000 And, you know, I would try and like, so I was like, I am really into rock, paper, scissors.
01:14:17.000 How into it are you?
01:14:18.000 I'm undefeated.
01:14:21.000 How is that possible?
01:14:23.000 So whenever they wanted anything, I would be like, all right, rock, paper, scissors.
01:14:26.000 You know, they were like, can we like do this thing?
01:14:28.000 I'd be like, all right, we'll do rock, paper, scissors.
01:14:30.000 If you win, you can do this thing.
01:14:31.000 If I win, and then I would like pick the thing that was like their sort of deepest fear, you know, it's like the really shy person had to like write a haiku about every day and then read it aloud at dinner.
01:14:42.000 You know, like the, you know, the person who was like really into like having like a manicure, like wasn't allowed to shave her legs for the rest of the, you know, like that kind of thing.
01:14:52.000 Wow.
01:14:53.000 And so then by the end of it, it was just like, you know, everyone had lost, you know, so everyone was like reading the haiku at dinner and doing, you know.
01:14:58.000 How are you so good at rock, paper, scissors?
01:15:01.000 It's just, you know, skill, muscle, intuition.
01:15:04.000 Intuition.
01:15:04.000 Can we play right now?
01:15:05.000 You want to play?
01:15:06.000 Yes.
01:15:06.000 But I only play for stakes.
01:15:08.000 Okay.
01:15:09.000 What do you want to play for?
01:15:13.000 Okay.
01:15:14.000 How about...
01:15:20.000 If I win, I do the programming on your show for a week.
01:15:25.000 No.
01:15:28.000 That's worth a lot of money.
01:15:30.000 You can fuck off.
01:15:31.000 What kind of money?
01:15:33.000 I'm not saying the ads or whatever.
01:15:35.000 Programming.
01:15:35.000 Who's going to be on?
01:15:36.000 That's not possible.
01:15:38.000 We're booked up months and months in advance.
01:15:40.000 You were so confident until just now.
01:15:43.000 That's ridiculous to flip a coin on that.
01:15:46.000 There's no chance.
01:15:47.000 I mean, what would be...
01:15:48.000 Because then you'd make me have...
01:15:50.000 Listen, the whole reason why this show works is because I talk to people that I want to talk to.
01:15:53.000 That's why it works.
01:15:55.000 The only way...
01:15:56.000 You've got to do something to play this game.
01:15:57.000 That's not a risk.
01:15:58.000 That's just one week of your life.
01:15:59.000 No, that's abandoning the show.
01:16:00.000 That's one week of your life.
01:16:01.000 No, you could bring some assholes on here that I don't want to talk to and then I'm like, what am I doing?
01:16:05.000 No, no, no.
01:16:06.000 Impossible.
01:16:06.000 Alright.
01:16:07.000 Well, do you think that there's something of equivalent value?
01:16:09.000 No.
01:16:10.000 Of that?
01:16:11.000 Nothing that I can do.
01:16:12.000 No.
01:16:12.000 There's nothing that you could give me that would be worth a week of programming on the show?
01:16:17.000 What are you going to give me?
01:16:18.000 What about a day of programming?
01:16:19.000 You'd have to give me a spectacular amount of money.
01:16:22.000 I sent you a...
01:16:23.000 We can't make this about money.
01:16:26.000 But that's the only way I would...
01:16:27.000 The only way, if you ever put a monetary equivalent to that, it would have to be a spectacular amount of money for me to let someone else program the show.
01:16:35.000 I've never let anybody do that before.
01:16:37.000 Not even for one day?
01:16:38.000 No!
01:16:39.000 That was one of the big things about doing this show on Spotify.
01:16:42.000 They could have no impact at all on who gets on, no suggestions, no nothing.
01:16:48.000 The only way it works...
01:16:49.000 What was up with that dude in the suit outside with the clipboard that was telling me from Spotify?
01:16:53.000 Oh, he's from the government.
01:16:54.000 He's from the CIA. There's no one out there.
01:16:56.000 He's joking.
01:16:57.000 But the only way the show works, I think, the way it works, is I have to be interested in talking to the people.
01:17:04.000 That's it.
01:17:06.000 So it has to be, I get a, I have like all these suggestions for guests.
01:17:10.000 I go, oh, that kind of seems cool.
01:17:12.000 Oh, that might be interesting.
01:17:13.000 Let me read up on this guy.
01:17:14.000 What if it's like for a week, I give you the list of suggestions?
01:17:17.000 No.
01:17:18.000 No.
01:17:18.000 No input.
01:17:19.000 No?
01:17:19.000 No.
01:17:20.000 It's not.
01:17:21.000 That's a ridiculous.
01:17:21.000 Stand real.
01:17:22.000 Stand real.
01:17:22.000 Okay.
01:17:22.000 Okay.
01:17:23.000 All right.
01:17:23.000 Impossible.
01:17:24.000 In any case.
01:17:25.000 How about five bucks?
01:17:27.000 No.
01:17:28.000 No?
01:17:28.000 No, it's gotta be stakes.
01:17:29.000 Come on, man.
01:17:30.000 20 bucks?
01:17:31.000 20 bucks.
01:17:31.000 I got 20 bucks in my pocket.
01:17:33.000 Money is off the table.
01:17:34.000 We can't do money.
01:17:34.000 Money's off the table?
01:17:35.000 I forget that.
01:17:35.000 All right.
01:17:36.000 Sounds like someone's scared to lose at Rock, Paper, Scissors.
01:17:38.000 It sounds like someone else is scared to lose at Rock, Paper, Scissors.
01:17:40.000 No, you're asking me for something that's ridiculous.
01:17:42.000 You don't have anything.
01:17:44.000 You don't have anything that's worth a week of programming on this show.
01:17:48.000 You don't have it.
01:17:48.000 That's rough.
01:17:49.000 It doesn't exist.
01:17:50.000 That's rough.
01:17:51.000 No, it literally doesn't exist.
01:17:53.000 There's nothing that you can have that you could offer me that I couldn't buy myself.
01:18:00.000 I'll make your...
01:18:01.000 No, no, no.
01:18:02.000 It'll be interesting.
01:18:03.000 No, no, no.
01:18:04.000 You can't.
01:18:04.000 No.
01:18:05.000 All right, fine.
01:18:05.000 But that doesn't do anything for me.
01:18:07.000 That does something for you.
01:18:08.000 That does zero for me.
01:18:10.000 Of course, you would have, if you win, you would name your steak.
01:18:11.000 I don't have a steak.
01:18:12.000 There's nothing I want from you.
01:18:14.000 What you ask from me is a crazy thing.
01:18:17.000 Yeah, we can't play Rock, Paper, Scissors now, huh?
01:18:19.000 Interesting.
01:18:20.000 Anyway, we were talking about something else before all of us.
01:18:22.000 We're talking about the evolution of cryptography.
01:18:27.000 Sailing with children.
01:18:28.000 Sailing with children.
01:18:29.000 Well, at first we were talking about Anguilla and the fact that people are moving to Anguilla.
01:18:32.000 Yeah.
01:18:33.000 So how did you learn how to do all this stuff?
01:18:36.000 Was it trial by fire when you were learning how to use all this, I mean, I don't want to call it ancient equipment, but mechanical equipment to figure out how to...
01:18:44.000 Yeah.
01:18:46.000 Yeah.
01:18:47.000 Secret is to begin.
01:18:49.000 To start...
01:18:50.000 Like a sextant.
01:18:52.000 Where the fuck does one learn how to operate a sextant and then navigate in the ocean?
01:19:00.000 Uh, yeah, just, I would, you know, I started, uh, you know, me and some friends got a boat and, um, we started fixing it up and making a lot of mistakes and then, you know, started taking some trips and then...
01:19:11.000 Getting lost?
01:19:12.000 Yeah, I got lost a bunch.
01:19:13.000 I took a solo trip from San Francisco to Mexico and back, uh, on a little 27 foot boat with no engine and...
01:19:21.000 Whoa!
01:19:22.000 How long did that take?
01:19:24.000 Ah...
01:19:25.000 A few months.
01:19:26.000 And the way you did it, did you stay close so I could see the shore?
01:19:30.000 So if everything fucks up, I can kind of swim.
01:19:32.000 Yeah, well, no, you can't swim.
01:19:34.000 I learned that lesson, too.
01:19:35.000 No?
01:19:35.000 Why?
01:19:37.000 I mean, the closest I ever came to death in my life was just in the bay.
01:19:44.000 In the San Francisco Bay, I was on a boat that capsized, and I was probably 2,000 yards away from shore, and I almost drowned.
01:19:54.000 I mean, I didn't make it to shore.
01:19:57.000 Yeah, it's just the water's so cold, you know?
01:20:00.000 You didn't make it to shore?
01:20:02.000 Yeah, it's a long story.
01:20:03.000 I was like...
01:20:06.000 A friend of mine was living in San Francisco and he wanted to learn how to sail.
01:20:09.000 And I was like, you know, what you should do is you should get like a little boat, like a little sailing thingy, you know, and then you can just anchor it like off the shore in this area that no one cares about.
01:20:17.000 And, you know, you could just sort of experiment with this little boat.
01:20:19.000 And so he started looking on Craigslist and he found this boat that was for sale for 500 bucks up in the North Bay.
01:20:25.000 And every time we called the phone number, we got an answering machine that was like, hello, you've reached Dr. Ken Thompson, honorary.
01:20:33.000 I'm unable to take your call, you know?
01:20:35.000 And we were like, what is that?
01:20:35.000 Like, honorary?
01:20:36.000 It's a fake doctor.
01:20:37.000 Is he like a judge?
01:20:39.000 Chiropractor.
01:20:40.000 You know, like, what is it?
01:20:41.000 And so finally we got in touch with this guy.
01:20:43.000 We go up there, and it's the kind of situation where, like, we pull up, and there's, like, the trailer that the boat's supposed to go on, and it's just full of scrap metal.
01:20:50.000 Oh, boy.
01:20:51.000 And, you know, this guy comes out.
01:20:52.000 He's like, oh, yeah, this is the trailer.
01:20:54.000 We were going to do a metal run, but if you want the boat, you know, we'll take the metal off, you know?
01:20:59.000 And we're like, okay, you know, and he's like taking us around.
01:21:01.000 He's like, okay, the mast is over here.
01:21:03.000 And it's like under some leaves, you know, it's like, and then, you know, the hole is in the water here.
01:21:07.000 And he has like a dock behind his house, and the tide is all the way out.
01:21:11.000 So they're both just sitting in the mud, you know.
01:21:12.000 And I'm like, well, how do we get this out of here?
01:21:14.000 He's like, oh, you'd have to come back at a different time, you know, and then you take it over there.
01:21:17.000 And we're like, you told us to come now, like at this time, you know.
01:21:20.000 Anyway, so we go through all this thing, and my friend, who knows nothing about votes, is like, all right, Moxie, what do you think?
01:21:26.000 Should I get this?
01:21:27.000 And I was like, okay.
01:21:28.000 Oh, and we were like, so what's a doctor of what?
01:21:31.000 He's like, oh, self-declared.
01:21:32.000 We're like, oh, okay.
01:21:33.000 He's a self-declared doctor?
01:21:35.000 Honorary.
01:21:36.000 Honorary self-declared doctor.
01:21:37.000 You can do that?
01:21:38.000 I guess so.
01:21:39.000 Why not?
01:21:40.000 It's just an answer.
01:21:41.000 Jamie?
01:21:41.000 Yes.
01:21:42.000 Doctor?
01:21:43.000 Yes.
01:21:43.000 I think we should become doctors.
01:21:45.000 I just became one.
01:21:49.000 I tried that for a while, actually.
01:21:50.000 Yeah, did you really?
01:21:51.000 Yeah, I don't know.
01:21:52.000 I mean, I never went to college, so...
01:21:53.000 Did Hunter S. Thompson ever get an honorary degree, or did he just call himself Dr. Hunter S. Thompson?
01:21:58.000 Because he was calling himself Dr. Hunter S. Thompson for a while.
01:22:01.000 I was quickly looking up how fast he could do this legally.
01:22:04.000 Well, Bill Cosby became a doctor for a little bit.
01:22:06.000 They took it back, though.
01:22:08.000 That's when you know he fucked up.
01:22:09.000 Yeah, yeah.
01:22:10.000 They take back your fake doctor degree.
01:22:12.000 Yeah, yeah.
01:22:13.000 So this guy was like, you know, my friend's like, what do you think, Maxine?
01:22:15.000 I'm like, all right, Dr. Ken.
01:22:17.000 I would have to consider.
01:22:19.000 I'm not sure that I would do it, but I would consider taking this boat for free.
01:22:23.000 I'd have to think about it, but I would consider that, you know?
01:22:25.000 And he's like...
01:22:27.000 I might be amenable to that.
01:22:29.000 So we've gone from $500 to free.
01:22:32.000 And so we got this boat, and we had to deal with the metal and all that stuff.
01:22:37.000 We got the boat, and we were just trying to anchor it.
01:22:44.000 Did you bring life vests?
01:22:46.000 Yeah, I was wearing a PFT, a Type 2 PFT, and we took it to this boat ramp, and it was the end of the day, and the wind was blowing kind of hard, and the conditions weren't that good, but I was like, oh, we're just doing this little thing, this little maneuver, and we were in two boats.
01:23:03.000 I built this little wooden rowing boat, and my friend was going to go out in that with one anchor, and I was going to sail out this boat.
01:23:09.000 You built it?
01:23:10.000 Yeah, out of plywood.
01:23:11.000 It's a stitching glue.
01:23:14.000 But not the sturdiest vessel.
01:23:19.000 So he's going to go out in this little rowboat, and I was going to sail out this little catamaran.
01:23:25.000 And we had two anchors, and we're going to anchor it, and then we're going to get in the rowboat and row back.
01:23:29.000 And it seemed a little windy, and I got in the boat first, and I got out around this pier and was hit by the full force of the wind and realized that it was blowing like 20 knots.
01:23:38.000 It was way, way too much for what we were trying to do.
01:23:40.000 But I had misrigged part of the boat, so it took me a while to get it turned around.
01:23:45.000 And by the time I got it turned around, my friend had rowed out around the pier, and he got hit by the force of the wind and just got blown out into the bay.
01:23:53.000 So he's rowing directly into the wind and moving backwards.
01:23:56.000 Oh, shit!
01:23:58.000 And I was like, fuck.
01:23:59.000 And I'm on this little Hobie Cat, and it was moving so fast.
01:24:02.000 It was way too windy to be sailing this thing.
01:24:04.000 I've got just my clothes on.
01:24:06.000 I don't have a wetsuit on or anything like that.
01:24:07.000 I have a life jacket and just my clothes.
01:24:10.000 And we don't have a radio.
01:24:11.000 We're unprepared.
01:24:12.000 It's starting to get dark.
01:24:13.000 We don't have a light.
01:24:15.000 And I'm sailing back and forth trying to help my friend.
01:24:19.000 And it got to the point where I was like, all right, I'm just going to tack over.
01:24:23.000 I'm going to sail up to this boat that was called the Sea Louse.
01:24:27.000 Sail up to the Sea Louse.
01:24:28.000 I'm going to get my friend off of it.
01:24:30.000 We're just going to abandon it.
01:24:31.000 And then we're going to sail this Hobie Cat back.
01:24:34.000 If we can.
01:24:35.000 And so I go to turn around, and right as I'm turning around, a gust of wind hit the boat and capsized it before I could even know that it was happening.
01:24:44.000 It's one moment, you're on the boat, and the next moment you're in the water.
01:24:49.000 And the water is like 50 degrees.
01:24:54.000 It's a shock when it hits you.
01:24:56.000 And the boat was a little messed up in a way where I couldn't write it.
01:25:02.000 It had capsized, and then it capsized all the way and then sank.
01:25:06.000 So it was floating like three feet underwater, basically.
01:25:11.000 And so I'm in the water, but I'm still a little bit out of the water, but in the water.
01:25:16.000 And I had a cell phone that just immediately was busted.
01:25:20.000 And I look at my friend, and he's a ways away now.
01:25:24.000 He didn't see me, and I was yelling as loud as I could, but the wind is blowing 20 knots, and you can't hear each other.
01:25:33.000 It just takes your voice away.
01:25:38.000 I was screaming, I was waving, he wasn't wearing his glasses, and he just very slowly rode away.
01:25:44.000 Oh my god!
01:25:46.000 And so then I was just like floating there.
01:25:47.000 I was starting to get dark.
01:25:49.000 He rode away?
01:25:50.000 Did he notice that your boat had capsized?
01:25:52.000 No, he didn't even see me.
01:25:53.000 He thought that I just sailed somewhere else.
01:25:56.000 Because in his mind, I was the person with the experience.
01:25:59.000 Do you still talk to this dude?
01:26:00.000 Yeah, all the time.
01:26:01.000 I'd be like, you motherfucker.
01:26:03.000 I don't blame him.
01:26:04.000 In his mind, he was the person that was in trouble.
01:26:06.000 Right.
01:26:07.000 I understand.
01:26:08.000 And he thought I just sailed somewhere else.
01:26:10.000 That's crazy.
01:26:12.000 Yeah.
01:26:12.000 Sailed out of vision.
01:26:14.000 Yeah, and then, you know, it basically got dark.
01:26:17.000 I could see the shore.
01:26:18.000 I wasn't far away.
01:26:18.000 There's nobody on shore.
01:26:19.000 There's nobody around.
01:26:20.000 And the wind was blowing directly offshore.
01:26:22.000 So you have to swim, you know, swim into the wind and into the wind wave and all that stuff.
01:26:28.000 And eventually I tried swimming and I swam, you know, directly upwind.
01:26:32.000 And I was because I was I was like, OK, like if I get separated from this boat and I don't make it to shore, then I'm definitely dead.
01:26:40.000 You know, like there's just no saving me.
01:26:42.000 So I was trying to go directly upwind so that if I felt like I couldn't make it, I would float back down when it hit the boat again.
01:26:46.000 And so I tried, you know, I swam for probably like 20 minutes upwind and made no progress.
01:26:52.000 It didn't feel like any progress.
01:26:53.000 You know, in 50 degrees, you have 30 to 60 minutes before you black out.
01:26:57.000 My arms were just, you know, it's like I consider myself a strong swimmer.
01:27:01.000 Like I free dive, you know, all this stuff.
01:27:03.000 And I just, you know, it's like you read these stories about...
01:27:09.000 How people die.
01:27:11.000 They succumb to hypothermia on a local hike or they drown in the bay.
01:27:14.000 And the story's always like, well, Timmy was a strong swimmer.
01:27:16.000 And you're like, really?
01:27:17.000 Was Timmy really a strong swimmer?
01:27:19.000 Because he drowned in the bay.
01:27:20.000 And floating there, it just all came to me.
01:27:23.000 I'm like, wow, this is how this happens.
01:27:25.000 You just make a series of pretty dumb, small decisions until you find yourself floating in the dark in the bay.
01:27:32.000 There's no one around.
01:27:33.000 And it's a really slow process, too.
01:27:37.000 You just come to terms with the idea that you're not going to make it.
01:27:40.000 And it's not sudden.
01:27:43.000 It's not like someone shot you or you got hit by a bus or something like that.
01:27:45.000 It's like this hour-long thing that you're getting dragged through all alone.
01:27:50.000 And you realize that no one will ever even know What this was?
01:27:55.000 You know, how this happened?
01:27:56.000 And you think about all the people like Joshua Slocum, Jim Gray, people who were lost at sea, and you realize they all had this thing that they went through, you know, this hour-long ordeal of just floating alone, and no one will even ever know what that was or what that was like, you know?
01:28:11.000 And eventually, I realized I wasn't going to make it ashore.
01:28:13.000 I looked back.
01:28:14.000 The boat was, like, way far away from me.
01:28:16.000 I started, you know, drifting back towards it.
01:28:18.000 I was still trying to swim.
01:28:20.000 I realized at some point that I wasn't going to hit it.
01:28:22.000 I wasn't going to hit the boat on the way back downwind.
01:28:24.000 And I had to just give it all that I had to try to connect with the boat, you know, to stop myself from getting blown past it.
01:28:33.000 And in that moment, too, you realize that, like...
01:28:37.000 Uncertainty is the most unendurable condition.
01:28:41.000 You imagine yourself making it to shore and relaxing, just knowing that it's resolved.
01:28:46.000 And in that moment of like, I might not make it back to this boat, you're tempted to give up because it's the same resolution.
01:28:54.000 It's the feeling of just knowing that the uncertainties have been resolved.
01:28:59.000 And you have to really remind yourself that it's not the same.
01:29:02.000 You have to give it everything you have in order to survive.
01:29:04.000 That feeling that you're longing for is not actually the feeling that you want.
01:29:10.000 And I just barely got the end of a rope that was trailing off the back of the hull.
01:29:15.000 Pulled myself back on it.
01:29:16.000 Almost threw up.
01:29:18.000 Then I had to...
01:29:19.000 Then I was just floating there with the hole three feet underwater.
01:29:23.000 I tied myself to it.
01:29:25.000 I started to get tunnel vision.
01:29:27.000 And really, at the last minute, a tugboat started coming through the area.
01:29:35.000 And it was coming straight at me, actually.
01:29:37.000 And I realized that it probably just wouldn't even see me.
01:29:40.000 It would just run me over and not even know that...
01:29:44.000 I had been there.
01:29:44.000 It's totally possible.
01:29:46.000 I was trying to wave.
01:29:48.000 I could barely lift my arm.
01:29:49.000 I was trying to scream.
01:29:50.000 I could barely make any noise.
01:29:52.000 Somehow they saw me.
01:29:56.000 It took them 15 minutes to get a rope around me.
01:30:00.000 They started pulling me up the side of the boat.
01:30:02.000 Lining every tugboat is tires.
01:30:05.000 Tires, usually.
01:30:06.000 It has a fender.
01:30:08.000 I got wedged in the tires as they were pulling me up.
01:30:11.000 And I knew what was happening.
01:30:12.000 And I was like, all I have to do is stick my leg out and push against the hull of the boat to go around the tires.
01:30:19.000 And I couldn't do it.
01:30:21.000 And I could barely see.
01:30:22.000 And they swung me around and eventually pulled me up.
01:30:24.000 They put me in next to the engines in the engine room.
01:30:28.000 I couldn't even feel the heat.
01:30:32.000 And they called the Coast Guard.
01:30:33.000 And the Coast Guard came and got me.
01:30:34.000 It was really embarrassing.
01:30:36.000 And the Coast Guard guys like, He's got all these blankets over me and he's trying to talk to me to keep me alert.
01:30:44.000 And he's like, so is this your first time sailing?
01:30:51.000 And I have a commercial, like a 250-ton master's license.
01:30:55.000 You need 600 days at sea to get this license.
01:30:59.000 And I was like, no, I have a master's license.
01:31:02.000 And he was like, what?
01:31:03.000 He's like, you're a fucking idiot, man.
01:31:06.000 Everything changed.
01:31:07.000 The tone totally changed.
01:31:08.000 Oh, my God, dude.
01:31:10.000 That's insane.
01:31:11.000 Yeah.
01:31:13.000 Did that change your appreciation for comfort and safety and just life in general?
01:31:20.000 Did it like...
01:31:22.000 Yeah, totally.
01:31:23.000 I mean, it changed.
01:31:23.000 Well, you know, for sure, the next day, I was like, you know, it's just like any near-death experience.
01:31:31.000 I feel like you're just like, what are we doing here?
01:31:33.000 You know, like, what's the...
01:31:34.000 Why are we wasting our time with this?
01:31:36.000 You know, at the time, I was working on Twitter.
01:31:39.000 And, you know, your co-workers are like, oh, we got this problem with the slave lag on the database.
01:31:44.000 And you're just like, what are we doing, man?
01:31:46.000 You know, shouldn't we be doing something else?
01:31:48.000 You know, like...
01:31:49.000 But you can't, I feel like, you can't live like that for long.
01:31:54.000 The what are we doing, man?
01:31:56.000 You know, it's like, it's impossible.
01:32:00.000 The world will, like, suck you back into it.
01:32:04.000 Yeah.
01:32:08.000 Unless you go to Anguilla.
01:32:10.000 Yeah.
01:32:11.000 I mean, a lot of those early crypto people are actually still in Anguilla.
01:32:14.000 Really?
01:32:14.000 Yeah, it's funny.
01:32:16.000 Yeah, that's why we were talking about selling Anguilla.
01:32:18.000 So the people who moved to Anguilla were part of this moment of like...
01:32:23.000 How much did that shift your direction in your life though?
01:32:26.000 Did it change like the way like it seems almost I mean I haven't had a near-death experience but I've had a lot of psychedelic experiences and in some ways I think they're kind of similar and that life shifts to the point where whatever you thought of life before that experience is almost like oh come on that's nonsense Yeah,
01:32:47.000 I mean, it changes your perspective, or it did for me.
01:32:49.000 And, you know, because also in that moment, you know, it's like, you know, I think you go through this sort of embarrassing set of things where you're like, oh, I had these things I was going to do tomorrow.
01:33:01.000 Like, I'm not going to be able to do them.
01:33:04.000 And then you're like, wait, why is that the thing that I'm concerned about?
01:33:10.000 Trivial things.
01:33:11.000 Yeah, trivial.
01:33:11.000 We're just like, oh, I was going to see that person tomorrow.
01:33:13.000 I'm not going to see that.
01:33:16.000 I feel like I remember I was supposed to meet somebody the next day.
01:33:19.000 I remember being worried that they would think that I stood them up or something.
01:33:23.000 You have the awesomest excuse ever.
01:33:28.000 I mean, just tell them that story, the way you just told it to me, and they're going to be like, dude, we're good.
01:33:34.000 Shit.
01:33:35.000 Fuck.
01:33:35.000 Glad you're alright.
01:33:36.000 Yeah.
01:33:37.000 My God.
01:33:38.000 That kind of stuff.
01:33:38.000 And then you get more into the...
01:33:41.000 Yeah, it changes the way you think about things.
01:33:43.000 And certainly, I was working at Twitter at the time, and I think it made me think about how I was spending my life.
01:33:50.000 Yeah.
01:33:56.000 I remember the first day that I was at Twitter.
01:34:03.000 At the time, the most popular person on Twitter was Justin Bieber.
01:34:08.000 He had more followers than any other person.
01:34:10.000 Was that when you guys were trying to rig it so that he wasn't trending number one always?
01:34:16.000 Because they did do that, right?
01:34:17.000 I don't remember that.
01:34:19.000 Conveniently.
01:34:22.000 Because Jamie and I were talking about that one day.
01:34:24.000 Because they had to do something because if they didn't do something, Justin Bieber would be the number one topic every day, no matter what was happening in the world.
01:34:32.000 I can believe that they wanted to change that because the problem was, at the time, Twitter was held together with bubblegum and dental floss.
01:34:41.000 Every time Bieber would tweet, the lights would dim and the building would shake a little bit.
01:34:48.000 Here it goes.
01:34:49.000 So they blocked me from trending.
01:34:51.000 This is 2010. I'm actually honored.
01:34:53.000 Not even Matt.
01:34:54.000 He's also 12. Then I get on and see, yet again, my fans are unstoppable.
01:34:59.000 Love you.
01:35:00.000 Okay, so there's, you know, people talk about, like, invisible labor.
01:35:03.000 Like, the invisible labor behind that tweet is just kind of comical, because it's like, when he did that, you know, people, like, you know, it's like my first day there, you know, it's like he tweeted something, and, you know, the building's, like, kind of shaking, and, like, alarms are going off.
01:35:16.000 People are, like, scrambling around, you know?
01:35:18.000 And it was just this...
01:35:19.000 You know, it's like this realization where you're just like, never in my life did I think that anything Justin Bieber did would like really affect me in any like deep way, you know?
01:35:28.000 And then here I am just like scrambling around to like facilitate.
01:35:31.000 What are your thoughts on curating what trends and what doesn't trend and whether or not social media should have any sort of obligation in terms of...
01:35:43.000 How things, whether or not people see things, like shadow banning and things along those lines.
01:35:51.000 I'm very torn on this stuff because I think that things should just be.
01:35:57.000 And if you have a situation where Justin Bieber is the most popular thing on the internet, that's just what it is.
01:36:03.000 It is what it is.
01:36:05.000 But I also get it.
01:36:06.000 I get how you would say, well, this is going to fuck up our whole program, like what we're trying to do with this thing.
01:36:14.000 What do you mean, fuck up our whole program?
01:36:15.000 Well, what you're trying to do with Twitter, I mean, I would assume what you're trying to do is give people a place where they could share important information and, you know, have people, you know...
01:36:29.000 I mean, Twitter has been used successfully to overturn governments.
01:36:35.000 I mean, Twitter has been used to...
01:36:38.000 Break news on very important events and alert people to danger.
01:36:42.000 There's so many positive things about Twitter.
01:36:45.000 If it's overwhelmed by Justin Bieber and Justin Bieber fan accounts, if it's overwhelmed, then the top ten things that are trending are all nonsense.
01:36:56.000 I could see how someone would think we're going to do a good thing by suppressing that.
01:37:05.000 Yeah, I see what you're saying.
01:37:07.000 Why do you think they did suppress that?
01:37:09.000 What do you think?
01:37:10.000 You worked there.
01:37:11.000 Why do you think they kept him from trending?
01:37:16.000 Well, I mean, I don't know about that specific situation.
01:37:20.000 I mean, I think, you know, looking at the larger picture, right, like...
01:37:27.000 In a way, you know, it's like, if you think about, like, 20 years ago, whenever anybody talked about, like, society, you know, everyone would always say, like, the problem is the media.
01:37:36.000 It's like the media, man.
01:37:38.000 You know, if only we could change the media.
01:37:40.000 And a lot of people in who were interested in, like, a better and brighter future were really focused on self-publishing.
01:37:48.000 Their whole conference is about an underground publishing conference, now the Allied Media Conference.
01:37:52.000 People were writing zines.
01:37:53.000 People were, you know, getting their own printing presses.
01:37:56.000 We were convinced that if we made publishing more equitable, if everybody had the equal ability to produce and consume content, that the world would change.
01:38:09.000 In some ways, what we have today is the fantasy of those dreams from 20 years ago.
01:38:22.000 In a couple ways.
01:38:24.000 One, it was the dream that if a cop killed some random person in the suburbs of St. Louis, that everyone would know about it.
01:38:35.000 Everyone knows.
01:38:37.000 And also, that anybody could share their weird ideas about the world.
01:38:42.000 And I think, in some ways, we were wrong.
01:38:46.000 You know, that we thought, like, you know, the word we got today is like, yeah, like, if a cop kills somebody in the suburbs of St. Louis, like, everybody knows about it.
01:38:55.000 I think we overestimated how much that would matter.
01:38:58.000 And I think we also believed that the things that everyone would be sharing were, like, our weird ideas about the world.
01:39:05.000 And instead, we got, like, you know, Flat Earth and, like, you know, anti-vax and, like, you know, all this stuff, right?
01:39:12.000 And so it's, like, in a sense, like, I'm glad that those things exist because they're, like, they're sort of what we wanted, you know?
01:39:19.000 But I think what we did, what we underestimated is, like, how important the medium is.
01:39:25.000 Like, the medium is the message kind of thing.
01:39:26.000 And that, like, What we were doing at the time of writing zines and sharing information, I don't think we understood how much that was predicated on actually building community and relationships with each other.
01:39:42.000 Like, what we didn't want was just, like, more channels on the television.
01:39:45.000 And that's sort of what we got, you know?
01:39:47.000 And so I think, you know, it's like everyone is, like, on YouTube trying to monetize their content, whatever, you know?
01:39:51.000 And that, it's the same thing.
01:39:53.000 Like, bad business models produce, like, bad technology and bad outcomes.
01:39:56.000 And so I think there's concern about that.
01:40:00.000 But I think...
01:40:04.000 I think, like, you know, now that there's, like, you know, these two simultaneous truths that everyone seems to believe that are in contradiction with each other.
01:40:12.000 You know, like, one is that, like, everything is relative.
01:40:16.000 Everyone is entitled to their own opinion.
01:40:18.000 All opinions are equally valid.
01:40:19.000 And two, like...
01:40:23.000 Our democracy is impossible without a shared understanding of what is true and what is false.
01:40:28.000 The information that we share needs to be verified by our most trusted institutions.
01:40:33.000 People seem to simultaneously believe both of these things, and I think they're in direct contradiction with each other.
01:40:39.000 So in some ways, I think most of the questions about social media in our time are about trying to resolve those contradictions.
01:40:46.000 But I think it's way more complicated than the way that the social media companies are trying to portray it.
01:40:56.000 Yeah, I think there's simplistic methods that they're using to handle complex realities.
01:41:05.000 Like, for instance, banning QAnon.
01:41:11.000 This is a big one, right?
01:41:14.000 QAnon's got these wacky theories and they're like, Jesus Christ, these are weaponizing all these nutbags.
01:41:20.000 We're just going to ban QAnon.
01:41:24.000 Because you think what they're saying is not true and not correct.
01:41:27.000 How far do you go with that?
01:41:30.000 You've sort of set a precedent.
01:41:32.000 And where does that end?
01:41:34.000 Because, you know, are we going to ban JFK theories?
01:41:37.000 Because JFK murders are probably still relevant today.
01:41:40.000 Some of those people are still alive.
01:41:41.000 Do we ban...
01:41:42.000 There's theories about the Challenger, the space shuttle Challenger.
01:41:46.000 There's a lot of wacky conspiracy theories about space being fake.
01:41:54.000 Have you ever seen hashtag space is fake?
01:41:56.000 Yeah.
01:41:57.000 Go on there if you want to really fucking just lose all faith in humanity.
01:42:03.000 Look up space is fake.
01:42:06.000 Oh my god, there's so many people.
01:42:09.000 Yeah, and I think that people get something out of that.
01:42:11.000 Yeah, they do.
01:42:12.000 Well, people get something out of mysteries and maybe being on the inside and knowing things where the rest of the world is asleep.
01:42:20.000 This is the reason why people love the idea of red-pilled.
01:42:24.000 Somebody even suggested I call this room the red pill.
01:42:26.000 My friend Radio Rahim said, call it the red pill.
01:42:29.000 I'm like, ah...
01:42:30.000 There's a lot riding on that term.
01:42:47.000 Where does that end?
01:42:49.000 Does it end with Flat Earth?
01:42:51.000 Are you going to ban that?
01:42:53.000 They're going to go, oh, they're suppressing us.
01:42:54.000 And then they're going to find these...
01:42:56.000 That's the thing about all these weird alternative sources of social media, whether it's Parler or Gab, they become...
01:43:05.000 Shitfests.
01:43:06.000 If you go to those, especially Gab, it's just like, God damn, what have you guys done?
01:43:11.000 It's not even what have they done.
01:43:12.000 It's what have the people done that have all been kicked out of all these other places.
01:43:16.000 And then if you have a place that says, we're not going to kick you out.
01:43:19.000 And then all these fucking cretins come piling into these places.
01:43:23.000 And I'm sure there's a lot of good people on Gab.
01:43:25.000 Don't get me wrong.
01:43:25.000 There's a lot of people that are...
01:43:26.000 They just didn't want to be suppressed by social media.
01:43:30.000 Parler doesn't seem to be nearly as bad.
01:43:33.000 I've looked at that as well.
01:43:34.000 It's more like just super right-wing information type stuff.
01:43:39.000 And there's some reasonable people on Parler.
01:43:42.000 But...
01:43:42.000 But I think that there's a subtle thing there because I don't know how those things work.
01:43:46.000 But I think part of what...
01:43:52.000 If you set aside all of the takedown stuff, all the deplatforming stuff, if you say, okay, Facebook, Twitter, these companies, they don't do that anymore.
01:44:06.000 They've never done that.
01:44:07.000 They're still moderating content.
01:44:09.000 They have an algorithm that decides what is seen and what isn't.
01:44:12.000 Right.
01:44:14.000 But how is that all algorithm programmed?
01:44:17.000 For Facebook and for YouTube and a lot of these things, it's done to encourage viewership.
01:44:24.000 It's done to encourage interaction, right?
01:44:27.000 It's done to encourage time spent looking at the screen.
01:44:31.000 Yeah, so that's how they monetize it.
01:44:33.000 They want more clicks and more ad views and all that jazz.
01:44:38.000 But when it becomes an ideological moderation, that's when things get a little weird, right?
01:44:43.000 But it is by definition an ideological moderation.
01:44:47.000 If you optimize for time spent looking at the screen, you're going to be encouraging certain kinds of content and not others.
01:44:55.000 Okay, but that's not always true.
01:44:56.000 I'll give you an example for us.
01:44:58.000 We did a podcast with Kanye West.
01:45:01.000 Kanye West was running for president, right?
01:45:04.000 And if you were running for president and you were outside the...
01:45:07.000 Like, for instance, Twitter banned Brett Weinstein's...
01:45:12.000 Brett Weinstein had a...
01:45:14.000 He had a Twitter account that was set up for...
01:45:19.000 It was Unity 2020. And the idea was, like, instead of looking at this in terms of left versus right, Republican versus Democrat, let's get reasonable people from both sides, like a Tulsi Gabbard and a Dan Crenshaw.
01:45:33.000 Bring them together and perhaps maybe put into people's minds the idea that, like, this idea, this concept of it has to be a Republican vice president or a Republican president.
01:45:45.000 Maybe that's nonsense.
01:45:46.000 And maybe it would be better if we had reasonable, intelligent people together.
01:45:50.000 What is this?
01:45:51.000 Third video.
01:45:51.000 There's their video?
01:45:52.000 Yeah.
01:45:53.000 Well, it's a very rational perspective.
01:45:56.000 It's not conspiracy theory driven.
01:46:00.000 They got banned from Twitter.
01:46:02.000 For what?
01:46:03.000 For nothing.
01:46:04.000 Just because they were promoting a third party.
01:46:07.000 Because they were trying to come up with some alternative.
01:46:10.000 The idea was this could siphon off votes from Biden.
01:46:13.000 We want Biden to win because Trump is bad.
01:46:16.000 This is the narrative, right?
01:46:19.000 Yeah, I mean, I think that there's...
01:46:21.000 Man, there's a lot here, but the...
01:46:23.000 Yeah.
01:46:23.000 But I was going to say, I got sidetracked, I'm sorry.
01:46:26.000 Let me finish.
01:46:26.000 Yeah.
01:46:27.000 The Kanye West thing.
01:46:28.000 So we had a podcast with Kanye West.
01:46:31.000 It got, I don't know how many millions of views, but it was a lot.
01:46:37.000 But it wasn't trending.
01:46:39.000 And so, Jamie, you contacted the people at YouTube and asked them why it's trending.
01:46:43.000 What was their answer?
01:46:44.000 It's not trending.
01:46:45.000 Um...
01:46:47.000 Like it didn't meet the qualifications they decided for trending or something like that?
01:46:52.000 No, like it didn't include everything you would assume, like you just said, all the interactivity comments.
01:46:58.000 It had more comments than any video we had.
01:47:00.000 That's what I mean.
01:47:01.000 Massive amounts of comments, massive amounts of views, but yet nowhere to be seen on the trending.
01:47:06.000 But I don't think there was a person involved.
01:47:08.000 Like, there was an algorithm involved that was trying to optimize for certain things.
01:47:12.000 Not in this case.
01:47:14.000 This specific case, yeah.
01:47:15.000 There's a team there.
01:47:16.000 There's separate teams at YouTube, from my understanding.
01:47:19.000 Yeah, and this separate team had made a distinction.
01:47:23.000 And I don't even know if they told the person who told me that what it was.
01:47:27.000 So that person may not know either.
01:47:29.000 So they just decided this is not worthy of trending.
01:47:32.000 So you have arbitrary decisions that are being made by people, most likely because they feel that ideologically Kanye West is not aligned with...
01:47:42.000 I mean, he was wearing the MAGA hat for a while.
01:47:45.000 So they just decided this is not trending.
01:47:48.000 But it is trending.
01:47:49.000 It's clearly trending.
01:47:51.000 You've got millions and millions and millions of people who are watching it.
01:47:54.000 Whether there, you know, it's like whether there are, but I think this is the point, you know, it's like whether there, whether it's people, whether it's algorithms, you know, there are forces that are making decisions about what people see and what people don't see, and they're based on certain objectives that I think are most often business objectives.
01:48:10.000 But not in this case.
01:48:12.000 In this case, the business objective was if they wanted to get more eyeballs on it, they would want it to be trending.
01:48:17.000 And people say, oh shit, Kanye West is on the JRE? Do people that like Kanye click on ads or not?
01:48:23.000 There's a lot in there that we don't know.
01:48:28.000 Oh, that's horseshit.
01:48:29.000 Come on, bro.
01:48:30.000 I don't know.
01:48:30.000 Maybe they're making an ideological decision.
01:48:32.000 Millions and millions.
01:48:33.000 When you have a video that's being viewed by that many people, there's going to be a lot of goddamn people clicking on ads.
01:48:39.000 No matter what.
01:48:40.000 The other thing that these platforms want is for the content to be ad safe.
01:48:44.000 It's like maybe advertisers don't...
01:48:46.000 I don't know.
01:48:48.000 But I think actually focusing on the outlying cases of this person was deplatformed, this person was intentionally, ideologically not promoted or de-emphasized or whatever.
01:49:01.000 Shadow banning.
01:49:02.000 I think that that, like, obfuscates or, you know, draws attention away from the larger thing that's happening, which is that, like, those things are happening just implicitly all the time.
01:49:15.000 And that, like, it almost, like, serves to the advantage of these platforms to...
01:49:20.000 Highlight the times when they remove somebody because what they're trying to do is reframe this is like, okay, well, yeah, we've got these algorithms or whatever.
01:49:26.000 Don't talk about that.
01:49:27.000 The problem is there's just these bad people, you know, and we have to decide there's a bad content from bad people and we have to decide, you know, what to do about this bad content and these bad people.
01:49:36.000 And I think that distracts people from the fact that like the platforms are at every moment making a decision about what you see and what you don't see.
01:49:48.000 I see what you're saying.
01:49:50.000 So, there's more than one problem.
01:49:54.000 There's a problem of deplatforming, because in many ways, deplatforming decisions are being made based on ideology.
01:50:02.000 It's a certain specific ideology that the people that are deplatforming the other folks have that doesn't align with the people that are Being de-platformed.
01:50:11.000 These people that are being de-platformed, they have ideas that these people find offensive or they don't agree with.
01:50:17.000 So they say, we're going to take you off.
01:50:20.000 Yeah.
01:50:21.000 Or sometimes they just find themselves in a trap, you know?
01:50:24.000 A trap.
01:50:25.000 Well, I think that there's a tendency for a lot of these platforms to try to define some policy about what it is that they want and they don't want.
01:50:37.000 I feel like that's sort of a throwback to this modernist view of science and how science works and we can objectively and rigorously define these things.
01:50:47.000 I just don't think that's actually how the world works.
01:50:51.000 What do you mean?
01:50:52.000 How so?
01:50:54.000 I feel like we're just past that.
01:50:56.000 That it's not like...
01:50:57.000 First of all, I think science is not about truth.
01:51:01.000 It's just not.
01:51:01.000 It's about utility.
01:51:03.000 What do you mean?
01:51:05.000 Okay.
01:51:06.000 It's like...
01:51:08.000 I was taught Newtonian physics in high school.
01:51:10.000 Why?
01:51:11.000 It's not true.
01:51:13.000 That's not how the universe works.
01:51:15.000 But it's still useful, and that's why it's taught.
01:51:18.000 Because you can use it to predict motion outcomes, that kind of thing.
01:51:21.000 What's incorrect about Newtonian physics in the sense that they shouldn't be teaching it?
01:51:27.000 I mean, today, you know, people believe that the truth is that, you know, there's like, you know, relativity, like gravity is not a force.
01:51:33.000 There's like, you know, these planes and stuff, whatever, you know, that like there are other models to describe how the universe works.
01:51:39.000 And Newtonian physics is considered outmoded.
01:51:43.000 But it still has utility in the fact that you can use it to predict the...
01:51:47.000 So you're talking about in terms of quantum physics and string theory and a lot of these more...
01:51:51.000 Yeah, it's like relativity at the large scale, quantum physics at the small scale.
01:51:56.000 And even those things are most likely not true in the sense that they aren't consistent with each other and people are trying to unify them and find something that does make sense at both of those scales.
01:52:05.000 The history of science is a history of things that weren't actually true.
01:52:08.000 You know, Bohr's model of the atom, Newtonian physics.
01:52:11.000 People have these, you know, Copernicus's model of the solar system.
01:52:15.000 People have these ideas of how things work.
01:52:18.000 And the reason that people are drawn to them is because they actually have utility.
01:52:21.000 That it's like, oh, we can use this to predict the motion of the planets.
01:52:24.000 Oh, we can use this to send a rocket into space.
01:52:25.000 Oh, we can use this to, you know, have better outcomes, you know, for some medical procedure or whatever.
01:52:30.000 But it's not actually...
01:52:31.000 I don't...
01:52:33.000 I think it's not actually truth.
01:52:35.000 Like, the point of it isn't truth.
01:52:37.000 The point of it is that, like, we have some utility that we find in these things.
01:52:42.000 And I think that...
01:52:47.000 When you look at the emergence of science and people conceiving of it as a truth, it became this new authority that everyone was trying to appeal to.
01:52:56.000 If you look at all of the 19th century political philosophy, I mean, okay, I think the question of truth is, like, you know, it's even a little squishy with the hard sciences, right?
01:53:08.000 But once you get into, like, soft sciences, like social science, psychology, like, then it's even squishier, you know, that, like, these things are really not about truth.
01:53:17.000 They're about, like, some kind of utility.
01:53:18.000 And when you're talking about utility, the important question is, like, useful for what and to whom?
01:53:24.000 You know?
01:53:25.000 And I think that's just always the important question to be asking, right?
01:53:28.000 Because, you know, when you look at, like, all the 19th century political writing, it's all trying to frame things in terms of science in this way that it just seems laughable now.
01:53:35.000 But, you know, like, at the time, they were just like, we're going to prove that communism is, like, the most true, like, social economic system in the world, you know?
01:53:43.000 Like, there are whole disciplines of that.
01:53:44.000 People in...
01:53:45.000 You know, people had like PhDs in that, you know, their whole research departments in the Soviet Union, people doing that.
01:53:51.000 And we laugh about that now, but I don't think it's that different than like social science in the West, you know?
01:53:57.000 And so I think, you know, it's like if you lose sight of that, then you can try, then you try to like frame Social questions in terms of truths.
01:54:06.000 It's like, this is the kind of content that we want, and we can rigorously define that, and we can define why that's going to have the outcomes that we want it to.
01:54:13.000 But once you get on that road, you're like, okay, well, terrorist stuff.
01:54:17.000 We don't like terrorist stuff, so we're going to rigorously define that, and then we have a policy, no terrorist stuff.
01:54:24.000 And then China shows up, and they're like, we've got this problem with terrorists, the Uyghurs.
01:54:30.000 We see you have a policy.
01:54:32.000 I think if people from the beginning acknowledged that all of objectivity is just a particular worldview and that we're not going to regularsly define these things in a way of what is true and what isn't, then I think we would have better outcomes.
01:54:50.000 That's my weird take.
01:54:51.000 I mean, I think, you know, from the perspective of Signal, you know, it's like, do you know what's trending on Signal right now?
01:54:57.000 No.
01:54:57.000 Nothing.
01:54:58.000 Neither do I. No.
01:54:58.000 Okay.
01:54:59.000 But that's, it's on a social media platform.
01:55:01.000 But isn't it, there's a weird thing when you decide that you have one particular ideology that's being supported in another particular ideology that That is being suppressed.
01:55:14.000 And this is what conservative people feel when they're on social media platforms.
01:55:19.000 Almost all of them, other than the ones we talked about before, Parler and Gab and the alternative ones, they're all very left-wing in terms of the ideology that they support.
01:55:30.000 The things that can get you in trouble on Twitter.
01:55:36.000 What did you say?
01:55:37.000 But then the President of the United States just constantly violated every policy that they had.
01:55:43.000 But he's ridiculous.
01:55:44.000 That's a ridiculous example, right?
01:55:46.000 Because he's one person, and they've actually discussed this, that he and his tweets are more important.
01:55:53.000 It's more important that they allow these tweets to get out.
01:55:57.000 First of all, you can understand how fucking crazy this guy is.
01:56:00.000 And second of all, it's newsworthy.
01:56:03.000 He's the leader of the...
01:56:06.000 And also, it would be very costly from a business perspective.
01:56:11.000 Yes.
01:56:12.000 Very likely.
01:56:13.000 And kind of amazing that he didn't do anything along the way while he was witnessing people get deplatformed, and particularly this This sort of bias towards people on the left and this discrimination against people on the right.
01:56:30.000 There's people on the right that have been banned and shadow banned and blocked from posting things.
01:56:36.000 You run into this situation where you wonder what exactly is a social media platform.
01:56:45.000 It's just a small private company and maybe you have some sort of a video platform and there's only a few thousand people on it and you only want videos that align with your perspective.
01:56:57.000 Okay, you're a private company.
01:56:58.000 You can do whatever you want.
01:56:59.000 But when you're the biggest video platform on earth like YouTube and you decide that you are going to take down anything that disagrees with your perspective on how COVID should be handled...
01:57:14.000 Including doctors.
01:57:15.000 This is one of the things that happened.
01:57:17.000 Doctors that were stating, look, there's more danger in lockdowns.
01:57:22.000 There's more danger in this than there is in the way we're handling it.
01:57:27.000 There's more danger in the negative aspects of the decisions that are being made than it would be to let people go to work with masks on.
01:57:37.000 Those videos just get deleted.
01:57:40.000 Those videos get blocked.
01:57:41.000 There's people that Yeah.
01:58:02.000 A town hall?
01:58:03.000 Is it the way that people get to express ideas?
01:58:06.000 And isn't the best way to express ideas to allow people to decide, based on the better argument, what is correct and what's incorrect?
01:58:16.000 Like, this is what freedom of speech is supposed to be about.
01:58:20.000 It's supposed to be about, you have an idea, I have an idea, these two ideas come together, and then the observers get to go, hmm, okay, well, this guy's got a lot of facts behind him.
01:58:28.000 This is objective reality.
01:58:31.000 This is provable.
01:58:33.000 And this other guy is just a crazy person who thinks the world's hollow.
01:58:36.000 Okay?
01:58:37.000 This is the correct one.
01:58:39.000 There's going to be some people that go, no, there's a suppression of hollow earth and hollow earth is the truth and hollow earth facts and hollow earth theory.
01:58:46.000 But you've got to kind of let that happen.
01:58:49.000 You gotta kind of have people that are crazy.
01:58:52.000 Remember the old dude that used to stand on the corners with the placards on, the world is ending tomorrow?
01:58:58.000 They're still there.
01:58:59.000 Yeah, but those are on Twitter now, right?
01:59:01.000 But those people, no one said, you gotta get rid of that guy.
01:59:05.000 You would drive by and go, look at this crazy fuck.
01:59:08.000 Those crazy fucks making YouTube videos, those videos get deleted.
01:59:12.000 I don't know if that's good.
01:59:14.000 I kind of think that you should let those crazy fucks do that.
01:59:17.000 Because it's not going to influence you.
01:59:19.000 It's not going to influence me.
01:59:21.000 It's going to influence people that are easily influenced.
01:59:23.000 And the question is, who are we protecting and why are we protecting these people?
01:59:27.000 Well, okay, but I think, in my mind, what's going on is, like, the problem is that it used to be that some person with very strange ideas about the world wearing a sign on the street corner shouting was just a person with very strange ideas about the world wearing a sign on the street corner shouting.
01:59:43.000 Now, there's somebody, you know, with very strange ideas about the world, and those ideas are being amplified by a billion-dollar company, because there are algorithms that amplify that.
01:59:52.000 And what I'm saying is that instead of actually talking about that, instead of addressing that problem, those companies are trying to distract us from that discussion by saying, Would the correct way to handle it...
02:00:23.000 Would it be to make algorithms illegal in that respect?
02:00:26.000 Like to not be able to amplify or detract?
02:00:29.000 To not be able to ban, shadow ban, or just to have whatever trends trend.
02:00:36.000 Whatever is popular, popular.
02:00:38.000 Whatever people like, let them like it.
02:00:41.000 And say, listen, this thing that you've done by creating an algorithm that encourages people to interact, encourages people to interact on Facebook, encourages people to spend more time on the computer, what you've done is you've kind of distorted what is valuable to people.
02:00:57.000 You've changed it and guided it in a way that is ultimately, perhaps arguably, detrimental to society.
02:01:05.000 So we are going to ban algorithms.
02:01:08.000 You cannot use algorithms to dictate what people see or not see.
02:01:12.000 You give them a fucking search bar, and if they want to look up UFOs, let them look up UFOs.
02:01:17.000 But don't shove it down their throat because you know they're a UFO nut.
02:01:21.000 Don't curate their content feed.
02:01:24.000 Yeah, I mean, I think it's okay.
02:01:26.000 It's complicated because, one, I have no faith in, like when you say ban or make it illegal or whatever, I have zero faith in the government being able to handle this.
02:01:37.000 Yeah, nor do I. Every time I see a cookie warning on a website, I'm like, okay, these people are not the people that are good.
02:01:43.000 This is what they've given us after all this time.
02:01:45.000 These people are not going to solve this for us.
02:01:47.000 And also, I think a lot of what it is that the satisfaction that people feel and the discomfort that people feel and the concern that people have is a concern about power.
02:01:59.000 That right now, these tech companies have a lot of power.
02:02:03.000 And I think that the concern that is coming from government is the concern for their power.
02:02:10.000 The right has made such a big deal about deplatforming.
02:02:15.000 And I think it's because they're trying to put these companies on notice.
02:02:23.000 It's like, fuck with us.
02:02:25.000 We will take power.
02:02:27.000 But they've done nothing about it.
02:02:30.000 Don't you think that they've actually made a big deal about deplatforming because the right has been disproportionately deplatformed?
02:02:37.000 I think the right is, like, doing fine.
02:02:40.000 How so?
02:02:41.000 I don't know.
02:02:42.000 I don't know what the numbers are, but I feel like it's, like...
02:02:44.000 Did you say that, though, because you're on the left?
02:02:47.000 Yeah, but that's Trump.
02:02:48.000 That's Trump.
02:02:50.000 He's an anomaly.
02:02:52.000 You can't really, you know...
02:02:55.000 Okay, I think, I guess maybe, let me just reframe this to say that, like, I think it's interesting that we are, we've hit an inflection point, right?
02:03:03.000 Where, like, the era of utopianism with regards to technology is over.
02:03:07.000 Yeah.
02:03:07.000 That it's just like, you know, after 2016, it was just like, big tech has zero allies anymore.
02:03:14.000 You know, on the left, everyone's just like, you just gave the election to Trump, you know?
02:03:17.000 And on the right, they're just like, you just removed somebody from YouTube for calling gay people an abomination.
02:03:21.000 Fuck you.
02:03:21.000 You know, like, They have no allies.
02:03:24.000 No one believes in the better and brighter.
02:03:28.000 No one believes that Google is organizing the world's information.
02:03:30.000 No one believes that Facebook is connecting the world.
02:03:33.000 And I think that there's an opportunity there.
02:03:38.000 That we're in a better situation than we were before.
02:03:41.000 All the cards are on the table.
02:03:43.000 People more and more understanding how it is that these systems function.
02:03:47.000 I think we're increasingly see that people understand that this is really about power, it's about authority, and that we should be trying to build things that limit the power that people have.
02:03:58.000 If you had your wish, if you could let these social media platforms, whether it's video platforms like YouTube or Facebook or Twitter, if you If you had the call,
02:04:13.000 if they called you up and said, Moxie, we're going to let you make the call.
02:04:18.000 What should we do?
02:04:19.000 How should we curate this information?
02:04:20.000 Should we have algorithms?
02:04:22.000 Should we allow people?
02:04:23.000 Should we just let it open to everything?
02:04:26.000 Everything and anybody.
02:04:27.000 What should we do?
02:04:28.000 Well, I mean, this is what we're trying to do with Signal.
02:04:30.000 But it's different, right?
02:04:31.000 Because you're just a messaging app.
02:04:33.000 We're just a messaging app.
02:04:35.000 No, I don't say that.
02:04:36.000 It's a very good messaging app that I use.
02:04:38.000 No, I understand what you're saying.
02:04:39.000 But I think the way that messaging apps are going, there's a trajectory where a project like Signal becomes more of a social experience.
02:04:49.000 And that, like, the things that we're building extend beyond just, like, you know, sending messages.
02:04:54.000 Particularly, I think, as more and more communication moves into group chats and things like that.
02:04:59.000 And, you know, the foundation that we're building it on is a foundation where we know nothing.
02:05:03.000 You know, it's like, if I looked up your Signal account record right now of, like, all the information that we had about you on Signal, There's only two pieces of information.
02:05:10.000 The date that you created the account and the date that you last used Signal.
02:05:14.000 That's it.
02:05:14.000 That's all we know.
02:05:15.000 If you looked on any other platform, your mind would be blown.
02:05:20.000 No, it's admirable what you're doing, and it's one of the reasons why I wanted to talk to you.
02:05:24.000 I think that foundation gives us...
02:05:27.000 Now that we have that foundation, there's a lot that we can build on it.
02:05:32.000 Would you do a social media app?
02:05:35.000 Well, I think, you know, some of the stuff that we're working on now of just like moving away from phone numbers, you can have like, you know, a username so that you can like post that more publicly.
02:05:41.000 And then, you know, we have groups, now you have group links.
02:05:43.000 And then, you know, maybe we can do something with events.
02:05:45.000 And we can, you know, that's like, we're sort of moving in the direction of like, an app that's good for communicating with connections you already have to an app that's also good for creating new connections.
02:05:57.000 Would you think that social media would be better served with the algorithms that are in place and with the mechanisms for determining what's trending in place and for their trust and safety or whatever their content monitoring policy they have now or have it wide open?
02:06:22.000 Wild West.
02:06:24.000 I mean, I think it depends on when you say like better, you know, better for what, right?
02:06:29.000 Better for humanity.
02:06:30.000 Yeah, no.
02:06:31.000 I think...
02:06:32.000 Censorship is better?
02:06:33.000 No, no, no.
02:06:34.000 I think the problem...
02:06:36.000 I think bad business models create bad technology, which has bad outcomes.
02:06:40.000 You know, that's the problem we have today, right?
02:06:41.000 So the problem is that there's a financial incentive for them to...
02:06:45.000 That if we, you know, if you look at, like, the metrics, you know, that we talked about, like, you know, what Facebook cares about is just, like, time that you spent looking at the screen on Facebook, you know?
02:06:54.000 Like, if we were to have metrics, if Signal were to have metrics, you know, our metrics would be, like, what we want is for you to use the app as little as possible, for you to actually have the app open as little as possible, but for the velocity of information to be as high as possible.
02:07:07.000 So it's like you're getting maximum utility.
02:07:09.000 You're spending as little time possible looking at this thing while getting as much out of it as you can.
02:07:13.000 How could that be engineered, do you think?
02:07:15.000 That's what we're trying to do.
02:07:16.000 So you're trying to do that with a social media app as well?
02:07:18.000 Well, I mean, you know, we're sort of moving in that direction, right?
02:07:21.000 And it's like, and I think once you start from the principle of like, well, we don't have to have infinite growth.
02:07:27.000 We don't actually have to have profit.
02:07:30.000 We don't have to return.
02:07:31.000 We're not accountable to investors.
02:07:32.000 We don't have to, you know, satisfy public markets.
02:07:36.000 We also don't have to build a pyramid scheme where we have like, you know, 2 billion users so that we can monetize them to like, you know, a few hundred thousand advertisers so that we can, you know, like we don't have to do any of that.
02:07:46.000 And so We have the freedom to pick the metrics that we think are the ways that we think technology should work, that we think will better serve all of us.
02:07:55.000 So what would better be served is a bunch of wild hippies like yourself that don't want to make any money at all, put together a social media app.
02:08:03.000 If you work at Signal, you get paid.
02:08:05.000 Oh yeah, I'm sure.
02:08:06.000 But I don't mean the company itself as a corporation.
02:08:12.000 You get paid, but that's it.
02:08:15.000 And how do you generate the income?
02:08:17.000 Well, you know, we do it by, like, tying ourselves to a community of users instead of advertisers, right?
02:08:23.000 So, where's the money coming from, though?
02:08:25.000 From people who use Signal.
02:08:26.000 So, similar to, like...
02:08:28.000 Do they pay for it?
02:08:29.000 No, no, it's, like, donation-based.
02:08:31.000 It's similar to, like, Wikimedia.
02:08:32.000 Oh, okay.
02:08:32.000 You know, it's, like, you know, Wikipedia exists.
02:08:34.000 There's no company.
02:08:35.000 There's no...
02:08:36.000 Well, that would be great if they could figure out a way to develop some sort of social media platform that just operated on donations and could rival the ones that are operating on advertising revenue.
02:08:48.000 Because I agree with you that that creates a giant problem.
02:08:53.000 And that's what we're working on, slowly.
02:08:57.000 So you just look at it in terms of bad business model equals bad outcome.
02:09:04.000 That's how you look at all these.
02:09:06.000 And it's also, by the way, why we have people mining cobalt in Congo.
02:09:12.000 And you don't think that they can regulate their way out of this situation.
02:09:16.000 With technology, I'm not super optimistic, yeah.
02:09:20.000 Just based on, you know, and even the hearings, you know, it's just like amateur hour.
02:09:24.000 So do you think that, yes, the hearings were amateur hour, yeah, when, yeah, there were some ridiculous questions.
02:09:32.000 Yeah, I mean, it's just like, they're talking to the wrong people.
02:09:35.000 They don't understand how stuff works.
02:09:37.000 They don't know that it's not Google, that's Apple.
02:09:37.000 You've been prepped for this.
02:09:39.000 Don't you have a team of people who...
02:09:41.000 Yeah.
02:09:42.000 Come on.
02:09:43.000 Yeah.
02:09:43.000 It's fascinating to watch, right?
02:09:46.000 It's like your dad who doesn't know how to...
02:09:48.000 How do I get the email?
02:09:51.000 It's like these people are not going to save us, man.
02:09:53.000 You know, and it's like anything that they do will probably just make things worse.
02:09:55.000 Do you think that it's a valid argument that conservatives have though?
02:09:58.000 That they're being censored and that their voice is not being heard?
02:10:02.000 I know what you said in terms of, you know, that if someone had something on YouTube that said that gay people are unhuman and they should be abolished and banned and delete that video.
02:10:15.000 I get that perspective.
02:10:18.000 But I think there's other perspectives, like the Unity 2020 perspective, which is not in any way negative.
02:10:25.000 Yeah, I mean, I don't know what happened with that, but I feel like what I... I think it could be a part of this thing of just like, well, we create this policy and we have these...
02:10:33.000 You know, we define things this way, and then a lot of stuff just gets caught up in it.
02:10:36.000 You know, where it's just like, now you're like taking down content about the Uyghurs because you wanted to do something else.
02:10:40.000 You know, that if people would just be more honest about, like, there is not really an objectivity...
02:10:44.000 And, you know, we're looking for these specific outcomes and this is why that I think, you know, maybe we would have better results.
02:10:50.000 Well, how does one fix this, though?
02:10:52.000 How does one, like, you worked at Twitter, you kind of understand these better than most, these social media platforms.
02:10:59.000 How would one fix this?
02:11:01.000 If they hired you, if they said, hey, Moxie, we're kind of fucked.
02:11:05.000 We don't know how to fix this.
02:11:09.000 Well, uh...
02:11:10.000 Is there a way?
02:11:11.000 Because it seems like they make so much money.
02:11:13.000 Yeah.
02:11:13.000 If you came along and said, yeah, well, you gotta stop making money.
02:11:16.000 They'd be like, get rid of that fucking nut.
02:11:18.000 Exactly, exactly.
02:11:18.000 Look at him, goddamn sailor.
02:11:20.000 Yeah.
02:11:20.000 What's he talking about?
02:11:21.000 What is he talking about?
02:11:22.000 Fuck out of here.
02:11:23.000 Stop making money.
02:11:24.000 Yeah.
02:11:24.000 What, you wanna play rock, paper, scissors?
02:11:26.000 Yeah.
02:11:28.000 You're crazy, man.
02:11:30.000 How do you fix this?
02:11:32.000 I mean, one thing I'm actually a little encouraged by is the organizing unionization stuff that's been happening in the tech industry.
02:11:42.000 So there's been a couple of walkouts and there's some increased communication among tech workers.
02:11:49.000 Normally you think about...
02:11:50.000 I'm not totally aware of this.
02:11:52.000 What have they been organizing and unionization about?
02:11:57.000 Well, normally you think about unionization as a process for improving material conditions for workers.
02:12:03.000 And there's some aspect of this in the organizing that's been happening.
02:12:06.000 Where have they been doing this?
02:12:09.000 Google is the big, where a lot of the activity has happened, but it's happening across the industry.
02:12:16.000 What are their objectives at Google?
02:12:18.000 At Google, there were some walkouts.
02:12:21.000 The objectives...
02:12:22.000 You should talk to Meredith Whitaker about this, actually.
02:12:25.000 She's really smart and has a lot to say.
02:12:28.000 Shout out to Meredith.
02:12:30.000 She and other people working there and...
02:12:35.000 They were organizing for, one, trying to apply the protections that full-time workers and benefits of full-time workers there had to a lot of the temporary workers, like the people who work in security, the people who are working in the cafeteria, the people who are driving buses and stuff like that,
02:12:51.000 who are living a lot more precariously.
02:12:54.000 But also for creative control over how the technology that they're producing is used.
02:13:01.000 So Google was involved in some military contracts that were pretty sketch.
02:13:07.000 Like applying machine learning AI stuff to military technology.
02:13:11.000 And then finally, there had been a lot of high profile sexual harassment incidents at Google where the perpetrators of sexual harassment were Usually paid large severances in order to leave.
02:13:30.000 And so they had a list of demands.
02:13:33.000 And they, like a lot of people walked out.
02:13:35.000 I don't know what the numbers were, but a lot of people, they managed to organize internally and walked out.
02:13:39.000 And I think stuff like that is encouraging because, you know, it's like we look at the hearings and it's like the people in Congress don't even know who's the right person to talk to.
02:13:51.000 You know, it's like, you know, old people talking about But isn't that another issue where you're going to have people who have an ideological perspective?
02:14:13.000 And that may be opposed to people that have a different ideological perspective, but they're sort of disproportionately represented on the left in these social media corporations.
02:14:24.000 When you get kids that come out of school, they have degrees in tech, or they're interested in tech, they tend to almost universally lean left.
02:14:34.000 Maybe, but I think most...
02:14:36.000 Like, when it comes to the technology, I don't think people are...
02:14:42.000 I think what almost everyone can agree is the amount of money and resources that we're putting into surveillance, into ad tech, into these algorithms that are just about increasing engagement, that they're just not good for the world.
02:14:55.000 And if you put a different CEO in charge...
02:14:57.000 That person's just going to get fired.
02:14:59.000 But if the entire company organizes together and says, no, this is what we want.
02:15:03.000 This is how we want to allocate resources.
02:15:05.000 This is how we want to create the world, then you can't fire all those people.
02:15:11.000 I understand what you're saying.
02:15:11.000 So they'd have to get together and unionize and have a very distinct mandate, very clear that we want to go back to do no evil or whatever the fuck it used to be.
02:15:24.000 Right.
02:15:25.000 Yeah, where they don't really have that as a big sign anymore.
02:15:29.000 Do you think that would really have an impact, though?
02:15:31.000 I mean, it seems like the amount of money, when you find out the amount of money that's being generated by Google and Facebook and YouTube, the numbers are so staggering that to shut that valve off, to like...
02:15:48.000 To shut that spout, good luck.
02:15:51.000 It's almost like it had to have been engineered from the beginning, like what you're doing at Signal.
02:15:57.000 Like someone had to look at it from the beginning and go, you know what, if we rely on advertiser revenue, we're going to have a real problem.
02:16:04.000 Yeah.
02:16:05.000 And I think, but I think it's, yeah, exactly.
02:16:07.000 I mean, you know, I think you're right.
02:16:09.000 And there's, you know, part of the problem with just relying on tech workers to organize themselves is that they are shareholders of these companies.
02:16:17.000 You know, they have a financial stake in their outcome.
02:16:19.000 And so that influences the way that they think about things.
02:16:23.000 But...
02:16:26.000 I think another aspect to all of this is that I think people underestimate just how expensive it is to make software.
02:16:32.000 And another thing that I think would really improve things is making software cheaper.
02:16:38.000 Right now, it's moving in the opposite direction.
02:16:40.000 It's getting harder, more expensive to produce software.
02:16:42.000 How so?
02:16:44.000 It used to be that if you wrote a piece of software, you just wrote it once for the computer.
02:16:49.000 And then that was your software.
02:16:51.000 Now if you want to write a piece of software, you have to write it at least three times.
02:16:53.000 You have to write it for iPhone.
02:16:55.000 You have to write it for Android.
02:16:56.000 You have to write it for the web.
02:16:58.000 Maybe you need a desktop client.
02:17:00.000 So it's like you need three or four times the energy that you used to have.
02:17:07.000 And the way that software works...
02:17:14.000 Not worth going into.
02:17:14.000 But it's getting more expensive.
02:17:16.000 What do you personally use?
02:17:18.000 Are you one of those minimalist dudes?
02:17:20.000 I notice you have a little tiny notebook here.
02:17:23.000 Oh, yeah, yeah.
02:17:23.000 And then you have two phones.
02:17:25.000 Yeah, I'm like...
02:17:26.000 I have to have...
02:17:28.000 I try to be like...
02:17:30.000 I just want to...
02:17:31.000 You're also one of those renegades with no phone case.
02:17:33.000 Oh, yeah, man.
02:17:34.000 I feel like that's like...
02:17:35.000 You and Jamie should get together and talk about it.
02:17:38.000 He's radical.
02:17:38.000 I mean, it's like people...
02:17:39.000 You know, it's like industrial designers put all of that effort into creating that thing.
02:17:43.000 Yeah, it's made on a fucking glass and it costs a thousand bucks.
02:17:46.000 If you drop it with this thing on it, it doesn't get hurt.
02:17:50.000 And see this?
02:17:50.000 This little thing right here?
02:17:51.000 See, I stick my finger in there and then I can use it.
02:17:54.000 I can text better.
02:17:55.000 Really good.
02:17:56.000 Yeah.
02:17:57.000 And then also, if I want to watch a video, that'll prop it up.
02:18:00.000 You know?
02:18:03.000 You know how that works?
02:18:07.000 Isn't that better than no case?
02:18:09.000 I mean, some things I actually want to make more difficult for myself.
02:18:12.000 But I have two phones just because I'm trying to...
02:18:14.000 I always just want to keep tabs on how everything works everywhere.
02:18:17.000 So you have an Android and an iPhone?
02:18:20.000 Do you keep things stripped down?
02:18:23.000 No, I'm pretty...
02:18:24.000 I mean, I don't actually use...
02:18:27.000 TikTok?
02:18:29.000 Well, okay, my problem is that, like, I spend all day.
02:18:32.000 I think, you know, sometimes I go through this thing where, like, cryptography will be, like, in the news or something.
02:18:38.000 There'll be some, like, geopolitical thing that's happening.
02:18:40.000 And someone like, you know, Vice or something will get in touch with me.
02:18:42.000 And they'll be like, hey, we want to do a thing, like a video where, like, we follow you around for a day, like a day in the life.
02:18:48.000 You know, it's because it's so exciting.
02:18:49.000 Sounds good for them.
02:18:50.000 Annoying for you.
02:18:51.000 Well, the thing I'll usually write back is, like, okay, here's the video.
02:18:56.000 Me sitting in front of a computer for eight hours.
02:18:59.000 And they're like, oh, we can't make that video.
02:19:04.000 No one would want to watch that.
02:19:05.000 Yeah, what we need to do is take you to a yoga class and you go to an organic food store and you talk to people about their rights and then...
02:19:13.000 Yeah, exactly.
02:19:16.000 Unfortunately, I don't even want to watch the movie in my own life.
02:19:20.000 So that is my life.
02:19:23.000 I spend so much time looking at a computer for work that I... It's hard for me to continue looking at screens and stuff.
02:19:31.000 Yeah, I can only imagine.
02:19:34.000 But I try to be, like, a normal...
02:19:35.000 Like, there's this, like, in the history of people who are, like, doing...
02:19:39.000 Like, building cryptography, stuff like that, there was this period of time where the thesis was basically, like, all right, what we're going to do is develop really powerful tools for ourselves, and then we're going to teach everyone to be like us, you know?
02:19:52.000 And that didn't work because, you know, we didn't really anticipate the way that computers were going.
02:19:57.000 So I try to be, like, as normal as possible.
02:20:00.000 You know, I just, like, have, like, a normal setup.
02:20:02.000 I'm not, like, you know, I haven't...
02:20:03.000 I used to have a cell phone where I soldered the microphone differently so there was a hard switch that you could turn it off.
02:20:09.000 Really?
02:20:09.000 You did that?
02:20:10.000 Yeah, because you start thinking about how all this stuff works.
02:20:14.000 Do you ever fuck around with Linux phones or anything like that?
02:20:17.000 No.
02:20:17.000 I try to be normal.
02:20:20.000 I still do run Linux on a desktop just because I've been doing that forever.
02:20:26.000 And you keep a moleskin for what?
02:20:28.000 Just notes.
02:20:29.000 You don't put them on your phone?
02:20:32.000 Sometimes I do, but I like writing more, I guess.
02:20:35.000 Okay, so you just do it just because you enjoy it.
02:20:37.000 Yeah, but I guess, you know, you're right.
02:20:39.000 Maybe, you know, like, I feel the forces of darkness are not going to compromise.
02:20:47.000 Yeah.
02:20:48.000 Do you feel like you have extra scrutiny on you because of the fact that you're involved in this messaging application that Glenn Greenwald and Edward Snowden and a bunch of other people that are seriously concerned with...
02:21:05.000 Security and privacy that maybe people are upset at you?
02:21:11.000 That you've created something that allows people to share encrypted messages?
02:21:19.000 I mean, maybe.
02:21:20.000 I mean, I think...
02:21:21.000 Because you've kind of cut out the middleman, right?
02:21:23.000 You've cut out the third-party door.
02:21:26.000 Yeah.
02:21:27.000 And I think...
02:21:27.000 But in some ways, that means that there's less pressure on me because, you know, it's like if you're the creator of Facebook Messenger and your computer gets hacked, like, that's everyone's Facebook messages are, you know, gone.
02:21:39.000 Yeah.
02:21:40.000 And, you know, for me, if, like, my computer gets hacked, I can't access anyone's signal messages whether I get hacked or not, you know?
02:21:47.000 Right.
02:21:47.000 And so I have sort of less liability in that sense.
02:21:50.000 There was, like, a weird period of time where it was very difficult for me to fly commercially, like, on an airplane.
02:21:58.000 And I don't know why.
02:22:00.000 I think it had something to do with a talk that someone gave about WikiLeaks once, and they mentioned my name.
02:22:06.000 And after that...
02:22:07.000 You were getting flagged?
02:22:08.000 Yeah, it was very annoying.
02:22:09.000 I would go to the airport and I wouldn't be able to print a ticket at the kiosk.
02:22:13.000 I had to go talk to a person.
02:22:14.000 They had to call some phone number that would appear on their screen and then wait on hold for like 45 minutes to an hour to get approval.
02:22:21.000 And then they would get approval to print the ticket.
02:22:22.000 So you had to anticipate this when you travel?
02:22:24.000 Yeah.
02:22:25.000 So you had to go their way in advance?
02:22:26.000 Way in advance.
02:22:27.000 And then anytime I traveled internationally, on the way back through customs, they would seize all of the electronics that I had.
02:22:33.000 Jeez.
02:22:34.000 The US government would do this?
02:22:35.000 Yeah.
02:22:36.000 Customs and Border Protection.
02:22:37.000 They would seize your shit and would you get it back?
02:22:40.000 They would eventually send it back, but you just had to throw it out because it's not...
02:22:44.000 Who knows what they did to it, you know?
02:22:46.000 I would want to give it to someone and go, hey, tell me what they did.
02:22:50.000 Could you do that?
02:22:51.000 Is it possible to back-engineer whatever?
02:22:54.000 I never spent time on it.
02:22:56.000 How much time did they have your shit for?
02:22:58.000 It would be like weeks.
02:23:00.000 Weeks?!
02:23:00.000 Weeks!
02:23:02.000 Did you have to give them passwords and everything?
02:23:04.000 Well, that's the thing.
02:23:04.000 They would stop you, and they would be like, hey, we just need you to type in your password here so that we can get through the full disk encryption.
02:23:10.000 And I would be like, no.
02:23:11.000 And they would be like, well, if you don't do that, we're going to take this, and we're going to send it to our lab, and they're going to get it anyway.
02:23:16.000 And I would be like, no, they're not.
02:23:18.000 And they would be like, all right, we're going to take it.
02:23:20.000 You're not going to have your stuff for a while.
02:23:21.000 You sure you don't want to type in your password?
02:23:22.000 I would be like, nope.
02:23:23.000 And then it would disappear, and it would come back weeks later, and then it's like, How bizarre.
02:23:28.000 Yeah.
02:23:29.000 And with...
02:23:29.000 There was no...
02:23:32.000 They didn't have like a motive.
02:23:34.000 There was no...
02:23:35.000 That's the thing.
02:23:35.000 You never know why.
02:23:36.000 No, but I'm saying they didn't say, hey, you were...
02:23:40.000 You're thought to have done this or there's some...
02:23:45.000 No, they would always just be like, oh no, this is just random or whatever.
02:23:47.000 But there would be two people at the exit of the plane with photographs of me, you know, waiting for me to step off the plane and they would escort.
02:23:53.000 They wouldn't even wait for me to get to the...
02:23:55.000 So did you have to have like a burner laptop?
02:23:58.000 I just wouldn't travel with electronics, you know, because it was just...
02:24:00.000 Even your phone?
02:24:01.000 Yeah.
02:24:02.000 Yeah, I gave them my phone.
02:24:03.000 Oh, fuck.
02:24:05.000 Wow.
02:24:06.000 That was only internationally, though, because they can't do that domestically.
02:24:10.000 So domestically, you just had long waits, and then they'd eventually give you a ticket?
02:24:16.000 Yeah, they would eventually give you a ticket and then you'd get the selective screening where they would take all the stuff out of your bag and like, you know, filter out your car.
02:24:22.000 They'd touch your dick too, right?
02:24:24.000 And then at every connection, the TSA would come to the gate of the connecting thing, even though you're already behind security, and do it again at the connection.
02:24:32.000 Really?
02:24:32.000 Yeah, I don't know.
02:24:33.000 It was weird.
02:24:34.000 It was just like a...
02:24:34.000 Connections too?
02:24:36.000 Yeah, yeah.
02:24:37.000 So they're trying to fuck with you?
02:24:39.000 I think so, yeah.
02:24:39.000 I don't know what that was.
02:24:40.000 And how long did that last for?
02:24:41.000 That was a couple years.
02:24:43.000 Yeah.
02:24:44.000 And when did it go away?
02:24:45.000 The day it went away, were you like, ooh?
02:24:46.000 Yep.
02:24:47.000 Yeah, one day it just stopped.
02:24:48.000 It really did change the game.
02:24:50.000 What year did it go away?
02:24:51.000 When Trump got into office?
02:24:53.000 No, it was way before that.
02:24:55.000 Yeah, I forget.
02:24:56.000 Yeah, I forget.
02:24:57.000 Yeah, I was thinking, actually, I was thinking on the way here, it's funny how, like, I remember after the last election, everyone was talking about, like, California leaving the United States.
02:25:07.000 Like, California seceding.
02:25:08.000 You remember that?
02:25:09.000 Yeah, hilarious.
02:25:10.000 And now everyone's talking about leaving California, like, after this election.
02:25:13.000 Yeah, imagine that.
02:25:14.000 President Newsom.
02:25:15.000 Yeah.
02:25:16.000 Locked down in a communist state.
02:25:18.000 But do you remember...
02:25:19.000 People discovered that the CalExit, the whole CalExit movement was started by a guy that lived in Russia.
02:25:24.000 Oh, it was one of those IRA things?
02:25:27.000 Internet research agency scams?
02:25:29.000 But it wasn't.
02:25:29.000 I actually tracked the guy down in Moscow one time.
02:25:33.000 You tracked him down?
02:25:34.000 He was just some guy.
02:25:36.000 Did he do it for goof?
02:25:38.000 No, he, like, really believes.
02:25:40.000 That California should leave?
02:25:41.000 Yeah, he, like, he lived in California and had been, for years, like, trying to foment this CalExit thing.
02:25:48.000 And he has all the stats on, you know, why it would be better for California and all this stuff, you know.
02:25:53.000 And then he sort of thought, well, this isn't working.
02:25:56.000 And he really liked Russia for some reason.
02:25:58.000 So he moved to Russia just before the election, not knowing what was going to happen.
02:26:02.000 And then when Trump won, people were like, wait a second, fuck this.
02:26:05.000 Maybe California should get out of here.
02:26:07.000 And they just found this...
02:26:09.000 Like campaign that already existed and everyone sort of got behind it and he was just like oh shit and he lives in Russia now you know and and but he like didn't really understand um optics I think where he like he like the re the way that people everyone found out that uh he lived in Russia was that he opened a California embassy in Moscow so they like announced like you know CalAXIT has opened the first California embassy like in a foreign country but it was in Moscow and this was right as all the like Russian like stuff was happening you know So
02:26:41.000 if you're conspiratorially minded, you'd have drawn some incorrect conclusions.
02:26:44.000 Yeah, he was just...
02:26:46.000 I met with him.
02:26:48.000 I like hanging out with him for a day.
02:26:49.000 I think he really genuinely just...
02:26:51.000 So what was your motivation to hang out with this guy for a whole day?
02:26:55.000 I mean, I was just fascinated, you know, because here's this guy who's, like, doing this kind of ambitious thing, and it just, the optics seem so bad, you know?
02:27:00.000 Yeah.
02:27:00.000 I think he reminded me of, like, the Hannah Arendt quote that's like, you know, if the essence of power is deceit, does that mean that the essence of impotence is truth?
02:27:10.000 You know, that, like...
02:27:11.000 He sort of believed that just, like, the facts were enough.
02:27:15.000 You know, it's just, like, the stats of just, like, yeah, we spend this much money on, like, defense spending.
02:27:20.000 If we, you know, if we stopped, you know, it's like we would have, like...
02:27:23.000 So much money if California was a country.
02:27:24.000 And we would still have, like, the fourth largest military in the world.
02:27:27.000 And, you know, we would have, like...
02:27:28.000 You know, it's just, like, the numbers actually are compelling, you know?
02:27:31.000 And it was just sort of, like, that's...
02:27:32.000 You know, people will just see the truth, you know?
02:27:34.000 And I was like, dude, I think maybe you should, like, not live in Russia anymore, you know?
02:27:39.000 It was...
02:27:40.000 Dude.
02:27:41.000 Why did he go to Russia?
02:27:44.000 I don't know.
02:27:44.000 He had been teaching English, and I think he just sort of ended up liking Russia.
02:27:50.000 And so, yeah, he just decided to move there.
02:27:53.000 I was on the way with a friend to Abkhazia.
02:27:57.000 Have you ever heard of that place?
02:27:58.000 No.
02:27:59.000 It's an autonomous region of the country of Georgia.
02:28:05.000 And it's kind of interesting.
02:28:06.000 There's all these autonomous regions in the world that are essentially their own countries, you know, but they're not recognized by the UN or other countries, you know.
02:28:14.000 Like Texas.
02:28:15.000 You're in one right now.
02:28:16.000 I mean, these places are like, you know, militarized border, like they have their own, you know.
02:28:21.000 But they're not recognized by the UN. Yeah.
02:28:24.000 And so they all recognize each other.
02:28:26.000 And it's like, if you want to be a country, it's kind of interesting.
02:28:30.000 You need a lot of stuff.
02:28:31.000 You need a flag.
02:28:33.000 You need a national bird.
02:28:35.000 You need an anthem or whatever.
02:28:36.000 And you need a soccer team.
02:28:38.000 You definitely have to have a soccer team.
02:28:40.000 Interesting.
02:28:41.000 So these countries all have their own soccer teams, but they can't play in FIFA because they're not recognized by the UN. So FIFA can't recognize them.
02:28:48.000 So they have their own league.
02:28:49.000 It's like the league of unrecognized states and stateless peoples.
02:28:54.000 And they have their own World Cup.
02:28:55.000 And they have the World Cup in Abkhazia.
02:28:58.000 How many different countries are there that are like this?
02:29:00.000 There are a lot.
02:29:01.000 How many?
02:29:01.000 I mean, I don't know.
02:29:02.000 I don't know how many teams are in this league called Kanifa.
02:29:06.000 I mean, it's 20 plus.
02:29:08.000 So there's 20-plus unrecognized countries or autonomous regions.
02:29:13.000 And also stateless people, so like the Kurds.
02:29:17.000 People from Chagos Islands were basically evicted for a U.S. military base in their diaspora.
02:29:23.000 You know, places like Somaliland, Transnistria, South Ossetia, Lapalandia, you know, like, it's kind of interesting.
02:29:32.000 So, I went with a friend to Ocasio for the World Cup of all the unrecognized states.
02:29:36.000 How was that?
02:29:38.000 It was awesome.
02:29:39.000 Yeah?
02:29:39.000 It was like, yeah, it was really interesting.
02:29:41.000 I mean...
02:29:41.000 The smile on your face.
02:29:42.000 This is the biggest smile you've had the entire show.
02:29:46.000 It sounds like it was a great time.
02:29:48.000 I mean, it just is so fascinating to me.
02:29:51.000 And...
02:29:52.000 I think it's, like, an interesting, you know, it's, like, in a way that I feel like, you know, society moves by, like, pushing at the edges, you know, that, like, it's the fringes that end up moving the center.
02:30:02.000 I feel like, you know, looking at the margins of the way politics works is an interesting view of, like, how everything else works, you know, that, like, going to Abkhazia, it was so crazy getting there, you know, it's like, You know, we travel all through Russia.
02:30:17.000 We get to this, like, militarized border.
02:30:19.000 You go through these three checkpoints that aren't supposed to exist, but obviously exist.
02:30:23.000 You know, you get to the other side, and it's just the same as where you just were.
02:30:28.000 You know, you guys fought a brutal civil war, you know, with, like, genocide, like, full-on, you know, like, crazy shit.
02:30:36.000 And...
02:30:37.000 It's just kind of the same.
02:30:39.000 Was it worth it?
02:30:40.000 What's the deal?
02:30:41.000 I feel like it's this thing you see again and again of the institutions that we're familiar with in the world that exists are the institutions of kings.
02:30:52.000 It's like police, military, illegal apparatus, tax collectors.
02:30:58.000 Every moment in history since then has been about trying to change ownership of those institutions.
02:31:05.000 Hmm.
02:31:05.000 And it's always sort of dissatisfying, you know?
02:31:08.000 And, like, you know, just seeing that happen again and again.
02:31:11.000 And just, like, you know, realizing that it's like maybe what we should be doing is actually trying to get rid of these institutions or change these institutions in some way, you know?
02:31:18.000 Don't you think there's a very slow rate of progress, but ultimately progress?
02:31:24.000 If you follow Pinker's work, it looks at all the various metrics like murder, rape, racism, crime, all these different things.
02:31:33.000 Over time, we're clearly moving in a better direction.
02:31:40.000 Do you think it's just like...
02:31:41.000 You know, I was listening to this podcast today.
02:31:45.000 We were talking about religion, and it was discussing the Bible, and they were talking about all the different stories that are in the Bible, many of them that are hundreds of years apart, that were collected and put into that.
02:32:01.000 Just stop and think about a book that was written literally before...
02:32:08.000 The Constitution was drafted, and that book is being introduced today as gospel.
02:32:16.000 And that there's a new book that's going to be written 200 years from now, and that will be attached to the new version of the Bible as well.
02:32:24.000 And then one day someone will come across this, and it will all be interpreted as the will and the words of God that all came about.
02:32:35.000 Yeah.
02:32:38.000 Yeah, yeah, yeah.
02:32:42.000 Yeah.
02:32:48.000 Alan Turing in 1950 being chemically castrated for being gay to, in my lifetime, seeing gay marriage as being something that was very fringe when I was a boy living in San Francisco to universal across the United States today,
02:33:06.000 at least mostly accepted by the populace, right?
02:33:10.000 That this is a very short amount of time where a big change has happened.
02:33:13.000 And that these changes are coming quicker and quicker and quicker.
02:33:16.000 I would hope that this is a trend that is moving in the correct direction.
02:33:21.000 Yeah, certainly there are some things that are getting better, yeah.
02:33:24.000 And I feel like, to me, it's important to, you know, for a lot of those things, like the things you mentioned, like gay marriage, I think it's important to realize that, like, a lot of those, a lot of that progress would not have happened without the ability to break the law, honestly.
02:33:39.000 Right, right.
02:33:41.000 How would anyone have known that we wanted to allow same-sex marriage if no one had been able to have a same-sex relationship because Saudi laws had been perfectly enforced?
02:33:50.000 How would we know that we want to legalize marijuana if no one had ever been able to consume marijuana?
02:33:56.000 So I think a lot of the fear around increased surveillance data is that space dissipates.
02:34:07.000 Yes.
02:34:07.000 Yeah.
02:34:08.000 But, you know, on the other hand, you know, it's like we're living in the apocalypse, you know, that it's like if you took someone from 200 years ago who used to be able to just walk up to the Klamath River and dump a bucket in the water and pull out, you know, 12 salmon and that was, you know, their food.
02:34:22.000 And you were like, oh, yeah, the way it works today is you go to Whole Foods and it's $20 a pound and it's, you know, pretty good.
02:34:27.000 You know, they'd be like, what have you done?
02:34:28.000 Oh, my God.
02:34:29.000 You used to be able to walk across the backs of the salmon, you know, across the whole river.
02:34:34.000 Well, we're trying to avoid slipping even further into that apocalypse.
02:34:38.000 I don't know if you've followed what's going on in the Bristol Bay of Alaska with the Pebble Mine.
02:34:43.000 No.
02:34:43.000 It's crazy.
02:34:44.000 They're trying to...
02:34:46.000 According to what Joe Biden said when he was running for office, when he's in office, that will not happen.
02:34:54.000 But they're trying to...
02:34:58.000 Do essentially the biggest mine in the world that would destroy the salmon population.
02:35:04.000 It would destroy it.
02:35:06.000 It would literally wipe out a gigantic, not just a gigantic industry, but a gigantic chunk of the salmon.
02:35:14.000 I think it's...
02:35:16.000 I forget which kind of salmon it is.
02:35:19.000 I don't want to say it's Chinook.
02:35:21.000 I forget what kind of salmon it is, but it's the biggest population.
02:35:25.000 Sockeye, thank you.
02:35:26.000 The biggest population of them, certainly in America, but I think in the world.
02:35:32.000 I think it's responsible for an enormous number of jobs.
02:35:37.000 Apparently there's fucking billions of dollars worth of gold and copper down there.
02:35:43.000 Yeah.
02:35:45.000 Earthworks.
02:35:45.000 What's at stake?
02:35:45.000 An average of 40 to 50 million wild salmon make the epic migration from the ocean to the headwaters of the Bristol Bay every year.
02:35:53.000 Like, no place on Earth.
02:35:54.000 The Bristol Bay watershed.
02:35:55.000 They've been working to try to make this mine a reality for, I think, a couple of decades now.
02:36:01.000 And people have been fighting tirelessly to educate people on what a devastating impact this is going to have on the ecology of that area and the fact that the environment will be permanently devastated.
02:36:13.000 There's no way of bringing this back and there's no way of doing this without destroying the environment.
02:36:18.000 Because the specific style of mining that they have to employ in order to pull that copper and gold out of the ground It involves going deep, deep into the earth to find these reservoirs of gold and copper and sulfur they have to go through and then they have to remove the waste.
02:36:35.000 And mining companies have invested hundreds of millions of dollars in this and then abandoned it.
02:36:41.000 So they were like, we can't.
02:36:42.000 We can't fucking do this.
02:36:43.000 And then people are like, we can do it.
02:36:45.000 And then they've got...
02:36:46.000 And it's other companies that are...
02:36:48.000 I don't believe the company that's currently involved in this is even an American company.
02:36:52.000 I think it's a...
02:36:53.000 It's a foreign company that's trying to...
02:36:55.000 I think they're from Canada that are trying to do this spectacular...
02:36:59.000 I don't know which company it is, but it's...
02:37:02.000 My friend Steve Rinella from the Meat Eater podcast.
02:37:05.000 I want to recommend this podcast because he's got a particular episode on that where he talks about it.
02:37:13.000 Let me find it real quick.
02:37:15.000 Because it's...
02:37:16.000 It's pretty epic where he talks to this one guy who's dedicated the last 20 years of his life trying to fight this.
02:37:24.000 Let me just find it real quick because it's pretty intense.
02:37:29.000 And it's terrifying when you see how close it's come to actually being implemented and how if it happens, there's no way you pull that back.
02:37:39.000 Once they do it...
02:37:40.000 It's like all that Standing Rock shit.
02:37:43.000 They were like, no, it's the Pipeline's got to be fine.
02:37:45.000 No way that it leaks into the water or whatever.
02:37:47.000 It's like, sure enough.
02:37:48.000 Exactly.
02:37:50.000 Unfortunately, I've already listened to it, so I'm having a hard time finding it in this app.
02:37:53.000 It's motherfuckers.
02:37:57.000 Did you find it?
02:38:01.000 Yeah, previously played...
02:38:03.000 Yeah, Half-Life of Never.
02:38:06.000 It's the October 5th episode.
02:38:09.000 That's a good title.
02:38:10.000 Yeah, and the gentleman's name is Tim Bristol, which is kind of crazy.
02:38:14.000 That is his birth name.
02:38:16.000 His name is Tim Bristol, and he's dealing with his Bristol Bay situation.
02:38:21.000 I mean, it's just a random coincidence.
02:38:24.000 Yeah, and you read all that shit about the...
02:38:26.000 Episode 241. Like when they were building all the dams in California, and it's just like the salmon just bashed themselves to death.
02:38:34.000 They had to set them on fire.
02:38:36.000 Seattle, yeah.
02:38:37.000 Same thing that happened up in Seattle.
02:38:38.000 These knuckleheads, they didn't understand the migration.
02:38:41.000 These salmon won't go anywhere else.
02:38:44.000 They have one specific river where they were born, and that's where they will die and spawn.
02:38:49.000 Ugh, it's crazy.
02:38:51.000 But these assholes that just want copper and gold are willing to do this.
02:38:54.000 And there was this one politician in particular that has a gigantic windfall, if he can pull this off, or a lobbyist or whatever the fuck he is.
02:39:02.000 But he stands to make, I think they said $14 million if he can actually get the shovels into the ground.
02:39:09.000 That's how much he earns.
02:39:12.000 So what are we going to do about it?
02:39:13.000 Kill that guy!
02:39:15.000 Assassination politics.
02:39:16.000 Yes.
02:39:17.000 Kill them all.
02:39:18.000 No.
02:39:19.000 I'm kidding.
02:39:20.000 Don't get me in trouble.
02:39:22.000 You can get banned off of YouTube for saying something like that.
02:39:24.000 I'm joking.
02:39:25.000 What should we do?
02:39:26.000 We should make people aware of it and make people aware that there are real consequences to...
02:39:33.000 We're good to go.
02:39:51.000 It's really that simple.
02:39:52.000 I mean, we are getting along fine without that copper and without that gold, and we are using the resource of the salmon, and people are employed that are enjoying that resource, and they're also able to go there and see the bears eating the salmon and seeing this incredible wild place.
02:40:13.000 Alaska is one of the few really, truly wild spots in this country.
02:40:18.000 Yeah.
02:40:19.000 And someone might fuck that up.
02:40:21.000 And if you get enough greedy assholes together, and they can figure out a way to make this a reality, and with the wrong people in positions of power, that's 100% possible.
02:40:33.000 Yeah, you might even say we've organized the entire world economy to fuck that up.
02:40:38.000 Yeah, yeah.
02:40:39.000 But I think that it's like the question of agency of how do we affect these processes is tough.
02:40:46.000 Well, I was joking, obviously, about killing that person, but there was a recent one of the Iranian scientists was assassinated, and this brought up this gigantic ethical debate.
02:40:59.000 And we don't know who did it, whether it was an Israeli army Mossad held a press conference to say, we didn't do it, while wearing t-shirts that said, we definitely did it.
02:41:07.000 Assassinated Iranian nuclear scientists shot with remote-controlled machine gun.
02:41:12.000 Holy fuck!
02:41:14.000 Holy fuck!
02:41:15.000 It was in middaylight, which is what I was hearing about.
02:41:18.000 Oh my god!
02:41:20.000 Dude, we're killing people with robots now, right?
02:41:22.000 That was the other Iranian guy that got killed, Soleimani, who was also killed with a drone.
02:41:32.000 I mean, essentially...
02:41:33.000 It says out of another car, but whatever.
02:41:36.000 Oh, so a car was driving by and there was a remote-controlled machine gun?
02:41:40.000 Mm-hmm.
02:41:41.000 Fuck.
02:41:43.000 It says he was in a bulletproof car, too.
02:41:46.000 Wow, I don't know.
02:41:47.000 He was in a bulletproof...
02:41:48.000 Like, they knew they were going to kill this guy.
02:41:50.000 Yeah, they did, man.
02:41:51.000 Damn.
02:41:51.000 So, this is the question.
02:41:53.000 Oh, he got out of the car.
02:41:54.000 Oh.
02:41:55.000 Well, there you go.
02:41:56.000 You fucked up.
02:41:56.000 Stay in that bulletproof car.
02:41:58.000 If you know that a man is going to...
02:42:03.000 Like, what if someone did that to Oppenheimer?
02:42:05.000 You know, what if someone said, hey, we see where this is going...
02:42:09.000 And we need to find that Oppenheimer gentleman, and we need to prevent Big Boy from dropping down and killing how many people?
02:42:20.000 Half a million people.
02:42:22.000 What?
02:42:22.000 He got shot by that remote.
02:42:24.000 It was 164 yards away.
02:42:27.000 Shot him and his bodyguard, and then the car they were in exploded.
02:42:31.000 It lasted for three minutes.
02:42:33.000 The whole thing was three minutes long.
02:42:34.000 Wow.
02:42:36.000 So there's this ethical dilemma.
02:42:37.000 Like, if someone is actively trying to acquire nuclear weapons, and we think that those people are going to use those nuclear weapons, is it ethical to kill that person?
02:42:46.000 And if that person's a scientist, they're not a...
02:42:50.000 Yeah, I mean, I think the causality stuff is really hard to figure out, you know.
02:42:56.000 But I think most of the time it's not about the one person, you know, that it's not, you know, maybe sometimes it is, but I think most, it's just like, I feel like assassination politics in the tech arena does not work, you know, that it's like you can get rid of all the people at the top of these companies and that's not what's going to do it,
02:43:12.000 you know, that there are like these structural reasons why these things keep happening over and over again.
02:43:16.000 Yeah, I think they're trying to slow it down, though, right?
02:43:18.000 Like, this is the reason why...
02:43:20.000 Do you remember when they employed Suxnet?
02:43:24.000 Suxnet, yeah, yeah.
02:43:25.000 You know, I mean, that was for the same reason, right?
02:43:28.000 They were trying to disarm the Iranian nuclear capabilities.
02:43:32.000 Yeah, yeah, yeah.
02:43:33.000 That was the same thing, where they...
02:43:34.000 But that was kind of crazy.
02:43:35.000 They were like, we didn't do it while wearing t-shirts.
02:43:37.000 They were like, we definitely did this.
02:43:38.000 Yeah.
02:43:39.000 But they did that with computer virus, right?
02:43:42.000 Which is pretty fascinating.
02:43:43.000 Yeah.
02:43:45.000 And people didn't have a problem with that.
02:43:47.000 Well, I think some people had a problem with that, obviously.
02:43:50.000 Well, Iranians.
02:43:52.000 Yeah, but also just like, okay...
02:43:58.000 You know, you go down that road and, you know, where things can happen too.
02:44:01.000 A great example is, so one of the things that came out in a lot of the documents that Snowden released was that the NSA had worked with a standards body called NIST in order to produce a random number generator that was backdoored.
02:44:19.000 So random numbers are very important in cryptography, and if you can predict what the random numbers are going to be, then you win.
02:44:27.000 And so the NSA had produced this random number generator that allowed them to predict what the random numbers would be because they knew of this one constant that was in there.
02:44:39.000 They knew a reciprocal value that you can't derive just by looking at it, but they know because they created it.
02:44:45.000 And they had what they called a nobody but us backdoor.
02:44:49.000 No bus.
02:44:50.000 Nobody but us backdoor.
02:44:51.000 And they got NIST to standardize this thing, and then they got a company called Jupyter, who makes routers and VPNs and stuff like that.
02:45:01.000 Juniper, sorry.
02:45:02.000 To include this in their products.
02:45:05.000 And so the idea was that, like, the NSA would have these capabilities, they had developed, you know, these vulnerabilities that they could exploit in situations like this, you know, that they could, like, take advantage of foreign powers and stuff like that in ways that wouldn't boomerang back at them.
02:45:20.000 But what happened was, in, I think, you know, 20 early teens, Juniper got hacked, and somebody secretly changed that one parameter.
02:45:33.000 That was, like, basically the back door to a different one that they knew the reciprocal value to.
02:45:41.000 And it's most likely China or Russia that did this.
02:45:45.000 And then what's kind of interesting is there was a big incident where the OPM, the Office of Personnel Management, I think, was compromised.
02:45:54.000 And they have records on, you know, foreign intelligence assets and stuff like that.
02:45:59.000 Their systems were compromised, it seems like, maybe by China.
02:46:03.000 And what's sort of interesting is that they were running the Juniper networking gear that had been, you know, hacked in this one specific way.
02:46:12.000 And so it's kind of possible that, like, you know, the NSA developed this backdoor that they were going to use for situations like this, you know, against foreign adversaries or whatever, and that the whole thing just boomeranged back at them, and the OPM was compromised as a result.
02:46:28.000 Wow.
02:46:29.000 But this is like, I don't know, I think it's, You know, it's easy to look at things like Stuxnet and stuff like that and just be like, yeah, this is harm reduction or whatever, you know, but like in the end, it can have real-world consequences.
02:46:43.000 And this is also why people are so hesitant about, you know, like the government is always like, well, why don't you develop a form of cryptography where it like works except for us, you know, weaken the content, you know.
02:46:55.000 And it's like, well, this is why.
02:46:56.000 Because if you can access it, if anybody can access it, somehow that's going to boomerang back at you.
02:47:02.000 Well, I remember when there was a terrorist attack in Bakersfield, California.
02:47:07.000 Is that where it was?
02:47:07.000 I think it was Bakersfield.
02:47:08.000 Yeah.
02:47:11.000 San Bernardino.
02:47:12.000 San Bernardino, thank you, yeah.
02:47:13.000 And there was an iPhone involved, and Apple wouldn't open it for them.
02:47:19.000 It wouldn't allow the FBI to have access to it, and people were furious.
02:47:23.000 And they were like, if this starts here, this does not end well.
02:47:28.000 And I kind of saw their point, but I kind of saw the FBI's point too.
02:47:33.000 Like, did you just open this one?
02:47:35.000 This guy's clearly a murderer, has killed a ton of people, and created this terrorist incident.
02:47:41.000 Yeah, but I mean, it was a little disingenuous too, right?
02:47:43.000 Where it's like, the FBI had their entire iCloud backup for this device.
02:47:48.000 The only thing they didn't have was the previous two hours or something like that.
02:47:52.000 And the reason they didn't have it is because they fucked up and approached it in the wrong way and got themselves locked out of it.
02:47:59.000 Oh, really?
02:48:00.000 It was their own mistake that led to the situation where they didn't have the iCloud backup.
02:48:04.000 So then it's like, what are you really going to get off this phone?
02:48:08.000 The actual possibility of what was there was extremely marginal.
02:48:12.000 So do you think what they really want is the tools to be able to get into other people's phones?
02:48:16.000 They've just been waiting for the moment of like, okay, here we go.
02:48:20.000 We got terrorists.
02:48:22.000 That makes sense.
02:48:24.000 What did you think like when the State Department or whoever it was banned Huawei phones?
02:48:30.000 Yeah.
02:48:31.000 Did you think there was...
02:48:33.000 I mean, yeah, it's mostly political, right?
02:48:34.000 Like it's...
02:48:38.000 It's complicated, right?
02:48:39.000 Because there's, like, you know, companies like Huawei and, you know, the people who make TikTok.
02:48:48.000 Like, they're, yeah, they're doing, like, all the sketchy shit.
02:48:52.000 But it's the same sketchy shit that, like, all of Silicon Valley is doing, you know?
02:48:57.000 Like, it's not...
02:48:58.000 Is it really?
02:48:59.000 Is that a valid comparison to what they're doing in Silicon Valley?
02:49:03.000 Like, Huawei did have routers that had third-party access, apparently, and they were shown that information was going to a third party that was not supposed to be, right?
02:49:15.000 Wasn't that part of the issue?
02:49:16.000 Am I reading this wrong?
02:49:18.000 Well, okay, I think there's a couple...
02:49:21.000 There have been incidents where it's like, yeah, there's data collection that's happening.
02:49:25.000 Well, there's data collection happening in all Western products, too.
02:49:30.000 And actually, the way the Western products are designed are really scary.
02:49:34.000 In the telecommunications space, there's a legal requirement called CALEA, Communications and Law Enforcement Act, or something like that, that requires telecommunications equipment to have...
02:49:47.000 To have eavesdropping, like surveillance stuff built into it, like when you produce the hardware, in order to sell it in the United States, you have to have...
02:49:54.000 Like which hardware?
02:49:55.000 Like phone switches and stuff.
02:49:56.000 It's like when you make a normal phone call, it has to have...
02:49:59.000 I forget what they call it.
02:50:01.000 The ability to tap.
02:50:02.000 Yeah, they call it something else.
02:50:04.000 But it has to have this ability to record conversations, intercept...
02:50:10.000 Lawful intercept, that's what it's called.
02:50:11.000 How does a signal call work?
02:50:13.000 So signal calls work not using the traditional telecommunications infrastructure.
02:50:18.000 It is routing data over the internet.
02:50:20.000 And that data is end-to-end encrypted, so nobody can eavesdrop on those calls, including us.
02:50:27.000 But so communication equipment that is produced in the United States has to have this so-called lawful intercept capability.
02:50:36.000 But what's crazy about that is that's the same, you know, it's like these are U.S. companies and they're selling that all around the world.
02:50:41.000 So that's the shit that gets shipped to UAE. Yeah.
02:50:44.000 You know, so it's like it's the secondary effect thing of like the United States government was like, well, we're going to be responsible with this or whatever.
02:50:50.000 We're going to have warrants or whatever.
02:50:51.000 And even that's not true.
02:50:52.000 And then that same equipment gets shipped to tyrants and repressive regimes all over the place.
02:50:57.000 And they just got a ready-made thing to just avail everyone's phone calls.
02:51:01.000 So it's like, I don't know, it's hard to indict Huawei for acting substantially different than the way, than, you know, whatever, the US industry acts.
02:51:13.000 It's just, certainly they have a different political environment and, you know, they are much more willing to use that information to do really brutal stuff.
02:51:24.000 Well, it wasn't just that they banned Huawei devices, but they also banned them from using Google.
02:51:30.000 That's when I thought, like, wow, this is really...
02:51:32.000 Like, what do they know?
02:51:34.000 Or what has been...
02:51:35.000 Oh, Google...
02:51:36.000 Yeah.
02:51:37.000 Well, Google has...
02:51:40.000 No.
02:51:40.000 So, you know, Android...
02:51:43.000 You're talking about, like, so-called Android devices.
02:51:45.000 They can't use the Android operating system anymore.
02:51:47.000 They have to now...
02:51:49.000 They've developed their own operating system, and now they have their own ecosystem, they have their own app store, the whole deal.
02:51:54.000 Yeah.
02:51:55.000 But that's also...
02:51:56.000 That's a business thing, you know, where it's, like, Google's control over...
02:52:01.000 Google is producing this software, Android, and it's just free.
02:52:05.000 They're releasing it.
02:52:07.000 But they want to maintain some control over the ecosystem because it's their thing that they're producing.
02:52:11.000 And so they have a lot of requirements.
02:52:14.000 It's like, okay, you can run Android.
02:52:16.000 Oh, you want all this other stuff that we make that's not part of just the stock-free thing, like Play Services and all the Google stuff.
02:52:25.000 Increasingly more and more of Android is just getting shoved into this proprietary bit.
02:52:29.000 And they're like, okay, you want access to this?
02:52:31.000 Then it's going to cost you in these ways.
02:52:34.000 And I think it probably got to the point where Huawei was just like...
02:52:41.000 We're not willing to pay, you know, even either monetarily or through whatever compromise they would have to make, and they were just like, we're gonna do our own thing.
02:52:49.000 I thought it was because of the State Department's boycott.
02:52:52.000 Oh, it could have also been that there was a legal requirement that they stopped doing it, yeah.
02:52:56.000 Yeah, I think I might be...
02:52:58.000 Jamie will find out.
02:52:59.000 I think I might be right, but I'm not sure though.
02:53:02.000 But it just made me think, like, I understand that there's a sort of connection that can't be broken between business and government in China, and that business and government are united.
02:53:13.000 It's not like, you know, like Apple and the FBI, right?
02:53:17.000 Yeah.
02:53:19.000 In China, they would just give them the phone.
02:53:20.000 Oh yeah, of course.
02:53:21.000 They developed the phone.
02:53:22.000 They wouldn't have the tools to already get into it.
02:53:25.000 They wouldn't have to have this conversation.
02:53:27.000 Yeah, exactly.
02:53:27.000 They just send it directly to the people.
02:53:30.000 What we're terrified of is that these relationships that business and government have in this country, they're getting tighter and tighter intertwined.
02:53:39.000 And we look at a country like China that does have this sort of inexorable connection between business and government, and we're terrified that we're going to be like that someday.
02:53:50.000 Yeah.
02:53:51.000 Yeah.
02:53:52.000 Is it just it?
02:53:53.000 It is what it is?
02:53:54.000 Yeah.
02:53:55.000 I mean, and that's, I think, you know, a lot of what Snowden was revealing.
02:53:59.000 Yes.
02:53:59.000 It was like, you know, that there are already these relationships, you know.
02:54:03.000 You know, the NSA called it PRISM. And, you know, tech companies just called it, like, the consoles or whatever they had built for these, you know, for these requests.
02:54:13.000 But it's...
02:54:14.000 That's...
02:54:15.000 Yeah, it's happening.
02:54:16.000 And I don't...
02:54:17.000 Also, you know, it's sort of like...
02:54:21.000 I think a lot of people, a lot of nations look at China and are envious, right?
02:54:24.000 Where it's like, they've done this thing where they just, you know, they built like the Great Firewall of China, and that has served them in a lot of ways.
02:54:35.000 You know, one, surveillance, obviously, like they have total control of everything that appears on the internet.
02:54:41.000 So not just surveillance, but also content moderation, propaganda, but then also, It allows them to have their own internet economy.
02:54:52.000 China is large enough that they can have their own ecosystem.
02:54:57.000 People don't use Google there.
02:55:00.000 They have their own chat apps.
02:55:01.000 They have their own social networks.
02:55:03.000 They have their own everything.
02:55:05.000 And I think a lot of nations look at China and they're just like, huh, that was kind of smart.
02:55:08.000 It's like you have your own ecosystem, your own infrastructure that you control, and you have the ability to do content moderation, and you have the ability to do surveillance.
02:55:16.000 And so I think the fear is that there's going to be a balkanization of the internet where Russia will be next and then every country that has an economy large enough will go down the same road.
02:55:26.000 Was it, Jamie?
02:55:29.000 There's a couple things that happened that are what you're saying, but directly seems to be related to this.
02:55:35.000 Sweeping crackdown on facial recognition tech.
02:55:37.000 House and Senate Democrats on Tuesday rolled out a legislation to halt federal use of facial recognition software and require state and local authorities to pause any use of the technology to receive federal funding.
02:55:49.000 The Facial Recognition and Biometric Technology Moratorium Act introduced Thursday.
02:55:58.000 Marks one of the most ambitious crackdowns on Facebook.
02:56:00.000 This has to do with that?
02:56:01.000 It said it was part of this boycott that had to do with Google's, like, antitrust suit.
02:56:07.000 That also had to do with Facebook, and they were looking into it.
02:56:09.000 This was from, like, a month ago.
02:56:11.000 I mean, I think this is connected to what you're saying, just in the sense that, like...
02:56:17.000 You know, the people who are producing that facial recognition technology, it's not the government.
02:56:20.000 It's, you know, Valenti or whoever sells services to the government.
02:56:23.000 And then, you know, the government is then deploying this technology that they're getting from industry and in kind of crazy ways.
02:56:30.000 Like, there's the story of the Black Lives Matter protester who, like, the police, like, NYPD, you know, not like the FBI, you know, NYPD, like, tracked him to his house using facial recognition technology.
02:56:43.000 And so, yeah.
02:56:44.000 How did they do that?
02:56:47.000 There's a story about it.
02:56:49.000 I've been finding stories.
02:56:51.000 No one knows what these things are.
02:56:52.000 There's things supposedly all over New York City and Manhattan that are tracking everybody's face as soon as they go in there.
02:56:58.000 I've watched news videos from local New York, local media, asking people, have you seen these?
02:57:03.000 What are they?
02:57:04.000 They get no answers.
02:57:05.000 Well, here's what's hilarious.
02:57:07.000 Crime has never been higher.
02:57:09.000 New York City crime right now is insane.
02:57:11.000 That shit's not doing anything.
02:57:14.000 Yeah.
02:57:15.000 Well, everyone's wearing a mask, too.
02:57:16.000 That's also part of the problem.
02:57:17.000 But I think the fear is that there's this circle of industry-producing technology that is going into government.
02:57:25.000 Stuff like facial recognition technology just makes existing power structures much more difficult to contest.
02:57:34.000 Do you use facial recognition on your phone?
02:57:38.000 No.
02:57:39.000 I don't have any apps or anything that use it.
02:57:41.000 You don't know with your iPhone?
02:57:43.000 Oh, no.
02:57:44.000 I just have a pen.
02:57:44.000 Oh, you don't use it.
02:57:46.000 What's going on, Jeremy?
02:57:47.000 New York City Police Department uses facial recognition software to track down a Black Lives Matter activist accused of assault after allegedly shouting into a police officer's ear with a bullhorn.
02:57:56.000 That's it?
02:57:59.000 What about that guy who punched Rick Moranis, you fucks?
02:58:03.000 They found him.
02:58:04.000 They did?
02:58:05.000 Yeah.
02:58:05.000 Like last week.
02:58:06.000 Right in jail.
02:58:07.000 But they did find him.
02:58:08.000 How'd they find him?
02:58:09.000 They have facial recognition, Joe.
02:58:11.000 But he wear a mask.
02:58:12.000 I don't know.
02:58:13.000 Anyway.
02:58:15.000 Listen, I think what you're doing is very important.
02:58:18.000 And I love the fact that you approach things the way you do.
02:58:22.000 And that you really are this idealistic person that's not trying to make money off of this stuff.
02:58:26.000 And you're doing it because you think it's the right thing to do.
02:58:31.000 If there is a resistance, people like you are very important.
02:58:35.000 What you've done by creating signal, it's very important.
02:58:39.000 There's not a lot of other options, and there's no other options that I think are as secure or as viable.
02:58:46.000 Thank you.
02:58:46.000 Thanks.
02:58:48.000 I appreciate you saying that.
02:58:49.000 And I support it, and I try to tell other people to use it as well.
02:58:54.000 Last word.
02:58:55.000 Do you have anything to say to everybody before we wrap this up?
02:58:58.000 That's a lot of pressure.
02:58:59.000 Sorry.
02:59:02.000 Can I put out a public plea for a project I'm trying to work on?
02:59:07.000 Sure.
02:59:08.000 Okay, I'm vaguely obsessed with this thing that happened in the 60s.
02:59:13.000 Are you familiar with the Soviet space dogs?
02:59:18.000 So the first animal in space was a dog named Laika.
02:59:23.000 Laika died in space, sadly.
02:59:26.000 The second animal in space was a dog called Strelka.
02:59:29.000 Strelka went to space, made it back to Earth, and had puppies.
02:59:34.000 Whoa, those puppies can read minds.
02:59:37.000 When Khrushchev came to visit JFK in 1965, he brought with him the ultimate insult gift, which was one of the puppies.
02:59:45.000 That's an insult?
02:59:46.000 Oh, dude.
02:59:46.000 It's like, oh, do you have anything that's been to space?
02:59:48.000 We have extra puppies.
02:59:50.000 You know, do you want one?
02:59:51.000 You know?
02:59:51.000 That's an insult?
02:59:52.000 Dude, it's the ultimate insult gift.
02:59:54.000 Like, the United States had no space program, had never been, the Soviet Union was, like, way ahead of them.
02:59:58.000 They're like, oh, we've just got extra animals that have been to space.
03:00:00.000 Like, here, have one, you know?
03:00:01.000 It's a puppy.
03:00:02.000 Stop being so personal.
03:00:03.000 That's what I would tell Kennedy.
03:00:04.000 Just take the puppy, bro.
03:00:06.000 Well, Kennedy took the puppy.
03:00:07.000 Kennedy took the puppy.
03:00:08.000 The puppy had a Cold War romance with one of Kennedy's dogs, and they had puppies.
03:00:12.000 Oh, snap!
03:00:13.000 That the Kennedys called the Pupnicks.
03:00:16.000 And the Pupnicks captivated the imagination of children across America.
03:00:22.000 Because Jackie Kennedy said something, she was like, I don't know what we're going to do with the dogs, you know?
03:00:26.000 And that ignited a spontaneous letter-writing campaign from children across America who all requested one of the puppies.
03:00:34.000 Jackie Kennedy selected two children in America whose names were Mark Bruce and Karen House.
03:00:40.000 And she delivered two puppies to each of these people.
03:00:44.000 One of them lived in Missouri, the other lived in Illinois.
03:00:49.000 And I have sort of been obsessed with the idea that those puppies had puppies, and that those puppies had puppies, and that somewhere in the American Midwest today are the descendants of the original animals in space.
03:01:02.000 The first animal to go to space and survive.
03:01:04.000 They've probably been watered down so heavily.
03:01:06.000 Maybe, but like...
03:01:07.000 Chihuahuas and German Shepherds and shit.
03:01:10.000 Well, they were all...
03:01:11.000 There they are, right there.
03:01:13.000 They were mutts.
03:01:13.000 They were random dogs that they found from around the, like, spaceport.
03:01:19.000 Because they thought that they would be, like, tougher.
03:01:21.000 Yeah.
03:01:21.000 Oh, wow.
03:01:23.000 But they were small.
03:01:24.000 And so, yeah, I've been obsessed with the idea that these dogs could still be out there, and I've been trying to find the dogs.
03:01:32.000 So I've been trying to track down these two people, notably Karen House, because she got the female dog.
03:01:37.000 And I think she's still alive, and I think she lives in the Chicago area, but I can't get in touch with her because I'm not, I don't know, I'm not an investigative journalist.
03:01:45.000 I, like, don't know how to do this or whatever.
03:01:46.000 So, if anybody knows anything about the whereabouts of Karen House or the descendants of the Soviet space dogs, I'm very interested.
03:01:54.000 My goal is just to meet one, you know?
03:01:56.000 How should someone get in touch with you?
03:01:58.000 I'm on the internet.
03:02:00.000 Okay.
03:02:00.000 Moxie.
03:02:01.000 Just like that.
03:02:02.000 I'm on the internet.
03:02:03.000 My name is Moxie.
03:02:06.000 I love it.
03:02:07.000 Thanks, man.
03:02:07.000 I really appreciate it.
03:02:08.000 I really enjoyed our conversation.
03:02:09.000 Thank you.
03:02:10.000 Bye, everybody.