The Joe Rogan Experience - September 15, 2020


Joe Rogan Experience #1536 - Edward Snowden


Episode Stats

Length

2 hours and 28 minutes

Words per Minute

161.60376

Word Count

23,982

Sentence Count

1,431

Misogynist Sentences

5


Summary

A judge has struck down the bulk surveillance of Americans' phone records by the government. This is a huge victory for our privacy rights, but what does it mean for the Chinese government? And what will it mean in the long-term? Plus, a new technology that could allow the government to spy on us in a way no one else can. This week's guest is Alex Blumberg, host of the podcast "The FiveThirtyEight" and host of "The Alex Show" on the pod. He's a journalist, podcaster, writer, and podcaster. He's also the author of the book, "The Patriot Act" and has been a frequent guest on the Alex Show. Alex joins us to talk about his new book, The Patriot Act: The Truth About It, and why we should all be worried about the government spying on us. Also, we talk about the new technology called "End-to-End Encryption" and how it could be a game-changer in the privacy crisis we're all facing, and how companies like Google and other tech companies are using it to track our every move to track us and keep us safe. We also talk about what's going on in the world and what we can do to protect our privacy, and what companies are doing about it. and how they can do about it, too. . Thank you for listening to the podcast! Big thanks to Billy and Bobby for coming up with the intro and outro music, Bobby for the outro, and for the intro music, and thanks to Bobby for making us feel like we're in the mood for this week's theme song. , and thanks for coming out loud and clear and clear, and bright and clear in this episode, and we hope you like it. Thank you Bobby for letting us know what you're listening to this episode. Cheers, Bobby and Bobby, and good vibes! and good night, bye, bye. Cheers! -Bobby, Cheers. -Eugene, Bobby, Bobby & Billy - EJ, EJ and Joe, and Joe and Cheers -Joby, and JUICY, and Good to See You, and Bye Bye, Bye Bye Bye bye, Bye, bye Bye Bye. Blessings, bye! -Jon & Bye, Love, - P. & GOTY,


Transcript

00:00:11.000 Good to see you again, man.
00:00:14.000 Good to see you.
00:00:14.000 Thanks for having me.
00:00:15.000 It's been like a year since we last talked.
00:00:18.000 It's been like a year, believe it or not.
00:00:19.000 You look exactly the same, and the studio looks exactly the same.
00:00:23.000 You might be on another part of the world.
00:00:24.000 No one knows.
00:00:26.000 Yeah, it's just my apartment that I rent.
00:00:30.000 You know, I don't like to give out a lot of information about where I'm at and that kind of stuff.
00:00:34.000 So it's very smart.
00:00:36.000 A plain wall that I've got the lights down low so it looks kind of a nice gray.
00:00:41.000 At least I think it's nice.
00:00:42.000 I think it's beautiful.
00:00:43.000 It looks amazing.
00:00:44.000 First of all, congratulations on the recent ruling.
00:00:48.000 It was a 9th District Court of Appeals.
00:00:52.000 Is that what it was?
00:00:53.000 It said that what you exposed with the warrantless wiretapping was in fact illegal.
00:01:00.000 And there are many people that are calling for you to be pardoned now.
00:01:05.000 Yeah, so much has happened.
00:01:07.000 This ruling, this is actually not the first time the federal government has or the appeals courts have struck down some of the federal surveillance programs as unlawful.
00:01:18.000 But this one is really important because it happened from an appeals court.
00:01:21.000 It wasn't from a single judge.
00:01:22.000 It was from a panel of judges.
00:01:26.000 And what they had ruled was that the NSA's bulk collection of Americans' phone records was illegal.
00:01:35.000 And this is the very first sort of mass surveillance program that I and the journalists, really that the news was broken back in 2013. So this is a huge victory for privacy rights.
00:01:49.000 What it means is There was this provision of the Patriot Act.
00:01:54.000 Remember the Patriot Act?
00:01:55.000 Remember like a zillion years ago?
00:01:57.000 I do.
00:01:58.000 Everybody was like, Patriot Act, Patriot Act.
00:02:00.000 Your friend Alex Jones, you know, I think.
00:02:02.000 He was worried about the Patriot Act.
00:02:03.000 It's a terrible name.
00:02:05.000 There's a real problem with that name.
00:02:06.000 Because if you're against the Patriot Act, it's like against babies.
00:02:11.000 It's like, this is the pro-baby act.
00:02:13.000 But meanwhile, pro-baby act, they get to look through your email.
00:02:16.000 You know what I mean?
00:02:16.000 It's like the word patriot is attached to that in a very disingenuous way.
00:02:21.000 Calling that the Patriot Act, it's really creepy that they could do that.
00:02:25.000 It should have a number, like Bill A1. You know what I'm saying?
00:02:29.000 So you can debate the merits of it.
00:02:31.000 It's just so much propaganda attached to that name, like the Patriot Act.
00:02:36.000 This is one of the funny things because it should be a warning for anybody who's in, like, you know, just anywhere in the country and they hear on the news they're talking about, like, the Save Puppies Act.
00:02:46.000 There's actually one that they've been trying to push through recently, which is basically outlawing meaningful encryption from the major Internet service providers.
00:02:56.000 Like, if Facebook or Google, for whatever reason, got out of bed in the morning and they actually wanted...
00:03:02.000 To protect the security of your communications in a way that even they can't break.
00:03:09.000 Like right now, Google and Facebook, they do a great job keeping other people from spying on your communications.
00:03:17.000 But if Google wants to rifle through your inbox, right, if Facebook wants to go through all your direct messages and give that to the federal government, like you tap one button and boom, they've got all of it.
00:03:27.000 It happens every single day.
00:03:29.000 Well, companies like Facebook have recently realized this is a real problem for them.
00:03:36.000 Because first off, they get all these censorship demands that you've seen, where there's deplatforming requests.
00:03:44.000 And it happens in one country, right?
00:03:48.000 Government is allowed to decide what can and can't be said by this person on this platform.
00:03:53.000 Or the U.S. goes, look, we got a court warrant.
00:03:56.000 A judge said, we think this person's a criminal.
00:03:59.000 We want you to hand over everything you have on this person.
00:04:02.000 And they do it, right?
00:04:03.000 Facebook does this.
00:04:05.000 Well, guess who's next, right?
00:04:06.000 The Russian government shows up at the door the next day.
00:04:08.000 The Chinese government shows up at the door the next day.
00:04:11.000 And if these companies don't play ball, they get shut down in that country.
00:04:15.000 They can no longer operate.
00:04:19.000 And so the idea that a lot of them have, that they've considered, and this has actually become a bigger thing in the COVID crisis where we start talking about like contact tracing.
00:04:30.000 These companies want to know where everybody is at all the time so they can hand this over to medical authorities or whatever.
00:04:35.000 There's this idea called end-to-end encryption, which what it means is that when you send a message, you know, when Billy sends a message to Bobby, Billy and Bobby both have the keys to unlock that message.
00:04:48.000 And it could be sent through Facebook.
00:04:50.000 It could be sent through Google.
00:04:51.000 It could be posted on a bulletin board in the town square.
00:04:54.000 But without that key, which the people who run the bulletin board, right, the people who own the bulletin board, Google, Facebook, they don't have that key.
00:05:04.000 Only the phones at the end, the laptops at the end, the people who own those, they're the only people who have the key.
00:05:09.000 So if somebody comes to Facebook and says, we want to see that information, Facebook hands over the encrypted message, right?
00:05:17.000 And Facebook goes, well, here you go.
00:05:19.000 Here's our copy.
00:05:19.000 But we can't read it.
00:05:20.000 You can't either.
00:05:22.000 Now you've got to actually do some work on the government side and go get that key yourself.
00:05:27.000 And then you can read it, right?
00:05:28.000 But we can't read it.
00:05:30.000 Congress is trying to stop the basically proliferation of that basic end-to-end encryption technology.
00:05:37.000 And they're calling it like the Child Online Predator Act or something like that.
00:05:42.000 Where they say it's all about protecting the posting of like child exploitation material and really, really horrible stuff.
00:05:50.000 But that's not actually what the law is about.
00:05:54.000 The law is about making it easier for spies and law enforcement to reach deeper and deeper into your life with a simple warrant stamped by any court.
00:06:03.000 And the funny thing is, this never used to be the way law enforcement worked in the United States.
00:06:09.000 I mean, when you hear about a warrant, what does that mean to you?
00:06:13.000 What can the cops get with a warrant?
00:06:16.000 Well, usually I think it means that they can come in your house and search.
00:06:21.000 Right.
00:06:22.000 The real issue with warrants when it pertains to encryption, like when you're talking about the Child Safety Act or whatever they're calling it...
00:06:32.000 Anyone would say, yes, we have to stop child predators.
00:06:36.000 But the problem with having the ability to use something like that to stop child predators, in my eyes, I start thinking, well, if I really wanted to look into someone, what I would do is I would send them some malware that would put child predators Pornography on their computer,
00:06:52.000 and then I would have all of the motive that I need to go and look through everything.
00:06:58.000 Like, say if they were a political dissident, if they were doing something against the government, and you were someone who was acting in bad faith, and you decided, okay, we want to look into this guy, but we don't have a warrant.
00:07:11.000 What are the laws?
00:07:13.000 What can we get away with it?
00:07:15.000 Well, we have the Child Endangerment Act.
00:07:17.000 And so because of that, we're allowed to peer into anything.
00:07:21.000 But we just have to have motive.
00:07:23.000 So we have to, well, do we have motive?
00:07:26.000 All you'd have to do is, and we both know this, it's very easy to put something illegal on someone's computer if they're not paying attention.
00:07:34.000 It's very easy to install, like you could send someone a text message that looks like a routing number for a package they're going to get.
00:07:42.000 They click on that, and then you, what is that, what the Israelis have, Pegasus.
00:07:47.000 Yeah, you've read up on this.
00:07:50.000 Well, it's from Brian Fogel's new film, The Dissident, which is about Jamal Khashoggi's murder and how the Saudis actually tapped into Jeff Bezos's phone.
00:08:05.000 And that's where all of the, this is the suspicion, is that that's where all of those National Enquirer photos came out and all the attacks on him.
00:08:14.000 Because they had access to his actual phone through this.
00:08:17.000 So someone could easily get into your stuff if you're not paying attention.
00:08:21.000 And then they could use, you know, whatever acts they've come up with.
00:08:26.000 Whatever, it's the Patriot Act or whatever act.
00:08:28.000 Where they could just...
00:08:30.000 Get into everything you're doing.
00:08:32.000 Look at your WhatsApp messages.
00:08:33.000 Look at your Facebook messages.
00:08:36.000 It's real sneaky.
00:08:37.000 And it's dangerous.
00:08:38.000 It's a dangerous precedent to set.
00:08:40.000 Yeah, I mean, there's a lot to this.
00:08:42.000 Let me go into some of that in a little depth.
00:08:44.000 So you mentioned the NSO Group and their Pegasus malware set.
00:08:47.000 And this is very much a real thing.
00:08:49.000 Like, you're a well-read guy.
00:08:51.000 This is like...
00:08:54.000 This company, the CEO's name I think is Shalev Julio, is run in Israel.
00:09:00.000 It was previously owned actually by an American venture capital firm.
00:09:04.000 I believe they've been re-bought out, but it doesn't really matter.
00:09:09.000 Their entire business This is preying on flaws in the critical infrastructure of all the software running on the most popular devices in the world.
00:09:19.000 The number one target, right, is the iPhone.
00:09:21.000 And this is because the iPhone, as secure as it is relative to a lot of other phones, is a monoculture, right?
00:09:30.000 You get these little software update notifications all the time that are like, hey, please update to the most recent version of iOS.
00:09:36.000 And that's a fabulous thing.
00:09:37.000 That's a wonderful thing for security.
00:09:40.000 Because the number one way that people's devices get screwed, if it's not just through user error, right?
00:09:46.000 Like you entering your password somewhere you shouldn't.
00:09:48.000 It's like a fake site.
00:09:50.000 It looks like Gmail, but it's not actually Gmail.
00:09:52.000 You just gave the guy your password.
00:09:54.000 Now he uses your password to log in.
00:09:55.000 But to actually break into a device, Is that it's not patched, right?
00:10:00.000 Patch means getting these security updates, these little code updates that fix holes that researchers found in the security device.
00:10:10.000 Well, Apple's really good about rolling these out all the time for everybody in the world.
00:10:15.000 The problem is Basically, all these different iPhones, right?
00:10:20.000 You got an iPhone 6, you got an iPhone 8, you got an iPhone X, you got an iPhone, you know, 3, whatever.
00:10:27.000 These are all running a pretty narrow band of software versions.
00:10:34.000 And so these guys go, if they want to target, for example, Android phones, like Google phones, like a Samsung Galaxy or something like that.
00:10:41.000 There's like a billion different phones made by a billion different people.
00:10:45.000 Half of them are completely out of date.
00:10:48.000 But what it means is it's not one version of software they're running.
00:10:52.000 It's like 10,000.
00:10:53.000 And this is actually bad for security on the individual level.
00:10:57.000 But it's good for security in a very unusual way, which is the guys who are developing the exploits, the guys like this NSO group who are trying to find ways to break into phones, they now have to have, like, 50 different handsets running 50 different versions of software.
00:11:14.000 They're all changing.
00:11:15.000 They've got different hardware.
00:11:16.000 They've got different chipsets.
00:11:18.000 They've got all kinds of just technical variables that can screw up the way they attack Your phone.
00:11:26.000 And then when they find one, it only works on like this Samsung Galaxy line.
00:11:31.000 It doesn't work on like the Google Pixel line or it doesn't work on like a Nokia line or something like that.
00:11:37.000 Whereas they realize if they find a way to attack an iPhone, which is actually, you know, this is difficult.
00:11:43.000 This is really difficult stuff.
00:11:45.000 Now, it works against basically every iPhone.
00:11:49.000 And who has iPhones?
00:11:51.000 All the rich people, right?
00:11:52.000 All the important people, all the lawmakers, all the guys who are in there.
00:11:55.000 So they've made a business on basically attacking the iPhone and selling it to every two-bit thug who runs a police department in the world.
00:12:04.000 You know, they sell this stuff to Saudi Arabia.
00:12:06.000 They sell this to Mexico.
00:12:08.000 And there's a group of researchers in Canada working at a university called the Citizen Lab.
00:12:15.000 And these guys are really like the best in the world at tracking what NSO Group is doing.
00:12:19.000 If you want to learn about this stuff, the real stuff, look up Citizen Lab and the NSO Group.
00:12:25.000 And what they have found is all the people who are being targeted, By the NSO group, the classes of people, the countries that are using this.
00:12:35.000 And it's not like the local police department in Germany trying to bust up a terrorism ring or something like that.
00:12:43.000 It's the Mexican government spying on the head of the Mexican opposition.
00:12:48.000 We're trying to look at human rights defenders who are investigating student disappearances.
00:12:53.000 Or it's people like the friends and associates of Jamal Khashoggi, who was murdered by the Saudi government.
00:13:00.000 Or it's people like dissidents in Bahrain.
00:13:04.000 And these petro-states, these bad actors nationally, will pay literally tens of millions of dollars each year Just to have the ability to break into an iPhone for a certain number of times, because that's how these guys do it.
00:13:18.000 They sell their business plan.
00:13:19.000 They go, we'll let you break into any iPhone just by basically sending a text message to this phone.
00:13:24.000 All you need to find is the phone number of a person who's running an iPhone, and we will exploit something.
00:13:31.000 Which will give you total control.
00:13:32.000 If that happens to someone, I'm sorry, but if that happens to someone, could they just get a new phone?
00:13:38.000 And does the exploit, is the exploit specific to their account?
00:13:42.000 Or is the exploit on the physical phone itself?
00:13:45.000 So the question or the answer to this is it really depends on the exploit.
00:13:50.000 Like the easiest forms of exploit, or rather the easier types of exploits, are where they send you a text message, right?
00:13:58.000 And it'll be like an iMessage or something like that.
00:14:00.000 And it's got a link in it that'll be like, oh gosh, terrible news.
00:14:04.000 You know, your buddy's father just died.
00:14:08.000 And we're making funeral arrangements.
00:14:10.000 Are you going to be there?
00:14:11.000 It's the day after tomorrow.
00:14:13.000 And when you click the link for the funeral arrangements, it opens your web browser.
00:14:17.000 And the web browser on your phone is always the biggest, most complicated process in it, right?
00:14:23.000 There's a zillion lines of code in this as opposed to an instant messenger where there's fewer lines of code in it.
00:14:30.000 And they'll find one thing in that, where there's a flaw that lets them feed instructions, not just to the browser, but basically escape the little sandbox that the browser's supposed to play in, that's supposed to be safe, where it can't do anything too harmful, and it'll run out of this sandbox,
00:14:47.000 and it'll ransack your phone's, like, hardwired operating system, the system image.
00:14:54.000 It'll like give them privileges to do whatever they want on your phone as if they are you and then as if they have a higher level of privilege than you.
00:15:04.000 They have system level privileges to change the phone's operation permanently, right?
00:15:09.000 And this is the problem is on the phone.
00:15:12.000 You can replace the phone, right, and they'll lose access to that.
00:15:16.000 But if they've already used that, To gain the passwords that you use to access, you know, your iCloud or whatever.
00:15:23.000 When they have control of the phone, they've already got your photo roll, right?
00:15:26.000 They've already got your contact list.
00:15:28.000 They already have everything that you've ever put in that phone.
00:15:30.000 They already have all your notes.
00:15:32.000 They already have all your files.
00:15:33.000 They already have everything that's in your message history, right?
00:15:36.000 They can pull that out immediately.
00:15:38.000 And now, Because they have, you know, all your contacts and things like that.
00:15:42.000 They see that phone stop being active.
00:15:45.000 They know you've changed your phone number.
00:15:46.000 All they have to do is find the new phone number and then they can try to go after you again.
00:15:50.000 The benefit is, with that old style of attack, if you get that message, And you don't click that link.
00:15:56.000 You're somebody in a vulnerable class, right?
00:15:59.000 You've had these kind of attacks against you before.
00:16:00.000 It looks suspicious.
00:16:02.000 You don't know who this person is.
00:16:03.000 The number isn't right.
00:16:04.000 Something like that.
00:16:05.000 And you save that link.
00:16:07.000 You don't click the link.
00:16:08.000 You don't do anything with that link.
00:16:09.000 But you send it.
00:16:11.000 To a group like Citizen Lab, they can basically use that link to basically use like a dummy phone, like sort of a Trojan horse, to go to the site that would attack your phone and catch it.
00:16:24.000 And this is what the sort of process that all of their research is based on.
00:16:28.000 There are other more advanced types of attacks that actually don't have these defenses against them that are far more scary.
00:16:36.000 Can I stop you for a second?
00:16:37.000 What is Citizen Lab?
00:16:39.000 Citizen Lab?
00:16:40.000 Yeah, the Citizen Lab is the name of this research group at the University in Canada who basically studies state-sponsored and corporate malware attacks against civil society.
00:16:51.000 It's run by a guy named Ron Deber, I believe.
00:16:55.000 You guys will have to fact-check me on that one.
00:16:57.000 I think he just published a book.
00:16:59.000 Actually, he was publishing a book about all of this.
00:17:02.000 But it's really, they are the world leaders, in my opinion, in basically investigating these kind of attacks and exposing them.
00:17:10.000 It's true public service.
00:17:14.000 Let's go back to that one thing.
00:17:15.000 I asked you about warrants, and you talked about the fact that people could plant evidence on things and then get motivation, or rather they could show probable cause to the court to then investigate you, and then they can get everything.
00:17:31.000 And you said, you know, you thought...
00:17:33.000 That a warrant meant they can go and search your house.
00:17:35.000 And this is the kind of thing that we, you know, modern people are used to thinking of in the context of a warrant.
00:17:41.000 Cops go to a specific place looking for specific things that are elements of a crime.
00:17:48.000 Now, you know, you've heard all these things where, like, cops find a way to, like, stop somebody and they, like, are like, oh, I smelled pot or whatever, and they try to, you know, toss their car or whatever.
00:18:00.000 Or plain sight doctrines where they open the door and the guy sits down and talks to them and they go, oh, you know, I see a bong or something.
00:18:07.000 You know, that's paraphernalia, you're going to jail.
00:18:10.000 But until, I think it was 1967...
00:18:16.000 Warrants in the United States could only be used to gather two things that were called the fruits and instrumentalities of a crime, which meant even if the cops knew you did it, even if the cops knew you rode the subway or worked for this company or whatever,
00:18:32.000 they couldn't get all the company's records.
00:18:34.000 They couldn't, if they existed, get all the emails that you ever wrote.
00:18:39.000 They couldn't get your friend to turn over like an exchange of letters that you had with this person.
00:18:45.000 The fruits of the crime were the things that they gained from it, right?
00:18:48.000 If they robbed the bank, the cops could get the sack of money.
00:18:51.000 The instrumentalities were the tools that were used, right?
00:18:54.000 Like if you I used dynamite or a crowbar or a getaway car.
00:18:59.000 They could seize all of those things.
00:19:01.000 But the idea that the cops can get everything, the idea that the FBI can get all these records, you know, all of these things, your whole history, is very much a new thing.
00:19:14.000 And nobody talks about that today.
00:19:16.000 We just presume it's normal.
00:19:18.000 We presume it's okay.
00:19:19.000 But between 1967 and today, think about how many more records there are about your life and how you live.
00:19:28.000 Private things about you that have nothing to do with criminality.
00:19:31.000 And everything to do with the intimacy of who you are.
00:19:35.000 And the fact that all of that now today is exposed.
00:19:38.000 And not just to, let's say you love the U.S. government.
00:19:41.000 Let's say you are like, Throwing cookouts for your local police department, but every other government in the world, too.
00:19:51.000 And we really need to ask ourselves, how much information do the authorities of the day need to do their job?
00:20:00.000 How much do we want them to have?
00:20:02.000 How much is proper and appropriately and necessary?
00:20:04.000 And how much is too much?
00:20:06.000 And if we decide the cops shouldn't have this, if we decide the spies shouldn't have this, well why in the hell should Facebook or Google or somebody trying to sell you Nikes, why should they have this?
00:20:19.000 Yeah, and what's the answer to that?
00:20:21.000 They shouldn't, right?
00:20:23.000 But nobody wants to go backwards.
00:20:26.000 Once you have gained a certain amount of access, and you can justify that access, like we're stopping crimes, like the Patriot Act, and then later the Patriot Act II, which was even more overreaching.
00:20:39.000 Once they have that kind of power, they never go, you know what, we went too far.
00:20:44.000 Yeah.
00:20:45.000 We have too much access to your privacy, and even if you've committed a crime, we shouldn't have unrelated access to all these other activities that you're involved in.
00:20:55.000 Yeah, and I mean, that's exactly the thing about the whole Save the Puppies Act, right?
00:20:58.000 If it's got a name like that, you've got to be like, no, something doesn't smell right here.
00:21:03.000 There's something bad in this.
00:21:06.000 And I mean, so this gets back to that...
00:21:13.000 Initial topic of what did the court decide, right?
00:21:17.000 So we had the Patriot Act, and the Patriot Act was this giant law that had been written long before 9-11.
00:21:24.000 It was just sitting on the shelf.
00:21:26.000 And the Department of Justice, the FBI, they knew they couldn't pass this.
00:21:30.000 They knew nobody would live with it because it was an extreme expansion of government authority.
00:21:36.000 And then 9-11 happened, right?
00:21:38.000 And that's really where it all started to go wrong.
00:21:40.000 That's where we got the rise of this new authoritarianism that we see continuing in the United States today, right?
00:21:45.000 Like if you think, and you know, like you have problems with what's happening under Donald Trump.
00:21:52.000 But you also had problems under, like, what was happening with Obama and the expansion of the war on whistleblowing.
00:21:58.000 You had problems with the way drone strikes were going out of control.
00:22:01.000 You go, well, really, where did this all start, right?
00:22:04.000 Where did this start to go wrong?
00:22:06.000 Personally, I think...
00:22:08.000 9-11 was where we made a fundamental mistake, and that was we were so frightened in the moment because we had had such an extraordinary and rare terrorist attack succeed, which by the way could have been prevented, and I think we discussed this in the last episode.
00:22:30.000 The Congress, you know, they were just terrified.
00:22:34.000 They said, look, intelligence services, cops, FBI, whoever, anything you want, blank check, here you go.
00:22:40.000 That was the Patriot Act.
00:22:41.000 And at the time, groups like the American Civil Liberties Union, they were like...
00:22:49.000 We are worried that this goes too far, because God bless them, that's what the American Civil Liberties Union does.
00:22:54.000 And one of the provisions that they had a problem with was this Section 215 of the Patriot Act, which I believe they were calling at the time the Library Records Provision.
00:23:05.000 And what it said, basically, this tiny little phrase in the law...
00:23:10.000 He said, the FBI can basically get any records that it deems relevant to a counterterrorism investigation under a warrant.
00:23:20.000 And the worst thing the ACLU could imagine was that these guys would go to the library and get what kind of books you're reading and like shock horror this is the worst thing these guys could do.
00:23:32.000 And so they protested and they lost and this passed and it went on and lo and behold 10 years later we find out in 2013 they had used this provision that people were worried about just going after individuals library records To instead get the phone records of not an individual,
00:23:52.000 not a group, but everybody in the United States who was making calls on U.S. telecommunications providers delivered to the NSA daily by these companies, right?
00:24:03.000 So no matter who you were, no matter how innocent you were, the FBI was getting these because they said, well, every phone call is relevant to a counterterrorism investigation.
00:24:13.000 And the court went, finally, you know, this is seven years after 2013, Then when, guys, that's too much.
00:24:19.000 If your definition of relevance is basically anything, anywhere, all the time is relevant to a counter-terrorist investigation, the question is, what then is not relevant?
00:24:31.000 What is the limiting principle on this?
00:24:33.000 Where is the end?
00:24:35.000 And this is a very important thing, because even if it's not enough, right, even if this doesn't shut down all the programs, the program was actually already stopped a few years ago because of previous court decisions and changes in law, The fact that the courts are finally beginning to look at the impacts of these sweeping new technologies that allow governments to see all of these connections and interactions that we're having every day,
00:24:59.000 they're finally putting limits on it.
00:25:03.000 And that is, I think, transformative.
00:25:05.000 It is the foundation of what we will see in the future will begin to be the first meaningful guarantees of privacy rights in the digital age.
00:25:16.000 Now that you have been, at least according to this court, exonerated or justified, what happens to you?
00:25:28.000 And what happens to what they've been doing?
00:25:33.000 And how much of the breaks do they hit on this?
00:25:39.000 What changes?
00:25:40.000 Does anything change in the government's sweeping surveillance?
00:25:44.000 It's a great question.
00:25:45.000 I mean, you would think when you get a court, not even a first-level court, but an appeals court, That looks at these issues, you know, they're talking about serious stuff, they're talking about counter-terrorism investigations.
00:26:01.000 By the way, in the same thing, in the same decision, they said the government has been arguing, you know, for 20 years now, these programs were saving lives.
00:26:12.000 They were stopping terrorist attacks.
00:26:14.000 They said, you know, first they said mass surveillance had stopped 54 terrorist attacks in the United States.
00:26:20.000 Then they dropped it to seven.
00:26:22.000 And then they dropped it to one.
00:26:24.000 And the one terrorist attack or terrorist conspiracy, whatever, that they said it did stop was this case that was just decided.
00:26:35.000 And the court found, and this is important, After looking at the government's classified evidence, so this is not just the court deciding on their own.
00:26:43.000 This is the government going, look, here's all the evidence that we have, the top secret stuff, the stuff that nobody can see.
00:26:49.000 Please don't, you know, say our program is ineffective or whatever.
00:26:52.000 The court looked at it and they went, holy crap.
00:26:57.000 This invasion of hundreds of millions of Americans' privacy happening over the span of decades did not make a difference in this case.
00:27:08.000 They said even in the absence of this program, if it hadn't existed, if the government had never done it, They still would have busted this ring because they were already closing in on them.
00:27:20.000 The FBI already had all the evidence they needed to get a warrant to get the records through traditional means.
00:27:26.000 And the fact the government had been saying, Congress had been saying for years and years and years that this program was necessary, the court says that was misleading, which is legalese for sending the government's effing liars on this.
00:27:42.000 So that raises the question of, okay, as you said, Well, what now?
00:27:46.000 How does this change everything?
00:27:48.000 Well, it does mean the government has to stop doing this particular kind of program directly, but that program had already shut down.
00:27:58.000 And the government has a really great team of lawyers for every agency, right?
00:28:04.000 The DOJ has got lawyers, the White House has lawyers, the FBI has lawyers, the NSA has lawyers, and the CIA has lawyers.
00:28:09.000 And the only thing these guys are paid to do all day is to look at basically these legal opinions from the court that says all the ways the government broke the law and go, huh, is there any way we can just rejigger this program slightly so that we can dodge around that court ruling to go,
00:28:30.000 all right, you know, the abuses are still happening, but they're happening in a less abusive way, and then it's business as usual.
00:28:38.000 So this is always the process with the courts ruling against the government.
00:28:44.000 This is not an exceptional thing in the case of, you know, it's NSA, it's CIA. What happens is when the government breaks the law, as the court has ruled them to do last week, There is no punishment.
00:28:59.000 There is no criminal liability for all the bastards, the head of the FBI, the head of the NSA, who were violating Americans' rights for decades.
00:29:09.000 Those guys don't go to prison.
00:29:10.000 They don't lose their jobs.
00:29:12.000 They don't even smell the inside of a courtroom where they're the ones wearing handcuffs.
00:29:18.000 And because of that, it creates a culture of unaccountability, of impunity, right?
00:29:23.000 Which means with each generation of government officials, they study this.
00:29:27.000 They study the cases against them.
00:29:28.000 They study where they won.
00:29:29.000 They study where they lost.
00:29:31.000 And what they do is they try to create exactly what just happened, which is a system where they can break the law For 10 years, you know, 2001 to 2013 basically, and no one even knows that it's happening.
00:29:46.000 Classification protects that, right?
00:29:48.000 Then eventually it gets exposed.
00:29:51.000 There's a leak.
00:29:51.000 Of course, somebody blows the whistle on it, right?
00:29:54.000 It becomes a scandal.
00:29:56.000 The government, you know, they'll disown this program.
00:29:58.000 They'll change the law there.
00:30:00.000 But somebody, like the ACLU, will sue the government.
00:30:04.000 And so the courts will finally be forced to look at these things.
00:30:08.000 But the wheels of justice turn slow, right?
00:30:10.000 The government will try to put the brakes on it.
00:30:14.000 The plaintiffs, the civil society organizations that are suing will have to gather evidence.
00:30:19.000 It's really difficult to do because the government's not providing anything.
00:30:22.000 It's all classified.
00:30:23.000 And then basically it takes another five years, another ten years for the court to get to their verdict.
00:30:28.000 And then we have it.
00:30:30.000 But then nobody goes to jail, right?
00:30:31.000 Nobody actually faces serious consequences who is responsible for the wrongdoing.
00:30:37.000 And so the cycle continues.
00:30:38.000 But having said that, it might feel disempowering.
00:30:42.000 People might go, oh, we can't win.
00:30:45.000 But this is in the context of a system where we lack accountability, where the government does have a culture of impunity.
00:30:53.000 This is what winning looks like because things do get better.
00:30:57.000 The problem is they get better by decades.
00:30:59.000 They get better by half centuries and centuries.
00:31:02.000 If you look at the United States, you know, 200 years ago, 100 years ago, things were objectively worse on basically every measure.
00:31:10.000 The fact that we have to crawl to the future It's a sad thing when we know it could be fixed very quickly by establishing some kind of criminal liability for people like James Clapper, the former Director of National Intelligence, who lied under oath to Congress and the American people saying exactly this program didn't exist.
00:31:30.000 The NSA wasn't collecting any information on millions or hundreds of millions of Americans when in fact they were doing that every day.
00:31:38.000 Obama did not fire him, right?
00:31:40.000 Obama did not charge him.
00:31:42.000 Obama let him serve out the end of his days and then retire happily.
00:31:45.000 But it's not an Obama problem, right?
00:31:49.000 We see the same kinds of abuses happening under the Trump administration.
00:31:52.000 We saw the same kind of abuses happening under the Bush administration.
00:31:56.000 And the only way this changes Materially is if our government changes structurally, right?
00:32:04.000 And that's kind of the issue that I think everybody in the country sees.
00:32:07.000 When you look at the economy, when you look at all the struggle, when you look at all the class conflict and the divide and the political partisanship that's happening today, The problem isn't, right, like about this law or this court ruling or this agency.
00:32:23.000 It's about inequality of opportunity, of access, even of privilege, right?
00:32:28.000 I know people don't like talking about that.
00:32:30.000 It's uncomfortable.
00:32:31.000 People are like, oh my God, you know, are you, like, whatever.
00:32:35.000 But the reality is we have a few people in the country, you know, the Jeff Bezos, the Bill Gates, That own everything.
00:32:44.000 Like ten people owning half the country.
00:32:46.000 And half the country owning nothing at all.
00:32:49.000 And this applies to influence, right?
00:32:53.000 When you have that kind of disproportionality of resources, you have that kind of disproportionality of influence.
00:33:02.000 Your vote means less.
00:33:04.000 Your ability to change the law means less.
00:33:06.000 Your access to the courts means less.
00:33:09.000 And that's how we end up in the situation where we are today.
00:33:14.000 That's very disheartening.
00:33:16.000 Well, it doesn't have to be, because the important thing is we can change it.
00:33:21.000 Can we, though?
00:33:22.000 I mean, like, what can we do?
00:33:24.000 And what can anybody change at this point to stop this overwhelming power that the government has to invade your privacy and to all the things that you exposed?
00:33:36.000 When you talk about how The particular program that was in place has been shut down, but all they do is manipulate it slightly, do it so that you can argue in court that it's not the same thing, that it's a different thing, come up with other justifications for it,
00:33:53.000 withhold evidence, and then drag the process out for years and years and years.
00:33:59.000 For you to be so optimistic is really kind of spectacular considering the fact that you've been hiding in another country, allegedly.
00:34:06.000 We don't even know.
00:34:06.000 You might be in Ohio.
00:34:07.000 We don't know.
00:34:09.000 You know, we don't know.
00:34:10.000 But you're essentially on the lam for exposing something that has now been determined to be illegal.
00:34:19.000 So you are correct.
00:34:21.000 When you go back to Obama's hope and what was his website?
00:34:26.000 Hope and change.
00:34:27.000 Hope and change.
00:34:28.000 A big part of hope and change was protecting whistleblowers.
00:34:32.000 Do you remember that?
00:34:33.000 And that was all deleted later.
00:34:36.000 Later on, they were like, yeah, let's go back and take that shit out.
00:34:39.000 We didn't know.
00:34:40.000 I didn't know what it was like to actually be president back then.
00:34:42.000 I was just trying to get in there.
00:34:43.000 But the hope and change stuff was still there when you were being tried.
00:34:48.000 It was still there when they were chasing you and trying to find your location.
00:34:53.000 When the Guardian article came out, the hope and change shit was still online.
00:34:58.000 And that's the fact that you're so optimistic, even though you've been fucked over royally.
00:35:04.000 I mean, you are, in my opinion, you're a hero.
00:35:07.000 I really think that.
00:35:08.000 And I really think that what you exposed is hugely important for the American citizens to understand that Absolute power corrupts absolutely.
00:35:21.000 And these people had the ability to look into everything.
00:35:25.000 And they still do.
00:35:26.000 They have the ability to look into everything you're doing.
00:35:28.000 And the fact that through these years, it literally stopped zero terrorist attacks.
00:35:35.000 Zero.
00:35:37.000 So this sweeping, overwhelming intrusion of your privacy had no impact whatsoever on your safety.
00:35:46.000 Well, it wasn't about safety.
00:35:48.000 It was about power, right?
00:35:49.000 They told us it was about safety.
00:35:52.000 Again, it's the Save the Puppies Act.
00:35:55.000 If you see government saying all these things that work for safety, they're protecting you, and they never establish the efficacy of it, the chances are it probably isn't effective.
00:36:07.000 Because, you know, the government leaks all the time.
00:36:11.000 If they say, we saved this person, we did that.
00:36:15.000 Whenever they're being criticized, they go on TV and they very seriously go, oh, that's classified and we can't expose that.
00:36:21.000 You never hear of the successes we do because it's so important that they stay secret.
00:36:25.000 Look, I worked for the CIA, I worked for the NSA. That's bullshit.
00:36:29.000 When they do something great, it's on the front page of the New York Times by the end of the day because they're fighting for budget, they're fighting for clout, they're fighting for authority, they're fighting for new laws.
00:36:41.000 Constantly.
00:36:41.000 So there are no real accomplishments that are in the shadows that they just don't tell us about.
00:36:47.000 Think about when we got Bin Laden, right?
00:36:51.000 Obama's like, I want a press conference within the next 20 minutes.
00:36:55.000 And again, this is not to bag on Obama.
00:36:57.000 Any president would do this.
00:36:59.000 That's just how it is.
00:37:02.000 Of course, there are some secret successes, but it's about stuff that no one cares about.
00:37:09.000 It's stuff that wouldn't win the political clout.
00:37:11.000 It's like they gained an advantaged negotiating position on the price of shrimp and clove cigarettes, which was actually one of the stories that came out of some kind of classified disclosure that I think was from WikiLeaks.
00:37:26.000 That kind of stuff, it actually does happen, right?
00:37:30.000 But we're never having a conversation of, do you want to give up all of your privacy rights so that we can get better prices on shrimp and clove cigarettes?
00:37:37.000 Like, that would be a very different political conversation than, do you want to give up all of your privacy rights because if you don't, your children will die.
00:37:46.000 And, you know...
00:37:47.000 Save the puppies.
00:37:50.000 Right, exactly.
00:37:51.000 Save the puppies, 2020. Right.
00:37:55.000 So this thing you ask about me and optimism, I have been criticized relentlessly for being a naive optimist, right?
00:38:07.000 And my answer is that you don't do what I did Unless you believe that people can do better.
00:38:13.000 I took a very comfortable life.
00:38:16.000 I was living in Hawaii with the woman that I loved.
00:38:19.000 I had to do basically no work but go in the office and read spy feeds about people all day long.
00:38:27.000 And I could have done that, you know, for the rest of my life quite happily.
00:38:31.000 It would have been great.
00:38:32.000 I set that on fire because I believed that what I saw was wrong.
00:38:37.000 And I believed that people deserved to know about it.
00:38:39.000 And I believed that if people did know about it, that things would change.
00:38:44.000 I did not believe it was going to save the world.
00:38:48.000 I did not believe I was going to get, you know, a ticker tape parade and a pardon.
00:38:53.000 You know, be welcomed with open arms.
00:38:55.000 There's actually, if you watch Citizen 4, Which is the documentary from 2013 where I was meeting with reporters and Laura Poitras had the camera rolling in the room when we talked for the first time.
00:39:08.000 I said, you know, the government's gonna say I harm national security.
00:39:12.000 I put everybody in jeopardy.
00:39:13.000 They're gonna charge me under the Espionage Act.
00:39:16.000 And they really did try to destroy my life.
00:39:19.000 They tried to put me in prison forever.
00:39:21.000 And to this day, they are still trying to do the same thing.
00:39:25.000 That's just how it is.
00:39:27.000 You know, this wasn't like...
00:39:28.000 Even though.
00:39:29.000 Even though, yeah.
00:39:30.000 Even though the most recent ruling has showed that you were correct, and what they were doing was illegal, and you exposed a crime.
00:39:38.000 Yeah, well, I mean, this is a continuing story.
00:39:41.000 In 2013, you know, when this first came out, President Obama went out on stage, you know, because he was getting singed in the press, and said, you know, take it from me, nobody is listening to your phone calls.
00:39:56.000 Even though nobody said they were listening to your phone calls.
00:40:01.000 It wasn't like they had headsets on, you know, 300 plus million people in the United States.
00:40:07.000 You'd have to have computers do that.
00:40:10.000 But what they did do was they collected the records of your phone calls.
00:40:14.000 And to an analyst, to an intelligence analyst, that's more valuable than the transcripts of your phone calls.
00:40:20.000 We care less about what you said on the phone than who you called, when you called them, what else you were doing, what your phone was doing, right?
00:40:30.000 The websites that you would access, the cell phone towers they were connected to.
00:40:34.000 All of those things, that metadata creates what's called the social graph.
00:40:39.000 Your pattern of life It says, based on when your phone becomes active in the morning, when you start calling people, when you start browsing, when you check your Twitter feed, you're scrolling on Instagram, whatever, that's when you wake up.
00:40:51.000 When it stops, that's when you go to sleep.
00:40:54.000 We see where you are.
00:40:55.000 We see where you live.
00:40:56.000 We see who you live with.
00:40:58.000 All of those things, right?
00:40:59.000 That's just from metadata.
00:41:01.000 You don't need the content of your communications.
00:41:03.000 I don't need to see what picture you posted on Instagram.
00:41:07.000 To know you're awake and active, and you're communicating with this person at this phone, this place, this area code, this IP address, you know, this version of software, whether they're using Android or iOS, you know, all of these things.
00:41:20.000 And now as we get smartphones, as your cars begin connecting to the internet, it's just richer and richer and richer data.
00:41:28.000 I don't know where I was going with that, sorry.
00:41:31.000 I got off topic, but...
00:41:35.000 The bottom line is things get better.
00:41:38.000 They get better slowly.
00:41:39.000 Oh, right.
00:41:40.000 Sorry.
00:41:40.000 Now, Obama was saying, you know, nobody listens to your phone calls, right?
00:41:44.000 That was June 2013. By January of 2014, giving his State of the Union address, he went, although he could never condone what I did, The conversation that I started has made us stronger as a nation.
00:41:56.000 He was calling for the end of this program, the passage of a new law called the USA Freedom Act, another Save the Puppies Act, which was better than the thing it was replacing but still really bad.
00:42:10.000 And he did that not out of the goodness of his heart.
00:42:13.000 He did that because the court in December of 2013 had ruled these programs were unlawful and likely unconstitutional.
00:42:21.000 And this is again, it's not an Obama thing, it's a power thing.
00:42:25.000 This is how the system works, right?
00:42:27.000 But year by year, step by step, things get better.
00:42:30.000 We make progress a little bit at a time.
00:42:32.000 And the fact that someone is suing, the fact that the ACLU is bringing this case, and we should thank them for that, for years, which is a difficult and expensive proposition with no guarantee of success, means that we have stronger privacy rights seven years later as a result.
00:42:49.000 That doesn't mean we save the world.
00:42:52.000 That doesn't mean we relax.
00:42:53.000 We sit down on the couch.
00:42:55.000 You know, there's the golden sunset.
00:42:56.000 That's not how life works.
00:42:58.000 It is a constant struggle.
00:43:00.000 But when we do struggle, when we do stand up, we believe in something so strongly we don't merely believe in it.
00:43:08.000 But we risk something for that belief.
00:43:11.000 We work together and we pull the species forward an inch at a time.
00:43:16.000 We move away from that swamp of impunity and unaccountability into a future where, hey, maybe not just the little guy breaks the law and goes to jail, but maybe a senator, maybe an attorney general, maybe a president,
00:43:33.000 right?
00:43:34.000 And that would be a very good precedent to have.
00:43:38.000 Do you wonder whether or not someone will use you as a political chess piece at this point and decide?
00:43:47.000 Correct me if I'm wrong, but I'm pretty sure you have overwhelming support of the general public.
00:43:54.000 Most people believe that what you did was a good thing for America and that you are, in fact, a patriot.
00:44:01.000 I think the vast majority of people, and the people that I've talked to, I have talked to a few people that disagree with that, they're misinformed.
00:44:08.000 They were misinformed about what you did and what information you leaked or whether or not people's lives were put in danger because of that.
00:44:15.000 And I had to explain the whole chain of events and where the information actually was, how it was leaked and what you had done to protect people.
00:44:25.000 Could you please explain that?
00:44:27.000 Because it wasn't just that the information was dumped.
00:44:31.000 Yeah, so I mean, this is really the subject of our last conversation.
00:44:35.000 It goes on for three hours, but I wrote a book.
00:44:37.000 Yeah, but just so this will stand alone.
00:44:40.000 I'll go through it.
00:44:41.000 So the idea, and this is the subject of my first book, Permanent Record, which was why I came on last year.
00:44:48.000 And actually, just this week, the softcover came out, so it's more affordable.
00:44:53.000 For people who didn't want to get it before, is this story, right?
00:44:58.000 It's who I am, where I came from, why I did this, how, and what it meant.
00:45:03.000 I didn't just reveal information.
00:45:05.000 I gave it to journalists, right?
00:45:08.000 These journalists were only given access to the information on the condition that they would publish no story Simply because it was newsworthy or interesting, right?
00:45:17.000 They weren't going to clickbait classified documents.
00:45:20.000 They would only publish stories if they were willing to make an independent institutional judgment and stand by it that it was in the public interest That this be known, right?
00:45:32.000 And then as an extraordinary measure on top of that, before they publish the stories, right?
00:45:36.000 And this is not me publishing things, putting them out on the internet or blog or something, which I could have done, would have been very easy.
00:45:43.000 It's not me telling them what to write or not to write.
00:45:45.000 They're doing this, the Guardian, the Washington Post, you know, Der Spiegel.
00:45:51.000 They are then going to the United States government in advance of publication and giving the government a chance, an adversarial opportunity to argue against publication.
00:46:01.000 To go, you guys don't get it.
00:46:03.000 You know, Snowden's a liar.
00:46:04.000 These documents are false.
00:46:06.000 Or, he's not lying, and yes, these are true, but these programs are effective, they're saving lives, whatever, and here's what we can show you to convince you, please don't publish this or leave out this detail.
00:46:16.000 And in every case I'm aware of, that process was followed.
00:46:19.000 And that's why now in 2020, remember, we're seven years on from 2013, the government has never shown a single example of any harm that has come as a result of the publication of these documents back in 2013,
00:46:34.000 the revelation of mass surveillance.
00:46:37.000 That's what I wanted to bring up.
00:46:40.000 It's unscientific, but I've seen polls run on Twitter very recently in the last few weeks when this pardon question came out, where 90%, like 90 plus percent of people were in favor of a pardon.
00:46:55.000 And that's crazy.
00:46:57.000 Even in 2013 when we were doing well, it was like 60%.
00:47:02.000 In favor among young people, but it was like 40% for older people.
00:47:07.000 But that's because the government was on TV every Sunday, you know, bringing these CIA suits going, who were there with their very stern faces going, oh, this caused great damage and it cost lives and everything like that.
00:47:20.000 But those arguments stop being convincing when seven years later, after they told us the sky is falling, the atmosphere never catches fire, right?
00:47:29.000 The oceans never boil off.
00:47:31.000 We're still alive.
00:47:33.000 And I think people can see through that.
00:47:35.000 And that was, again, exactly what you said.
00:47:38.000 People don't know this history, that 10% who are against it, and actually a lot of the 90% who are even in favor of it.
00:47:47.000 They don't know the details.
00:47:49.000 It wasn't well covered by the media at the time.
00:47:51.000 It was all about this person said that, that person said that.
00:47:53.000 Is it true?
00:47:54.000 Is it false?
00:47:57.000 They were playing on character.
00:47:58.000 They were trying to make a drama out of it.
00:48:00.000 And that's a big part of why I wrote Permanent Record.
00:48:04.000 And it's been tremendously gratifying to see people connect to it.
00:48:09.000 And actually, this, you know, I mentioned it, we talked on Twitter, when we were talking about the possibility of having this conversation.
00:48:16.000 And I was like, I looked back at our first conversation we had, and it's had like 16 million views, man, that's for a three hour conversation.
00:48:26.000 And then probably an equal amount of people just listened to it in audio.
00:48:30.000 Right!
00:48:31.000 And that was just for one clip on YouTube.
00:48:34.000 There were smaller clips talking about cell phone surveillance and that was like another 10 million views.
00:48:39.000 77,000 comments.
00:48:41.000 The book on Amazon has thousands of reviews.
00:48:45.000 It's got a 4.8 rating by the number of people and how it's rated.
00:48:51.000 That's one of the best autobiographies according to ordinary people in the audience in like years.
00:48:56.000 And to see that after these years of attacks To me is evidence that despite all these news guys at night going, well, Senator, you know, no one really cares about privacy these days.
00:49:10.000 These kids with their Facebooks and their Instagrams, you know, people do care.
00:49:16.000 What they're actually feeling is kind of what you got to earlier, like this sensation that nothing changes.
00:49:22.000 Like, even when we win, we lose.
00:49:24.000 But the thing is, you've got to have a broader view of time.
00:49:28.000 You've got to look at the sweep of history rather than the atmosphere of the moment.
00:49:33.000 Because right now, yes, Things are very bad.
00:49:38.000 And even if you love Donald Trump, because I know some of your viewers do, you've got to admit, a lot of things in the world suck right now.
00:49:45.000 A lot of things in the country suck right now.
00:49:48.000 But the thing is, they only get better if somebody does the hard work to make them better.
00:49:54.000 And there's no magic wand.
00:49:55.000 There's no happy ending, right?
00:49:57.000 Life is not that simple.
00:49:58.000 But together, we can make it better.
00:50:01.000 And we do that through struggle.
00:50:06.000 Do you—has there been any discussion about someone pardoning you?
00:50:10.000 Has there been—I mean, this was the question initially that led to this, but I wanted you to expand on what actually went down.
00:50:16.000 But has there been any discussion about you being pardoned or someone using you as, like I said, a political chess piece?
00:50:27.000 It would be a smart thing.
00:50:30.000 And if anybody has had a problem with the intelligence community, it is Donald Trump.
00:50:35.000 I mean, he's the only president in any memory that has had open disagreements and been openly disparaging of the intelligence community.
00:50:44.000 Well, that's not true.
00:50:45.000 There was JFK, but...
00:50:47.000 That didn't go very long.
00:50:48.000 That's right.
00:50:49.000 I forgot about that.
00:50:51.000 Good point.
00:50:51.000 Yeah, that went terrible for him.
00:50:54.000 For Trump, it actually seems to be a positive in some strange way.
00:50:59.000 If anybody is going to pardon you, I would imagine that would be the guy.
00:51:04.000 So this idea of the political bargaining chip has actually been used in a different way.
00:51:09.000 There was the idea...
00:51:11.000 And it's funny because this was actually promoted by all these CIA deputy directors and whatnot who were responsible for these abuses of Americans' rights, who were writing opinion pieces in the newspaper, and they were like, you know, what if Vladimir Putin sends Snowden to Trump as an inauguration gift?
00:51:29.000 Wouldn't that be terrible for him?
00:51:32.000 And they were like, hint, hint.
00:51:35.000 But I don't think when we talk about this stuff...
00:51:41.000 I don't think there's anything I can do to control it.
00:51:43.000 One of the things people have asked is, like, would I accept a pardon from Donald Trump?
00:51:49.000 And I think that misapprehends what a pardon is and how it works.
00:51:53.000 A pardon is not a contract.
00:51:55.000 A pardon is not something that you agree to.
00:51:58.000 A pardon is a constitutionally enumerated power.
00:52:02.000 I think it's Article 2, Section 2, where The reason that it exists is basically a check on the laws and the judiciary, where the laws as written become corrosive to the intention of them.
00:52:20.000 And this is something that I think actually is meaningful.
00:52:23.000 You know, people are like, are you going to ask Donald Trump for a pardon?
00:52:27.000 And the answer is no.
00:52:30.000 But I will ask for pardon for Terry Albury and Daniel Hale and Reality Winner and all the other American whistleblowers who have been treated unfairly by this system.
00:52:41.000 The whole thing that brought this up was two weeks ago.
00:52:43.000 Some journalist asked the president, like, oh, you know, what do you think about Snowden?
00:52:48.000 Are you going to pardon him?
00:52:50.000 And he said he seemed to be thinking about it.
00:52:52.000 He heard I have been treated very unfairly.
00:52:55.000 That's accurate!
00:52:57.000 Because it's impossible to get a fair trial under the Espionage Act, which is what I've been charged under.
00:53:03.000 And every American whistleblower since Daniel Ellsberg in the 1970s has been charged under this law, the Espionage Act, which makes no distinction between someone who is stealing secrets and selling them to foreign governments Which neither I nor any of these other people have done,
00:53:21.000 and giving them freely to journalists to advance the public interest of the American people rather than the private interest of these spies, you know, individually.
00:53:33.000 And this is the kind of circumstance for which the pardon power exists, where the courts and judges will not or cannot And a fundamentally unfair and abusive circumstance in the United States,
00:53:52.000 either because they're fearful of being criticized, of soft on terrorism or whatever, or because the law prohibits them from doing so.
00:54:01.000 The problem with the Espionage Act is it means you can't tell the jury why you did what you did.
00:54:07.000 You cannot mount what's called a public interest offense.
00:54:10.000 Where you say, hell yeah, I broke the law.
00:54:14.000 I took a classified document and I gave it to the journalist and the journalist published it and then it went to the courts and the court said, This guy was right.
00:54:23.000 The government was breaking the law.
00:54:25.000 In the courts, if I were, you know, in prison today, as reality winners in prison today, or rather Daniel Hale, who revealed government abuses related to the drone program, or Terry Albury, who revealed problems with racial policies in the FBI,
00:54:42.000 how they were being abused.
00:54:45.000 When these guys are on trial, all of that stuff is forbidden from being spoken.
00:54:51.000 Daniel Ellsberg's lawyer asked Daniel Ellsberg, why did you do it?
00:54:56.000 In court, in open court, under oath, you know, why did you publish or provide to journalists the Pentagon Papers?
00:55:03.000 And the prosecutor said, objection, objection, he can't say that.
00:55:06.000 And the judge said, sustained, fine, he can't say it.
00:55:09.000 And his attorney looked at the judge like he was crazy and said, I've never heard Of a trial where the jury is not allowed to hear why a defendant did what they did.
00:55:21.000 And the judge said, well, you're seeing one now.
00:55:25.000 And this is why the pardon power exists.
00:55:29.000 Well, that's what's so creepy about something like the Espionage Act.
00:55:33.000 If you can't even establish a motive, you can't even explain that you are doing this for the American people, that there's a real precedent that should be set for this kind of thing, especially in regards to what you're being charged with, which has now been determined that you were exposing something that was,
00:55:50.000 in fact, illegal.
00:55:51.000 And this is, it's an incredibly un-American thing.
00:55:57.000 It's very un-American.
00:55:59.000 It really is.
00:56:01.000 It's disturbingly so.
00:56:03.000 I mean, we see these kind of injustices happening in the United States every day, and it's not about the Espionage Act specifically.
00:56:09.000 I mean, you see with drug charges, you see with civil forfeiture, asset forfeiture, where like, you know, they take an old lady's car because her nephew was selling weed or something like that, and there's no way for her to get it back.
00:56:21.000 Whether we're talking civil or criminal, whether we're talking federal or state, we see where the system of laws in the United States is letting people down constantly.
00:56:38.000 But the question becomes, how do we fix this?
00:56:41.000 How does that get addressed?
00:56:43.000 And, you know, you can mount a national campaign, you can try to change the law, but as we talked about before, unless you're Jeff Bezos, unless you're Bill Gates, that's very difficult to do.
00:56:53.000 But the governor can pardon people for state crimes.
00:56:56.000 The president can pardon people for federal crimes.
00:56:58.000 But we have not developed a compassionate culture that actually looks at this.
00:57:04.000 Every president has abused their pardon power or their pardon authority to sort of let their cronies off the hook.
00:57:11.000 We've seen it under this president.
00:57:12.000 We've seen it under previous presidents.
00:57:15.000 But it is very difficult to establish And understanding among average people that it's actually okay for presidents to use this power more liberally, when particularly we're talking about nonviolent offenses, when we're talking about things that have not,
00:57:33.000 you know, they're not that controversial, but they are being controversialized because of the political atmosphere of partisanship, where everything has to be criticized for political advantage from one side or the other.
00:57:46.000 Everything's become a football.
00:57:49.000 Well, particularly in your case when you're talking about polls that show 90% of people support you being pardoned and this recent ruling that what you exposed was illegal.
00:58:00.000 I wonder how much the president actually knows about your case.
00:58:06.000 It's a good question.
00:58:07.000 He's famous for barely paying attention in briefings.
00:58:11.000 I can't imagine that in 2013 this was fully on his radar, where he investigated it and read all the documents and really got deep into it.
00:58:21.000 I can't imagine he really knows everything that went down.
00:58:25.000 I bet he hasn't seen Citizen Four.
00:58:28.000 Just call him and tell him to watch it.
00:58:31.000 Listen, if I had his number, I really would.
00:58:34.000 And I do know people who know him, and I am going to communicate that after this conversation.
00:58:40.000 I think that would be...
00:58:42.000 I literally think that would win him a tremendous amount of political favor.
00:58:46.000 I really do.
00:58:48.000 I think, particularly at this point in time, where...
00:58:51.000 People are really, look, if there's ever a time where people are fed up about the overreaching power of government, it's during this pandemic lockdown, you know, for good or for bad, whether it's incorrect or incorrect, people are very frustrated right now with power.
00:59:07.000 They're very frustrated right now with the draconian measures that some states have put in place to keep people from working and their eyes keep people safe.
00:59:17.000 All this would contribute to the motivation to pardon you, because I think that it would show people that the president actually does agree that there have been some overreaches, and in your case, not just an overreach,
00:59:32.000 but a miscarriage of justice, a disgusting, un-American overreach.
00:59:37.000 I think when you ask this question about how much does he know about the case, it's fair to say not a lot because he's intentionally being misadvised by his advisors.
00:59:48.000 You've had the Attorney General, William Barr, who says he would be vehemently opposed to a pardon for me.
00:59:55.000 His Secretary of State, Mike Pompeo, has literally, I think said I should be killed.
01:00:00.000 John Bolton, at least, said I should be killed.
01:00:04.000 And, you know, I think when this conversation first came up a couple weeks ago, Mike Pompeo probably hid every pen in the White House because he's trying to make sure things like this don't happen.
01:00:14.000 I think there are a lot of people who try and control the president.
01:00:20.000 But this whole question about, you know, What's right for me?
01:00:26.000 What's right for the president in terms of political advantage is the wrong question.
01:00:32.000 This is why I haven't been advocating for pardon.
01:00:34.000 I didn't ask for a pardon from Obama.
01:00:37.000 I did ask for a pardon for Chelsea Manning, which we didn't get, but we did get clemency, and that's an important thing.
01:00:47.000 What we need is we need for pardons to be made not as a question of political advantage, but as a decision taken to further the public interest.
01:00:59.000 And this is why I say pardon all of these previous whistleblowers.
01:01:04.000 Thomas Drake, John Kiriakou, Terry Albury, reality winner, Daniel Hale.
01:01:08.000 There are many names.
01:01:10.000 Daniel Ellsberg, right?
01:01:11.000 He wasn't convicted, so he got out.
01:01:13.000 But these people deserve recognition as the patriots who stood up and took a risk for the rest of us that they are.
01:01:20.000 Look at the current cases, right, that don't even require an exercise of the pardon, don't worry.
01:01:26.000 But Julian Assange, right now, today, is in court in the UK fighting an extradition trial to the United States.
01:01:33.000 For those who don't remember, this is the guy who's the head of WikiLeaks, right?
01:01:37.000 And he really fell out of favor in 2016 because he published the Hillary emails and everything like that, or Podesta emails.
01:01:46.000 But he's not being charged for that.
01:01:48.000 The extradition trial has nothing to do with that.
01:01:51.000 Actually, the U.S. government, under William Barr, the current Attorney General, is trying to extradite this guy and put him in prison for the rest of his life for the best work that WikiLeaks ever did, that has won awards in every country basically around the planet,
01:02:07.000 including the United States.
01:02:09.000 Which is the Iraq and Afghanistan war logs, right?
01:02:13.000 Detainee records in Guantanamo Bay.
01:02:16.000 Things that are about explicit war crimes and abuses of power, torture and people who were killed who shouldn't have been killed, violations of use of force protocols, and all of these things, right?
01:02:28.000 And this could all be made to go away if William Barr, the Attorney General, simply dropped the charges.
01:02:34.000 And he should.
01:02:36.000 Why isn't he?
01:02:38.000 Well, Julian Assange has literally been tortured.
01:02:42.000 I mean, the guy was locked in that embassy for how many years with no exposure to daylight, just completely trapped.
01:02:49.000 And you've seen videos of him skateboarding around the embassy.
01:02:53.000 I mean, it looks like he's going crazy in there.
01:02:54.000 And now he's in jail and on trial.
01:02:58.000 The whole thing is, it's so disturbing because You know, when it boils down to, like, what did he do that is illegal?
01:03:08.000 What did he do that people disagree with, that people the United States disagree with, in terms of the citizens?
01:03:15.000 Well, he exposed horrific crimes.
01:03:18.000 He exposed things that were deeply...
01:03:22.000 that the United States citizens are deeply opposed to.
01:03:27.000 And the fact that that...
01:03:31.000 Is something that you in this country can be prosecuted for, that they would try to extradite you and drag you from another country.
01:03:40.000 They'd kick him out of the embassy and bring him back to the United States to try him for that.
01:03:44.000 It seems like we're talking about some kangaroo court.
01:03:47.000 It seems like we're talking about some dictatorship where you have no protection to freedom of speech, no protection under the First Amendment, no protection under the rights of the press.
01:04:01.000 It's so disturbing that there are workarounds for our Constitution, our Bill of Rights, that we all just agree to, just accept that this is happening.
01:04:12.000 There's no riots in the streets for this.
01:04:14.000 No one's up in arms that they're trying to extradite Julian Assange.
01:04:20.000 It's not in the news.
01:04:22.000 For whatever reason, The mainstream news has barely covered it over his current court proceedings in the UK. Well, I think a lot of this comes down to the fact that they see Julian Assange, by this they I mean a lot of the mainstream media,
01:04:40.000 the broadcast outlets, as a partisan figure.
01:04:44.000 And it's really sad because the most dangerous thing about the charges against Julian Assange is if they extradite Julian Assange, if Julian Assange is convicted, He's charged under the Espionage Act, the same act that I'm charged under, the same thing that all these whistleblowers are charged under,
01:05:01.000 but he is not a source.
01:05:04.000 The way as abusive as these Espionage Act charges have run in the last 50 years is the government had sort of a quiet agreement.
01:05:14.000 They never charge the press outlets.
01:05:16.000 They never charge the New York Times.
01:05:18.000 They never charge the Washington Post.
01:05:19.000 They don't charge the journalists.
01:05:21.000 They charge their sources.
01:05:23.000 They charge the Chelsea Mannings, right?
01:05:25.000 They charge the Edward Snowdens.
01:05:27.000 They charge the Thomas Drakes, the Daniel Ellsbergs.
01:05:30.000 But the press, they're left alone.
01:05:32.000 They are breaking that agreement with the Julian Assange case.
01:05:36.000 Assange is not the source.
01:05:38.000 He is merely a publisher.
01:05:40.000 He runs a press organization.
01:05:42.000 People are like, oh, Julian Assange is not a journalist.
01:05:44.000 He's not whatever.
01:05:45.000 There is no way you can make that argument in court in a way that will be defensible, particularly given what we've talked about with the government and how careful they are to avoid prior court precedents and to work around it and create obscure legal theories.
01:05:59.000 That are legal fictions.
01:06:01.000 Everyone knows they're a lie.
01:06:02.000 Everyone knows these theories are false.
01:06:04.000 But under the law, you know, they bend just enough that they can pass the argument through and get the conviction they want.
01:06:10.000 You cannot convict Julian Assange, the chief editor and publisher of WikiLeaks, under the Espionage Act, without exposing the New York Times, the Washington Post, CBS, ABC, NBC, you know, CNN, Fox,
01:06:26.000 whoever, To the same kind of charges under this president and every coming president.
01:06:32.000 And I think people don't think about that.
01:06:35.000 That is disturbing.
01:06:37.000 You know, another thing that's disturbing, well, there's many things that are disturbing about this case, but another thing that's been disturbing was he was a guy who the left supported up until 2016, and then it became inconvenient.
01:06:51.000 When he was dragging Bush, it was great.
01:06:53.000 Then when he's dragging Clinton, it's not so great.
01:06:56.000 Right, right.
01:06:57.000 When the footage was revealed from the, I believe it was a helicopter that showed it was a collateral murder.
01:07:07.000 Remember that video that was put out?
01:07:09.000 Collateral murder in Iraq.
01:07:10.000 Yeah.
01:07:10.000 It was an Apache helicopter in Iraq firing on two Reuters journalists who were embedded with local militants or something.
01:07:19.000 Yes, exactly.
01:07:22.000 That was the left's—he was the darling of the left.
01:07:26.000 I mean, they were all free Julian Assange.
01:07:29.000 And it's just—it's so interesting how that narrative can shift— So completely to all of a sudden he's a puppet of Russia and that's what it became in 2016 and that propaganda stuck and people who were pro Julian Assange before now all of a sudden I've seen these people say fuck WikiLeaks you know and fuck Julian Assange like that guy's a puppet of Russia I'm like like how much have you looked into this?
01:07:55.000 It's amazing how that kind of propaganda, when you just get the surface veneer of whatever the narrative they're trying to push, how well it spreads.
01:08:08.000 That all these people who were these educated left-wing people now all of a sudden were anti-Wikileaks.
01:08:16.000 And I'm like, do you not remember how this whole thing got started?
01:08:20.000 It was the Iraq War, which we all opposed.
01:08:23.000 Do you not remember this whole bullshit lie about the weapons of mass destruction that got us into this crazy war?
01:08:31.000 And then Julian Assange and WikiLeaks exposed so much of this.
01:08:36.000 And yet, here we are, in 2016, it turns up on its head, and now he's a puppet of Russia and WikiLeaks is bad because, inconveniently, the information that he released damaged Hillary Clinton's campaign.
01:08:51.000 I think a lot of it comes down to people forgetting what principles are and why they're important, right?
01:09:01.000 You can hate Julian Assange.
01:09:03.000 You can think Julian Assange is a puppet of Russia.
01:09:05.000 You can think he's the worst person on earth, right?
01:09:08.000 He's a reincarnation of Hitler or Stalin or whatever and still realize That convicting him harms you.
01:09:17.000 It harms your society.
01:09:19.000 It harms your children's future.
01:09:21.000 People forget about this in today's world where everything's become partisan.
01:09:25.000 But the ACLU cut their teeth.
01:09:27.000 They made their reputation on defending a Nazi march through a Jewish neighborhood.
01:09:36.000 And this is because it's about the right to assembly, the right to freedom of speech.
01:09:42.000 You do not have a right to be free from offense, right?
01:09:46.000 There is no constitutional right to a safe space.
01:09:51.000 But that doesn't mean you do nothing.
01:09:53.000 That doesn't mean you have no opinion.
01:09:54.000 That doesn't mean you have no political power.
01:09:57.000 What it does mean is that you have to recognize that everyone has the right to their own opinion, even terrible opinions.
01:10:05.000 What we have to protect is the speech, is the platform, is the assembly, is the association, is the process That allows us to understand and recognize and identify when people did break the law, when they did harm others, to go to a fair trial where the jury can consider why they did what they did,
01:10:22.000 what they did, and not just whether it was legal or illegal, but whether it was moral or immoral, whether it was right or whether it was wrong, and whether they are the lowest person, you know, the most ordinary citizen in the Whereas today,
01:10:42.000 you know, we call them public officials and private citizens, but with all of the surveillance, all of the data collection, people in power, commercially or governmentally, they know everything about us.
01:10:55.000 And we know nothing about them.
01:10:58.000 We break the smallest law, we go to jail, we get a fine, we get screwed, we can't get a job, we can't get a loan.
01:11:05.000 But if they, you know, flagrantly abuse their office, their authority, they get a pass.
01:11:11.000 They go on the speaker's circuit, you know.
01:11:13.000 It's all sunshine and rainbows for them.
01:11:17.000 And the way we change these things is remembering our principles and being willing to stand and defend them.
01:11:24.000 It's also instinctual for people to be partisan.
01:11:27.000 And it's tribal.
01:11:29.000 It's a tribal thing.
01:11:30.000 And in this day and age, people are rabidly partisan.
01:11:34.000 And the rejection of nuance is so disturbing to me.
01:11:38.000 And it's so disturbing that a lot of this happens from the left now.
01:11:41.000 Whereas the left used to be all about freedom of speech.
01:11:44.000 The ACLU is...
01:11:45.000 I mean, it's just...
01:11:46.000 You automatically think of liberal people when you think of the ACLU. But...
01:11:52.000 The ACLU, just for the record, is a nonpartisan organization.
01:11:56.000 Yes, but supported overwhelmingly by left-wing people.
01:12:03.000 I mean, obviously they are nonpartisan, but people are so partisan today that this rejection of nuance...
01:12:12.000 It's so easy for people to look at things as left versus right and ignore all of the sins of their team and concentrate on defeating the other side.
01:12:26.000 And it seems to be a giant part of the problem today, so much so that people are in favor of A lot of people are in favor of deplatforming people that just simply disagree with them.
01:12:40.000 And I want to talk to you about that because it seems to be a gigantic issue.
01:12:44.000 Not seems to be.
01:12:45.000 It is a gigantic issue with social media, whether it's with Twitter or YouTube or many things.
01:12:51.000 In fact, Unity 2020 is something that...
01:12:56.000 My friend Brett Weinstein is putting together this idea that we should look across both parties for people that are reasonable and rational people and look at what we agree with rather than simply sitting on partisan policy on party lines and only voting You know,
01:13:21.000 blue across the board or red across the board.
01:13:23.000 And let's look at reasonable people from both sides, whether it's Dan Crenshaw and Tulsi Gabbard or whoever it is that are they represent different parties, but they're both reasonable people.
01:13:33.000 Let's get them together and have these communications.
01:13:34.000 They were banned from Twitter.
01:13:36.000 They were simply banned from Twitter for simply saying, reject both Trump and Biden.
01:13:44.000 Look for a third choice.
01:13:46.000 So there's nothing offensive about what they did.
01:13:49.000 In fact, they're encouraging choice.
01:13:52.000 They're encouraging this idea that we don't have to be a two-party system.
01:13:57.000 That, in fact, even though we have had libertarian and green parties...
01:14:01.000 We kind of look at it like bullshit.
01:14:03.000 It's like a protest vote.
01:14:05.000 If you vote Green Party, you know you're not going to elect that person for president.
01:14:09.000 It's kind of like we tolerate it.
01:14:11.000 But when someone like Ross Perot came around, it threw a monkey wrench into the gears and became very dangerous for both sides because the Republicans lost a lot of votes and that's how Bill Clinton got into office and George H.W. Bush did not get a second term.
01:14:26.000 Directly because the influence of Ross Perot.
01:14:28.000 So they changed the requirements for getting into the debates and everything became very different and very more complicated after that.
01:14:36.000 The fact that Twitter would be willing to ban Unity 2020 specifically because they're calling for people to walk away from this idea that you have to either vote for Trump or Biden and trying to get mainstream acceptance of a potential third-party candidate is extremely disturbing.
01:14:56.000 But de-platforming in general, I think, is extremely disturbing because it's a slippery slope.
01:15:01.000 If you decide that someone has Views that are opposite of yours and they bother you.
01:15:07.000 Those views bother you and you could do whatever you can to get them off of a platform.
01:15:12.000 It's very dangerous because someone from the right who gains power or someone from an opposing party that gains power, if they get into a position of power in social media, if they own a gigantic social media company like Twitter or YouTube and they decide in turn to go after people that agree with your ideology,
01:15:33.000 Well, then we have a freedom of speech issue.
01:15:35.000 And you're literally supporting the suppression of freedom of speech if you're supporting de-platforming people on social media.
01:15:42.000 And I've always thought that the answer to someone saying something you disagree with or someone saying something you vehemently oppose is a better argument.
01:15:54.000 That's what it's supposed to be.
01:15:55.000 It's supposed to be you should expose the problems and what they're doing.
01:15:59.000 And I'm seeing so many people, particularly on the left, that are happy when people get deplatformed.
01:16:06.000 And people that just are...
01:16:09.000 Just are contrary to their perspective, contrary to their ideology.
01:16:14.000 And I think it's very dangerous, and it's too easy.
01:16:18.000 It's too easy to accept.
01:16:20.000 And this goes back to what you're saying.
01:16:22.000 This partisan viewpoint that we have today, fiercely, rabidly partisan, in a way that I've never seen in my life.
01:16:30.000 Yeah, I think the question of deplatforming, this is one of the central issues of our time that's really overlooked and it's underappreciated.
01:16:40.000 So many people on both sides are in favor of this when it's somebody they don't like, right?
01:16:48.000 Yes.
01:16:51.000 The central issue is this.
01:16:53.000 Do we want companies deciding what can and cannot be said?
01:16:59.000 Do we want governments deciding what can and cannot be said?
01:17:03.000 If the answer is yes, it is a very different kind of society than we have had traditionally.
01:17:10.000 I do think we need to understand where this impulse came from, how it came to be, and why it seemed reasonable.
01:17:19.000 And a lot of people forget this.
01:17:21.000 And it came from ISIS. If you remember the Islamic State, It was all over YouTube.
01:17:27.000 They were all over Twitter.
01:17:28.000 They were all over Facebook.
01:17:29.000 And they were literally burning people alive in cages.
01:17:32.000 They were beheading people, you know, pushing people off buildings.
01:17:34.000 Just horrible stuff.
01:17:37.000 And that raises a tough question for a lot of these companies.
01:17:41.000 Now, it's very easy to make the argument that, alright, this is a direct call for violence.
01:17:47.000 This is literally supporting terrorism.
01:17:52.000 And as a private company, we have no obligation to let people use our platforms.
01:18:00.000 Therefore, we're closing their accounts, right?
01:18:02.000 We're shutting this off.
01:18:03.000 We're erasing it.
01:18:03.000 We can do whatever we want.
01:18:04.000 It's our website.
01:18:05.000 Don't like it?
01:18:05.000 Leave.
01:18:07.000 Constitutionally, there's no freedom of speech issue implicated there because the Constitution restrains the federal government and the state governments in certain circumstances.
01:18:20.000 Not private companies.
01:18:24.000 But once that precedent had been established that they would do this for ISIS, they started going, well, what about these other people?
01:18:33.000 What about these things that could be construed as calls to violence?
01:18:36.000 Okay, what if they're not violence at all?
01:18:40.000 What if it's harassment?
01:18:41.000 What if it's abuse?
01:18:42.000 What if it's racism?
01:18:43.000 What if it's, you know, criminality?
01:18:44.000 What if it's drug culture?
01:18:46.000 What if it's pornography?
01:18:47.000 What if it's whatever?
01:18:48.000 And there will always be more what-ifs.
01:18:51.000 And the categories of prohibited speech will constantly expand.
01:18:56.000 So we need to ask ourselves, well, who is best placed to make those decisions about what can and cannot be said?
01:19:03.000 Traditionally, the access to broadcast It was limited.
01:19:09.000 You had radio, you had TV. If you didn't have that, you had the soapbox on the corner, right?
01:19:14.000 Or the local university, the coffee shop.
01:19:17.000 And somebody owned those places, or somebody ran those places.
01:19:23.000 You know, the college president would say this person would be invited to speak, this person wouldn't be invited to speak.
01:19:30.000 And I actually think it's right and proper for people to be able to protest speakers to say this person shouldn't speak at our college.
01:19:38.000 But I think the college itself, the institution, has to be willing to make value judgments about why they invite certain people to speak.
01:19:47.000 And if that person's a very unpopular speaker, if that person is representing a viewpoint that is not well supported by the college, if it's not necessarily what students want to hear, But the administration believes, like the faculty believes,
01:20:03.000 that it's something students should hear.
01:20:07.000 Isn't that why we have universities?
01:20:10.000 We don't go to class to learn, you know, necessarily, like, you don't go to a literature course to read the things that you want to read.
01:20:18.000 You just go home and read those yourself.
01:20:20.000 You go to study a curriculum to something else.
01:20:22.000 You want to benefit from the experience from the perspectives of others.
01:20:26.000 The question that people have is, how does this expand into the wider audience, right?
01:20:30.000 What happens when you move beyond universities?
01:20:32.000 What happens when you move to news broadcasts?
01:20:34.000 What happens when you move to the internet?
01:20:36.000 What happens when everyone everywhere can broadcast?
01:20:39.000 And this is where I think things get really tricky.
01:20:42.000 Not, can people say what they want?
01:20:45.000 As long as they're not advocating violence or whatever, I don't think this should be a difficult issue.
01:20:51.000 This gets complicated when you have things like YouTube's next video suggestion algorithm.
01:20:58.000 Because the idea of universal speech, universal ability to broadcast, is exactly as you said.
01:21:06.000 Well, what is the counter for this?
01:21:08.000 You've got frickin' Nazis on the internet.
01:21:10.000 And I'm not talking like, whatever, the guy's got a Trump sticker on his truck.
01:21:14.000 I'm talking goose-stepping, you know, swastika-bearing actual frickin' Nazi.
01:21:19.000 You have those people out there on the internet calling for violence, calling for all these terrible things.
01:21:25.000 And normally the way you deal with this, even in the case of something like ISIS, you drag them onto the platform.
01:21:32.000 You discredit their ideas before the world because if you don't, If you drive them underground, if you make them, you know, this faction that's, you know, hanging out at a radical mosque, or, you know, they're hanging out at the hardware store if they're freaking Nazis or whatever,
01:21:49.000 there are places where you create its own community that is sheltered from other perspectives, it's sheltered from other ideas, and that is where extremism thrives, where it cannot be challenged, Where it cannot be exposed for what it really is.
01:22:09.000 But when you've got YouTube going, oh, you like Nazi A? How about Nazi B? How about Nazi C, right?
01:22:15.000 These people never get exposed to counter speech.
01:22:18.000 And this is where things get tricky.
01:22:21.000 Well, it also gets tricky when you decide that someone is saying something that's offensive and you remove them from the platform and then you open the door for other things being offensive, things that maybe aren't offensive to you.
01:22:37.000 The slope gets slippery.
01:22:39.000 And then you have wrong speak.
01:22:41.000 You have newly dictated language that you have to use.
01:22:46.000 You have new restrictions on ideologies, things you're not allowed to espouse.
01:22:51.000 I mean, Twitter will ban you for deadnaming someone.
01:22:55.000 They will ban you for life.
01:22:57.000 Meaning if you transition to be a woman and you call yourself Edwina and I call you Edward, I will be banned for life.
01:23:05.000 With no recourse, which is madness.
01:23:08.000 It's mad because I can call you fuckface and no one has a problem with it.
01:23:12.000 You know what I'm saying?
01:23:13.000 I could call you a terrible, I could call you that and there's no problem.
01:23:17.000 But if I choose a name that used to accurately represent you...
01:23:24.000 As a different gender, because this is some new, incredibly important distinction that we've decided.
01:23:31.000 It takes precedence over everything else, including, it's more significant than insults, more significant than demeaning of, I can call you a moron, I could demean your intellect, all those things are fine.
01:23:43.000 But if I choose to call you by a name that used to accurately represent you when you were a different gender, or when you identified with a different gender, Because of today's political climate, that is grounds for banning you for life.
01:23:58.000 It shows you how incredibly slippery censorship can get, because I would have never imagined that.
01:24:04.000 If you said to me 10 years ago, well, when someone becomes a transgender person 10 years ago, if you said this to me, If someone becomes a transgender person, you call them by their original name.
01:24:15.000 You could be banned from social media for life.
01:24:17.000 I'm like, get the fuck out of here.
01:24:18.000 They'll never get to that.
01:24:19.000 No one's going to be that unreasonable.
01:24:21.000 That's crazy.
01:24:21.000 Because you could call some people so many disparaging and insulting names, but you can't say their name that isn't even insulting.
01:24:30.000 Deadnaming, that's what it's called.
01:24:32.000 So it just shows you, deadnaming of today, you agree with that today, that opens up the door for all kinds of crazy shit five years from now, ten years from now, if we still get more and more rabidly politically polarized, and our idea of PC culture gets more and more extreme.
01:24:51.000 You're on a greased hill.
01:24:54.000 And if you decide to give up a little ground, the slide is imminent.
01:24:58.000 I think this is, like, you can argue on that axis, but I think incrementalism and the failures of imagination going, you know, 10 years ago, we couldn't imagine this would have been a valuable offense.
01:25:11.000 It's the wrong way to go about it.
01:25:14.000 Because if you go back to the founding of the country saying, you know, women should have the right to vote, black people should have the right to vote, you know, that was unimaginable.
01:25:21.000 That would get you equivalently deplatformed, not welcomed to the speaking community or whatever.
01:25:27.000 Sure, but those are positive and inclusive things.
01:25:30.000 I'm not saying...
01:25:32.000 I'm associating these directly.
01:25:33.000 I'm talking about the principle here.
01:25:36.000 I understand what you're saying.
01:25:46.000 And anybody technically today can decide who can and cannot speak on their platforms.
01:25:52.000 The question is, what should we do?
01:25:55.000 What kind of culture should we promote?
01:25:57.000 How should we have these conversations?
01:26:00.000 How should we make them available?
01:26:01.000 And I think civility is not too much to ask people generally.
01:26:05.000 As you say, you know, calling people fuckface or moron or whatever is completely normal on the internet, and that's not really going to get you banned from anywhere.
01:26:14.000 And now you have all of these companies contorting themselves to fit into these blocks to not isolate or anger all of these different demographics.
01:26:28.000 But if we truly want to have a global broadcast, a public commons, the question I think that's more important here is, Not so much what should and should not be banned, because that's accepting the premise of banning.
01:26:43.000 It's how do we create an inclusive platform where everyone can talk and even strictly and harshly disagree with each other without it coming down to name calling, without trying to dox people, without trying to basically dog whistle them or screw them or hurt them or harm them,
01:27:01.000 however.
01:27:02.000 Now look, I am not above calling people bad names on the internet.
01:27:06.000 I've said terrible things.
01:27:07.000 I grew up on the internet, right?
01:27:09.000 I was an asshole.
01:27:10.000 And we all were.
01:27:12.000 And the thing is, the worst things that we say at any moment today They are permanent.
01:27:18.000 The internet never forgets, right?
01:27:19.000 So when you say these things, and, you know, there's a young audience listening right now to like everything, and they think it's cool, they think it's funny, or they don't think it's cool, or they don't think it's funny, but they think they shouldn't be deplatformed for it.
01:27:35.000 They're edgy, you know, they push the lines or whatever.
01:27:38.000 They get that out there, and they start emulating this behavior.
01:27:41.000 They start saying mean things.
01:27:42.000 They start saying cruel things.
01:27:43.000 I did it myself, right?
01:27:45.000 Not in this context, but in whatever the equivalent would be, you know 20 years ago and There are going to be consequences for that.
01:27:56.000 They're going to be judged by that.
01:27:58.000 Whether they should or should not, whether it is right or wrong, because as you said, there's so much tribalism today.
01:28:03.000 And I think we have to create positive examples.
01:28:06.000 I think you're right.
01:28:07.000 The deplatforming is a huge issue.
01:28:09.000 It is a tremendous issue, right?
01:28:11.000 But we should think about what it is that we're actually fighting against.
01:28:16.000 And I don't think like trans issues or whatever, when it comes down to basically civility, Is the hill to die on?
01:28:23.000 Because I think there's better arguments.
01:28:26.000 Well, I certainly think we should encourage civility.
01:28:30.000 There's no doubt about it.
01:28:32.000 What I'm getting at is that the idea that you can be banned for life for that, it's preposterous.
01:28:39.000 I think civility is one of the most important things our culture could ever promote.
01:28:43.000 And I think it's very difficult to promote civility online because of the anonymous aspect of internet interaction.
01:28:51.000 Right.
01:28:51.000 There's no accountability.
01:28:52.000 You're not getting social cues from people.
01:28:54.000 It's just a completely different world when you're interacting with people, especially for kids.
01:29:00.000 I mean, if you'd given me the internet when I was 15 years old, I would have said the most horrific things to people, for sure.
01:29:06.000 And I'm sure many 15-year-old kids are doing exactly that right now.
01:29:10.000 I think...
01:29:12.000 The more we can encourage civility, the better we all are in all aspects of our life, whether it's person to person, face to face, or online.
01:29:20.000 I try very hard to only say things online that I would say to someone's face.
01:29:27.000 Online now, I do not interact with people.
01:29:33.000 I don't do it.
01:29:34.000 I don't believe in it.
01:29:36.000 I treat it the same way.
01:29:38.000 If it's avoidable, I avoid it.
01:29:40.000 And I think that's incredibly important.
01:29:42.000 But this does bring up an important point, which is, I mean, what it really gets to the core of the issue, failures of civility.
01:29:49.000 The fact that people say bad.
01:29:51.000 The faith that people don't have accountability.
01:29:53.000 There's a whole spectrum of people out there, from angels to devils.
01:29:58.000 There's ordinary people, and even the best of people have bad days and say terrible things.
01:30:03.000 For sure.
01:30:05.000 We do need people to have some responsibility for having a thicker skin.
01:30:09.000 Look, guys.
01:30:11.000 I've had people literally advocating my murder.
01:30:16.000 Torture and murder.
01:30:18.000 Horrible things.
01:30:19.000 Yeah, I've seen it.
01:30:20.000 For a year.
01:30:22.000 Yes.
01:30:22.000 And the people that I've blocked on my Twitter account are the ones who are posting about Bitcoin scams that are like, you know, send me five Bitcoin, I'll send you five Bitcoin back.
01:30:32.000 That's hilarious.
01:30:33.000 And I'm not saying this is the example to emulate.
01:30:37.000 What it is, though, is we have to recognize that some people aren't worth engaging.
01:30:42.000 Some people aren't worth listening to.
01:30:45.000 It's a lesson.
01:30:46.000 Right.
01:30:47.000 But that doesn't mean necessarily that you take their voice entirely.
01:30:51.000 Yes, I most certainly agree with that, particularly in terms of de-platforming.
01:30:58.000 My question to you about this is, and I've raised this question to many people and I really haven't got a satisfactory answer, do you think that things that get so huge, like Twitter or Facebook or even YouTube, do they become a basic right?
01:31:18.000 Is it like the utilities?
01:31:20.000 Is it like electricity and water?
01:31:23.000 The ability to communicate online seems to me a core aspect of what it means to be a human being with a voice in 2020. And I don't think it's as simple as removing someone from Twitter is simply a company exercising their right to have whatever they want on their platform.
01:31:44.000 I think when it gets as big as Twitter is, I think we've passed into a new realm, and I think we need to acknowledge that, whether it's Twitter or YouTube or Facebook or what have you.
01:31:53.000 And I think it should be very difficult to remove someone from those platforms.
01:31:58.000 And I think it should probably involve some sort of a trial.
01:32:02.000 I mean, this is a really, really tough issue.
01:32:07.000 It's much larger than just deplatforming, because what we're really talking about is the Internet as a public utility.
01:32:14.000 The internet is water and power.
01:32:17.000 And its ability to shape culture.
01:32:20.000 Right, right.
01:32:21.000 When you talk about something like Twitter, when the president is basically directing policy from Twitter, It's clear something has changed.
01:32:32.000 And threatening countries!
01:32:35.000 Our laws were not designed with that in mind.
01:32:41.000 And unfortunately, we have a legislature that's just fundamentally broken.
01:32:45.000 This gets back to the electoral system, which you talked about earlier.
01:32:48.000 Most countries in the world have a wide swath of parties.
01:32:53.000 They're not this two-party binary system where it's just two groups.
01:32:58.000 Largely neocorporatist groups that are just handing power back and forth.
01:33:01.000 The president changes, but the actual lawmakers, the actual structure behind the president, the advisors, are largely from the same cohorts.
01:33:11.000 We don't have that We don't have that governmental structure that allows us to adapt in a way that truly represents, I think, the broadest spectrum of public opinion in a way that allows us to respond to changes in technology in a meaningful way,
01:33:32.000 which is what's left us stranded today where these companies are sort of deciding things for themselves.
01:33:37.000 It's because there is a vacuum of legislation.
01:33:42.000 Now there's a question, do we want legislation?
01:33:45.000 People on different spectrums from authoritarian to libertarian here will go, we want lots of legislation, we want no legislation.
01:33:53.000 But there is a push and there has been a push in Congress for years, actually since the 90s, with the Communications Decency Act and the first crypto war where the government was treating the ability to encrypt your communications to make them secret or private as you communicate with people online.
01:34:13.000 They were treating that as a weapon And saying you couldn't export this code without getting a license from the government and all kinds of craziness.
01:34:21.000 But the Communications Decency Act, the idea that there would be obscenity regulations, some years ago you may remember a scandal involving Backpage, which was like a variant of Craigslist that had a lot of prostitution ads on it.
01:34:38.000 Government has been trying more and more to say these kind of things can be done on the internet, these kind of things can be said on the internet, these kind of things can't be said on the internet.
01:34:48.000 And they have been doing this largely under the guise, I would argue, of the Commerce Clause, right?
01:34:55.000 The federal government, where do they get the constitutional authority to regulate what we say and do businesses wherever?
01:35:00.000 Well, they go, well, The internet is global, it's international, therefore it's interstate commerce, and so we're going to regulate this as if you're, you know, shipping bushels of corn from Iowa to Florida.
01:35:12.000 But it's a little bit different than that, and I think What we need to recognize is that the internet is a utility and people, individuals, and corporate entities should be criminally liable For
01:36:01.000 the things that they do online.
01:36:02.000 Work this kind of stuff out, or at least hundreds of years.
01:36:07.000 But when you get the government and you get officials in Congress, you get officials at, you know, whatever the local department of this country or that country, you know, Russia's got a Telecommunications Censorship Bureau, China's got one, France, Germany, the United States, all of these guys have different regulatory authorities,
01:36:24.000 whether it's the FCC in the United States or Roskomnadzor in Russia.
01:36:29.000 And you cannot substitute their judgment.
01:36:32.000 For the judgment of a jury, for the judgment of the people and the public broadly.
01:36:37.000 And I think it's dangerous that we are trying to have the government pick winners and losers when whether you win or lose determines whether or not you can engage with the world, whether you can have a public presence on the internet because the internet is real life today.
01:36:58.000 Yeah, it is.
01:36:59.000 And could it be that the option would be to extend the First Amendment rights to the internet in general?
01:37:08.000 And to, if you want to run a social media platform, you know, other than what we're talking about, putting people in danger, doxing people, threatening people's lives, doing things that can cause direct harm to people, but the ability to express yourself in controversial ways.
01:37:25.000 Shouldn't we extend First Amendment protections to social media platforms.
01:37:32.000 I think this is a much more complicated question than it appears because you get into the whole thing of obligation of service.
01:37:41.000 There was a cause celebrate on the right, actually, that would seem like a similar issue.
01:37:48.000 Remember, there was the cake shop somewhere where they didn't want to serve like a same-sex marriage thing.
01:37:53.000 And again, this gets back to civility.
01:37:56.000 But some people, they have a very strong fundamental belief here that these people shouldn't be able to do this, that, or the other.
01:38:03.000 And if you impose that on them, that requirement on them, they've got to serve whatever their business is to these people that they don't like or that they don't agree with.
01:38:14.000 There's a compulsion of service there.
01:38:16.000 You start doing this with the internet, And then there's a completely different country, you know, let's say there's a website in Belgium that's now bound by American laws, that's bound by this, and Twitter can't ban this person even though they're against them.
01:38:32.000 It seems like...
01:38:34.000 But isn't that a different argument, though?
01:38:35.000 Because all these companies we're talking about, Twitter, Facebook, and YouTube, are all based in America.
01:38:46.000 Right, and would that be their loophole?
01:38:52.000 Yeah, would that be their loophole to get out of that, just sell it to China?
01:38:55.000 Right, but I mean, more fundamentally, we have to recognize either as a society, we can compel people to standards of civility, or we can't.
01:39:08.000 And we need to decide how we handle that, because that's what all of these tie around, right?
01:39:14.000 I think we have forgotten in many ways just we're not teaching people the golden rule well enough because we are all angry.
01:39:24.000 We are all in competition.
01:39:25.000 And the funny thing is the guy on the right who's poor and living in a trailer is not much different than, you know, the hippie on the left who's scrounging out of dumpsters, you know.
01:39:38.000 I think we're good to go.
01:40:00.000 And we are all getting lost in our own ideological differences and losing sight of the things that actually tie us together and that if we worked together, maybe we could change in a more meaningful way.
01:40:13.000 And the more people you meet, the more people you talk to, the more you realize how malleable people really are and about how so many of these ideological perspectives that they so rabidly subscribe to, they've adopted because it allows them to be accepted by their community,
01:40:29.000 by whatever neighborhood they're in, whatever group of people they hang out with, and they choose to adopt These ideas about how the world is, and so many of those people just don't experience people that are different from them.
01:40:43.000 I mean, that is the case with racism, that's the case with homophobia, that's the case with many of the issues that people have with other folks, is that they just don't know people from those other groups, and they haven't experienced, you know, they haven't walked a mile in their shoes, as it were.
01:40:57.000 I think civility should be encouraged as much as possible.
01:41:02.000 Also though, I'm a comedian and I talk a lot of shit and that's in the sense of humor like you can miss and it's been done against me many times where they've taken things I've said in jest and put them in quotes completely out of context and it looks horrible Because that's not the way it was intended.
01:41:24.000 And it was intended in humor.
01:41:26.000 Now, if you do have laws that not just encourage civility, but mandate civility, you're going to have a real problem with humor.
01:41:36.000 Because you're basically going to cut the ankles out of comedy.
01:41:40.000 Not that I'm saying that all humor has to be mean and vicious.
01:41:43.000 It doesn't.
01:41:43.000 But some of the best is.
01:41:45.000 Well, it's also about saying things that can't be said, you know?
01:41:49.000 Yes.
01:41:49.000 Yes.
01:41:50.000 Saying things that can't be said.
01:41:51.000 I think there's a giant problem with online censorship today.
01:41:57.000 And I think it's one of the biggest problems of our era.
01:42:00.000 And I do think it is because there is a massive slippery slope.
01:42:04.000 And I do agree with you about the cake people.
01:42:07.000 You know, that was a big issue, the cause celebre of the right of these people.
01:42:12.000 They should have the right, a lot of people felt, to not make a cake for someone who is doing something they think is immoral, right?
01:42:21.000 Being involved in a gay relationship.
01:42:23.000 But there's also the problem of sensationalizing these things because the people that did find those people that didn't want to make those cakes, they went to a bunch of people that agreed to make the cake first.
01:42:36.000 They went and tried to find someone who didn't want to make that cake, and then they turned it into a big story.
01:42:43.000 Now, even though I just think...
01:42:46.000 I mean, I think you should make a cake for gay people because there's nothing wrong with being gay.
01:42:50.000 I think the people that made that decision to not make that...
01:42:53.000 I feel bad for them.
01:42:55.000 I feel bad that they're bigoted in that way and that it's such a foolish thing to care who someone is in love with, whether it's the same sex or an opposite sex.
01:43:05.000 But also...
01:43:06.000 I think it's weird that someone wants to go around and try to find someone who won't make a cake for them.
01:43:11.000 Who wants to go from cake place to cake place to cake place until, like, aha!
01:43:15.000 I found a bigot.
01:43:17.000 And then make a big deal out of it.
01:43:19.000 Like, you know, you're searching for victimhood.
01:43:21.000 I mean, there's an argument that that's...
01:43:25.000 I mean, that's one way to look at it.
01:43:27.000 And another way to look at it is that's activism.
01:43:29.000 They're searching for injustice.
01:43:31.000 Agreed.
01:43:31.000 Agreed.
01:43:32.000 I agree.
01:43:33.000 This is the thing.
01:43:35.000 Like, What is right and wrong, this is what people forget, is changing constantly when we're talking about public opinion, because public opinion is changing constantly.
01:43:46.000 And this is why doing right by people, it's so sad that we've lost sight of this basic impulse to do unto others as you would have them do unto you.
01:43:56.000 Yes.
01:44:00.000 Because when you talk about the internet, when you talk about deplatforming, when you talk about humor, as you said, you know, people are going back and they're looking at your jokes.
01:44:07.000 They're putting them in quotes.
01:44:08.000 This is a different context.
01:44:09.000 You're being attacked by it.
01:44:11.000 Something you said looks bad.
01:44:14.000 There's things that you've said, things that I've said, things that the person listening right now have said that they believed, that they meant that they said ten years ago, that they said one year ago, that they said three weeks ago.
01:44:30.000 That they no longer believe.
01:44:31.000 That they've abandoned, that they've been persuaded otherwise, that they've changed their mind on.
01:44:35.000 And this was one of the central themes in the book, Permanent Record.
01:44:42.000 We are no longer allowed to forget.
01:44:46.000 Our worst mistakes, right?
01:44:48.000 They're there.
01:44:49.000 They haunt us.
01:44:51.000 They're used against us.
01:44:52.000 They're weaponized.
01:44:54.000 And this society has become aware of this.
01:44:57.000 Activists on all sides have become aware of this.
01:44:59.000 Immediately they use this To try to attack people on the other side of any issue that they don't like, to go after their credibility, to go after their character.
01:45:09.000 And what we are losing in that conflict, and this is a rational strategy on the part of both sides in the moment because they realize there is a real political advantage to be gained.
01:45:21.000 You can get people canceled very easily nowadays.
01:45:25.000 But the thing is, When we pin everyone to their worst moment, when we do away with the concept of forgiveness, we do away with the potential for growth,
01:45:43.000 for change, for persuasion.
01:45:45.000 And this gets back to those rat holes of extremism on YouTube, on Twitter, on everywhere else, where they start self-reinforcing and eventually reaching the bottom of the hole at the worst of the worst with everybody else who's been canceled too.
01:46:02.000 Part of that is because they can't climb out, or they think they can't climb out.
01:46:09.000 And there's a question, how do we resolve that?
01:46:13.000 One of the nice things about the pre-internet society was as bad as you were, as ignorant, as racist, as exploitative, as whatever you don't like, right, as that person,
01:46:28.000 that character was, They could find something new.
01:46:33.000 They could read a book.
01:46:34.000 They could meet someone.
01:46:35.000 They could change their mind.
01:46:36.000 And even if nobody in their town would ever forgive them, rightly, in some cases, because they had done something truly terrible, something truly unforgivable, they could leave.
01:46:45.000 They could move to a different town.
01:46:47.000 They could move to a different state, and that history would not follow them.
01:46:50.000 They could reinvent themselves, and they could become someone truly, honestly better, instead of being married to their prior ignorance.
01:47:02.000 That is a very important thing because we all are in a constant state of growth.
01:47:07.000 If you're not, you're really making some fundamental errors with your life.
01:47:12.000 We're all in this constant state of accepting and acquiring new information, gaining new perspectives, learning from our mistakes.
01:47:22.000 And unless you're Dr. Manhattan, unless you're some person who's not making any mistakes, And you just have this all-knowing vision of the world.
01:47:30.000 You're a finished product.
01:47:32.000 Like, please, if you are, share that with everybody else.
01:47:35.000 But most of us are not.
01:47:36.000 Most of us are in this weird state of being a human being on Earth where everyone is trying to figure it out in this incredibly imperfect world, incredibly imperfect society.
01:47:47.000 Everything from the structure, the economic structure to the societal structure, everything.
01:47:54.000 Down to the very last things.
01:47:57.000 Everything's imperfect.
01:47:59.000 And the idea should be that we're all communicating to try to grow together and that we're learning together.
01:48:05.000 And it's one of the more interesting things about interacting with people online is that you can get different perspectives.
01:48:10.000 And if you can let go of your ego and if you can let go of your preconceived notions, you can learn things about the way other people see and feel and think about the world that could change and enhance your own ideas.
01:48:24.000 And I think that it's important that we not just accept the fact that people are growing and getting better and improving, but that we encourage it.
01:48:35.000 We encourage it and we reward it.
01:48:38.000 I think that's one of the interesting things that...
01:48:42.000 We're struggling with.
01:48:43.000 I mean, you see this in the context of police violence.
01:48:46.000 You see this in the context of mass surveillance.
01:48:49.000 You see this in the context of cancel culture.
01:48:52.000 You see this everywhere.
01:48:54.000 One of the interesting things about this surveillance machine that has been built around us, the sort of architecture of oppression, the turnkey tyranny, as I describe it, so much is known about every person, regardless of how innocent or how guilty they are.
01:49:12.000 It's all in there.
01:49:13.000 You know, the files are waiting to be accessed.
01:49:15.000 The data just needs to be collated.
01:49:18.000 It's just waiting to be requested and analyzed and used.
01:49:25.000 What this means, like there's this old idea of the panopticon, right?
01:49:31.000 Which is you create a prison that is circular.
01:49:35.000 And in the middle of it, there's this great tower, right, that rises way up.
01:49:40.000 And at the very top of the tower, there's a mirrored glass room that the warden sits in.
01:49:47.000 And no prisoner knows where the warden is looking because the warden can see out but they can't see in.
01:49:53.000 And so everyone believes that they are watched and so the idea is that no one will misbehave because they're all afraid that they'll be retaliated against for breaking the rules or whatever.
01:50:05.000 But what we have seen as this surveillance machine has been built is we all realize Intuitively, innately, inherently, in ourselves, even if we don't recognize it, even if we don't speak to it, we witness it in the news every night.
01:50:23.000 There are records of wrongdoing.
01:50:25.000 Criminality in government at the highest and lowest levels of our government.
01:50:30.000 Corporations and prominent figures in society breaking the rules.
01:50:37.000 Ordinary people jaywalking, littering, polluting, small-scale petty stuff.
01:50:44.000 All of that somewhere there is a record of.
01:50:47.000 But in almost all cases it's not punished.
01:50:51.000 What has happened is we have broken the chain of accountability between knowledge of wrongdoing And consequence for wrongdoing.
01:51:05.000 And this happened without a vote.
01:51:07.000 It happened without our participation.
01:51:09.000 We weren't asked whether this was okay.
01:51:11.000 But I think in some way, that is beginning to change the moral character of people.
01:51:18.000 And what we need to do, starting with the top rather than the bottom, because China is trying to do the reverse, they're going, Alright, well, there's a simple solution to this.
01:51:26.000 Let's just start screwing everybody who breaks the rules instantly and immediately.
01:51:31.000 You know, you got a social credit score, you protested, so you're going off to a camp, you know, whatever.
01:51:38.000 But imagine what it would mean If we saw people where now any official, the minute they are guilty of the slightest infraction, immediately exposed in the press, they go on trial,
01:51:54.000 they go on all this stuff, they're ruined, they're disgraced.
01:51:59.000 But it turns out every other member of Congress is going to court in the same week because everybody is in violation of something somewhere.
01:52:06.000 We all have some measure of guilt, large or small, even if we're completely innocent because, you know, our legal code is so complex there's no way you can make it through a week without breaking some kind of rule about you can't wear a green hat on Tuesday.
01:52:21.000 But if this happened, if there was accountability for infractions of the rules any time an infraction of the rules was witnessed, The laws would change instantly to enshrine the right to privacy because the people in power wouldn't want to lose their position of power.
01:52:38.000 They would not want to lose this position.
01:52:41.000 And suddenly, when they have skin in the game, they would realize, oh, everybody deserves this.
01:52:46.000 And I think there's just something interesting to that.
01:52:49.000 I haven't thought this out all the way fully, so this could be, you know, give me some slack here.
01:52:55.000 But I think this is really what has changed.
01:52:59.000 We have built a panopticon, but what sits at the top of it is a computer.
01:53:05.000 That computer witnesses everything we do.
01:53:09.000 In reality, it's a distribution of computers.
01:53:12.000 They're owned by many people and answered to many people.
01:53:15.000 But it does not yet judge us for us.
01:53:19.000 Judge us for it.
01:53:20.000 And what is happening is the audience, society, the people have realized that they can see through this computer.
01:53:29.000 They can see through the panopticon from a certain angle, a certain degree, in a certain direction at any given time.
01:53:35.000 The cops that have been, you know, monitoring all of us for years, right, they've got surveillance and drones and stuff that they couldn't have imagined in generations prior.
01:53:45.000 But now every person on the street has a smartphone with a camera too and the cops are being witnessed for the first time and now people are trying to impose upon them the same judgment that has classically been imposed upon us and this I think is one of the dynamics that the changes that is leading to this increasing conflict in society is when you realize that the people that throughout you know your
01:54:16.000 generations A youth, we're told in Hollywood and stories, our common shared national myths.
01:54:25.000 You know, the government's the good guys.
01:54:26.000 The FBI's are going to get the gangsters and the terrorists and things like that.
01:54:30.000 They're the best of the best.
01:54:31.000 The fact that they are people too, they're not only fallible, but in some cases, you know, small-minded and vicious.
01:54:38.000 They are political.
01:54:39.000 They are partisan.
01:54:40.000 The same way everyone else is, people start questioning Power and how it is used, the basic legitimacy, the way it impacts our lives, what the limits of it should be.
01:54:51.000 But people yet have not realized One of the responses to this should be a limitation on the amount of power the government has, or rather not just government, but institution.
01:55:07.000 Institution is a concept, right?
01:55:09.000 Government or corporation.
01:55:11.000 The powers of institution should be limited to interfere in our lives.
01:55:15.000 Instead, what they're trying to do, both sides, you know, blue team, red team, whatever, they're squabbling, they're fighting over who has their hands on the trigger.
01:55:25.000 Who gets to aim the weapon rather than should the weapon exist?
01:55:31.000 Are you talking about police violence when you're saying these things?
01:55:35.000 That's a part of it, yes.
01:55:36.000 It's every direction.
01:55:38.000 But police violence is very much the public part of it that we see right now.
01:55:44.000 That seems to be one of the most complex abuses of power because the kind of power that you give someone when you allow them to be a police officer is literally the power to end life.
01:55:59.000 It's not just the power to kick you off of Twitter.
01:56:01.000 It's the power to decide this person who's just a regular person, no different than you or I, with all sorts of problems in their own life and stresses and strains and a disproportionate amount of strain and stress for the actual job that they do.
01:56:16.000 I mean, it's a spectacularly stressful position to be in life.
01:56:20.000 But yet you give them the ability to literally, with a finger pull, end someone's life.
01:56:27.000 I think that's being exposed in a way that we've, because of these cell phone cameras and because of social media, it's being exposed in a way that no one ever would have ever dreamed imaginable before.
01:56:41.000 And exposing how...
01:56:45.000 Almost impossible it is to have that position as a human being.
01:56:51.000 The position of power like that over folks and just to have a regular person with a normal psychology and not some incredibly brilliant Zen master who's in charge of overseeing drug crimes or pulling people over or assault or whatever it is.
01:57:13.000 I don't know the solution to that.
01:57:16.000 There's all sorts of things at play.
01:57:18.000 Ignorance, foolishness, racism, anger.
01:57:21.000 But at the end of the day, it's about a human being's ability to have a massive amount of power by law over other human beings, which is always going to be a problem.
01:57:33.000 It's just going to be a problem.
01:57:35.000 Yeah, I mean, I think...
01:57:38.000 We've known about this, you know, there's aphorisms that go back a zillion years, you know, absolute power corrupts absolutely.
01:57:45.000 You know, you give a monkey a stick, the first thing he's going to do is he's going to look for something to hit with it.
01:57:51.000 But this is also one of the things you asked earlier about, like, how I can be hopeful, how I can be idealistic when I see the scale of the problems, the challenges arrayed against us.
01:58:05.000 When I understand not just that mass surveillance exists, but I understand the mechanics of it.
01:58:11.000 I understand how systemic it is.
01:58:12.000 I understand the resources behind it that want to prevent the change of it and instead want to entrench it and expand it to make it more powerful and have more influence over the direction of our lives.
01:58:23.000 Down to this basic stuff about, you know, We are told that cops are the best among us.
01:58:29.000 People sign up to be cops, I genuinely believe, because they want to serve and protect, more so than they just want to be the big tough cop guy.
01:58:38.000 And some people say, you know, that's naive, some people say that's petty, but I think it's different.
01:58:43.000 I think the reason that I feel this way, the reason that I am okay with seeing how much we fail, seeing how much incivility and violence and just ignorance that we have in the world today is, Is I have a lower expectation of the individual at the moment,
01:59:02.000 but a higher appreciation for their potential.
01:59:06.000 And the reality is we are all inherently flawed.
01:59:10.000 I'm a terrible person.
01:59:13.000 And I think in a lot of ways, you're not as good as you want yourself to be.
01:59:20.000 But I know that I have become a better person with time.
01:59:25.000 You have become a better person with time.
01:59:27.000 I think we all have and we all can or those of us who have not could if they chose to or if they had guidance or if they had love or friendship or someone who cared and directed them and helped them become better.
01:59:41.000 Yeah.
01:59:42.000 And that's, I mean, that is the story of human history because we were all the monkey and then we found a stick.
01:59:49.000 We could use it to beat somebody or we could use it to build a bridge.
01:59:52.000 But if you look around at the world today, there's a hell of a lot of bridges.
01:59:56.000 There are.
01:59:57.000 And I think in terms of police brutality, there's very few reasonable...
02:00:11.000 How do you deal with these violent encounters that police officers often have with people?
02:00:19.000 How do you deal with the PTSD that I believe a vast majority of these police officers suffer from?
02:00:27.000 Completely stressed out.
02:00:29.000 Every time they pull someone over, it could be the end of their life.
02:00:31.000 They might not go home to their families.
02:00:33.000 They really don't know.
02:00:34.000 And I think there's also a bunch of them that are emotionally and psychologically unqualified for the job to begin with.
02:00:41.000 And then here we are with these calls in America, at least, to defund the police, which I think is even more ridiculous.
02:00:48.000 I think, if anything, they need more funding and more training and a more stringent process of elimination, of removing people that aren't qualified for that job, because I believe very few people actually are qualified.
02:01:01.000 I think there's great police officers out there, I really do, and I think most of them Most of the interactions that people have with police officers aren't horrible, but there's enough of those horrible ones that are captured on video that we have this bias towards these negative results that we see over and over again,
02:01:18.000 and we don't take into account the full data set.
02:01:21.000 We're not taking into account all the interactions that people have with police officers because those aren't documented.
02:01:26.000 What we're getting in front of our face day in, day out are the terrible interactions.
02:01:31.000 And I don't see nor do I hear a real workable way of improving this.
02:01:39.000 You get people that are either calling to defund the police or you're calling for people to support police officers.
02:01:47.000 That's all you hear.
02:01:48.000 And from a few people like Jocko Willink, you see really great suggestions that they should be treated the same way they treat Navy SEALs, where you're spending literally 20% of your time training.
02:02:00.000 And you're going through psychological training, you're going through actual real-world situations where you're going over what's the correct protocol and how to handle certain situations.
02:02:12.000 And I think it's a giant problem in our society today, and I think that's an understatement.
02:02:19.000 That every time someone gets shot that shouldn't have gotten shot, particularly if it's a person of color, it becomes a gigantic flashpoint for our society.
02:02:29.000 Well, let me challenge you on that a little bit.
02:02:31.000 Because, I mean, we can have civil disagreements in a way.
02:02:36.000 That's why we have discussion.
02:02:38.000 I think there are things that we can do that don't require the idea of shutting down every police department.
02:02:44.000 I think that's sort of far beyond what people talk about when they talk about defunding the police.
02:02:52.000 I think the most common sense...
02:02:55.000 The other measure that is being discussed, and it's not being discussed as broadly in terms of the mainstream news, it should be, is ending police unions, right?
02:03:06.000 Now, why do we talk about that?
02:03:09.000 This gets back to the same thing that we talked about earlier, with the court cases and the government.
02:03:13.000 They get caught doing something wrong.
02:03:16.000 But there's no consequence, right?
02:03:18.000 And people learn from that.
02:03:20.000 Each generation learns from the cases prior, right?
02:03:22.000 It's in training, people learn the rules, things like that.
02:03:26.000 The reason a lot of police violence occurs Even if it's not all, again, there's no magic wand we wave that saves the world, is the lack of accountability.
02:03:38.000 We know there are cops, and even cops say this, right?
02:03:43.000 There are cops out there who aren't good people.
02:03:46.000 There are cops out there who have abused their authority.
02:03:49.000 There are, you know, really tragic cases where a cop has done something straight up criminal, And they have faced no meaningful consequences as a result.
02:04:00.000 Maybe they lost their job, right?
02:04:02.000 But if it was anybody else, they would have gone to prison.
02:04:05.000 And so there's a question of how do we remediate this in a way that preserves the legitimate interest of, you know, police officers as a class, But it also preserves the rights of the people who are being policed in,
02:04:26.000 by your own admission, at least some cases, people who are abusing their authorities.
02:04:31.000 And again, I'm not saying all cops are bad or anything like that.
02:04:35.000 But if we recognize there are abuses, and this is a class that is invested, as you said, with the power over life and death, We have to be willing as a society, and the people occupying this position have to be willing to assume a higher standard of accountability than ordinary people,
02:04:52.000 right?
02:04:53.000 And if we can agree on that, Everything else follows from it, I think.
02:04:59.000 We don't want to have a gun-toting, immunized class walking among us.
02:05:05.000 And I think even, you know, police officers, among themselves at least, would recognize this.
02:05:13.000 But it is rational for them.
02:05:15.000 To resist this from the interests of their class.
02:05:17.000 They're in a privileged position.
02:05:19.000 Why would they give that up?
02:05:21.000 The same way our spies are in a privileged position.
02:05:23.000 Why would they give that up?
02:05:25.000 But as a society, we exist to ask more.
02:05:29.000 And you raise valid points, right?
02:05:30.000 There's cops out there.
02:05:31.000 Go up to a dark car in the middle of the night.
02:05:33.000 They're afraid they're not going to make it home to their family.
02:05:36.000 That's reasonable and legitimate, right?
02:05:39.000 But being a police officer is a dangerous position that people have signed up to.
02:05:45.000 We give our police officers every advantage that could be given to them today.
02:05:49.000 I can tell you, from having lived all around the world, there's no cops in the world that are kitted out like cops in America are.
02:05:58.000 These guys look like something from a sci-fi movie.
02:06:04.000 And if there is a cop...
02:06:05.000 Well, some of them do.
02:06:05.000 Some of them do.
02:06:07.000 They're going to riots.
02:06:08.000 And look, there's good cops out there.
02:06:10.000 I had a lot of interactions with cops as a young man that were nothing but positive.
02:06:17.000 It's not that police as an idea are the enemy.
02:06:21.000 It is the system that is rotten, and I think even honest cops recognize that the system is fundamentally broken.
02:06:28.000 The question is not, or the question from their side should not be, can we stop reform?
02:06:35.000 Because if they are, if that's their position, I think they're doing the public a disservice, and I think to themselves they know they're doing a disservice.
02:06:42.000 It's how do we handle this appropriately?
02:06:45.000 How do we handle this in the right way?
02:06:47.000 And if there's cops out there who legitimately have served, you know, they've been out there for years, they've been exposing themselves to danger to keep people safe at night, they've done a good job, and they don't want to walk the beat anymore, that should certainly be an option that's available to them.
02:07:02.000 And from my perspective, as not a cop, but I think when you look at The state of law enforcement in the United States.
02:07:10.000 That very much is an option.
02:07:13.000 You know, do they want to work on dispatch?
02:07:15.000 Do they want to work on investigation?
02:07:16.000 Do they want to be cross-trained in forensics?
02:07:19.000 There are ways that we can end issues or at least mitigate some of the issues that we see with policing today without saying cops are the worst people in the world and without saying, you know, these guys should be above the law.
02:07:35.000 Well, I don't think anybody's saying they should be above the law, but...
02:07:38.000 But factually today, they really are.
02:07:40.000 Excuse me?
02:07:41.000 I said factually today.
02:07:42.000 Like, as a matter of fact, whether we like to or not, you gotta admit, in most cases, cops are bulletproof.
02:07:49.000 Well, I don't know.
02:07:50.000 I don't think I agree with that.
02:07:51.000 I mean, if you look at what happened to the George Floyd case, obviously they were caught on camera.
02:07:57.000 So we're fortunate we got to, not fortunate, but we got to see what happened and they reacted accordingly.
02:08:06.000 What you were saying before you started this, though, was that we need to stop police unions.
02:08:13.000 But do you think police unions aren't only around to protect people from the consequences of terrible policing?
02:08:21.000 They're also to provide health insurance and reasonable amounts of counseling.
02:08:28.000 This is a great argument for everybody to have health insurance.
02:08:34.000 Oh, for sure, yeah.
02:08:35.000 No, I agree.
02:08:36.000 I think health insurance is a fundamental right of being a human being in a civilized society.
02:08:42.000 I think it should be treated the same way we treat the fire department.
02:08:45.000 I think it should be something that we all agree we should pay into because it benefits all of us.
02:08:50.000 I mean, I just think if we are a community, and that's what really a country is supposed to be, we're supposed to be a large community, wouldn't we want to protect the most vulnerable members of that community?
02:09:05.000 If you have a small, knit family, and something happens to someone in the family, everybody chips in to help that person.
02:09:11.000 You know, that's what I think health insurance should be.
02:09:15.000 I think it should be an important part of a culture, of a community, of a group of human beings that decide they're all on the same team.
02:09:24.000 We have to take care of the most vulnerable people.
02:09:27.000 I mean, I think that across the board.
02:09:28.000 And I mean, that's really the argument that I'm making for how we want our police to be.
02:09:33.000 When I say, you know, cops are bulletproof, I don't mean in the literal sense.
02:09:36.000 There are a lot of cops who have given their lives to stop very bad people, and we should honor them.
02:09:41.000 We should provide for their families.
02:09:43.000 But the way that we do that is providing a better society that's more fair to police by being more fair to everyone, right?
02:09:52.000 Agreed.
02:09:53.000 Agreed.
02:09:54.000 Any occupation that has, it's really this simple, as long as we have an occupation that is invested with exceptional authority, they must be invested with an extraordinary standard of accountability.
02:10:08.000 It's that simple from my perspective.
02:10:10.000 It doesn't have to be a terrible thing, it doesn't have to be an aggressive attack, but it's this basic principle.
02:10:16.000 Today, In the world of business, in the world of government, in the world of policing, anywhere you look, right, it's a common issue.
02:10:24.000 What we have is a disproportionate allocation of influence, a disproportionate allocation of economic resources, a disproportionate allocation of authority without an equal allocation of responsibility.
02:10:46.000 Well, I think we both agree on that.
02:10:48.000 And I think we also both agree that it's not a shock that a disproportionate amount of criminal activity exists in a place where there's a disproportional amount of poverty.
02:10:58.000 Sure.
02:10:58.000 And a disproportionate...
02:10:59.000 Yeah, I mean, and very few economic opportunities.
02:11:04.000 I mean, this is something people don't want to talk about with terrorism.
02:11:07.000 But you're exactly right.
02:11:08.000 I mean, when you talk about...
02:11:10.000 We're good to go.
02:11:33.000 If we want to solve the symptoms, which are criminality, right?
02:11:40.000 Because people forget that terrorism is a crime.
02:11:43.000 It's a very grave crime, but it's still a crime.
02:11:46.000 We have to go to the core causes.
02:11:50.000 Yeah, and you know, we were talking about this previously on a different show in regards to the way people reacted to the pandemic in terms of economic support to businesses and trillions of dollars that were allocated to all these various businesses to try to stimulate them and keep them active and alive and keep people working.
02:12:07.000 And my thought was, like, imagine if that same attention to detail had been to impoverished neighborhoods.
02:12:14.000 If they had decided, like, listen, there's obviously a disproportionate amount of crime and poverty in these neighborhoods.
02:12:21.000 We've got to figure out a way to lessen that burden and strengthen those neighborhoods.
02:12:27.000 And in a real simplistic way of putting it, the way I've always said, if you want to make America great, you want less losers.
02:12:35.000 What's the best way to have less losers?
02:12:37.000 Have more people with an opportunity to succeed.
02:12:41.000 More people who grow up in an area where it's actually safe, where there's economic possibilities, where you're given more access to education, more access to healthcare, more access to counseling.
02:12:55.000 More access to community centers, any kind of support that you could possibly give people that gives them more of an opportunity to get by in life.
02:13:04.000 And that this is something that we've conveniently ignored, this need to strengthen these core and significant areas of our culture, but yet we do when something comes along like a pandemic that might close down business.
02:13:21.000 And already thriving economic businesses.
02:13:23.000 I think we should have put, I think a long time ago, we should have put similar resources and attention into these impoverished neighborhoods that have been impoverished for decades.
02:13:34.000 And a lot of it because of slavery and a lot of it because of redlining laws and Jim Crow laws and all the things that happened after slavery.
02:13:41.000 There's so many areas of our country that just don't get better, and we don't do anything about it.
02:13:46.000 And we just assume that these crime-ridden areas will remain that way forever.
02:13:51.000 And they send cops there, and then you see the videos of the interactions that cops have with people, and it just creates more and more anger and more and more frustration without any real...
02:14:03.000 Some sort of socially responsible action by the government and some sort of a program where it's explained to people, explained to the general public how this is going to benefit everyone,
02:14:19.000 that we will have less crime, that we will have more opportunity, that we will have more people involved.
02:14:24.000 That are educated and empowered entering into the workforce.
02:14:28.000 We'll have more competition.
02:14:29.000 It'll strengthen the country as a whole.
02:14:31.000 It'll be better literally for every one of us.
02:14:34.000 And that this is something that they didn't pursue.
02:14:37.000 And they haven't pursued in this country forever.
02:14:41.000 This gets back to that question that I was asking earlier.
02:14:44.000 It's one that I ask myself.
02:14:46.000 When you look at all the problems of today, and for somebody who's focused on privacy and surveillance issues, it's easy to be reminded every day of how deep in the hole we are.
02:14:57.000 Where did these things really start?
02:15:00.000 You talked earlier about a greased hill.
02:15:03.000 Where did the incline increase?
02:15:05.000 Where did things start to really go wrong?
02:15:07.000 Because they've always been going wrong in some area.
02:15:11.000 Again, that's our burden.
02:15:14.000 We've got to make things better because they're never going to be good enough where we start.
02:15:19.000 But in recent decades, things have gotten bad, and I think it goes back to the Patriot Act.
02:15:24.000 And you ask about economy, you talk about poverty, you talk about opportunity, how do we fix this?
02:15:32.000 Everybody is rehabilitating him now as this, you know, nice little old guy painting his feet in the bathtub.
02:15:38.000 But the Patriot Act, George Bush, and the Iraq War, and the policy of endless war that is continuing sadly today.
02:15:47.000 It's a bipartisan thing.
02:15:49.000 It continued under Obama.
02:15:50.000 It continued under Trump.
02:15:53.000 We have spent trillions of dollars, trillions of dollars, Killing far away people who, literally going by the statistics, are more likely to be non-combatants than combatants, I think.
02:16:09.000 Collateral damage is a real thing.
02:16:11.000 And even if every one of those people was someone we didn't like, was the level of effort, was the level of resources that we invested in it, Was the cost to our national soul worth whatever it is we can be said to have gained?
02:16:30.000 And I think the answer is that we have been generationally diminished.
02:16:39.000 Not by that president alone, but by the policies that that administration popularized that have been embraced and continued by the administration since.
02:16:52.000 And until we learn that lesson, we, you, me, everyone else will have an obligation to try and change things, to return us to a better path.
02:17:04.000 I agree with you and I also think there's a real good argument that there's certain aspects of technology that have been implemented in terms of like warfare and how we deal with terrorism that you could say short term perhaps might have eliminated some targets,
02:17:22.000 but I would argue long term probably encouraged more people towards radical fundamentalism, particularly drones.
02:17:29.000 When I tell people the efficacy of drone attacks and how many people who are killed by drone attacks, what I've gone into with people that really haven't focused on it, the amount of people that are innocent that are killed by drones and the vast majority of that being the case,
02:17:46.000 that when you're dealing with 100 drone deaths, it might be like 84 of them are innocent.
02:17:53.000 Like, imagine that being anything else.
02:17:55.000 Imagine if the police did that, if they prevented crime by killing 84% completely innocent people.
02:18:03.000 You would say, that's insane.
02:18:04.000 Like, we have to stop that immediately.
02:18:06.000 But because it's done with a robot that flies through the sky remotely from Nevada by some guy with an Xbox controller, and he's launching missiles into some sort of a car convoy, we've accepted this.
02:18:20.000 And I think...
02:18:22.000 There's a real argument that it's being accepted because of the remote aspect of it, because we don't see it, we don't feel it.
02:18:31.000 It seems distant, and even seems distant from the person that's holding the remote control.
02:18:37.000 They're saying that the people that are doing that, that are responsible for operating these drones, are experiencing a new level of PTSD and a very severe form of it.
02:18:48.000 Many of them, they're haunted by the idea of what they've done and the fact that even though their own hands have done it, they weren't there to see it.
02:18:57.000 It's some sort of a bizarre disconnect and that they're murdering literally Who knows what percentage, but it's a very high percentage of innocent people.
02:19:09.000 This gets us back to what I was talking about in calling for the pardon of these different whistleblowers.
02:19:16.000 This is the core issue of Daniel Hale.
02:19:19.000 Daniel Hale is an American who I believe is still on trial.
02:19:24.000 They have yet to be convicted, but the government is going to bury this man if they get the chance for revealing abuses in the drone program and the failures of the drone program.
02:19:35.000 And this also gets, you know, you talk about this question of efficacy and percentages.
02:19:40.000 We talk about mass surveillance.
02:19:43.000 Just last week, this was covered nowhere in media that I've seen so far in a prominent way.
02:19:50.000 I think the Washington Post wrote an article, but it, you know, it was buried.
02:19:54.000 It wasn't like a front page A1 sort of top of the fold splash on the FISA court.
02:20:00.000 A lot of people have heard about the FISA court because of the relationship to the Trump thing.
02:20:04.000 I hope one of your guys who works in production can pull out a headline or front page or the Twitter thread from Elizabeth Goitin, I think it's at Liza Goitin, who went through this.
02:20:18.000 It was published in declassified version of the FISA Reauthorization for last year where the court goes through every year and the FBI submits this request for basically a blanket surveillance warrant that they can use on all these different people for all these different sort of categories of behavior that they want to monitor.
02:20:42.000 And the FISA Court reauthorizes this annually.
02:20:45.000 And in this annual review, they look at, is the system functioning?
02:20:51.000 Is it effective?
02:20:52.000 Were the rules broken?
02:20:54.000 And one of these experts, I think she worked at the Brennan Center for Justice.
02:21:01.000 Correct me and edit me out if I'm wrong here.
02:21:06.000 There were thousands of cases in the last year.
02:21:09.000 Thousands of cases where the FBI looked people up under the aegis of a FISA warrant, right?
02:21:16.000 And this is like a mass warrant that's used for multiple people instead of one for everyone else.
02:21:20.000 And we know how bad these FISA warrants can be.
02:21:23.000 And over the course of thousands of cases, The court found that they had been unjustified in looking up these people's background in all but seven cases.
02:21:35.000 I think it was seven cases out of thousands.
02:21:38.000 And this is where it's at.
02:21:41.000 We have created a procedural state, a bureaucratic state, an automated system For policing, and I mean that broadly, I don't just mean, you know, guys in shiny shoes on the ground with a pistol on their waist.
02:21:57.000 I'm talking about, is it platform behavior and speech on Twitter?
02:22:01.000 I'm talking about, is it surveillance behavior both domestically against American citizens and abroad around the world?
02:22:07.000 We are trying to create a system That observes everyone and judges everyone in a way that we already know is not fair.
02:22:22.000 It is not used properly, it is not used appropriately, it is not used effectively, and I believe does more harm than good.
02:22:31.000 And why are we trying to create a system that sees everything we do and judges us, which is effectively trying to invent God, When we know that it is a dark and vengeful one, we need to think about the kind of technologies that we are putting in place that rule us,
02:22:53.000 but we do not effectively control.
02:22:56.000 Well, I think there has to be repercussions.
02:23:00.000 When you're talking about that, where all but seven of them...
02:23:04.000 In this case, there weren't.
02:23:04.000 The court said, oh yes, the FBI broke the rules routinely.
02:23:06.000 They did it all the time.
02:23:08.000 So, we're going to go ahead and reauthorize this for next year.
02:23:11.000 Here's your rubber stamp.
02:23:12.000 Come back in, you know, 12 months.
02:23:15.000 Exactly.
02:23:15.000 But what I'm saying is I think we as a society need to demand repercussions for these overreaches.
02:23:22.000 Because it is a violation of law.
02:23:25.000 And if it's a violation of law with no consequences, then we're not talking about law anymore.
02:23:31.000 We're talking about nonsense.
02:23:33.000 We're talking about things you could just get away with.
02:23:35.000 It really is.
02:23:36.000 It's a king class.
02:23:37.000 It's someone who could just get away with things.
02:23:39.000 It doesn't make any sense.
02:23:41.000 Or a law that's only enforced against the powerless but not against the powerful.
02:23:45.000 Right, particularly if you or me or Jamie had done the same thing, we would for sure be in jail for a violation of privacy, for invading someone's privacy, for doing something that is against the law.
02:23:59.000 If we were tried, we would be convicted, we would wind up doing time or pay some extraordinary fine.
02:24:05.000 We would be in real trouble, is my point, but they're not in any trouble at all.
02:24:09.000 That you cannot have that.
02:24:11.000 We can't have that in a society because if you have that ability to completely bypass any liability and any responsibility for a violation of law, then we've created two classes of human beings.
02:24:26.000 We've created human beings that are the governed, and then we've created human beings that are the governors, and the governors are exempt.
02:24:34.000 And that's not government anymore.
02:24:37.000 Now you're in a monarchy.
02:24:39.000 You're in some craziness.
02:24:41.000 Yes, you're rulers and the ruled.
02:24:44.000 And you can't have that.
02:24:45.000 We can't have that because of what you said earlier.
02:24:48.000 Absolute power corrupts absolutely.
02:24:49.000 That is absolute power.
02:24:51.000 There's no repercussions whatsoever for violating laws that can greatly...
02:24:57.000 Impact people's lives in a negative way.
02:25:00.000 That's crazy You can't have that we can't have that and we need to agree as human beings particularly now Because of the age that we live in and the access to information that we enjoy We're aware of this acutely.
02:25:12.000 It's obvious It's it's right in front of our faces and it's one of the many reasons why I think you should be exonerated Why I should I think you should be pardoned I mean, you've exposed this and you've opened people's eyes to this.
02:25:26.000 The exponential increase in people's understanding and appreciation for that, based on your work and what The Guardian put out and how you exposed all that, it's changed the conversation.
02:25:42.000 And it needs to be changed, and the repercussions need to be changed as well.
02:25:48.000 Well, thank you.
02:25:51.000 I guess there's not much more to say than that, but I hope one day I will be able to come back.
02:25:59.000 If I want to see you in real life, man, I'll give you a hug.
02:26:01.000 I'll come on the show and be in the same room for once.
02:26:05.000 Yeah.
02:26:06.000 Well, hopefully COVID will be gone then.
02:26:08.000 Well, I'll test you first.
02:26:09.000 We'll test each other first.
02:26:11.000 But listen, I said it before.
02:26:13.000 I really do believe this.
02:26:15.000 I think you're a hero.
02:26:16.000 And I think that what you've done, history will be kind to you.
02:26:20.000 They will look back on what has been done to you.
02:26:24.000 And I think our government is on the wrong side of history.
02:26:27.000 I really do believe that.
02:26:29.000 And I think if people really did know the facts, particularly the way you explained it earlier about how the information was distributed and the way it was handled ethically and morally, you did the best you possibly could have done with that situation.
02:26:48.000 It's an incredibly bold move that you've done, and I feel like the time has come.
02:26:54.000 I really do, and I hope Trump listens to this.
02:26:59.000 I really do.
02:27:00.000 I hope he listens to this, and I hope he understands also what a political piece it would be.
02:27:06.000 I mean, this is a massive—if he pardoned you, I think it would be a massively positive move for his own—the way the United States citizens view him.
02:27:19.000 Well, I hope what we see under this administration or any other, but certainly we don't have to wait much longer for, is ending the war on whistleblowers.
02:27:30.000 Because as much as I would like to come home, as much as I would like to see recognition from the system, that there are times when the only thing you can do is tell the truth, and that should not be a crime.
02:27:44.000 It's not about me.
02:27:45.000 It's about what happens to all of us.
02:27:47.000 It's what happens to the system.
02:27:49.000 It's how we restore, or rather realize, the ideal of a country that we were always told we had.
02:27:59.000 But in reality, we have never been as good as what we dreamed.
02:28:03.000 But we're getting closer.
02:28:05.000 And the way we do that is by admitting where we were wrong and doing better.
02:28:10.000 Thanks so much for having me on again.
02:28:12.000 I really appreciate this.
02:28:13.000 Thanks for being on, man.
02:28:14.000 Those words and that mentality are what make you a hero and your actions.
02:28:18.000 I appreciate you very much, man.
02:28:20.000 Thanks so much.
02:28:21.000 Stay free, brother.
02:28:22.000 You too, my friend.
02:28:23.000 Take care.
02:28:24.000 Bye.