Real Coffee with Scott Adams - July 24, 2020


Episode 1069 Scott Adams: How to Reimagine Police Work to Free Resources for Social Services (Better Than Defunding)


Episode Stats

Length

16 minutes

Words per Minute

154.50092

Word Count

2,523

Sentence Count

158

Misogynist Sentences

1


Summary


Transcript

00:00:00.000 Hey everybody, this is a special unexpected live stream. I want to talk about one topic, and one topic only today. It's the topic about defunding the police, which of course is a provocative thought. Some people want fewer police presence, or less of it. Some people want more of it.
00:00:24.780 So the country is being torn apart, and I would like to suggest that this is not a question of left or right, even though we've made it that way. The question of defunding the police should be a technology and systems question. In other words, we should be thinking of it in terms of a puzzle to solve, and not a left problem or a right problem.
00:00:46.180 And the puzzle is this. How do you get more bang for your buck? A very basic corporate business decision. How do you take this thing we've been doing, policing, how do you get as much benefit as you can at the lowest cost?
00:01:02.180 And I would say that just thinking of it that way would make a big difference. And so I would like to say we should think of it more as reimagining the police, not defunding them. Defunding is sort of a, those are fighting words, and we don't need that.
00:01:20.600 But I would suggest that we already have in place everything we need, or very close to it, to be able to do something that really radically reimagines the police.
00:01:32.380 So what I'm going to tell you are some things that already exist. So if you say to me, Scott, Scott, Scott, we will never be able to develop these things, you're already behind the times.
00:01:43.980 Everything I talk about already exists. We just have to think about it in this way, and it's going to be able to move us forward.
00:01:52.100 And it looks like this. So you have a current police force with a current budget and a current way of doing things.
00:01:58.300 But at the same time, there's a little satellite around them of private companies who are doing functions that have, in some cases, wide application, but they're also really, really good for police.
00:02:11.760 And I'll talk about a few of these in a moment.
00:02:13.980 If you'd use this model, where you let private companies develop new technologies that would make policing way more effective, cost effective, so it gets the job done at a much lower cost, would you like to see these kinds of things experimented with inside the police force, in other words, inside a government-like entity,
00:02:38.720 or would you like to see private entity doing the development and taking the risk?
00:02:43.980 It's a rhetorical question.
00:02:45.480 The last thing you want is a government entity or something like the police force doing technology innovation.
00:02:54.500 You don't want it there. You want it where it is, in the private industry.
00:02:57.800 Now, there are a few developments that really make the idea of changing or reimagining how police work is done very practical.
00:03:07.580 So when the left says, hey, let's defund the police and make money available for social services, I say, that is completely doable.
00:03:17.060 It is completely doable without losing a thing in policing, and because of this new technology that I'll talk about in a moment, you can actually get far better police results.
00:03:30.740 And I'm talking about multiples of better.
00:03:32.600 I'm not talking about a 10% improvement.
00:03:35.620 I'm talking about a five times improvement sort of situation.
00:03:39.060 Here are some examples.
00:03:41.980 Let's take DNA.
00:03:44.460 If you are a serial rapist, it is possible that the police already have several records of you with your DNA,
00:03:55.560 because you might be leaving it at each of these sex crimes places.
00:04:00.220 But it would be very common for the police to have three separate DNA samples for three separate crimes and not know who you are,
00:04:10.320 because knowing your DNA doesn't automatically make you findable.
00:04:16.200 Approximately 15% of sex crimes, you can take the DNA and actually find the person.
00:04:23.600 85% of the time, it's just a data and a record, and it's a dead end.
00:04:28.760 You can't find the person.
00:04:30.820 But the newer DNA technologies, and we won't get into specific companies.
00:04:36.560 I'm going to talk generically now.
00:04:38.280 The newer DNA, they could take the DNA from the suspect, and they could find their family members.
00:04:44.540 And if you find somebody's family members, because there are enough people who have DNA in different places, etc., that are accessible,
00:04:51.780 if you can find their family members, you can pretty much find the suspect almost every time,
00:04:58.280 because you just go to Uncle Bob and you say,
00:05:01.580 Hey, Bob, do you know anybody in your family who might live in this neighborhood?
00:05:07.480 And Uncle Bob will say, Yeah, that's my nephew, Jason.
00:05:11.520 And then you go check on Jason, and you've got your suspect.
00:05:16.480 So imagine that level of improvement.
00:05:21.180 To go from about 15% of your cases are solved.
00:05:26.480 I'm talking about violent rapes.
00:05:28.280 15%, 15%, 1-5, is how many they can identify.
00:05:34.280 An external company can get them all.
00:05:39.380 Can get them all.
00:05:40.420 Now, nothing's 100%, right?
00:05:42.440 So when I say can get them all, you should translate that into your head to maybe 90% or something.
00:05:48.340 But to go from 1-5% to 90%, how expensive would that be?
00:05:55.600 And here's the kicker.
00:05:57.640 Not very.
00:05:59.040 It's not very expensive.
00:06:00.580 In fact, there are hundreds of thousands of rape kits,
00:06:04.500 in other words, the sample that's been taken from the crime that have not been tested.
00:06:10.680 Hundreds of thousands of them.
00:06:12.540 Now, most of them will be tested, and they'll go through the system,
00:06:15.480 but they will not result in any kind of a match.
00:06:19.480 They could go to an external company and say,
00:06:23.560 For this reasonable dollar amount, can you solve all of our worst cases?
00:06:28.400 So you could take, for example,
00:06:29.780 all of the records that have multiple crimes associated with the same DNA.
00:06:35.220 And you can say to this company,
00:06:36.500 Look, we've got five DNA samples from five different rapes,
00:06:42.060 and we don't know who this is.
00:06:44.300 How many crimes do you prevent
00:06:47.880 if you take a serial rapist off the streets?
00:06:52.940 Maybe a lot.
00:06:54.540 You could end up solving, or not solving, but preventing,
00:06:58.380 10 rapes by getting one serial rapist.
00:07:03.960 And likewise, I suppose,
00:07:05.140 if it's some other kind of violent crime that leaves DNA.
00:07:08.060 So imagine how much is being left,
00:07:11.260 you know, sort of on the table of free money,
00:07:13.660 because this isn't completely being used at the moment.
00:07:17.700 And that's just one example.
00:07:19.540 Imagine having facial recognition
00:07:21.560 widely available for the police.
00:07:24.660 Imagine the police have a suspect
00:07:26.380 get a photo somehow,
00:07:28.920 and they can identify them.
00:07:30.700 Now, you have the problem with
00:07:32.040 African-American faces are harder to identify,
00:07:35.860 but apparently that is not common
00:07:37.600 to all of the different facial technologies.
00:07:41.020 There are some that are actually good
00:07:42.500 at distinguishing African-American faces,
00:07:45.380 and some are not.
00:07:46.600 So if you hear some stories about
00:07:48.040 facial recognition has problems
00:07:50.160 with black citizens,
00:07:51.840 that's true,
00:07:53.500 but it's not true of all the technologies.
00:07:57.120 At least not as true.
00:07:59.000 And certainly,
00:08:00.400 there's something about the process
00:08:02.140 that can be done better.
00:08:03.980 And I talked about this
00:08:05.280 on a separate Periscope,
00:08:06.860 that if you have a match,
00:08:08.480 you think you have a match,
00:08:09.620 and you go to the suspect's house
00:08:11.280 to find out
00:08:12.040 if your facial recognition
00:08:13.780 got the right person,
00:08:15.060 you should at least bring the picture with you.
00:08:17.620 So you can hold it up,
00:08:18.740 and when the person answers the door,
00:08:20.300 you can say,
00:08:21.320 okay, we didn't get a match this time.
00:08:23.760 It's obvious.
00:08:24.460 I'm looking at you.
00:08:25.160 I'm looking at the picture.
00:08:26.400 It's not you.
00:08:27.200 Sorry to bother you.
00:08:28.560 So there's something
00:08:30.020 that can be done differently in the system
00:08:31.800 to make the technology
00:08:33.620 even more bulletproof.
00:08:35.460 Likewise,
00:08:36.240 police are already using drones.
00:08:38.320 You can imagine
00:08:39.060 private drone companies
00:08:40.740 being able to
00:08:42.120 take the risk
00:08:43.520 away from the police
00:08:45.260 and to just try some stuff
00:08:47.200 and it works or it doesn't work
00:08:49.600 and then the police can watch it
00:08:50.860 and they can decide
00:08:51.540 to do more of it or less.
00:08:53.380 Imagine if you will,
00:08:54.680 I'm just brainstorming here.
00:08:56.260 Imagine if you will
00:08:57.060 if private drone companies
00:08:59.220 made a bunch of deals
00:09:01.100 to put drones on rooftops
00:09:02.940 just staged
00:09:04.260 in various places around the city.
00:09:07.280 I'm just making this up.
00:09:08.420 This doesn't exist.
00:09:09.820 And imagine if there was a crime
00:09:11.460 and the police force
00:09:12.820 had a deal with a drone company
00:09:14.720 and they said,
00:09:15.160 look,
00:09:15.840 we've got a crime
00:09:16.940 in this neighborhood.
00:09:18.240 Can you get us some eyes on it?
00:09:20.660 Drone takes off
00:09:21.660 and takes pictures
00:09:23.920 and they get the best look
00:09:25.940 at things that they can.
00:09:28.020 If the drone has
00:09:29.460 a good camera,
00:09:31.680 then you've got your facial recognition.
00:09:34.440 So some of these are being done
00:09:36.660 in some fashion.
00:09:37.860 But what you'll find is
00:09:39.120 you're going to say to yourself,
00:09:40.740 well, Scott,
00:09:41.140 you've just described
00:09:42.000 a perfectly good system.
00:09:43.300 So is this already working?
00:09:46.080 Isn't this already working?
00:09:47.540 Because why wouldn't it be?
00:09:48.800 The police know about
00:09:50.140 these technologies
00:09:51.140 in most cases.
00:09:52.300 They know about them.
00:09:53.680 Why aren't they using them?
00:09:55.520 Some are.
00:09:56.760 Some are not.
00:09:57.440 But why isn't it just like widespread?
00:10:00.580 Just this is the way you do it.
00:10:02.780 And the reason is
00:10:03.660 bureaucracy.
00:10:05.260 The reason is
00:10:06.220 the left and the right,
00:10:07.380 the battle over,
00:10:08.640 you know,
00:10:09.000 turf,
00:10:09.620 the nobody wants to be,
00:10:11.100 to lose their job
00:10:12.020 on the old system
00:10:13.140 that's not working.
00:10:14.140 How do you get anything done
00:10:16.460 in a bureaucracy?
00:10:17.440 It's all that stuff.
00:10:19.520 But,
00:10:20.200 as these technologies
00:10:22.100 prove themselves
00:10:23.240 with the police entities
00:10:25.560 that do use them,
00:10:26.600 and they're all being tested somewhere,
00:10:28.500 as they're being tested,
00:10:30.160 it will be impossible
00:10:31.980 to ignore them.
00:10:33.460 Because if you have one city
00:10:35.480 that is solving
00:10:36.780 all of its sex crimes,
00:10:38.980 just think about that.
00:10:40.180 Suppose you had one city
00:10:41.220 that said,
00:10:42.000 we don't have any backlog,
00:10:43.380 we solved them all.
00:10:44.800 Just give us the DNA,
00:10:45.940 we'll solve it for you too.
00:10:47.580 How does Chicago,
00:10:49.640 just to use an example,
00:10:51.240 how do they not do this
00:10:52.640 if some other city,
00:10:54.400 let's say Baltimore,
00:10:55.780 tries it out
00:10:56.540 and it just works?
00:10:58.420 How do they not do something
00:10:59.620 that's so obviously good
00:11:00.760 and works and saves money?
00:11:02.620 Now,
00:11:03.120 the idea is
00:11:04.000 that each of these technologies
00:11:06.000 is so much more effective
00:11:07.720 than what the police
00:11:08.760 are doing already
00:11:09.580 that it should take
00:11:10.960 the amount of resources
00:11:11.980 they need way down.
00:11:13.880 Would the police
00:11:14.680 be in favor of this?
00:11:16.160 Let's say the individual
00:11:17.060 police officer,
00:11:18.240 not talking about bureaucracy,
00:11:20.540 not talking about left or right
00:11:22.060 or the mayor
00:11:23.440 or anybody else.
00:11:24.300 Would the individual
00:11:25.200 police officer
00:11:26.220 be happier
00:11:27.740 if these technologies
00:11:29.680 were more widely deployed?
00:11:31.560 I think yes.
00:11:32.760 They'd be happier
00:11:33.440 on a number of levels.
00:11:35.340 Number one,
00:11:36.000 if you're a police officer,
00:11:37.640 don't you like to solve crimes?
00:11:39.960 Of course you do.
00:11:41.520 What is better
00:11:42.540 if you're a police officer
00:11:44.040 than actually doing good work
00:11:45.900 and solving crimes
00:11:46.880 and making your community
00:11:48.480 a better place?
00:11:49.240 You'd probably get a raise.
00:11:50.920 So that's all good.
00:11:52.760 But I would argue
00:11:53.780 that also
00:11:54.480 there's a secondary benefit
00:11:55.880 which is really big.
00:11:57.380 It's really big.
00:11:58.380 Which is,
00:11:58.960 if you're a police officer,
00:12:00.120 all of these entities
00:12:01.360 which you would then
00:12:02.220 be working closely with
00:12:03.820 become a career path.
00:12:05.520 because
00:12:06.820 wouldn't it be good
00:12:08.180 to have an experienced
00:12:09.980 police officer
00:12:10.940 working on your drone
00:12:12.960 startup,
00:12:14.840 working on any
00:12:15.920 of your other startups
00:12:16.840 that have policing
00:12:18.140 technology involved?
00:12:19.800 So this is the basic idea.
00:12:24.020 You'll notice
00:12:24.880 some themes
00:12:25.700 I'm pulling together.
00:12:26.720 One is that
00:12:27.840 A-B testing
00:12:28.980 is always the way to go
00:12:30.620 if you can.
00:12:32.320 If something
00:12:32.940 can be tested
00:12:34.040 and to do it large,
00:12:36.640 you know,
00:12:36.840 to roll it out everywhere
00:12:38.280 would be really expensive,
00:12:39.860 well then test it small.
00:12:41.420 This is the perfect situation
00:12:43.420 for testing it small.
00:12:45.920 Now the other thing
00:12:46.520 that's working here,
00:12:47.280 the other big theme,
00:12:48.740 is changing the system,
00:12:51.160 of course.
00:12:51.660 It's a change
00:12:52.640 in the system.
00:12:53.700 Instead of moving
00:12:54.500 all of the functions
00:12:55.640 of police
00:12:56.280 within the police
00:12:58.300 organization
00:12:59.220 and budget,
00:13:00.540 you can move
00:13:01.500 some of it out
00:13:02.140 to private corporations
00:13:03.180 and de-risk it.
00:13:05.480 In other words,
00:13:06.120 the private corporation
00:13:07.060 can take all of the risks
00:13:08.700 that a police entity
00:13:10.720 could not.
00:13:11.920 And I'm not talking
00:13:12.540 about life and death risks.
00:13:14.080 You know,
00:13:14.260 nobody should be taking
00:13:15.160 those.
00:13:15.960 I'm talking about
00:13:16.700 just something
00:13:17.240 that might be embarrassing
00:13:18.240 if it didn't work.
00:13:19.840 Something that
00:13:20.700 if a police
00:13:22.120 put their budget
00:13:22.780 into it
00:13:23.440 and then it failed,
00:13:25.420 well,
00:13:25.780 that'd be bad
00:13:26.600 for the police,
00:13:27.600 bad for everybody.
00:13:28.740 So you take that risk
00:13:30.020 and you just move it
00:13:30.780 into places
00:13:32.120 where people like risk.
00:13:34.100 Venture capitalists,
00:13:35.460 entrepreneurs,
00:13:36.540 they like risk.
00:13:37.700 So move it
00:13:38.280 where they like it
00:13:39.000 and it won't hurt
00:13:39.640 the rest of the people
00:13:40.940 until it's proven.
00:13:42.500 I'm looking at the comments.
00:13:47.960 It says,
00:13:48.240 Scott does not talk
00:13:49.220 about the real problem
00:13:50.200 which is the wickedness
00:13:51.740 in the hearts of people.
00:13:53.960 Well,
00:13:54.580 that might be
00:13:55.240 a different periscope.
00:13:56.380 Somebody else
00:13:56.820 is going to have to do that.
00:13:58.780 But I do make
00:13:59.980 the following claim
00:14:01.080 that there are
00:14:02.380 a lot of things
00:14:03.000 that look like
00:14:03.620 they're problems
00:14:04.340 of the left
00:14:05.100 and problems
00:14:05.920 of the right
00:14:06.580 and not agreeing.
00:14:08.480 This is not one of them.
00:14:10.540 This is a problem
00:14:11.420 that has artificially
00:14:12.680 been put into
00:14:13.420 the world of politics
00:14:14.600 because everything is,
00:14:16.820 but it has nothing
00:14:17.600 to do with politics.
00:14:19.020 There's nobody
00:14:19.720 in the world
00:14:20.420 who wouldn't like
00:14:21.680 better police,
00:14:23.520 police effectiveness,
00:14:25.460 with less violence,
00:14:27.080 and at lower cost.
00:14:29.740 There isn't anybody
00:14:30.800 who doesn't want that.
00:14:32.360 And so if you take it
00:14:33.240 out of the political realm
00:14:34.260 and put it
00:14:34.740 in the economic realm,
00:14:36.480 the entrepreneur realm,
00:14:37.900 the realm
00:14:39.020 where ideas
00:14:39.920 actually work
00:14:40.900 and can be tested,
00:14:42.340 well,
00:14:42.640 then you've got
00:14:43.080 a solution
00:14:43.480 that works for everybody.
00:14:44.800 And of course,
00:14:45.400 language gets us
00:14:46.300 in trouble
00:14:46.760 because is what
00:14:48.420 I just described
00:14:49.640 a case of defunding
00:14:51.200 the police?
00:14:52.720 Kind of.
00:14:54.000 Yeah.
00:14:54.520 Because the ultimate result
00:14:56.520 would be the police
00:14:57.380 would be far more effective
00:14:58.740 and there's also
00:15:00.100 another impact
00:15:01.040 on top of this.
00:15:02.460 Imagine you're a criminal.
00:15:04.160 Some of you
00:15:04.760 might be criminals,
00:15:05.840 so it's easy to imagine.
00:15:07.140 Imagine you're a criminal
00:15:08.160 and in today's world,
00:15:10.840 if you did a violent rape,
00:15:13.680 you could get identified
00:15:15.160 0.15,
00:15:16.640 not 0.15,
00:15:17.660 15% of the time.
00:15:20.460 That's not much
00:15:21.540 of a disincentive,
00:15:23.140 is it?
00:15:23.680 Because a lot of people
00:15:25.540 who are, you know,
00:15:26.400 inclined to crime,
00:15:27.800 if they see there's only
00:15:28.700 a 15% chance
00:15:30.800 of even being identified,
00:15:33.880 you might be willing
00:15:35.960 to do the crime.
00:15:37.080 Suppose you had
00:15:37.980 a 90% chance
00:15:39.440 of getting caught.
00:15:41.120 Would you do the crime?
00:15:42.760 Some people will
00:15:43.500 and they'll be caught.
00:15:45.540 But they won't do
00:15:46.320 a second crime
00:15:47.100 because they got caught
00:15:49.500 after the first.
00:15:50.160 So that's my idea.
00:15:52.740 I thought I'd put that
00:15:53.360 out there for a comment
00:15:54.240 and I'd love to see
00:15:57.020 your feedback.
00:15:57.940 So give me your comments
00:15:59.400 and we'll go from there.
00:16:01.220 and thank you.
00:16:04.700 Thank you.
00:16:11.700 Thank you.
00:16:14.300 Thank you.