Leo D.M.J. Aurini


Triage VS the Trolley Problem


Episode Stats

Misogynist Sentences

3

Hate Speech Sentences

5


Summary

In this episode, I discuss the difference between artificial intelligence and pragmatic ethics, utilitarianism versus common sense, and why it's worrying me so much that technocrat-style utilitarianism seems to be so popular these days.


Transcript

00:00:00.000 Triage versus the trolley problem.
00:00:07.540 Now, what I'm going to be talking about in this video is the difference between this
00:00:17.340 artificial intelligence and pragmatic ethics, utilitarianism versus common sense.
00:00:27.400 And why it's worrying me so much that this mechanistic, technocrat-style utilitarianism seems to be so popular these days.
00:00:41.920 And a perfect example of it is the trolley problem.
00:00:47.520 The trolley problem, if you haven't heard of it, the trolley problem goes like this.
00:00:53.580 Say you are standing by a railway track, and there is a trolley speeding out of control.
00:01:01.320 Brakes have failed, it's going to run right past you, and it's going to go into a tunnel where there's eight workers present.
00:01:09.820 And it's going to kill all of those workers.
00:01:12.580 Now, right next to you is a switch that would put it onto a different track and put it into a tunnel that only has one worker working inside of it.
00:01:21.560 Ethically speaking, morally speaking, should you pull the switch so that only one person dies versus eight people?
00:01:32.560 And this is basic utilitarianism.
00:01:34.920 It's better to only have one person die than to have eight people die.
00:01:38.580 So you pull the switch and you divert it down the second tunnel.
00:01:41.800 Now, we add a layer of complexity to it.
00:01:48.080 We ask the question, what if, instead of a switch, imagine you're standing on a train platform next to a very fat man.
00:01:56.960 And you know that if you push the fat man in front of the trolley, it will slow it down enough to save the eight workers.
00:02:07.960 It's the same math, one life for eight lives.
00:02:12.660 But people tend to balk at the second option.
00:02:16.400 Why is that?
00:02:17.100 Well, some have pointed out, some have pointed out that, well, if you push the fat man, he's not really involved.
00:02:26.640 See, the workers, they signed up for some risk when they knew they were going to be working on the rail line, et cetera, et cetera.
00:02:32.080 There's the fact that pushing the fat man in front of the train, that's the same logic.
00:02:39.800 To save eight lives, that's the same logic as murder one person and harvest their organs to save eight lives.
00:02:47.340 It's the same logic, so why not murder that person?
00:02:53.640 This kind of points to the flaws in utilitarianism in general.
00:02:56.680 That utilitarianism is only concerned with the quote-unquote maximum good for as many people as possible.
00:03:06.060 And so by utilitarian ethics, if, you know, society would enjoy seeing somebody tortured to death,
00:03:11.800 and there's enough people that their happiness quotient outweighs his suffering quotient, then we should do it, right?
00:03:22.900 Eliezer Yucoustki once put it,
00:03:24.460 is it better to have one person tortured to death versus a Googleplex of people getting a moat of dust in their eye?
00:03:33.340 And he argued that one person should be tortured to death, which is insane.
00:03:39.500 See, this is where this logic leads.
00:03:41.040 This is where utilitarianism leads.
00:03:43.340 And I think that there's a deeper issue behind all of this.
00:03:48.300 This is somewhat related to a video I did recently about the difference between math, science, and engineering.
00:03:55.280 It's an issue of how do we understand the world.
00:04:02.620 Now, in that video, I pointed out, briefly, that mathematical knowledge,
00:04:06.620 when you prove something in math, you absolutely prove it.
00:04:09.920 Scientific math knowledge, scientific knowledge, is tentative,
00:04:14.180 and that engineering is considered with best practice, concerned with best practices.
00:04:19.760 The trolley problem assumes perfect knowledge.
00:04:29.920 Especially the second one, where you push the fat man in front of the train.
00:04:33.900 How would you know that the fat man is going to stop the train?
00:04:40.840 How would you know beyond...
00:04:42.380 What if you push him in front of the train and it still kills the eight workers?
00:04:45.680 The trolley problem reminds me of the tricorder in Star Trek.
00:04:58.760 See, the tricorder.
00:05:03.960 You know, the Star Trek people, they pull out their tricorder,
00:05:06.280 they scan the environment,
00:05:08.880 and they tell you all the secret scientific knowledge.
00:05:12.740 There's a radio wave field here.
00:05:16.140 The atmosphere is composed of...
00:05:17.760 That's not how diagnostics actually works.
00:05:29.660 Diagnostics.
00:05:30.660 Trying to understand the environment.
00:05:32.780 Trying to figure out why your damn car isn't working.
00:05:36.240 Diagnostics is all about understanding relationships.
00:05:41.840 Let's take the simplest diagnostic.
00:05:44.100 One of the simplest out there.
00:05:45.280 Is there a gauge, a fuel gauge, or a tire pressure gauge,
00:05:51.180 or whatever it might be?
00:05:55.900 Now, these gauges that tell you is the tank full or empty?
00:06:01.440 These are working...
00:06:03.460 Well, in the case of a fuel tank,
00:06:05.020 you're going to have a little float bob in it.
00:06:08.660 It goes up when it's full,
00:06:11.880 and it goes down when it's empty.
00:06:13.700 Thus, you get your fuel gauge.
00:06:17.000 You know, another one might be a pressure gauge.
00:06:19.540 Right?
00:06:19.820 Where there's a spring.
00:06:22.280 And we know that if it's exerting this much pressure,
00:06:25.140 there must be this many kilopascals inside of it.
00:06:28.780 This much pressure means even more.
00:06:30.500 You're determining relationships.
00:06:39.220 If you're trying to test an atmosphere on an alien planet
00:06:42.360 to figure out what percentage is the oxygen,
00:06:44.760 what percentage is the CO2 and the fluorine, whatever,
00:06:48.300 what you're going to do is you're going to take something
00:06:51.040 and have it react with the environment.
00:06:54.160 One of the things you have in the oil patch
00:06:57.500 is you have little labels
00:06:59.740 that will change color in the presence of sulfur dioxide.
00:07:08.560 And so you put these...
00:07:09.840 You put them down by your pants
00:07:11.100 because it's heavier than air.
00:07:12.220 All right?
00:07:12.540 You don't put them up here
00:07:13.480 because by then the sulfur...
00:07:14.700 It's already all the way up here.
00:07:16.340 You put them down on your pants,
00:07:19.560 and if you notice them change color,
00:07:22.260 you get the hell out of there.
00:07:26.480 But then you have to throw it out.
00:07:27.860 The diagnostic test has been done.
00:07:29.880 Think about back in science class
00:07:31.380 when you use those little pH strips
00:07:32.780 and you dip them in liquid
00:07:34.320 and the color it changes
00:07:35.500 tells you what the pH of that is.
00:07:37.320 That is a diagnostic test.
00:07:41.800 Now, it is theoretically possible
00:07:44.780 that when you put the little strip into acid,
00:07:48.860 well, you didn't mix it very well.
00:07:51.360 So the right side of the beaker
00:07:52.780 is more acidic than the left side of the beaker.
00:07:56.760 With your fuel gauge on your vehicle,
00:08:00.020 maybe some water got into the tank
00:08:02.120 and the little valve...
00:08:03.100 Oh, it rusted into position.
00:08:06.540 So you think you got a third of a tank left
00:08:08.640 and actually you're running on empty.
00:08:14.820 Your car won't start.
00:08:16.900 Electrics won't come on.
00:08:17.820 So you change the battery
00:08:19.080 and it starts working.
00:08:19.940 Does that mean the old battery was dead?
00:08:21.440 Well, probably.
00:08:23.640 But you haven't opened up the battery
00:08:25.660 and looked at its guts.
00:08:28.140 All right?
00:08:28.520 When you think the battery is dead
00:08:29.920 because that's the symptoms,
00:08:31.700 it doesn't necessarily mean that.
00:08:35.720 There could have been a loose wire.
00:08:39.200 So your old battery was fine.
00:08:41.480 But when you put in the new battery,
00:08:43.960 the wire got tightened,
00:08:45.440 it's no longer loose,
00:08:46.380 and your car works.
00:08:48.060 Unless if you open up the battery itself
00:08:50.600 and investigate its guts,
00:08:52.180 you don't actually know that the battery is dead.
00:08:54.900 In the real world,
00:09:01.420 there are no tricorders.
00:09:04.120 Okay?
00:09:04.300 The tricorder is a magical scrying spell.
00:09:08.700 All right?
00:09:09.060 The Star Trek people,
00:09:09.840 they pull it out
00:09:10.440 and they boop, boop, boop,
00:09:11.600 have perfect knowledge
00:09:12.820 of everything around them.
00:09:16.200 That never happens.
00:09:18.140 Especially not with something
00:09:19.460 as generalized as a tricorder.
00:09:21.500 Fuel gauges are pretty accurate
00:09:25.320 because we put a lot of effort
00:09:26.700 into making them accurate.
00:09:28.520 They're not 100%,
00:09:29.820 but they're pretty bloody close.
00:09:32.560 Something as generalized as a tricorder,
00:09:35.320 utterly imaginary.
00:09:37.040 Utter, utter BS.
00:09:41.400 You know, how many times
00:09:42.220 have you seen somebody
00:09:42.920 pull out a tape measure on Star Trek?
00:09:44.560 Okay?
00:09:44.800 Tape measures,
00:09:45.740 you can trust, more or less.
00:09:47.860 Tricorders, not a chance in hell.
00:09:49.340 But it seems to me
00:09:52.460 that this Star Trek tricorder
00:09:53.800 inducted a whole generation
00:09:55.980 into this idea
00:09:57.900 of having perfect knowledge.
00:10:01.000 If you remember my past video,
00:10:02.960 it's, you know,
00:10:03.440 the scientists,
00:10:04.040 they developed the perfect dairy barn,
00:10:06.100 but it only works
00:10:06.740 for spherical cows in a vacuum.
00:10:10.160 These technocrats love to come up
00:10:11.720 with all of these systems
00:10:12.940 that would work great
00:10:14.340 if perfect knowledge were possible.
00:10:16.700 And this is the issue
00:10:19.860 with the trolley problem.
00:10:21.840 Okay, the trolley problem
00:10:22.780 is presuming
00:10:23.720 this perfect knowledge.
00:10:29.260 And the scary thing about it
00:10:31.000 is that they are trying
00:10:32.720 to program this stuff
00:10:33.900 into artificial intelligences.
00:10:40.500 So what's the alternative?
00:10:41.580 Well, triage.
00:10:48.440 When medics are dealing
00:10:50.820 with a mass casualty event,
00:10:53.420 they break people down
00:10:56.060 into three categories.
00:10:58.100 The first is people
00:10:59.340 that are injured,
00:11:00.540 but it's non-life-threatening.
00:11:02.480 They're not going to die
00:11:03.640 in the next 30 minutes.
00:11:04.820 The next group
00:11:07.120 is those that are injured
00:11:08.720 and are probably going to die
00:11:11.200 no matter what you do.
00:11:13.040 And the third is people
00:11:14.200 that are injured,
00:11:15.240 but life-saving medical treatment
00:11:17.280 might save their lives.
00:11:20.420 So when the medics rush in,
00:11:22.840 they're making this judgment,
00:11:24.320 boom, boom, boom, boom.
00:11:28.560 They're using a very complex array
00:11:31.260 of priors, heuristics, instincts.
00:11:37.340 You know, Quintus Curtius
00:11:38.140 has written some interesting posts
00:11:39.980 about medical treatment
00:11:41.940 during the classical age.
00:11:46.880 And these doctors,
00:11:48.420 these medics back then,
00:11:49.280 didn't have the same
00:11:50.200 scientific knowledge
00:11:52.340 that we have now.
00:11:54.540 But there are so many cases
00:11:55.860 that he has cited
00:11:57.380 where the doctor
00:11:59.140 would go up to one person
00:12:00.200 with an arrow through the skull.
00:12:02.240 You know, after the battle,
00:12:03.160 he's still living,
00:12:03.700 he's riding his horse
00:12:04.580 with an arrow
00:12:05.160 sticking through his head.
00:12:06.480 Goes up to that guy and says,
00:12:08.260 okay, yeah,
00:12:08.800 you're going to be fine,
00:12:09.600 we're just going to have
00:12:10.020 to get that arrow out.
00:12:13.100 And he goes up to the next guy
00:12:14.520 who seems to have
00:12:15.620 a rather minor wound,
00:12:16.700 but he's like,
00:12:17.020 I'm sorry,
00:12:17.920 you're not going to live
00:12:18.720 through the night.
00:12:26.120 Tricorders don't exist.
00:12:27.620 Perfect scientific objective knowledge
00:12:31.500 doesn't exist.
00:12:33.700 However,
00:12:35.740 human assessment,
00:12:39.120 human instincts
00:12:40.520 are absolutely amazing.
00:12:46.080 When you get somebody
00:12:46.920 that's a real master
00:12:47.900 of their field,
00:12:49.820 they can often tell you things
00:12:51.960 that they don't even know
00:12:52.940 how they know them.
00:12:54.720 You know,
00:12:55.120 I've run this with myself
00:12:56.720 a few times.
00:12:57.340 When it comes to fixing cars,
00:12:58.460 I sometimes just know
00:12:59.940 what's wrong with the car
00:13:01.180 above and beyond
00:13:03.580 any diagnostics.
00:13:04.720 And this is because
00:13:05.080 I've spent a long time
00:13:06.680 fixing cars.
00:13:09.320 You just have this
00:13:09.840 gut instinct,
00:13:11.160 we call it.
00:13:12.040 Because the stuff
00:13:13.940 going on in the back
00:13:14.680 of our brain
00:13:15.160 is so complex
00:13:15.940 that we can't understand it.
00:13:16.980 We just,
00:13:17.480 we know.
00:13:21.620 And so when this medic
00:13:22.640 rushes into
00:13:23.700 the mass casualty event,
00:13:25.320 that's what he's doing.
00:13:27.220 He's not pulling out
00:13:28.320 a tricorder and saying,
00:13:29.440 well, this person
00:13:30.020 has a blood pressure.
00:13:31.400 No.
00:13:32.440 He's looking at the person
00:13:33.560 and boom,
00:13:34.060 snap judgment.
00:13:35.440 This person doesn't need help.
00:13:37.060 That person is beyond help.
00:13:38.760 This person,
00:13:39.660 I can help.
00:13:40.520 And I will take triage
00:13:48.260 over the trolley problem
00:13:49.320 any day
00:13:50.720 of my life.
00:13:54.740 See,
00:13:55.240 the trolley problem,
00:13:56.020 this whole idea
00:13:56.960 that we can have
00:13:57.660 this perfect knowledge
00:13:58.620 and use rationality
00:14:00.360 to make moral decisions,
00:14:01.860 and we can program computers
00:14:03.120 to do it first
00:14:03.940 so we don't have
00:14:04.500 human failings,
00:14:05.720 you're handing yourself
00:14:12.400 over to Leviathan
00:14:13.260 if you do that.
00:14:22.220 These technocrats,
00:14:23.660 they want to believe
00:14:26.220 that they're smarter
00:14:27.340 than anybody
00:14:27.900 who's ever lived.
00:14:29.760 That they're smarter
00:14:30.600 than anybody alive today.
00:14:31.720 that they can make
00:14:33.160 better decisions
00:14:33.960 than experts.
00:14:35.440 People,
00:14:35.900 when I say experts,
00:14:37.300 I don't mean the guy
00:14:37.880 has 12 degrees.
00:14:39.260 I mean the guy's
00:14:40.060 got his hands dirty.
00:14:41.720 Like,
00:14:42.340 the guy that
00:14:43.020 drilled four pins
00:14:44.260 into this hand
00:14:45.080 might not take
00:14:46.480 his diet advice,
00:14:47.860 but when it comes
00:14:48.640 to drilling pins
00:14:49.700 into a hand,
00:14:50.380 when it comes
00:14:50.740 to hand surgery,
00:14:52.160 this guy's a bit
00:14:52.660 of a maestro.
00:14:55.060 I'm going to trust
00:14:55.960 his instincts
00:14:56.740 because he seemed
00:14:57.620 like he knew
00:14:58.120 what he was doing.
00:15:00.100 And I'm going to
00:15:00.820 trust his instincts
00:15:01.700 over an artificial
00:15:05.280 intelligence
00:15:06.140 that has been
00:15:07.660 designed to drill
00:15:08.520 the perfect holes
00:15:09.660 for pins to go
00:15:10.500 into your hand
00:15:11.300 so long as we're
00:15:12.220 dealing with
00:15:12.680 spherical hands
00:15:13.460 in a vacuum.
00:15:19.060 We are far too
00:15:19.960 ready to discount
00:15:20.920 human instinct
00:15:22.680 and understanding.
00:15:23.620 I don't mean
00:15:24.200 instinct is,
00:15:26.800 this is the most
00:15:27.860 powerful
00:15:28.500 computer in existence.
00:15:31.120 Okay,
00:15:31.220 nothing holds
00:15:32.360 a candle
00:15:33.080 up to the
00:15:33.960 processing power
00:15:34.840 of this thing
00:15:35.740 right here.
00:15:37.800 And this thing
00:15:38.580 right here
00:15:39.120 takes into account
00:15:40.200 not just
00:15:40.960 thousands and
00:15:42.420 millions of
00:15:43.560 diagnostic heuristics
00:15:45.220 when you just
00:15:45.700 look at something.
00:15:47.300 You don't even know,
00:15:47.800 maybe it's the skin's
00:15:48.580 a little bit pale,
00:15:49.220 you know that guy's
00:15:49.840 going to die.
00:15:51.360 Who knows what it is?
00:15:52.680 All you know
00:15:53.240 is that you know
00:15:53.960 that that guy's
00:15:54.560 going to die.
00:15:54.960 this thing's
00:15:57.520 freaking amazing.
00:15:58.980 And on top of that
00:15:59.900 it includes
00:16:00.480 moral judgments,
00:16:01.800 it includes
00:16:02.260 empathy,
00:16:03.020 it includes
00:16:03.920 everything.
00:16:09.320 Humans might not
00:16:10.520 make the perfect
00:16:11.500 decisions all the
00:16:12.780 time,
00:16:13.800 but I'll promise
00:16:14.640 you this,
00:16:15.840 they'll make
00:16:16.360 better decisions
00:16:17.320 than an artificial
00:16:18.000 intelligence you
00:16:18.940 program with the
00:16:19.760 trolley problem
00:16:20.660 that winds up
00:16:22.260 cutting off
00:16:26.140 all of our
00:16:26.600 faces
00:16:27.000 and putting
00:16:28.600 wires in them
00:16:29.320 to make us
00:16:29.780 smile for all
00:16:31.160 eternity
00:16:31.640 because that's
00:16:32.680 what we told
00:16:33.280 it to do.
00:16:40.940 Don't
00:16:41.460 overestimate
00:16:42.180 your rationality
00:16:43.360 while
00:16:44.240 underestimating
00:16:45.400 this thing
00:16:47.600 in here
00:16:48.060 and this
00:16:49.700 thing in here.
00:16:51.180 Anyway,
00:16:52.460 Deus
00:16:52.760 Vult,
00:16:53.740 Urini,
00:16:54.640 out.