RadixJournal - December 09, 2022


The AI Illusion


Episode Stats


Length

7 minutes

Words per minute

130.26315

Word count

924

Sentence count

84

Harmful content

Toxicity

8

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, I talk about how language has been around for a very long time, and how it has changed over that time period, and why we should be worried about artificial intelligence (AI) transcending our understanding of language.

Transcript

Transcript generated with Whisper (turbo).
Toxicity classifications generated with s-nlp/roberta_toxicity_classifier .
00:00:00.000 We have actually used language for a shorter amount of time than we would imagine.
00:00:08.700 And, you know, dogs can understand words.
00:00:13.780 I'm not sure a dog quite has a grammar, but they understand a sound wave as indicating something.
00:00:23.840 A dog knows its name.
00:00:25.600 If he hears the word walk or dinner and he perks up and is going to look at you and be like, oh, walk.
00:00:34.520 Yeah, that sounds great.
00:00:35.260 Yeah, let's do it.
00:00:36.220 So he understands language to some extent, but there's no actual grammar or logic exactly.
00:00:43.460 Now, you know, we're homo sapiens.
00:00:45.480 So, you know, wise man, the rational animal or something like that.
00:00:49.760 That is actually extremely incorrect.
00:00:53.380 I actually just saw something about this today.
00:00:55.600 There was an experiment in Germany in which fish have a sense of numbers.
00:01:01.260 They have a sense of a larger and a smaller number.
00:01:04.020 And they can actually engage in a sort of addition to some extent.
00:01:09.080 And they don't have a frontal brain anywhere close to the extent that we have one.
00:01:15.520 We might even overestimate like the head as the seat of reason or something.
00:01:20.640 We have reason in our spinal cord.
00:01:23.860 And I've used this metaphor quite a bit.
00:01:27.700 And so I apologize if people are getting bored of it.
00:01:30.340 But there is literally no time to think if you are standing at a baseball plate and someone is throwing 70, 80, 90, 100 miles per hour.
00:01:42.980 You cannot think in that split second when you determine what pitch.
00:01:49.280 Is it a curve?
00:01:49.980 Is it a fastball?
00:01:50.700 Is it a changeup?
00:01:51.480 Is it inside?
00:01:53.540 Is it outside?
00:01:54.400 Is this the pitch I want to hit?
00:01:56.380 You have absolutely no time to think that.
00:02:01.840 And yet you do.
00:02:02.800 The idea that some of these baseball players could explain to you how a curveball curves.
00:02:11.160 They can't.
00:02:12.360 But they just do it.
00:02:13.860 And they know it in their bones.
00:02:15.760 Maybe kind of literally in their bones.
00:02:19.020 A outfielder, he hears a crack off the bat.
00:02:23.320 The audible level of the crack gives him information.
00:02:27.260 He sees the ball maybe even kind of peripherally to some degree.
00:02:31.560 And he sees it travel like 50 feet.
00:02:35.420 And he estimates exactly where he should run to.
00:02:39.080 And he hops to the exact spot, opens up his glove, and in a lackadaisical manner catches it.
00:02:45.480 There is reason, mathematics, rationality in our spinal cord.
00:02:52.200 And we kind of don't grasp this.
00:02:56.240 And you're thinking consciousness when you're using language.
00:02:59.940 It is kind of almost like a late stage of this.
00:03:04.560 And there also have been experiments.
00:03:06.720 I think I've mentioned these to other people.
00:03:08.540 And I'll mention two.
00:03:10.740 One of which is that your muscles will engage before you think to pick up your coffee.
00:03:18.700 Now, does that mean that we're all predetermined?
00:03:22.020 No, it does not mean that at all.
00:03:23.800 And what it means is that you are telling yourself in your mind using language, I want coffee, as a kind of post-facto rationalization of what you are instinctively doing.
00:03:38.280 Another thing.
00:03:40.280 Another thing.
00:03:41.520 So there's an experiment that's done where they're blinded.
00:03:45.340 And they tell people to pick up objects.
00:03:47.880 And they say, we want you to judge the texture of these objects.
00:03:53.440 And so you'll pick one up that will be like furry.
00:03:56.760 And you'll pick up another one that will be slick.
00:03:58.640 And then they'll say, which object was heavier?
00:04:02.460 And people will get it. 0.97
00:04:04.340 What that indicates, that might sound like dumb or obvious. 0.97
00:04:08.240 No, it's not dumb or obvious. 0.90
00:04:10.020 What that means is that not only are they engaged, they can engage in reason. 0.82
00:04:14.200 They're engaging in judgment unconsciously.
00:04:18.280 You are thinking, your language comes kind of, it's like an after effect in a way.
00:04:24.520 And we developed language, and it's obviously immensely powerful, but we kind of shouldn't overestimate it.
00:04:34.000 We act in certain ways that are amazing and miraculous long before our language mind even gets its pants on in the morning.
00:04:44.980 And why I say all of this is that AI is nothing if not language.
00:04:51.820 I mean, it is computer code.
00:04:55.180 And ultimately, if you want to just reduce it to like the most basic thing, it is still language.
00:05:01.100 It is a binary, a one or a zero.
00:05:03.400 Machine code, the most basic kind of thing.
00:05:06.700 It is pure language.
00:05:09.640 What I am saying is that there's a kind of overestimation of language in these people who worry about like AI transcending itself or taking over the world or something like that.
00:05:22.320 It kind of can't think on some level.
00:05:25.280 Now, language matters.
00:05:26.440 Language affects us.
00:05:28.060 Language, our internal monologue is our way of rationalizing behavior.
00:05:32.640 It's also our way of kind of like super, it's a kind of superego, you could say.
00:05:36.960 It kind of like, you know, directs us and so on.
00:05:40.760 But it also shouldn't be overestimated.
00:05:45.180 We're biology.
00:05:46.080 We have instincts.
00:05:47.000 Long before, there were men around before language.
00:05:49.820 We got on. 0.98
00:05:50.620 You know, we built campfires and hunted shit and had sex and reproduced and raised children and did a lot of stuff. 0.93
00:06:01.620 You know, none of that is possibly available in the computer. 0.98
00:06:06.420 The computer is pure code.
00:06:08.520 So this notion that like it's anything other than some like logical thing and that it's going to like be able to think for itself or overcome itself, I just find ridiculous.
00:06:25.320 And it's just based on a fundamental misunderstanding by a bunch of nerds. 0.84
00:06:35.620 Thank you.