The Jordan B. Peterson Podcast - July 21, 2022


272. Zeroes and Ones: Into The Depths of Computation | Jim Keller


Episode Stats

Length

2 hours and 19 minutes

Words per Minute

177.47112

Word Count

24,760

Sentence Count

1,862

Misogynist Sentences

5

Hate Speech Sentences

6


Summary

Jim Keller is a microprocessor engineer known for his work at Digital Equipment, AMD, Apple, Tesla, and Intel. He was co-architect for what were among the earliest of 64-bit microprocessors, the EV5 and EV6, designed in the 90s. In the later 90s, he served as lead architect for the AMD K8 microarchitecture, including the original Athlon 64, and was involved in designing the Athlon K7 and Apple A4 through 7 processors. He is presently President and CTO at TENS Torrent, building AI computers. Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety. We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling. With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way. In his new series, he provides a roadmap towards healing, showing that while the journey isn t easy, it s absolutely possible to find your way forward. If you re suffering, please know you are not alone. There s hope, and there s a path to feeling better. Go to Dailywire Plus now and start watching Dr. J.B. Peterson on Depression and Anxiety. Let s take the first step towards the brighter future you deserve. Today. - Dr. B. Peterson - Daily Wire Plus - Podcast Episodes: Today's Theme Song: "Incomptech" by Suneaters by Haley Shaw Music by Jeff Kaale (Recorded in Los Angeles, CA - "Goodbye Outer Space" by Cairo Braga (ft. & Other Words) & "Outer Space Warning" by Robert Ferendle (feat. ) by Jeff McElroy (Partially Accompanied by and "Let's Talk About It (Goodbye, Goodbye, Goodbye, My Love & Goodbye, & " by Squeepee) - "Solo" by F&R (Isaac) by Eddy (Solo) & (Feat. (Aptest) (Bennie) and (Amberly ( ) & ) ( ) ( ) ( ) & (Apostle) & (C) (Avenance ( ) )


Transcript

00:00:00.960 Hey everyone, real quick before you skip, I want to talk to you about something serious and important.
00:00:06.480 Dr. Jordan Peterson has created a new series that could be a lifeline for those battling depression and anxiety.
00:00:12.740 We know how isolating and overwhelming these conditions can be, and we wanted to take a moment to reach out to those listening who may be struggling.
00:00:20.100 With decades of experience helping patients, Dr. Peterson offers a unique understanding of why you might be feeling this way in his new series.
00:00:27.420 He provides a roadmap towards healing, showing that while the journey isn't easy, it's absolutely possible to find your way forward.
00:00:35.360 If you're suffering, please know you are not alone. There's hope, and there's a path to feeling better.
00:00:41.780 Go to Daily Wire Plus now and start watching Dr. Jordan B. Peterson on depression and anxiety.
00:00:47.460 Let this be the first step towards the brighter future you deserve.
00:00:57.420 Hello everyone, I'm pleased to welcome Jim Keller to my YouTube channel and podcast today.
00:01:16.000 Jim is a microprocessor engineer known for his work at Digital Equipment, AMD, Apple, Tesla, and Intel.
00:01:27.420 He was co-architect for what were among the earliest of 64-bit microprocessors, the EV5 and EV6 digital alpha processors designed in the 90s.
00:01:40.020 In the later 90s, he served as lead architect for the AMD K8 microarchitecture, including the original Athlon 64,
00:01:49.360 and was involved in designing the Athlon K7 and Apple A4 through 7 processors.
00:01:55.860 He was also the co-author of the specifications for the X86-64 instruction set and HyperTransport Interconnect.
00:02:05.740 From 2012 to 2015, he returned to AMD to work on the AMD K12 and Zen microarchitectures.
00:02:14.200 At Tesla, he worked on automotive autopilot hardware and software, designing the Hardware 3 autopilot chip.
00:02:21.440 He then served as senior VP of silicon engineering, heading a team of 10,000 people at Intel.
00:02:31.440 He is presently president and CTO at TENS Torrent, building AI computers.
00:02:40.020 He's also my brother-in-law, and we've talked a lot over the last 20 years.
00:02:50.020 He was a friend of mine before he married my sister, and we've known each other for a very long time since we both lived in Boston.
00:02:56.180 So I'm very happy to have you to talk to you today, Jim.
00:03:00.600 I'm really looking forward to it.
00:03:03.560 So thanks for agreeing to do this.
00:03:05.940 Sure thing.
00:03:07.180 So let's walk through your career first.
00:03:10.340 It takes some unpacking.
00:03:11.760 Your resume takes some unpacking to be comprehensible, I would say.
00:03:16.420 So let's start with digital equipment.
00:03:18.280 You were working on very early stage, sophisticated microprocessors.
00:03:22.420 So tell me about that.
00:03:26.460 Well, the long story is I graduated college as an electrical engineer with a bachelor's, and I took a job in Florida because I wanted to live on the beach.
00:03:36.920 And it turned out to be a really interesting job.
00:03:39.360 I was at Harris, and I spent like two years working in labs, fixing up electrical equipment and doing some networking stuff and some digital design.
00:03:48.240 And at some point, a friend told me I should work at digital.
00:03:50.920 So I read about digital, and I literally read the computer architecture manual for the VAC 780 on the plane on the way to the interview.
00:03:59.860 And then I interviewed them with a whole bunch of questions because I just read this architecture spec, which I didn't know that much about, to be honest.
00:04:07.240 But I was kind of a wise-ass as a kid.
00:04:10.340 And they hired me because they thought I was funny.
00:04:12.860 And so I got my architecture education working on the VAC 8800, working for a guy named Bob Stewart, along with some other really great architects.
00:04:23.480 So I spent about seven years in that group, you know, learning to be a processor architect.
00:04:29.860 And then I spent a little time at digital research in California for six months, and then went back and joined the Hudson team where they were building alpha processors and became co-architect with Pete Bannon on EV5 and then with Dirk Meyer on EV6.
00:04:44.920 Now, those chips, 15 years, and we worked on, I would say, three very successful shipping products, and then a couple other products that didn't get to market.
00:04:58.740 So those chips, from what I remember, were remarkably ahead of their time, but that didn't seem to save Digital Equipment Corporation.
00:05:08.220 Yeah.
00:05:08.540 Is that fair to say?
00:05:09.360 That's definitely fair to say.
00:05:11.560 But Digital literally made the world's fastest computer in the years they were going out of business.
00:05:17.340 And there was a complicated market dynamic.
00:05:20.200 Digital was very successful building mini computers in the mini computer age, which replaced the mainframes to a large extent.
00:05:27.520 But they missed the boat on the PC revolution and, let's say, workstations.
00:05:31.640 And they had a big, expensive computer mindset right when computer prices were falling a lot.
00:05:37.640 So we were building really fast processors for server and high-end workstations, and the market had kind of moved on.
00:05:44.700 And let's say a lot of crazy things were going on inside the company at the time.
00:05:49.140 Yeah.
00:05:49.580 Well, we're going to return to the topic of crazy things going on inside computers.
00:05:52.880 But it's interesting to note right there.
00:05:54.720 So that's a situation where a company has a great product but doesn't know how to launch it into the marketplace or is blinded by its own preconceptions.
00:06:03.280 It can't even say necessarily what it happened.
00:06:04.840 More than that, Gordon Bell was CTO, and he was a really brilliant computer architect, but he also had really good, let's say, observational skills.
00:06:13.160 And midway through the Vax 8800, he decided the technology we were using was a little too late and redirected the program.
00:06:21.260 And it became a very successful product because he knew what was going on and he made decisions like that.
00:06:26.140 But when he left, I think digital became an argument between business unit managers, not about technology.
00:06:34.460 And Alpha was a great technology, but it went into business units that were aiming at high prices and high margins, not market penetration, and not basically keeping up with the software revolution that was happening at the time.
00:06:47.580 And so it was Ken Olson was a great manager, but he wasn't a technical leader.
00:06:53.720 And without Gordon Bell, the company kind of lost its way.
00:06:57.880 And when companies lose their way, they fail.
00:07:00.060 And they failed fast, too.
00:07:01.880 You know, they went from record profits to losing a billion a year, and they kind of rectified that for a little while.
00:07:08.760 And then sales dropped off again and over.
00:07:12.140 Yeah, well, one of the things I've been struck by watching your career and talking to you over the years is exactly that.
00:07:17.280 The rate at which a company that appears dominant can disintegrate and disappear is really quite something quite stunning.
00:07:24.500 I think Fortune 500 companies tend to last no more than 30 years.
00:07:29.920 That's approximately the span.
00:07:31.580 And that's not a tremendously long period of time.
00:07:35.380 So there's always dominant companies and always a handful of dominant companies, but the company that's dominant tends to shift quite quickly.
00:07:43.060 Well, I should return to this a couple of times.
00:07:45.520 There's the, you know, the classic escrow in economics.
00:07:48.580 You start out low, you solve a problem, you ramp up, you plateau, and then you fail.
00:07:53.260 And this dominates business, it dominates humanity at some level.
00:07:58.120 And it, you know, plays out over and over.
00:07:59.860 Now, it sounds like you didn't have, so how is it that you managed to do this job?
00:08:06.000 You intimated when we were talking that you weren't really trained for it.
00:08:09.540 And so you were trained as an engineer.
00:08:11.780 You had an, is it a bachelor's degree in engineering?
00:08:15.080 And how prepared were you as a consequence of your degree for any of the jobs that you undertook?
00:08:21.580 Training is highly overrated.
00:08:22.880 So, so a good engineering degree is math, physics, basic understanding of science, and some smattering of communication skills.
00:08:31.220 You can probably do a great engineering degree in two or three years if you're dedicated to it.
00:08:36.780 You know, the, the things that stretch my brains the most when I went to college is with math and mechanical engineering.
00:08:41.940 And Penn State, I went to Penn State, and they used mechanical engineering and math as two of the weed-out courses to find out if you had the chops or the gumption to get through engineering.
00:08:54.980 So there was a fairly high failure rate there.
00:08:58.000 But mechanical engineering is a really interesting discipline because you have to think, think about solving math problems spatially.
00:09:04.220 Like, you know, do things like, how do you calculate the force on a rotating, accelerating object?
00:09:10.820 Like, it's somewhat complicated.
00:09:13.180 And it makes you really think.
00:09:14.900 So you have to learn to think.
00:09:16.360 And in engineering school, you never answer a multiple choice question.
00:09:20.180 You learn stuff and film is in stuff, but then you calculate and to understand what the result is.
00:09:27.800 And the problem sets are like little design exercises.
00:09:31.540 So how much of it do you think, how much of it do you think is pure screening, let's say, well, it wouldn't be pure screening, but fundamental screening for conscientiousness and IQ and how much of it is learning to think.
00:09:48.580 How much of the education process is that?
00:09:50.900 Like, if you're going to hire an engineer, are you hiring fundamentally on the basis of IQ and you get smarter people from the top schools?
00:09:57.100 Or do you think that the engineering training actually does prepare people for a technical career?
00:10:03.780 Well, it depends on the engineer and it depends on the school and depends on their approach.
00:10:08.380 So my IQ isn't super high compared to really smart people.
00:10:12.000 I mean, it's high enough.
00:10:14.000 When I went to college, it took me about a year and a half to learn how to think properly.
00:10:18.760 And I found for me personally, I had to do the work on a regular basis early.
00:10:24.160 I didn't study for finals.
00:10:26.060 I wasn't the kind of person that could pick up a book, understand it, get an aid the next day and then forget about it.
00:10:31.600 I'm not, I can't do that.
00:10:33.680 So I had to learn how to do the work, go through the mechanisms, automatize some of the basics so I didn't have to think about them so hard.
00:10:42.780 But literally let my brain work on this stuff so that I could use them to go problem solve.
00:10:49.680 And especially in engineering, there's lots of different kinds of engineering.
00:10:52.780 There's like highly technical stuff where you turn the crank, like a skilled lawyer might.
00:10:57.880 But there's other stuff where you have to be really creative.
00:11:00.380 You have an unsolved problem that nobody solved before.
00:11:03.080 And as an engineer, you have a skill set, right?
00:11:06.320 But you have to apply it creatively.
00:11:07.760 And there's lots of high IQ people who aren't creative and there's low IQ people that are creative.
00:11:14.040 And you find in a big engineering team, there's a real diversity of personality types.
00:11:18.980 There's open-minded people, conscientious people, gregarious people.
00:11:23.280 And it takes many different kinds of people working together to do something sophisticated.
00:11:29.360 I'd say, you know, like some of my senior classes in engineering was just going a little deeper on stuff I already knew.
00:11:37.100 So I could have left it after three years and been just fine.
00:11:41.300 But I do think the work I did did help me, you know, be an engineer.
00:11:46.880 But then the problems I saw, you know, I worked on after I graduated college.
00:11:51.900 Like in school, most of the problems you're given, there's a known answer because they're in a book, you know,
00:11:57.880 and you're in a room with 20 people and they're doing the same stuff.
00:12:01.240 When you're an engineer working in the company, they never have two people the same thing to do because that's a waste of money, right?
00:12:07.100 And when you start engineering, you're given relatively small tasks by, you know, your manager or supervisor.
00:12:13.280 But as you go along at some point, depends again on who you are, you're working on stuff that you don't even know how to deal with.
00:12:22.760 You know, there's no answer in the book.
00:12:24.740 So it's, but it's not like physics, right?
00:12:26.940 Like physicists are a funny bunch.
00:12:28.260 I realized this the other day that physicists, they're supposed to work on stuff that's unsolved.
00:12:34.280 Unsolved.
00:12:35.800 Whereas engineers, you know, there's a big repertoire of engineering and then it's reduction to practice.
00:12:42.820 And then the world's complicated.
00:12:44.520 So you, you know, go build a new bridge that's never been built before.
00:12:48.300 It's not like bridges are unsolved problems.
00:12:50.700 This particular bridge hasn't been solved before.
00:12:53.140 You know, maybe unique challenges to it.
00:12:55.460 But it's not like physics where you're looking for an unknown particle or, you know, it's, you know, there's a pretty big dividing line between engineering and pure science.
00:13:05.340 Engineers typically work in domains where there's many, many knowns and the unknowns are problems of the combination of, you know, reality, you know, complexity.
00:13:15.720 Whereas physics, physics in principle, they're working on stuff that's fundamentally unknown.
00:13:21.720 As soon as it's known, they have to move on because, because then it's engineering, like, like physicists, you know, translate the unknown into engineering and engineering applies known concepts to unknown problems.
00:13:33.880 Okay.
00:13:34.000 So you, okay.
00:13:34.800 So now you, you went from digital to AMD and you learned how to design microprocessors.
00:13:40.900 So at AMD, you worked on the K8.
00:13:43.540 Yeah.
00:13:43.780 And at that point, AMD was losing ground to Intel.
00:13:48.420 Yes.
00:13:49.160 And so how did you fix that?
00:13:52.040 So basically, Dirk Meyer and I were co-architects of EV6, the third alpha chip.
00:13:58.480 He left about, he left a digital to AMD about a year before I did.
00:14:02.660 He started the K7 project.
00:14:04.720 When I joined, I started the K8 project and then helped and then worked with him significantly on K7 as well.
00:14:10.860 So, um, and how did we do it?
00:14:14.100 Well, yeah.
00:14:15.340 So those were 64 bit chips that you guys designed to compete with the, the, the Intel chips that had dominated the, the, uh, home computer market at that point.
00:14:25.740 Well, so there's a funny thing, which is.
00:14:27.700 Like at some level building fast computers, isn't that hard, right?
00:14:33.360 So you have to have a goal.
00:14:35.780 Like, so a lot of designers, so they have it, they have a design.
00:14:39.720 And then the easiest thing for the next one is to go look at that design and make it like 10% better, 20% better.
00:14:46.540 Right.
00:14:46.980 But every one of those designs has limitations built into it.
00:14:51.200 Like, like it's sort of like, if you have, if you buy a two bedroom house, you can add one bedroom.
00:14:57.720 You can't add eight bedrooms, right?
00:15:00.060 If you want an eight bedroom house, you have to build a different kind of house.
00:15:03.360 Right.
00:15:04.040 So, so every design bill has kind of a, you know, a range that it can play.
00:15:10.000 And, and, and, you know, you, you, you build the first one and you know, you can make some improvements, but some point the improvements don't really help that much.
00:15:16.980 Right.
00:15:18.080 And so AMD, they had a design called K5, which for complicated reasons didn't work out that well.
00:15:24.900 And they lost ground to Intel.
00:15:26.480 Before that, they literally the 386 and 46 AMD copied Intel's designs.
00:15:31.240 They were a clone manufacturer.
00:15:33.300 The K5 was their first design and it didn't work out that good.
00:15:36.720 And then they bought a company called Next Gen, which had K6, which is an okay design, but it wasn't competitive against Intel.
00:15:42.840 And then K7, Dirk was the chief architect of, and he designed a computer that was competitive and the head of Intel from, and some of that came from our work at digital on UV5 and UV6.
00:15:57.040 Dirk worked on UV4 as well.
00:15:58.360 And some of it was just saying in this, this day, you know, we have this many transistors, but you get more transistors every generation.
00:16:06.860 So you can basically imagine you're building a house, suddenly you have way more bricks and way bigger steel beams.
00:16:12.040 So your idea about what to build has to scale with that.
00:16:17.320 And then K7 was a 32-bit chip and then K8 was a 64-bit chip, you know, somewhat related to that as it turned out, but also it was built to be bigger.
00:16:28.400 And what I did is I wrote the performance model.
00:16:34.080 I came up with the basic architecture and I started to organize the team around building it.
00:16:38.420 And while we were doing that, we also wrote the thing called hypertransport spec, which became the basis of essentially all modern server computers or what's called two-socket servers.
00:16:48.420 We wrote that in 98 and in 2002 or 2022, they're still building them that way.
00:16:54.540 And when you say you wrote it, what does that mean?
00:16:56.820 What does the process of writing that entail?
00:16:59.900 What is it that you're writing and how do you do that?
00:17:02.360 I'm dyslexic.
00:17:03.300 So I wrote a complete, you know, protocol spec about how two computer chips talk to each other in 18 pages, right?
00:17:11.680 Which is relatively terse.
00:17:14.420 And there's a couple of pictures and, you know, computer protocols are pretty straightforward.
00:17:18.620 There's a command, there's the address you're talking to, there's the data you're moving, there's some protocol bits that tell you how to exchange commands, right?
00:17:25.920 And then Dirk took the spec and said, you mind if I flesh this out a little bit?
00:17:30.280 And three days later, he sent me a 50-page version of it, which clarified all the little bullets.
00:17:35.440 And then that specification, we literally used to build the interface between K8s, right?
00:17:41.980 So there's a couple levels of design.
00:17:45.080 What sort of impact did that have on the broader world?
00:17:50.900 What's the significance?
00:17:52.040 It's very difficult for non-engineers to understand any of this.
00:17:55.780 AMD market share and server went from 0% to 35%, which was a huge impact to the business.
00:18:04.320 And it became essentially the standard because apparently Intel had a version of that, but it didn't go to market.
00:18:10.840 But after Opturon came out to market, Intel built a similar version, similar protocol about how to connect a small number of processors together with that kind of interconnect.
00:18:21.840 And then that, let's say, design framework became standard in the industry.
00:18:27.060 So if you go into a Google data center and you pull it out, there'll be two sockets with an interconnect between the two of them.
00:18:34.360 And each socket will have memory attached to it.
00:18:36.440 And they call it the 2P server or two processor server.
00:18:40.160 And it had a really big impact.
00:18:42.960 We didn't do it because we thought it was going to have a big impact.
00:18:45.520 We did it because we thought it was a better way to build computers.
00:18:49.000 And at AMD, we were somewhat resource-constrained.
00:18:51.540 So we couldn't build a thing that looked like a big IBM server.
00:18:54.280 So we built what was basically a small server with the minimal amount of interconnect between it.
00:19:00.100 So it was a little bit of creativity by constraint.
00:19:04.740 Steve Jobs line.
00:19:06.700 And what function do those servers have, again, in the broader world?
00:19:10.320 What are they doing now for people?
00:19:12.220 Well, it's basically the entire cloud.
00:19:14.080 It's all Google, all Amazon, all Facebook, all Microsoft Azure.
00:19:18.260 But here's the interesting thing.
00:19:19.760 When we built them, the big server guys, servers used to be backplanes like this big with multiple CPU slots, multiple memory cards, multiple I.O. slots.
00:19:29.380 And the server manufacturer thought the server is oriented around the back.
00:19:33.780 So IBM, HP, Dell, they all turned this down.
00:19:36.660 But all the little startups at the time, like Google, were using PCs as low-cost servers.
00:19:43.900 And we made this, basically, you could take a PC board, instead of putting one computer on it, you could put two, which radically saved the money.
00:19:51.700 So when AMD made those kinds of servers, it was a way lower entry point for server-class technology.
00:19:57.680 And the little startups at the time used it, and then over 15 years, disrupted all the big server manufacturers.
00:20:06.480 So it's, you know, it's one of those, I couldn't say we planned it.
00:20:11.080 Like, the constraints that we had a target market, we didn't know that it was going to become essentially how servers were built, you know, for 20-odd years.
00:20:19.240 But it happens.
00:20:20.380 After AMD, you went to Apple, you worked on the A4 through 7 processors.
00:20:27.780 Well, I worked at two startups that did processors for networking, Sidebyte and TA-Semi.
00:20:34.160 And that was probably about five or six years.
00:20:38.020 And then I joined Apple in 2008.
00:20:40.140 So I guess I was, no, it must have been eight years.
00:20:42.440 I was AMD, 98, 98, 99, 2000.
00:20:47.720 And then I worked at startups for about eight years.
00:20:51.840 And then I went to Apple.
00:20:55.340 Yeah, and worked on mobile processors.
00:20:57.540 And so tell me about those chips and what you did at Apple.
00:21:00.620 The first funny part is, I had some friends that were working at Apple, and they wouldn't tell me what I was going to work on.
00:21:05.460 So when I interviewed there, they said, oh, you should just come here.
00:21:08.220 It'll be fun.
00:21:08.720 And I didn't actually know what I was going to work on.
00:21:12.300 They had a group called Platform Architecture run by a guy named Mike Colbert, who was like the unofficial CTO of Apple.
00:21:19.700 He worked for Steve Jobs.
00:21:21.360 And he had a group of architects that looked at what Apple was doing and figured out what they should do next.
00:21:26.960 And I worked on a MacBook Air definition, like I wrote the power management spec and did some other architectural work, which ultimately was an NVIDIA chip called MCP89.
00:21:40.160 And then I was one of the chief architects of, you know, four generations of SOCs, what's called A4, A5, A6, A7.
00:21:48.820 And we did a lot of stuff there, but the division was, you know, mobile phones.
00:21:54.520 And SOCs are what?
00:21:56.380 Yeah, system on a chip.
00:21:57.920 Oh, yes.
00:21:58.780 To pack a computer into a phone, you have a piece of silicon about that big.
00:22:03.380 And all the components, the CPU, the GPU, the IO, are all on the same chip.
00:22:08.420 And when they first started building phone chips, they were considered to be very slow, low cost, you know, very integrated chips.
00:22:17.220 And we thought, if you looked ahead, because technology shrinks about every two years, and about six or eight years, we'd have enough transistors on a phone chip, that would be more powerful than a PC at the time.
00:22:29.280 So we started architecting computers, interconnects, and other functions, so that when we had enough transistors, we could literally have, you know, a high-end desktop in a phone.
00:22:43.480 And Apple's DNA is, you create the product that kills your current product.
00:22:51.320 You create the product that?
00:22:52.800 That killed...
00:22:53.400 So every company has a great product, and they worry about competitors coming in and kill it.
00:22:57.700 And Apple wanted to be the first to kill their own products.
00:23:01.460 So Steve Jobs thought phones and tablets would replace PCs.
00:23:05.580 And he wanted to be the first to do it.
00:23:07.620 He didn't want somebody else to do it to him.
00:23:10.060 Did you know Jobs?
00:23:11.780 No.
00:23:13.180 I've seen him a couple times.
00:23:14.460 I said hi to him twice.
00:23:16.400 I felt like I knew him pretty well.
00:23:18.020 Everybody at Apple did.
00:23:19.240 Like, when Steve wanted something done, everybody knew the next day.
00:23:23.660 My boss, Mike, talked to him every single day, multiple times sometimes.
00:23:28.480 It's a joke.
00:23:29.100 Like, we'd walk in Mike's office, and he'd be holding the phone out like this.
00:23:32.700 He goes, Steve, he's a pest.
00:23:35.480 We're like, yeah.
00:23:36.660 So what?
00:23:37.620 But Mike could translate what Steve wanted into engineering stuff.
00:23:40.780 And Steve trusted Mike a lot.
00:23:44.800 And like, he could translate the vision into engineering.
00:23:48.540 And Steve's judgment on stuff like this is spectacular.
00:23:52.040 So, and did you have any sense, do you have any sense of why that is?
00:23:56.460 I mean, Jobs was famously, obviously originated Apple, and then was famously brought back in
00:24:01.360 to save them when they were in danger of extinction.
00:24:05.240 And then, in fact, did seem to save them.
00:24:07.540 And you never know when you hear about these things from the outside how much of that is
00:24:11.100 sort of a mythologization of a person and how much of it is, you know, this person was
00:24:16.200 really singular and unique.
00:24:18.260 Yeah.
00:24:18.700 And so.
00:24:19.100 Definitely singular and unique.
00:24:20.500 I mean, your psychological parlance, as we've talked about, he would be considered high
00:24:25.580 in openness and disagreeableness, right?
00:24:29.680 And I think negative emotionality.
00:24:32.280 Like, he was a very difficult person.
00:24:34.900 But the solution to, you know, things could go really bad and being disagreeable was, I'm
00:24:42.140 going to make it as great as possible.
00:24:44.960 And he was willing to take the risk for that.
00:24:47.140 You know, his public persona was very well practiced.
00:24:52.100 Mike used to say, the worse the practice for the, you know, Apple keynotes, the better
00:24:57.240 they would go off.
00:24:59.320 Like, like he was throwing iPhones, you know, at one of the iPhone, you know, pre-launch
00:25:04.360 practices because nothing was right.
00:25:06.260 But then when he showed up and, you know, his persona of, you know, technical explainer,
00:25:12.620 let's say, you know, that was very real.
00:25:15.480 That's what he wanted to portray.
00:25:17.280 He believed every single bit of it.
00:25:18.720 You could tell.
00:25:20.300 So, you know, any engineers, when I joined Apple, I watched some of his early keynotes
00:25:25.280 when he came back to Apple and changed the Macs.
00:25:27.460 It was, you know, it's inspiring.
00:25:30.500 But, you know, it's also super tough, right?
00:25:32.480 Because he went into a company that was very dysfunctional, had a whole bunch of, you know,
00:25:36.940 engineering groups doing basically random stuff.
00:25:39.240 Let's say, senior managers who felt like they owned their product lines and knew what they
00:25:44.100 were doing.
00:25:45.680 And, you know, Steve wanted them to do what he wanted them to do, and they didn't want
00:25:48.540 to do it.
00:25:48.900 And I'm pretty sure he cleaned house pretty thoroughly.
00:25:51.600 And he famously reduced the product lines.
00:25:54.140 And, you know, who knows how many products?
00:25:56.140 It's like four.
00:25:58.340 You know, there was consumer and professional.
00:26:01.360 So, do you think it was that disagreeableness?
00:26:04.580 I mean, we hear all the time now in the modern world about the necessity for empathy and so
00:26:09.520 forth.
00:26:10.000 And that's the agreeableness dimension.
00:26:12.780 And you're making the claim that Jobs was low in agreeableness and that he was able to
00:26:17.420 kill off malfunctioning projects.
00:26:19.240 And that's not exactly a nice thing to do.
00:26:23.440 And so imagine you go to a room full of people who dedicated their life in the last five years
00:26:28.080 of their work, five years of their careers, building products that you can't sell.
00:26:33.260 And you say, we have to do something completely different.
00:26:35.520 And everybody's, you know, every day as an engineer, you work on something, you embrace
00:26:40.640 it, you love it, care about it.
00:26:43.020 Like engineers are very emotional people somewhere in their pointy little souls, right?
00:26:47.560 So, but if it's not working out for whatever reason, you have to do something different.
00:26:53.080 And if you listen to everybody, you'll never change anything.
00:26:58.080 Right.
00:26:58.580 It's difficult to get people reoriented.
00:27:01.580 Now, another line you gave me, I don't know where it came from, is you run fastest when
00:27:04.740 you're running towards something and away from something.
00:27:08.500 Yeah.
00:27:08.960 That was from animal experimental literature.
00:27:10.780 If you, if you threaten a rat and re and offer to reward simultaneously, it will run faster
00:27:16.380 towards the reward than if you just reward it.
00:27:19.020 Because you get all your motivational systems on board that way.
00:27:22.280 Yeah.
00:27:22.640 Yeah.
00:27:23.340 So Steve was very good at the vision.
00:27:25.020 We are going to build this beautiful computer.
00:27:30.680 Right.
00:27:31.660 And you better goddamn build it now or you're going to die.
00:27:35.340 So that was his, okay.
00:27:36.720 So the openness, the openness, that's the creativity dimension.
00:27:41.200 That gives him the vision.
00:27:42.440 He's extroverted.
00:27:43.520 Can he communicate enthusiastically?
00:27:46.880 He can certainly put on the act.
00:27:48.520 I have no idea if he was extroverted or not.
00:27:50.140 I never saw him be extroverted in any natural setting.
00:27:54.340 I just, you know, I've seen him walk around.
00:27:56.380 Like I said, I used to see him in the cafeteria.
00:27:58.740 He was, he was visible on the Apple campus, even until his last days.
00:28:03.260 He didn't like to be bothered.
00:28:04.320 Like he didn't go up to Steve and say, hey, Steve, how's it going?
00:28:08.960 Right.
00:28:09.100 Well, that would be reflective of basic disagreeableness too.
00:28:12.160 Right.
00:28:13.000 You know, it's hard with, it's hard.
00:28:15.100 Sometimes people can communicate very effectively, communicate a vision because they are high in
00:28:19.120 openness.
00:28:20.240 Extroverted people are enthusiastic and assertive.
00:28:22.780 So they tend to be verbally dominant and can inspire people because they generate a lot of
00:28:26.920 positive emotion, but that can be mimicked by openness.
00:28:29.400 So, so you'll have to remember like Steve was part of Pixar and very much part of Hollywood
00:28:35.540 and, you know, creating movies and creating personas and characters and archetypes.
00:28:41.240 So he was super well grounded in how that stuff works and what works and doesn't work about
00:28:46.040 it.
00:28:46.900 Right.
00:28:47.420 And he had an unerring eye for, for, for beauty and elegance.
00:28:52.060 And he cared about it.
00:28:53.280 And he would fight for that.
00:28:54.740 And that's hard.
00:28:55.700 It's very hard to fight for beauty and elegance.
00:28:57.920 And I suspect it's particularly hard, perhaps maybe I'm wrong about this, but I would think
00:29:02.300 that would be a hard sell to, to at least a subset of engineers.
00:29:06.540 Yeah.
00:29:07.040 So another, this is explained to my boss at Tesla was, I worked both for Elon and for a
00:29:12.940 guy named Doug Field.
00:29:14.380 And Doug said, there's this, you know, there's this productivity graph versus order.
00:29:19.720 So at the origin is zero productivity and chaos.
00:29:23.540 Right.
00:29:24.020 And then as you add order to your design methods, your productivity will go up.
00:29:29.720 And what happens with engineers is they, they understand as they get better processes, they
00:29:34.700 get better training, they get better working together.
00:29:38.260 Every single thing that makes the whole organization more orderly improves productivity.
00:29:42.820 Unfortunately, that peaks at some point and then too much order, productivity goes down.
00:29:49.440 And so then, you know, as I say, any idiot can see, you should be at the peak, you know,
00:29:54.560 enough order to really be effective, but not so much order you grind to a halt.
00:29:59.420 But why can't you stay there?
00:30:01.260 And the reason is, is once order takes over the organization, it's unstoppable.
00:30:08.400 Right.
00:30:09.180 It feels good.
00:30:10.560 You get even better at doing what you're doing.
00:30:12.720 You get even more organized.
00:30:14.580 You micromanage your time even better.
00:30:16.700 You close out all the creativity.
00:30:18.340 You're not open to change.
00:30:19.960 A whole bunch of bad things happen.
00:30:21.540 You shut out the disorderly people who actually know how to make a change and do something creative.
00:30:27.800 And the organization dies.
00:30:28.900 You think that Jobs was conscientious as well?
00:30:33.860 You know, like, was he in early?
00:30:35.400 Was he working 18-hour days?
00:30:37.540 I know he was up in the middle of the night because he called Michael and that stuff.
00:30:40.880 Well, the point is, both Steve and Elon were counter forces to order.
00:30:46.240 Right.
00:30:47.080 You have to be really strong to avoid your organization getting captured by order.
00:30:52.180 Well, order also has its remarkable air of moral virtue, right?
00:30:56.320 Because it's pure and it's efficient.
00:30:58.320 You think about it feels good, you know?
00:31:01.360 But, you know, it's like alcohol.
00:31:03.120 The first drink feels right.
00:31:04.060 The second feels okay.
00:31:05.000 The third one, not that good.
00:31:06.640 But, you know, you keep remembering what the first one did.
00:31:09.820 So, you drink it.
00:31:11.440 Right.
00:31:11.940 You know, there's lots and lots of processes, you know, where some is good, too much is bad.
00:31:18.060 But the counter force, the more is weak.
00:31:22.040 And that's the thing that, you know.
00:31:24.520 So, Steve was interesting because he was simultaneously super creative and had visions, which could inspire people.
00:31:31.340 But he also prevented the company from being over-organized and preventing them from doing what he was doing.
00:31:38.900 And that's hard because people, like I said, they get committed to what they're doing.
00:31:44.060 Yeah, well, it's an open question.
00:31:46.320 Like, imagine that the creative process has a productive component and then a culling component.
00:31:52.040 And the productive component looks like it's associated with openness.
00:31:55.320 But what the culling component is, is open to question.
00:31:59.300 And it does seem to me that, at least upon occasion, it's low agreeableness.
00:32:04.100 It's the ability to say, no, we're going to dispense with that and to not let anything stand in the face of that decision, which would also include often human compassion.
00:32:18.060 Yeah.
00:32:18.300 And people have different approaches to it.
00:32:19.880 Like Jobs would call things those they weren't beautiful or they weren't great.
00:32:23.140 But, you know, Elon Musk is famous for getting the first principles and really understanding it fundamentally and culling from like a standpoint of knowledge.
00:32:33.300 Yeah.
00:32:33.760 And you've asked me, like, what makes an engineer great?
00:32:37.200 Like, so you have to have the will to creativity.
00:32:40.580 Like, now there's lots of engineering jobs that aren't creative.
00:32:43.420 Like, you need a skill set, you can exercise the skill set.
00:32:46.340 But if you're going to build new things, you need to be creative.
00:32:48.980 But you also have to have a filter good enough to figure out what's actually good and bad.
00:32:55.100 Like, I know a lot of really creative engineers and they find a new thing and they're excited and they go down the rabbit hole on it and they, you know, they can work on it for six months and nothing to show for it.
00:33:03.800 So you have to have that conscientiousness.
00:33:06.160 I don't know if it's conscientiousness, disagreeableness, you know, that, that taste on how.
00:33:10.760 Well, the conscientiousness would keep you working in the direction that you've chosen and, and doing that diligently and orderly.
00:33:18.780 The, the low agreeableness, well, that's, that's the open question because agreeableness is such a complicated dimension.
00:33:25.100 There's obvious disadvantages and advantages at every point on the distribution.
00:33:29.780 I mean, disagreeable people are often harder to work with because they don't care much about your feelings.
00:33:34.460 But one thing I've noted about working with disagreeable people is you always know what they're thinking.
00:33:39.360 And if you want someone to tell you what's stupid and wrong, they're perfectly willing to do that.
00:33:45.160 Yeah.
00:33:45.660 I used to, I used to watch, I used to wonder.
00:33:48.120 So Dirk Meyer was a disagreeable manager, but he could tell you what was wrong with what you were doing in the way you would go.
00:33:53.760 Okay.
00:33:54.720 Like he was very unemotional about it.
00:33:56.680 Like he goes, Jim, I really like this and this, but this isn't working for shit.
00:34:00.840 Like, what are we going to do about it?
00:34:02.000 And, and you just, it would just be all shocked as a matter of fact.
00:34:06.200 I would say when I was younger, I was a lot less disagreeable.
00:34:10.520 You know, I'm fairly open minded and, you know, I like to create new stuff, kind of stuff, things.
00:34:17.380 But then I saw enough things fail over the years because we didn't make the, you know, let's say the hard choices about something.
00:34:23.540 And then, you know, you hate to work on something for two years and have it go away because at some point you realize you're doing a couple of wrong things.
00:34:31.060 And you didn't do something about it when you could.
00:34:35.240 And so as a, you know, as a manager and a senior leader, I'm somewhat famously disagreeable.
00:34:41.940 Part of it's an act to get people to move.
00:34:44.680 And part of it's, you know, my beliefs that I can't have people dedicate themselves to doing bad things for very long because it'll, it'll bite us.
00:34:53.980 Yeah, well, we've talked a little bit about this too, about the moral dilemma between agreeableness and conscientiousness.
00:35:00.500 They're both virtues.
00:35:03.020 Agreeableness seems to me to govern short-term intimate relationships like that between a mother and a child.
00:35:08.540 And it involves very careful attention to the emotional reactions of another person and, and, and the optimization of those in the short term.
00:35:18.580 But conscientiousness looks like a longer term virtue and they come into conflict at some point because sometimes.
00:35:25.380 They come into conflict in the midterm.
00:35:27.820 Yes.
00:35:28.760 Right.
00:35:29.060 Yeah.
00:35:29.260 It's, you know, it could just be, you know, how our brains see the future, but, but it's like, you know, if you're managing the group and you have to fire somebody, it's hard.
00:35:37.220 Right.
00:35:38.060 But do you want to fire five people now or everybody later?
00:35:40.660 Like, well, like once you've internalized that and taken responsibility for that decision, then making, you know, management, leadership, position choices is always hard, but it's so much better to make them.
00:35:54.100 And then, then succeed than it is to fail because you couldn't make the hard calls.
00:35:58.380 Yeah.
00:35:59.100 Well, it isn't obvious at all.
00:36:00.400 Who's, who's got the upper hand, you know, someone who fires early out of necessity, but is accurate and looking carefully or someone who, you know, is willing to let people drag on.
00:36:13.320 I'll give you, I'll give you two counter examples of that.
00:36:15.400 So Jack Walsh in his book, straight from the gut, a weird thing.
00:36:19.280 He said, you know, once you have a doubt on somebody, you never act fast enough, which, you know, it took me years to really believe that.
00:36:26.420 And then the other weird one is people say, Hey, I have this organization of a hundred people and there's five, five people that aren't working out, but I'm not sure who they are.
00:36:36.120 So I'm going to be really careful because I don't want to accidentally fire a good person.
00:36:42.260 Right.
00:36:42.640 That makes sense.
00:36:43.280 Right.
00:36:43.940 You got five bad people, you know, maybe you figure out who two or three of them are, but there's this other group of five or 10.
00:36:49.260 You're not sure which ones are the wrong ones.
00:36:52.320 Here's the sad truth.
00:36:53.600 There's a lot of people in the world.
00:36:55.360 You're better firing too many than too few.
00:36:57.520 And how did you come to terms with that emotionally?
00:37:01.860 I mean, look, we have a mutual friend who fires people with quite great regularity and I've talked to him and he scores very high in disagreeableness.
00:37:09.640 And I talked to him about firing, which he's done a lot of, and he was actually quite positive about it.
00:37:15.020 He said, I don't fire anyone who I don't think is causing more trouble than preventing.
00:37:19.520 And so by firing the person that I'm firing, I'm actually doing a very large number of people, including potentially that person a favor.
00:37:28.260 It didn't bother him, but he was temperamentally wired that way, I would say.
00:37:32.580 But I would say, you know, digital equipment went bankrupt because they had bad people who didn't fire.
00:37:39.640 I've seen many groups fail because they couldn't clean house.
00:37:43.720 Right.
00:37:44.420 And the impact on, you know, the greater good equation is super easy.
00:37:48.200 You want to save 90 people or, you know, lose 100.
00:37:52.640 So, so that's true.
00:37:54.880 The thing that took me a while to realize that the world needs shaking up all over the place and individuals do.
00:38:04.120 Right.
00:38:04.840 A lot of people who are not doing too good, they need a wake up call.
00:38:08.520 You give them a bad review and they kind of shrug.
00:38:10.400 They're like, what are you going to do about it?
00:38:12.080 You know, it's like a spoiled kid.
00:38:13.440 Nothing.
00:38:13.840 Right.
00:38:14.020 But when they actually get fired, they really have to do some soul searching.
00:38:19.380 And then the fact that if you're doing something good, there's always a cue outside the door of more people.
00:38:25.080 Now, here's another way to think about it.
00:38:26.220 Take a group of 100 people and rank them from top to bottom.
00:38:30.000 Human beings, by the way, are really good at this.
00:38:32.700 You have four managers in the group, except for the manager's individual friends.
00:38:36.760 They'll tend to rank 100 people the same way.
00:38:38.880 I've done this experiment many times.
00:38:41.060 So we're really good at ranking.
00:38:43.100 And there's a little bit of what are you ranking for?
00:38:45.060 You're ranking for creativity, productivity, conscientiousness.
00:38:49.420 But if you set the criteria right, people rank pretty well.
00:38:53.220 If you have 100 people in your group and there's 50 people outside, the distribution of those 50 people is around the average of the team.
00:39:02.860 Right.
00:39:04.460 So there's this idea that you fire the bottom 10% of a team because the random people you hired will be better on average than the bottom 10% of your team.
00:39:14.940 Right.
00:39:15.420 It's just math.
00:39:16.680 Unfriendly statistics.
00:39:18.240 Right.
00:39:18.540 But yes, I get the argument.
00:39:20.000 Right.
00:39:20.180 The problem is that every company that does that, first it gets gamed because managers hire bottom 10 percenters.
00:39:27.280 So when they get to fire the 10%, they don't have to fire their friends.
00:39:31.440 Right.
00:39:31.840 And it also really is hard on morale.
00:39:34.500 Like people bond and there may be people in the bottom 10% of your group that are the social glue of the organization.
00:39:41.920 So you may be inadvertently taking out the stuff that makes the team work.
00:39:46.460 Right.
00:39:46.640 Well, that's a measurement error too, right?
00:39:48.300 It means that your criteria for competence aren't broad enough.
00:39:51.340 Yeah.
00:39:52.260 That's tough.
00:39:53.060 Good point.
00:39:53.500 Maybe you're ranking a little wrong, but impact on morale is high.
00:39:56.880 Teams, generally speaking, Rory Reed was CEO of AMD when I joined and we had a big layoff, which we had to do because we were running out of money.
00:40:05.800 We were broke.
00:40:06.220 And when we all settled, we landed on just the right amount of people for the money we had.
00:40:13.260 And he basically read us the riot act.
00:40:14.920 He said, he said, guys, teams have to grow.
00:40:19.480 When you cut, you always cut further and then you grow.
00:40:25.580 People aren't happy unless they're growing.
00:40:28.580 Right.
00:40:29.020 It's like when you prune bushes and stuff, you don't prune the bush to where you want the bush.
00:40:33.260 You prune the bush past that point.
00:40:35.040 So it grows out and looks nice, right?
00:40:37.260 Things have to grow.
00:40:38.260 It's really an amazing dynamic.
00:40:42.000 Well, yeah.
00:40:42.300 Well, and it's never clear how much death there is involved in growth.
00:40:46.440 And the pruning analogy is exactly that.
00:40:49.560 And this is harsh stuff, obviously, but you're looking at one collapse or another.
00:40:54.880 It's right.
00:40:55.280 That's the thing.
00:40:56.480 It's not harsh.
00:40:57.320 Because it's beautiful when you prune your bush and it grows back beautifully.
00:41:00.960 It's great when you rebuild an organization that's really strong and powerful because you made the right calls, right?
00:41:07.840 Like this isn't just negative stuff.
00:41:10.800 It's hard stuff to do that creates something really great.
00:41:15.360 Like when I joined AMD in, what was it, 2013 or something, like they had two product lines, you know, Bulldozer and Jaguar, and they're both failing.
00:41:24.140 And I canceled both products.
00:41:30.460 Okay.
00:41:31.000 And so what was the human cost of that, I would say, both to you and also to the people that were involved?
00:41:37.640 Well, I did the math on it.
00:41:39.400 It's like, you know, we needed to be building eight bedroom houses and we're trying to add six bedrooms to a two-bedroom house in one case.
00:41:45.560 It was never going to work.
00:41:47.120 Yeah.
00:41:47.360 So you saw that as doomed to failure.
00:41:49.200 And the other one was structurally screwed up.
00:41:51.160 It seemed to be the right ballpark for the performance we should get, but the way it was engineered and built was sort of like, you know, you let the plumber do the architecting and the house looks like shit.
00:42:02.080 And it was difficult.
00:42:03.920 You know, for complicated reasons, technical reasons, there was no path out of where they were.
00:42:08.440 And when I realized I had to cancel them, yeah, it was sleepless nights.
00:42:14.400 Here we are.
00:42:15.020 We had revenue on that.
00:42:16.460 We had people committed to it.
00:42:17.540 People really liked it.
00:42:18.820 When I canceled it, especially on the Jaguar team, a significant number of people quit because they were angry about it.
00:42:25.120 There were some pretty big organizations.
00:42:28.340 There was some management we had to let go.
00:42:30.920 The best architect at the time, well, one of the best architects at AMD was, he really was my way or the highway.
00:42:36.720 And he was, you know, he could not communicate what he was doing.
00:42:40.020 So I let him go, which is a strange thing to do.
00:42:44.200 So if he'd rank the organization, you're ranking near the top.
00:42:46.640 And I let one of those guys go because he wasn't ineffective working with the team.
00:42:50.420 And how did you justify that to yourself?
00:42:53.520 And how did you check yourself against stupidity and ignorance and, you know, self-interest?
00:43:00.900 And how did you know that what you were doing was right?
00:43:04.380 Well, I'm a little lower in conscientious than I should be for a senior leader, first of all.
00:43:09.040 So I knew this wasn't going to work.
00:43:11.900 So you're in a space.
00:43:13.460 The direction I'm going is not going to work.
00:43:17.160 So you know how mosquitoes work?
00:43:19.240 Mosquitoes are fun.
00:43:20.420 So they detect two things.
00:43:21.780 They detect water, vapor, and carbon dioxide.
00:43:24.500 So mosquitoes will fly along in a direction as long as the water, vapor, and carbon dioxide are staying the same or going up.
00:43:32.220 But as soon as it starts to go down, they change direction in a random direction.
00:43:37.040 Right?
00:43:37.320 And within a couple of turns, they're aiming right for a mammal.
00:43:40.280 They can bite.
00:43:42.120 It's gotten colorful.
00:43:42.860 So if you know you're going in the wrong direction, a change in direction can maybe is just as bad, but there's some chance it's good, especially if you're somewhat smart and you have some experience.
00:43:55.120 Right?
00:43:55.180 So even a random move is better than no move if the outcome is certain failure.
00:44:00.820 And so that is some justification for taking a risk.
00:44:03.440 Now, there's an infinite number of fail directions, but you're somewhat informed.
00:44:07.900 Right?
00:44:08.020 And then the other problem is when you build a house, it has a foundation.
00:44:11.880 Once the foundation is built, it's very difficult to change the top of the house a lot.
00:44:17.660 Like if you have a foundation for a two-story house, it's hard to make it into an eight-story house.
00:44:22.500 So when we cancel those projects, we consciously reset some design methodologies, some team organizations, some leadership, some let's say, you know, we said we're going to have the best in class leadership, design methodology, and some of the architectural tools.
00:44:44.400 We're just going to take those as givens.
00:44:46.780 Now that the land has been cleared, we had the opportunity to go back to that.
00:44:52.500 And it was interesting in the design teams, it turns out there was some very good pieces, you know, in the two processors they had, but they weren't working together organically like they should.
00:45:03.880 And let's say the framework of the design wasn't big enough.
00:45:07.460 And then the tools over the years have evolved into lots of little local improvements, but it wasn't really the right tool set.
00:45:15.740 Now, AMD leapt forward when you did this, and they were the only competitor to Intel in a realistic sense.
00:45:21.640 And so this, these actions on your part were part of what made that company thrive and kept competition within the microprocessor world.
00:45:29.960 So these didn't have, these decisions didn't have trivial outcomes.
00:45:33.200 Oh, no, it had really great outcomes.
00:45:35.700 And there was, the really cool thing was, you know, when we did that, we didn't really bring in outsiders.
00:45:41.380 Like that Zen design was entirely based on people who worked at AMD at the time, right?
00:45:47.760 So what we needed was to clear the plate a little bit, to reestablish some, you know, first principles about how we were doing things to have a better goal.
00:45:57.700 There was a little bit of head knocking on getting like the methodologies straightened out.
00:46:03.340 You know, I was, let's say, fairly disagreeable about how we were going to get through that because people kept saying, oh, it's too hard to do this.
00:46:10.740 Well, is it any good?
00:46:11.740 No.
00:46:12.980 Well, if it's no good, it doesn't matter how hard it is.
00:46:16.780 You have to do it, right?
00:46:18.620 If you're going to drown, you don't go, well, a mile is too far to swim.
00:46:22.880 So I'm just not going to swim.
00:46:23.940 You're going to drown, right?
00:46:25.200 If you're a mile offshore, you're drowning, swimming a mile is the requirement, right?
00:46:31.660 And I've explained that like a million different ways.
00:46:34.300 When something is pretty good, you know, the world's, you know, divided into three things.
00:46:38.120 Things are good, so you're happy.
00:46:39.960 When things are bad, you fix it.
00:46:42.300 And then there's the middle ground where it's not great, but it's not killing.
00:46:46.740 Those are the ones where human beings have a really hard time improving, right?
00:46:51.160 So by canceling the projects and declaring everything bad, everything could be improved.
00:46:58.720 And people would bug me about it.
00:47:01.460 It was like, Jim, this wasn't that bad.
00:47:03.400 Well, was it great?
00:47:04.200 Was it going to win?
00:47:04.940 No.
00:47:05.240 All right, it's bad.
00:47:05.940 I defined everything that's not great, bad.
00:47:08.280 So I moved, you know, a little on the continuum.
00:47:11.680 Everything 5% or more away from great is bad, period.
00:47:14.980 And how did you, like, how did you come to decide that was a good criteria?
00:47:21.060 What's that?
00:47:22.240 Why did you decide that was a good criteria?
00:47:25.240 Well, at the time we were competing with Intel and on a whole bunch of metrics, like, literally
00:47:30.900 their CPU had twice the frequency, twice the performance per clock.
00:47:35.480 A whole bunch of metrics were so good.
00:47:37.840 So we plotted them all.
00:47:39.020 Like, you know, I had a whole bunch of data on this stuff, but also as a mindset, like
00:47:45.640 if it wasn't best in class, like a computer has a whole bunch of things in it, there's
00:47:49.520 something called a branch predictor, which predicts which way branches are going.
00:47:53.300 Theirs was way better than ours.
00:47:54.600 We can measure it.
00:47:55.900 So we did.
00:47:57.320 There's a thing called, you know, a memory system.
00:47:59.340 Their memory system was way better.
00:48:01.000 It was twice as fast.
00:48:03.020 So it was like, well, we need to be within 10% of them on everything or we're going to get
00:48:07.100 our assets kicked.
00:48:07.920 And it would be really nice if we were better on some things.
00:48:12.240 So we measured all their stuff.
00:48:15.080 Like, if you're going to compete, you know, like in basketball, you don't just players,
00:48:18.980 you know, play yourself.
00:48:20.700 You know, you try to see what the other teams are doing, whether they're good at, whether
00:48:24.060 they're bad at, you know, every coach is great at doing analysis of all the competition.
00:48:30.080 And then, you know, you win two A's, you meet the competition with what they're good
00:48:35.440 at. And then you have some secret plays that you're better at than surprising.
00:48:39.080 Given recent SCOTUS wins, it feels like the pendulum may be swinging back to a time when
00:48:44.760 the nuclear family was situated at the center of American life, where real conversation,
00:48:49.420 learning and growth began at home.
00:48:51.260 President Ronald Reagan said in his farewell address that all great change in America begins
00:48:55.460 around the dinner table.
00:48:56.860 Well, all great meals in America begin with Good Ranchers.
00:49:00.640 Good Ranchers cares deeply about providing families with steakhouse quality beef, chicken
00:49:04.980 and seafood meat at a reasonable price. Their mission is to bring people to the table,
00:49:09.400 making those shared moments with your loved ones easy, accessible and delicious.
00:49:13.840 Good Ranchers ships 100% American meat, born, raised and harvested in the U.S., right to your
00:49:18.840 door. Plus, when you subscribe, your price is locked in for the life of your subscription.
00:49:23.660 Great food creates great conversation and great conversation makes great change.
00:49:27.820 So start bringing people back to the table with high quality American meat.
00:49:31.580 Go to GoodRanchers.com slash Peterson to get $30 off your first order plus free shipping.
00:49:36.960 That's GoodRanchers.com slash Peterson.
00:49:40.520 So you went from AMD to Tesla.
00:49:46.680 So, and we talked a little bit about Elon Musk. I want to ask you about him and your experiences
00:49:52.020 at Tesla, but also and what you did there.
00:49:56.340 Sure.
00:49:57.020 So let's talk about Elon Musk to begin with.
00:49:59.260 Well, he's a pretty public guy. So I don't know that I can add that much.
00:50:05.820 Well, kind of what you said about Steve Jobs. Was it true?
00:50:08.940 Yeah, it's mostly true.
00:50:11.020 Like Elon's the real deal.
00:50:13.360 He's a really good engineer.
00:50:15.240 He has this belief that he can learn deeply about anything fast.
00:50:19.280 And he's practiced it and does it. I've seen him do it.
00:50:22.320 Like he gets into lots of details.
00:50:24.740 He has done five impossible things.
00:50:27.180 Yeah, yeah. It's pretty, it's pretty respectable.
00:50:30.480 And he likes the details and he has a good eye for it.
00:50:33.020 Usually before, before Tesla, when you do a technical presentation, usually done, you have some kind of methodology for presenting the problem to an executive.
00:50:43.600 What's the problem?
00:50:44.860 What's, you know, the background of that?
00:50:47.300 Elon is solution first.
00:50:49.620 Like, if I don't like your solution, I don't care about the stupid problem and your data.
00:50:54.040 You know, just forget it.
00:50:54.880 No.
00:50:55.800 What's the solution?
00:50:57.040 Is the solution great?
00:50:58.380 Great.
00:50:59.300 Then everything else is backup.
00:51:01.100 You know, what's the data behind that thing?
00:51:03.440 Oh, we've got the data.
00:51:04.800 What was the problem?
00:51:05.500 Like, how did you figure this out?
00:51:06.720 Oh, here's the original problem we found.
00:51:08.820 Like he likes, you know, he has a reverse order for most people.
00:51:11.380 Most people tell you a story.
00:51:12.620 It's a problem.
00:51:13.980 You know, here's how we figured it out.
00:51:15.440 Here's the background data.
00:51:16.720 Here's how we built the solution.
00:51:17.840 Here's the solution.
00:51:19.200 Here's the next step.
00:51:20.060 Like, that's a typical technical presentation.
00:51:22.520 It's not a bad idea.
00:51:24.080 Elon hated this.
00:51:25.500 Like, stop with the bullshit.
00:51:27.220 What's the goddamn solution?
00:51:30.140 Can you give me an example of that?
00:51:35.500 We had a problem with low-resolution camera images, right?
00:51:40.240 And we were trying to improve, like, how the computer perceived roads in, like, low-light conditions.
00:51:44.860 So, we started with, well, here's the resolution of the camera.
00:51:50.020 Here's the light sensitivity.
00:51:52.060 Here's the images we're getting.
00:51:53.740 And he was like, what the?
00:51:55.680 Do you have a solution or not?
00:51:57.660 Well, yeah, we do.
00:51:58.680 We have the software that does this.
00:52:00.500 Well, page 12.
00:52:02.280 Why is it on page 12?
00:52:04.280 Page one of it in a really good place.
00:52:06.500 Here's the old image.
00:52:07.600 Here's the new image.
00:52:08.580 Here's how we did it.
00:52:11.500 Right?
00:52:12.000 That's what he wanted.
00:52:13.240 Now, partly, he's a high bandwidth.
00:52:14.160 So, did that, would, would, so what I'm wondering is, what was effective about that?
00:52:18.880 Was it that there was an ethos then that the most important thing that you had to bring forward wasn't a problem, but it was a solution?
00:52:25.700 And so, that people were striving constantly to generate and communicate solutions, which seems like a good strategy.
00:52:32.060 Well, no, I think he's worried.
00:52:34.160 So, engineers see the problem, and they start investigating it.
00:52:37.120 And then, as you're investigating, you develop understanding, and you're understanding the problem is good.
00:52:43.740 Like, you would think that, right?
00:52:46.480 You think that.
00:52:47.420 Lots of people think that.
00:52:48.280 Elon doesn't think that.
00:52:50.140 Elon wants a solution, right?
00:52:52.040 And if you're falling in love with your understanding, and you're falling in love with your little details.
00:52:57.200 Well, that also feels like work.
00:52:58.880 You know, you mentioned earlier that just because it's hard doesn't mean it's useful.
00:53:02.340 Right.
00:53:02.860 And focusing on a solution.
00:53:04.360 You want to know that you are focused like crazy on the solution.
00:53:08.020 Mm-hmm.
00:53:08.580 The best way to do that.
00:53:09.680 So, then you have two states, right?
00:53:11.480 If there's a problem, there's two states.
00:53:13.620 You have a solution, and you don't have a solution.
00:53:15.840 If you have a solution that's on page one, we're all happy.
00:53:20.700 Yeah, well, I've seen in the-
00:53:22.020 If you don't have a solution, why the hell are you talking to me?
00:53:24.620 Why aren't you finding a damn solution?
00:53:26.340 Yeah, yeah.
00:53:27.440 Well, I've seen in the software projects, other projects that I've been involved in, too, that focusing on a solution, I think this is along the same lines as what you're discussing, is, well, then you get to product a lot faster.
00:53:39.660 It's like, this thing has to exist and work.
00:53:42.220 Maybe it won't solve, and the problem with the problem is that you can indefinitely investigate the problem and expand it, and also that that feels like work, but it's not saleable.
00:53:53.380 Yeah.
00:53:54.040 Yeah, it puts in the forefront of your mind, you know, that constant, you know, you need to be creative in pursuing the problem, but also make sure you're really on track of the solution, and you're just not falling in love with the problems.
00:54:06.280 People call it admiring the problem.
00:54:07.760 Some engineers are great at admiring problems.
00:54:09.540 Like, I've worked with lots of people that come in with a 12-page presentation, and when I'm done, I'm like, did you guys just give me a whole bunch more data on the problem?
00:54:18.020 He's like, yeah, we're really getting to the bottom of it.
00:54:19.920 It's like, no, you're not.
00:54:20.700 You're not getting anywhere.
00:54:22.300 Well, the bottom of a problem is a solution, because why would you just investigate the problem, right?
00:54:29.340 I mean, your destination point is a solution.
00:54:31.080 You know, they just keep getting deeper and deeper into problem admiration, and nothing happens.
00:54:38.120 Okay.
00:54:38.700 I'd say three-quarters of engineers would be perfectly happy to do that their whole life.
00:54:43.220 Because, so you explained this to me, complex mastery behavior, right?
00:54:47.520 So humans are very, you know, we like to learn, right?
00:54:51.700 We don't like to do dumb, repetitive things, right?
00:54:55.520 But we like to do things that are complicated, that we've mastered, that take skill and, you know, insight.
00:55:03.240 But you can have complex mastery behavior just analyzing problems.
00:55:08.200 Right, definitely.
00:55:09.220 Right, and there's careers for that.
00:55:10.300 Some people find they're really good at it.
00:55:11.720 They're analysts.
00:55:12.860 They generate lots of data.
00:55:13.980 But if you're in charge of solving problems, you know, that period needs to be focused, short, concise, and you need to move on to solutions, right?
00:55:24.680 I think that's probably why I'm not so temperamentally fond of activists.
00:55:29.760 Are they problem admirers?
00:55:31.860 Well, that's what it looks like to me.
00:55:33.320 It's like, well, that's a, and this is one of the reasons I admire Elon Musk.
00:55:36.520 It's like, well, you're concerned about the environment?
00:55:39.280 Well, why don't you build an electric car then?
00:55:42.260 Right.
00:55:42.560 Well, so a large number of people, and this is my favorite thing I learned, and it was working with mechanical engineers at Tesla, because they think the world's made out of silly putty, right?
00:55:53.760 They used to design, when we were building Model 3, they would design the part, and then they would joke about how they're going to make it.
00:55:58.920 Are they going to, you know, CNC it, like mill it?
00:56:01.480 Are they going to injection mold it, 3D print it, stamp it, make it with a hammer, you know, cut it out with scissors, you know, carve it out of a block?
00:56:09.540 Like, they had this cool machine that could carve 3D models out of clay.
00:56:14.540 Like, it was funny.
00:56:16.200 Like, so they could design things in their heads and on computers, and then go build any damn thing they want.
00:56:22.380 Like, if you ever look at a complicated mechanical assembly, there would be some screwed aluminum thing that would be milled somewhere and then drilled, and there's screws going through it.
00:56:32.400 And there'd be some little tab sticking off of it that holds another thing.
00:56:36.800 Like, they can make stuff, right?
00:56:38.820 They think they can make anything, right?
00:56:42.400 And there's a whole bunch of people in the world that don't think they can make anything.
00:56:44.960 They don't, they think the world is what it is.
00:56:49.000 I had a friend, he had a rattle on his dashboard, and he didn't know what to do about it.
00:56:55.540 And I was asking him where the rattle was, and, you know, and I was thinking that, I was talking to him about, like, how the dashboard's made.
00:57:02.880 And he goes, oh, I get it.
00:57:05.940 You think the dashboard's made of a whole bunch of parts that are put together some way.
00:57:10.540 I thought the dashboard was just the dashboard.
00:57:12.620 Like, he could, he didn't conceptualize it as there's this outer piece, and there's inner brackets, and there's radio, and there's just these things.
00:57:20.740 Whereas mentally, I can't help but see the whole thing in 3D, and then I'm wondering which piece is loose and where it is, right?
00:57:29.040 And then how to fix it.
00:57:30.540 Like, I'm not mechanical engineering creative, but I'm visual.
00:57:35.740 Like, and for people who get stuck on activism as problem description, they don't think the world can change.
00:57:45.200 Which, at some level, makes sense.
00:57:47.080 You know, for human evolution, like, you know, it was pretty much the same for a million years.
00:57:52.980 Like, it's weird how good we are at change.
00:57:54.700 And my best theory on it is from 0 to 20.
00:58:00.580 Like, your brain is going through radical change because you're going from, you know, silly putty and not knowing much to being pretty smart.
00:58:07.300 So you have to change and adapt really fast.
00:58:10.040 And then humans are adapted to deal with each other, and humans are fairly crafty.
00:58:15.880 You know, you have to deal with that.
00:58:17.300 But, you know, the lifestyle of most people from, you know, say it's 30 to death, it's fairly static.
00:58:26.500 And, you know, so we have this funny capacity for learning rapidly, exponentially, and then dealing with slowly changing environments.
00:58:35.540 But we're not naturally adapted to rapid change.
00:58:38.880 And the modern world is, well, especially in engineering, is rapid change.
00:58:43.580 It was just so funny.
00:58:49.720 Like, you never knew what they were up to.
00:58:51.620 Like, one day, like, they were working on the interior for the car, and they made this crazy-looking model, which kind of looked like a car, but it turned out it was a thing you could move around and have the attachment points for all the interior parts.
00:59:03.580 So you could basically, it looked like a weird skeleton, but it had the attachment points that you could adjust, and you could build all the interior parts and put a Tesla interior together right in the middle of the, where the engineering desks were.
00:59:17.420 It was really cool.
00:59:19.720 And let them go build it and think about it.
00:59:22.340 And then, because in the CAD model, in the computer, you could see it, but it didn't always work out in real life.
00:59:27.900 Like, we have a scale problem.
00:59:28.960 When you look at something that's small, even if you scale it up perfectly when it's big, sometimes that's just what you thought, and sometimes it doesn't work.
00:59:37.140 And so, you know, you want to do, you know, computers like to change things fast, but like real scale models that you sit in and live in and, you know, get a human experience for it.
00:59:47.100 And it was really fun for that to just show up and be like, holy cats.
00:59:50.120 If I did a similar thing, we took all the electrical subsystems and motors and laid them out on two tables, two big tables, covered with all the electrical parts of a Model 3.
01:00:00.140 We stared at it, and once you see them all together, it's crystal clear, that could be a lot better.
01:00:07.340 Because, you know, there's three motors that look almost the same, why isn't that one motor?
01:00:11.520 There's these two parts that are completely separate assemblies, but if you build it together, you could have one thing do both things really much more naturally.
01:00:19.380 Right, and by laying that all out in front of you, you didn't have to do the mental work of representing that.
01:00:25.520 You could do the mental work of seeing how all the parts interconnected and what might be.
01:00:30.620 Yeah.
01:00:31.560 Yeah, Doug Clark is an architect I worked with when I was a kid at Digital, called it the interocular traumatic test.
01:00:37.180 When you look at it, it doesn't bug you.
01:00:41.440 And a lot of things, when you really lay them out like that, you go, oh, we're not doing this right.
01:00:49.080 And Elon likes that kind of stuff.
01:00:52.040 Like, you know, you're almost afraid to show them.
01:00:54.520 Like, when you laid it all out, you looked at it and go, this is crazy.
01:00:57.600 So, it was like, do we show Elon or not?
01:00:59.500 Because they'll look at it and think this is crazy.
01:01:01.620 Like, we built this car, we did this.
01:01:02.920 Okay, so I wanted to talk to you, too.
01:01:07.260 I want to talk to you like I'm someone very stupid.
01:01:11.040 And in this particular regard, I am.
01:01:13.700 I really don't understand how computation works.
01:01:17.700 And you're a microprocessor architect and you build computers.
01:01:20.720 And I listened to a discussion you had earlier this month, and there was a lot of it I couldn't follow.
01:01:28.340 I thought it might be helpful and interesting for you just to walk through for me and for my audience how a computer actually works, what it does, and how you build it, and then what it would be like to design and to architect a microprocessor.
01:01:45.620 Yeah, well, it's somewhat hard to describe, but there's a couple simple things.
01:01:52.800 So, let's start with the easiest thing.
01:01:54.640 The computers have three components, really.
01:01:57.860 Memory, programs, and input and output.
01:02:03.320 Right, those are the three basic things we always build.
01:02:06.880 Right, so memory is like the DRAM or the disk drive, place where you store data.
01:02:12.120 And it's just stored.
01:02:13.300 And it can have different representations.
01:02:15.580 We currently use ones and zeros.
01:02:18.100 So, you can take any bit of information and describe it as a sequence of ones and zeros.
01:02:25.940 And it's stored in silicon in either static and dynamic memories or on disk drives, which, you know, there's a couple technologies for that.
01:02:35.020 So, does memory make sense to you?
01:02:36.400 A place to store information.
01:02:37.680 Well, it does, although I have some difficulty in understanding exactly how the transformation is undertaken to represent things in zeros and ones.
01:02:46.420 I mean, I...
01:02:47.080 I'll give you a simple example.
01:02:49.540 So, if you shine a light on a photo, a photo detector, right?
01:02:55.760 So, the light comes in and it's a stream of photons, right?
01:03:00.840 And the photo detector counts the photons, literally.
01:03:05.760 So, every photon that hits it or a couple photons hit it, they cause some electric charge to move.
01:03:11.080 And that causes the circuit to wake up and say, I saw some photons.
01:03:14.280 So, say you're trying to evaluate how strong that light is.
01:03:20.040 So, it could be anywhere from nothing to super intense, right?
01:03:24.340 And then you might say, well, let's put that in a range of numbers from zero to a thousand, right?
01:03:31.620 So, and then you're starting to light on it.
01:03:34.440 And so, you count the photons for, say, you know, a microsecond.
01:03:37.220 And then you translate how many photons you counted into the number, right?
01:03:43.060 So, and so, just imagine as the light varies up and down, the count, the number coming out of your photo detector is varying between zero and a thousand, right?
01:03:54.060 And we use base 10, but you can translate that to binary, which is base two.
01:04:00.400 And now you have ones and zeros.
01:04:01.420 So, you've basically now translated a light ray, optical information, to a count by science.
01:04:10.780 And so, virtually everything can be represented by a count, apparently.
01:04:14.960 So, just think of your computer.
01:04:16.660 It has a camera, which is essentially doing just that.
01:04:19.860 Well, and you get different colors, but first you go through color filters.
01:04:22.760 So, you have a green filter and a red filter and a blue filter.
01:04:26.060 That's enough to represent the color spectrum.
01:04:28.040 And then you have a little photon counter underneath that.
01:04:32.320 And it counts for a little while.
01:04:33.860 And then you ship out the number and you reset it and count again.
01:04:36.880 So, if light varies, you're getting that.
01:04:38.940 There's a pretty big grid, like a modern camera has 12 million, you know, photo detectors in it.
01:04:45.280 Actually, 36 million because it's got different colors.
01:04:48.540 But, well, it depends on how they build it.
01:04:50.900 Like, they might have different pixels be different colors and then interpolate the colors.
01:04:55.520 Like, a keyboard is really simple.
01:04:57.100 So, the fundamental issue to begin with is that you reduce everything to a count and then you represent that count in base 2.
01:05:04.720 It's base 2, right?
01:05:06.060 Zero and 1.
01:05:07.340 And you can do the same thing with sound.
01:05:09.000 You can have a sound detector that basically counts, you know, intensity of sound waves.
01:05:12.640 And the keyboard, well, you have a little grid of keys.
01:05:15.560 And when you push a key, it sends which key was that.
01:05:19.080 That was key number 27.
01:05:20.800 So, now you can call it.
01:05:21.940 And it's either on or off.
01:05:23.140 On or off, you know.
01:05:24.260 So, D might be count number 26 and F is 27 and G is 28.
01:05:30.120 So, everything gets encoded into a number.
01:05:34.520 And then computer is like binary numbers, you know, for technical reasons.
01:05:38.820 But that's not a big deal.
01:05:40.220 So, memory.
01:05:41.660 So, input and output is the first thing.
01:05:44.460 So, the computer is built around the memory.
01:05:46.940 So, the input and output system.
01:05:48.780 So, the input is all your photo detectors, sound detectors, keyboard detectors.
01:05:52.820 And there's amazing numbers of sensors these days.
01:05:56.200 You can detect gravitational waves, maybe.
01:05:59.140 You can detect, you know, photons.
01:06:01.120 You can detect electrical waves, sound waves.
01:06:03.600 You can detect temperature.
01:06:05.500 You know, it's very varied.
01:06:07.320 You turn all that stuff into a number.
01:06:09.020 And then your input device writes that into memory.
01:06:12.100 And memory stores information.
01:06:14.500 You know, memory used to be small and expensive.
01:06:16.360 And now it's big and cheap.
01:06:17.440 But it's still just memory.
01:06:20.780 And nothing happens to it in the memory.
01:06:22.520 It just gets stored.
01:06:24.040 So, we look inside a computer.
01:06:25.260 It's a big memory.
01:06:25.900 And it's full of numbers.
01:06:27.640 Now, the person who's writing the program, you know, is telling, like, the input and output.
01:06:34.520 Like, here comes the input video stream.
01:06:37.000 Put the video stream address, 1 million.
01:06:40.500 So, all the memory is addressed.
01:06:42.920 And the address typically starts at zero.
01:06:44.760 And the modern computer goes up to billions.
01:06:48.120 Okay.
01:06:48.320 So, walk through that again, the addressing.
01:06:50.140 So, what's exactly the function of that?
01:06:52.800 Well, you want to know where the memory is.
01:06:55.360 Okay.
01:06:55.740 Right.
01:06:56.020 You need to.
01:06:56.520 Okay.
01:06:56.780 Fine.
01:06:57.240 Yeah.
01:06:57.500 So, basically, your phone probably, I don't know, has 8 or 16 gigabytes of memory in it.
01:07:02.900 Maybe, yeah, 4 or 8.
01:07:04.680 I don't know.
01:07:05.560 So, billion.
01:07:06.540 You know, 8 billion bytes of information in there.
01:07:11.260 So, and when you're designing your programs, you kind of lay out, well, here's the operating system that's going.
01:07:17.720 Here's what the input and output buffers are.
01:07:19.820 Here's memory we're going to use to run some program.
01:07:23.260 So, and all that's addressed.
01:07:24.840 So, you can think of the addresses.
01:07:27.420 It's just like a post office, right?
01:07:28.640 So, you know, every house has a postal address.
01:07:33.600 You know, it's a street address and then your house address.
01:07:36.020 And so, you can find every person.
01:07:38.240 And the address corresponds to the physical location, in some sense, to the physical location of the, of the.
01:07:44.000 And how are the zero and ones represented in the memory?
01:07:47.760 It's literally a voltage that's either high or low.
01:07:50.740 And zero is using ground, which is zero volts.
01:07:53.240 It's in modern, you know, DRAM cells probably stored at 1.1 volts.
01:07:59.340 And, you know, in a DRAM cell, it's a capacitor that's holding electrons.
01:08:03.560 So, basically, when you store the cell and either you drain all the electrons out, so it's zero volts, or you put a bunch of electrons in so that it holds a one volt.
01:08:14.060 So, it's literally a number of electrons in there.
01:08:16.520 There's a couple of ways to make memory cells.
01:08:18.180 There's another way, which is called a bistable element, where you have what's called cross-couple inverters.
01:08:23.020 But that's too complicated to explain.
01:08:26.480 And then, the memories are usually built in rays.
01:08:28.780 So, there's an X, Y.
01:08:30.620 So, you take the number, and you say, I'll take the bottom half of the number and figure out which row it's in, and the top half of the number of which column it's in.
01:08:38.100 And where the column and the row overlap, then I'll write my new data of a one and zero in that spot.
01:08:44.900 It's literally that simple.
01:08:46.260 So, if you look at a memory chip, you'll see this array of bits with little blocks on two edges.
01:08:54.520 Usually, you know, one side's the row, one side's the column, and then the bottom is what they call the sense end, and you read it back out again.
01:09:01.820 So, a memory process is you activate the row and column to a spot, which gives you the address of that bit.
01:09:08.220 And then, you drive the bit in and charge up or discharge that cell, and then it holds them.
01:09:16.500 And it's super simple.
01:09:18.220 Okay.
01:09:18.420 You can build a memory with a pegboard.
01:09:20.800 You could build a memory.
01:09:21.900 I mean, people literally did.
01:09:22.860 Way back when, there was something called Chlor memory, where they had essentially the XY grid, and at each little place, there was a little magnetic bead, which when you put the current through, you put the current in the same direction, you could make it be north to south and the opposite direction, south to north.
01:09:42.300 So, you basically remagnetize the little beads.
01:09:45.240 So, there's lots of ways to make memory, but currently, the really dense memories are called dynamic memories, where you literally put charge in there.
01:09:53.660 And then, there's fun stuff that happens, like flash cells, the cells got so small that the electrons, you know, from the quantum effects, tunnel out occasionally.
01:10:02.500 So, you put 25...
01:10:03.040 Right, because there's some doubt about where the electron actually is.
01:10:05.960 Yes.
01:10:06.320 And sometimes, it could literally jump out of the cell once it jumps out and doesn't come back.
01:10:10.280 So, they got down to, like, 25 electrons in a cell, and they would wander off over a couple hours, and you have to refresh them.
01:10:19.740 So, periodically, you go back and you read the data before too much of it's escaped, and you write it back in.
01:10:26.260 So, it's called refreshing the memory.
01:10:29.220 But DRAMs hold more charge than that, and then the flash guys figure out how to stack the cell.
01:10:33.080 So, modern flash chips, there's an XY grid, but there's also a Z dimension.
01:10:38.680 They're, like, 256 layers thick now.
01:10:41.540 So, it's, like, a three-dimensional memory.
01:10:44.080 But the simple thing still is, it's a linear range of addresses where you put some data.
01:10:50.800 Okay, so that's memory.
01:10:51.840 So, the next component?
01:10:53.520 Is programs.
01:10:55.160 This is the compute part.
01:10:57.460 So, a simple program is A equal B plus C.
01:11:03.080 Right.
01:11:04.060 So, the data at address A, so, when you write the program, you tend to use what they call variable names, A, B, and C.
01:11:12.420 But there's a tool called compiler, which will assign A to say address 100, and B to address 101, and C to address 102.
01:11:21.840 Right.
01:11:22.120 And then the computer, when it's running, says, do what I told you to do.
01:11:28.820 So, you see this program, A equal B plus C.
01:11:32.720 So, you get B.
01:11:35.560 You get C.
01:11:37.340 You add them together.
01:11:38.740 You put it in A.
01:11:40.840 And typically, what happens is you have what's called a local memory or a register file.
01:11:46.640 So, you get the data from memory into the register file.
01:11:49.260 You do whatever operation you're told to do, like add.
01:11:53.320 And then you put C back in memory.
01:11:54.680 And what are the range of operations, or is that too broad a question?
01:11:59.460 What are the fundamental operations, apparently, are arithmetic?
01:12:03.040 I've done this.
01:12:04.000 Like, the number of operations that a computer does, like instruction sets can have 100 or 500 or 1,000 different instructions.
01:12:11.940 But the most common ones are load data from memory to the processor, or the program ones.
01:12:18.860 Store memory back.
01:12:20.120 Those are your first two instructions.
01:12:22.700 And then add, subtract, multiply, divide, you know, clear.
01:12:28.900 You know, very simple.
01:12:30.920 It's written.
01:12:34.000 You know, then there's what's called logical operators, and or not.
01:12:37.320 Well, it's stunning to me, conceptually, thinking through this, that computers, which can produce whole worlds, in some sense, can do that as a consequence of zeros and ones and arithmetic operators.
01:12:50.780 Sure.
01:12:51.440 Well, your brain is doing something interesting like that.
01:12:56.160 There's no magic to it.
01:12:57.140 So, the key to programs is abstraction layers.
01:13:01.020 Right?
01:13:01.160 So, at some low level, you know, like I understand computers from atoms up to operating systems, which is, you know, fairly broad range, but there's lots of people who can do that.
01:13:11.900 Yeah, and I understand them like the surface of the keyboard.
01:13:15.640 Yes.
01:13:16.660 Yes, monkey with military helicopter, basically.
01:13:20.400 Pete Bannon had an interview question.
01:13:22.520 So, you know, computer scientists would say, tell me what happens when I move, when I type a key.
01:13:29.940 Right?
01:13:31.160 Because you can, you can talk all day.
01:13:32.820 I can talk all day about it.
01:13:35.140 You know, because the key is at the position, which encoded the number, which got sent into the memory.
01:13:39.980 There's an interrupt delivered to the processor to say, there's new data in memory.
01:13:43.760 Go take a look at it.
01:13:45.300 But you can describe that at many, many levels.
01:13:48.980 Right?
01:13:49.880 So, it's a good place to start.
01:13:51.560 It's not bad.
01:13:53.120 As an interview question.
01:13:54.760 As an interview question.
01:13:55.620 Yeah, right.
01:13:56.280 Some people, by the way, are stumped.
01:13:57.740 They go to college and they can't tell you what happens.
01:13:59.400 You know, a key is click, which is weird.
01:14:03.840 So, so back to the computer.
01:14:06.520 So, yeah, it's the basic operations.
01:14:08.520 That's subtract, multiply, divide, you know, clear, set the one, and, or, not, XOR.
01:14:16.000 Where, you know, you can take a number and you can shift it around, you can mask it.
01:14:21.520 And so, different architectures.
01:14:22.900 How are those operators discovered, Jim?
01:14:25.480 I mean, I know that there's their arithmetic operators.
01:14:27.840 And then, is that just the question of how was their arithmetic discovered?
01:14:31.280 But, I mean, there's a logic.
01:14:33.560 Way after, way after math.
01:14:36.620 So, computers, you know, at some level, they're doing arithmetic.
01:14:42.280 Like, it's not very sophisticated.
01:14:43.880 And I'll get to a little more complicated version of this.
01:14:46.720 And by the time we invented computers, people had a pretty good idea of number theory.
01:14:51.040 They'd figured out that base 10 was just one of the bases.
01:14:53.620 You could have 2, 3, 4, 5, 6, 7, 8.
01:14:56.440 People had, the philosophers had worked out what logic is.
01:14:59.640 You know, if this is true and this is true, then this is true, this is true, or this is true.
01:15:05.740 Like, the logical operators are real.
01:15:08.640 Like, there's a whole bunch.
01:15:10.200 There was a whole bunch of math.
01:15:11.040 Yeah, it's the real, it's the realism of them that's stunning to me.
01:15:15.340 Yeah.
01:15:16.360 So, there's the basic operator set.
01:15:19.980 And then, there's something called control flow.
01:15:22.720 So, computers, typically, they put a program like add, you know, A equal B plus C.
01:15:27.720 You know, D equals E plus F.
01:15:31.480 F equals E plus A, you know.
01:15:35.080 And you typically put that in what's called program memory, but it's just part of the memory of the computer.
01:15:40.840 And you have a program calendar, which, you know, I don't know what's called calendar,
01:15:45.260 but the thing that points at the next instruction to execute, and it's default thing is,
01:15:51.320 do this instruction and then do the one right after it.
01:15:53.960 That became, like, the way computers are built.
01:15:57.020 That's an arbitrary choice, by the way.
01:15:58.520 You can have every instruction tell you where to get the next instruction.
01:16:01.740 There's a bunch of things you can do.
01:16:03.440 But for simplicity, people said, this piece of memory has programs in it.
01:16:08.380 Start at the first instruction and then do the next one and the next one.
01:16:12.160 The program counters.
01:16:14.600 But that's not good enough because then you would just start at the first one and you go to the end of memory and be done.
01:16:19.000 So there's something called control flow.
01:16:22.120 So a program called, sorry, called control, control flow.
01:16:26.540 So imagine you wanted to add up a list of 10 numbers.
01:16:30.120 So your first instruction says, I'm on the first instruction.
01:16:34.020 Then you say that the sum equals the current sum plus the next number.
01:16:41.000 Increment the counter of how many instructions I had.
01:16:43.920 Increment, increment, count by one, and then test.
01:16:48.900 Is the counter equal to 10?
01:16:51.960 If yes, keep going straight.
01:16:54.400 If no, go back to get the next number.
01:16:58.140 Right?
01:16:58.320 So you created a little loop.
01:17:00.920 Right?
01:17:01.460 So, and it turns out the computer scientists invented a whole bunch of kind of loop, what they call control flow constraints.
01:17:10.680 Do this while X is true.
01:17:13.160 Do this until the counter gets to a number.
01:17:16.940 Right?
01:17:17.440 So there's, so you can create little, you know, basically subprograms in the program.
01:17:23.520 Right?
01:17:24.000 And then there's a couple, you can, you can test like, hey, I need to decide if this is a dog or a cat.
01:17:31.280 So if it's a one, go look at here.
01:17:33.680 If it's a zero, go look at that.
01:17:36.100 Right?
01:17:36.560 So that's a conditional branch with loop branches.
01:17:39.920 And then somebody famously invented subroutines.
01:17:43.380 You notice how as he was writing the program, he'd write, he'd write this little routine, but it would be used a bunch of different times.
01:17:50.100 So rather than putting the code in multiple times, it was like define a word.
01:17:56.040 And then whenever I need to use the word, I don't have to put the whole definition for the word.
01:17:59.760 I just put the word.
01:18:01.460 So subroutine is like a local definition of something or a local computation that's used multiple times.
01:18:08.160 So your top level program might be go to the subroutine that counts up numbers, comes up numbers and come back.
01:18:16.080 Now go to the subroutine that checks whether it's your bank balance or not, come back.
01:18:21.680 So the program, this becomes sequential operation, control flow, like doing loops, let's say do this until something's done.
01:18:30.780 And then conditional branches that says, depending on value, do this or that.
01:18:35.460 And then subroutines to do something atomic.
01:18:38.260 And that's essentially all the program.
01:18:41.900 Operations, loops, conditional branches, and subroutines.
01:18:46.860 That's it.
01:18:48.820 Now, why can computers construct worlds?
01:18:54.680 So I still remember when, so if you look at your screen, you know, the computer in front of you probably has two or four million pixels on it.
01:19:03.100 Seems like a lot.
01:19:04.180 And when they first started, you know, televisions, when they lit up screens, you know, they were scanning the little electron microscope, you know, electron beam across a phosphorescent surface and lighting and modulating the intensity of the electron gun to make the little phosphores brighter.
01:19:23.940 Right.
01:19:24.260 And it was writing, it writes one line at a time, wrote one line at a time at an incredibly fast rate by human standards.
01:19:31.120 And we saw that as continual.
01:19:33.160 Right.
01:19:33.840 Yeah.
01:19:34.220 And then continual motion.
01:19:35.640 The phosphore was designed to decay at the rate.
01:19:38.140 So by the time you came back to it, it had just gotten a little dimmer.
01:19:41.840 And then you wrote it with the next value.
01:19:43.700 So it didn't flicker.
01:19:45.440 So your eye has some persistence.
01:19:46.820 So the electron hits it, it makes the phosphore light up with some photons of the right color.
01:19:52.380 And then it slowly decays.
01:19:54.060 And you scan down and it gets back there and writes it again before it's too dim.
01:19:58.100 And so the screen on a phosphore-based television, is it analogous?
01:20:04.540 It's analogous in some sense to the binary representation?
01:20:08.120 The dot is on or off?
01:20:11.540 That's entirely analog.
01:20:13.420 So it's digitized in the sense it's discrete, I'd say, in the sense that each little pixel, you can see the little phosphores in the screen.
01:20:21.780 Right.
01:20:22.320 Especially notice that when they went to color TVs because they have a red, green, and blue thing in there.
01:20:26.480 Right, right.
01:20:27.240 And they would hit them.
01:20:28.340 And they're essentially either on or off?
01:20:32.040 Well, they have a range.
01:20:33.280 So that beam is a variable intensity.
01:20:37.580 Right.
01:20:38.120 So now modern computers work differently.
01:20:40.480 So the screen in front of you has a little, it literally has an XY grid, and it can address each one of those things.
01:20:48.000 Right.
01:20:48.480 So, you know, you don't shoot a beam at it anymore.
01:20:51.140 You have an XY, you decode.
01:20:52.880 You know, it's almost like the screen looks like a big flat memory.
01:20:56.140 But instead of storing ones and zeros, it's storing color.
01:20:59.280 But they have the same kind of decay property, and you write to new color, and there's a bunch of stuff.
01:21:03.860 Now, here's the wild thing.
01:21:05.900 Computers are now so fast.
01:21:07.300 You can run a 10,000-line program for every single pixel on that screen.
01:21:15.480 So what does that imply?
01:21:17.960 Well, it turns out for a whole bunch of reasons, like if you want to make something look really good on the screen, so the world is relatively continuous, right?
01:21:28.700 So if you look at it, there's all this light reflecting around.
01:21:31.840 There's all these things going on.
01:21:33.460 There's no little pixels in the surface of your table, right?
01:21:37.680 To make a discrete grid look that way, you have to, you know, combine the colors.
01:21:44.120 You have to do a whole bunch of stuff.
01:21:45.580 You have to pretend you're shining lights on it.
01:21:47.860 You have to, you know, like there's a reflection from one surface to the next one.
01:21:51.100 And it turns out when you have thousands of instructions per pixel, you can start to make those pixels look realistic, right?
01:22:00.640 The operations, when you go look in the pixel program, like it looks so beautiful, but you think that's incredible.
01:22:06.860 But if you look in the pixel program, it's load the data into the register, add it to a number, test it against the number, subtract something.
01:22:14.340 There's something called clipping, like make sure the pixel doesn't get brighter than this and dimmer than that.
01:22:20.800 It's all simple operations.
01:22:23.500 Like there's nothing in the computer that's like, do a, you know, pixel operation, right?
01:22:31.300 Well, there may be a subroutine named that, but underneath it, it's just the same old stuff.
01:22:36.380 Computers always do load, store, add, subtract, multiply, divide, branch.
01:22:41.780 Okay, so did we, okay, so how far have we got, I'm listening to so many things, I'm having a hard time keeping track of the order.
01:22:50.780 You mentioned earlier that computers consist of four elements, I believe that's what you said.
01:22:56.520 Memory.
01:22:57.280 Yep.
01:22:57.880 Input and output.
01:22:59.080 Yep.
01:22:59.620 And compute.
01:23:01.020 Okay, three.
01:23:01.880 So I was counting input and output separately, but okay.
01:23:04.380 And have we gone through all three of them?
01:23:06.080 Yep.
01:23:06.860 Okay.
01:23:07.420 Okay.
01:23:07.820 So memory is just a place to store bets.
01:23:09.880 Yep.
01:23:10.120 Input and output is typically the way to, you know, it depends on what you're doing.
01:23:14.660 You might just send bets from one place to another, but it might also be, you could say input, you know, input and output in the computer and sensors are slightly different things.
01:23:22.480 Like sensors, you know, turn analog real world signals into bits.
01:23:26.920 Into digital.
01:23:28.540 Right.
01:23:28.780 And then programs basically transform the data in some way.
01:23:34.320 And programs is basically seek, you know, operations like add subtract, divide, and then branches that either let you do loops or make decisions.
01:23:44.580 And then the hardware to let you do subroutines to break the program into pieces.
01:23:49.240 And that's pretty much it.
01:23:54.240 So to some degree, you take the world, you transform it into on, off, or yes, no.
01:24:00.740 Billions of those.
01:24:01.640 And then you manipulate the yeses and nos or the zeros and ones, and that can produce almost any sort of phenomenon that you can imagine.
01:24:10.560 Yeah.
01:24:10.800 Yes and no, it's not a very good, you know, ones and zeros is better because then it's a, it's a mathematical representation, you know, a digital representation of an analog reality.
01:24:22.660 Something like that.
01:24:23.420 And is the analog reality analog all the way down or is it digital at the bottom quantum at the bottom.
01:24:33.020 So there's something called the fine constant, which makes the universe look discreet, but it's a very, very small number.
01:24:38.460 Right.
01:24:40.440 So, and there's a fun fact, which is.
01:24:43.600 Is that the plank length?
01:24:45.280 Is that associated with the plank length?
01:24:47.260 Yeah.
01:24:48.280 And that's the smallest possible length.
01:24:50.140 I believe.
01:24:50.680 Like the mass of the universe is 10 to the 40th and the, then the plank length is 10 to the minus 40th.
01:24:56.480 And there's a, there's a physics thread about the mystery of why those things are 10 to the 40th and 10 to the 40th.
01:25:04.900 Okay.
01:25:05.460 All right.
01:25:05.800 So let's move from that to, to, um, I'm going to ask you, these are the questions.
01:25:11.720 I want to say, so.
01:25:12.840 Yep.
01:25:14.100 The thing that makes computers do what they do is abstraction layers.
01:25:17.840 So, so at the bottom, there's atoms.
01:25:21.220 So there's engineers who know how to put atoms together in a way that makes switches, which we call transistors.
01:25:27.100 Right.
01:25:27.660 So, and those guys are experts at that stuff.
01:25:30.160 Right.
01:25:30.980 And they just, they can operate at that level.
01:25:33.340 Then there's another thing where you take multiple transistors together and you basically make what's called logic gates, which literally do the ands and ors and inversions.
01:25:42.740 Right.
01:25:43.760 And then that's an abstraction layer.
01:25:46.560 We call it, you know, the physical design library or something like that.
01:25:50.100 And then people take those and they make them up into adders and subtractors and multipliers.
01:25:54.920 This is a well understood Boolean mass.
01:25:57.660 So how do you, how do you add two binary numbers?
01:26:00.580 So you make those and then there's another abstraction layer that says, all right, I'm going to take multiple operation units and put them together to make, you know, part of the computer.
01:26:10.660 Right.
01:26:11.360 And then you make, there's a bunch of those blocks and then that thing runs a program very simply.
01:26:17.260 And there's a small number of people who write programs at the low level, but then there's people who use what's called libraries where they, you know, they're, they're doing some higher level program.
01:26:25.800 And so they're going to do a matrix multiplying and do this and that, but they don't actually write that low level code.
01:26:30.640 So there's, you know, there's a stack of abstractions.
01:26:33.840 And when something gets too complicated, you split the abstraction layer into two things.
01:26:39.480 There used to be when, when people wrote a program, there's a program called a compiler that translated your C program or Fortran program into the low level instructions.
01:26:49.820 But it turns out there's too many languages up here and there's too many instructions here.
01:26:53.960 So now they translate it from the high level language into an intermediate representation, which is sort of, let's say a generic program.
01:27:00.920 And then there's another thing that translates the intermediate representation and the specific computer you have.
01:27:06.780 But that just keeps going higher and higher.
01:27:08.940 Like a lot of programmers, they, they use, you know, frameworks that can do amazing things.
01:27:15.800 Like you could literally lay a program and it says, search the internet for a picture of a cat sword by color output to my printer.
01:27:24.160 Like there's a language where that's a program.
01:27:27.120 Search the internet.
01:27:28.260 Holy cow.
01:27:28.740 That runs a trillion lines of code on a hundred thousand computers.
01:27:32.320 Find a cat.
01:27:33.560 That's a really expensive, that's a really complicated program.
01:27:36.180 So how much of the radical increase in computation power is a consequence of hardware transformation and how much of it is a consequence of the, the, the increasing density, let's say of these abstraction layers.
01:27:49.960 Well, so this is, this is where, you know, there's a really creative tension or dynamic interplay.
01:27:55.560 So when computers first started, they were so slow.
01:27:58.380 You ran really simple programs.
01:28:00.300 A equal B plus C times D.
01:28:02.540 Right.
01:28:03.660 And we've been going up the math hierarchy.
01:28:05.520 So then you could run a program that did what's called, you know, matrix math, like, or linear algebra systems of big equations, and then matrices, and then more complicated ones.
01:28:15.640 So as the computational power went up, you could dedicate more and more stuff to, you know, that kind of computation.
01:28:24.660 And then similar thing happened on abstraction layers.
01:28:27.820 Like it used to be, if you bought a million dollar computer, you hand wrote every line of code because you didn't want to waste time on the computer with like overhead.
01:28:38.280 But today, you know, that million dollar computer costs 10 cents.
01:28:42.060 You don't really care how many cycles you use, you know, parsing a cat video or something.
01:28:46.880 And so the computation capacity, let the abstractions at the programming level increase a lot.
01:28:55.560 So somebody had a graph about how many bytes does it take to store the letter A?
01:29:01.060 Like, it used to be one, and then Word for Windows, it's like 10 kilobytes per letter.
01:29:10.480 Because the letter has a font, it has a color, it has a shadow, you know, there's a whole bunch of, you know, and that's fine.
01:29:17.300 Like, the computer with a million dollars for, you know, a thousand bytes of memory, you wouldn't store a letter A like that.
01:29:24.380 You'd put it in one byte.
01:29:25.500 But now you have gigabytes and terabytes of storage, who cares?
01:29:30.600 You probably already know that there are data brokers out there selling your internet data off to companies who want to serve you a targeted ad.
01:29:37.720 But you might be surprised to learn that they're also selling your information to the Department of Homeland Security and the IRS.
01:29:44.300 Mask your digital footprint and protect yourself with ExpressVPN.
01:29:48.200 One of the easiest ways for brokers to aggregate data and tie it back to you is through your device's unique IP address.
01:29:54.300 But when you're connected to ExpressVPN, your IP address is hidden, making it much more difficult for data brokers to identify you.
01:30:02.020 ExpressVPN also encrypts 100% of network traffic to keep your data safe from hackers on public Wi-Fi.
01:30:08.460 You can download ExpressVPN on all your devices, your phone, your computer, even your home Wi-Fi router.
01:30:14.940 Just tap one button and you're protected.
01:30:17.300 Make sure your online activity and data is protected with the best VPN money can buy.
01:30:22.020 Visit ExpressVPN.com slash Jordan right now and get three extra months free.
01:30:27.920 That's ExpressVPN.com slash Jordan.
01:30:31.340 Okay, so you walked us through the basics of computation.
01:30:34.920 Now, can you shed some light on, like, I don't understand what you do as a computer architect.
01:30:42.580 Like, when you go to work, when you're working on a project, what is it that you're actually involved in doing?
01:30:48.580 Make ads go faster.
01:30:49.840 So, I'm a fairly low-level engineer.
01:30:57.480 You know, low-level in terms of the abstraction layers.
01:30:59.980 Like, I understand the higher ones.
01:31:02.360 But, you know, I talk to the people who make transistors and N-gates and O-gates.
01:31:08.900 And they talk to the people who know the atoms.
01:31:10.840 Right.
01:31:12.060 So, and I hardly ever talk to the atom people, but I know something about atoms.
01:31:17.200 So, I build, you know, when I'm architecting stuff, the functional units and then how they operate together at the low level that runs programs.
01:31:30.180 So, I don't write programs.
01:31:32.700 I build, I'm an architect of the computer that runs programs.
01:31:37.720 Right.
01:31:37.880 And then it used to be you could look at a computer and you know how a program works.
01:31:44.520 You run the first line, the second line, if there's a branch, the branch unit, and then you branch.
01:31:49.700 Right.
01:31:50.100 And the computer would literally have that in it.
01:31:52.760 So, that's an instruction, load the data, do the operation, which is a branch, execute the branch, if necessary, change the program counter.
01:32:03.360 So, you know, people, you know, there was a period of time where computers had like five stages in it.
01:32:08.460 And each one of them could say, that's the branch, that's the fetch unit, that's the load unit, that's the add unit, that's the branch unit.
01:32:15.640 Right, but modern computers are more complicated than this, right, because computers like that would do one instruction every five cycles.
01:32:25.500 And modern computers, the fastest one I know about, is doing 10 instructions, 10 instructions in a cycle in parallel.
01:32:33.420 Right, and this is difficult.
01:32:35.300 So, the best way to...
01:32:36.940 Unpack that, unpack that.
01:32:38.200 So, if you write a program, since you write, right, when you write, you write linear narratives.
01:32:46.520 Right, you write a sentence that makes sense, followed by another sentence.
01:32:52.120 Right, and so, as you're writing along, sometimes the one sentence defines the meaning of the next sentence.
01:32:57.960 Right, and then group it into paragraphs.
01:33:01.260 You might call those subroutines, right?
01:33:03.340 And sometimes the paragraphs have to be ordered, and sometimes the paragraphs, the order doesn't matter.
01:33:09.740 Right, so programs are written by human beings, and they're written in the same linear narrative.
01:33:15.640 So, if you want to go faster than parsing the instructions one at a time, in order, you have to do some analysis to say, all right, I got two sentences.
01:33:25.660 Are they dependent or not?
01:33:28.280 If they're dependent, I do them in order.
01:33:30.380 If they're not dependent, I can do them in parallel, or any order.
01:33:33.760 Right, and you start, so the modern computers, when they're reading the programs out, they're analyzing the dependencies and deciding what can happen in order.
01:33:44.460 What has to happen in order for correctness, correct understanding, and what can be reordered.
01:33:49.700 And then it turns out, there's many places where you say, if there's an error, go here, but there's hardly ever an error, and you can predict that really well.
01:34:01.260 So, you say, I'm going, you're reading along, and you say, here's a point where I'm not sure which, should I read the next sentence, or should I jump to the next paragraph?
01:34:12.500 Right, so a modern computer predicts that.
01:34:18.340 It doesn't wait for you to fully understand all the sentences up to that point, so you know exactly where to read them to.
01:34:26.000 So, imagine, so now you're, you're reading this book, and you're reading sentences in dependency order, which means you haven't, so you get to a branch, and you haven't read all the sentences before that and understood them.
01:34:37.540 So, you don't know where to read the next paragraph, the next chapter, but we predict what's going to happen.
01:34:42.500 And we just keep on going.
01:34:45.400 And how does that tie into the process of designing the...
01:34:50.220 So, the goal, the goal of modern computers is to go fast.
01:34:54.060 Well, let me say there's three kinds of computers.
01:34:57.540 There's computers that run very simple programs in order.
01:35:04.100 Right?
01:35:04.920 They just do exactly what you told them to do.
01:35:07.680 And they tend to be small and simple, but they're so small and simple, you can make a chip,
01:35:12.360 but 1,000 of those computers on.
01:35:14.620 So, when you build a GPU that does a little program for every pixel on your screen, each one of those pixels gets its own program.
01:35:21.980 It's very simple.
01:35:23.360 But you sort of say the first 1,000 pixels you run on these 1,000 computers.
01:35:28.460 So, like a modern GPU has currently like 6,000 or 8,000 processors in it.
01:35:35.020 And they literally, you do the first 6,000 pixels, and then the next 6,000 pixels, and the next 6,000 pixels.
01:35:42.240 And they do that fast enough that you can run a fairly big program on every pixel on the screen for every screen refresh time.
01:35:49.100 So, you have simple computers that do stuff in order.
01:35:53.320 Right?
01:35:53.980 And then you have, let's say, computers that are designed to run complicated, long programs as fast as possible.
01:36:02.560 Right?
01:36:03.020 And that's where you parse the instructions carefully, and you figure out what order you can do them in, and when possible, you reorder it.
01:36:09.600 Right?
01:36:11.980 And the reason you reorder it is because if this doesn't depend on this, I can do it in parallel.
01:36:15.800 Now I can do two things at a time.
01:36:17.480 The next thing I can predict that I can do it in parallel, now I can do three things.
01:36:22.580 And, you know, like I said, the computer in your desktop is probably doing three to five things at a time, and the best I know of is 10.
01:36:31.820 Right?
01:36:32.420 And that's because, and there's other sophisticated predictors in there.
01:36:36.320 So, to do that, you have to fetch large groups of instructions at a time, you have to figure out where the, like, the sentence boundaries are, figure out if they're dependent or not, figure out if you can predict where the next instructions are coming from when you hit branches.
01:36:51.440 And it turns out that's fairly complicated.
01:36:54.180 The difference between a little computer that does, let's say, one instruction at a time and a complicated one that does 10 instructions at a time, it's 100 times more complicated.
01:37:04.720 Right?
01:37:05.160 And from a, what's the best way to do lots of instructions, complicated computers are not efficient.
01:37:13.000 But there's so many applications where people care how fast it is.
01:37:17.380 So, when you're, like, clicking on your webpage, you want that to come up as fast as possible.
01:37:23.200 So, the part of it that's, let's say, what's it called, you know, the logic of the webpage.
01:37:29.520 It's probably a serial narrative written by a human being.
01:37:34.480 So, you have to, you run that on a complicated computer that, you know, does it out of order and predicts what to do as fast as possible.
01:37:42.300 But when you render the screen itself, that runs on large numbers of simple computers to make all the pixels.
01:37:47.440 Right, and then there's a third kind of computer, which we're starting to invent, which is AI computers.
01:37:57.560 And that's what you're working on now.
01:37:59.140 Yes.
01:37:59.980 For TENS Torrent.
01:38:01.460 Yeah.
01:38:02.220 And there's a, there's a really good talk by Andre Carpathy called Software 2.0.
01:38:07.420 So, all the, the first two kinds of computers, simple computers and complex out-of-order computers, they're running, running programs written by humans.
01:38:17.880 Right.
01:38:18.560 And if you look at the code, it's, it's literally a declarative statements about operations and where to go.
01:38:25.580 And it's serial.
01:38:26.860 It's a linear narrative.
01:38:28.000 Yeah, the different thing about AI computers is you use data to train the weights in neural networks to, to get you the desired result.
01:38:44.480 So, instead of, the programs are no longer written by humans.
01:38:48.420 Now, it turns out there's components of the AI stack that are written by humans, but at a high level, you use data to train them.
01:38:56.640 So, you have a big neural network and you want to detect cats.
01:39:01.780 So, you put a cat picture into the network when you start training and the output is gibberish.
01:39:07.060 And you compare gibberish to what a cat is and you calculate the difference in what the network said versus the desired result, which is the word cat.
01:39:16.540 And then they do something called backpropagation, which is mathematically sophisticated, but essentially takes the error and partition it across the layers of the network.
01:39:25.460 Such that you've sort of bumped each neuron a little closer to saying cat next time by taking the bigger at the end and distributing it across.
01:39:34.880 It's called backpropagation.
01:39:35.460 And then you put another cat in, and if you have the right size network and the right training methods, after you show that network a million cats, when you put a cat in, it reliably says cat.
01:39:48.540 And when you put a picture that's not a cat, it reliably says it's not a cat.
01:39:52.700 Right?
01:39:53.140 And you never wrote any code that had anything to do with cats.
01:40:00.200 And can you understand what it is that the computer is doing now that it's recognizing cats?
01:40:06.380 A little bit.
01:40:07.400 So, people for years worked on visual computing, and they were trying to detect things like cats.
01:40:13.800 Right?
01:40:13.980 And cats have a whole bunch of artifacts.
01:40:15.600 They have round eyes.
01:40:16.560 They have pointy ears.
01:40:17.720 They have fluffy hair.
01:40:18.620 So, you could detect, it was called feature detection.
01:40:22.340 You would say, this will be a cat if I see the following colors, the following amount of fluffiness, the following number.
01:40:29.140 You know, two pointy ears, not three.
01:40:31.440 One or two round eyes, depending on the view.
01:40:35.120 Right?
01:40:35.520 So, you could write code, and the problem with that is, well, now the cat has an arbitrary orientation.
01:40:40.780 So, you have to, you do your feature detect on the picture, and the features have to search the whole image, and you have to rotate around.
01:40:48.100 You know, and it's sort of, and every single thing you want to detect, you have to write a unique program for.
01:40:54.380 You're done with cats, now you go to dogs.
01:40:56.720 And then what about the dog that has slightly pointy ears?
01:40:59.740 These dogs have round ears, and cats have pointy ears.
01:41:02.440 You know?
01:41:03.000 So, it was sort of endless thing.
01:41:07.160 Right, right.
01:41:07.640 Same thing with speech.
01:41:08.460 Endless detail by detail construction.
01:41:12.360 Yeah.
01:41:13.060 I had a friend who worked on speech recognition years ago.
01:41:16.060 So, you break speech into, you know, the phonemes, so you can see those, and then they have frequency characteristics, and you can differentiate vowels from consonants.
01:41:26.000 So, those people working on speech were doing a whole bunch of analysis of analog waveforms of sound.
01:41:34.180 And they were making some progress, but it never really worked.
01:41:38.800 And then they train neural networks by, you put the word in, and you have what's called supervised learning.
01:41:47.940 So, you play a language where you know what all the words are, and you keep telling the network how to correct.
01:41:54.100 And with like a billion samples and a big enough neural network, it can recognize speech just fine.
01:42:00.360 And if you train it with a broad variety of accents, it can work across accents.
01:42:10.040 And then it turns out, the bigger they made these networks, the more information they can put in.
01:42:14.860 And then, on the cat one specifically, they found, so, when they first had a neural network crack the cat problem, I forget, it was like 50 layers deep.
01:42:29.620 And if you looked in the layers, you could see that it was detecting point of ears and eyes.
01:42:35.020 But it was also detecting a lot of other things.
01:42:38.260 And some things we don't know.
01:42:39.420 Yeah, well, if we see the back end of a cat walking away, we still know it's a cat.
01:42:43.300 And it pretty much lacks eyes and point of ears from that perspective.
01:42:47.360 And the funny thing, if you take an object, like in light, right, it's like a phone, you can project the phone onto a flat surface.
01:42:55.880 Say that's a projection, right?
01:42:57.700 And as you move it around, you get different.
01:42:59.380 It's a shadow.
01:43:00.360 Shadow.
01:43:01.460 But think of it as a projection, right?
01:43:04.160 So, that's a projection of a light source on a flat plane.
01:43:08.820 It's a fairly simple projection.
01:43:11.760 But what if you had a light shaped like a cat and you signed that on the phone?
01:43:16.900 What would a projection look like?
01:43:20.180 And it turns out, mathematically, there's an arbitrary number of projections.
01:43:26.640 You can, like we think of projections in three dimensions because we're three-dimensional creatures.
01:43:31.940 Right?
01:43:32.640 But there can be lots of projections.
01:43:34.040 And then you can have the projection project on another plane.
01:43:40.940 So, what the neural networks are doing is they're teasing out all the details of what that is.
01:43:48.540 And some of the projection planes give you what's called, you know, size invariance or rotation invariance.
01:43:54.200 Like, you could recognize a cat in the middle of which it's pointing.
01:43:58.240 Like, your brain is a little specialized.
01:44:00.040 Like, the faces.
01:44:01.020 Right.
01:44:01.400 It likes them to be vertical.
01:44:03.020 Right, side up.
01:44:03.760 Yeah.
01:44:04.060 But with a little bit of work, you can recognize an upside down face.
01:44:09.080 Unless you have a problem.
01:44:10.080 Okay, so we could do two things here.
01:44:12.720 We could either talk about your...
01:44:15.720 No, let's go into your...
01:44:18.200 You were an engineer and then you were a manager.
01:44:20.700 And you've worked in lots of companies, some of which were incredibly creative, some of which were thriving to an incredible degree, and some of which were collapsing and irreparable.
01:44:30.660 So, what have you learned about what makes companies work, and more importantly, what have you learned about what makes them not work, and maybe what do you do then?
01:44:43.940 Sure.
01:44:44.640 Well, that's a fun question.
01:44:47.060 Well, first of all, there's...
01:44:48.820 Like, I've noticed, and many people have noticed, this is not just me, that people...
01:44:53.060 Like, in engineering fields, people kind of bucket towards, you know, technical people and management people.
01:44:59.280 Yeah.
01:45:00.660 And it's not that there aren't good technical managers, or there's not good managers, or technical people who can manage, right?
01:45:09.580 Right, but that's an intersection of two skills, say.
01:45:12.660 Yeah, but generally speaking, most people are one or the other.
01:45:15.700 And it's like, when you wake up in the morning, do you want to solve a problem, or do you want to organize a problem?
01:45:22.620 Like, are you worried about your schedule and your headcounts and how things are getting done, and did you hit the milestones,
01:45:29.060 or are you working on a technical problem?
01:45:32.160 And people...
01:45:33.320 And in engineering fields, it's often there's the fellow track with the technical leadership position,
01:45:37.900 or the director of EP track, the management leadership.
01:45:41.880 Right?
01:45:42.360 So I'm a technical person.
01:45:45.140 But you took on management roles repeatedly.
01:45:47.560 Well, I did, because I found out that if you're, generally speaking, the top of the organization is the manager, the VP.
01:45:55.260 And as a technical person, no matter how high you go, you're an advisor for that person.
01:46:00.040 And I decided consciously, after I worked at Apple, that I was going to be a VP and have everybody work for me, because then I can...
01:46:08.000 Right?
01:46:09.720 So then my skill set is somewhat unusual, and I'm not the only one, obviously.
01:46:14.340 But I decided to, you know, get on the management track so I could build the computers I wanted.
01:46:20.860 Because sometimes, when I wasn't the leader of the group, some managers, at some point, would decide they own the next decision,
01:46:27.760 and they would make some random decision.
01:46:30.220 I'd be grumpy about it, and there's nothing I could do about it, because people work for them, not for me.
01:46:34.860 So that's, you know, it was a conscious thing.
01:46:37.840 And I hired a consultant, Ben Katrao, who helped me reframe how I approached this.
01:46:43.740 Now, I'm still a technical person, but I found that it turns out there's a whole bunch of really good technical managers that I like to work with.
01:46:51.860 I like to organize stuff.
01:46:53.820 And I would say, you know, I maintain my openness and low conscientiousness and disagreeable behavior.
01:47:00.060 And I have people who work for me, who work on my team, who work with people that manage better.
01:47:06.300 So, even though I've been a manager, you know, at AMD, it was 2,400 people total at the end, and Intel was 10,000.
01:47:15.480 My staff was, you know, 15 or 20 people.
01:47:19.480 And usually, half of them are real managers, and half of them are technical leaders.
01:47:25.160 That's how I solve it.
01:47:26.880 And there are lots of companies running away.
01:47:28.920 A lot of times, founders tend to be technical people, but people working for them are non-technical.
01:47:33.200 Or they're stronger on the management side than the technical side.
01:47:39.480 But for everybody, you need to decide who you are.
01:47:42.860 Like, I had a great technical manager at AMD, and one day, he was a little mess because I was looking to the, you know, a couple of the really technical heavyweights.
01:47:50.200 It's all a problem.
01:47:50.860 He said, you know, I'm pretty technical.
01:47:53.380 I said, yeah, I know.
01:47:55.220 I said, are you technical compared to Jim and to Barb?
01:47:58.640 And he goes, I guess not really.
01:48:00.820 I said, I know.
01:48:01.560 I really like, you know.
01:48:03.200 What I want you to do is you're running this project.
01:48:05.560 You have 150 people working for you.
01:48:07.780 You make all the technical decisions you can, but when it's out of your wheelhouse, we got serious experts.
01:48:15.460 And you have two choices.
01:48:16.960 You can call them or I can call them.
01:48:20.000 And he later told me, he said, I found out it was a lot better when I called him than when you called him.
01:48:25.020 And successful thing.
01:48:27.660 And he was technical.
01:48:28.700 And he was really good at making good decisions, but he wasn't the strongest technical person in the group.
01:48:33.420 So that's the first thing is, you know, figure out who you are.
01:48:37.880 I've seen a lot of people fail in engineering because at some point they think, I'm technical, but I want to get on the management track.
01:48:44.000 But they're bored by management and they don't have a plan to deal with it.
01:48:48.160 And so they start.
01:48:48.660 Yeah, well, you weren't bored by management.
01:48:50.780 And so I joke that I decided to see the organization that's computer architecture problem and treat me.
01:48:57.480 Well, that's exactly what I was going to ask.
01:48:59.040 What transformation did you have to undertake to?
01:49:01.700 Well, one of them was, what do I have to do to be effective?
01:49:06.320 Right.
01:49:06.680 So that's, you know, I hate to work on failed projects.
01:49:10.160 Right.
01:49:10.560 And then the next was the organizational problem itself is an architectural problem.
01:49:16.920 And then I kept, you know, for myself, well, I'm a funny kind of, if something has a solution and it's being confidently driven, I'm not that interested in it.
01:49:27.380 I like problems.
01:49:29.040 And so in a big organization, there's a million problems and I start sorting them by priority and then solving some of them or handing them out to the right people.
01:49:38.640 So there's a whole bunch of technical work to do on that.
01:49:42.040 And then I'm fairly good at skill assessing people who are technical, either for management or technical positions.
01:49:49.820 And then, you know, giving them work.
01:49:52.480 I like autonomy in management.
01:49:55.540 So if somebody's competent, they can do it and they understand it.
01:50:01.380 Then gave me a bunch of books to read.
01:50:03.520 And one of the frameworks is goals, organization, contract, and teamwork.
01:50:09.800 Or capabilities, I guess we usually solve for that.
01:50:12.480 So is the goal super clear?
01:50:15.200 Do we have the capability to solve the problem?
01:50:17.520 Is there a contract between me and the groups doing it so they know what to do and what their goals, you know, box are in?
01:50:23.860 Right.
01:50:24.340 And do they have the, you know, is the organization that, like a lot of times, you know, there's a joke that startups start with a problem and build an organization and support it.
01:50:36.120 But on the second, third system, the organization defines the problem rather than the problem defining the organization.
01:50:42.680 And then it breaks up.
01:50:44.280 Yeah.
01:50:44.700 Then the organization becomes the problem.
01:50:46.900 Yes.
01:50:47.700 Yeah.
01:50:47.900 They constrain the problem and become the problem.
01:50:50.620 Well, we had a number of discussions while you were doing this about ethics.
01:50:54.360 And, I mean, you said that you go, you look at the problems.
01:50:57.680 Well, that's hard, right?
01:50:58.740 Because you have to know enough to know what the problems are.
01:51:01.680 Then you have to be willing to look at the problems.
01:51:04.200 Well, then you prioritize them.
01:51:05.620 Like you skipped over that very quickly.
01:51:07.840 But all of that's extraordinarily difficult, I would say, both cognitively and emotionally.
01:51:13.300 Sometimes it is and sometimes it isn't.
01:51:15.200 Like when I joined A&P, the CPUs were less than half as fast as the competition.
01:51:19.420 And they had no plan to catch up.
01:51:21.900 So that wasn't that hard.
01:51:23.400 No, but what would be hard there, I would presume, is figuring out how it could be that
01:51:28.720 such an obvious problem had gone undetected and unsolved.
01:51:33.820 And then.
01:51:34.320 No, actually, one of their architects, when I was working at Apple, told me that they believed
01:51:38.680 that CPU performance had plateaued.
01:51:41.140 It wasn't going to get any faster.
01:51:42.800 And they were going to work on adding features to the rest of the chip.
01:51:46.600 And then Intel came in and said, we think computers are going to get 5% or 10% faster every
01:51:50.780 year.
01:51:51.380 And they did it.
01:51:52.100 One had one goal, which is, you know, things slow down.
01:51:55.540 The other had a different goal.
01:51:56.800 5% or 10% isn't a lot.
01:51:58.680 But you do that 10 years in a row.
01:52:03.340 And, you know, the other guys weren't.
01:52:05.220 So that wasn't that complicated.
01:52:08.180 Like Elon famously said, he tells everybody secret plans.
01:52:10.680 Nobody believes them.
01:52:11.480 And then he does them.
01:52:12.120 And they still don't believe them.
01:52:12.980 And then they're like, oh, shot.
01:52:15.140 So Intel publicly said they were going 5% or 10% faster every year.
01:52:18.660 And AMD said, no, they're not.
01:52:20.620 You know?
01:52:22.000 And the results were, at some point, the gap got bigger and bigger.
01:52:26.560 You know, the people at AMD were committed to their plan.
01:52:28.820 I don't know why.
01:52:30.380 But it's interesting how these things get internalized.
01:52:33.900 And then you start, even when they, you know, at some point, you know how it is.
01:52:37.340 It's cognitive dissonance.
01:52:38.780 You say you're going to do something different.
01:52:40.600 But you learn how to do this other thing really well.
01:52:43.100 You keep doing it.
01:52:45.120 Right?
01:52:45.920 And then...
01:52:46.380 Well, and you build a whole machinery around it.
01:52:49.320 Yes, exactly.
01:52:50.600 You know, they had a big machine that did all kinds of stuff that was perfectly useful.
01:52:55.060 Right?
01:52:55.680 And good people doing it.
01:52:56.760 Like I said, we didn't hire any people to build them.
01:53:00.140 But we did refactor, you know, reset the goals, refactor a whole bunch of the engineering.
01:53:05.980 Okay.
01:53:06.800 So, well, at AMD, you were successful twice.
01:53:10.480 And so, and the success was both building a chip that was competitive.
01:53:15.480 So, you had to put together the teams to build the chip, but also to transform the internal
01:53:21.140 structure of the company so that that became possible.
01:53:23.360 And then also to communicate that to your customers.
01:53:27.480 And so, what's the problem set there?
01:53:29.340 Well, I didn't communicate it to the customers.
01:53:31.120 I, you know, because, you know, computer world, performance cells.
01:53:37.260 Okay.
01:53:37.520 So, that's the first thing you brought to the table.
01:53:39.680 Performance cells.
01:53:40.780 And here, we're going to break that down.
01:53:42.340 Here's the measurements.
01:53:43.760 Here's the measurements.
01:53:44.200 And there's lots of public benchmarks.
01:53:46.620 Like, everybody tries to gain it, but generally speaking, there's a really big community of
01:53:51.040 computer users, and they know what they want, and they know what's fast.
01:53:54.080 Right.
01:53:54.360 And you know exactly how a computer works.
01:53:56.260 So, you can actually say, once you decide that what faster is better, does that work
01:54:02.720 on all of the elements of design?
01:54:06.460 Well, it's a little complicated, but there's efficiency too, right?
01:54:09.380 There's certain things, like, you can make it faster, nobody would care.
01:54:12.120 It's like, yeah, there's some judgment calls in it, but it's not that complicated.
01:54:15.960 Like, you know, today on phones, there's a thing called Geekbench, and, you know, you
01:54:19.760 get a number at the end.
01:54:21.000 You know, is your Geekbench score 100 or 50?
01:54:23.420 100's better.
01:54:25.120 Right, right, right.
01:54:25.940 The people who made the benchmark, right, depict the components of your phone experience such
01:54:30.280 that the Geekbench number represented whether the phone was faster or not.
01:54:34.820 And then whether you care or not is another question.
01:54:37.200 Like, for the current applications, if they got twice as fast, you might not notice.
01:54:40.920 But as the computer gets faster, so new applications are possible.
01:54:44.660 And on the phone, where it's possible, it's great.
01:54:46.500 And where it's not possible, it feels slow and laggy.
01:54:48.820 Right, so performance wins, and, you know, different form factors, like a notebook or
01:54:55.500 a desktop or a phone, have different amounts of power available, they have to live within
01:54:59.520 the budget.
01:55:00.140 Okay, so you had a goal, you had the measurements in place, you decomposed that into tasks, you
01:55:04.560 assigned competent people.
01:55:06.380 What psychological factors got in the way?
01:55:09.780 Like, how did you see companies...
01:55:10.820 All of them.
01:55:11.380 Yeah, fair enough.
01:55:14.000 All of them.
01:55:14.720 But what did you see specifically interfere?
01:55:17.640 Once you have a good plan in place, that doesn't necessarily mean it's going to be implemented.
01:55:21.240 And so what are the mistakes that people make that you saw in large companies that doom the
01:55:26.340 companies or that stop them from transforming internally?
01:55:29.980 So there's a couple of very separate problems.
01:55:35.300 When somebody with a good set of ideas says, I need to transform this place.
01:55:41.480 Like, are the goals proper, right?
01:55:45.980 And then you want to say, do I have the capability in the team to do it?
01:55:50.140 Like, I worried when I went to AMD, I wouldn't have enough experts in certain things to do
01:55:54.800 it, and I'd have to go hire 50 people to fix it.
01:55:57.200 But it turns out, I did plenty of, you know, there was plenty of good people, actually some
01:56:03.340 really great people.
01:56:04.440 So I was like, you know, pretty quickly checked off the capability box.
01:56:07.900 And then you start wondering, well, why the hell are we doing the right thing?
01:56:10.140 Well, the problem was the goals were wrong, and then your organization was wrong, right?
01:56:14.700 And then, generally speaking, if those aren't right...
01:56:17.340 So to begin...
01:56:17.960 So maybe to begin with, the goals weren't unreasonable, and no one knew.
01:56:23.120 But then, across time, the fact that one set of goals was better than the other...
01:56:27.800 The belief that computers weren't going to get much faster was a bad goal in a world where
01:56:31.580 the competitor believed they were going to get a lot faster.
01:56:34.480 Yes, and could do it.
01:56:35.560 And that became incrementally worse across time, to the point where it became cataclysmic.
01:56:40.800 Yeah.
01:56:40.880 So you got to get the goals right, and you got to establish where you have capabilities.
01:56:46.620 You know, those are the kind of fundamentals.
01:56:48.400 But then the organization building is hard, because somebody will tell you, so-and-so is
01:56:53.140 a great manager.
01:56:54.840 Well, is he?
01:56:56.120 Or, you know, like a lot of times, there's somebody that looks like a good manager, but
01:56:59.460 you just have three people working for him.
01:57:00.920 It's really good.
01:57:01.320 The problem with that is, when things are going well, the empty suit manager, with his
01:57:08.640 good people supporting him, they can look like they're making lots of progress.
01:57:13.400 But when they run into hard problems, and the technical guys don't want to do it, they
01:57:17.260 go to him, he makes a random decision, or does something dumb, or doesn't believe him.
01:57:21.440 Like, that happens a lot.
01:57:22.560 Like, a technical guy goes to the empty suit manager and says, you know, I think this
01:57:28.480 isn't working, you need to change.
01:57:29.940 And he says, no, we're fine.
01:57:32.600 We just look right away through it, right?
01:57:34.300 So you get these weaknesses in the organization, because you don't have skill level.
01:57:39.460 Like I said, I've worked with a lot of really good technical managers who know when they
01:57:44.940 can make the decision, and they know when they have to punt to somebody who's more of
01:57:48.440 an expert.
01:57:49.820 That's great.
01:57:50.780 And it turns out some people are so good at that, they can operate way higher than you
01:57:54.200 think, because they're not technically strong.
01:57:56.120 They're super good at translating and making judgment calls like that.
01:58:00.660 So you got to start building your organization.
01:58:04.020 And then there's stuff about how do you build teams.
01:58:05.960 Like some groups are what's called functional.
01:58:08.660 But all the people who do software in one group, and all the people who do hardware in
01:58:12.440 one group, and all the people who do atoms in another group.
01:58:14.560 And then the managers, but if the thing you're building needs a little of all three of those
01:58:19.900 things, it's called functional organization versus product organization.
01:58:27.040 You might want a team with a couple of programmers, a couple of hardware people, a couple of atom
01:58:31.340 people, and the same team.
01:58:33.720 So they all have one goal, as opposed to the functional group says, I'm making the best
01:58:39.520 software in order.
01:58:40.560 Well, is it the right thing for this product?
01:58:42.380 They go, I don't know.
01:58:43.340 I don't work on the product.
01:58:44.200 I work on software.
01:58:45.460 So I'm generally speaking, product focused.
01:58:47.920 So if you only have like five of some discipline, you tend to make a little functional team
01:58:56.140 like that.
01:58:56.780 And there's a couple of things in computer design which are functional.
01:59:00.100 But generally speaking, I like product focused organization.
01:59:04.560 So everybody's like, they're all working together on the same thing.
01:59:07.220 They may have different disciplines.
01:59:08.320 So you've encountered all sorts of frustrating situations when you've gone into companies
01:59:20.880 that where you're trying to put together a good product.
01:59:24.340 And so what do you see as particularly counterproductive?
01:59:29.860 And what have you learned to how to conduct yourself so that you can be successful?
01:59:34.640 Weak leadership, people who can't make the technical decisions they have to.
01:59:38.520 That's a big problem.
01:59:40.020 Functional organizations where people are optimizing for the function, not the product.
01:59:45.580 Bad goals is one of the worst things.
01:59:52.060 Some organizations have real capability gaps.
01:59:56.320 Like, you know, they think they have the right people, but they don't.
02:00:00.160 You know, some managers play favorites.
02:00:01.740 They think so-and-so is really good and they're not.
02:00:04.820 Yeah.
02:00:04.980 So that's a real functional analysis.
02:00:06.980 The company just can't do what it needs to do.
02:00:11.480 Yeah.
02:00:12.120 So, and we're still analyzing.
02:00:15.100 Here's a group.
02:00:16.260 They're actually from, you know, someplace.
02:00:18.100 There's a belief that we're going to build this product.
02:00:20.860 It has to be a great product.
02:00:22.060 And how do you do the, you know, basic blocking, taxing, and how to make that successful?
02:00:27.760 Right.
02:00:29.320 That's different than the malaise that overtakes big successful companies, which you can generically call bureaucratic capture.
02:00:38.240 Right.
02:00:38.880 That's a different, different problem.
02:00:41.140 So, like, a company that's bureaucratically captured will manifest all kinds of bad behavior in the organization and product development.
02:00:49.400 But, and then, you know, some big companies where the, you know, the bureaucracy has taken over, there might still be groups that are really doing a great job making great products.
02:01:00.480 You know, so there's, you know, I think there are separate spaces and I understand both of them pretty well.
02:01:04.820 So, and again, you know, the way you solve big complicated problems, you have some abstractions about what you're dealing with.
02:01:11.580 So, you know, a framework like goals, organization, capability, and contract is, is a super clear method for evaluating what the hell is going on and then making changes.
02:01:20.900 You know, very specific changes to it.
02:01:24.020 Goals are clear.
02:01:24.920 You know, if you're not clear, nothing else matters.
02:01:26.980 Get the goals clear.
02:01:28.420 Right.
02:01:29.080 Capabilities, are they good?
02:01:30.920 You don't have great capabilities.
02:01:32.300 Nothing will save you.
02:01:33.080 You have to have the ability to do the job you're doing.
02:01:35.940 You know, does your organization serve the goals?
02:01:39.480 That's a big problem.
02:01:40.580 That's a painful one.
02:01:41.980 That's because that's when you start changing who works for who and what the boundaries are.
02:01:47.480 But, but you have to, you have to do it.
02:01:49.980 Okay.
02:01:50.100 So let's, let's tackle it this way then.
02:01:52.060 So you're going to pick someone who has optimal attributes to, to what would create and operate within a highly functional organization.
02:02:03.080 What are you looking for in that person?
02:02:05.400 What's, what's crucial?
02:02:07.140 Well, people are fairly diverse.
02:02:10.060 You know, that's the funny thing.
02:02:12.180 So engineers need to have this will to create if they're technical leaders, let's say.
02:02:17.300 And then they have to have the discernment to make, you know, decisions about whether they're actually making progress with the goals or just wasting their time on something cute.
02:02:25.800 Right.
02:02:25.960 That's, that's the thing.
02:02:28.620 Technical managers, you know, they need to know how to run a program.
02:02:33.020 They need to know how to hire and fire.
02:02:34.720 They need to know how to structure work.
02:02:36.540 They need to know how to evaluate how long it's going to take, how to evaluate whether people are making progress.
02:02:42.300 There's a whole bunch of things, but then people have very different styles.
02:02:46.480 Some people are very extroverted.
02:02:48.440 I worked with this woman.
02:02:49.300 She was great.
02:02:49.940 She would just have these team meetings and she would really get out there and energize the team.
02:02:55.180 And another guy in the same building was very low key and he would wander around and talk to people and have a really good sense of the team, like a, like an introvert versus extrovert style.
02:03:04.580 But they both worked.
02:03:06.640 They were both very competent and they were both, to me, you know, really good technical competency.
02:03:13.880 They weren't my technical leads, but they were technically competent enough to make the decisions and know when they had to punt the decision up.
02:03:21.860 So, who do you not want, who do you not, okay.
02:03:25.720 So, I mean, that kind of goes along with the management literature.
02:03:28.680 You see that you want people who are intelligent, especially for complex jobs, so they can learn.
02:03:33.640 You want people who are conscientious because they work hard and they have integrity.
02:03:37.480 Then with the other dimensions, it looks like there's a fair bit of variability, although too much negative emotionality can be a problem.
02:03:44.680 I think that's because it's associated with depression and too much anxiety and so on, but there's diversity in the other personality dimensions and that might be task specific.
02:03:54.240 But what do you, what's, what, what sort of person do you not want to work with?
02:03:59.020 Fakes.
02:04:00.700 There's lots of fakers out there.
02:04:02.780 You know, you know, they have sales attributes or, you know, extroverted, agreeable.
02:04:07.320 You know, they, they want to say everything is good all the time.
02:04:10.240 They're not sufficiently concerned about disaster and digging into stuff.
02:04:15.620 They may have some kind of narcissistic personality problem.
02:04:19.200 So, there's, there's.
02:04:19.880 So, they're imposters.
02:04:21.300 They're, they're mimicking competence.
02:04:23.120 Mimicking competence.
02:04:24.200 That's, that's a problem.
02:04:25.460 There are people who literally.
02:04:26.320 They take credit from other people.
02:04:28.100 Yeah.
02:04:28.500 I kind of put that in a separate boat, but there's people who take credit for the team.
02:04:32.700 Their teams, you know.
02:04:34.120 You know, like, I realized early on there's two kinds of managers.
02:04:36.840 People set up and people set down.
02:04:38.940 You know, like, like as a manager, I, I, I often tangle with the people I work for, but
02:04:45.240 I always took care of the people who work for me.
02:04:47.600 But some other managers, you know, I had this one guy who worked for me.
02:04:50.060 I thought he was great.
02:04:51.000 And then I, I walked by a meeting he was having, he was abusing his team and they hated him.
02:04:57.320 I fired him because he, he always said the nice things to me and, you know, he, he put on
02:05:02.460 those people.
02:05:02.980 So it's, yeah, there's, there's a bunch of weird stuff that happens with management like
02:05:07.700 that.
02:05:08.280 Like you have to be excited.
02:05:10.120 Like if you're a senior manager and high tech thing, there's many people in the group that
02:05:13.820 are smarter than you and you have to promote them and put that forward.
02:05:16.740 You can't be uncomfortable because somebody's smarter than you.
02:05:22.280 When I was at A&D, I had six senior fellows.
02:05:24.380 I think they were all smart.
02:05:26.700 They weren't, they weren't as generalists or something, you know, and they weren't, didn't
02:05:30.620 have my interest in, you know, architecture of organization, but man, it was smart.
02:05:35.760 Super good.
02:05:36.980 I could talk to them all, you know, I could keep up with them sometimes, but, you know,
02:05:41.880 I was more than happy to promote them as the smart guys.
02:05:45.600 Like, why, why, why were you confident enough?
02:05:48.700 Do you think to allow your, you to be surrounded by people that you, like I'm above average
02:05:55.780 smart, but I met people who are so smart.
02:05:58.600 Like I knew Butler Lampson, you know, famous unrated IQ and his wife was smarter.
02:06:04.120 It was a joke that he spoke at half Lampson because his wife was smart when he spoke really
02:06:09.040 fast.
02:06:10.260 But I, at a fairly young age, I was competent in getting things done and work with people
02:06:15.680 that were smarter than me, but they, they, they liked my, you know, I'm an engineer and
02:06:19.720 I build stuff, you know, this, you know, this rocket scientist, they think it up and
02:06:23.900 then they hope somebody will build it for them because they're off on the next thing.
02:06:27.820 So that's, you know, a belief I have.
02:06:31.240 You know, it wasn't always easy.
02:06:33.160 I still remember working on EV5 and I went to the digital research lab and there's half
02:06:38.160 of those super smart people.
02:06:40.560 And I started describing what I was doing and I would describe something for about two minutes
02:06:44.520 and then they would spend five minutes taking another part and analyzing how it could be
02:06:47.700 like way better.
02:06:48.720 And then they'd ask me the next question.
02:06:50.700 And after an hour of that, I felt like, oh my God, I was just beaten to death.
02:06:55.460 And they were like, this is great, Jim.
02:06:57.460 I was like, you thought that was great.
02:06:58.960 They're like, yeah, we're glad you're doing it.
02:07:01.620 So I've always had that attitude since.
02:07:05.900 No, but yeah, it's hard on some people when they realize how smart some people are.
02:07:09.520 I make up for it because I'm open-minded and I worked my ass off for many years and then
02:07:17.100 I've dived in lots of things.
02:07:18.520 And then, you know, I'm not afraid to ask dumb questions.
02:07:21.460 I, you know, like a lot of people protect, they're trying to project who they are so they
02:07:26.420 don't ask the right questions.
02:07:27.300 They don't learn it.
02:07:28.700 And I'm like, I don't understand what the hell's going on.
02:07:31.360 I've done that in a room of 50 people.
02:07:33.320 And they're like, well, we thought you should know.
02:07:34.700 It's like, well, I don't, but I'm not going to believe until I do.
02:07:36.760 And then they give all the information and I'm smarter than I used to be.
02:07:42.200 And, you know, so that, that takes a certain, you know, mental resilience and sometimes it's
02:07:46.880 very hard on me, but, you know, again, it's sort of like, you know, do you fire the people
02:07:52.840 you have to fire to save the group and save the product, maybe save the company?
02:07:57.480 Yeah.
02:07:57.900 Because the net good is, it's really high and it's the right thing to do.
02:08:03.160 So exposing yourself is the right thing to do, ironically.
02:08:07.680 Now it's hard and it's hard and, you know,
02:08:09.600 Well, if you admit you're stupid, then sometimes you don't have to stay that way.
02:08:14.020 Yeah.
02:08:14.840 It's hard in some sick organization.
02:08:16.580 Sometimes it's not safe.
02:08:18.480 And I feel for people who are in places where they would really like to be more open and
02:08:22.840 can't because, you know, organizations that get political and bureaucratic are hard on
02:08:28.280 people that are actually trying to do the right thing and learn.
02:08:30.980 I totally understand that.
02:08:32.640 It takes a while to, you know, the psychological safety thing is,
02:08:36.760 it gets overused and it gets a bad rap, but having an organization where it's actually
02:08:41.040 safe to open your mouth and talk and ask questions and occasionally look stupid, you know, and
02:08:47.480 fumble a little bit and have your peers like support you with that and be happy for you
02:08:53.720 when you learn stuff.
02:08:55.520 That's really important.
02:08:56.960 It's hard to do.
02:08:59.020 You know, and so, and there's great attention because, you know, as a leader, you have to
02:09:02.520 be disagreeable enough to do the hard things while still creating an environment where people
02:09:06.740 can open up and do that.
02:09:08.020 I would say I have mixed reviews on that topic because once I find things that are wrong and
02:09:15.500 people are doing the wrong thing, you know, I have to get to the bottom of it.
02:09:19.160 And I'm going to, I'm going to close with some people.
02:09:24.060 It's never happened to them before.
02:09:26.560 Like the people haven't really taken what apart, you know, they got A's in college and
02:09:30.980 they got good reviews and they rose to their Peter principal and competence point.
02:09:35.500 All of a sudden they're doing something they're over their head and they don't know what to
02:09:38.660 do about them.
02:09:39.220 They have a lot of practice.
02:09:41.360 So, yeah, it's a, it's a funny, it's a funny thing.
02:09:44.300 So, I'm going to close with a question about your current venture.
02:09:51.760 You're, you're, you're now working with a company that does AI computing and what do
02:09:56.360 you hope to do that you can talk about?
02:10:00.840 Well, so I was an investor in this company when I first started, the Bishop Project, who's
02:10:06.320 the founder, worked with me at AMD and I, I always thought he was an especially smart guy
02:10:11.360 and I liked his approach to building AI computation.
02:10:16.220 I'm really intrigued about computers programmed by data, right?
02:10:21.140 I think it's more like how our brains work or our brains are really weird, right?
02:10:25.860 Because we think in this linear narrative, we have this little voice in our head, but
02:10:28.980 we know we have 10 billion neurons and they're collecting away, you know, exchanging, you
02:10:33.660 know, small amounts of brain transmitters and electrical pulses.
02:10:36.320 You know, it's bloody hilarious, the gap between what a neuron looks like and what a thought
02:10:41.180 looks like.
02:10:43.140 And so, there's a really interesting opportunity to make big AI computers that are actually really
02:10:50.360 programmable.
02:10:51.700 So, one of the things we're doing is we're building the software stack that lets you build
02:10:56.740 the neural networks you want and then program and get the results you expect reasonably well,
02:11:01.540 as opposed to having a very large army of people tweaking it.
02:11:04.180 And so, there's a bunch of architectural, interesting things to do.
02:11:11.460 And then it's a startup, which, you know, we have chips at work, we start a production,
02:11:15.760 we're going to start selling them.
02:11:17.580 You know, there's a whole bunch of work to do on how to engage with customers.
02:11:20.480 And a lot of customers we're talking to are super smart.
02:11:23.340 There's all these, you know, AI software startups with really smart people that, you know, have
02:11:28.480 some problems that, you know, basically, if the computers are a million times faster,
02:11:32.920 they're going to be easier to solve.
02:11:34.180 So, there's like a huge capacity gap on what they want to do.
02:11:39.440 So, participating in that is fun.
02:11:41.020 Like, I like that kind of thinking.
02:11:43.080 And your goal?
02:11:44.280 So, you go into an organization, you have a goal for the chips.
02:11:46.820 What's your goal for this organization?
02:11:49.380 Oh, we're going to be successful selling AI computers to a large number of people.
02:11:53.700 You know, significantly better performance, better programmability, lower cost.
02:11:57.960 And there's a bunch of innovation work to do around that, to make that really possible and doable.
02:12:06.700 Like, the AI field is relatively new.
02:12:09.220 The computers that run AI today are relatively clunky.
02:12:13.920 And, to me, you know, need a lot of work and refinement so that, you know, from the idea that you want to express in the program,
02:12:20.700 you're ready to the result you want better and cleaner.
02:12:23.260 Okay, so, one final question for anyone who's listening who would like to pursue engineering as a career,
02:12:32.000 or let's say who wants to be successful within the confines of a big company.
02:12:36.400 What advice do you have for people?
02:12:39.760 What have you learned that you can sum up?
02:12:42.780 Yeah, Lex Friedman asked me that.
02:12:44.180 First, you have to, you know, you have to know yourself a bunch.
02:12:48.200 Like, what are you good at?
02:12:49.380 Like, you can't get really good at something you're not into.
02:12:53.560 And you're not good at it.
02:12:55.020 So, you have to have some natural talent for it.
02:12:57.960 And then you have to really spend some time figuring out what you like.
02:13:01.780 Like, I read this thing.
02:13:03.840 It was interesting.
02:13:04.860 Like, people think of college as expanding their possibilities.
02:13:08.040 And the university itself has so many options, you think that would expand your possibilities.
02:13:12.460 But once you pick one of them, and you study it for four, eight, ten years, you've narrowed your possibilities, right?
02:13:19.920 You're kind of stuck with your discipline, and you pick that 20, which I think is crazy, by the way.
02:13:25.360 Like, I think if you want to be an engineer, a good general engineering degree, like mechanical engineering or electrical engineering will give you thinking skill sets.
02:13:33.300 I'm not a huge fan of people getting PhDs unless they really, really know they love it, right?
02:13:39.480 And then take some jobs where, you know, there's an opportunity to do something for a year or two and then do something else.
02:13:47.400 Like, I worked, my first job out of school was a random job.
02:13:51.440 But I worked on, like, five different projects in two years while I was there, you know, fixing hardware, building something, debugging something.
02:13:58.580 I learned a lot in the digital.
02:14:00.060 So I had many different roles, even though I set that company for 15 years.
02:14:04.340 I wrote programs, I did logic design, I did testing, I did lab work, you know.
02:14:10.480 And so I got to see a lot of different things and get a feel for what I really liked.
02:14:15.580 And I worked with smart people that, you know, I had a lot to learn from.
02:14:21.600 Working hard when you're young is really useful.
02:14:24.920 You know, some people are like, well, you know, it's like the 10,000 hour problem.
02:14:28.680 And if you want to be an expert, you need to do that a couple of different times on different things.
02:14:33.060 And you can't do it unless you really love it.
02:14:36.320 A friend of mine's wife said, what do they put in the water?
02:14:38.560 All you guys do is talk about work.
02:14:40.240 Yeah.
02:14:40.660 So you figure out what you're competent at because you need that.
02:14:44.960 Figure out what you're interested in.
02:14:46.740 I mean, men and women seem to pick different occupations, not based on their competence, but on their interest.
02:14:51.440 And so interest is a very powerful, motivating factor.
02:14:55.860 Oh, and I've been in a lot of places with the best engineers for women.
02:14:59.520 So, you know, we know the numbers are less, but there's plenty of really great women.
02:15:04.760 Yeah, it certainly doesn't make it impossible.
02:15:06.520 It's just an indication of the, what would you call it, of the impact of interest as a phenomenon.
02:15:12.800 It's important as well as competence.
02:15:15.120 Yeah, maybe.
02:15:15.840 And so a diverse range of experiences.
02:15:20.280 Yeah.
02:15:20.660 Don't over-index on something before you're really, you know, sure that that's something that you're really going to like or be great at.
02:15:27.720 You know.
02:15:29.020 Don't be afraid to ask stupid questions if you don't know what you're doing.
02:15:34.700 Yeah.
02:15:36.840 Try to work with good people.
02:15:39.020 Work in organizations where, like, if everybody hates the company you're working in, move somewhere else.
02:15:44.160 You want to work someplace where the energy is good, people are excited about what you're doing and why.
02:15:49.920 Like, sometimes you might be in a company that has, you know, something going wrong, but your group is, you know, going to change it.
02:15:56.060 That can be really fun.
02:15:57.440 But you need some camaraderie, some, you know, hope, right, where the goal is clear.
02:16:02.800 Right.
02:16:03.180 So that's an adventure.
02:16:04.720 You have a destination and the camaraderie along the way.
02:16:08.140 Yeah.
02:16:08.460 And there's, you know, there's so many places doing so many wild things.
02:16:11.760 It's being stuck in a company you don't like that's going nowhere for 10 years, man.
02:16:16.160 You don't have that many 10 years lasting your life.
02:16:19.940 You know, make sure you're actually getting, especially if you're on, getting a different experience.
02:16:24.160 Somebody said, you know, you have 10 years experience or one year of experience 10 times.
02:16:28.280 Right.
02:16:29.700 Now, sometimes you work on the same thing and you refine it and you become the expert.
02:16:34.960 But then you should feel like you're making progress and expertise.
02:16:40.380 Right.
02:16:41.500 But if you're just kind of going through the motions over and over and doing the same thing.
02:16:44.240 Then it's time to fire yourself under those conditions.
02:16:46.680 If you're bored, you're not moving.
02:16:49.060 Right.
02:16:49.820 Like engineering is not boring.
02:16:51.640 Right.
02:16:51.860 It's relatively exciting.
02:16:53.880 Yeah.
02:16:54.000 I think that's actually a pretty good rule of thumb.
02:16:56.320 If you're bored, you're doing it wrong.
02:16:58.300 Yeah.
02:16:58.540 Something's wrong.
02:16:59.520 Yeah.
02:16:59.800 Something's funny.
02:17:00.380 It's like, like, like an AMD, we had this group that did tests and it was, it was kind
02:17:04.760 of dysfunctional and there was a couple of managers and nobody liked it.
02:17:07.720 And at some level, the test engineering wasn't the hardest thing.
02:17:12.600 So, but I decided that's stupid.
02:17:15.320 Why isn't, why isn't our test group the best in the world?
02:17:17.940 But we were organized around it.
02:17:19.080 We had a really great leader.
02:17:20.060 We had a good team.
02:17:21.920 I told him I wanted it to be really great.
02:17:24.460 And I told the engineers to stop complaining about it.
02:17:27.660 They had a problem.
02:17:28.280 Come to me and we'll fix it.
02:17:29.420 But within two years, people were coming to me.
02:17:31.620 It's like, man, the test guys are killing.
02:17:33.800 Yeah.
02:17:35.080 Yeah.
02:17:35.460 They went above and beyond.
02:17:36.760 They, they, they made it something of value.
02:17:40.020 You know, it was great.
02:17:41.120 It was super fun.
02:17:42.740 That's a really good place to end.
02:17:45.240 Cool.
02:17:46.260 Thanks, Jim.
02:17:47.260 Hey, good to see you, man.
02:17:48.580 Much appreciated.
02:17:49.480 Thank you for taking the time.
02:17:51.220 All right.
02:17:52.040 We'll talk soon.
02:17:53.280 Yeah.
02:17:53.700 Cheers.
02:17:54.060 Bye.
02:17:55.760 Going online without ExpressVPN is like not paying attention to the
02:17:59.360 safety demonstration on a flight.
02:18:01.320 Most of the time, you'll probably be fine.
02:18:03.300 But what if one day that weird yellow mask drops down from overhead and you have no idea
02:18:08.180 what to do?
02:18:08.980 In our hyper-connected world, your digital privacy isn't just a luxury.
02:18:12.800 It's a fundamental right.
02:18:14.100 Every time you connect to an unsecured network in a cafe, hotel, or airport, you're essentially
02:18:18.920 broadcasting your personal information to anyone with a technical know-how to intercept
02:18:23.020 it.
02:18:23.320 And let's be clear, it doesn't take a genius hacker to do this.
02:18:26.300 With some off-the-shelf hardware, even a tech-savvy teenager could potentially access
02:18:30.740 your passwords, bank logins, and credit card details.
02:18:34.060 Now, you might think, what's the big deal?
02:18:36.140 Who'd want my data anyway?
02:18:37.680 Well, on the dark web, your personal information could fetch up to $1,000.
02:18:42.100 That's right.
02:18:42.840 There's a whole underground economy built on stolen identities.
02:18:46.360 Enter ExpressVPN.
02:18:48.100 It's like a digital fortress, creating an encrypted tunnel between your device and the
02:18:52.100 internet.
02:18:52.800 Their encryption is so robust that it would take a hacker with a supercomputer over a
02:18:57.000 billion years to crack it.
02:18:58.440 But don't let its power fool you.
02:19:00.360 ExpressVPN is incredibly user-friendly.
02:19:02.600 With just one click, you're protected across all your devices.
02:19:05.620 Phones, laptops, tablets, you name it.
02:19:07.800 That's why I use ExpressVPN whenever I'm traveling or working from a coffee shop.
02:19:11.920 It gives me peace of mind knowing that my research, communications, and personal data
02:19:16.060 are shielded from prying eyes.
02:19:17.660 Secure your online data today by visiting expressvpn.com slash jordan.
02:19:22.660 That's E-X-P-R-E-S-S-V-P-N dot com slash jordan, and you can get an extra three months free.
02:19:29.040 Expressvpn.com slash jordan.