00:00:00.200We don't realize that we incrementally gave up on the idea that our lives could be filled with an exciting parade of major innovations.
00:00:12.300I see AI and I find it terrifying. I find it terrifying the effect it's going to have on people's jobs, the fact that it's going to obliterate a lot of jobs.
00:00:21.160What effect is that going to have on society? Am I right?
00:00:24.800You just lost your capitalist model. You have a fabric that is overlying the entire world that directs people whether to get up in the morning and what to do once they do rise.
00:00:37.060And it tells them how to do things without having a dictator. So it's not just the invisible hand, it's the invisible mesh.
00:00:45.480And this invisible mesh, if it breaks, means people are not going to know what to do in the morning.
00:00:50.220All hell's about to break loose. And my question is, are you trying to figure out which way the wave is going to break and get your surfboard in the water?
00:01:00.100We should be sitting around talking about what is the new economics.
00:01:04.260Instead, what we're talking about is how do we keep communism at bay while capitalism is struggling?
00:01:20.220Hello and welcome to Trigonometry. I'm Francis Foster.
00:01:25.160And this is a show for you if you want honest conversations with fascinating people.
00:01:30.320Our brilliant guest today is a physicist. He's one of the smartest people in the world and a friend of the show, Eric Weinstein. Welcome to Trigonometry.
00:01:36.500Hey, it's great to be here. Thanks, guys.
00:01:38.020It's really good to have you on the show, Eric. It's been a long time coming. So exciting to have you here.
00:01:42.840But listen, you know, for a lot of people who watch our show, you will be one of the IDW guys who was part of this sort of weird emerging thing a few years ago.
00:01:53.060And they won't actually know who you are exactly. They'll know you for the things you say.
00:01:57.920What has been your journey through life that leads you to be sitting here? Because you've done a lot of interesting things.
00:02:03.120Right. So sometimes I try to avoid being known to the audience.
00:02:06.780I would say that I, by profession, I started off trying to do physics, but realizing that physics was in a terrible situation.
00:02:15.200So I ended up doing the mathematics that allowed me to shadow the physics that I wanted to be doing.
00:02:21.360In a certain sense, a lot of who I am is a person who has believed deeply that the master narratives that govern our time have been getting weirder and wackier.
00:02:33.020And so I started, I would say, late 80s, early 90s, really exploring the idea that our institutions are much farther gone and much stranger and weirder than anyone expects.
00:02:44.760And because that has such a high social cost, or at least had such a high social cost, when you explored it at that time, it was kind of an open world.
00:02:54.420And I would say that people who had been particularly active in progressive politics were about the only people who believed that a lot of these structures were really far decayed, that the narratives were wildly off of what was actually happening.
00:03:07.500And I've tried tracking that, I would say, through economics, labor markets, financial instruments, political skullduggery, science, the ways in which the military complex interacts with all of these things from art to physics to news.
00:03:27.100And beyond that, I mean, my personal life is something that I really probably care about even more than any of this, and all of these actions are really because I have children on this planet, and I'm deeply concerned that they have a happy, optimistic, and positive future.
00:03:44.980And I have to, I feel personally responsible for clearing away a lot of stuff that we're not supposed to talk about, because as we are increasingly seeing, we grew up thinking we were in a free society, but that is actually governed by these incredibly strong narratives that are clearly untrue.
00:04:01.040And they're very difficult to source as to why is it that so many people pretend to believe things that no one can believe in an individual instance.
00:04:08.720I hear you, and particularly on the feeling of responsibility now that I've become a father as well.
00:04:21.900What was wrong with physics that made you do math?
00:04:23.320Same thing that's wrong with physics now.
00:04:24.760I mean, the fact is, this is the 50th anniversary of a couple of developments, one of which was February 1st, 1973, called the Kobayashi-Maskawa Augmentation of the Kibibo Angle, which introduced three families of matter into the standard model.
00:04:42.060But that picture of the matter in this room, who and what we actually are as waves propagating through the space-time that Einstein gave us, that model has been stagnant for 50 years.
00:04:54.280And as I was just saying on Joe Rogan, if you think about songs from that period of time, like Crocodile Rock or Tie a Yellow Ribbon Around the Old Oak Tree, imagine that playing on a continuous loop for 50 years without any real progress in the underlying understanding of the world.
00:05:09.660That's a catastrophe because it means that you replace all of the people who knew what science was with a group of people who will tell you, well, in science, this is the way things go.
00:05:20.520Yet they have no understanding of science, having never contributed.
00:05:29.460Why has it happened, furthermore, across multiple fields?
00:05:32.300Why is it that, for example, evolutionary theory in terms of the sexual and natural selection theories, why did that stagnate when it tried to go into sociobiology and ran into political problems?
00:05:44.320Why is it that neoclassical economics hardened into dogma?
00:05:48.420In all of these situations, we have the fact that there was something that was going on, late 60s, early 70s, that ossified.
00:05:57.980And except in the fields of computation communications, things largely stagnated.
00:06:04.640And it's hard to think about because so much of our lives are digital, that the fact, thank God, that we had Moore's Law and these explosions in computation,
00:06:13.440we almost don't notice that that feeling that the world is taking place at breakneck pace, at a breathtaking speed, rather.
00:06:24.960That feeling derives almost entirely from our digital lives and the innovations in software and computation and artificial intelligence, all of those things.
00:06:34.360But oddly, that neon sign, that poster, the fact that we're filming in a studio, all of those things were possible in 1973.
00:06:50.240So, I mean, that says a lot about not only science, but it says a lot about society.
00:06:55.980Because if society is not innovating, it's effectively dying, isn't it?
00:06:59.980Well, again, it's not that we're not innovating at all.
00:07:02.620But if you think about, you know, the jetpacks, for example, that were featured in a James Bond film in the 1970s, we always wondered, when were we going to get personal jetpacks?
00:07:14.920We're still dealing with the idea that they're quite hard to stabilize.
00:07:18.260If you go to the airport in Los Angeles, where I'm from, there's a completely space-age futuristic building that dominates the architecture.
00:07:27.140It's still the most futuristic thing in the city.
00:07:30.680I mean, we somehow took on a very different perspective at the future.
00:07:36.640I just was dealing with somebody involved in the relaunch of the DeLorean Motor Company.
00:07:40.500And 40 years later, the DeLorean is still absolutely something that excites us because it is new.
00:07:48.420And I just find this fascinating that we don't realize that we incrementally gave up on the idea that our lives could be filled with an exciting parade of major innovations.
00:08:01.280And I was just on stage in Miami at Bitcoin talking about the three white papers that changed the world.
00:08:07.220And one of them was Bitcoin, the blockchain white paper from 2008.
00:08:13.500One of them was something called Attention is All You Need from 2017, which changed the large language model landscape.
00:08:20.200And one of them was the 2018 proposal from the EcoHealth Alliance that we should start experimenting with furin cleavage sites and spike protein and coronavirus.
00:08:33.400Eric, you know, one of the interesting things that you mentioned there is something that I think is part of this whole thing that confronts all of us who are trying to think about these things,
00:08:43.680which is our vision of our future and of ourselves is fundamentally changed.
00:08:48.720And I remember as a boy growing up reading science fiction about the great challenges that humanity would face as it expanded into the universe and how when you create a spin-off of your civilization on a different planet or as you introduce robotics, as you introduce this, new challenges come along.
00:09:08.140And it was almost taken for granted in that era, having just watched, you know, first of all, my guys launch a man into space and your guys put one on the moon and all of that, that this would continue, that these vast breakthroughs in human achievement would continue.
00:09:23.380And now we sort of squabble about tax rates and stuff like that.
00:09:34.080Did we put a man on the moon or did we put a man on an ICBM and said it was a moon mission?
00:09:39.620At some level, that was also a giant head fake, right?
00:09:43.700Because we knew that there was almost nowhere to go.
00:09:46.080I mean, basically, there's the moon and there's Mars and then you're out of range of anything interesting with chemical rockets.
00:09:51.600And I can even ask, you know, if you've ever been to Joshua Tree in Southern California, you have an idea of what it might be like to be on Mars.
00:09:58.700It's beautiful, but it gets old pretty quickly.
00:10:01.820I think in part that that's not the issue.
00:10:04.440The issue is what happens when you take a moonless night and you go out and there isn't a cloud in the sky and you lay on your back, maybe during a meteor shower, and you gaze up at the heavens and you think, why is it that Uruguay is on my bucket list when I can see the heavens?
00:10:25.580And I know that some of those are stars and some of them are galaxies.
00:10:30.140That's where we're supposed to be dreaming.
00:10:32.240And the only way to get out of here and to go find that and find out what the universe is, is physics.
00:10:37.740So, while many things stagnated, the unforgivable thing that's stagnated, the singular unforgivable thing, is our understanding of the most basic notion of who we are.
00:10:48.480And when we lost the taste, and we're in the process of not only stagnating in physics, but killing the impetus to solve these problems, there's a new kind of ethos that says that to ask for an ultimate theory is immodest.
00:11:04.480It's destructive, it's destructive, it is fundamentally imperious, that really begins to scare me, that we start thinking about these things in terms of very personal negative characteristics of arrogance, of hubris.
00:11:20.160And you understand where it comes from, right, because we did unlock this power.
00:11:25.180And in particular, in 1952, and forget about the actual atomic bombs dropped on humans, tragic as they may be, but the potential tragedy of hydrogen weapons, with that we came to understand we're really good at this.
00:11:43.900We're really incredible, we're really incredible, and we have to watch ourselves, and I think that that's the right ethos, is that we need to worry about ourselves.
00:11:52.480But to stop ourselves, we've now crawled into the valley of death, and the trick is to get out to the other side, to get to the cosmos, and to start to feel that we're being invited to the world's greatest adventure.
00:12:07.780Well, people might say, and I mean, I have some sympathy with this argument, you made it yourself only a few minutes ago, which is, the pace of change is such that we are rapidly developing technologies that are breaking the world around us.
00:12:25.720If you look at the impact social media has had on the way that human beings communicate, I mean, nuclear weapons is another example, of course, but you could give others where the technological progress we make is so disruptive to our world that I don't blame people who think, why don't we just slow down a bit?
00:12:46.320And tell me something, when your wife's water broke, what was your sense of like, oh, shoot, we've got to stabilize the situation, how do we make sure that our child can stay in here forever?
00:13:03.600Yeah, what could happen next could be absolutely deadly.
00:13:06.880But to not understand that it is now time to call the hospital, to get that bag together, to run like hell, to care about those less fortunate than ourselves because they may be incapacitated, I don't think we're understanding what this moment is, for example.
00:13:25.320The idea that I believe it's four amino acids and 12 nucleotides that shut down planet Earth.
00:13:31.680Whether or not that came from a pangolin or a laboratory does not matter.
00:13:38.000It's a tiny change that led to a virality.
00:13:41.840This is the leverage level that we're now talking about, where a tiny change in the world with a large enough lever.
00:16:13.400And this invisible mesh, if it breaks, means people are not going to know what to do in the morning.
00:16:18.960And now you can see that this is going to break it.
00:16:21.140Now, you have a brief period of time with what my wife has called it.
00:16:26.200She's an economist with the Institute for New Economic Thinking.
00:16:29.300She says this is the golden age of AI complementarity, where a human being making prompts can ask the large language model or neural net, whatever, whatever you like, questions.
00:16:42.240And the two of them in dialogue can create something.
00:16:44.700It's sort of like when humans and computers started playing chess together.
00:16:48.060This is going to quickly give way to where the AI says, I can take it from here.
00:16:54.160And how quick do you think it's going to be, Eric?
00:16:59.420But keep in mind that there's some things that could happen.
00:17:05.120Because what these machines are doing is reading a human corpus when it comes to language models, let's say.
00:17:12.720It could be that they asymptote based on how clever we've been.
00:17:16.780For example, they don't do a very good job in areas where there are fewer than 200 people writing about a very high-level scientific subject.
00:17:23.720So I ask these computer models a lot about determinate line bundles, not something in general conversation.
00:17:32.000And they're terrible at it because they don't know what to read or how much.
00:17:37.140So it's possible that that could asymptote.
00:17:40.320But the thing you really have to keep in mind is that a clever person at one of these AI outfits might figure out how to teach computers to do things that no one has done.
00:17:56.120So we know that they have emergent behaviors, that you may not teach them Bengali, but they realize they have to learn Bengali in order to tell you about Tagore.
00:20:03.740But his whole narrative about getting off the planet and diversifying the number of places where the spark of human consciousness can be found without having all of our eggs in one basket, 100% right.
00:20:13.740And then it becomes a SpaceX pitch for Mars, which loses me.
00:20:17.380But then again, I don't own a chemical rocket company.
00:20:19.280In the case of AI, you know, the fact is he was seeing this early, and he saw Sam Altman make these other decisions.
00:20:27.360They were all at this, you know, I think Sam wasn't at the Puerto Rico conference.
00:20:30.920But all of these folks who were at the Puerto Rico conference knew that this was coming.
00:20:35.840Now, 2017 is the dividing line, I think, in AI, because this paper, Attention is All You Need, makes it clear to all of us, not through the paper, but through the consequences,
00:20:46.880that this is as real as a heart attack.
00:21:15.840If I give it a piece of typeset mathematics and ask it to generate the code that would give that, it doesn't seem to know how to do that yet.
00:21:24.140It doesn't seem to know how to do very specialized quantum field theory questions.
00:21:29.420Generically, I asked it, you know, I was told that it doesn't have a social intelligence.
00:21:35.540And that may be, but I asked it, you know, does this query make my ass look fat?
00:22:43.900Because art is an expression of the human soul.
00:22:45.960You can continue to express the human soul.
00:22:47.920The fact that a different soul might express itself in a way that dwarfs your contribution.
00:22:51.920You have to ask yourself, does that affect how you feel about your own output?
00:22:54.540Or, for example, if it writes a beautiful love letter, but you know that it is soulless in your terms because it's a bit of linear algebra with nonlinear function theory thrown in, does that change the meaning, the poignancy of the words?
00:23:33.020That song was a failed song for multiple tries.
00:23:36.080In fact, Barry Gordy, I believe, said that the next person who mentions that song gets fired.
00:23:40.140Gladys Knight almost got that song to the point of stardom because the pips had this thing where it's, oh, yes, I am, yes, I am.
00:23:46.300And it would echo as it would get fainter and fainter.
00:23:48.580It was genius, but it wasn't as cool as when Marvin Gaye was given the same song a half step or a whole step above where he could sing so he was forced to reach and stretch.
00:23:58.060And then you have the Halloween violins at the beginning that set the tone for this thing, right?
00:24:59.640I don't know whether it's the chord progression, which the thing can clearly do, or the fact that I believe that Gaye is straining and feeling what he's doing.
00:25:09.260I don't know whether it's the fact that this seems like a human expression, but he didn't write the song.
00:25:15.240Somebody else's, he animated the song.
00:25:17.200So it really has to do with the context.
00:25:21.320And my claim is maybe this thing will come up with a better lyric or a better chord progression, but it'll matter less to you because it didn't come from a human heart.
00:25:29.880I'm saying that you're going to have to start picking apart the essence of each song or poem or story to figure out whether you continue to feel that it's art if it was written by linear algebra.
00:25:44.020What if we go outside of the realm of art and into the more practical and tangible things?
00:25:49.980I mean, you talk about the end of capitalism, and that makes sense.
00:25:52.680And if you look at, you know, communism or fascism or whatever, they're all responses to the economic circumstances that they encountered, right?
00:26:02.280So they're a way of trying to deal with the fact that you've got an industrial revolution, and suddenly everyone goes into factories, and now these people are there, and they're now in cities, and all of it comes out of that, right?
00:26:14.540So we're going to have to have a new whatever.
00:26:45.980You know, you can watch TV, you can hang around in the pool, and suddenly, you know, maybe your wife is looking at you thinking you're kind of useless.
00:26:55.020I kind of liked it better when he was going out and killing mastodons and dragging them back to my cave.
00:27:01.200And now you've got a new need, which is, okay, should we make some work for you that doesn't need to be done?
00:29:16.660But I've learned to respect these jobs so much more when I realized, oh, my God, my audience is studying spinners and differential operators in between, you know, Uber assignments or something like this.
00:29:31.340What do we do with those people who, for whatever reason, are not that way inclined, Eric, who can't access these type of technologies or these type of industries?
00:29:42.580Because that's going to prove a real problem.
00:29:44.400Well, first of all, why don't we create some art for them that excites them?
00:29:47.580Why don't we use art to reacquaint ourselves with the possible?
00:29:53.020I really feel like the artists have fallen down on the job.
00:29:56.180I feel like today's art is, in general, not reflecting our time.
00:30:00.580And as a result, it's becoming less and less relevant.
00:30:03.120And then people get more and more adamant that, no, art has never been more relevant because they know that it's become less and less relevant.
00:30:08.360And what do you mean by that, that it doesn't reflect our time?
00:31:13.480You know, but when Drake says, you used to call me on your cell phone, he's trying to take the fact that there is some way, you know, are you going to swipe right by me, baby?
00:31:26.060But then again, the Erie Canal, you know, it's not like it's the high seas and, oh, the Erie was a rising and the gin was getting low and I scarcely think we'll get a drink till we get to Buffalo.
00:32:38.920But on the other hand, we have a barbell of attention.
00:32:41.860We have the inability to get through a longer tweet and the ability to watch Game of Thrones with the longest character developments anyone's ever seen.
00:32:50.580Far dwarfing a movie and approaching, you know, what you do in spectacular fiction.
00:32:55.860We have to recognize that just as attention deficit disorder actually contains the ability to concentrate on something for months, you know, with laser focus.