Jaron Lanier - The Truth About Social Media
Episode Stats
Length
1 hour and 12 minutes
Words per Minute
178.77606
Summary
Jaron Lanier is an American computer scientist, visual artist, computer philosophy writer, technologist, and futurist. He s considered one of the founders of the field of virtual reality. In this episode, we talk to him about his views on AI, virtual reality, and social media.
Transcript
00:00:00.000
it puts everybody into this behavior modification regime that even if it doesn't do what it's
00:00:05.340
supposed to, it has all these side effects. It makes everybody a bit more vain and crazy and
00:00:10.440
stupid and contentious and paranoid. That does happen. It's really good at that. And because of
00:00:17.400
that, it becomes this giant invitation to the worst actors in the world to try to go in
00:00:30.000
Hello and welcome to Trigonometry. I'm Francis Foster. I'm Constantine Kisham. And this is
00:00:56.120
a show for you if you want honest conversations with fascinating people.
00:01:01.420
Our brilliant guest today is somebody we've been trying to get onto the show literally
00:01:05.140
since the day we started. He's an American computer scientist, visual artist, computer philosophy
00:01:10.220
writer, technologist, futurist, and composer of contemporary classical music as well. He's
00:01:15.560
considered one of the founders of the field of virtual reality. Jaron Lanier, welcome to
00:01:21.800
It's going great. Listen, man, before we get into what we hope is going to be a brilliant
00:01:26.680
and fascinating conversation, tell everybody a little bit about your story. Who are you?
00:01:31.800
How are you where you are? What has been your journey through life that leads you to be sitting
00:01:37.140
Oh, gosh, you know, I find that to be a confusing question.
00:01:40.120
Well, let's see. I'm known for a few different things and to different communities of people.
00:01:50.040
Some people only know me one way or the other. I approximately started virtual reality. I came
00:01:57.900
up with the term and I had the first startup and pioneered a lot of the apps and devices and
00:02:02.280
whatnot a long time ago in the 80s when I was a kid in my wasted youth. I write books that are
00:02:11.160
critical about technology. I'm very concerned that we have a responsibility as computer scientists to
00:02:16.340
think about what we're doing to the world. And when it's not working, we have to be honest about
00:02:21.440
it. And so I've written a bunch of books. The best known one might be the 10 arguments for deleting
00:02:28.440
your social media accounts now, right now. But another one is called You're Not a Gadget.
00:02:33.940
And I'm very skeptical of the idea of artificial intelligence. I'm very skeptical of the business
00:02:38.340
models for social media and other things that involve. I'm very concerned about them. And then
00:02:44.120
yet another thing is, as an actual computer scientist, I do all kinds of things. We don't
00:02:48.920
have to get into that because it'll put you to sleep, but it's actually extremely important and
00:02:52.300
profound. And then I do other stuff too, but I think that's enough to get going.
00:02:58.440
Sure. Well, you've covered three things there that Francis and I are really keen to talk to you
00:03:04.380
about, which is AI, virtual reality, and of course, social media. I think we best start with social
00:03:09.760
media because I feel like of all the different stories we've covered on the show, the different
00:03:14.900
guests we've talked to, the polarization, the tone of our public discourse, the behavior of children in
00:03:23.320
terms of their self-image and their body image. I mean, any issue that is affecting modern society,
00:03:29.780
really, there is a role that social media has played. And by the way, you know, Francis and I
00:03:35.280
are both quote unquote content creators. I love using Twitter, for example, to talk to, you know,
00:03:40.480
hundreds of thousands of people on a daily basis. I think it's an incredible piece of technology,
00:03:45.020
but we're also paying a heavy, heavy price for some of these benefits. Talk to us about your broad
00:03:51.840
take on social media. You talk about the business model and the problems with that. What are some
00:03:56.500
of the impacts that we're seeing in the world, in your opinion, and where they're coming from?
00:04:01.320
Well, okay. The first thing I want to say is that in the biggest picture, something similar to social
00:04:07.920
media, as we know it, would be great. Like I worked really hard to help get the internet to happen,
00:04:14.780
which is a story we can talk about if you want, what my role was. But I still believe in the idea
00:04:19.760
of a more connected world. I still believe in the idea of people being able to have reach in new ways.
00:04:27.580
All of that's great. The problem that came up is the business model, which had a perverse effect
00:04:33.420
of promoting the worst sides of human nature. Okay. So it's a business model problem more than a
00:04:40.600
technology problem. And I really want people to remember that because a lot of times somebody will
00:04:45.040
come up to me and say, well, there's this or that about whatever social media platforms they like
00:04:50.160
that I love. And how can you say that I should quit it and that it's all horrible? And I'll say,
00:04:54.100
you know, I never told you to quit. I gave you arguments. I want you to think. I want you to be
00:04:58.900
responsible for your choices. But the good stuff is authentic. Like it would be absurd to argue that
00:05:06.520
the good stuff is inauthentic because that's ridiculous. I mean, it's plain in front of our faces
00:05:11.780
that they're good things. My personal favorite one remains an oldie but goodie, which is people
00:05:17.900
with unusual diseases can find each other and share notes. And that used to be possible. That to me
00:05:23.520
is a great concrete example of something that the technology has brought into the world that
00:05:29.220
otherwise would not be in the world that has kept people alive. I love that. All right. So I mean,
00:05:34.600
it should be obvious. However, what happened was, for a lot of really stupid reasons,
00:05:42.660
the early internet companies ended up with this bizarre business model. And the business model
00:05:47.860
is you pretend to be socialist in the user's experience, but the way you're capitalist is
00:05:55.240
fake capitalist too. You end up with the worst of both worlds instead of the best of both worlds.
00:05:59.140
So you're fake socialist in the sense that you say, oh, sharing is great to share your information
00:06:04.800
and we'll give you free stuff. But the problem is there has to be a customer somewhere because it's
00:06:10.640
still a world driven by business. And so the customers who come in think that what they're
00:06:15.740
buying is mind control. They think that what they're buying is so much access to user data
00:06:20.520
and so much access to giant algorithms that can predict user behavior that they can just pour money
00:06:26.480
into Alphabet or Meta, Google or Facebook, and out will come massive behavior modification that will
00:06:34.040
throw elections or get people to buy products or whatever it is, or commit to a religion, all kinds
00:06:41.160
of crazy stuff, or just cynically bring about a quicker end to civilization or something, whatever the
00:06:49.760
person's into. The thing is, a lot of that stuff doesn't even work. I've been around the meetings
00:06:57.020
where the big companies sell the biggest clients to pour money into this. And honestly, I don't think
00:07:04.980
you get a lot of mind control. I do think you get two things though. I think you get just a few big
00:07:12.800
companies holding the ability to blackmail everybody by controlling access of everybody
00:07:19.480
to everybody. So in other words, it's just the opposite. What the internet was supposed to be is
00:07:23.400
connecting everyone. And instead we have universal blackmail of forcing people to behave in certain
00:07:29.420
ways to get connected, which is the opposite of what was supposed to happen. So that's stupid.
00:07:34.780
But then the other thing is that even if the mind control to get somebody to buy a particular soap
00:07:41.060
is pretty sketchy, and I think a lot of times the data to support it is sketchy, what it does do is
00:07:46.840
it puts everybody into this behavior modification regime that even if it doesn't do what it's supposed
00:07:52.480
to, it has all these side effects. It makes everybody a bit more vain and crazy and stupid and contentious
00:07:59.380
and paranoid. That does happen. It's really good at that. And because of that, it becomes this giant
00:08:06.540
invitation to the worst actors in the world to try to go in and sabotage other people's societies.
00:08:12.320
So you have the lovely Mr. Putin having these basements in St. Petersburg create fake black
00:08:22.980
activists to try to make the black political movements in the U.S. more radical in order to
00:08:27.300
destabilize American society and make it kind of nutty. That's documented. I'm just using that example
00:08:32.680
because it's well-documented. There's many, many others that are as well. And so you overall have
00:08:40.940
this thing that darkens human prospects. It makes everybody, like I say, just a little like a lesser
00:08:46.340
version of themselves, a little more vain, a little more mature, a little more paranoid, a little more
00:08:50.760
ornery and irritable. And all of that is in service of a business model that doesn't really deliver a lot
00:08:58.540
of real value, but just creates artificial impediments to what would otherwise happen,
00:09:03.000
in my opinion. So I think it's a massive, massive screw up. And it happened.
00:09:11.680
You know, I was around Google when it was very, very small before Facebook existed. I sold them a
00:09:18.220
company. I used to know Sergey and Larry. They were so cute, so full of energy, so optimistic.
00:09:23.400
And the thing is that the surrounding social circumstance was incredibly adamant about two
00:09:33.520
different needs that were totally in contradiction with one another. On the one hand,
00:09:38.580
there was this leftist feeling because it's the Bay Area in California and everybody's supposed to
00:09:43.640
share. The internet is finally a chance to get rid of the evils of capitalism. But on the other hand,
00:09:49.460
we love our business hacker heroes. We love our entrepreneurs. We love our Steve Jobs, you know.
00:09:56.780
And so you're supposed to be this great business person, but you're supposed to hate business.
00:10:01.920
You know, like, how do you get those things to work? And the way to do it is by being fake,
00:10:06.200
where you create fake socialism as an experience and fake capitalism on the back end. You don't get,
00:10:12.720
like, whatever you believe about either of them, you know, this is the worst of both of them.
00:10:19.080
It's like the worst possible solution, but in a way, it's the only one the social pressures of the
00:10:26.400
Jaron, can you dig in for us into that particular thing a little bit more?
00:10:30.500
What is the, I understand that me as a Twitter user, I'm not the customer, it's the advertiser
00:10:36.220
that's the customer. But beyond that, how is the business model making us different?
00:10:42.340
And when you talk about fake socialism and fake capitalism, dig into that a little bit more for us.
00:10:48.920
Yeah, sure. So there's a, we have to go a little bit into the science behind behavior modification.
00:10:56.420
So this backwards, it goes into the 19th century with Ivan Pavlov, yet another Russian. I see you
00:11:03.840
Well, I'm from Russia originally, so that's my heritage.
00:11:06.000
Okay. And I'm from Venezuela, so I've seen socialism done properly.
00:11:16.000
Okay. Excellent. Excellent. So we're all extremely well informed here.
00:11:24.020
Yeah. Pavlov, talk to us about Pavlov and his beautiful dogs.
00:11:29.100
One of the early behaviorists. And so these are people who said, we're going to do a methodical,
00:11:34.840
mathematical, scientific approach to understanding how training works. So, of course, people have
00:11:40.900
been training animals, and for that matter, other people since ancient times. There's nothing new
00:11:45.640
about that. But this was saying, okay, we're in the scientific age, we're going to do it methodically.
00:11:50.260
And so Pavlov started to experiment with like, you're going to put a creature in a cage and control
00:11:58.660
exactly what stimuli are available and exactly what outcomes are available. So you can really get
00:12:03.960
focused information about what works and what doesn't work to understand behaviorism. In the 20th
00:12:10.240
century, probably the most famous heir to Pavlov was B.F. Skinner. He was doing the same thing. In
00:12:18.160
Skinner's case, mostly with pigeons and rats, you put them in a cage. There are a few very
00:12:24.820
interesting things that came out of this methodical approach to training. One thing is that it can
00:12:33.920
get out of control. So if you put an animal in a cage and you say, if you hit this button, you'll get
00:12:41.260
a treat. If you hit this other button, you'll get an electric shock, let's say. They'll just sit there
00:12:47.500
hitting the treat button. What happens is it changes the nature of an organism or a creature from one
00:12:54.680
that has multi-level behaviors that can respond to different circumstances and can learn to one that's
00:13:00.040
very simple, just stuck on a little loop. Okay, that's a very important thing to observe. All right,
00:13:06.120
that's one thing. Another thing that's extremely interesting is that if you want to deeply embed a
00:13:16.540
behavioral pattern in the creature, instead of just having perfect feedback, where when you hit the
00:13:23.860
button, you immediately get the candy every time, you add a bit of randomness, or it might take a while
00:13:29.560
longer. And this bit of randomness, what it does is it absorbs the brain's ability to adapt and
00:13:36.940
focus it entirely on this, because the brain is like, well, what's going on? What's going on? And so it
00:13:41.640
actually strengthens the conditioning. Okay, so this is sometimes called operant conditioning. There
00:13:47.680
are many variations, and I could flip into technical talk to create many shades of distinction, but
00:13:53.640
I think I'm giving you a fair summary of how this stuff works.
00:13:59.600
The people who studied this had different opinions about how it should be applied to human beings.
00:14:06.820
Okay, now, at the dawn of computers, one of the founding computer scientists
00:14:13.640
wrote a book called The Human Use of Human Beings. And this was a book pointing out that
00:14:24.560
with the advent of computers, you could automate this whole thing instead of having white-coated lab
00:14:30.380
researchers doing it. And there's a danger that people could make machines that would just make each
00:14:36.240
other trained in a way that they'd become crazy and very narrow, and it could really be extremely
00:14:41.100
dangerous for civilization. That was a book in the 50s. Sadly forgotten, Human Use of Human Beings.
00:14:51.660
But then we hit the 60s, and something really interesting happens. It's famously known that a
00:14:58.300
little into the late 60s, the beginning of the technology of the internet happened,
00:15:03.060
funded in the United States by military research. I was trying to come up with communications
00:15:09.060
networks that could survive a nuclear attack by being very adaptable, right? But before that,
00:15:15.560
there was a cruder technology for networking that was the first time computers were actually
00:15:19.960
networked at distances for people to use. And this was an experimental network that ran only between
00:15:28.200
universities in the American Midwest. And guess who got the job of designing the user experience in
00:15:34.760
it? None other than RBF Skinner. So the very first network experiment with real people was by a
00:15:42.320
behaviorist. And his attitude, I think, was quite wrong, really wrongheaded. What he thought
00:15:48.040
is that if you could get everybody onto computers that are networked, and you could put everybody under the
00:15:55.620
influence of these algorithms, you could engineer a society that would become stable and productive and
00:16:03.580
happy and perfect. That's the right reaction. It's funny, right? It strikes us as a laugh line. It
00:16:15.260
seems ridiculous. But there are a couple of amazing things about it, which is that BF Skinner was on
00:16:21.380
the Western capitalist side of the Cold War. And yet what he was saying was not unlike what was
00:16:27.680
happening on the other side. This is exactly what people in places like East Germany and
00:16:33.340
North Korea and many other places have tried to do using other means to sort of put people under enough
00:16:39.600
behavioral control and sensory control that you can engineer this perfect, coherent society to your liking.
00:16:45.880
And jumping ahead a little bit, Jaren, it's exactly like what China is doing now to its own citizens, as we know,
00:16:50.660
right? Right. And unfortunately, I think China learned some tricks from our tech companies in
00:16:55.340
this case, something that I find shameful. Yeah. But anyway, sorry for interrupting you. Carry on.
00:17:01.300
No, it's right. It's right. So look, anyway, so Skinner did this experiment. It didn't work particularly
00:17:05.900
well. It was a tiny crude experiment, mostly forgotten, actually. So then the internet happens
00:17:17.400
initially between military sites and universities. But this is the radical 60s. So there's this kind
00:17:22.580
of hippie thing, which, hey, listen, I'm part of it. I'm a hippie. I'm a California hippie, proud of it.
00:17:29.920
But at any rate, there was this hippie sensibility that was anti-capitalism, not exactly pro-Soviet or
00:17:37.080
pro-the other side, but definitely in this other space and somewhat naively utopian on many levels.
00:17:45.380
I say that as somebody who's tried to live in a 60s commune and whatnot.
00:17:50.080
It's hard. People are hard. You can't just pretend people are easy when people are hard. I mean,
00:17:54.160
this is a fundamental issue that the left never gets, right? But anyway, so then when companies
00:18:06.200
like Google show up, the rhetoric around computer culture, which is, and computer culture is
00:18:12.100
incredibly powerful. I mean, it's not, this has never been an entirely top-down thing. This is a
00:18:17.600
community cultural effort. And computer culture is anti-capitalist on the surface, but as I say,
00:18:24.040
also pro-capitalist, as long as you're a hacker hero entrepreneur, you know. So it doesn't make sense.
00:18:29.920
And it doesn't, we were absolutely unwilling to take three seconds to step back and notice our own
00:18:35.540
contradictions, right? It's ridiculous, you know, very human, I think. So Google was born with this
00:18:45.100
ridiculous hippie illusion in front, even though it's actually a company. And I have to say,
00:18:51.540
I'm putting it this way because I'm pretty sure that the original people who made Google
00:19:03.640
would have been just as happy to do it differently, but the community pressure was so intense that they
00:19:08.540
weren't really given a practical choice. It was like this very intense dogma. Like as an example,
00:19:16.300
I'll say some things here that might upset some people who are listening, but like in the
00:19:21.380
early days, 20 years ago, or whatever, but more than that, actually, like the seeming good guys
00:19:29.860
who had nothing but very high self-regard, like the pirate parties in Europe, were feeding Google
00:19:36.340
business while pretending to be anarchists and anti-capitalists. And that was plainly true,
00:19:42.620
obviously true to everybody. They were even funded by Google in many cases. And yet somehow everybody
00:19:48.480
just ignored this. And I think it's an incredible testament to how people will be happily ready to
00:19:55.740
lie to themselves if it's comfortable. And I think that's universal. That's not just a problem
00:20:01.380
with this particular community. It's a universal human quality. And it sneaks up on us all the time.
00:20:06.800
Hey, Konstantin, do you want better mental health? I'm from Russia. We don't have mental health.
00:20:14.740
So how do you deal with mental health? You drink vodka, then go out and wrestle bear. If you live,
00:20:20.660
you feel better. If you die, you're not real man. What about the bear's feelings? It's Russian bear.
00:20:26.780
It has no feelings. People don't always realize that physical symptoms like headaches, teeth grinding,
00:20:32.560
and even digestive issues can be indicators of stress. And let's not forget about doom scrolling,
00:20:38.820
not sleeping enough, sleeping too much, undereating and overeating. Sleeping too much,
00:20:45.000
undereating. This is Western disease. Therapy has really helped me in my life to concentrate and focus.
00:20:52.000
It's really important to have someone impartial who you can talk to about the tricky issues that
00:20:57.680
you're struggling to deal with. Therapy has played a really important role in helping me to deal with
00:21:03.320
my ADHD and become better in all areas of my life. Why is he telling them how weak he is? Drink vodka,
00:21:11.220
feel better. BetterHelp is customized online therapy that offers video, phone, and even live chat
00:21:18.380
sessions with your therapist. So you don't have to see anyone on camera if you don't want to.
00:21:24.100
Trigonometry funds get 10% off their first month at betterhelp.com forward slash trigger,
00:21:29.740
especially if they're not real men. That's B-E-T-T-E-R-H-E-L-P dot com slash trigger.
00:21:40.740
We're talking about the worst of capitalism, but these companies in many ways are the worst of
00:21:46.080
capitalism. They're monopolies. They're incredibly powerful. They now have the ability to influence
00:21:52.520
elections like what we saw with Hunter Biden and Twitter suppressing the story. I mean, it's
00:21:58.480
terrifying, isn't it? Well, a couple of things to say there. One thing is I get calls from all kinds
00:22:07.300
of people who are upset about what happened to elections. I have more firsthand knowledge than
00:22:15.580
most people about what happened in the platforms. And there was more suppression of the left than the
00:22:21.280
right. Both things happened. But just to be clear about that, there was more stuff that happened
00:22:27.540
that was anti-Biden than pro-Biden. Like what? Tell us. Because I've never heard this argument. I'm
00:22:32.860
very open to it. Like what? Can you give us some examples? The documented for real and high volume
00:22:41.120
foreign interventions that favored a candidate, favored Trump, mostly driven by Russia.
00:22:47.080
To say that that's all that happened would be ridiculous because all kinds of things were
00:22:52.540
happening. But the weight of it went in that direction. Does that mean anything? No, actually,
00:22:57.100
it doesn't. I know that it's frequently, you kind of want to believe, like if you're more sympathetic
00:23:03.220
to one side than the other side, you kind of want to believe the other side's got more advantage or
00:23:09.180
whatever. Honestly, this whole thing is a big, giant bundle of bullshit anyway. Like I think to worry
00:23:18.220
about whether the other side got more bullshit than your side is really irrelevant. The point is there
00:23:25.040
should be no bullshit. No argument there at all. But John, can I just, can I just challenge you briefly
00:23:31.040
on this point? I'm not challenge. You know, look, I know I've been through this argument a lot. I
00:23:35.560
really want to emphasize, I do believe I have a clear picture of it. On the other hand, I just
00:23:41.340
really don't think it matters or it's that important. If it strikes you as being important, I really want
00:23:45.740
you, I want you to reconsider whether it is. Well, that's why I want to ask you the question,
00:23:50.160
because I am totally open for you to persuade me it's unimportant. The difference that I saw,
00:23:55.460
and look, neither Francis or I are Trump fans. Neither of us is on the right. None of that,
00:23:59.920
right? We're just, as a principle, to me, watching a set of coordinated efforts by different tech
00:24:08.200
platforms to prevent a story about the son of a presidential candidate being published on the
00:24:14.280
eve of the election, that was unprecedented and very, very different to the tech platform messing
00:24:23.080
around with Russian bot farms or whatever. Do you see what I'm saying? Yeah. Oh, look, I, and you know
00:24:28.860
what, um, I, if you're going to, so if, if the question before us was, were there parties who
00:24:39.980
worked together to try to support Biden in ways that make us uncomfortable about the future of a
00:24:45.760
decent, honest society and democracy? The answer is yes, there were. And I, and so I can, I would
00:24:51.540
validate what you're saying. Um, I think, I think that happened in a few different levels. I think
00:24:57.020
part of it was, um, the fact that most people of influence in the tech world are educated and most
00:25:05.120
educated people are more sympathetic to the Democrats in the U S and to liberal positions in general. Um,
00:25:14.000
and they naturally tend to support each other when they start to move in that direction. And even
00:25:19.360
without an explicit conspiracy, there can be an effective one. All right. I, I, I don't want to
00:25:25.380
deny that there's truth about that. Um, I also want to point out that some of what they were resisting,
00:25:31.780
like if the question is, uh, well, they're different questions. You could say, was there less
00:25:37.200
reporting than there might've otherwise been about Hunter Biden? Probably on the other hand, a lot of the
00:25:42.460
promotion of the Hunter Biden story was also equally questionable and for the same reasons.
00:25:46.760
And with even more central or central coordination and more questionable issues and more money behind
00:25:53.400
it to the best of my ability to read it, which is neither here nor there. Um, in my view, in the
00:25:59.720
balance, if you compare all the different kinds of bias that were injected into the social media
00:26:04.000
systems, there was more pro Trump stuff going on than pro Biden stuff. Once again, neither here nor
00:26:12.140
there. And it, it, it makes no sense for any of us to adjudicate that if I'm wrong, it really wouldn't
00:26:17.900
have any effect on any of the positions I'm taking of consequence here.
00:26:21.500
Truly. Okay. Jared, that, that being the case, what do we do with these hugely powerful companies?
00:26:28.620
They, uh, you know, they can pretty much do what they want. Yeah. They can control what they want.
00:26:34.220
They, they disseminate information as they want. They're too powerful, surely.
00:26:39.900
Yeah. So this is very problematic. So, um, going back to the book, I mentioned the human use of
00:26:45.340
human beings, and I didn't mention the author Norbert Wiener, I should mention. And I have to say,
00:26:49.900
this book is really hard to read because this guy's an ultra nerd and there's mathematical
00:26:53.660
equations in the middle of it. And, um, it's a great shame if he'd written it in an accessible style,
00:26:59.420
I think it would have changed the world, but he, he just wasn't that person, you know, at the time,
00:27:03.420
this is going way, way back. This is to the Manhattan project generation of scientists, you
00:27:08.140
know? So anyway, um, the problem is that if you have computer networks running the world, whoever
00:27:15.900
runs the networks runs the world, it's like a much more powerful thing than money. It's much more
00:27:20.060
powerful thing than votes. It's much, it's like the most powerful thing because you're directly
00:27:24.540
in control of the channels of action that exist in the civilization. And this was a danger that
00:27:31.020
was well articulated in advance by a variety of people. Um, I tried to say what I could about it
00:27:36.300
at the time, as did a few others. Um, honestly, most of us in computer science were just so enthralled
00:27:42.540
by the prospect of power that we allowed our, our kind of, um, I don't know, our egos to get the best
00:27:52.300
of us. Oh, we must, this must be for the good. We're the best people. We'll save the world if we
00:27:55.900
get powerful. I mean, it's, and, and of course that's fallacious. No, no, nobody's perfect, you
00:28:00.540
know, and no concentration of power is trouble free, you know? And so, um, that, that's, uh,
00:28:08.060
what happened. And so now we do have this kind of curious situation. Now that said, there's a weird
00:28:16.060
kind of neutered quality to the new kind of power that makes it a little less horrible than it might
00:28:23.100
be. And if you want to look at the comparison, compare what companies like Meta and Alphabet
00:28:27.020
can do to what the Chinese are doing, or if it doesn't matter, the Russians or the Iranians,
00:28:31.900
or these all kinds of other people. Like there's a weird thing about Silicon Valley culture,
00:28:36.380
if I can call that, or tech or tech culture, and that it's so nerdy. It's so, we're not supposed
00:28:42.620
to use the term on the spectrum anymore, but I don't really know what language to use for this.
00:28:46.380
There's this kind of dryness to it that in a way, it's not exactly traditional power seeking,
00:28:51.980
and it doesn't have traditional power goals. It's more just this kind of
00:28:57.740
nerd supremacy without a particular direction or sensibility. It's, it's, it's truly different,
00:29:05.100
I think, than previous concentrations of power. I mean, like you go to Silicon Valley and every other
00:29:11.420
center of power in history has made itself beautiful and impressive, and Silicon Valley has
00:29:15.660
not. It's just kind of another shitty suburb, you know, and that's strange, but it reflects this kind
00:29:21.100
of incredible nerdiness. And so if you talk to Zuckerberg, it's just this kind of very formulaic
00:29:33.580
first order approximation of what an idea would be. Like the world should be more connected, so we'll make
00:29:37.740
it more connected, and that's to the good. But there's nothing more. There's no, there's nothing
00:29:41.900
there. I mean, of course, the individuals like getting rich, but they don't necessarily have
00:29:46.460
anything to do with that, you know. Same thing at Google. It's, it's kind of an odd thing. It's like
00:29:52.380
we will organize the world's information, which I think, by the way, is a misnomer. Let's, but that we
00:29:57.340
don't need to go into the philosophy of that. But the thing is, it's a weirdly neutered thing. And so
00:30:02.540
the main problem with it is different from the problem of, oh, I don't know, a Putin-like figure
00:30:11.340
who wants to conquer a neighbor, or a Xi-like figure in China who wants to create this sort of
00:30:17.900
Chinese-centric model of the world that gives no room for even the identity of Tibetans or Uyghurs
00:30:24.460
or Taiwanese or anything. It's not like either of those things. It's more, it's a weird thing. It's
00:30:32.700
like the nerd kid in high school who doesn't even really have an agenda, but just is sick of being
00:30:41.180
the one dumped on, you know. It's a strange, inert kind of power to a degree. But because it has the
00:30:47.580
side effect of destroying the personnel, harming the personalities of everybody else, it still is,
00:30:54.220
the effects are still too negative. And of course, the other thing to say is, whenever you create a
00:30:58.940
center of power, it'll eventually be seized by the worst people. You start off with Bolsheviks,
00:31:03.820
then you get Stalinists. That's how it works. And when you get rid of the Stalinists, you don't
00:31:07.740
suddenly get Democrats. You get Putin's, you know. Like, centers of power are seductively horrible
00:31:19.020
for history. So there's a lot of reasons not to do it. But it's also important to understand that
00:31:25.020
when we talk about power of tech companies, it's a different kind of power than we're used to
00:31:28.940
historically. Jaren, it seems to me that, like all revolutions, it started off with utopians,
00:31:35.900
people wanting to build a better future, people having an idealized view of the world and how
00:31:41.260
they were going to change things to the better, and they were going to make everyone more connected,
00:31:44.700
and society was going to get richer and just better as a result, without realizing,
00:31:51.100
A, the consequences of their actions, but B, as well, you've got to monetize this thing. And if you're
00:31:57.740
going to monetize it, you want people to stay on the platforms. And how are you going to do that?
00:32:02.860
You're going to do it through algorithms and playing with people's emotions.
00:32:06.300
Yeah, just over, over the last weekend, I keynoted, or I co-keenoted with Neil Stevenson,
00:32:14.780
who wrote Snow Crash and many other science fiction novels, the big crypto conference,
00:32:20.140
which is called Consensus, put on by Coindesk in Austin, which is kind of like,
00:32:26.620
we call it a dumpster fire in our current lingo. But anyway, whatever. The thing is, I thought,
00:32:32.460
wow, I'm going to come in here. This is a hostile audience. You're not going to want to hear me say
00:32:36.780
how scammy this whole thing is, and how whatever. But I put it in a positive light. What I said is,
00:32:42.060
look, instead of trying to make money from finding the next person to come along, who buys some token
00:32:47.580
for more than you paid for it, why don't we get these things to form value chains? Where if you have
00:32:52.700
a token for some piece of art, and some person uses that art and animation, and then somebody else adds
00:32:57.420
music to the animation, and somebody else adds a story, and somebody else does this and that,
00:33:02.140
and then it eventually makes money because people are willing to pay for it, then if royalties go
00:33:05.660
back to everybody, then instead of selling the token, you could earn royalties and dividends on it
00:33:10.940
and fund a society. Because one of the fundamental problems with tech culture and money in general is
00:33:17.580
that it has forgotten what money is for. If you go back to Adam Smith, or anybody who's been concerned
00:33:24.620
with how money works, the idea is that when you have money, that money goes to work for society,
00:33:30.540
that's how you earn interest on it, and the money goes up in value. The worst thing is to stuff your
00:33:34.860
money in a mattress and pull it out of circulation, right? And yet, the tech industry has sort of been
00:33:41.660
like putting money in a mattress, and that's even more so with the web three and crypto stuff.
00:33:46.700
And so what's funny is, when I said this, I thought they were going to be angry, but I had this whole
00:33:51.180
haul of people who started applauding, and then they stood up. And it's like, I think everybody knows
00:33:56.780
in their gut that the way we've been doing things is silly, and we need to be more focused on real
00:34:02.220
productivity, real investment, real accomplishment, real achievement, real building, real innovation,
00:34:07.820
and just moving bits around and screwing around with other people's behaviors doesn't actually do
00:34:12.140
anything for the future. And everybody really is in it for the future at the end of the day,
00:34:16.700
that's why people get into tech. So I, it might just evaporate, maybe I'm kidding myself,
00:34:22.620
but I have this feeling like there's this, and this is a very young audience. So these are new people,
00:34:27.900
and not the same old people I've been talking to for decades. And I just have this kind of optimistic
00:34:33.420
feeling that they're open to it, and that there's room for tech culture to improve. I really believe that.
00:34:39.980
Yeah, well, I know what you mean, John, because I think, as we, I said to you right at the beginning,
00:34:44.220
when we started, I think, while many of us are grateful for the tremendous opportunities that
00:34:49.900
being connected with 8 billion other people have given us, at the same time, we cannot ignore some
00:34:56.780
of the terrible downsides, whether they are for society, or for our children, or for our own well
00:35:02.140
being, or any of those things. So, and I think a lot of people are starting to wake up to that. And
00:35:07.180
the other thing that resonated with me so strongly is when you were talking about,
00:35:10.940
you know, these are nerds who've got their own sort of not quite as evil agenda as it might be.
00:35:17.100
But I was thinking the whole time you were talking, yeah, but that's because they own
00:35:20.380
the thing that they created. What happens when they all get forced out, or they die, or whatever?
00:35:27.420
Yeah, yeah. I think about that all the time. Like, who's going to, who exactly is going to inherit
00:35:33.580
Facebook or Meta? Right. You know, it's not set up to allow anything. I mean, now, Cheryl just left,
00:35:40.220
and instead of a new Cheryl, it's just even more concentrated. And that is not the way to create
00:35:45.260
an institution for the benefit of humankind. You have to see beyond yourself, and it just hasn't
00:35:50.460
happened yet. So let me ask you about that, Jaren, because you talked about how this could all have
00:35:55.340
been done differently. And you spoke about it at the time, and the business model is wrong.
00:35:59.980
How do we fix social media? Or how do we make it better? How do we make it slightly less worse,
00:36:07.100
at least? Yeah, slightly less worse. I'll live with that. I think it's a really huge
00:36:15.180
problem to say, oh, I'm going to fix it. I'm going to make everything better, and everything's
00:36:19.020
going to be perfect. Because whenever somebody says that, you know, this is going to be really
00:36:23.820
crappy. If it's like, I can make this slightly less horrible, you know, that's the person you want
00:36:28.060
to listen to. So I think there's a few things that can be done differently. One of them, if anybody
00:36:34.300
wants to, the current issue of the Atlantic magazine in the US has a piece of mine on how to fix Twitter
00:36:41.100
that I wrote for Mr. Elon, and perhaps to no effect whatsoever, we shall see. But what it proposes
00:36:48.380
is an idea that I think is the least invasive and least impinging on free speech way of just improving
00:36:58.700
the conversation. And that is to demand that people form small associations, like form little zines,
00:37:06.300
or bands, or clubs, or brands, small enough that they can all know each other, and only publish through
00:37:11.900
those and rise or fall in terms of reputation and monetary income together in that band, like you
00:37:19.580
two are doing with your production here, right? Now, here's the reason why I'm saying that.
00:37:26.700
If you look at all the people who've studied societies that you'd want to live in, they've all
00:37:32.140
come to the conclusion that that mechanism, which is often called a societal institution, is at the core
00:37:37.500
core of maintaining quality, all right? Some of the people who talked about that are the Tocqueville,
00:37:44.060
Hannah Arendt, but you know the one that really got me going on it some time ago was a friend of
00:37:48.780
mine named Mohamed Yunus, who won a Nobel for starting microlending. And there's a lot of ways that
00:37:55.340
microlending has perhaps not achieved all that some hoped it would achieve, but let's leave that aside.
00:38:00.460
There's one part of it that's worked really well, which is he had a bank in Bangladesh
00:38:06.540
trying to serve a super poor community where nobody had a credit rating. If they gave somebody
00:38:11.180
a loan, chances are the person would not pay it back because like, why would they? It was just free
00:38:15.260
money. They weren't even used to the whole thing. And they said, okay, we're not going to do loans
00:38:19.740
to individuals anymore. You guys have to form a group. You have to vouch for each other. If one of
00:38:23.580
you doesn't repay, you all are going to pay the price for it. And this is a way that we can distribute
00:38:29.100
the process of credit, creating credit, or if you like creating quality. And it worked. They
00:38:35.180
suddenly got better repayment rates than traditional banks are used to. Incredible. So when people have
00:38:40.540
a bit of a shared stake, they start to watch out for each other. And if somebody runs into legitimate
00:38:44.860
trouble, other people help them. If somebody wants to screw around, they get rejected. They get ejected
00:38:49.660
from the group. And suddenly you have quality going up. Magic, right? And so I would like to
00:38:56.060
see that in social media. And it would be strictly by free association. If you don't like your group,
00:39:00.220
you can always quit. You can get into another one. A group can decide who to accept, who to reject.
00:39:04.380
But the point is that the groups are small enough that the people really know each other. And then
00:39:09.580
what you get out of that is a few things. One, if somebody starts turning into a jerk, and
00:39:15.100
everybody turns into a jerk online from time to time, because we turn into a jerk from time to
00:39:20.300
time offline. I mean, let's be honest about ourselves. The other people say, hey, you know
00:39:25.820
what? Cool it. This is like a bit much. But another great thing is the group can post often enough to
00:39:32.540
keep the brand going and to keep subscribers, whereas individuals have to post too often to do so and to
00:39:37.980
stay sane. So that's a good thing. Another really good thing is that since the group is divvying up
00:39:43.740
benefits, and I want them to get subscriptions, donations, micropayments, I'm really into money
00:39:49.820
for this. I think there should be more money online. I really do. Because we live in a market world.
00:39:54.380
And to say, well, the online world won't be a market world just makes people become gradually
00:39:58.620
more and more obsolete as technology gets better. It's ridiculous. If we're going to be socialists,
00:40:05.420
we have to do it everywhere. We can't just do it online. So I'm really pro-capitalism online,
00:40:10.380
because I think it's the most viable way to make things workable. But anyway, if there's a bot,
00:40:16.700
if there's a fake person in your group, and that bot is getting some of the money, you have suddenly
00:40:20.300
a motivation to get rid of the bot. And now when Elon Musk is saying, oh, there's too many bots on
00:40:26.300
Twitter, he's absolutely right. A friend of mine at Facebook tells me that 99% of the new account
00:40:32.540
applications are from bots by their estimate. And the reason why is that people who want to mess up a
00:40:38.380
society put in bots in order to sway opinion, right? So there's every economic incentive to
00:40:43.980
make bots. Well, now there'll be an economic incentive to get rid of bots. And otherwise,
00:40:49.260
it's the platform's job to do it from on top, which is a losing game. But if it's distributed,
00:40:54.620
just like creating credit and micro-lending, all of a sudden it becomes doable and we can get rid of
00:40:58.860
the bots, which are terrible. They're poisonous. There are many other benefits. You can read the
00:41:04.780
piece in the Atlantic if you want. So another thing I'll point out is that I've tried to do a
00:41:11.660
ranking of the relative degree of terribleness in different online platforms.
00:41:17.100
How often are people jerks? And nothing's perfect. But the thing is, it's human beings,
00:41:23.660
so nothing's ever going to be perfect, right? But there are differences, okay? So
00:41:28.940
some of the places are really a cesspool. Facebook is junk. Horrible. YouTube is really
00:41:40.780
pretty bad. You can manage your experience on it to make it better, but it's really pretty awful.
00:41:47.980
There's an experiment. Now, some people challenge me and say, no, this isn't true. But I've done this
00:41:53.100
many times. And in my experience, having done it dozens of times, which is not a huge sample,
00:41:57.420
so not up to scientific standards. But if I get a room full of kids and I say, okay,
00:42:03.100
just either start with your account or start with a fresh identity where Google doesn't know who you
00:42:08.620
are and just let YouTube recommend follow-up videos, how many tops does it take until you end up in some
00:42:15.580
really weird, creepy, paranoid, ugly thing? And I usually found it to be in the teens, like in the late
00:42:21.260
teens of times, sometimes sooner, sometimes later. But that's horrible. That's horrible. So YouTube's
00:42:27.340
garbage. YouTube's a cesspool, despite many, many wonderful, valuable things on it.
00:42:31.580
What's something that's better? I might be biased because I work with Microsoft,
00:42:37.260
but GitHub, this place for sharing projects, is pretty good. There's an occasional jerk on GitHub,
00:42:45.660
but mostly people have a stake and they have a shared stake with people in their teams.
00:42:50.060
So it's a little bit like the thing I'm talking about for social media, where if you're working
00:42:54.460
with a bunch of people on a coding project, the last thing you're going to go and do is mess up the
00:42:59.900
value for everybody by mouthing off at somebody for no reason, like all of a sudden you feel a bit
00:43:04.300
of a sense of responsibility to your mates, your compatriot, right? That's a good thing.
00:43:11.820
So, and I believe in monetization, because I believe when people have a stake, when they have
00:43:18.140
something to lose, they'll think through their actions more. If the only thing you have to gain
00:43:24.140
his attention and you have nothing to lose, then you'll be a jerk because you have the incentive,
00:43:28.860
right? It's really simple. So I want to monetize the stuff. And then people say, oh, but I don't
00:43:33.580
want to have to pay. But the thing is, if you're earning, paying won't feel so bad. Like if somebody's
00:43:39.340
gainfully employed, it doesn't bother them to pay for stuff that keeps other people gainfully
00:43:44.140
employed so they can afford to buy whatever they do. Like if people are part of the cycle of the economy
00:43:50.220
and are part of a social contract that makes sense, they're more ready to pay for stuff.
00:43:55.340
And we already know online that people pay for stuff. Sometimes it's possible. It doesn't have
00:44:01.420
to be all the time. It doesn't have to be like this religious thing that you never have an exception to,
00:44:06.140
but we just need a lot more of it. And Jaron, you're talking about, you know,
00:44:12.940
social media platforms and you mentioned Facebook and I agree with you and YouTube as well.
00:44:17.500
To me, the worst one is Instagram. It's the worst one. And there's somebody who used to,
00:44:25.820
I used to teach for many years. Go for it. Go for it. Tell me.
00:44:28.940
It's the ideal platform for me to do my bikini modeling. So
00:44:31.980
I don't know what you're telling me, but yeah, no, no, I hear you. I hear you. I get it.
00:44:38.780
But just, but the effect that it has on particularly on children and young girls in particular is awful.
00:44:47.580
It's awful. And, you know, and then I was reading, they didn't do it, thank the Lord,
00:44:53.100
but then they were talking about Instagram for kids.
00:44:56.140
I know. Oh, and, and like, you can just imagine the meeting where that came up and
00:45:01.340
somebody saying, how can we expand? How can we expand? Well, there's this population of kids,
00:45:05.820
you know, you know, it's just like, it's incredible that that conversation could even have happened.
00:45:16.140
No, no, because no, no, I was going to follow up because to me, the thing that is awful about
00:45:20.700
Instagram, it's the obsession with the physical self and the constantly looking at other people's
00:45:28.780
lives and other people's bodies and what that breeds in young people, particularly in women.
00:45:34.060
And, you know, I walk around London now and there's so many people having plastic surgery
00:45:40.460
and I haven't, there's no studies in this, but I go, this must be something to do with
00:45:45.260
what people are seeing constantly online, constantly on Instagram.
00:45:49.180
Right. Well, look, um, a couple of things to say there. Um, one thing I've learned through
00:45:58.220
hard experience is that if a causation connection seems right, it doesn't necessarily mean it is
00:46:05.260
right. So you say that the plastic surgery thing tracks Instagram, it sounds right to me, but
00:46:12.700
I'm trying to keep us to somewhat high standards of scientific study, even though that's hard with
00:46:17.420
this because it happens so fast. And the only people with the data are the creepy people doing
00:46:22.460
it. So it's, it's difficult, but I wouldn't, I would just put that in the category of kind of
00:46:29.100
makes sense, but not confirmed. It's my advice to you. Like, like we should do our best to try to,
00:46:36.460
because we can make all the arguments we need to and chart a future path without convicting every
00:46:42.140
little possible wrong along the way. Cause it won't even happen. Nothing will happen anyway.
00:46:47.100
So, uh, that's one thing I'll say. The next thing I'll say is, um, uh, the pumping up vanity has
00:46:57.020
been happening, happening for as long as, uh, women could spend money on anything or men,
00:47:02.140
for that matter, just people like there's a sense in which it's ancient and, and yet there's a sense in
00:47:08.380
which it's more immediate, more programmed and more, um, kind of nasty now than I think it was before.
00:47:15.420
Um, so, um, it's, it's easy in this stuff to get caught up with this argument. Well,
00:47:20.700
it's always been that way, which is kind of true to a degree, but then the question of the amount
00:47:25.980
and kind is also important. Then there is a difference down. Sometimes it can be a little
00:47:29.740
bit hard to be articulate about what the difference is, but it's, it's definitely there. And I think it
00:47:33.900
has to do with operant conditioning with this use of behavior modification as technique.
00:47:38.060
Well, right. I was going to say human beings are the way they've always been, but the tools
00:47:42.860
are much more powerful. And I think that's, that's where we are, which is one of the things I wanted
00:47:47.260
to talk to you about because you talked somewhat almost romantically and wistfully about the early
00:47:52.780
days of, of the internet. And I remember that time, uh, freedom, connection, every, anything is
00:47:58.540
possible. What are you talking about? Say again? You weren't born yet. What are you talking about? Well,
00:48:03.340
maybe not in those days, but I certainly remember like first encountering the mass
00:48:08.220
phase of the internet, right. When the ordinary user was using it. Right. Uh, and I was, I used
00:48:13.660
to play a lot of computer games in the time and I'd have friends all over the world that I would speak
00:48:18.140
with. We'd build connections. We'd play games together and we'd, we'd make the most outrageous
00:48:24.700
jokes to each other. And it was free and it was kind of edgy and it was like the wild, wild
00:48:30.380
west. And that was fun. And that was cool. And now we are in a very different place where
00:48:35.740
rather than talking about how much freedom we all have, it's much more about how do we prevent
00:48:40.740
quote unquote harm. And we've talked about some of the harms that come with social media and the
00:48:45.180
internet, but also there's a lot of this, like, you can't say this, you can't say that amplifying
00:48:50.540
this message causes this and that. Do you think the internet will ever be free again, even in the way
00:48:55.980
that it was? This is why I'm proposing this group structure I just talked about, because
00:49:01.180
on the one hand, if we just say, well, we have to lay off because all this, all of this telling
00:49:06.940
people what to say is leading in no good direction. Well, then the whole world turns to pot because
00:49:12.540
we let manipulative, creepy people have the most power. But then on the other hand, if we let things
00:49:18.140
just, if we get to control to the point where we control those people, we also control too much.
00:49:23.660
And if you want to see what that looks like, look at China. We actually think that looks like,
00:49:27.500
and it's not good. And I think, I think China has become a little crazy, for instance, in its
00:49:33.020
management of COVID because of the inability for people to just communicate in a straightforward
00:49:38.620
way. I think, I think they've really hurt themselves. And so neither of those paths are good.
00:49:45.180
This group thing I'm talking about, I don't know if it'll work because it hasn't been tested enough.
00:49:48.940
So maybe it also is hopeless, but at least right now, it's, it's an idea that's still standing,
00:49:53.740
that gives us an alternative to shutting people down or just letting society fall away to the
00:49:59.660
worst people. It gives us a path through that dilemma, at least it appears to, you know, so far,
00:50:04.700
or maybe some variation of it will, but I mean, it's at least a direction because I really don't like
00:50:10.060
either of the alternatives right now. Right now, if we don't go into the direction of the groups I was
00:50:14.380
talking about, we either have more and more censorious behavior online, or we have more and
00:50:20.700
more power for the most creepy people, or we have to just shut down the net. All three of those are
00:50:26.540
horrible. So this is another option. It's another direction. Maybe, maybe. Do you have a website or
00:50:32.540
do you plan to have a website? Because if you do, then Easy DNS is a company for you. Easy DNS is the
00:50:40.060
perfect domain name registrar provider and web host for you. They have a track record of standing up
00:50:45.660
for their clients, whether it be cancel culture, de-platform attacks, or overzealous government
00:50:51.500
agencies. He knows about that. So will you in a second. Easy DNS have rock solid network infrastructure
00:50:58.780
and fantastic customer support. They're in your corner, no matter what the world throws at you,
00:51:04.620
unless it's your ex-girlfriend, in which case you're on your own. You know about that.
00:51:09.660
Move your domains and websites over to Easy DNS right now. All you've got to do is go to
00:51:15.020
easydns.com forward slash triggered. That's easydns.com forward slash triggered. Use our promo code,
00:51:21.900
which is also triggered and get 50% off the initial purchase. Sign up for their newsletter,
00:51:27.340
Access of Easy, which tells you everything you need to know about technology, privacy and censorship.
00:51:34.780
It's interesting you mentioned that, Jaren, because we use a platform called Locals,
00:51:39.420
which is a bit like Patreon, but there's a community element to it. So it's not just
00:51:45.260
money for access to extra content. It's a community of people who love our show.
00:51:49.500
And, you know, there's about 15,000 members in total. There's about a thousand super active users who
00:51:55.740
contribute or maybe 1,200, something like that. And we, at this level, at least, we don't have to do
00:52:01.660
any moderation. People aren't, generally speaking, having a bitch fights with each other every three
00:52:07.980
minutes. People are generally respectful. There's a sense of, like, everybody's pulling in the same
00:52:12.620
direction. And one or two times when people have got out of line, me or Francis have gone,
00:52:17.660
guys, remember, this is like a cool space in which we all hang out and have a good time.
00:52:22.940
And everyone's going, oh, yeah, that's cool. Right. And people come and go. But I think your idea
00:52:27.660
sort of works. But I suppose the question for me is, in an internet that connects everyone to
00:52:34.380
everyone, how do you do that? Right. Well, all the platforms were
00:52:40.460
cute when they were small. Yeah. Yeah. Earliest days of TikTok weren't toxic. TikTok is currently
00:52:46.300
toxic. Right. The very earliest days of Instagram were actually cute, as hard as it is to remember
00:52:51.820
now. So smallness works. The question is, how can you make a network of small things that maintain
00:52:59.180
the good qualities of smallness while still scaling to the world? So that's the idea of
00:53:04.140
presenting to you as an attempt to do. And it's going to be a project. It's a civilizational level
00:53:12.460
project. I can't tell you that I know exactly how to do it. I just believe that through process of
00:53:17.900
elimination, it's the last idea standing I know about. And I don't see anything that kills it
00:53:24.460
offhand, unless you just want to be cynical and say, ah, it's all going to be shit, which is,
00:53:28.060
you know, it might be. It might be, but maybe this is a path forward. I think there has to be a small
00:53:38.780
enough immediate community that it's possible for people to really talk to each other and for
00:53:44.540
not to become lost in this giant ocean of bots and unknowability. That's absolutely essential.
00:53:49.260
social. But there has to be, there's another side of it I haven't talked about. Right now,
00:53:59.980
people, the way people perceive the world is social. This is something we often don't realize
00:54:05.740
about each other. There's an experiment I sometimes do with my students where if you go out on a crowded
00:54:10.380
sidewalk and you just start pointing at something, everybody will start looking there, even if there's
00:54:13.980
nothing there. And the reason why is that we evolved to rely on each other for cues about danger
00:54:20.380
and where to focus attention. This is something that's deep in us. Okay. So if you have this giant
00:54:25.900
unknowable crowd, whatever the, that thing, that giant crowd indicates, you will start to feel.
00:54:32.460
So if there's enough bots in it to try to indicate, indicate that your society is shit or that the other
00:54:37.100
society is shit or whatever it is, you'll start to feel that. Now, if you can turn it into a
00:54:42.700
collection of groups as I've been talking about, maybe you can cut back the total number of players
00:54:47.980
by a hundred times or something like that, maybe even a thousand times. Now, your interface to that
00:54:55.260
world, instead of looking like this feed from a zillion people, will look like a newsstand with
00:55:01.180
covers that you've started to like. And you can add or remove from your newsstand and start to dive
00:55:05.340
into it, each of which representing a group. And in that circumstance, the way you receive information
00:55:11.900
is given context and package. So there were horrible Nazis and whatever, whatever kind of
00:55:20.060
creepy person, jihadists, whatever these people have existed before. But the thing is, they always
00:55:25.660
sort of labeled themselves like we're in this magazine or we're in this, whatever, when they're
00:55:30.060
just part of the feed, it just makes the whole world feel horrible. Part of it leads to this rising
00:55:35.580
paranoia for everyone. If everything is once again, compartmentalized where there's at least some
00:55:40.780
context, context hasn't been destroyed. Then if you see some horrible message is, oh yeah,
00:55:46.620
those are the Nazis, whatever, fuck those people. So it gives you that ability to not have it be part
00:55:51.740
of your social queueing. It's an extremely important issue I hadn't talked about before. So we have to
00:55:56.540
think about realistically how people work cognitively and how to just build a technology that works with us
00:56:02.380
as we are, instead of as how we imagine ourselves.
00:56:05.100
Jaron, your idea is brilliant. The one area of pushback that I would say is,
00:56:11.020
aren't we therefore just creating more echo chambers by doing this?
00:56:14.300
Yeah. I mean, look, what'll happen, this is why I want it to be monetized.
00:56:21.500
If you have a bunch of people just saying, oh yeah, you're great. And those other people are
00:56:25.180
shit. And then you have a whole bunch of those, how many people are going to subscribe to that?
00:56:28.620
Well, a certain amount, it might even be a fairly large amount, but it won't be most people.
00:56:33.020
Like what'll happen at a certain point when you get to the point where you have
00:56:39.260
bands and brands and stuff, there's a bit of a filter that happens where if it's just self-indulgent,
00:56:46.060
echo chamber-y stuff, it'll start to fall. There's an amazing thing that happens. Like for instance,
00:56:52.860
if you look at the history of music, you might say, well, the music business is filled with producers
00:56:59.740
doing calculated artificial stuff and crappy artists who got in through nepotism and blah,
00:57:04.700
blah, blah, blah, blah. And all of that is true. And yet, if you look at the things that were hits
00:57:09.500
and the things that persist and the things that lasted over the years, at least in my opinion,
00:57:13.980
there was kind of a sieve in which quality stuff rose, you know, overall, with some exceptions,
00:57:19.020
and we can disagree about individual cases, but overall it happens. And I think it would start
00:57:24.220
to happen online too. That makes a lot of sense. One of the reasons I think the people that like
00:57:29.580
our show, like our show is they know we're not right or left. We're just trying to find the
00:57:33.820
answers in a complicated world. And we've got our own biases, which we're upfront about, but we're not
00:57:39.660
pushing an agenda. Because I do think that gets very boring. And we've seen other people go through
00:57:43.420
a process where they just end up in one thing and they're not curious anymore. And I think that's
00:57:48.140
very off-putting. I know what you mean. Look, we've done a lot on social media. We could talk for hours
00:57:54.540
more, of course. Talk to us about AI. You mentioned you were extremely sceptical. And about VR as well.
00:58:02.380
Just give us your, you know, a Cliff Notes version of your thoughts.
00:58:05.500
Oh yeah, sure. Well, look, on AI, just to be clear, the actual algorithms I'm a total enthusiast of,
00:58:12.220
and I've worked on and contributed to. To my knowledge, my little group, a little startup was
00:58:18.700
the first to do deep fakes, as we know them, and also the first to do snap filter kinds of things.
00:58:23.740
Because I think these types of algorithms have their uses and can, you know, and they're
00:58:29.180
scientifically interesting. The thing that bothers me though, and I think that tends to create a cover
00:58:34.380
for the worst uses of these things that make them, that can make them feel anti-human and indeed be
00:58:39.660
anti-human, is this ideology that we're building a life form in the box. Now, within, and even the term
00:58:46.780
artificial intelligence suggests an intelligence, you know, it suggests that there's some entity,
00:58:51.900
some creature, some, you know, something alive in there. And I really don't like that way of
00:59:00.140
thinking. I want to think of them as tools, not as creatures. Now, as it happens, the single person
00:59:05.900
who did the most to promote the creature way of thinking about computers was named Marvin Minsky.
00:59:11.180
And he was one of, probably my most important mentor when I was young. And then he started to
00:59:15.580
argue with him about this. We're going back to like late 70s, early 80s, something like that.
00:59:21.260
And I'd say, you know, this is bullshit, Marvin. Like these things aren't alive. Like, look at this
00:59:25.260
thing. And I know, God, I hate to say this. Marvin wouldn't want me to say this, but he would say, look,
00:59:32.140
this lab is funded because the military believes that if we don't build these creatures that our
00:59:36.380
enemies will. So just play along and then you get a salary. You know, it's like a marketing thing.
00:59:42.460
And so I said, okay, okay. You know, I did it. And I think it still is sort of a marketing thing,
00:59:47.980
but it's become sort of a religious thing. It's like, we, the nerdy guys get to create life. Screw
00:59:52.940
those women, like, or don't, don't screw them or whatever. Like we're, we're gonna, we're gonna,
00:59:59.340
we're gonna make these life. And, and the thing is, everything gets clearer. Totally. So there's no
01:00:04.860
absolute truth on this because nobody can really know what else is conscious or what else we like.
01:00:09.820
We don't really have, is evolution itself as a process conscious? Is it intelligent?
01:00:17.980
You can debate that. I think you can see it either way. The terms have a lot of potential
01:00:22.300
wiggle room and definition and whatever. I mean, it's interesting, but you can't ever come to a
01:00:27.580
conclusion because there's no way to define your terms. And we don't have, if I, if I can get slightly
01:00:32.700
sophisticated, my language use, we have no empirical channel to gather evidence to help the argument
01:00:37.660
along one way or the other anyway. So it's not resolvable, but we can ask pragmatically,
01:00:43.740
which way of thinking makes us more competent? I think that's a reasonable question. And thinking
01:00:50.540
of them as tools instead of creatures makes us more competent because then we can evaluate them
01:00:54.780
as tools. If we think they're a creature, then we're giving them too much deference. We never make
01:00:59.260
them better in the ways we need to. So like, if you, that's why I really don't like it when people
01:01:04.540
say, Oh, I'm going to have this program that'll make music art for you. And it'll be better than
01:01:07.740
a person. And the reason why is that then you'll change to make it seem better.
01:01:11.980
The way I used to, the way I used to put it is you'll make your, do you know what the Turing test
01:01:17.900
is? It was this idea from Alan Turing who started computer science. And it's ironic because if you
01:01:25.340
know anything about the story, he wrote this just before he committed suicide because, and it's a,
01:01:31.740
it's a long, amazing story, but at any rate, if you take it at face value, what it says is
01:01:38.700
if you can't tell whether a program is a person or not, you might as well call it a person. Otherwise
01:01:43.740
you're prejudiced. Now, of course he was, he was being essentially tortured for being gay
01:01:49.660
and treated as a non-person. So there's a whole level to this thing that we could talk about, but
01:01:54.380
let's leave that aside for a second. Um, the, the thing is, if it's, if it's true that somebody
01:02:01.660
cannot tell if the machine is acting like a person or not, maybe they're behind it, you know,
01:02:06.860
they're just texting with you. Um, then it's one possibility is the machine elevated itself and
01:02:13.180
became a person. The other possibility is that the human lowered themselves and became an idiot to
01:02:17.580
believe the machine's a person that the test doesn't allow us to distinguish between the two.
01:02:22.060
This happened just last week where an engineer was fired from Google to sort of believe their
01:02:27.100
program had become alive, but it's also possible that that guy became an idiot.
01:02:32.220
There's no empirical or scientific distinction between these two things. We can't tell which
01:02:36.700
is which. Anyway, um, people demonstrably are willing to make themselves idiots to socialize.
01:02:42.780
We do it all the time. We all become morons to try to impress a date. We all become morons to try
01:02:48.060
to impress a potential employer. We all become morons all the time with each other
01:02:53.660
in the hopes that it'll do some good. And every once in a while, maybe it does. It usually doesn't,
01:02:57.100
but we do exactly the same with machines that we think are people. So let's not think they're
01:03:00.860
people. Let's call them tools. If whatever it is, isn't working, let's say that tool needs to be
01:03:05.660
better designed. Let's not say, Oh, you know, uh, I guess that's just the way it is. It's a creature.
01:03:11.500
No, no, no, no, no, no. And I think this, so this issue of treating them as, as, as tools instead of
01:03:18.380
people is actually one of the core problems that got us into the trouble we're in because Google
01:03:23.340
always conceived of itself as an AI company. And they always thought, well, if it's not doing what
01:03:28.380
we want, the thing is it's an emerging AI and we have to give it the space. It's like a child.
01:03:32.460
You let it grow and go through its tantrums or whatever. Um, and that attitude is absolutely
01:03:37.260
wrongheaded. They're tools. If they're not doing what we want, we change them.
01:03:41.500
And Jaren, on that, speaking of pragmatic questions to ask in this sort of conversation,
01:03:46.940
uh, you know, these tools from the inside out, we don't, is it not the case that the algorithms
01:03:52.380
that say determine our ability to search for things on Google, they're so complex now that
01:03:57.820
actually it's not within the realms of possibility for a single human being to assess that algorithm
01:04:03.580
in its entirety. Right. So this is a, uh, this is a large topic to talk about. And I am,
01:04:12.060
I'm not here representing them at all, but I'm, I'm, uh, the so-called prime scientist at Microsoft
01:04:18.140
and our office is funding and helping guide the open AI algorithms like GPT and DALI and Codex that some,
01:04:26.140
some of your, uh, uh, viewers might be familiar with. And so this question of, can we understand what
01:04:33.180
these things even do? Um, in my opinion, part of the issue is that the culture, tech culture wants
01:04:44.140
them to be creatures instead of tools and kind of set them up to be hard to understand
01:04:48.860
more than was really needed. Now that's going to be a controversial statement and I'll get a lot of
01:04:53.340
pushback from people in the field. And yet I think it's true. Like for instance, one of the problems
01:05:00.700
No. Oh, you should look it up. It's D-A-L-L dash E. It's the latest, um, image generation stuff. So
01:05:07.580
you just give it some text and you say, Hey, I would like, I would like to have an image of a
01:05:14.060
robotic pumpkin, uh, swimming the English channel in the style of Turner. And you know what? It'll
01:05:20.540
synthesize that. It'll come out. And the way it does it is through this process of sort of randomly
01:05:26.940
starting to create images and compare, comparing them with images that have matching keywords
01:05:32.220
nearby and going back and forth until it gets something that it reads as a match. And typically
01:05:37.900
what comes out actually looks like the thing you asked for. It's just astonishing actually. Now,
01:05:43.020
what's wrong with that? Well, there's a few interesting things about it. Um, one thing is
01:05:47.020
if you just give it nothing but the one word Asian, nothing will come out the porn because all it does
01:05:51.260
is it reflects the primary keywords that were online. And I don't think most, the billions of
01:05:56.780
people from Asia really want that result. I don't think that's great.
01:06:04.780
If you, so there's something wrong with the way we're doing it. And so then what we end up doing
01:06:08.940
is it's a little bit like the problem of trying to censor online speech to fix online speech. It's
01:06:13.660
like this game that you can never win. So now what you do is you create this whole organization that's
01:06:19.260
trying to go through the output of these giant models and try to have them not give terrible results.
01:06:24.700
But you know, it's a never ending, it's a never ending struggle. And it's not clear that you ever
01:06:29.740
even get to the point where it's okay. You know, it's just rough. And at what point is this kind
01:06:34.860
of safe for the world? You know? Um, so it's, that's a tough one. Can I tell you what troubles me
01:06:41.500
about your answer very quickly, right? I asked you whether these algorithms are too complicated for a
01:06:47.980
single human being to understand, and you're a very smart guy. And you gave me an answer that's
01:06:53.660
too complicated for me to understand. Oh, I'm sorry.
01:06:56.700
No, no. But what I'm saying is, what troubles me about that is, if it was the case that a single
01:07:02.620
being could understand them, I imagine you would have said yes, but you didn't.
01:07:05.980
I, let me give, give me a chance to finish. Okay. What I was saying is right now they can't
01:07:10.700
be understood, but I think they could be, we could do things differently so that they could
01:07:14.300
be understood and actually better. That's, that's the big picture I'm trying to get across. Okay.
01:07:18.620
So let's suppose instead of saying, we'll just randomly take whatever's people upload into
01:07:24.220
the internet and then use that as the basis for these types of algorithms. Um, what if instead
01:07:32.700
we said, you know what, we're going to put out a bounty. We need better input data that reflects
01:07:38.540
what Asia is. And then what some of these clubs I was talking about, we sometimes call them mids,
01:07:43.660
but whatever, there's like these data trusts, these clubs of people like the ones who would publish,
01:07:47.420
um, they would start saying, okay, we're, we've just assembled a hundred thousand images of Asia.
01:07:53.660
And then the, the people running the algorithm would say, wow, we just tested that. That improves
01:07:59.100
our performance. We're not just getting porn anymore. You know, we'll pay you this much.
01:08:02.860
And then they say, well, you know, we're going to collectively bargain with you. We think we
01:08:06.220
deserve that much. And then they, they, they go back and forth and what they get is a fair market
01:08:10.860
price. So what you're doing is you're using the market to fix it. And now people understand what's
01:08:16.860
going on. And the reason, you know, they understood it is they got paid, right? If they didn't
01:08:22.540
understand it, they wouldn't get paid. And so basically you, you end up in the same situation
01:08:27.100
that you have in capitalism, where understanding is reflected in payment, because if you're not,
01:08:33.820
you only get paid, if you're generating enough value for the person to be able to pay it or being
01:08:37.660
willing to pay it. So the successful negotiation is your validation of understanding, whereas
01:08:42.620
otherwise, what the hell would it be? So what you do is instead of saying, oh no,
01:08:46.220
how can we fix this horrible thing we've made? You say, um, how do we motivate people to make
01:08:50.780
a wonderful thing? All right. That's the answer to say, oh, I'm going to have these eggheads,
01:08:56.220
people like me who are going to understand it is ridiculous because you shouldn't even trust us.
01:09:00.540
Like, suppose we could understand it. Do you want to trust me to, no, of course not. That's not how
01:09:05.420
civilization should work. Do I want to trust a marketplace made of groups of people who are earning
01:09:10.700
their way to improving the performance of these things? Yes. I like that. And then what I like
01:09:15.340
also is that then every time there's some new algorithm or some new robot, instead of the people
01:09:20.700
saying, oh no, this thing's going to put me out of work. I thought I was going to make a living
01:09:23.660
as an artist, but now this thing's going to make the art. What will I do? I'll just be living in the
01:09:27.660
street. Instead they say, oh, this is a great opportunity. I'm going to join one of these groups and
01:09:32.060
we're going to improve that thing because right now it's garbage and we're going to make it better.
01:09:35.420
We're going to get paid for it. So it takes, it inverts the whole thing and turns what we call AI
01:09:41.260
into this endless, infinite set of new opportunities instead of this thing that
01:09:45.020
makes people feel obsolete. And that's what I call understanding.
01:09:50.700
Well, that is a beautiful message to end on. How realistic that is, we are going to find out.
01:09:55.100
Jaron Lanier, it's been an absolute pleasure. We're going to ask you a couple of questions from our
01:09:59.660
supporters for our supporters. But before we do, in that local group that I told you about,
01:10:04.780
right? The small group, people working together. But before we do, we've got one final question for you,
01:10:11.660
What's the one thing we're not talking about that we really should be?
01:10:17.980
Does AI fulfill the same emotional needs as your ridiculous royal family for people?
01:10:28.140
You're going to have to expand on that a little bit.
01:10:33.100
Well, as an American, I was looking like, what is this royal family? Why is it there? Why is,
01:10:41.660
Well, no, but the thing is, it's this, see, I think AI sort of is too. It's like, it's this,
01:10:46.860
it's a set of principles that come from very powerful people that are supposed to run society.
01:10:52.620
You know, I'm starting to feel like this whole gambit is falling flat. Never mind.
01:11:00.860
Jaron, it's been an absolute pleasure. If people want to find your work online, where's the best
01:11:12.140
place to do that? Where is the best place to engage in this wonderful mind of yours?
01:11:15.900
The last book I wrote was, wait, which one? I'm not sure if I remember the order. The last two were
01:11:22.780
called Dawn of the New Everything, which is a memoir of the early days of virtual reality in the 80s.
01:11:29.660
And the social media book I wrote was called 10 Arguments for deleting your social media accounts
01:11:35.820
right now. In the UK and UK adjacent countries around the world, it's published by Boadly Head,
01:11:43.020
which is a fun, a fun publisher you have. And, uh, um, I think on the, the VR memoir,
01:11:51.340
the head is wearing a headset. Um, it's an actual, it's a head anyway. Um, and then there's two more
01:11:57.980
books coming out, but I'll, I'll leave you, I'll leave you in suspense about those. Well, uh, if, if,
01:12:06.380
if you did give us your time once again, when the books come out to chat about what you're talking
01:12:11.260
about and then we'd be very, very grateful, uh, don't go anywhere. Cause we're going to ask you
01:12:14.940
a couple of questions for our locals, but in the meantime, thank you so much for joining us.
01:12:18.700
And thank you for watching and listening. We'll see you very soon with another brilliant episode
01:12:22.540
like this one or or show all of which go out at 7 PM UK time. And for those of you who like your
01:12:27.340
trigonometry on the go, it's also available as a podcast. Take care and see you soon guys.
01:12:32.300
Do you see a connection between the emergent use of social media and virtual reality? And on the
01:12:40.220
other hand, the growth of the feeling that we sort of all get to choose our own identity.