In this episode, I speak with Nicholas Christakis, the Director of the Human Nature Lab at Yale and a sociologist, about how technology and human relations have changed over the past decade. We talk about the impact technology has had on us, the role of technology in our lives, and the role technology has played in shaping us.
00:02:14.380And years ago, he told me he didn't use credit cards
00:02:16.380And, you know, he refused to get a cell phone and he wanted, you know, he was trying to be off the grid because he didn't want to be surveyed. And I thought he was like a Luddite nut. Yet now, you know, worry that like my every move is being tracked by someone.
00:02:30.720So if, to the extent that you are arguing, and I think you are, that some of what ails
00:02:35.840us at present is due to some of these communication technologies and the ways they've been grafted
00:02:41.520onto very fundamental human desires and exploit those desires, to the extent that we grow
00:02:48.780as a society to cope with those threats, I think we will look back at this period as
00:02:53.880It's just that, one in which we yielded to and were adversely affected by and ultimately, let's say, overcame some of these threats.
00:03:03.600Not dissimilar, you and I remember, when you couldn't swim in the Boston Harbor, the Charles was polluted, the air was polluted, and we sort of cleaned everything up in some sense.
00:03:15.240So maybe we'll clean everything up in that way, but it'll take some time.
00:03:18.140So what is your personal engagement with social media these days?
00:03:44.640And I found that a lot of the knowledge that I was acquiring, I was acquiring, I curated a list of people with diverse expertise and beliefs and followed them, and I really enjoyed it.
00:03:54.660And then I felt like I had to, it wasn't just appropriate for me to take from the commons, I had to give to the commons.
00:04:00.660So I tried to generate content that would reflect my expertise or my ideas and be useful to others.
00:04:06.940But in the last few years, I found it to be just incredibly toxic.
00:04:11.600And the feed became, even when I just tried to follow only my own people, became full
00:04:17.660of garbage, a lot of trolling, a lot of mostly far-right conspiracy theories, also some left
00:10:53.940to a privileging of reputable sources.
00:10:56.880Like, you know, we've migrated so far away from, you know, the evening news with Dan Rather kind of, you know, thing to everyone is an expert and, you know, there's all this kind of good stuff, but also crap online.
00:11:11.100I think we may, ironically, people may be willing to pay a bit more for reliability.
00:11:16.340You may not believe it unless you read it in The Economist, you know, then you'll believe it.
00:11:21.020You're not going to believe whatever you see otherwise online.
00:11:22.700So it may reprivilege, you know, sort of credibly real voices.
00:11:30.720I know you've done some research of late on AI and how it changes not just human behavior with respect to technology or information sources, but behavior toward one another, right?
00:11:44.300It alters the mechanics of human cooperation on some level.
00:11:48.200Well, you know, take that strand if you want, but I mean, just generally speaking, what are your thoughts about AI and where all of this is headed for us?
00:11:56.660So I want to tell a brief toy story or toy model or toy example of the question you just put.
00:12:02.640But before I tell that, I want to go on a slight digression.
00:12:06.380And because I struggle a lot, as I suspect you do, with, you know, what is happening with these incredibly powerful tools that are being so rapidly developed in our society.
00:12:16.140And there's this scene in the movie Fiddler on the Roof where the protagonist, who's a milkman in the town of Anatevka, you know, around the time of the Russian Revolution, just before actually, is a very poor man, goes to the town center, and there's a big argument that's going on there.
00:12:32.020And someone makes something, and Reptevia, he's the character, says, you're right.
00:12:36.440And someone makes the opposite point, and he says, you're right too.
00:12:40.480And then someone says, Reptevia, they can't both be right.
00:12:44.600And this is how I feel when I listen to debates by experts on AI.
00:12:49.260I listen to some computer scientists and some tech billionaires who talk about the amazing promise of AI and how there will be some bumps, but mostly it's going to be this extraordinary future and that to oppose it is to be a Luddite.
00:13:03.780And then I listen to other incredibly expert computer scientists and tech billionaires who say the exact opposite, who say, you know, I think I was at an event with Sam Altman a couple of years ago or a year ago, actually.