The Joe Rogan Experience - August 25, 2022


Joe Rogan Experience #1863 - Mark Zuckerberg


Episode Stats

Length

2 hours and 53 minutes

Words per Minute

168.9951

Word Count

29,346

Sentence Count

1,798

Misogynist Sentences

5

Hate Speech Sentences

1


Summary

In this episode, we talk to the CEO and Founder of Oculus, Shane Smith. Shane talks about how he got into virtual reality and augmented reality and how it's changed the way we see and experience the world. We talk about the future of VR and Augmented Reality and how companies like Oculus are revolutionizing how we see the world and how technology can help us do so. We also talk about some of the new features that are coming to the Oculus headset in October, and how they can be used to enhance your experience in real life. This episode is sponsored by Oculus Connect. A big thank you to Oculus for sponsoring this episode. If you like the show, please HIT SUBSCRIBE and leave us a review on Apple Podcasts! Subscribe to our new podcast, The Anthropology, wherever you get your podcasts, and help spread the word about the show. Timestamps: 3:00 - What are the benefits of virtual reality? 4:30 - The benefits of Augmented and Virtual Reality? 5:20 - How can we improve our perception of the world? 6:15 - What can we do to make the world better? 7:40 - How does technology help us connect better with each other? 8:00 9:00- What are some of our biggest challenges? 11:30- How do we get better at being present? 12:40- What is the ultimate expression of technology? 13:20- Why do we need to be present in life? 14: How do you feel present in the most authentic experience? 15: What are we all connected? 16:10 - What is real? 17: What do you want to be in the world through technology in the 21st century? 18:40 19:30 21:10- What do we want to do in virtual reality in the real world in the next decade? 22:10 Can we be a better version of ourselves in the best experience in the future? 25:10:20 26: How can I be a virtual reality machine? 27:30 What is a virtual world in 2020? ? Is it possible to be more present in real-world in the ultimate experience in my head in the near-real world in a virtual space? & 15:30 Can I be there with someone else in the first place?


Transcript

00:00:11.000 So, your new Oculus is awesome.
00:00:14.000 It's very impressive.
00:00:15.000 Yeah.
00:00:15.000 It's very cool.
00:00:16.000 Coming out in October, we're going to be talking about it at our Connect conference that's coming up.
00:00:25.000 Yeah, pretty excited about it.
00:00:27.000 It's so interesting.
00:00:29.000 When you put it on, so I'll just describe it to people.
00:00:32.000 When I put it on, there was an avatar in front of me, and it was an alien woman.
00:00:36.000 And the alien woman, when I moved my mouth, she moved her mouth.
00:00:39.000 When I moved my eyes left and right, it's tracking my eyes.
00:00:42.000 When I make an angry face, it makes an angry face.
00:00:47.000 It's incredible.
00:00:49.000 You can see the evolution and the progress of this stuff where it's getting to the point where it's mimicking human patterns in kind of a creepy way.
00:01:02.000 But it's very cool.
00:01:03.000 Yeah, so for me this stuff is all about helping people connect.
00:01:09.000 The way that I got into this is...
00:01:12.000 I don't know.
00:01:14.000 I just started thinking about what is the...
00:01:17.000 What would be the ultimate expression of basically people using technology to feel present with each other?
00:01:24.000 It's not phones.
00:01:26.000 It's not computers.
00:01:28.000 How do you get this sensation of actually being present like you're right there with another person?
00:01:33.000 And that's to me what virtual and eventually augmented reality are all about.
00:01:38.000 And there's just this whole technology roadmap that you That we basically just need to go run down over the next decade to unlock that.
00:01:45.000 So for the next device that's coming out in October, there are a few big features.
00:01:54.000 I mean, the one that you're talking about, basically social presence.
00:01:58.000 I mean, the ability to now have kind of eye contact in virtual reality.
00:02:04.000 Have your face be tracked so that way...
00:02:07.000 Your avatar, it's not just like this still thing, but if you smile or if you frown or if you pout or, you know, whatever your expression is, have that actually just in real time translate to your avatar.
00:02:18.000 I mean, that's obviously like our facial expressions are just a huge, that's like a, you know, there's more nonverbal communication when people are with each other than verbal communication.
00:02:27.000 You had a really good point, too, about face tracking.
00:02:30.000 If you're doing a FaceTime call, that you don't look at each other in the eye.
00:02:35.000 Because you're looking at the camera to look in the eye, and then you don't see the person.
00:02:38.000 So if you look at the camera, you're looking up.
00:02:42.000 And if you look down at the actual screen, you're not making eye contact with the person.
00:02:48.000 But this is able to recreate actual eye contact.
00:02:52.000 Yeah.
00:02:52.000 With the avatar.
00:02:53.000 Yeah, no, this will be the first time really to do that.
00:02:56.000 You know, I mean, when we're using technology today, I mean, it's great to be able to make phone calls and video calls and all that.
00:03:02.000 I mean, if you can't be with someone today, it's nice to be able to see their face.
00:03:06.000 But when you're on a video call, you don't actually feel like you're there with the person, right?
00:03:10.000 I mean, you get some signals, some information, you can see their face.
00:03:14.000 But the whole time you're You're kind of trying to convince your brain that you're actually there with them.
00:03:21.000 But your brain knows, right?
00:03:22.000 It's kind of like a deep level that you're not actually there with them.
00:03:27.000 You're just getting some information about what they look like.
00:03:30.000 And to me, what virtual reality unlocks is it basically really convinces your brain that you're there.
00:03:39.000 And when you're in there, you're...
00:03:43.000 You have to basically try to convince your brain that this isn't real, right?
00:03:46.000 And that you're not present.
00:03:47.000 And there are all these just subtle signals and things that either deepen the illusion or break it.
00:03:56.000 That each time we do a new version, we just try to...
00:04:02.000 You know, break down a few more of the barriers.
00:04:04.000 And one of the big ones early on, well, the first one, obviously, was just like having a headset and be able to look around.
00:04:11.000 And for that, one of the key things that your eye basically refreshes...
00:04:17.000 I'll call it every five milliseconds or something.
00:04:19.000 So if you turn your head and the image isn't kind of refreshed to where you're looking within five milliseconds, then there's this huge mismatch between your visual system and your vestibular system and your kind of balance in your ear.
00:04:33.000 And people used to kind of feel uncomfortable from that, right?
00:04:36.000 Because it's like a physical discomfort because what you were looking at didn't match as you were rotating your head.
00:04:43.000 So that was kind of the first thing.
00:04:44.000 Then we got hands.
00:04:46.000 And there was this whole thing that was super interesting there where at first we wanted to, you know, display your whole arm, which makes sense, right?
00:04:54.000 Because, I mean, you'd think, okay, it's a little weird to just see your hand.
00:04:57.000 But it turns out that your brain is perfectly willing to just accept seeing your hand without your arm.
00:05:06.000 Because your hand is the thing that it's trying to manipulate.
00:05:09.000 And as a matter of fact, if we kind of interpolated and got your arm position wrong, right?
00:05:14.000 So we'd get into these cases where your hands were here, and we'd sort of guess that your arm was like that or something.
00:05:20.000 And if your arm was actually like that, but we displayed it so that it was in like that, you're like, ah, my elbow's broken!
00:05:25.000 It felt really wrong.
00:05:27.000 So it's actually much better to show a limited number of signals, but get them right.
00:05:31.000 And then you can just add on over time.
00:05:33.000 So for previous versions before this, we didn't The kind of eye contact was all just AI simulated, but we didn't actually know when you were making eye contact because we weren't tracking the eyes.
00:05:44.000 And now for this version and hopefully a lot of the different ones that we build going forward, you'll be able to have realistic facial expressions and more translated directly to your avatar.
00:05:56.000 But there's this whole roadmap of basically how do you deliver this real sense of presence, like you're there with another person no matter where you actually are.
00:06:03.000 It was very impressive because even when I moved my jaw side to side, it did that, it made the O face.
00:06:10.000 It's really interesting.
00:06:12.000 And you know, you were saying also that the way this is tracking is you're doing this without putting something on your body, without putting trackers on your body.
00:06:23.000 But do you ultimately think that that's, like, are we gonna go Ready Player One where you have like a haptic feedback suit?
00:06:29.000 And you have to zip this thing up to get into a game and in that way you're gonna be fully immersive or do you think that I can get to the point where it can mimic the movements of your body accurately without you having to wear something?
00:06:44.000 So I think that there will be opportunities to wear things to augment the experience further, right?
00:06:49.000 So we already have these experiments with haptic gloves, where you can like, if you touch a digital object, right, if you drop a ball from one hand to the other, you can feel the ball in your hand physically.
00:07:00.000 And that's pretty cool.
00:07:03.000 I want to design this in a way where you don't need that, right?
00:07:07.000 So today there's two primary modes of doing the tracking.
00:07:10.000 I mean, there's this kind of notion of inside-out tracking.
00:07:12.000 You're wearing the headset and it tracks your motion.
00:07:15.000 It tracks your hands.
00:07:16.000 Eventually it'll track your legs with an AI model.
00:07:20.000 And you can do that all with your headset.
00:07:21.000 And the big advantage of that is you don't need to have a whole lot of different devices.
00:07:26.000 Eventually you'll be able to do it without even having controllers.
00:07:28.000 You'll just have the headset.
00:07:29.000 The headset will get smaller.
00:07:30.000 It'll be more portable.
00:07:31.000 You'll be able to bring it around.
00:07:32.000 You don't want to have a setup that has like 10 pieces.
00:07:36.000 There are going to be times when you want to kind of have that I think it's a sort of super deep experience or maybe you have it at your home.
00:07:43.000 But I think ultimately people are going to just want to have versions of this that they can bring around, whether it's on an airplane or you're doing work at the office or you're going to a coffee shop or whatever.
00:07:53.000 For that, you really just want to make it work from the device.
00:07:58.000 So just from a pair of glasses or something along those lines?
00:08:03.000 Right now, there's kind of two...
00:08:05.000 The concepts of virtual reality and augmented reality are sort of on two different development paths, but they're obviously fundamentally interrelated.
00:08:17.000 So virtual reality, it's kind of possible to build today.
00:08:20.000 Quest 2, it's pretty popular, doing well.
00:08:22.000 Hopefully the new one that comes out, I think it's a pretty big step above it.
00:08:28.000 But you can build that today.
00:08:30.000 There's a lot of new technology that we've researched that goes into that.
00:08:33.000 But it also is building on top of decades of advances in displays that came from TVs and then laptops and phones.
00:08:44.000 And some of the display technology gets to piggyback on those decades of innovation and all these different companies that have done that work before.
00:08:54.000 AR is a pretty different beast because what you really want to get to is not a headset.
00:09:00.000 You want to get to something that's like a normal-looking pair of glasses that is...
00:09:07.000 I mean, it won't be like a wireframe because you'll need to fit some electronics in it, where you'll basically need to have a computer in there and speakers and a microphone and batteries and a laser projector.
00:09:19.000 And then the display, which, you know, we and a lot of other folks think is going to be this technology called waveguides.
00:09:43.000 So is a waveguide a type of technology?
00:09:46.000 What is a waveguide?
00:09:47.000 Yeah.
00:09:49.000 Basically, they can be made of different substances, plastic, glass, different substrates.
00:09:58.000 And they basically get etched or printed in different ways.
00:10:01.000 And there's this big debate right now.
00:10:04.000 Where a lot of the research is going into what is the right way to basically create these waveguides that have the right properties.
00:10:10.000 Because you want, for augmented reality, to get something that's a wide enough field of view.
00:10:16.000 So you can imagine in five years, we're having this conversation.
00:10:20.000 I'm not here.
00:10:21.000 You're wearing AR glasses.
00:10:23.000 Hologram mark is here.
00:10:26.000 So it's not only just is it kind of working as a hologram, but there's all these different dimensions beyond just being, like, a better video chat.
00:10:37.000 If we wanted to play poker, you know, it's like I could, you know, I could, like, deal a deck of cards and we could play.
00:10:43.000 Hologram me...
00:10:45.000 It could deal hologram cards and you could have your glasses and physical you there could pick up the hologram cards and you can have a poker night where some of your friends are there physically and some of them are there as holograms.
00:10:59.000 It's actually kind of wild.
00:11:00.000 One of the thought experiments that I like to do is thinking about how few of the things that we physically have in the world actually need to be physical.
00:11:09.000 Obviously, things like chairs need to be physical, where you're not going to be sitting on a hologram.
00:11:13.000 Food needs to be physical.
00:11:15.000 But most entertainment-type stuff, I mean, not just cards, but games, most media, TVs in the future probably won't need to actually be physical things.
00:11:24.000 It'll just be like an app.
00:11:27.000 We'll just have an app there on your wall, and it's like, snap your fingers.
00:11:33.000 Yeah.
00:11:33.000 Get the hologram there for the TV and we can have our glasses and watch whatever you want there.
00:11:38.000 I don't know, they're sort of limited to being rectangular now because a bunch of limits in terms of the physics of how they get produced, but in the future you'll just have some high school students or college students developing apps and they'll just be wild.
00:11:52.000 Crazy stuff will just kind of get created.
00:11:55.000 So you'll eventually be able to kind of have that all come through these AR glasses.
00:12:01.000 So are there these AR glasses, are they in production now?
00:12:07.000 Are they in development now?
00:12:08.000 Like when you talk about this kind of technology where you can see things that aren't there and look at maps and watch videos and have it all on a small computer that's in the frame of glasses, do they exist already?
00:12:20.000 No.
00:12:21.000 I think we'll start to get stuff that kind of looks like the full version of this over the next...
00:12:28.000 I'd say...
00:12:30.000 Three to five years.
00:12:32.000 But I think it'll also start off pretty expensive once it's available, and then it'll take a while to work down to something that's like hundreds of dollars.
00:12:43.000 There are versions of this that you can start to see if you relax some of the constraints, right?
00:12:48.000 So the kind of ultimate AR experience is that like, okay, you just have normal looking glasses that can kind of have all of these, have holograms, make it so you can interact with people wherever you want.
00:13:01.000 But...
00:13:03.000 If you relax the form factor constraint, so you have a headset instead of normal-looking glasses, that's the other thing that's coming in the new device that we're shipping in October is mixed reality in VR. So we got to play around with this a little bit in the sword fighting experience that we did.
00:13:22.000 Basically, the thing about mixed reality is you see the physical world around you.
00:13:27.000 In the context of VR, it's not happening through a waveguide.
00:13:31.000 It's basically happening through—you have cameras on the device that capture the world and then translate that in real time into stereo images, so different images in both eyes, so that way you can—because otherwise it's weird and we kind of see stuff and— You know, 3D because our two eyes see slightly different things.
00:13:50.000 The computers are putting that together on the fly.
00:13:56.000 And then you can overlay digital objects on top of that.
00:13:58.000 So when we were sword fighting, it's like the version of me and my sword, it's like that was a digital thing, but otherwise it was in your lobby, right?
00:14:04.000 And you could see your lobby.
00:14:05.000 So you could start to see those kind of AR experiences starting to get built, but in a form factor around mixed reality VR first.
00:14:17.000 So that's one direction that I think that the industry is exploring.
00:14:21.000 Yeah.
00:14:22.000 The other is basically looking at, okay, so we've got to constrain this form factor because we want to have something that looks like normal glasses.
00:14:28.000 What's the most technology that we can fit into a pair of normal-looking glasses today?
00:14:32.000 So you kind of go from both sides.
00:14:34.000 It's like, what's the experience that we want to have even if we can't get the form factor right?
00:14:38.000 And what's the best we can do with the form factor?
00:14:40.000 And then each year, those two basically converge.
00:14:43.000 But on the smart glasses side, we work with Ray-Ban to basically build these smart glasses.
00:14:50.000 And they're the best-selling smart glasses that have ever been built.
00:14:56.000 We're continuing to work on new versions of it, but basically you can get a pair of Ray-Ban Wayfarers now that have a microphone and they have a speaker and they can take photos and take videos and you can post them to Instagram.
00:15:09.000 They do it on voice command?
00:15:11.000 Yep, yeah.
00:15:11.000 Oh, so you can say take a photo of this?
00:15:13.000 Yep, take a photo, take a video.
00:15:14.000 And what kind of image quality are you getting off of these things?
00:15:19.000 It's pretty good.
00:15:21.000 I want to make sure I don't get the spec wrong and I just have all these different numbers in my head because I want to make sure I don't confuse it with the new version.
00:15:28.000 Is it like similar to like a selfie camera?
00:15:30.000 Yeah.
00:15:30.000 Limited in comparison to the back camera?
00:15:32.000 Yeah.
00:15:32.000 No, it's not quite as good as the back cameras today.
00:15:36.000 But it's...
00:15:37.000 But yeah.
00:15:38.000 No, it's like...
00:15:39.000 I mean you look at the quality and it's good.
00:15:41.000 And it fits in like the corner of glasses.
00:15:44.000 Does that bring about privacy concerns?
00:15:46.000 If people could just like start filming things?
00:15:49.000 Yeah.
00:15:49.000 So, I mean, we designed it so it has a light on it.
00:15:52.000 So whenever there...
00:15:52.000 Yeah.
00:15:53.000 I mean, that I think is actually a really important part of this.
00:15:55.000 Could you put a piece of tape over the light?
00:15:57.000 I mean, I guess in theory, but it's...
00:16:01.000 If you were a creep.
00:16:01.000 Yeah, there it is.
00:16:02.000 Yeah.
00:16:02.000 So that little thing in the corner, is that a highlight or a light?
00:16:05.000 No, that's the light.
00:16:07.000 And it blinks and it's a pretty active indicator.
00:16:12.000 And I think if you put a piece of tape over it, it would probably interfere with the camera.
00:16:17.000 And so those Wayfarers are essentially the same size as normal Wayfarers?
00:16:22.000 Do they have thicker arms?
00:16:24.000 I think it's ever so slightly thicker, but it's within the same ballpark of weight.
00:16:31.000 So we worked with a company that, Ray-Bans, these are some of the most popular and successful glasses, and part of the reason why I wanted to work with them is because they know a lot about glasses design, and that's not my thing.
00:16:43.000 So I figure, okay, they'll really bring to the table some constraints around like, okay, how big can this actually be before it starts getting too heavy on your face and uncomfortable to wear for long periods of time?
00:16:54.000 And I've just learned a ton working with those guys.
00:16:57.000 I mean, they're super sharp.
00:16:58.000 They're this great Italian company.
00:17:00.000 And the collaboration has been awesome so far.
00:17:02.000 So I'm looking forward to building more stuff with them.
00:17:05.000 But yeah, so you basically have these two paths to the technology at once.
00:17:09.000 You're kind of trying to explore...
00:17:11.000 Yeah.
00:17:28.000 You know, make it a really great design.
00:17:30.000 And then just eventually these things converge.
00:17:32.000 And then eventually they'll converge and you'll get the functionality and you'll get the kind of form factor, but it'll still be kind of expensive for a little while.
00:17:38.000 And then you fast forward a few years from there and then I think it'll really be a mainstream thing.
00:17:42.000 But even VR today is doing quite well.
00:17:46.000 I mean, I don't think we've released exact numbers on the sales, but it's within the ballpark of Xbox or PlayStation or those kind of platforms.
00:17:57.000 Really?
00:17:57.000 Yeah, so I mean, we started off, this was sort of my theory on this is like, all right, gaming is use case number one for VR. But then pretty quickly, if you look at any platform, right, so computers, phones before, games are a huge part of those platforms.
00:18:14.000 But if you look at the main things that people do, it's really about communication, because I mean, this is what people do, right?
00:18:20.000 It's like we communicate.
00:18:21.000 And you know, that's kind of how we get meaning in our life is interacting with other people.
00:18:26.000 So I was like, all right, that's going to happen with VR. And sure enough, if you look at the top apps in VR now, the top few are basically social metaverse, hang out with your friends' apps that are not centered around any specific game.
00:18:41.000 So that kind of hypothesis around, okay, VR is starting to add different use cases.
00:18:46.000 It's going from games first, games are still growing and going to be huge, to Just kind of social, hang out with friends, be present.
00:18:54.000 And we're getting all these other use cases that are kind of crazy and are happening sooner than I thought.
00:18:58.000 So, you know, another big one is fitness, right?
00:19:01.000 Just because, I mean, in a way, I mean, these are like the first physical computing platforms.
00:19:06.000 It's like you don't, like, move around while you're on your computer.
00:19:08.000 I guess you could a little bit on your phone, but it's sort of awkward because you're looking at the small screen.
00:19:12.000 But, like...
00:19:13.000 VR and eventually AR are really designed to be able to move around and do things and interact with the world, and that's really important to me.
00:19:20.000 I hate sitting in front of a desk.
00:19:23.000 I just feel like if I'm not active, I'm wasting my day.
00:19:28.000 So, I don't know, there have been these awesome experiences.
00:19:31.000 Basically, a couple of companies, you can kind of think about it like Peloton for VR, where it's like Peloton, they sell you the bike or the treadmill, and then you buy the subscription and you get the classes.
00:19:43.000 There's a couple of companies that basically do cardio, they do dancing, they do boxing.
00:19:48.000 But instead of having to buy a bike, you just have your Quest headset.
00:19:51.000 And once you have that, you buy a subscription to these companies, and you can just take lessons and do different things and fitness.
00:20:02.000 I thought that was pretty wild.
00:20:03.000 I thought that in the long term, something like that would start to happen.
00:20:06.000 But it happened way sooner than I thought, which was really cool to see.
00:20:11.000 Well, if you do one of the boxing games, you realize right away, this is a really good workout.
00:20:15.000 The virtual boxer, when they come towards you and they're in that ring and they start throwing punches at you and you're moving your head, you really wind up getting a really high heart rate.
00:20:27.000 You put out a lot of energy.
00:20:29.000 It's really good cardio.
00:20:30.000 I found my feet would hurt.
00:20:33.000 Because I was pivoting and moving so much, because I was constantly switching stances and trying to get away from punches.
00:20:40.000 And as you get further on in some of the games, the opponents become more difficult.
00:20:45.000 It's really exciting.
00:20:47.000 It's fun.
00:20:47.000 And you get out of there and you're really exhausted.
00:20:49.000 It's a really good workout.
00:20:51.000 Yeah, so I mean, and that's just the kind of nature of the whole platform, and that's one of the things that I love about it.
00:20:56.000 But it's, I mean, those aren't even trying to be fitness apps, right?
00:21:01.000 They're just fun.
00:21:02.000 Yeah, it's just that they just happen to be physical.
00:21:04.000 Are they capable of having two people, like, we had a fencing match today, you and I did.
00:21:09.000 Yeah.
00:21:09.000 Which is really fun.
00:21:10.000 Are they capable of doing that with boxing now?
00:21:13.000 Where two people have a...
00:21:14.000 Because the thing about the fencing match that we had that I thought was really interesting was like you were facing one direction like 30 feet away and I was facing another direction.
00:21:24.000 Like we weren't even facing each other.
00:21:26.000 It didn't even matter.
00:21:27.000 Yeah.
00:21:27.000 So you could be in Bangladesh and I can be in Rome and we could be playing a game together.
00:21:34.000 Yeah, so, I mean, the fencing demo, our internal team built, because we haven't released the new device yet, so in order to kind of make stuff work, we kind of build that ourselves.
00:21:44.000 But the boxing ones are made all by other game developers and different developers.
00:21:49.000 And they can do that?
00:21:50.000 Yeah, there's nothing stopping them from having a multiplayer mode.
00:21:52.000 I'm not sure if any of them do yet.
00:21:54.000 All the ones that I've played, I mean, I do Thrill the Fight, and I really like Creed.
00:21:58.000 But I do those as...
00:22:00.000 As single player.
00:22:01.000 I don't know if they have multiplayer modes, but there's nothing holding them back from doing that.
00:22:05.000 So I'd imagine that they will add that over time.
00:22:07.000 It seems like a smart move.
00:22:09.000 I mean, we're talking about martial arts, like, in terms of, like, Muay Thai and other...
00:22:14.000 I think Jiu-Jitsu would be a real problem.
00:22:16.000 But, you know, because you'd have to physically have something to resist against.
00:22:19.000 But if you could figure out how to do a Muay Thai mode where...
00:22:25.000 The only problem would be things change when you make contact with stuff.
00:22:30.000 Yeah.
00:22:30.000 Things change in terms of, like, Positioning and movement and what you're able to get away with and not get away with.
00:22:35.000 Whereas with boxing, boxing's pretty good for that.
00:22:38.000 Like it's probably like the best combat sport for VR because you don't even have to hit anything to feel like you kind of are.
00:22:46.000 And when you get hit with a jab, your screen lights up like you feel like you got hit.
00:22:52.000 Yeah, I mean for kicking, with punching it's a little easier to throw a punch and then just pull it back.
00:22:57.000 With kicking, if you're not hitting a pad or something, you want to like continue rotating or else it's tough to really put your weight into it.
00:23:05.000 Do you envision a world where one day the physical experience of the game is going to be inconsequential because everything is going to be taking place in your mind?
00:23:17.000 Like, it'll be so good, whether it's with haptic feedback or some other kind of input, where you'll be able to actually experience very Matrix-like, some thing that's not there.
00:23:29.000 I mean, is that ultimately where all this is going?
00:23:31.000 I don't know.
00:23:35.000 I just think that so much of our experience is our body and not just our mind.
00:23:40.000 I mean, there's this strain of kind of philosophical thought that's like, okay, what is a human?
00:23:44.000 It's really just your brain, right?
00:23:46.000 And I don't subscribe to that at all because, I mean, I don't know how you feel about stuff, but I just feel like my whole energy level and mood and kind of how I kind of interact with the world is all just based on It's so physical.
00:24:05.000 I guess maybe over time it would be possible to just simulate that through your brain, but I don't believe that we're just brains in tanks or just brains in a body.
00:24:16.000 I think our physical being and the actions that we take there are just as much of the experience of being human.
00:24:30.000 I would agree to that, but I would also say that a lot of people just like to sit down and watch movies, and that's a very alien experience to the human body, and it's something we've become very accustomed to.
00:24:39.000 So what I'm thinking is, if technology advances and it keeps going further in the direction that it's headed now, more immersive, more convincing, you know, that uncanny valley gets bridged and all of a sudden you have a real life experience,
00:24:55.000 Now, whether this is through some sort of neural link type deal or some new technology that tricks the mind into actual experiences.
00:25:06.000 I mean, ultimately, isn't that where this is all going to go?
00:25:08.000 Where you're going to be able to have experiences without having them?
00:25:11.000 And that's not to negate the beauty of real experiences or not to say we won't have real experiences anymore.
00:25:17.000 But if you wanted to have a real experience, and we talked about economic restrictions that would keep you from being able to fly to another part of the world, well, you could go there with your Oculus.
00:25:28.000 You could have a very realistic 3D representation of those places.
00:25:33.000 You took me to Rome today.
00:25:34.000 I got to see Rome.
00:25:36.000 It's very cool.
00:25:37.000 But do you think that ultimately that is going to get to a time where the technology is so advanced that it's indiscernible?
00:25:45.000 That you could have a podcast experience with me.
00:25:49.000 You and I could have this same conversation right now, but neither one of us be in this room.
00:25:55.000 Yeah, I think the nature of technology is that...
00:26:02.000 It's interesting to sort of hypothesize what the kind of extreme end state is going to be when something becomes kind of all-consuming.
00:26:10.000 But I think the normal way this stuff plays out is that some things are more easily mimicable or replaceable than others.
00:26:19.000 So we were talking before about, okay, boxing, yeah, you can do that pretty well.
00:26:23.000 Maybe one day we'll get Muay Thai and kicking in.
00:26:25.000 Jiu-jitsu, that's going to be pretty hard, right?
00:26:27.000 Because you need, like, all kinds of resistance.
00:26:29.000 So...
00:26:30.000 I mean, I think the way that this progresses is like, it'll keep on being able to do more things really well.
00:26:40.000 And I would guess that there will be other technologies or other things will advance in the world that will prevent any one thing from ever subsuming everything else.
00:26:50.000 So, I don't know.
00:26:53.000 I mean, I also, I mean, maybe just because I'm in the position of, like, Working on building this stuff every day.
00:27:01.000 I'm just trying to make it useful for a lot of things.
00:27:04.000 To jump to it's so useful that it's better than everything.
00:27:10.000 That's so far ahead of where we actually are because I'm in the trenches every day trying to get this to work.
00:27:15.000 Maybe too close to it.
00:27:17.000 But from a bird's eye view, if you looked at where this is going, it's going to become more immersive.
00:27:23.000 It's going to get better.
00:27:24.000 It's going to be more convincing.
00:27:26.000 And this is the real argument for simulation theory, right?
00:27:30.000 The argument for simulation theory is if there's so many civilizations out there in the universe and they're so advanced, ultimately one has to create a simulation.
00:27:41.000 It seems like that's going to happen.
00:27:43.000 If the human race could survive another 100,000 years, the odds we wouldn't create a really realistic simulation is probably pretty low.
00:27:54.000 Yeah, I think the question is just how realistic and how good.
00:27:57.000 To me, the holy grail is building something that can create a sense of human presence.
00:28:05.000 I've spent the last almost 20 years of my life building social software, Making it so that whatever limited computation you have, you can kind of share something about your experience.
00:28:17.000 It started off with primarily text when I was in college.
00:28:20.000 Then we all got these smartphones, they had cameras, and then it became a lot of photos.
00:28:24.000 Now the mobile networks are good enough that it's starting to be a lot more video.
00:28:28.000 And to me, this kind of immersive experience is clearly going to be the next step.
00:28:32.000 But there's this question about...
00:28:37.000 So being able to feel like you're present with someone will unlock so many different types of value for a lot of people.
00:28:44.000 And there's social and entertainment.
00:28:46.000 There's professional.
00:28:49.000 I follow this economist who basically studies that economic opportunity and upward mobility is sort of limited or varies based on what zip code you grow up in.
00:29:00.000 Because there's different opportunities in different places.
00:29:03.000 Imagine if you didn't have to move to some city that didn't have your values in order to be able to get all the economic opportunities.
00:29:10.000 That would be awesome.
00:29:11.000 So in the future where you can just use AR, VR and teleport in the morning to the office and show up as a hologram, I think that's going to be pretty sweet.
00:29:19.000 It'll unlock a lot of economic opportunity for a lot of people.
00:29:25.000 Is it ever going to be 100% as good as being there in person?
00:29:30.000 Probably not.
00:29:31.000 But, like, I mean, I don't know.
00:29:33.000 When we were talking about doing this conversation, you know, we talked on the phone, right?
00:29:38.000 It's like I didn't fly down to Austin to talk about whether to have this conversation.
00:29:41.000 Sometimes it's like whatever amount of simulation you have is...
00:29:47.000 You can create a lot of value even if it's not 100% as good as the actual physical thing.
00:29:55.000 So I just view our job as we'll basically approach that like an asymptote.
00:30:00.000 I don't know if you'll never be able to do all of the things that you can do in person with a person.
00:30:08.000 We'll just be able to do more and more.
00:30:10.000 If today it's gaming or hanging out, over the next few years it'll be working.
00:30:14.000 So hopefully you'll just be able to teleport in and basically just show up as a hologram and work remotely and live wherever you want, be with your family wherever they live, but just be able to show up in whatever place.
00:30:27.000 I think that that's going to be pretty awesome and I think we'll be able to do that pretty well.
00:30:31.000 It's going to be a real issue for commercial real estate.
00:30:36.000 There's not going to be a lot of offices.
00:30:38.000 If that actually becomes as good as having a cell phone in your pocket and being able to make a phone call, you could just sort of teleport to work.
00:30:46.000 Yeah.
00:30:47.000 It's going to be a problem.
00:30:47.000 No one's going to want to work.
00:30:50.000 Well, that's a different question.
00:30:51.000 I mean, whether or not they're going to physically want to be there, rather.
00:30:55.000 Maybe they'll want to work, but they're not going to want to go to the office.
00:30:58.000 Yeah, I mean, maybe.
00:30:59.000 Although I think being physically, being present with people, feeling a sense of presence is pretty important, regardless of where you do it.
00:31:06.000 I mean, I've found, you know, over the last couple of years, the way that stuff, that the work has been done has changed a huge amount.
00:31:13.000 And, you know, it's, there are all these things that are sort of complex about the office.
00:31:17.000 But like, I mean, I see people in person almost every day.
00:31:21.000 Sometimes I probably do more meetings in my house now than I would have before.
00:31:26.000 But Yeah, I don't know.
00:31:27.000 I do think that seeing people in person having that sense of presence makes a big difference.
00:31:34.000 I think so too, but there's definitely a big pushback now about people going to the office rather than working from home.
00:31:44.000 People would rather just do their work from home and they're like, with the internet connections as they are today and the ability to videoconference, why do I have to be physically in the building in order to get my work done?
00:31:54.000 Yeah, no, and I agree with that, too.
00:31:57.000 You know, our company is actually pretty forward-leaning on remote work.
00:32:00.000 I mean, just especially some types of work, especially software engineering, you can do pretty well from a lot of different places.
00:32:07.000 And if you're an engineer, sometimes it's actually better to not be in the office because then people aren't bugging you.
00:32:12.000 You kind of want like a block of like five hours where you can just work on a problem.
00:32:16.000 I don't know.
00:32:17.000 I have this thing where I'll be in zone, kind of flow concentration, working on something, and my wife will ask me some basic question, and I'll just be like, oh, man.
00:32:30.000 I just lost my flow, and it's like...
00:32:33.000 And from her perspective, it's like, oh, not a big deal.
00:32:36.000 That was a quick question.
00:32:37.000 Just go back to what you were doing.
00:32:38.000 It's like, no, that's not how it works.
00:32:41.000 So I do think, to some degree, having people be able to work remotely is actually pretty useful for a lot of things.
00:32:50.000 But I think we'll need to find this mix.
00:32:53.000 I physically run away from my wife when I have a joke idea.
00:32:56.000 If she's talking and I have an idea, I'll just run away.
00:33:00.000 I just go, I got an idea.
00:33:02.000 She gets it, so it's okay.
00:33:03.000 But yeah, if I'm in the middle of writing and she comes in and interrupts, it's over.
00:33:08.000 Yeah.
00:33:08.000 Just gets shattered.
00:33:10.000 So in some ways, well, you would have to have a real quiet and secure place, but I think for a lot of people, just the wasted time commuting and all that, if you could eliminate that through AR or VR, some sort of a hologram system,
00:33:27.000 just the stress of life would be so much better.
00:33:30.000 Yeah, I mean, that's been, for me, over the last couple of years with COVID, and just kind of rethinking the way that stuff...
00:33:40.000 I think reducing the commute has been one of the big efficiencies.
00:33:44.000 But also being able to live in different places has been nice.
00:33:47.000 I spent a lot of time down in Kauai earlier on, and I got really into surfing and hydrofoiling.
00:33:54.000 I just wake up in the morning and go do that, and then just be really refreshed and go do my full day of meetings, which is obviously not something I could do in Palo Alto.
00:34:05.000 So, I don't know.
00:34:06.000 I'm pretty positive on all this.
00:34:07.000 I think if you can give people the ability to get their kind of fluid state, like flow state work remotely, but then also just be able to kind of in a second teleport to a place and show up as a hologram and be present, I think that that's pretty valuable.
00:34:23.000 Now, that doesn't replace everything, right?
00:34:26.000 I mean, one of the things that I found is...
00:34:29.000 For, you know, larger meetings, one of the most useful things is not actually the meeting itself.
00:34:33.000 It's just getting a chance to catch up with people before and after the meeting, right, when you're in the hallway or something.
00:34:38.000 So, you know, yeah, there's a downside to being so efficient about being able to teleport in and out, too, because you can kind of miss some of those casual downtime moments.
00:34:47.000 But overall, yeah, I mean, I think it's going to create this kind of crazy amount of efficiency there.
00:34:53.000 Yeah, I think people are still going to crave real-world experiences no matter what.
00:34:58.000 Obviously, I do stand-up comedy, so obviously that experience, you must be there.
00:35:04.000 That's part of the fun, is being in the room with people.
00:35:07.000 But I can envision technology improving to the point where you could create a virtual comedy club.
00:35:13.000 And you would see all the different people that have the headsets on in the room, and you would probably get pretty close.
00:35:23.000 There was a lot of people that did Zoom stand-up during the pandemic, and it was awful, because there was no audience.
00:35:29.000 They were just basically doing their act with no crowd.
00:35:31.000 I'm like, don't do that.
00:35:33.000 Don't do that.
00:35:34.000 You need the feedback.
00:35:34.000 It's terrible.
00:35:35.000 It's super awkward just doing public speaking and not having any feedback.
00:35:38.000 Well, if someone's just doing public feedback, there's some really great podcasts where people, like Bill Burr, just talks to himself.
00:35:47.000 It's just him ranting about life and stuff, and it's great.
00:35:50.000 He doesn't necessarily need someone to bounce off of.
00:35:53.000 But comedy is a different thing.
00:35:56.000 Comedy by itself with no audience is not good.
00:35:59.000 Yeah.
00:36:00.000 So there is already at least one experience like this that I'm aware of.
00:36:05.000 So we have this Horizon social platform and people can build worlds in it.
00:36:08.000 It's pretty simple today, but it's designed to be this really easy world building platform and people can go in and build stuff.
00:36:14.000 And people built this thing called the Soapstone Comedy Club.
00:36:17.000 And this is actually one of the stories that I've heard of people using VR that I think is really touching.
00:36:23.000 So there's this woman who basically lost her son and was really sad and was grieving for a while.
00:36:31.000 And comedy was just a really important outlet for her.
00:36:34.000 But she had a lot of social anxiety around going and physically being in front of people and performing and doing it at a club.
00:36:40.000 So she started doing it at the Soapstone Comedy Club and had a little bit more anonymity because it was in virtual reality, but she could feel a real sense of presence of other people there.
00:36:50.000 And, I mean, talking to her about it, it's like it's been a real important thing.
00:36:55.000 Experience for her to kind of be this creative outlet and help her get over this grieving that she's had.
00:37:01.000 And it's not something maybe that she would have been comfortable having the kind of full intensity experience of a physical comedy club, but you kind of got a bunch of the way there by feeling like you were present with people there.
00:37:12.000 My friend Brian Redband, he does this thing called Virtual Redband.
00:37:16.000 What does he do?
00:37:17.000 They go to diners and stuff like that.
00:37:19.000 They set it up.
00:37:20.000 VRChat.
00:37:21.000 Yeah.
00:37:21.000 So he does it in Oculus, and he has a bunch of his friends log on at the same time, and they go into a room together and hang out.
00:37:29.000 It's really interesting, because I think that...
00:37:31.000 To be able to have an online community where you go to a place and you all meet up and you're all talking and hearing each other's voices and seeing the avatar moving, like that alien avatar that you showed me today, it's very real looking.
00:37:47.000 I mean, I clearly see that it's this animated thing, but I would liken it to an avatar, like from the movie Avatar, like the Na'vi.
00:37:56.000 There's something cool about it where it's definitely a step above a lot of these things that I've seen in the past.
00:38:05.000 That's moving into this much more realistic sort of place where I could imagine a lot of people just deciding, like, today I'm going to be a penguin.
00:38:14.000 I'm going to go to this diner and hang out with these guys as a penguin.
00:38:18.000 And it's exciting.
00:38:20.000 It's kind of fun.
00:38:21.000 Part of what's a little trippy about it is that in some ways some of these experiences, I think, feel more realistic than...
00:38:29.000 Oh, yeah, for sure.
00:38:48.000 Well, I guess it doesn't quite work because of the headphones.
00:38:51.000 But normally, you know, if you talk, it's coming from that direction.
00:38:56.000 And spatial audio and kind of directional, building a spatial model of things is how we make memories.
00:39:00.000 So you take something like Zoom, and it just completely blows that up.
00:39:04.000 Because now, you know, it's every meeting that you have looks the same, right?
00:39:07.000 And also, there's no symmetry, right?
00:39:10.000 So if you're in the top left of my box square, that doesn't mean that I'm in the same place for you.
00:39:16.000 So we actually don't have any kind of...
00:39:28.000 Right, yeah.
00:39:34.000 It's like, okay, you have an avatar.
00:39:35.000 It's obviously not super realistic yet.
00:39:37.000 It'll get better and better over time as the computation gets better.
00:39:40.000 Although, as an aside, I'm not actually convinced that even when we have photorealistic avatars, that people are going to prefer that to the expressive ones.
00:39:46.000 But that's kind of a whole separate tangent that we can go down.
00:39:49.000 I think you'll clearly want the ability to do both, have a photorealistic one and an expressive one.
00:39:55.000 But yeah, I mean, if you're sitting around and someone's a penguin or your friends are clearly cartoony, but you're sitting around a table and you have a shared sense of space and your friend is to your right, which means that you're to their left.
00:40:07.000 And when they speak, you hear it coming from that direction.
00:40:09.000 You actually remember the spatial sense of that in the same way that you would a physical thing, which it's just kind of getting all those details right over time.
00:40:19.000 I just think that there's...
00:40:20.000 I mean, this to me, this is some of the most exciting work that I've gotten to do in a while because I just feel like building social experiences on phones is so constrained.
00:40:31.000 In some ways, it's awesome because there's billions of people that have phones.
00:40:34.000 So we can build services that get used by billions of people around the world, and that's obviously rewarding in its own way, too.
00:40:42.000 But having the ability to...
00:40:46.000 Define what these next platforms are going to be and have them break out of these boxes that have been really weirdly defined.
00:40:55.000 These things, phones, computers, they were not designed for basically communication and interaction.
00:41:02.000 They were designed for work and certain computational workloads.
00:41:07.000 So a lot of what I'm trying to do is like, okay, well, Yeah.
00:41:15.000 Yeah.
00:41:17.000 Yeah.
00:41:29.000 A unit of how you interact is around people and how you express yourself.
00:41:34.000 And you'd want to be able to have an avatar and an expression of your identity and be able to just jump between a bunch of different experiences rather than have everything be so siloed.
00:41:44.000 It's pretty wild to try to build this all from the ground up because it's just this incredible breadth and amount of technology.
00:41:52.000 I often get criticized because we're investing just this huge amount in this.
00:41:57.000 We're going to spend I think?
00:42:15.000 I think?
00:42:30.000 And we haven't even gotten to neural interfaces yet, but we should definitely spend some time on that.
00:42:34.000 But it's like you kind of go across all these different things and it's just this incredibly wide amount of technology that needs to get built in order to basically build and deliver a realistic sense of presence like you're physically there with another person, which I just think is the most magical thing in the world.
00:42:49.000 Well, it's very exciting.
00:42:50.000 The idea that you could have an office in a jungle.
00:42:53.000 All of a sudden, we're going to call this meeting together and we're going to be on the moon.
00:42:57.000 We're going to be next to a volcano.
00:43:00.000 There's going to be a bubbling volcano right next to the desk.
00:43:02.000 It's cool.
00:43:04.000 I love the fact that there's people like you out there doing it.
00:43:07.000 It's expanding the possibilities for this stuff.
00:43:11.000 Neural interfaces.
00:43:13.000 What are your thoughts on that?
00:43:14.000 Where is that going?
00:43:16.000 And where are we at right now?
00:43:19.000 Go back to your comments about the matrix before.
00:43:22.000 When people think about neural interfaces, or any interface, I think it's important to separate out their sort of...
00:43:32.000 I think?
00:43:51.000 So, some people, I mean, like Elon with Neuralink and those companies, I think, I mean, that's just taking this, like, super far off.
00:43:59.000 I mean, maybe it'll be ready in like a couple decades.
00:44:01.000 I mean, there will probably be interesting use cases I don't want to be an early adopter.
00:44:15.000 Yeah, I think you want the mature version of that, not the one where it's going to get a lot better next year and you need to get your brain implant upgraded every year.
00:44:26.000 But here's the kind of version of this that I spend a lot of time thinking about.
00:44:29.000 So you have AR glasses, right?
00:44:32.000 And how are you going to control them?
00:44:34.000 How you kind of control any computation devices is obviously super fundamental to what the platform is.
00:44:41.000 So you have a bunch of different modes.
00:44:44.000 One of them is going to be voice.
00:44:46.000 You'll be able to talk to it.
00:44:47.000 But that doesn't always work.
00:44:49.000 If you're in a public place or you want to be discreet or you want to just not annoy the people around you, you're not going to want to dictate everything out loud.
00:44:57.000 A second way is going to be using your hands.
00:45:00.000 So let's say, okay, I snap my fingers.
00:45:02.000 We have a chess game or a poker game.
00:45:04.000 And okay, here's our chess board and I move a piece.
00:45:08.000 It's like, okay, Yeah, that'll do with my hands.
00:45:10.000 That's kind of cool.
00:45:11.000 But, like, you're not going to be walking down the sidewalk, like, manipulating stuff with your hands.
00:45:18.000 Minority reports.
00:45:20.000 Yeah, I mean, I think at some basic level, if you can get past that just being weird, I think most people's hands will just get tired.
00:45:28.000 I mean, if you hold your hands out like this for a long enough period of time, eventually you want to put your hands down.
00:45:39.000 You can basically go and have your mind give commands to the computer, in this case the glasses, without having to speak out loud, without having to wave your hands around.
00:45:53.000 Even though those things will be great for some use cases, you're not going to want them all the time.
00:45:57.000 So the research that we're doing...
00:46:00.000 It's based on the—it's basically it's input only and it's focused on—so it's not trying to send signals to your brain.
00:46:08.000 It's trying to make it so that your brain can communicate with the computer.
00:46:10.000 And the path that we have is it's based on the fact that we have all these extra motor neurons in our body, right?
00:46:17.000 And part of the reason for that is, like, in case you get hurt, you have neuroplasticity, you can rewire, do stuff— Find a different pathway to kind of send a signal to move your finger or something.
00:46:26.000 There's all these different ways that it turns out our brain could tell this finger to move.
00:46:30.000 But we've sort of optimized individually.
00:46:34.000 We kind of reinforce certain pathways and end up using one kind of motor neuron pathway to do a specific thing.
00:46:41.000 And you have all these others that are not that used.
00:46:43.000 So it turns out you can have a device on your wrist.
00:46:48.000 That basically your brain can communicate with your hand, tell your hand to move in like a pattern that it isn't used to, and then the wristband can sort of pick up those signals and translate them into completely different things like having a virtual hand move in front of you while your physical hand is just kind of sitting there at your side.
00:47:13.000 So, you'll be able to have this experience in the future where, like, you're sitting in a meeting, and, you know, your wife texts you, and it pops up in the corner of your glasses, and you want to respond, but you don't want to, like, pull out your phone, because that's kind of rude, right?
00:47:28.000 So you just kind of, like, I don't know, twitch your wrist a little bit, maybe like this, like some super discreet motion that no one even knows you're doing it, and you just, like, send a message.
00:47:38.000 And...
00:47:39.000 That seems like a massive distraction.
00:47:42.000 I mean, people are already distracted by their phones.
00:47:44.000 Like when people get a text message and they're like, hang on a second, I just can't answer this real quick.
00:47:49.000 And you're like, okay.
00:47:50.000 And you're sitting there having lunch with someone and they're not talking to you anymore because they're looking at their phone.
00:47:53.000 But now they're going to be looking at these AR glasses and just thinking out text messages.
00:47:59.000 And you won't even know that they're distracted.
00:48:02.000 They're just going to be not connecting with you.
00:48:05.000 I don't know.
00:48:06.000 I actually think...
00:48:09.000 I don't know.
00:48:09.000 One experience that I think has been interesting since I've been doing more Zoom calls, especially earlier in COVID, one thing that I think actually was quite good or is quite good is the ability to both kind of have everyone who you're meeting with on video chat,
00:48:25.000 but then also have a chat thread going with some of those people.
00:48:28.000 So that way, like, let's say there's something that you don't want to say to everyone who's in the room, but you want to ask one person.
00:48:33.000 It's like, hey...
00:48:34.000 I don't know.
00:48:56.000 Whereas, if I'm having a virtual meeting over Zoom or in VR and workrooms, you can just text people while you're doing that.
00:49:06.000 I actually think that it will unlock a massive amount of efficiency in communication and expression between people to make it so that people don't have to wait until they're done doing one thing to send a message to someone else.
00:49:21.000 But yeah, I do think that there's a separate question.
00:49:24.000 About if you have glasses and you're kind of going about...
00:49:27.000 It's one thing to have VR and you put it on when you want to go play a game or do a meeting.
00:49:32.000 In the kind of fullness of augmented reality, when you kind of have the glasses and you're, like, going about that through your life, having some kind of really smart do-not-disturb mode that has a sense of, like, okay, this thing really shouldn't distract you and you're doing something important,
00:49:49.000 that's going to be a really important AI problem, too, I think, to be able to kind of simulate and understand...
00:49:55.000 I don't think it's going to be as black and white as do not disturb on or off.
00:49:58.000 I think you want some intelligence there about routing and understanding which things you're going to want to get and which things not.
00:50:05.000 And maybe have certain people have priority, like if your wife or your family is trying to get a hold of you, they can get through, but business people can't get through.
00:50:12.000 Yeah.
00:50:13.000 I worry about additional distractions.
00:50:17.000 I mean, I do not keep my children from social media, because I feel like the world that they live in has social media in it, and I don't want them to be just completely disconnected from that.
00:50:28.000 I limit the amount of time they use their phones, and I try to talk to them about the importance of not being...
00:50:34.000 Like completely absorbed in social media and these kind of things that these kids do.
00:50:40.000 But I think it's a part of life.
00:50:42.000 And I think it's new and it's weird and it's confusing and it can be very addictive.
00:50:49.000 But I also think it's a part of life.
00:50:50.000 But going out to dinner with them is so hard.
00:50:53.000 They just want to check their...
00:50:55.000 Like, hey, put your phone down.
00:50:57.000 Stop snapping with your friends.
00:50:58.000 They're always Snapchatting.
00:50:59.000 I'm like, stop!
00:51:00.000 Stop doing that.
00:51:01.000 It's like, well, we got to stop that.
00:51:02.000 Yeah.
00:51:02.000 It's like, you just got to put it aside.
00:51:05.000 Just put it aside.
00:51:06.000 But if you have glasses on, that's going to be very difficult.
00:51:09.000 It's going to be very difficult to get people to, you know, especially if glasses have social media applications and also offer some sort of a benefit, like a net benefit to like the way you view life.
00:51:23.000 Like maybe give you information on the amount of calories that are, if you pick up a food item, like what is that?
00:51:31.000 Oh, look at all the calories.
00:51:32.000 Oh my God, it's got that oil in it.
00:51:33.000 That's not good for you.
00:51:34.000 Or, you know, other benefits, but also has social media.
00:51:39.000 You're going to come into this sort of weird place where you have to figure out whether or not this is a positive thing in your life.
00:51:46.000 Or whether or not it's overcoming and you're overwhelmed by it.
00:51:51.000 Yeah, and I think that that's something that...
00:51:54.000 It's going to end up being this balance, and hopefully our computers and platforms will help us find the reasonable balance on that.
00:52:00.000 I mean, one of the things that you keep – that you've said a few times is, okay, like, I'm not sure if I'd want to do this digitally.
00:52:05.000 I think about – like, it's like I want to have this experience in the real world.
00:52:09.000 I mean, here's one kind of philosophical way that I think about this is I actually think when you say the real world – I call that the physical world.
00:52:18.000 And I think there's the physical world and the digital world.
00:52:21.000 And I think the combination of those increasingly is the real world.
00:52:27.000 There's all this additional information that we bring to the physical experiences that we have that...
00:52:36.000 Whether it's digital or just from our own experience or studying that we've done, that's more than just kind of the physical kind of sensation that we get.
00:52:45.000 But the ratio of that may be shifting over time, right?
00:52:48.000 So in a world in the future where, you know, a lot of the things that might be physical today, I mean, maybe this kind of art and sculptures and stuff that you have here, maybe in the future they're not physical, maybe they're just holograms because you can change them really easily.
00:53:02.000 But Maybe over time the sort of ratio of the amount of physical stuff that we interact with to digital stuff shifts and becomes more balanced or something like that.
00:53:13.000 Whereas, you know, historically it was all physical and there was very little kind of information or digital overlay on top of it.
00:53:20.000 And now I think it's just steadily been increasing.
00:53:22.000 But I mean, I think it's probably gonna be a lot healthier for us rather than consuming kind of all this additional context through this tiny little portal that we carry around on a phone.
00:53:31.000 And you're just kind of like looking at this and you're missing the whole context.
00:53:34.000 I think to have it be able to be overlaid and have kind of people be able to pop in and interact with them through it.
00:53:43.000 I think that's going to be powerful.
00:53:44.000 We'll obviously need to get the balance on this right, but that's sort of how I think about it.
00:53:48.000 I think probably the right way to think about what the real world is at this point is not actually just the physical world.
00:53:54.000 But the physical world, I'm probably more optimistic or believe that the physical world is probably more important to our being and essence and soul than a lot of other people in the industry.
00:54:05.000 So I really care about getting that balance right.
00:54:09.000 I think the balance is important, but I think you're correct.
00:54:11.000 I think there is an ever-increasing landscape of digital world that's undeniable, and it's a part of life now.
00:54:19.000 And as the technology improves, it's going to be a bigger and bigger part of life.
00:54:23.000 I wouldn't say my fear is, but my thoughts are that we're going to lead to a time someday where people become fully immersed 24-7 in a non-physical world.
00:54:33.000 And I think that's the matrix, and that's what people are worried about.
00:54:36.000 That as this technology advances, especially with some sort of neural interface, that we're going to get to a place where we're not really here anymore, or there always.
00:54:48.000 How many people are on...
00:54:50.000 Do you limit your social media use?
00:54:53.000 How do you do it?
00:54:54.000 Me personally?
00:54:55.000 I mean, I'm just doing so many things that in practice there aren't as many hours in the day.
00:55:00.000 And my kids, I haven't had to think about it quite as much yet because they're pretty young.
00:55:05.000 Six and five, right?
00:55:07.000 Loggi just turned five this weekend.
00:55:09.000 But it's...
00:55:11.000 So, I mean, they use...
00:55:13.000 I actually...
00:55:14.000 I want them to use technology for different things.
00:55:17.000 I mean, I teach them how to code.
00:55:18.000 I think it's like an outlet for creativity.
00:55:22.000 Yeah.
00:55:23.000 I mean, Augie especially.
00:55:27.000 Max likes building things.
00:55:28.000 Augie thinks about it as art.
00:55:29.000 So every night I try to do bedtime with them religiously.
00:55:34.000 So I try to end my meetings in order to be able to put them down.
00:55:38.000 And I ask them what activity they want to do.
00:55:41.000 Do you want to read or do you want to wrestle?
00:55:44.000 And Augie's just like, I want to do code art.
00:55:50.000 It's like, oh.
00:55:51.000 That's such an interesting way to think about it.
00:55:52.000 I always think about coding as like you're building something, and she just thinks about it as making the computer make art.
00:55:57.000 So, anyway, I think it's good for them to get that exposure.
00:56:03.000 But, I don't know, these things, it's not...
00:56:07.000 Everything that you're doing on a computer or screen isn't the same.
00:56:23.000 If you're building a relationship, then that is associated typically with a lot of long-term benefits and well-being, right?
00:56:31.000 Because, I mean, the relationships that we have in our lives, I view that as, like, the meaning, right?
00:56:36.000 That, to me, is, like, the point.
00:56:37.000 And that, I think, over time is what generally creates happiness for people and prosperity.
00:56:43.000 But...
00:56:46.000 If you're just sitting there and consuming stuff, I mean, it's not necessarily bad, but it generally isn't associated with all the positive benefits that you get from being actively engaged or building relationships.
00:56:55.000 Right.
00:56:56.000 And you could engage with people actively online and build digital relationships.
00:57:00.000 And especially as this technology improves, you could actually have meaningful experiences with someone's avatar.
00:57:07.000 It's very weird.
00:57:09.000 I just want to make it so that the experiences that we're having aren't just these passive things.
00:57:14.000 From my perspective, there's this People spend a lot of time with screens today.
00:57:20.000 It's basically computers, phones, and TVs.
00:57:23.000 And I'm always amazed, because I spend all my time on phones and computers, that for Americans, still almost half the time that they spend on screens is TVs, more than phones or computers.
00:57:34.000 Really?
00:57:35.000 Yes.
00:57:36.000 I think it might have just tipped in the last few years to being more phones and computers than TV, but TV is huge.
00:57:45.000 Because I know Netflix and YouTube, I think a giant amount of their stuff is on phones.
00:57:51.000 Yeah, it's a lot on phones, but it's a lot on TVs too, and a lot of people are still just watching cable or different things like that.
00:57:59.000 When you think about new experiences, I actually think the first thing that they're going to go do is eat TV, right?
00:58:06.000 And kind of the more passive things.
00:58:08.000 So when people talk about being worried about the time that people are spending in different kind of social experiences...
00:58:18.000 I mean, the time has to come from somewhere.
00:58:20.000 I think it's worth looking at where it's coming from.
00:58:22.000 If it's coming from sleep, that's probably not great.
00:58:24.000 If it's coming from exercise, I wouldn't be that happy with that.
00:58:26.000 If it's coming from TV, I'm pretty fine with that.
00:58:29.000 I mean, that's actually maybe a net improvement in well-being for people overall if you're shifting from this more beta kind of consuming state to just being actually actively engaged, potentially building relationships.
00:58:42.000 And there's just a ton of TV time to eat, right?
00:58:46.000 So I think before we worry about this kind of consuming more and more of people's time, I actually just think looking at the mix of what people do today is good.
00:58:56.000 And my goal for these next set of platforms, they are going to be more immersive, and hopefully they'll be more useful.
00:59:02.000 But I don't necessarily want the people to spend more time with computers.
00:59:05.000 I just want the time that people spend with screens to be better.
00:59:08.000 Because, I mean, today so much of it is like you're just sitting around and, I don't know, in this beta state consuming stuff.
00:59:14.000 And I think that that's like...
00:59:16.000 I don't know.
00:59:17.000 So you asked me, how do I control my own social media time or my time on this stuff?
00:59:21.000 I do a bunch of social media.
00:59:23.000 I do a lot of messaging.
00:59:25.000 I really don't watch that much TV. And that's because I just don't have that much...
00:59:30.000 I don't know.
00:59:31.000 It puts me in this really weird mental state.
00:59:34.000 Unless there's something that I'm just like...
00:59:36.000 Really attached to.
00:59:38.000 And I really like watching UFC, for example, but that's because I also like doing the sport.
00:59:43.000 So it's like I have some kind of connection to it.
00:59:47.000 But, I don't know, just sitting around watching...
00:59:50.000 I don't usually get into a lot of TV shows or stuff.
00:59:54.000 Have you always been a very physical person?
00:59:56.000 Because I follow you on Instagram.
00:59:57.000 I see wakeboarding and stuff.
00:59:59.000 You're very active, which I think is a great message, too.
01:00:04.000 It's great for you, but it's also a great message for other people that here's this guy who's incredibly busy and his life is overwhelmed with technology, yet he's constantly doing physical things and using his body and exercising and getting out in nature.
01:00:19.000 Yeah, I mean, I think it's something that my parents really stressed for me early on.
01:00:23.000 They're like, my parents pushed me pretty hard.
01:00:26.000 They're like, you're going to do well in school, and you're going to be on three varsity sports teams.
01:00:30.000 That's it?
01:00:31.000 Well, I mean, and a lot of other stuff.
01:00:34.000 No, but I mean, you don't have any debate.
01:00:36.000 That's it.
01:00:36.000 This is the rule.
01:00:38.000 So...
01:00:39.000 It's like, that's just what you're going to do.
01:00:41.000 So, and I'm super grateful for it.
01:00:43.000 I mean, they weren't very prescriptive, right?
01:00:46.000 I mean, they didn't tell me I had to do computers.
01:00:48.000 They didn't tell me which sports I had to do, but they were like, this is important.
01:00:52.000 But I don't know.
01:00:53.000 I mean, I've found it, especially as the company has scaled and in some ways become more stressful, it's like more important, right?
01:01:01.000 And my sort of day is like, it's like, all right, you wake up in the morning.
01:01:06.000 Look at my phone.
01:01:07.000 You get like a million messages of stuff that come in.
01:01:11.000 It's usually not good.
01:01:13.000 People reserve the good stuff to tell me in person.
01:01:18.000 But it's like, okay, what's going on in the world that I need to pay attention to that day?
01:01:24.000 So it's almost like every day you wake up and you're, like, punched in the stomach.
01:01:27.000 And then it's like, okay, well, fuck.
01:01:28.000 Now I need to, like, go reset myself and be able to kind of be productive and not be stressed about this.
01:01:33.000 So how do I do that?
01:01:34.000 So basically I go, I, like, I read, I take in all the information, and then I go do something physical for an hour or two and just kind of reset myself.
01:01:45.000 And over time what I've found is that it's not actually just – I used to run a lot.
01:01:52.000 But the problem with running is you can think a lot while you're running.
01:01:55.000 So I've...
01:01:59.000 Especially over the last couple of years, I've gotten really into things that require full focus.
01:02:03.000 So, you know, at the beginning of COVID, I mentioned that I spent a bunch of time in Kauai.
01:02:09.000 Our family has a ranch down there.
01:02:11.000 And, like, I spent a lot of time foiling and surfing.
01:02:14.000 And it's like if you're foiling or surfing and you're on, like, a wave, you have to pay attention the whole time, right, or else you're going to fall and maybe get held under.
01:02:24.000 And it's, like, not...
01:02:25.000 That's not a great experience.
01:02:26.000 I don't know if you surf.
01:02:28.000 No, I don't.
01:02:29.000 But I tried foiling, and now I'm an oaf.
01:02:31.000 I couldn't get on the thing.
01:02:33.000 My young daughter is really good at it, my 12-year-old.
01:02:36.000 And she just zooms around and gets on that thing, and I just couldn't figure it out.
01:02:41.000 Yeah.
01:02:43.000 It's, um, you're talking about the eFoil for that.
01:02:47.000 Yeah, that, it's, I bet if you tried it for, like, a few days, you'd get it down pretty well.
01:02:54.000 Yeah, yeah.
01:02:55.000 I tried it for about three hours, and I was like, oh my god, this is awful.
01:02:59.000 But the actual foiling, not the e-foil.
01:03:02.000 I mean, the e-foil weighs like 40 pounds.
01:03:03.000 An actual foil board maybe weighs like 5 to 10 pounds.
01:03:06.000 So after you learn how to foil, flying an e-foil is like, it feels like you're flying a tank through the air or something.
01:03:12.000 But it's a pretty interesting learning curve, but it requires full engagement the whole time.
01:03:20.000 And then, I mean, that's sort of...
01:03:22.000 So it's cleansing.
01:03:23.000 Yeah.
01:03:23.000 Yeah, yeah.
01:03:24.000 And this is sort of how I got into MMA too.
01:03:27.000 Now I don't spend as much time in Kauai because things are ramping back up and I'm in the office a lot more.
01:03:32.000 So it's like, alright, there's not as much foiling in Palo Alto.
01:03:35.000 So it's like, alright, what's a thing that is both like...
01:03:39.000 Just super engaging physically, but also intellectually, and where you can't afford to focus on something else.
01:03:45.000 And I think to some degree, it's like MMA is the perfect thing, because if you stop paying attention for one second, you're going to end up on the bottom.
01:03:54.000 So I've just found that that is...
01:04:01.000 Just really important for me in terms of what I do and being able to just kind of maintain my energy level, maintain my focus.
01:04:10.000 Because then after an hour or two of working out or rolling or wrestling with friends or training with different folks, it's like, now I'm ready to go solve whatever problem at work for the day.
01:04:24.000 And I've fully processed all the different news for the day that's come in.
01:04:32.000 And we're just ready to go.
01:04:33.000 How did you get introduced to martial arts training?
01:04:36.000 Like, how long ago was this?
01:04:37.000 It was in the last 12 months, actually.
01:04:39.000 Really?
01:04:40.000 Yeah, because it was basically, I kind of, I used to just run a lot and just do that.
01:04:44.000 And then it was, and then basically, since COVID, it's like, got super into surfing and foiling, then got really into MMA. So how did you, how did you initially approach it?
01:04:54.000 Like, how did you get a trainer?
01:04:56.000 Like, what did you do?
01:04:57.000 I know a bunch of people who are into it.
01:05:01.000 There's actually this really interesting connection between people who surf and do jiu-jitsu.
01:05:06.000 Oh, yeah.
01:05:06.000 A lot of similarities.
01:05:08.000 So a bunch of the guys who I do that with, they kind of have gyms and kawaii.
01:05:18.000 You know, basically collected a bunch of recommendations, ran them by a bunch of people who I know, and I ended up...
01:05:26.000 I mean, I trained with this guy Dave Camarillo.
01:05:28.000 I know Dave.
01:05:29.000 Yeah, in Gorilla Jiu-Jitsu.
01:05:30.000 Yeah.
01:05:31.000 Yeah, so...
01:05:31.000 And he's awesome.
01:05:32.000 He's great.
01:05:33.000 Yeah, super nice guy, and I feel like I'm learning a ton.
01:05:37.000 And the crazy thing is, like, I've...
01:05:41.000 I don't know.
01:05:41.000 It really is the best sport.
01:05:44.000 The question isn't how did I get into it, it's how did I not know about it until just now.
01:05:48.000 From the very first session that I did...
01:05:56.000 Five minutes in, I was like, where has this been my whole life?
01:06:00.000 It's like, alright, my mom made me do three varsity sports, and my life took a wrong turn when I chose to do fencing competitively instead of wrestling in high school or something.
01:06:10.000 It's like there's something that's just so primal.
01:06:17.000 About it.
01:06:18.000 I don't know.
01:06:21.000 Since then, I've just introduced a bunch of my friends to it, and that's been really fun because now it's like we train together and we just wrestle together.
01:06:28.000 I don't know.
01:06:28.000 There's a certain intensity to it that I like.
01:06:33.000 It's sort of...
01:06:35.000 Maybe it's like there's this cultural thing where maybe a lot of people haven't considered it, but I've had 100% hit rate of introducing friends to it and converting them to people who now train.
01:06:46.000 Every single person who I've kind of shown it to is like, this is amazing.
01:06:53.000 This is obviously how I should be training and working out.
01:06:56.000 Yeah.
01:06:57.000 That's very impressive that it's 100%.
01:06:58.000 You must have some solid friends.
01:07:00.000 Because a lot of people get turned off by the amount of effort that's involved, but they get excited by the problem solving, which is the more fascinating part, like learning the techniques and focusing on memorizing the techniques and developing the skills.
01:07:14.000 And then drilling.
01:07:16.000 That's probably one of the most important things about jujitsu that people don't do enough of is drilling.
01:07:21.000 I made some of my biggest leaps in martial arts from blue belt to purple belt just through constant drilling.
01:07:28.000 I was drilling all the time with my friend Eddie Bravo.
01:07:30.000 So we were always working on techniques.
01:07:32.000 And so I was able to progress much quicker.
01:07:36.000 And then I noticed when I stopped doing that later, my progress kind of stagnated.
01:07:40.000 It's like there's a real clear correlation between the amount of energy you put into drilling and observing the technique and then just going through the motion with someone offering 20%, 30% resistance.
01:07:54.000 It's just not as fun.
01:07:55.000 Sparring is the most fun.
01:07:57.000 You immediately want to just slap hands and start rolling.
01:07:59.000 It's fun.
01:08:00.000 Yeah.
01:08:01.000 Yeah, I don't know.
01:08:02.000 I really trust Dave, and I think it's also...
01:08:06.000 Dave's great.
01:08:07.000 Yeah, and I'm also just depending on what's going on in my life and work that day.
01:08:14.000 Some days I have the energy to go spar, and some days it's good to just go drill.
01:08:19.000 It's like, all right, there's a lot going on.
01:08:21.000 There's...
01:08:24.000 It's better to just go do something like 20 times in a row.
01:08:27.000 Both things are important.
01:08:28.000 But if you can force yourself to drill more than you spar, you'll get better.
01:08:31.000 It's really important.
01:08:33.000 It's like the most important thing.
01:08:35.000 And it's the thing that people do the least of.
01:08:37.000 Because to really carve those pathways in your mind...
01:08:42.000 Like when someone is in this position, you do that and then you immediately get your knee in position and you shift your hips.
01:08:50.000 It gets driven into your programming so that in the flow of an actual rolling session, it just happens.
01:09:01.000 You see Tom Hardy just won a bunch of fucking matches.
01:09:06.000 Tom Hardy is like an ass kicker.
01:09:08.000 Him and Mario Lopez are out there competing in jiu-jitsu tournaments.
01:09:11.000 Like, this is wild.
01:09:13.000 And they're both like beginners.
01:09:15.000 They're both like blue belts.
01:09:16.000 I'm like, that's incredible.
01:09:17.000 Well, maybe I'll get there for my 40th birthday in a couple years.
01:09:19.000 You should do it.
01:09:20.000 Bourdain did it.
01:09:21.000 He did it when he was like, God, I think he was like 60. I do a lot of stuff like this with friends, but I also just find wrestling around with friends is just...
01:09:33.000 Yeah, it's awesome.
01:09:34.000 Yeah, wrestling is fun.
01:09:35.000 Jiu-jitsu is even more fun because it's like wrestling with like finishes.
01:09:39.000 Yeah.
01:09:40.000 I'm excited that you're into it.
01:09:41.000 It's really cool.
01:09:42.000 And I think, you know, it really enhances your enjoyment of watching it on television because you know what's happening.
01:09:50.000 But there's also, like, a lot of good parallels, I think, in philosophy for life and work and all that.
01:09:58.000 I mean, I think...
01:09:58.000 I mean, both surfing and foiling and jujitsu, MMA, it's, like...
01:10:06.000 I think it sort of teaches you about like the flow and momentum of things.
01:10:11.000 And I think businesses like this in a similar way where it's like the hardest thing is knowing when you're in a position where you need to push through versus sort of developing the intuition for when like, all right, when...
01:10:25.000 The momentum is just going in the other direction.
01:10:28.000 It's like, all right, you're not going to be able to pump over this swell.
01:10:31.000 If you keep your weight in this direction, you're going to get swept.
01:10:37.000 I do think it's a super concrete thing.
01:10:40.000 I think one of the things that's sort of frustrating running a company is the feedback loops are so long.
01:10:47.000 It's like, all right, so I showed you the pre-release version of the new Yeah.
01:11:09.000 I mean, made those decisions years ago.
01:11:11.000 And we're not going to know if that's right until maybe a year from now until we see how that goes.
01:11:16.000 So there's something that I think is sort of, it's difficult in running an enterprise of that scale to, like, try to learn from things at such a long time.
01:11:40.000 Right, right.
01:11:49.000 Am I going to be able to go turn that direction on that wave, or am I just going to get swallowed?
01:11:56.000 I think that having that mix in your life feels really important and healthy to me.
01:12:04.000 I try to get my friends into it.
01:12:06.000 My kids do jujitsu, too.
01:12:08.000 I just think it's really important that they develop all these skills and appreciation for doing physical things, just like my parents taught me.
01:12:19.000 I don't know.
01:12:20.000 It's a big part of who I am, I think.
01:12:23.000 Going back to the other conversation about, are we just brains and tanks?
01:12:26.000 It's like, no, because this is the part of my life that I think is super fun.
01:12:32.000 The building things, that's super engaging, too.
01:12:36.000 It's a very different type of intellectual exercise.
01:12:40.000 What's also the one-on-one physical connection with a person, I would imagine that your life has got to be very bizarre because you are the head of this enormous platform and you're dealing with so many human beings and so much negativity and positivity and all kinds of fires that you have to put out and all sorts of chaos and to just have One thing,
01:13:06.000 one person right in front of you is probably really good to sort of clean the pipes out and just clear your mind and have your ability to focus on things sort of put into perspective.
01:13:19.000 It's got to be very unmanaged.
01:13:21.000 I've talked to people about this before.
01:13:23.000 Whenever people talk about social media sites and they're doing this and they're doing that, I'm like, could you imagine trying to manage at scale 5 billion people?
01:13:35.000 Or whatever it is.
01:13:36.000 How many people are on Facebook right now?
01:13:38.000 Facebook is almost three billion, but across our different properties, it's like a little more than three and a half billion.
01:13:44.000 That's so many people.
01:13:45.000 Yeah.
01:13:46.000 That's impossible.
01:13:47.000 Like no one can manage five billion people or three billion people or a billion.
01:13:53.000 It's like the numbers are just, they're so absurd.
01:13:57.000 It's so preposterous.
01:13:59.000 It doesn't make any sense.
01:14:00.000 So I would imagine that for your mind, the amount of pressure that's involved in just maintaining, it has to be looming over you at all times in the background.
01:14:13.000 It's probably very difficult for you to find things that filter that out.
01:14:18.000 Yeah.
01:14:23.000 There's always stuff to work on.
01:14:26.000 So I think a big part of trying to push forward is maintaining enough control of my time to push on the things that I believe need to be advanced for the future rather than being reactive.
01:14:43.000 Right.
01:14:45.000 Right.
01:14:47.000 Right.
01:14:55.000 By the way, I think that's probably a somewhat extreme version because I'm running this company, but I actually think this is probably true for everyone.
01:15:02.000 I think pretty much every person, I think, has a lot that gets thrown at them, and you could spend all of your time just reacting to that.
01:15:09.000 And I think a lot of what kind of creates the ability to be successful long-term and to build things and change your life and build products that change other people's lives is...
01:15:20.000 Carving out the time to do stuff that's proactive.
01:15:23.000 And that's both taking care of yourself and being physical and getting out there and also getting to spend time with my family and my girls and all that.
01:15:35.000 But it's also, I mean, I could spend all of my time just working on the things that we've already built and not trying to advance I think?
01:16:09.000 A lot of the economy for the country and the world so that way more people can pursue creative endeavors.
01:16:14.000 I think that's just going to be one of the most positive trends that comes out of this decade.
01:16:22.000 I agree with you 100% on that.
01:16:24.000 It's really an amazing time for people to be able to carve out an alternative living and to do so through social media platforms.
01:16:31.000 There's so many people using all the platforms that have developed these followings and started businesses, whether it's in fitness or there's people that are just chefs that cook online and they share and sell recipes.
01:16:46.000 It's amazing.
01:16:48.000 One of the things that That I really admire about what you do is, you know, it seems like you have a real commitment to giving a voice to a lot of different types of people, right?
01:16:58.000 It feels like a big part of your theme is, you know, you have a lot of people on the show who wouldn't just nowhere, like, no chance that they get the exposure that they get from talking to you elsewhere.
01:17:11.000 And, you know, part of the question that I wonder about is in Instagram and in Facebook, You have your follow graph.
01:17:18.000 You have the people you choose to follow and you have your friends.
01:17:21.000 But can we build AI systems that can also just help recommend better content that you didn't know to follow yet?
01:17:28.000 Because it's up your alley.
01:17:31.000 It's aligned with the type of things that you care about, your values, your interests.
01:17:36.000 And I just view that kind of confluence of building...
01:17:40.000 It's a very specific AI problem.
01:17:42.000 It's not like this kind of general intelligence AI problem, but...
01:17:46.000 I tend to think about things in terms of more specific problems that you can break down and try to deliver value for people.
01:17:52.000 But I think I'd just love it if in, I don't know, a couple of years, a significant...
01:18:01.000 Not the majority, but a significant part of the Instagram and Facebook experiences were basically highlighting different creators who you might be interested in but might have not otherwise seen.
01:18:10.000 And I think that that would both be good for people who are using those experiences to discover more people, get more diversity of input into their lives, but also I think can help push that creative economy forward.
01:18:24.000 I don't know.
01:18:25.000 That's one of the things that I'm super passionate about right now.
01:18:29.000 Having an algorithm like that could really help.
01:18:32.000 One of the things that Spotify does really well is suggest new music.
01:18:36.000 I love how they do that, where if you like a certain kind of music and you develop these playlists, they'll start recommending you music.
01:18:47.000 And I found out about so many different bands that I would never know about before.
01:18:53.000 The thing that gets people with algorithms is that algorithms today have this negative connotation to them.
01:19:00.000 There's a lot of argument that algorithms cause dissent and cause arguments and cause strife and that people are focusing only on the things that upset them.
01:19:14.000 The real problem with that is that they're not taught how to think and focus on things.
01:19:18.000 Because what the algorithms pick up on is essentially what are you spending the most time on?
01:19:24.000 Well, if you're spending the most time on carpentry and parasailing and deep-sea fishing, that's what the algorithm is going to recommend to you.
01:19:32.000 Like my friend Ari, we went through this experiment where it only Googled puppies.
01:19:36.000 And he only YouTube puppies.
01:19:38.000 And that's all they recommended to him.
01:19:39.000 Like, YouTube videos was all just puppies.
01:19:41.000 And he's like, see, like, all this stuff that people are saying, like, oh, the algorithms are tearing us apart.
01:19:47.000 Like, no, we're tearing us apart.
01:19:49.000 The algorithms just highlight the things that you're interested in.
01:19:52.000 It's not like the algorithm is some tricky program designed by the communist government to try to get you to argue with each other.
01:19:58.000 No, you're arguing.
01:20:00.000 You like to argue.
01:20:03.000 Yeah, I mean, I think that the algorithms can also obviously be designed better or worse.
01:20:08.000 So, you know, one of the things that I'm pushing on a lot right now is there's this idea in designing recommendation systems of explore versus exploit in that it's like, okay, if someone has spent a bunch of time, you know, searching for puppies,
01:20:24.000 you know they like puppies, so if you show them a puppy video, they'll probably engage with that.
01:20:28.000 But if you only show them puppy videos...
01:20:31.000 Over the long term, you're missing an opportunity to understand what other things that they're interested in.
01:20:36.000 So even though it might not be kind of ideal for the experience today, carving off 5%, 10% of basically the experience to just try to expose people to different things to see if they're interested in that too ends up paying long-term dividends.
01:20:56.000 So I do think that, like, these systems done well.
01:20:58.000 If you design them with a long-term perspective and you're not just trying to kind of maximize engagement today, but you're really trying to understand what people care about and who people want to become and what their values are, I think you can build some stuff that gets really good over time.
01:21:15.000 But I do think that the design of the system and the values that go into it matters quite a bit too.
01:21:21.000 I think so, too.
01:21:23.000 I'm neither pro nor con algorithms or recommendations.
01:21:29.000 I think it's a fascinating aspect of social media, but I do think that there are certain people that, unfortunately, when they get excited about a thing or when they start going online, they gravitate towards things that irritate them and upset them.
01:21:48.000 And that's the big concern that many people have with algorithms and with the use of social media, Twitter in particular, is that people are using it and getting upset and it's creating more tension and more of a divide.
01:22:08.000 Yeah, I mean, I think it's interesting to think about these services not just in terms of the information that is conveyed, but, you know, as a product designer Right.
01:22:28.000 Right.
01:22:29.000 Right.
01:22:46.000 I find that it's hard to spend a lot of time on Twitter without getting too upset.
01:22:52.000 On the flip side, I think Instagram is a super positive space.
01:22:56.000 I think some of the critique that we get there is that it's very curated and potentially in some ways overly positive.
01:23:04.000 But I think the energy on Instagram is generally very positive and it's easy to spend time there and kind of just absorb a lot of the positivity.
01:23:14.000 I think that's true, but how did that happen?
01:23:17.000 Why is Instagram generally friendlier?
01:23:20.000 I mean, so, it's the design of the system, but one thing is I think images are...
01:23:30.000 A little less cutting, usually, and kind of critical than text.
01:23:38.000 I think the news in general is often negative.
01:23:42.000 I think the incentives of the news industry are often to...
01:23:47.000 Well, I think just the mission of the news industry is to kind of speak truth to power and highlight things, like hold people accountable.
01:23:55.000 So I think that even if you're looking at it from that perspective, I think a lot of the stuff is like it generally has this very critical tone to it.
01:24:03.000 But with everything, there's just a balance.
01:24:04.000 If you spend your whole life living in criticism, then that's super negative.
01:24:10.000 But I think that what we've tried to do with Facebook is have a little bit of both.
01:24:14.000 Facebook has images and videos, but it also has news.
01:24:17.000 And the part of it that's probably the most critical where we probably have the most controversies around the more newsy type stuff, the more political type stuff.
01:24:25.000 And over time, I've generally just felt like Hey, that's not even what people in our community tell us that they want.
01:24:30.000 People say that they come here because they want to connect with other people and explore interests.
01:24:37.000 So I just want to emphasize that more of it.
01:24:39.000 But there are some very intentional decisions that you can make in terms of designing this stuff.
01:24:44.000 So for example, on Facebook, when you're reacting to a post, in addition to liking it, you can heart it, you can give it kind of an angry emotion.
01:24:55.000 And One of the decisions that we've basically made is if someone gives an angry reaction, we actually don't even count that in terms of whether to show that to someone else or maybe we even discount it.
01:25:12.000 So you could kind of view it as, okay...
01:25:16.000 Someone chose that they, like, were interested in this post and chose to give an angry reaction, but we just don't want to amplify anger, right?
01:25:23.000 That's, like, not what I kind of view us as here to do.
01:25:26.000 So we're just going to basically take that signal and, like, not use it to show the post to more people.
01:25:33.000 So how do you do that?
01:25:34.000 Like, how do you decide, like, what if it's anger but it's justifiable anger?
01:25:38.000 Yeah, I think that this is—that's exactly the right question, is— Is basically, you know, when I was making that decision internally, a bunch of teams were like, well, you know, there is a lot of stuff that's wrong in the world, and people should be angry about that.
01:25:54.000 And it's like, yeah, I think that's probably, that's fair.
01:25:59.000 But I'm not here to design a service that makes people angry.
01:26:03.000 So I kind of think that there's a balance.
01:26:05.000 And it's not like there's not going to be any angry stuff.
01:26:07.000 I mean, people can still react and say that something is negative if they don't like it.
01:26:12.000 But I don't view our job as going and needing to kind of amplify all that stuff.
01:26:18.000 So why do you have the option to have an anger response?
01:26:21.000 Well, I think it's good for people to be able to convey it.
01:26:24.000 So is it like a thumbs down?
01:26:26.000 Yeah, it's like an angry emoji face.
01:26:28.000 Angry emoji face.
01:26:28.000 And so obviously there's the option to have angry comments, which is just people's ability to express themselves.
01:26:36.000 Yeah, yeah.
01:26:37.000 But we basically choose to...
01:26:41.000 If someone likes something or if a friend chooses to share something, we use that as a signal to say, hey, this might be something that you're interested in because someone reacted to this, right?
01:26:49.000 It's like a friend had some kind of emotional reaction to this and thought it was interesting enough to engage with, so you might also think it's interesting to engage with.
01:26:56.000 But we try to intentionally mute the kind of angry reactions just because that's just not what we're trying to do in the world.
01:27:05.000 I appreciate that.
01:27:06.000 What do you think about the argument that algorithms in general, because the fact that they sort of appeal to human nature, like they amplify the things that you're interested in, and unfortunately people are interested oftentimes in things that upset them.
01:27:24.000 What do you think about the argument that this is too Whether it's too influential or it has too much impact on people and that a better solution would be to just let everything exist how it exists and don't have any kind of algorithm and let people find what they find and share what they share and just let it exist in sort of the free market of ideas.
01:27:53.000 Yeah, so we actually started there, right?
01:27:56.000 Because at the beginning we didn't have the technology to do this kind of ranking.
01:28:01.000 And the very first thing that you run into is if you don't do any kind of ranking, the system gets gamed in different ways.
01:28:08.000 So if you're not ranking anything, the most recent stuff We're good to go.
01:28:29.000 We're good to go.
01:28:48.000 Even if we don't understand the kind of specific content of what she posted, there's going to be a ton of people commenting congrats and, you know, a ton of hearts and positive reactions.
01:29:00.000 So that's the question there.
01:29:02.000 Like, how does that...
01:29:03.000 Enter into the algorithm like if if you're gonna favor something like your cousin's baby being born How would you go about doing that and how how do you how does the the AI figure out that your cousin just had a baby?
01:29:17.000 And that this should be in all the people that follow her should be in all their feeds because they would want to know yeah, she had the baby and Yeah, I mean, I think a lot of it is just...
01:29:26.000 There's a lot of signals that go into it, but at a simple level, you can kind of just look at who are the people who you care about and what do they find interesting.
01:29:37.000 So what are they commenting on?
01:29:39.000 What are they clicking on?
01:29:40.000 What are they...
01:29:40.000 Are they liking or hearting?
01:29:43.000 Those are some of the big signals.
01:29:45.000 And then there's some kind of content-based stuff, like we have a sense that you're interested in this type of thing or that you really hate politics or whatever, so we're going to show you a little bit less of that.
01:29:54.000 But in general, the thing that will differentiate that cousin's baby-was-just-born post is that even if our system has no idea what the post means, That post will almost certainly just have a ton of likes and comments on it that have a positive reaction.
01:30:10.000 So that will just tell us, okay, if your friends made 500 posts today, that's the one that has the most positivity around it.
01:30:21.000 We should probably show that more prominently.
01:30:24.000 So I think at a basic level, a system that didn't do that stuff...
01:30:37.000 Is there a way to stop businesses from spamming that you could just limit the amount of times they could post in a day?
01:30:46.000 I mean, there are a bunch of things like this that we've tried over time.
01:30:52.000 But then you start getting into some things that are basically pretty algorithmic or rules-based, which is like you start trying to rank stuff based on the quality of the posts or how much engagement they make.
01:31:12.000 I mean, I guess you could tell people they can't share more than this amount.
01:31:18.000 But, I don't know.
01:31:19.000 It kind of feels to me like you want to create a system.
01:31:24.000 And there are certain creators who do pump out a large amount of content.
01:31:29.000 And I'm not sure that you want to stop that.
01:31:32.000 I think you just want a system that can basically titrate it and show people the amount of it that they're interested in.
01:31:41.000 It's such an immense responsibility and the fact that it's a private company in some ways troubles some people because you have this ability to control the flow of information and that's really never existed before where there's been like obviously social media is very new that's never existed before And then having a company that's run by human beings that have the ability to decide what gets broadcast,
01:32:11.000 what gets its signal amplified, what gets suppressed, all that stuff concerns a lot of people.
01:32:20.000 Because it's basically just individual human beings with their own biases and their own perspectives and their own view of the world.
01:32:28.000 And they have the ability to either slow down or ramp up or suppress or amplify so many different ideas.
01:32:36.000 And in turn, that can literally shape the way the cultural narrative goes on any given subject.
01:32:46.000 What is it like having that kind of responsibility?
01:32:50.000 Because it seems to me that that would be an immense burden.
01:32:54.000 That would be like a lot of thought would be involved in like, what are the negative consequences of the choices that we make?
01:33:01.000 What are the positive consequences of the choices we make?
01:33:05.000 Because, you know, as you said, you're controlling the signal of three plus billion people.
01:33:12.000 That is so astounding to even say.
01:33:15.000 Well, I think the important thing is that I don't exactly look at it the way that you said.
01:33:21.000 I view our job as empowering people to be able to express what they want and get the content that they want.
01:33:27.000 And whenever we try to exert some kind of opinion that's different from what people want, our products do worse.
01:33:34.000 And we exist in a very competitive space.
01:33:36.000 I mean, we have TikTok that's growing incredibly quickly.
01:33:41.000 There's a whole lot of other companies.
01:33:42.000 We talked about Twitter before.
01:33:43.000 You talked about Snap.
01:33:45.000 YouTube is huge, and people spend a ton of time on it, and there's just new social products all the time.
01:33:50.000 So if we don't empower people and help people get and advance their own goals, then we lose over time.
01:33:58.000 So I kind of think that that...
01:34:03.000 Obviously, serving a lot of people is a big responsibility, and we take that super seriously.
01:34:08.000 But we also appreciate and respect that this is a very competitive marketplace.
01:34:14.000 And our role in it is not to imprint our opinion, but to empower people.
01:34:20.000 And that's sort of the ethos where we started the company.
01:34:24.000 The initial mission statement is, give people the power to share and make the world more open and connected.
01:34:31.000 And we've evolved that over time to now making it around building community and bringing people and bringing the world closer together.
01:34:38.000 But it's fundamentally that notion of giving people the power and empowering people is like really deep in the ethos of the company.
01:34:47.000 And I think whenever we mess that up, which we do frequently, we pay the price for it.
01:34:53.000 And people don't like those things that we do.
01:34:55.000 And then we have to run them back.
01:34:57.000 Well, that goes back to managing at scale, right?
01:34:59.000 Yeah.
01:34:59.000 Because you're just dealing with so many different people.
01:35:01.000 But what I'm saying essentially is I do agree that you are giving people the ability and the power to express themselves.
01:35:08.000 And if you don't do it correctly, they're going to go to the competitors.
01:35:11.000 But it's still an immense responsibility if there are choices being made as to what gets amplified, what gets suppressed, what gets...
01:35:29.000 Yeah.
01:35:29.000 I mean, there are a lot of different parts of what you just said.
01:35:37.000 I mean, I think in terms of kind of helping people discover the things that they want, I think that's a pretty different wing of what we do than the policy setting of what is not allowed, which I think is in a lot of ways...
01:35:49.000 I think a more controversial piece because in the what's not allowed you have to get into the nuances of specific types of content whereas in terms of the recommendation systems You kind of want to build those to be agnostic of the type of content.
01:36:05.000 If I see that a team is trying to promote some type of content over another, they're almost certainly doing something wrong that is going to make us worse than a competitor in terms of effectiveness of our product.
01:36:17.000 Because you should just build this technology in a way that is agnostic and lets people express the interests and things that they want and gives them that.
01:36:27.000 And then every once in a while, I think that there are some editorial decisions that often they're important enough that I have to make them, right?
01:36:36.000 Like that thing that we talked about before, which is like, I just don't want there to be as much anger, so we're going to not take into account the angry reaction.
01:36:42.000 Right.
01:36:43.000 It's probably the case that there would be more engagement on the platform if we didn't.
01:36:47.000 It's like people are expressing something and we're choosing to not listen to that thing.
01:36:52.000 So at some level, it probably makes the product somewhat less engaging, but that's an example of an editorial decision that it's like, at some level, we're here not just to focus on what content people see, but the kind of emotional sense.
01:37:05.000 But I try to make those very few just because the technology can enable a vast breadth of interests that different people have.
01:37:17.000 And I think part of how you build something that can serve billions of people is by not telling people what to think, right?
01:37:24.000 And basically having humility and basically I don't know, just valuing humanity and valuing that people can believe different things and that those beliefs are probably grounded in real lived experiences that they had and aren't the result of them being tricked or something like that.
01:37:42.000 They believe what they believe for a reason and it's kind of good to generally let people express that.
01:37:50.000 I don't know.
01:37:51.000 That's a pretty deeply held belief that I have.
01:37:54.000 One of the things that's got to be bizarre about having a platform like Facebook is that you know that there are foreign actors that are utilizing the platform to either spread propaganda or to start arguments.
01:38:11.000 We read once that I think it was 19 of the top 20 Christian sites on Facebook were run by a troll farm.
01:38:21.000 I didn't see that one.
01:38:23.000 Let's see if we can find it.
01:38:24.000 It's pretty crazy.
01:38:25.000 The amount of resources that are put into creating fake pages or pages that don't really represent real people but promote certain ideologies or certain political agendas and that they'll use these and start arguments with people.
01:38:42.000 And you don't even realize...
01:38:43.000 The people think they're arguing with real people and they're reading the opinions of real people.
01:38:47.000 Meanwhile, there's a guy with a bank of phones in Macedonia.
01:38:52.000 Yeah, no, so we call this...
01:38:54.000 Here it is.
01:38:55.000 19 of 20 Christian Facebook pages are fake.
01:38:58.000 Yeah, so we call what you're talking about coordinated inauthentic behavior.
01:39:03.000 Coordinated inauthentic behavior.
01:39:05.000 So it's basically coordinated, right?
01:39:08.000 Some of these policy acronyms that we come up with, they have to be very specific, right?
01:39:12.000 Yeah, I get it.
01:39:15.000 But basically, we have a team of hundreds of counterterrorism and counterintelligence people I don't know who's behind those fake pages,
01:39:35.000 but the Macedonian troll farms were basically a bunch of spammers who created fake pages and they wanted people to click on them so they could make money from the ads.
01:39:42.000 So that one's actually pretty easy to disrupt because you just make it so that they can't use the ads to monetize anymore and their whole economic incentive goes away and they sort of dry up.
01:39:51.000 Dealing with nation states is a lot harder because they're more kind of ideological or sovereignty motivated.
01:39:59.000 So there, I think you just kind of need to be very vigilant and it's more of an arms race and you just kind of are building up better technology for defense and you assume that they're going to keep on getting more sophisticated and you keep on needing to get better.
01:40:13.000 But I mean, at this point we have like tens of thousands of people working on this at the company.
01:40:19.000 I think we spend like...
01:40:20.000 $5 billion a year was the last stat on sort of all this community integrity work.
01:40:26.000 I mean, it's like our kind of defense budget, it's like...
01:40:30.000 I mean, just to put the numbers in perspective...
01:40:31.000 I love that you call it defense budget.
01:40:33.000 Basically, it's like...
01:40:35.000 I mean, it's...
01:40:36.000 To defend the integrity of the community.
01:40:41.000 But it's like...
01:40:42.000 I mean, it is, I think, bigger than the defense budgets of probably most countries.
01:40:47.000 But it's...
01:40:51.000 And this is obviously a super critical part of what we need to do.
01:40:54.000 But then, you know, there's also this important set of philosophical discussions, which is like, all right, so I think almost everyone will agree that, like, that's bad, right?
01:41:03.000 Like, you don't want, you know, countries basically creating networks of bots trying to convince people of stuff.
01:41:09.000 You don't want terrorism, right?
01:41:11.000 You don't want child pornography.
01:41:13.000 Like, you don't want people inciting literal violence.
01:41:16.000 Right?
01:41:17.000 But then the question is, okay, so you build these capabilities to try to find this stuff, and it's a combination of basically humans, really expert humans, and really powerful AI systems working together.
01:41:30.000 But sometimes they get it wrong, and then we end up taking down accounts that we weren't supposed to take down.
01:41:36.000 And that sucks, right?
01:41:38.000 Because then we're kind of getting in the way of people expressing legitimate things.
01:41:42.000 There's, you know, no system is ever going to be perfect, so the question is, you know, do you want more, I don't know, false positives or false negatives?
01:41:49.000 Do you want there to be more kind of fake Christian pages, or do you want to accidentally take down, I don't know, what was the example recently of, like, the comedian who had a profile photo that had kind of a gun in it,
01:42:08.000 and that we accidentally took down the This guy's page.
01:42:13.000 We were talking about this before.
01:42:14.000 Oh, Kill Tony?
01:42:16.000 Yeah.
01:42:16.000 Yeah, the Kill Tony podcast.
01:42:18.000 Yeah, so it's like, alright, so some AI system was just like, alright, that's clearly violence, right?
01:42:23.000 It's like the profile picture literally has a kind of rifle sight and a gun that shows the guy dying.
01:42:30.000 So it's like...
01:42:32.000 So, I mean, that sucks.
01:42:33.000 This is some of the stuff that...
01:42:34.000 I mean, this hurts me, right?
01:42:36.000 Because when we take down something that That we're not supposed to.
01:42:44.000 I mean, that is like...
01:42:46.000 I mean, that's the worst.
01:42:48.000 How do you discern?
01:42:50.000 Say, like, these Christian Facebook pages.
01:42:53.000 I don't know how they found out that 19 of 20 were fake.
01:42:57.000 But if someone just says, I am Bob Smith, and they post as Bob Smith, and they have a photograph, but really what they're doing is trying to talk shit about Joe Biden and get people to vote Republican in the midterms.
01:43:10.000 Like, how...
01:43:12.000 How do you know whether someone's real or not?
01:43:14.000 This is the big argument with Elon and Twitter.
01:43:17.000 Because Elon asked Twitter, like, what percentage of your website is filled with bots?
01:43:23.000 And they say 5%.
01:43:24.000 And he says, I don't believe you.
01:43:25.000 I think it's higher.
01:43:27.000 And let's find out how you've come to this conclusion.
01:43:30.000 And, you know, I believe they said that they just took a hundred random Twitter pages and looked at the interaction and there's some sort of an algorithm they applied to it.
01:43:40.000 But how do you discern?
01:43:43.000 Yeah, so I think estimating the overall prevalence is one thing, but I think that the question of looking at a page and is this page authentic, I think that there's a bunch of signals around that.
01:43:54.000 One of the things that we try to do is for large pages, we try to make sure that we know who the admin of that page is.
01:44:01.000 You should be able to run an anonymous page.
01:44:03.000 You don't necessarily need to out yourself and say who you are running it, but we want to make sure that we sort of have Like, an identity for that person on file so that way we know, like, at least behind the scenes, that that person is real.
01:44:17.000 For certain political things, I think having a sense of what country they're originating from, I mean, some of that you can do just by looking at where their server traffic comes from, like, is the IP address coming from Romania or, you know, is...
01:44:33.000 Because if it's, like, an ad in some other country's election, then, you know, you probably want to make sure that that ad is, you know, especially in countries that have laws around that are, like, coming from someone who's a valid citizen or, like, at least in that place.
01:44:48.000 So there's a bunch of...
01:44:50.000 I think...
01:44:52.000 One theme in my worldview around this stuff, when it gets to some of the stuff that we talked about before, is I don't think that this stuff is black and white or that you're ever going to have a perfect AI system.
01:45:02.000 I think it's all trade-offs all the way down.
01:45:05.000 And you could either build a system and you can either be overly aggressive and capture a higher percent of the bad guys, but then also by accident take out some number of good guys, or you could be a And say,
01:45:21.000 okay, no, the cost of taking out any number of good guys is too high, so we're going to tolerate having just a little bit more bad guys on the system.
01:45:32.000 These are values questions, right, around what do you value more?
01:45:36.000 And those are super tricky questions.
01:45:39.000 And part of what I've struggled with around this is I didn't get into this Yeah.
01:46:11.000 That's what I'm But I also don't think that as a matter of governance, you want all of that decision-making vested in one individual.
01:46:29.000 So I think one of the things that our country and our government gets right is the separation of powers.
01:46:34.000 So...
01:46:35.000 You know, one of the things that I tried to create is we created this oversight board.
01:46:38.000 It's an independent board that basically we appointed people whose kind of paramount value is free expression, but they also balance that with things like when is there going to be real harm to others in terms of safety or privacy or other human rights issues.
01:46:53.000 And basically, that board...
01:46:58.000 People in our community can appeal cases to when they think that we got it wrong, and that board actually gets to make the final binding decision, not us.
01:47:05.000 So, in a way, I actually think that that is a more legitimate form of governance than having just a team internally that makes these decisions, or maybe some of them go up to me, although I don't spend a ton of my time on this on a day-to-day basis.
01:47:20.000 But like, I think it's generally good to have some kind of separation of powers where you're architecting the governance so that way you have different stakeholders and different people who can make these decisions and it's not just like one private company that's making decisions even about what just happens on our platform.
01:47:37.000 How do you guys handle things when they're a big news item that's controversial?
01:47:43.000 Like, there was a lot of attention on Twitter during the election because of the Hunter Biden laptop story, the New York Post.
01:47:52.000 Yeah, so you guys censored that as well?
01:47:54.000 So we took a different path than Twitter.
01:47:57.000 I mean, basically, the background here is the FBI, I think, basically came to us, some folks on our team, and was like, hey, just so you know, you should be on high alert.
01:48:08.000 We thought there was a lot of Russian propaganda in the 2016 election.
01:48:12.000 We have it on notice that basically there's about to be some kind of dump of...
01:48:20.000 That's similar to that.
01:48:21.000 So just be vigilant.
01:48:23.000 So our protocol is different from Twitter's.
01:48:25.000 What Twitter did is they said you can't share this at all.
01:48:29.000 We didn't do that.
01:48:30.000 What we do is we have...
01:48:32.000 If something's reported to us as potentially misinformation, important misinformation, we also have this third-party fact-checking program because we don't want to be deciding what's true and false.
01:48:43.000 And for the...
01:48:45.000 I think it was...
01:48:46.000 Five or seven days when it was basically being determined whether it was false, the distribution on Facebook was decreased, but people were still allowed to share it.
01:48:59.000 So you could still share it.
01:49:00.000 You could still consume it.
01:49:02.000 So when you say the distribution has decreased, how does that work?
01:49:06.000 Basically, the ranking in newsfeed was a little bit less.
01:49:09.000 So fewer people saw it than would have otherwise.
01:49:12.000 By what percentage?
01:49:14.000 I don't know off the top of my head, but it's meaningful.
01:49:18.000 But basically, a...
01:49:22.000 A lot of people were still able to share it.
01:49:24.000 We got a lot of complaints that that was the case.
01:49:26.000 Obviously, this is a hyper-political issue, so depending on what side of the political spectrum, you either think we didn't censor it enough or censored it way too much.
01:49:34.000 But we weren't sort of as black and white about it as Twitter.
01:49:37.000 We just kind of thought, hey, look, if the FBI, which I still view as a legitimate institution in this country, it's very professional law enforcement, they come to us and tell us that we need to be on guard about something, then I want to take that seriously.
01:49:50.000 Did they specifically say you need to be on guard about that story?
01:49:54.000 No.
01:49:55.000 I don't remember if it was that specifically, but it basically fit the pattern.
01:49:59.000 When something like that turns out to be real, is there regret for not having it evenly distributed and for throttling the distribution of that story?
01:50:12.000 What do you mean evenly distributed?
01:50:14.000 I mean evenly in that it's not suppressed.
01:50:17.000 Yeah, yeah, yeah.
01:50:18.000 I mean, it sucks.
01:50:20.000 Yeah.
01:50:20.000 Yeah, I mean, because it turned out after the fact, I mean, the fact checkers looked into it, no one was able to say it was false, right?
01:50:27.000 So basically it had this period where it was getting lists distribution.
01:50:32.000 So, yeah.
01:50:33.000 But I think it probably sucks, though, I think in the same way that probably having to go through a criminal trial but being proven innocent in the end sucks.
01:50:44.000 It still sucks that you had to go through a criminal trial, but at the end you're free.
01:50:49.000 So I don't know if the answer would have been don't do anything or don't have any process.
01:50:54.000 I think the process was pretty reasonable.
01:50:57.000 We still let people share it.
01:50:59.000 But obviously you don't want situations like that.
01:51:02.000 But certainly much more reasonable than Twitter stance, and it's probably also the case of armchair quarterbacking, right?
01:51:09.000 Or at least Monday morning quarterbacking, I should say.
01:51:11.000 Because in the moment, you had reason to believe, based on the FBI talking to you, that it wasn't real.
01:51:19.000 And that there was going to be some propaganda.
01:51:21.000 So what do you do?
01:51:23.000 Yeah.
01:51:24.000 And then if you just let it get out there, and what if it changes the election and it turns out to be bullshit?
01:51:29.000 That's a real problem.
01:51:31.000 And I would imagine that those kind of decisions are the most difficult.
01:51:36.000 The decisions of, like, what is allowed and what is not allowed.
01:51:40.000 Yeah, I mean, what would you do in that situation?
01:51:42.000 I don't know what I would do.
01:51:43.000 I would have to, like, really thoroughly...
01:51:45.000 Well, first of all...
01:51:47.000 You're dealing with the New York Post, which is one of the oldest newspapers in the country.
01:51:51.000 So I would say, I would want to talk to someone from the New York Post.
01:51:57.000 And I would say, how did you come up with this data?
01:52:00.000 Like, where are you getting the information from?
01:52:03.000 How do you know whether or not this is correct?
01:52:05.000 And then you have to make a decision, because they might have got duped.
01:52:09.000 It's very...
01:52:11.000 It's hard, because everybody wants to look at it after the fact.
01:52:14.000 Now that we know that the laptop was real and it was a legitimate story and there is potential corruption involved with him, we think, oh, that should not have been restricted, that should not have been banned from sharing on Twitter.
01:52:32.000 Right.
01:52:33.000 I think everybody agrees with that.
01:52:34.000 Even Twitter agrees with that.
01:52:35.000 But the thing is, then, they didn't think that.
01:52:39.000 In the beginning, they thought it was fake.
01:52:41.000 So what do they do?
01:52:42.000 Like, if something comes along and the Republicans cook up some scheme to make it look like Joe Biden's a terrible person, and they only do it so that they can win the election, but it's really just propaganda, what are you supposed to do with that?
01:52:56.000 You're supposed to not allow that to be distributed.
01:52:58.000 So if they think that's the case, it makes sense to me that they would try to stop it.
01:53:04.000 But I just don't think that they looked at it hard enough When the New York Post is talking about it, they're pretty smart about what they release and what they don't release.
01:53:15.000 If they're going over some data from a laptop and you could talk to a person, but again, this is just one story, one individual story.
01:53:25.000 How many of these pop up every day, especially in regards to polarizing issues?
01:53:31.000 Like climate change or COVID or foreign policy or Ukraine.
01:53:37.000 Anytime there's a really controversial issue where some people think that it's imperative that you take a very specific stance and you can't have the other stance.
01:53:48.000 Those moments on social media, those trouble a lot of people because they don't know why certain things get censored or certain things get promoted.
01:54:01.000 Yeah, I agree.
01:54:04.000 And it's like, to be in your spot, and one of the things that I really wanted to talk to you about is this, because to be in your spot must be insanely difficult.
01:54:12.000 To have, no matter what decision you make, you're going to have a giant chunk of people that are upset at you.
01:54:19.000 And there might be a right way to handle it, but I don't know what the fuck right way is.
01:54:23.000 Well, I think the right way is to establish principles for governance.
01:54:29.000 That try to be balanced and not have the decision-making too centralized.
01:54:34.000 Because I think that it's hard for people to accept that some team at Meta or that I personally am making all these decisions.
01:54:44.000 And I think people should be skeptical about so much concentration around that.
01:54:48.000 So that's why a lot of the innovation that I've tried to push for in governance is around things like establishing this oversight board.
01:54:57.000 So that way you have people who are luminaries around expression from all over the world, but also in the U.S. You know, I mean, folks like Michael McConnell, who's, I mean, he's a Stanford professor, who's like, just he was, I forget which Republican president appointed him,
01:55:14.000 but I mean, he was, I think, going to be considered for the Supreme Court at some point.
01:55:18.000 I mean, he's a very...
01:55:22.000 Yeah.
01:55:39.000 And that's a step in the right direction.
01:55:41.000 I mean, in the Hunter Biden case that you talked about before, I don't want our company to decide what's misinformation and what's not.
01:55:48.000 So we work with third parties and basically let different organizations do that.
01:55:54.000 Now, I mean, then you have the question of, are those organizations biased or not?
01:55:57.000 And that's a very difficult question.
01:56:00.000 But at least we're not the ones who are basically sitting here deciding...
01:56:02.000 We're not the ministry of truth for the world that's deciding whether everything is true or not.
01:56:07.000 So...
01:56:07.000 I'd say this is not a solved problem.
01:56:12.000 Controversies aren't going away.
01:56:14.000 You know, I think that there's...
01:56:15.000 It is interesting that the U.S., It's actually more polarized than most other countries.
01:56:24.000 So I think sitting in the US, it's easy to extrapolate and say, hey, it probably feels this way around the whole world.
01:56:32.000 And from the social science research that I've seen, that's not actually the case.
01:56:36.000 There's a bunch of countries where social media is just as prominent, but polarization is either flat or has declined slightly.
01:56:43.000 So there's something kind of different happening in the US. But But for better or worse, it does seem like the next several years do seem like they're set up to be quite polarized.
01:56:56.000 So I tend to agree with you.
01:56:57.000 There are going to be a bunch of different decisions like this that come up.
01:57:02.000 Because of the scale of what we do, almost every major world event has some angle that's like the Facebook or Instagram or WhatsApp angle about how the services are used in it.
01:57:12.000 So, yeah, I think just establishing as much as possible independent governance so that way...
01:57:19.000 I'll obviously have to be involved, our teams.
01:57:22.000 Nick Clegg, who I appointed to be the president for all the policy issues for the company, and he was formerly the deputy prime minister in the UK, a successful politician there, and very well-versed in government and all those political issues.
01:57:43.000 We'll have to do some part of this, but I think also kind of getting to more and more independent governance is going to be an important part of how we deal with this.
01:57:51.000 Why do you think the United States is more polarized?
01:57:54.000 Like, what do you think is happening over here that's causing that?
01:57:58.000 I think that that's...
01:57:59.000 I mean, I'll speculate, but I think that there are people who have studied and thought about this a lot more.
01:58:05.000 I... I think there's probably a media environment...
01:58:22.000 I don't want to say uniquely because we're probably not the only country that has this, but in terms of having like Some of the news is so far left and some of it is so far right.
01:58:36.000 I think there's all this talk about filter bubbles on the internet, but I think even predating this, going back to the 70s or 80s when Fox News and all these other cable, these prominent media organizations were established,
01:58:52.000 I think that that has had a long-term effect and people have studied that.
01:58:58.000 But there might also be something about just the way that our governance is set up where we have two parties.
01:59:03.000 We have these primaries that basically make it so that it's almost like you're not promoting people who are trying to be the centrist.
01:59:14.000 You're basically promoting people who are the extreme of their party.
01:59:16.000 So I think that there are Really sensible reforms like open primaries that I think would probably have a pretty big impact on the political culture in the country.
01:59:26.000 And some of these other countries that are a little bit more parliamentary by definition just allow there to be more candidates on more parts of the spectrum.
01:59:37.000 But I want to be careful about not talking too far out of school because I'm not a political scientist.
01:59:44.000 But I've obviously spent a little bit of time thinking about this because...
01:59:48.000 I think a lot of people want to point to social media as the primary cause of this.
01:59:52.000 And I just think when you look at the fact that polarization has been rising in the US since before the internet, that just makes it seem like it's very unlikely that social media is kind of the prime mover here.
02:00:02.000 And then if you look at...
02:00:05.000 Yeah.
02:00:19.000 I don't know.
02:00:20.000 I mean, it's a really tough set of questions.
02:00:23.000 I think you're dead on with the open primary idea because this idea that it's only party loyalists who get to vote on each side, you're promoting this ideological adherence instead of reasonable ideas that people can enjoy or not enjoy and resonate with or not.
02:00:41.000 I don't think social media is to blame, but I think social media for a lot of people, it accentuates the divide because it gives them more time to immerse themselves with it.
02:00:55.000 And I think it's an unfortunate aspect of some people that they spend a lot of time distracted on things that don't immediately affect them.
02:01:05.000 But those things become their main focus in life.
02:01:09.000 And I think that's a distraction that's almost like a form of procrastination that people get involved with.
02:01:16.000 And it just seems like a natural thing with people.
02:01:20.000 I think it's a time management issue, and I think it's a discipline issue.
02:01:23.000 And I think some people have never really been taught time management or discipline, especially in regards to the type of information that you take in.
02:01:31.000 They just, like, see something that upsets them.
02:01:33.000 What is that?
02:01:34.000 What's going on?
02:01:35.000 Why are they doing that?
02:01:36.000 And then they just get upset, and then that's their whole day.
02:01:39.000 And, you know, I see things in terms of, like, I'm very careful with time management.
02:01:47.000 Because, like, anything that's going to take up too much time, that's not net benefit, that's not – I'm not enjoying or is going to wind up being a negative thing, I just – I'm not interested.
02:01:59.000 But I've developed this over time to recognize, like, that's a trap.
02:02:03.000 Get out of there.
02:02:04.000 Like, you can't put out all the fires.
02:02:07.000 You will be a fireman all day long.
02:02:09.000 There's no way.
02:02:10.000 If you just want to be upset at things and just engage with things that will upset you, there is no shortage of news stories.
02:02:16.000 There's no shortage of political issues.
02:02:19.000 There's no shortage of everything.
02:02:20.000 And you have to figure out time management and discipline.
02:02:23.000 And some people never do.
02:02:24.000 And I think that's more of the problem than social media and algorithms and all these different things that people are blaming for our woes.
02:02:34.000 More of the problem is a lack of Education.
02:02:38.000 Like, explaining to people that you, if you're awake for one hour, during that hour, it is your choice what to think and focus on.
02:02:49.000 In 24 hours, it's the same thing, you're just spreading it out.
02:02:52.000 You decide what to do.
02:02:54.000 Now, if you want to spend all of your time going back and forth on Roe v.
02:02:58.000 Wade on Twitter, Good luck.
02:03:01.000 Go do that.
02:03:02.000 You're not going to change the landscape.
02:03:04.000 That's what voting's for.
02:03:06.000 You're not going to change people's opinions.
02:03:08.000 If you want to make interesting videos and post them on Facebook, okay.
02:03:12.000 If you want to talk about things and have a perspective that you think is really well-formed and it's compelling, Go ahead and make that.
02:03:20.000 But if you get sucked into that world of just looking for things to complain about, that's really you.
02:03:28.000 You don't have to do that.
02:03:29.000 No one's forcing you to do that.
02:03:33.000 Yeah, I mean, I think your point around what do people have control over is really important.
02:03:40.000 Because I think the people who are happy and productive, I think, tend to focus on things that they have some agency over.
02:03:48.000 Yes.
02:03:48.000 And it's not that the other issues aren't important, right?
02:03:53.000 And national civic issues, I mean, they matter.
02:03:55.000 But there probably is a healthy balance.
02:04:00.000 Where, yeah, I mean, this just goes back to the time management conversation we were having before, which, I mean, you could spend all your day and more a thousand times over just reacting to things that are going on in the world.
02:04:11.000 Yeah.
02:04:13.000 I do think there's really a thing around kind of narrowing the aperture in your life to, like, what's around you, the people you care about.
02:04:31.000 I think that that does drive a lot of happiness for people.
02:04:37.000 So I think it's one of the interesting questions is how do we balance now having access to a historically unprecedented amount of information about issues that are going on in other places, which on the one hand drives...
02:04:51.000 In theory, it should drive more transparency and accountability and energy towards those things.
02:04:56.000 But maybe that energy needs to come balanced over a longer period of time or something.
02:05:05.000 I tend to think that all the transparency that we've gotten from social media will lead to good progress on a lot of things.
02:05:11.000 But I do think it can, if you just focus on...
02:05:20.000 I do think there's a certain thing about people's happiness that has to come from what's right around you in your world.
02:05:28.000 Most definitely.
02:05:29.000 And that's not to say that you shouldn't get upset about important issues and express yourselves.
02:05:36.000 It's just a matter of how much time you're spending on it.
02:05:40.000 Unless you're really disciplined and really careful with your time, you can get sucked into these things and you could waste your life just arguing with people online.
02:05:53.000 I don't think it's healthy for folks.
02:05:57.000 For entertainers and comedians in particular, it's really bad.
02:06:01.000 Like, I see so many comedians that get so much anxiety from, like, reading comments and going back and forth with people who are, like, talking shit to them on Twitter.
02:06:10.000 Yeah.
02:06:11.000 And I always tell them, like, don't do that.
02:06:13.000 Like, it's really bad for you.
02:06:14.000 Yeah, and I struggle with this, too.
02:06:16.000 Do you?
02:06:16.000 Well, I mean, on the one hand, some of this is free product feedback.
02:06:20.000 Right.
02:06:21.000 So, I mean, so, like, I actually think this is, like, one of the hard things about it.
02:06:27.000 You don't want to be so closed that you're not listening to criticism because then you're not going to grow.
02:06:32.000 But I think finding people and outlets that will provide criticism but from a place of actually trying to help you grow rather than tear you down is very rare.
02:06:47.000 I struggle with this.
02:06:48.000 I do sometimes feel like I need to...
02:06:51.000 I do want to try to understand all of the different perspectives that people have.
02:06:55.000 But the thing that's tough is that a lot of those people aren't necessarily trying to help us build something better.
02:07:00.000 And there is just a lot of negativity and it gets to you.
02:07:05.000 And I think there's a question of balance where, at what point are you kind of better off?
02:07:13.000 It's like, yeah, you want to push forward on the things you believe in, but you don't want to, you know, put on blinders and not consider alternative viewpoints, but then you could spend all your time looking at critique that's not necessarily trying to be constructive, and then that's just going to be super negative for your mental health.
02:07:28.000 So, yeah, I mean, I think probably a lot of the happiest and most productive people are, at least they're, I don't think you're ever going to carve out I don't think you should want to close off all that stuff completely, but I think at least being able to carve out a good amount of your day to be able to focus on what you want to push forward and things in your life that matter,
02:07:51.000 I think that's just really important to being a grounded person.
02:07:56.000 I think it's also important to establish an ethic where you communicate with people online the same way you communicate with them if they're in a room with you.
02:08:04.000 And I think that is not something that a lot of people adhere to.
02:08:09.000 People, they talk to people on Twitter like it's not a real human, they don't have real feelings, and you're just trying to say the most biting, mean, and cutting thing that you can.
02:08:21.000 And that's unfortunate.
02:08:23.000 I don't do that.
02:08:24.000 I used to engage in it.
02:08:26.000 Like, I used to argue with people back and forth, and then I realized, like, what am I doing?
02:08:29.000 Like, this is not good.
02:08:30.000 I always feel like shit.
02:08:31.000 I never feel good.
02:08:32.000 Even if I win the arguments, it doesn't feel good.
02:08:35.000 You're filled with anxiety, and then a new fire starts up in the comments.
02:08:39.000 You know, like, someone else will jump in, and then you've got a new opponent, and like, what are you doing?
02:08:44.000 Like, that's a...
02:08:46.000 Massive resource problem.
02:08:48.000 It's a giant issue in whether or not you want to focus on important things in life or whether you want to win these little verbal battles between people on Twitter or Facebook.
02:08:59.000 It's not necessary.
02:09:00.000 It's just for allocation of resources.
02:09:04.000 It's a terrible idea.
02:09:05.000 What we hear from our community is that that's not what people want to spend their time.
02:09:11.000 I think that part of the challenge in designing products is Sometimes what people tell you that they want to spend their time on is different from what they actually do spend their time on.
02:09:22.000 I'm sure.
02:09:38.000 There's some truth and aspiration to what they think they want to spend their time on that there is some long-term value in helping out with that, too.
02:09:45.000 So that's why I've just consciously tried to just downplay a lot of the political controversy on the services a bit and What do people come to our services for?
02:09:58.000 It's connecting with other people.
02:10:01.000 Expressing what matters to you, which for most people isn't some big global issue.
02:10:08.000 It's something that matters in their life.
02:10:10.000 What's going on with my kid's life?
02:10:12.000 How's my wife doing?
02:10:14.000 What's going on in my local community?
02:10:20.000 I think that there's something that's powerful about being able to focus more on that and be a bit more grounded in that.
02:10:26.000 It's not that we do that perfectly, but I do think social media tends to allow that a lot more than previous mass media did, because by definition, mass media just had to focus on issues that concerned a lot of people at once, whereas...
02:10:40.000 I think one of the best parts of social media is that it is so inherently local to just what matters to you and your friend group.
02:10:48.000 What's more kind of local to you than the specific people that you have relationships with?
02:10:53.000 I mean, that's the appeal.
02:10:55.000 That's why so many people use it.
02:10:57.000 Very good point.
02:10:58.000 And also, the interesting aspect of social media that I think often gets ignored is the discussion of social issues.
02:11:05.000 I mean, people have a greater understanding about how most people think about social issues today than we ever did in the past.
02:11:12.000 We were sort of informed how we felt about things based on the news, based on, you know, the rare commentator on the news or stories that were in the news or editorials that were in the New York Times or what have you.
02:11:25.000 And now you get a day-to-day sense of how people feel about things.
02:11:29.000 And of course, it's also clouded by people that are saying things to sort of virtue signal and get people to like them based on the opinion they think is going to be the most likely to attract positive attention.
02:11:41.000 But it at least is opening up this new field of people openly debating and discussing ideas that used to be only talked about by people that were already approved and on television and in the media.
02:11:58.000 Yeah.
02:11:58.000 We sort of get this, particularly about videos, right?
02:12:01.000 Videos are a really interesting example of that because someone can have a really concise and interesting perspective on something and that'll get shared millions and millions of times.
02:12:10.000 And it just has to go viral.
02:12:12.000 It just has to catch someone's opinion and go, wow, she's got a really good point.
02:12:15.000 And then that gets out there.
02:12:17.000 That is really powerful.
02:12:19.000 And that's something that never existed before.
02:12:22.000 Where just a regular person with an interesting idea can just catch fire.
02:12:26.000 Yeah.
02:12:27.000 Yeah, and I think that that's an area, hopefully, with better recommendation systems that will be able to be more possible in the future than has been in the past, I think, as the AI can help people discover things that they might be interested in.
02:12:40.000 But, yeah, I mean, I'm a little more mixed on this.
02:12:46.000 I mean, I think on the one hand, I think the comments and actual discussions that happen online...
02:12:54.000 I mean, I think live interactions, like what we're having now, I think that's hopefully an interesting discussion for people who are watching.
02:13:04.000 But I think if you look at common threads like you're talking about, I think that that experience probably needs a significant amount of innovation before it's good.
02:13:13.000 But I do think being able to see different people's opinions and Maybe more like the original posts than the comments back and forth.
02:13:25.000 Because when I see a friend has some opinion on something, I know where that person's coming from in terms of their values and their life story.
02:13:33.000 And that just means a lot more to me than, I don't know, the New York Times telling me that something is good or bad.
02:13:43.000 And there's also a lot of diversity because people tend to have friends who are from different backgrounds.
02:13:48.000 And before the internet, I think the average person basically had a few different media sources.
02:13:56.000 Each one had some kind of specific editorial leaning.
02:14:01.000 Now, the data that I've seen on this actually is that social media generally exposes people to a way more diverse set of views.
02:14:08.000 Now, there is a question about how people react to that.
02:14:11.000 I think sometimes when people see stuff that they don't agree with, there's a...
02:14:19.000 Yeah, I think.
02:14:39.000 Filter bubbles and, like, people only see one type of thing.
02:14:42.000 You know, I think it by this point has been pretty thoroughly debunked in terms of just, like, statistically the diversity of what you're seeing online from different sources is way greater than it ever was before.
02:14:53.000 I think people just don't make very compelling arguments.
02:14:56.000 That's one of the reasons why so few people are willing to think and listen to differing opinions on things.
02:15:02.000 You know, it's so often people are either preaching to the choir or shouting down at the person that has the opposing view instead of expressing themselves in a very neutral and objective way that considers all the possibilities.
02:15:20.000 And this is like, I mean, this brings me back to fact checks, like fact checkers, because oftentimes fact checkers are incorrect, and they are biased, and it is subjective as to whether or not what is a fact and what is not a fact,
02:15:35.000 especially about some more controversial issues.
02:15:38.000 Like, how do you choose fact checkers, and how does a fact checker How do they go through a mountain of data and come to a conclusion and then that is used for content moderation?
02:15:55.000 Yeah, so there's a whole discipline around and like professional discipline around fact checking where, I mean, these organizations are supposed to basically, they get accredited and they're generally,
02:16:10.000 I think, quite professional about how they do this.
02:16:12.000 Generally.
02:16:13.000 Yeah.
02:16:14.000 And so, I mean, that's another thing is, you know, not only did we not want to be deciding what is true or false, we also didn't want to be in the business of deciding which fact checkers are professional and not.
02:16:24.000 So we basically outsourced that to this accreditation.
02:16:28.000 I mean, it's widely respected as sort of the best that there is, even though it's not without flaws, like you're saying.
02:16:37.000 But what we tried to do Was basically, we give the fact-checkers the basic guidelines to not focus on things that could be opinion-y, right?
02:16:49.000 So there are things online that are, like, obviously...
02:16:55.000 I don't know, just like obviously kind of wrong memes or I don't know, like crazy conspiracy theories or something like that.
02:17:04.000 And I think that that's a pretty categorically different set of things than like is there some shade of to which some political candidate said something that was slightly false and like can we use that as an excuse to like ding them,
02:17:21.000 right?
02:17:21.000 So I think...
02:17:25.000 When the program is working the way it's supposed to, I mean, the overwhelming majority of people in our community tell us that they don't want to see things that are kind of obviously false flowing through the system, right?
02:17:36.000 It just decreases trust in the system.
02:17:38.000 And if there was a way to get rid of that, then it's like people on both sides of the political spectrum would want that to be the case.
02:17:44.000 I think where it ends up being an issue is when the fact checkers sort of veer towards getting into stuff that's Not as obviously black and white and a little more political.
02:17:55.000 I mean, a lot of the stuff that's blatantly wrong isn't necessarily even political.
02:17:59.000 It's just like stupid shit.
02:18:02.000 So those are the areas that I've seen that become the most controversial.
02:18:07.000 How do you make decisions when, like, I can understand the wanting to stop the spread of misinformation, but there's certain things that are so dumb where I feel like they should be allowed to be spread,
02:18:24.000 like flat earth.
02:18:26.000 Like, if someone has a flat earth theory, God, I want to listen.
02:18:30.000 I want to listen because it's so dumb.
02:18:32.000 I want to know how does someone start to form these ideas.
02:18:37.000 Because there's a thriving community.
02:18:40.000 I don't know if you know, have you ever Googled hashtag space is fake?
02:18:43.000 I have not.
02:18:44.000 You should.
02:18:45.000 I'm not sure I have enough time for that.
02:18:47.000 There's a large group of humans out there that believe that we live in some sort of a dome and that there's essentially light bulbs hung in the sky.
02:18:57.000 It's so dumb.
02:18:58.000 But I mean, what do you do about that?
02:19:00.000 If I was running Facebook, I would let that stay.
02:19:04.000 I would leave that in there.
02:19:05.000 That's so dumb.
02:19:07.000 So one important nuance on this, though, is we don't block misinformation.
02:19:13.000 We basically just have a label that goes on it that says that a fact checker says this is false and show it a little bit less in the ranking and news feed.
02:19:23.000 But don't you stop a person's ability to share something with that tag?
02:19:29.000 No, you can share it.
02:19:31.000 But what if someone is a person that is known to spread certain misinformation?
02:19:36.000 Don't you make it so you can't tag that person?
02:19:39.000 It depends on what the stuff is.
02:19:40.000 I mean, we can go super deep on all the nuances.
02:19:43.000 There's misinformation that could lead to harm, right?
02:19:47.000 So misinformation that veers on things that lead to violence or health safety that we treat in one way.
02:19:53.000 And then there's just misinformation, like stuff that's wrong that people say reduces trust in the system when they see it, but that...
02:20:00.000 We have no reason to believe is going to lead to any, like, physical safety issue for people.
02:20:05.000 And that we just treat differently.
02:20:07.000 I mean, that it's like, yeah, we'll put a label on it.
02:20:09.000 We'll, you know, if we have the choice to either show your cousins, you know, giving birth photo or that, we'll kind of show the other content above it.
02:20:17.000 But fundamentally, we're not going to prevent you from sharing it or prevent people from seeing it.
02:20:23.000 You know, when Jack Dorsey and I had a conversation about this, one of the things he said that he was in favor of and was trying to promote the idea of two versions of Twitter.
02:20:35.000 He wanted to have a moderated Twitter and then he wanted to have a Wild West Twitter.
02:20:41.000 Like, you wanted to have something where it's like 4chan or something.
02:20:45.000 Just, like, let people do whatever they want and just open up those barn doors and as soon as you go in there, it's chaos.
02:20:52.000 Like, what do you think about that?
02:20:54.000 And do you think that there is an important function of content moderation that to set a tone?
02:20:59.000 I mean, you can kind of see it.
02:21:02.000 Yeah, I mean, yeah.
02:21:03.000 I think that the tough thing is...
02:21:06.000 So there's, like...
02:21:09.000 Most of the categories of harmful content are things that I think almost everyone would agree on.
02:21:14.000 Right?
02:21:15.000 So it's like, you don't want foreign nations interfering, like the bot networks.
02:21:21.000 You don't want terrorism.
02:21:23.000 You don't want child pornography.
02:21:24.000 You don't want blatant intellectual property violations.
02:21:28.000 You don't want people promoting violence.
02:21:29.000 Okay, so you go through all this stuff.
02:21:30.000 I think that that's...
02:21:33.000 Most of that stuff, I actually think, is not that controversial.
02:21:37.000 It's like people want it gone and they expect us to, as a technology company that operates at scale, to be able to do this reasonably well.
02:21:47.000 So then I think that there are a couple of issues.
02:21:50.000 One is sometimes those systems get that stuff wrong, and we say that something is bad when it wasn't, or we miss something that is bad.
02:21:56.000 So there's that type of issue, which is like you just make an operational mistake, which is important, but is kind of one type of issue.
02:22:04.000 Then I think you get into the types.
02:22:08.000 There are a couple of types of issues, and I think misinformation is probably the biggest one.
02:22:15.000 Where there is actually just not widespread agreement at all about how to handle it.
02:22:20.000 I think that a large percent of the population, the vast majority, says that they don't want to see misinformation, but then people disagree on what misinformation is, right?
02:22:32.000 So people don't want to see what they think is misinformation, but honestly, even more than that, they don't want other people to be To see what they think is misinformation.
02:22:45.000 So that's pretty difficult because then different people have different views.
02:22:49.000 And I think that there, I mean, maybe you could have a policy like what Jack was talking about for that type of content.
02:22:58.000 I mean, I don't think you're going to have a Wild West version of social media where you're just allowing terrorism free-for-all.
02:23:02.000 I mean, that's crazy, right?
02:23:04.000 The Taliban is on Twitter, which is really wild because Donald Trump isn't.
02:23:12.000 Yeah, I'm not super deep on Twitter's policies, so tough for me to comment on that.
02:23:16.000 Well, Twitter has pornography.
02:23:18.000 I mean, they have hardcore pornography.
02:23:20.000 You could just accidentally stumble onto someone you follow's page and they'll have hardcore pornography.
02:23:25.000 Yeah, so that's something that just is more of, going back to your point around just for the community feel, But pornography is a thing that we don't allow.
02:23:36.000 And I think it's somewhat controversial.
02:23:39.000 Because, I mean, you could make a pretty good argument, I think, that this isn't doing physical harm to people.
02:23:46.000 I mean, I know that there's arguments on both sides of that, so I don't want to go super deep on that.
02:23:50.000 But...
02:23:52.000 I'd say our reason for not wanting pornography is more for the feel of the community than kind of the sense of harm.
02:24:01.000 And obviously child pornography is different.
02:24:03.000 That's obviously real harm.
02:24:08.000 But I think that's one category of content where it's kind of more of an editorial moderation decision.
02:24:14.000 I don't think it's a political decision.
02:24:15.000 It's more of like we want the feel of the service to be about people connecting with their friends and family and not necessarily coming across that kind of content.
02:24:24.000 But yeah, I mean, that's sort of, I think, how the whole thing breaks down.
02:24:28.000 I mean, there's most of the stuff that I think gets taken down.
02:24:31.000 Actually, most people would agree needs to get taken down.
02:24:34.000 And then I think there's mistakes.
02:24:36.000 And then there's, you know, stuff like how do you handle misinformation, which I think society as a whole doesn't agree on.
02:24:42.000 So my basic approach to that is...
02:24:45.000 Give people choice, and basically don't take it down, but basically let people share the stuff, but also flag if an accredited fact checker said it might be false, and also get us that we shouldn't be the ones deciding what's true and false,
02:25:00.000 so kind of try to set up this independent governance to do that.
02:25:03.000 I think it's a pretty well-balanced system.
02:25:05.000 It's not perfect.
02:25:06.000 We'll need to keep on iterating on it and making it better over time, but those are the basic principles for kind of how I think about navigating that.
02:25:13.000 One thing that people freak out about, and oftentimes I'm a little skeptical of their concerns, is people think they're being shadow banned.
02:25:23.000 It's always.
02:25:25.000 People think they're being shadow banned.
02:25:27.000 Is shadow banning a real thing?
02:25:29.000 And what does that mean?
02:25:31.000 Well, I mean, there's no policy that is shadow banning.
02:25:34.000 So I think it's sort of a slang term.
02:25:37.000 But that maybe refers to some of the demotions that we're talking about, right?
02:25:43.000 So if someone posts something that gets marked as false by a fact checker, then it'll get somewhat less...
02:25:49.000 Just that post or all of their posts is for the future?
02:25:53.000 I think that if you do it once, then it's that.
02:25:56.000 And then I think if there's some history within a page or there's kind of different rules for pages and groups and different things, then there can be some kind of broader policy that applies.
02:26:11.000 But...
02:26:14.000 When I look into this stuff, because a lot of my friends and people I know just send me examples, because unfortunately there are a lot of mistakes.
02:26:19.000 I think part of the issue is that, okay, if there's 3.5 billion people using these services, and if we make a mistake 0.1% of the time, that's like...
02:26:32.000 Still, millions of mistakes.
02:26:35.000 So there's all these cases, and that sucks.
02:26:38.000 There are all these cases where we missed something that we should have taken down, or we enforced something that we weren't supposed to.
02:26:48.000 But I'd say, as it relates to kind of concerns about shadow banning, a lot of the time when I look into stuff, people attribute some motive or like, ah, this is like meta has some stupid policy in place that blocked this or they're banning this thing.
02:27:01.000 And a lot of the time, it was either just a mistake.
02:27:04.000 So nothing was supposed to happen, but there was some bug in the system or some system didn't work the way it was supposed to, which is a real issue, but it's not an ideological issue.
02:27:16.000 And...
02:27:17.000 A lot of the time, also, when people are worried about stuff like shadow banning, it actually, like, maybe their post just wasn't as good or something, and it just didn't get the distribution that they wanted it to.
02:27:30.000 But I don't know.
02:27:31.000 I mean, you highlighted some examples to me a few weeks ago of someone who was saying that they, like, couldn't follow your account or something, and you posted it, and it was...
02:27:39.000 And, I mean, so I, like, looked into it because I'm like, okay, I'm, like, I'm going to see Joe soon, and I kind of want to understand what these issues are.
02:27:48.000 That's an example where it's like it had nothing to do with your account.
02:27:51.000 It was basically there was some bug and that person had kind of taken a bunch of actions quickly or something and we basically just for spam protection stopped them from taking a bunch of actions.
02:28:04.000 So I think people sometimes read in some ideological bent or policy thing into this that I think often isn't there.
02:28:12.000 But unfortunately there just are Because the scale is so big, there are going to be millions of mistakes.
02:28:18.000 You're going to be able to find almost any pattern that you want in that much data.
02:28:22.000 So I haven't figured out how to crack that nut of kind of communicating.
02:28:27.000 It's also an interesting problem because people don't really know what's going on behind the scenes.
02:28:34.000 So there's this sort of, in their eyes, a lack of transparency.
02:28:37.000 It's like, how does this all work?
02:28:39.000 So they assume there's nefarious intentions and that someone's censoring them.
02:28:46.000 I'm curious how you'd think about this if you were in my position.
02:28:50.000 I've thought about it.
02:28:52.000 I would imagine it would be incredibly overwhelming, and I'd probably be on Xanax.
02:28:58.000 I don't know.
02:28:59.000 Or you'd just work out more.
02:29:00.000 Just work out more.
02:29:01.000 I wouldn't take Xanax.
02:29:02.000 You'd just steal more MMA. But the amount of foreign countries that you're dealing with, too.
02:29:07.000 I mean, you have Facebook in how many languages?
02:29:10.000 Oh, I don't even know.
02:29:12.000 I mean, we just ruled out an AI tool that allows...
02:29:21.000 We're good to go.
02:29:36.000 Yeah, yeah.
02:29:36.000 That's there.
02:29:37.000 That should be there.
02:29:38.000 Yeah, no, it is there.
02:29:39.000 It's really cool.
02:29:39.000 Yeah, yeah.
02:29:39.000 I love that.
02:29:41.000 It's really nice, but the amount of countries that are using this and talk, like, I wouldn't even want to just pay attention to the people that are speaking English.
02:29:51.000 If you're trying to moderate all the people that are speaking, you know, a million different languages, like, how?
02:29:57.000 Oh, yeah.
02:29:57.000 I can't imagine.
02:30:00.000 This is one of the things I wanted to talk to you about is when you first started Facebook...
02:30:05.000 You clearly could have never imagined that it would become what it is now.
02:30:11.000 What was it like going through the stages of growth of this thing where, oh great, it's successful.
02:30:19.000 Oh hey, Facebook is taking up.
02:30:21.000 Holy shit, we're overthrowing governments.
02:30:24.000 What is happening?
02:30:26.000 What is this thing now?
02:30:28.000 What has that been like for you to assume this position and to have this position evolve and spread and for you yourself to become this...
02:30:43.000 Worldwide figurehead and, you know, become this insanely successful person who is involved in this social media platform that is so massive.
02:30:55.000 It's just so beyond.
02:30:56.000 And now with WhatsApp and you have Instagram and you have Oculus, you have all this going on.
02:31:02.000 Yeah.
02:31:04.000 Am I freaking you out just thinking about it?
02:31:06.000 No, no, no.
02:31:07.000 I've spent a long time...
02:31:08.000 Yeah, it's like, breathe deeply.
02:31:11.000 No, I've spent a lot of time thinking about this.
02:31:13.000 Don't imagine.
02:31:14.000 It's like, why us?
02:31:16.000 What happened that we were the ones who built this?
02:31:20.000 Like me and this group of people.
02:31:20.000 Where's MySpace?
02:31:21.000 Where'd it go?
02:31:22.000 Yeah, you know?
02:31:24.000 Here's my reflection on this.
02:31:26.000 And I'm curious for your view on if you think this is just crazy.
02:31:29.000 But when I was getting started, I remember...
02:31:33.000 The night that I launched the original Facebook website at my college, it was just a website for my college.
02:31:40.000 It was literally a Facebook.
02:31:42.000 Harvard didn't have a paper Facebook and I was like, this is stupid.
02:31:46.000 Let's just make a version where people can input their own stuff because people like expressing stuff about themselves and people are really interested in learning about other people.
02:31:54.000 So let's go do that and we can help people connect around that.
02:31:58.000 So I launched it and I went to go get pizza with my friends.
02:32:01.000 And that night we were talking about how it was really cool that I got this out at Harvard and people were going to use it at Harvard, but someday someone was going to do this for the world.
02:32:13.000 It was not even a possibility for me that that was going to be us.
02:32:16.000 It was completely obvious that it was going to be someone else.
02:32:19.000 It was like, we're just college kids.
02:32:21.000 Who are we to do this?
02:32:23.000 You have Google and Microsoft and Yahoo at the time, these great technology companies that have thousands of engineers and all these servers and all these resources.
02:32:35.000 It just wasn't even a question.
02:32:36.000 It wasn't even a hope that I had that we would do it.
02:32:38.000 But then, okay...
02:32:40.000 So we just kind of kept going.
02:32:43.000 So we launched it at Harvard and then...
02:32:45.000 What year was that?
02:32:46.000 2004. And a bunch of students from other schools started writing in.
02:32:53.000 And as soon as I optimized the code and basically got...
02:32:57.000 I could make it set...
02:32:58.000 I could run this at more colleges at the same time.
02:33:01.000 I started launching at more colleges and like...
02:33:04.000 Launched it at like two more colleges at a time, like every week.
02:33:08.000 And it just kind of kept on going.
02:33:16.000 I think a lot of people just kind of wrote it off.
02:33:19.000 It was this thing that a lot of people in colleges loved, but people were like, okay, that's kind of a kid thing.
02:33:24.000 It's not going to be a global thing.
02:33:27.000 And I was like, no, someday someone's going to do this.
02:33:28.000 It's going to be a global thing.
02:33:31.000 But Google and Microsoft and all these companies could never really get motivated to do it because there was probably a bunch of internal bureaucracy or forces that were naysaying against why this was a valuable thing.
02:33:43.000 So then, okay, then the people who were in college started graduating, and they kept on using it.
02:33:48.000 So then it was like, okay, so this clearly isn't just a college thing.
02:33:52.000 And we started in 2007. We opened it up beyond college so that anyone can sign up, and people of all different ages started signing up.
02:34:01.000 And then the meme shifted from this is a college thing to this is a fad, right?
02:34:06.000 Because people, you know, you mentioned MySpace and, you know, there's this whole string of social apps like this.
02:34:12.000 There's Friendster and there's MySpace and it was the whole thing was there's like one after another.
02:34:15.000 So it's like, no, there's not going to be one for like 20 years, right?
02:34:18.000 That lasts for like for 20 years or 30 years.
02:34:20.000 It's like this is a fad and probably...
02:34:23.000 Inside Google or Microsoft, there were probably people who thought that this would be a cool thing to do and go build this, but probably a bunch of bureaucracy and people naysaying on it.
02:34:34.000 So we just kept on going.
02:34:36.000 And then the next meme was, oh, it's never going to be a good business.
02:34:41.000 Okay, so there's 100 million people using it.
02:34:45.000 They've been going for like five or six years.
02:34:47.000 Yeah.
02:34:51.000 Maybe it's not a fad.
02:34:52.000 Maybe people can keep on using it.
02:34:54.000 But I mean, there's only one good internet ads business, and it's Google.
02:34:59.000 So it's like the chance that someone can invent another one, that seems really low.
02:35:03.000 So I don't know.
02:35:05.000 No one really motivated.
02:35:06.000 So I guess my reflection on this is that I think with so many things in the world, I think we did it because we just cared more and actually believed in it.
02:35:17.000 So we just kept going.
02:35:19.000 It shouldn't have been us.
02:35:21.000 People had more resources all along the way and cared more.
02:35:24.000 And I've just found that that's actually sort of something that I've noticed in other areas too.
02:35:31.000 So if you notice, if you think about what Elon's doing with rockets, or what we're trying to do with the metaverse, it's just like, these are sort of these crazy things.
02:35:42.000 And I do think at some level, I think?
02:36:06.000 Is probably undervalued in terms of determining who ends up doing what in the world.
02:36:13.000 I think most people probably have an assumption of something that is so obviously true to you that you just assume that other people are going to go do it.
02:36:21.000 But just because it's so obvious to you doesn't actually mean that it's that obvious to other people.
02:36:25.000 You probably have this around a lot of the stuff that you do, the stuff that you talk about, the way you explore all these different topics on your podcast, the way you do comedy.
02:36:36.000 I don't know.
02:36:37.000 I mean, I'm curious if you've had that experience where you feel like those things are just so obvious that obviously other people should get them too, but then just like no one does.
02:36:46.000 Well, I think you had the advantage in the early days of being young and having a perspective that's not overly influenced by commerce and by corporations and by corporate politics.
02:37:04.000 If you think about it, Google tried to do it with Google+.
02:37:07.000 I remember I had a good friend that worked at Google.
02:37:10.000 But it was too late then.
02:37:11.000 Yeah, it was too late.
02:37:12.000 I was joking around with her.
02:37:13.000 I'm like, this is dog shit.
02:37:14.000 This is never going anywhere.
02:37:15.000 It's terrible.
02:37:16.000 And they didn't have an original idea.
02:37:17.000 I mean, you can come late to something as long as you sort of have a unique contribution that you can bring to this place.
02:37:26.000 Right.
02:37:26.000 But they didn't.
02:37:27.000 Yeah.
02:37:27.000 So you were trying to get people to escape, leave from other social media platforms to go to this other clunky one that is just basically like a beta version of some new Facebook.
02:37:40.000 Yeah.
02:37:40.000 You can't build the same thing six years later.
02:37:43.000 You have to add...
02:37:44.000 Yeah, it wasn't compelling.
02:37:46.000 Well, it is interesting, though, because it's like when a product like yours achieves escape velocity, it gets to this point where it's just so big.
02:37:56.000 And it's like you have to have Instagram.
02:37:59.000 You have to have Facebook.
02:38:00.000 Everybody uses it.
02:38:01.000 It's just one of those things.
02:38:03.000 And I am always fascinated to, like, what is that like for the person who made it?
02:38:09.000 And it's got to be so bizarre to see it just continue to spread.
02:38:14.000 I mean, it's not getting smaller.
02:38:15.000 It just keeps getting bigger.
02:38:17.000 Yeah.
02:38:17.000 Which is nuts.
02:38:19.000 Like, you have three billion people and it's getting bigger.
02:38:20.000 That's nuts.
02:38:22.000 Yeah.
02:38:23.000 Well, how long do you see yourself doing this?
02:38:25.000 How long do you see yourself running it?
02:38:26.000 Do you think there's ever going to come a point in time where the stress is just overwhelming?
02:38:30.000 You're like, just pawn it off to somebody else and...
02:38:33.000 I'm not sure if that'll be the reason.
02:38:36.000 I think I'm probably going to do this for a while just because I kind of viewed the phases of the company as the first phase was building Facebook.
02:38:47.000 It was like, okay, can we build a social product that's super successful and successful?
02:38:52.000 We did.
02:38:53.000 We basically made the most used service in the world.
02:38:55.000 And then it's like, okay, once you're lucky, but can we do this multiple times?
02:39:01.000 And then that's when we got Instagram joined us super early.
02:39:04.000 I think there were 16 employees at the time, and I think it had 20 million people using it or something super early.
02:39:11.000 When did you pick up Instagram?
02:39:12.000 What year was that?
02:39:13.000 2012. It was the same year we went public.
02:39:15.000 But it was really small at the time.
02:39:17.000 Super talented team.
02:39:19.000 Kevin, super talented guy who created it.
02:39:22.000 Kevin and Mike.
02:39:23.000 And did a lot of awesome work together.
02:39:26.000 And then the WhatsApp folks joined.
02:39:28.000 I think there were about 60 people working at WhatsApp.
02:39:30.000 These were super early things when they joined us.
02:39:33.000 But, and we've scaled both of those two.
02:39:35.000 I mean, WhatsApp now is, you know, more than two billion people.
02:39:40.000 Instagram is, I don't think it's quite two billion yet, but it's basically, it's on its way.
02:39:49.000 So, and then Messenger, we kind of grew from scratch, and that has more than a billion people too.
02:39:55.000 So it's like, okay, so now for the second phase of the company, it's like, went from building one One great social experience to now building force.
02:40:03.000 It's like, all right, that's pretty good.
02:40:04.000 We can do this.
02:40:05.000 We can keep on evolving these things.
02:40:06.000 And as you say, they keep on growing and the businesses around them are good.
02:40:10.000 We're just empowering a lot of entrepreneurs around the world.
02:40:13.000 So really happy about all that stuff and there's a lot more to do.
02:40:18.000 But I look at a lot of what we're doing still just in those experiences feels constrained by the fact that it's happening on a phone.
02:40:25.000 And I just think phones are very limited.
02:40:27.000 So I think for the next chapter of what we're going to do, it's about continuing to build those, but also defining what the next computing platform is going to be, which is, for me, what the metaverse is all about and this kind of...
02:40:41.000 I don't know.
02:40:56.000 My outlook, though, on what types of things I want to focus on...
02:41:01.000 I mean, I'm curious how this has changed as you've grown in your career, too.
02:41:04.000 But for me...
02:41:08.000 For the first maybe 15 years of building the company, I was really just solely focused on, let's connect more people.
02:41:18.000 Let's grow this community to be bigger and bigger.
02:41:21.000 Let's grow the business to be bigger and bigger.
02:41:23.000 And now, I obviously care about that.
02:41:26.000 I want to continue seeing these things thrive.
02:41:30.000 But I think about my life more now in terms of projects that I want to take on on, like, a decade-long basis.
02:41:38.000 And, um...
02:41:42.000 And there are some things that work, right?
02:41:44.000 So, like, building out the metaverse with VR and AR and building out the whole developer community and creator community around that.
02:41:51.000 I view that as 10, maybe if it takes longer, 15-year project.
02:41:54.000 But that's, like, something that I just want to kind of dedicate myself to.
02:41:59.000 Done a lot of stuff on the philanthropy side where I just, like, it's been really cool getting a chance to work with my wife Priscilla on this.
02:42:06.000 It just, like, opened up a whole new side of our relationship where, like, it's like we were We're good to go.
02:42:40.000 And I think that's possible.
02:42:42.000 But not within the decade, but within a century.
02:42:46.000 So for that, we're taking on a bunch of kind of 10-year projects.
02:42:55.000 To just sort of be able to observe different things about human biology working that haven't been seen before.
02:43:02.000 So, like, one example is we're working on this imaging institute, an imaging project, where we want to be able to...
02:43:09.000 And you have, like, microscopes today, and they can see stuff, but it's, like, pretty hard to see things that are going on, like, inside your body, right?
02:43:16.000 You know, you're blocked by the other tissue that's in the way.
02:43:21.000 But now, through a combination of different techniques, you can use this, like, cryo-EM technique, where you can, like, take certain tissue out of a person, and it still will be...
02:43:36.000 I mean, there are techniques where some of the tissue will still be alive for some period of time, even though it's obviously going to die because it's been removed from you.
02:43:43.000 And you can look at that under a really powerful microscope.
02:43:46.000 And then you can use AI techniques over time To be able to kind of extrapolate from what you've seen in very high resolution in the tissue that you've removed from the body to now being able to...
02:43:58.000 Okay, even though there's like optical and physical limits on what you can see with a microscope in a body, you can use all this data that's been generated in AI to effectively be able to see different cells interacting.
02:44:09.000 Like, no one has ever seen a synapse, you know, a neuron like fire and like what it looks like in the synapse of a brain before...
02:44:18.000 Like, in a living organism.
02:44:20.000 But I kind of think my engineering perspective on this is, like, how are you going to debug a system or help solve it if you can't, like, step through the code, right?
02:44:32.000 Like, one line at a time and, like, see everything that's happening, right?
02:44:36.000 If you want to really understand what's going on in the brain, you need to see that, right?
02:44:41.000 So, I mean, that's the kind of project that we're doing on the philanthropy side.
02:44:45.000 So that's, like...
02:44:46.000 That's pretty cool too, right?
02:44:47.000 But it's like a different kind of thing.
02:44:49.000 So over time, I've sort of broadened out...
02:44:53.000 It's not just...
02:44:55.000 For the first 10 years or so, the company was so all-consuming that I really couldn't do much else and I wasn't that well-rounded of a person.
02:45:02.000 I think having a family changes that.
02:45:03.000 I think it sort of forces you to become a little more balanced.
02:45:06.000 But...
02:45:07.000 But now I'd say there's projects at work.
02:45:10.000 There's projects in philanthropy.
02:45:11.000 There's also just personal stuff that I just really enjoy building up our ranch to be self-sustaining and 100% off the grid and being able to grow all the stuff that we want there and raise our own cattle.
02:45:27.000 I think that that's a cool, fun project too.
02:45:31.000 At this point, I sort of define meaning in my life more by getting to work With people who I really like on a different set of things and just get to learn from doing a bunch of different things.
02:45:45.000 But I'm curious how that's kind of shifted in your life as you've grown in your career too.
02:45:50.000 Well, I think what you're dealing with is you have so much success that you're comfortable enough for you to not think about just success and to only think about growing the business.
02:46:03.000 Instead, you're thinking about projects that are fascinating to you.
02:46:21.000 I think that's amazing.
02:46:24.000 I would love to see that in more people.
02:46:27.000 So many people just get caught up in the game of resources and numbers, and they just want to grow numbers and grow and have more money and have bigger toys and have bigger this and bigger that, and it's a trap.
02:46:39.000 You know, and I think sometimes people are caught in that trap when they're ahead of the game.
02:46:44.000 They're winning the game.
02:46:45.000 They've won the game, but yet they're still, like, sucked into it, and they never really branch off and find things that are deeply fulfilling to them.
02:46:54.000 It's really unfortunate because that's the trap of the businessman.
02:46:58.000 You know, the businessman gets consumed by just wealth.
02:47:03.000 They get consumed by success of the company, eternal growth, and it's a real trap.
02:47:11.000 I have a similar perspective in that I don't think about my show in terms of how it grows or how it does well.
02:47:20.000 I just do what I like to do.
02:47:23.000 And I think as much about archery and jujitsu and playing pool and automobiles.
02:47:32.000 I think about a lot of different things.
02:47:34.000 I don't just think about the podcast.
02:47:36.000 And when I do the podcast, The very fortunate thing that I have is that who I talk to is entirely based on whether or not someone's willing to talk to them or whether or not I'm interested in talking to them.
02:47:50.000 So that's all it is.
02:47:51.000 There's no external pressure.
02:47:53.000 So it never feels like a job.
02:47:56.000 It's always like, oh, Mark Zuckerberg.
02:47:58.000 I'd love to talk to Mark.
02:47:59.000 He seems like an interesting guy.
02:48:00.000 I don't like the way you sip water, though.
02:48:01.000 When you're sipping water in the Senate, you're sipping water like a robot.
02:48:04.000 Let me see you take a real drink.
02:48:06.000 Go ahead.
02:48:11.000 Honestly, the Senate testimony is not exactly an environment that is set up to accentuate the humanity of the subject.
02:48:19.000 It's quite the opposite, right?
02:48:22.000 I don't know.
02:48:23.000 I mean, if you're up there for six or seven hours, you're going to make some face that's worth making a meme out of.
02:48:31.000 Right, and then they're just going to only concentrate on that.
02:48:33.000 And that's going to be the big deal.
02:48:35.000 But, you know, so I'm just fortunate that I can do this and just, and the appeal of the show, I think, I mean, if I had to think about it, I don't think about it too much, honestly.
02:48:46.000 But if I thought about it, I think the appeal of the show is that I'm actually interested in what the guest has to say, and it's because those are the people I've chosen.
02:48:55.000 Whether it's I'm talking to a scientist or a philosopher or an athlete or you or anybody, I'm interested.
02:49:02.000 I'm genuinely curious, and if I wasn't, I wouldn't do it.
02:49:04.000 And so because of that, I think it translates to the people at home, and it resonates with people.
02:49:12.000 Yeah.
02:49:14.000 I was going to say paradox.
02:49:16.000 I'm not sure if it's a paradox, but I think that there actually is a feedback loop between those things, I would guess, though.
02:49:21.000 You're saying that you don't care as much about the viewership or the listenership of the show, but to some degree, because you're following your curiosity, you probably are producing a more interesting show that more people want to watch.
02:49:32.000 I'd bet that for me, if I were solely focused on just kind of I'm not focused on the metaverse primarily because I think there's some near-term business opportunity.
02:50:01.000 I actually think we're going to lose a lot of money for a long time on this massive breadth of things that we're working on that we talked about before.
02:50:07.000 I think that if we do good work on this, I think we should be positioning ourselves quite well for the future.
02:50:12.000 And I do care about winning for all of our employees and our shareholders and stuff like that, too.
02:50:17.000 I mean, that obviously matters because we have all these awesome people who are actually doing the work.
02:50:21.000 Yeah, I mean, it's a brilliant gamble.
02:50:24.000 But it's also a very well-informed gamble.
02:50:28.000 Like, what you're doing is...
02:50:29.000 And you're not just gambling, you're innovating.
02:50:32.000 So it's a really cool place to be.
02:50:36.000 But what you're saying and what you're talking about is authentic focus and interest.
02:50:43.000 And I think today, in this world where there's so much bullshit, It's so hard to know what's real and what's not real.
02:50:51.000 People value authenticity.
02:50:55.000 They value it in a really unique way.
02:50:59.000 And when someone can create a service or a social media platform or something where people really believe that the people behind it are trying to make it the best thing possible.
02:51:11.000 And they're not just trying to grow it Yeah.
02:51:15.000 Forever and make it constantly get bigger and make more money.
02:51:19.000 And they're also genuinely trying to innovate and they're genuinely trying to expand into this new realm of the metaverse and they're genuinely trying to moderate content in a thoughtful way.
02:51:32.000 That resonates with people.
02:51:34.000 It's very important.
02:51:35.000 Mm-hmm.
02:51:36.000 That's probably one of the reasons why it's so big.
02:51:38.000 And I think that applies to many, many, many things.
02:51:42.000 But I think people are oftentimes very short-sighted, and they're only thinking about growth.
02:51:48.000 They're thinking about constant, never-ending growth, and they're not necessarily thinking about, why am I doing this in the first place?
02:51:54.000 Yeah.
02:51:54.000 What if I didn't have to do this anymore?
02:51:56.000 What if I had so much money that I could do whatever I wanted?
02:51:59.000 Would I still do this?
02:52:00.000 And why would I do it?
02:52:01.000 And how would I do it differently?
02:52:02.000 How would I do it differently if I wasn't thinking about just money?
02:52:05.000 I was thinking in terms of big picture things and making it more enjoyable, making it more thrilling to me, making it more exciting.
02:52:15.000 And I think that we're very fortunate, you and I, in that we have the ability to take those kind of choices.
02:52:22.000 And I think that kind of freedom is probably one of the most important freedoms for the Western person that's listening to this, that is not confined in a communist country.
02:52:31.000 There's obviously a lot bigger problems that they have.
02:52:34.000 But you're in a position where you can make your own choices.
02:52:38.000 What would be the best choice to make?
02:52:40.000 Well, the best choice to make is to actually follow your interests.
02:52:43.000 What is actually fascinating to you?
02:52:45.000 And for you, the fact that you're really interested in health and philanthropy and And those things as well as also interested in the metaverse and interested in AR and VR and all these different modalities, all these different ways to express it and how enriching it is to people's lives.
02:53:04.000 It's fucking cool.
02:53:07.000 Just trying.
02:53:08.000 Well, you're succeeding, man.
02:53:10.000 I mean, it's obvious.
02:53:11.000 It's pretty dope.
02:53:12.000 And I think we're like three hours in.
02:53:14.000 So we can wrap it up here.
02:53:16.000 All right.
02:53:16.000 Thank you for coming.
02:53:17.000 It was really cool to meet you.
02:53:19.000 Yeah.
02:53:19.000 And I appreciate all the things you're doing and what you're saying.
02:53:21.000 And I'm so glad you're into martial arts now, too.
02:53:24.000 It's great.
02:53:25.000 Awesome.
02:53:25.000 Yeah, no, it's great to get a chance to do this.
02:53:28.000 Yeah, let's do it again sometime, man.
02:53:30.000 Let's do it.
02:53:30.000 Whenever you have some new crazy shit coming out and whenever you take things to a new next level, come on in.
02:53:36.000 We'll talk.
02:53:37.000 Let's do it.
02:53:37.000 Thank you.
02:53:38.000 Bye, everybody.