The Joe Rogan Experience - January 10, 2025


Joe Rogan Experience #2255 - Mark Zuckerberg


Episode Stats

Length

2 hours and 50 minutes

Words per Minute

180.32875

Word Count

30,716

Sentence Count

2,321

Misogynist Sentences

9

Hate Speech Sentences

9


Summary

In this episode of the Joe Rogan Experience podcast, I sit down with Jack Dorsey to talk about his early days at the company, how he got started, and how he and his team dealt with the growing concerns about censorship on social media.


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day!
00:00:12.000 Alright, we're up.
00:00:13.000 What's happening?
00:00:14.000 Good to see you.
00:00:14.000 You too.
00:00:15.000 What's going on?
00:00:16.000 You know, chill week.
00:00:18.000 Yeah, sort of.
00:00:21.000 This recent announcement that you did about content moderation, how has that been received?
00:00:29.000 Probably depends on who you ask.
00:00:31.000 Right.
00:00:31.000 But, you know, but look, I mean, I've been working on this for a long time.
00:00:35.000 So, I mean, you've got to do what you think is right.
00:00:38.000 You know, we've been on a long journey here, right?
00:00:41.000 I mean, I think at some level you only start one of these companies if you believe in giving people a voice, right?
00:00:49.000 I mean, the whole point of social media is basically...
00:00:54.000 You know, giving people the ability to share what they want.
00:00:57.000 Right.
00:00:58.000 And, you know, it goes back to, you know, our original mission is just give people the power to share and make the world more open and connected.
00:01:10.000 What do you think started the pathway towards increasing censorship?
00:01:15.000 Because clearly we were going in that direction for the last few years.
00:01:21.000 It seemed like we really found out about it when Elon bought Twitter and we got the Twitter files.
00:01:27.000 And when you came on here and when you were explaining the relationship with FBI, where they were trying to get you to take down certain things that were true and real and certain things they tried to get you to limit the exposure to them.
00:01:40.000 So it's these kind of conversations.
00:01:43.000 Like, when did all that start?
00:01:45.000 Yeah, well, look, I think going back to the beginning.
00:01:49.000 Like I was saying, I think you start one of these if you care about giving people a voice.
00:01:53.000 You know, I wasn't too deep on our content policies for like the first 10 years of the company.
00:01:58.000 It was just kind of well known across the company that we were trying to give people the ability to share as much as possible.
00:02:05.000 And issues would come up, practical issues, right?
00:02:08.000 So if someone's getting bullied, for example, we'd deal with that, or we'd put in place systems to fight bullying.
00:02:13.000 You know, if someone is saying, You know, someone's pirating copyrighted content on the surface.
00:02:20.000 It's like, okay, we'll build controls to make it so we'll find IP-protected content.
00:02:24.000 But it was really in the last 10 years that people started pushing for, like, ideological-based censorship.
00:02:33.000 And I think it was two main events that really triggered this.
00:02:38.000 In 2016, there was the election of President Trump.
00:02:43.000 Also coincided with...
00:02:45.000 Basically, Brexit in the EU and sort of the fragmentation of the EU. And then, you know, in 2020, there was COVID. And I think that those were basically these two events where, for the first time, we just faced this massive, massive institutional pressure to basically start censoring content on...
00:03:12.000 I'm sorry to interrupt you, but when it first came up in 2016, did it come under the guise of the Russian collusion hoax?
00:03:20.000 Yeah, and this is the thing.
00:03:21.000 At the time, I was really sort of ill-prepared to kind of parse what was going on.
00:03:28.000 I think part of my reflection looking back on this is I kind of think in 2016, in the aftermath, I gave too much deference to a lot of folks in the media who were basically saying, okay, there's no way that this guy I gave too much deference to a lot of folks in the media who were basically saying, People can't actually believe this stuff, right?
00:03:50.000 It has to be that there's this kind of like massive misinformation out there.
00:03:56.000 Some of it started with the Russia collusion stuff.
00:04:00.000 But it kind of morphed into different things over time.
00:04:03.000 He was so ideologically polarizing, right?
00:04:06.000 People didn't want to believe that anybody looked at him and said, this should be our president.
00:04:12.000 Yeah, so I took this and just kind of assumed that everyone was acting in good faith.
00:04:19.000 And I said, okay, well, there are concerns about misinformation.
00:04:23.000 We should, just like when people raised other concerns in the past and we tried to deal with them.
00:04:28.000 Okay, yeah, if you ask people, no one says that they want misinformation, so maybe there's something that we should do to basically try to address this.
00:04:36.000 But I was really worried from the beginning about basically becoming this sort of decider of what is true in the world.
00:04:47.000 That's kind of a crazy position to be in for billions of people using your service.
00:04:53.000 So we tried to put in place a...
00:04:56.000 You know, a system that would deal with it.
00:04:58.000 You know, and early on tried to basically make it so that it was really limited.
00:05:03.000 We're like, all right, we're just going to have this system where there's these third-party fact checkers and they can check the worst of the worst stuff, right?
00:05:10.000 So things that are very clear hoaxes.
00:05:13.000 It's not like we're not parsing speech about whether something is slightly true or slightly false, like earth is flat, you know, things like that, right?
00:05:22.000 So that was sort of the original intent.
00:05:24.000 We put in place the system and it just sort of veered from there.
00:05:28.000 I think to some degree it's because some of the people whose job is to do fact-checking, a lot of their industry is focused on political fact-checking.
00:05:36.000 So they're just kind of veered in that direction.
00:05:38.000 And we kept on trying to basically get it to be what we had originally intended, which is just, you know, it's not, the point isn't to like judge people's opinions.
00:05:47.000 It's to provide in this layer to kind of help fact-check some of the stuff that seems the most extreme.
00:05:55.000 It just, you know, it was just never accepted by people broadly.
00:06:00.000 I think people just felt like the fact-checkers were too biased.
00:06:04.000 Not necessarily even so much in what they ruled, although sometimes I think people would disagree with that.
00:06:10.000 A lot of the time, it was just what types of things they chose to even go and fact-check in the first place.
00:06:14.000 So I kind of think, like, after having gone through that whole exercise, it, I don't know, it's something out of, like, You know, 1984, one of these books where it's just like, it really is a slippery slope.
00:06:29.000 And it just got to a point where it's just, okay, this is destroying so much trust, especially in the United States, to have this program.
00:06:38.000 And I guess it was probably about a few years that I really started coming to the conclusion that we were going to need to change something about that.
00:06:47.000 COVID was the other big one where that was also very tricky.
00:06:54.000 Because, you know, at the beginning it was, you know, it's like a legitimate public health crisis, you know, in the beginning.
00:07:02.000 And it's, you know, even people who are like the most ardent First Amendment, you know, defenders, the Supreme Court has this clear precedent that's like, all right, you can't yell fire in a crowded theater.
00:07:17.000 There are times when if there's an emergency, you're...
00:07:22.000 ability to speak can temporarily be curtailed in order to get an emergency under control.
00:07:27.000 So I was sympathetic to that at the beginning of COVID.
00:07:29.000 It seemed like, okay, you have this virus.
00:07:31.000 It seems like it's killing a lot of people.
00:07:34.000 I don't know.
00:07:35.000 We didn't know at the time how dangerous it was going to be.
00:07:38.000 So at the beginning, it kind of seemed like, okay, we should give a little bit of deference to the government and the health authorities on how we should play this.
00:07:46.000 But when it went from, you know, two weeks to flatten the curve to, you know, in like in the beginning, it was like, okay, there aren't enough masks.
00:07:55.000 Masks Masks aren't that important to them.
00:07:56.000 It's like, oh, no, you have to wear a mask.
00:07:58.000 And, you know, like everything was shifting around.
00:08:01.000 It's become very difficult to kind of follow.
00:08:04.000 And this really hit the most extreme, I'd say, during the Biden administration when they were trying to roll out the vaccine program.
00:08:18.000 I'm generally pretty pro-rolling out vaccines.
00:08:21.000 I think on balance, the vaccines are more positive than negative.
00:08:24.000 But I think that while they're trying to push that program, they also tried to censor anyone who was basically arguing against it.
00:08:34.000 And they pushed us super hard to take down things that were honestly true.
00:08:41.000 They basically pushed us and said, anything that...
00:08:47.000 Says that vaccines might have side effects.
00:08:50.000 You basically need to take down.
00:08:52.000 And I was just like, well, we're not going to do that.
00:08:56.000 Like, we're clearly not going to do that.
00:08:57.000 I mean, that is kind of inarguably true.
00:09:01.000 Who is they?
00:09:01.000 Who's telling you to take down things that talk about vaccine side effects?
00:09:06.000 It was people in the Biden administration.
00:09:08.000 I think it was, you know, I wasn't involved in those conversations directly, but I think it was...
00:09:12.000 How difficult is that to not be involved in those conversations directly?
00:09:16.000 Fitness isn't just about what you do in the gym.
00:09:18.000 It's also about your nutrition.
00:09:19.000 But even with the best diet, some nutrients can be hard to get.
00:09:23.000 And AG1 can help fill those gaps.
00:09:26.000 AG1 delivers optimal amounts of nutrients in forms that help your body perform.
00:09:30.000 AG1 makes foundational nutrition easy because there aren't a million different pills and capsules you have to keep track of.
00:09:36.000 It's just one scoop mixed in water.
00:09:38.000 It's such an easy routine to keep in the mornings.
00:09:41.000 Ingredients in AG1 are selected for absorption, nutrient density, and potency.
00:09:46.000 And are intentionally picked to work in sync with the whole formula for optimal impact.
00:09:51.000 They're seriously committed to quality.
00:09:53.000 AG1 is tested for hundreds of contaminants and impurities, and they're constantly reformulating their recipe to dial it in.
00:10:00.000 This is all part of why I've partnered with AG1 for years.
00:10:04.000 So get started with AG1 this holiday season and get a free bottle of vitamin D3, K2, and five free AG1 travel packs with your first purchase at drinkag1.com slash Joe Rogan.
00:10:17.000 That's a $76 value gift for free.
00:10:21.000 Go to drinkag1.com slash Joe Rogan.
00:10:25.000 Seriously, get on this.
00:10:27.000 That's got to be strange too, right?
00:10:28.000 Because you're running the company, but there's clearly, you're moderating at scale that's beyond the imagination.
00:10:36.000 The number of human beings you're moderating is fucking insane.
00:10:39.000 Like, what is, what's a Facebook, how many people use it on a daily basis?
00:10:44.000 Forget about how many overall, like how many people use it regularly?
00:10:48.000 3.2 billion people use one of our services every day.
00:10:51.000 That's it.
00:10:52.000 Yeah.
00:10:52.000 No, it's wild.
00:10:53.000 More than a third of the planet?
00:10:55.000 That's so crazy.
00:10:56.000 It's almost half of Earth.
00:10:58.000 Well, on a monthly basis, it is probably half of Earth.
00:11:01.000 I want to say that, though, there's a lot of hypercritical people that are conspiracy theorists and think that everybody is a part of some cabal to control them.
00:11:10.000 I want you to understand that whether it's YouTube or whatever place that you think is doing something that's awful, It's good that you speak because this is how things get changed and this is how people find out that people are upset about content moderation and censorship.
00:11:25.000 But moderating at scale is insane.
00:11:29.000 It's insane.
00:11:30.000 What we were talking the other day about the number of videos that go up every hour on YouTube and it's bananas.
00:11:37.000 It's bananas.
00:11:38.000 To try to get a human being that is reasonable, logic...
00:11:44.000 Logical and objective that's going to analyze every video.
00:11:48.000 It's virtually impossible.
00:11:49.000 It's not possible.
00:11:51.000 So you've got to use a bunch of tools.
00:11:52.000 You've got to get a bunch of things wrong.
00:11:53.000 And you have also people reporting things.
00:11:55.000 And how much is that going to affect things?
00:11:57.000 You could have mass reporting because you have bad actors.
00:12:01.000 You have some corporation that decides we're going to attack this video because it's bad for us to get it taken down.
00:12:05.000 There's so much going on.
00:12:08.000 I want to put that in people's heads before we go on.
00:12:11.000 understand the kind of numbers that we're talking about here.
00:12:14.000 Now understand you have the pandemic and then you have the administration that's doing something where I think they crossed the line, where it gets really weird, where they're saying what you were saying.
00:12:25.000 They were trying to get you to take down vaccine side effects, which is just crazy.
00:12:33.000 Yeah, so, I mean, like you're saying, I mean, this is...
00:12:37.000 It's so complicated, this system, that I could spend every minute of all of my time doing this and not actually focused on building any of the things that we're trying to do.
00:12:46.000 AI, glasses, the future of social media, all that stuff.
00:12:51.000 So I get involved in this stuff, but in general, we have a policy team.
00:12:55.000 There are people who I trust.
00:12:56.000 The people are kind of working on this on a day-to-day basis.
00:12:59.000 And the interactions that I was just referring to, a lot of this is documented.
00:13:06.000 You know, Jim Jordan and the House had this whole investigation and committee into the kind of government censorship around stuff like this.
00:13:14.000 And we produced all these documents and it's all in the public domain.
00:13:16.000 I mean, basically, these people from the Biden administration would call up our team and like...
00:13:22.000 Scream at them and curse.
00:13:24.000 And it's like these documents, it's all kind of out there.
00:13:27.000 Did you record any of those phone calls?
00:13:29.000 I don't know.
00:13:30.000 I don't think we were.
00:13:31.000 But I think...
00:13:32.000 I want to listen.
00:13:33.000 I mean, there are emails.
00:13:34.000 The emails are published.
00:13:35.000 It's all kind of out there.
00:13:37.000 And they're like...
00:13:39.000 And basically it just got to this point where we were like, no, we're not going to take down things that are true.
00:13:46.000 That's ridiculous.
00:13:47.000 They wanted us to take down this meme of Leonardo DiCaprio looking at a...
00:13:51.000 TV talking about how 10 years from now or something, you know, you're going to see an ad that says, okay, if you took a COVID vaccine, you're eligible, you know, like for this kind of payment, like sort of like class action lawsuit type meme.
00:14:07.000 And they're like, no, you have to take that down.
00:14:08.000 We said, no, we're not going to take down humor and satire.
00:14:12.000 We're not going to take down things that are true.
00:14:15.000 And then at some point, I guess...
00:14:20.000 I don't know, it flipped a bit.
00:14:21.000 I mean, Biden, when he was – he gave some statement at some point.
00:14:24.000 I don't know if it was a press conference or to some journalist where he basically was like, these guys are killing people.
00:14:28.000 And I don't know.
00:14:33.000 Then, like, all these different agencies and branches of government basically just, like, started investigating coming after our company.
00:14:40.000 It was brutal.
00:14:41.000 It was brutal.
00:14:43.000 Wow.
00:14:45.000 Yeah.
00:14:47.000 It's just a massive overstepping.
00:14:49.000 And also, you weren't killing people.
00:14:52.000 This is the thing about all of this.
00:14:54.000 It's like they suppressed so much information about things that people should be doing, regardless of whether or not you believe in the vaccine.
00:15:03.000 Regardless, put that aside.
00:15:05.000 Metabolic health.
00:15:06.000 Is of the utmost importance in your everyday life, whether there's a pandemic or there's not.
00:15:10.000 And there's a lot of things that you can do that can help you recover from illness.
00:15:16.000 It prevents illnesses.
00:15:18.000 It makes your body more robust and healthy.
00:15:20.000 It strengthens your immune system.
00:15:21.000 And they were suppressing all that information.
00:15:23.000 And that's just crazy.
00:15:25.000 You can't say you're one of the good guys if you're suppressing information that would help people recover from all kinds of diseases, not just COVID. The flu, common cold, all sorts of different things.
00:15:37.000 High doses of vitamin C, D3 with K2 and magnesium.
00:15:41.000 They were suppressing this stuff because they didn't want people to think that you could get away with not taking a vaccine, which is really crazy when you're talking about something that 99.07% of people survive.
00:15:55.000 This is a crazy overstep, but it scared the shit out of a lot of people.
00:16:01.000 Red-pilled, as it were, a lot of people because they realized like, oh, 1984 is like an instruction manual.
00:16:08.000 It shows you how things can go that way with wrong speak and with bizarre distortion of facts.
00:16:16.000 And when it comes down to it, in today's day and age, the way people get information is through your platform, through X. This is how people are getting information.
00:16:26.000 They're getting information from YouTube.
00:16:27.000 They're getting information from a bunch of different sources now.
00:16:31.000 And you can't censor that if it's real, legitimate information because it's not ideologically convenient for you.
00:16:38.000 Yeah.
00:16:39.000 So, I mean, that's basically the journey that I've been on.
00:16:41.000 It started off very pro-free speech, free expression.
00:16:46.000 And then over the last 10 years, there have been these two big episodes.
00:16:49.000 It was the Trump election and the aftermath, where I feel like in retrospect, I deferred too much to the kind of critique of the media on what we should do.
00:16:58.000 And since then...
00:17:00.000 I think generally trust in media has fallen off a cliff, right?
00:17:03.000 So I don't think I'm alone in that journey.
00:17:05.000 I think that's basically the experience that a lot of people have had is, okay, the stuff that's being written about is not kind of all accurate.
00:17:14.000 And even if the facts are right, it's kind of written from a slant a lot of the time.
00:17:18.000 Of course.
00:17:18.000 And then there was the government version of it, which was during COVID, which is okay.
00:17:24.000 It's like our government is telling us that we need to censor true things.
00:17:27.000 It's like, this is a disaster.
00:17:29.000 And it's not just the US, right?
00:17:32.000 I think a lot of people in the US focus on this as an American phenomenon.
00:17:38.000 But I kind of think that the reaction to COVID probably caused a breakdown in trust in a lot of governments around the world.
00:17:47.000 Because, I mean, 2024 was a big election year around the world.
00:17:53.000 You know, there are all these countries, India, just like a ton of countries that had elections, and the incumbents basically lost every single one.
00:18:02.000 So there is some sort of a global phenomenon where the, whether it was because of inflation, because of the economic policies to deal with COVID, or just how the governments dealt with COVID. Seems to have had this effect that's global, not just the U.S., but like a very broad decrease in trust, at least in that set of incumbents and maybe in sort of these democratic institutions overall.
00:18:32.000 So I think that what you're saying of, yeah, how do people get their information now?
00:18:36.000 It's by sharing it online on social media.
00:18:40.000 I think that that's just increasingly true.
00:18:42.000 And my view at this point is like, all right, like we started off focused on free expression.
00:18:46.000 We kind of had this pressure tested over the last period.
00:18:49.000 I feel like I just have a much greater command now of what I think the policy should be, and, like, this is how it's going to be going forward.
00:18:57.000 And so, I mean, at this point, I think a lot of people look at this as, like, a purely political thing, you know, because they...
00:19:06.000 They kind of look at the timing and they're like, hey, well, you're doing this right after the election.
00:19:10.000 It's like, okay, I try not to change our content rules right in the middle of an election either.
00:19:14.000 There's not a great time to do this.
00:19:16.000 And you want to do it a year later.
00:19:18.000 Yeah, it's like there's no good time to do it.
00:19:20.000 And whatever time is going on, there's going to be...
00:19:25.000 The good thing about doing it after the election is you get to take this kind of cultural pulse as like, okay, where are people right now and how are people thinking about it?
00:19:32.000 We try to have policies that reflect mainstream discourse.
00:19:36.000 But yeah, I mean, I don't know.
00:19:39.000 This is something I've been thinking about for a while.
00:19:41.000 I think that this is going to be pretty durable because at this point we've just been pressure tested on this stuff for like the last eight to ten years with like these huge institutions just...
00:19:52.000 Pressuring us.
00:19:53.000 And I feel like this is kind of the right place to be going forward.
00:19:58.000 What was it like when they were attacking you?
00:20:00.000 First of all, what was the premise?
00:20:03.000 What were they saying was your offense?
00:20:06.000 Was it that you were allowing information that was not true, that was getting out there?
00:20:10.000 I know there was also they're saying that you guys were allowing hate groups to speak.
00:20:15.000 There was a lot of this.
00:20:17.000 Yeah.
00:20:18.000 I mean, the tough thing with politics...
00:20:21.000 Is that there's like, well, when you say someone's coming after you, are you referring to kind of the government and investigations and all that?
00:20:28.000 I mean, so the issue is that there's the, there's what specific thing an agency might be looking into you for.
00:20:37.000 And then there's like the underlying political motivation, which is like, why do the people who are running this thing hate you?
00:20:44.000 And I think that those can often be two very different things.
00:20:48.000 So, and we had organizations.
00:20:51.000 That were looking into us that were, like, not really involved with social media.
00:20:54.000 Like the CFPB, like this financial...
00:20:58.000 I don't even know what it stands for.
00:20:59.000 It's the financial organization that Elizabeth Warren had set up.
00:21:05.000 Oh, great.
00:21:06.000 And it's basically, it's like, we're not a bank.
00:21:08.000 The debanking section.
00:21:10.000 Yeah, no, so we're not a bank, right?
00:21:13.000 It's like, what does Meta have to do with this?
00:21:16.000 But they kind of found some theory.
00:21:18.000 That they wanted to investigate.
00:21:20.000 And it's like, okay, clearly they were trying really hard to find some theory.
00:21:24.000 But I don't know.
00:21:27.000 Throughout the party and the government, there was just sort of...
00:21:32.000 I don't know how this stuff works.
00:21:34.000 I mean, I've never been in government.
00:21:35.000 I don't know if it's a directive or it's just a quiet consensus that we don't like these guys.
00:21:39.000 They're not doing what we want.
00:21:40.000 We want, we're going to punish them.
00:21:41.000 But it's tough to be at the other end of that.
00:21:48.000 What was it like?
00:21:50.000 Well, it's not good.
00:21:52.000 The thing that I think is actually the toughest, though, is it's global.
00:21:58.000 And really, when you think about it, the U.S. government should be defending its companies, not be the tip of the spear attacking its companies.
00:22:06.000 So we talk about a lot, okay, what is the experience of...
00:22:12.000 Okay, if the U.S. government comes after you, I think the real issue is that when the U.S. government does that to its tech industry, it's basically just open season around the rest of the world.
00:22:22.000 I mean, the EU, I pull these numbers, the EU has fined the tech companies more than $30 billion over the last, I think it was like 10 or 20 years.
00:22:33.000 Holy shit.
00:22:34.000 So when you think about it, like, okay, there's...
00:22:37.000 It's like, you know, $100 million here, a couple billion dollars there.
00:22:41.000 But what I think really adds up to is this is sort of like a kind of EU-wide policy for how they want to deal with American tech.
00:22:50.000 It's almost like a tariff.
00:22:51.000 And I think the U.S. government basically gets to decide how are they going to deal with that, right?
00:22:55.000 Because if the U.S. government, if some other country was screwing with another industry that we cared about, the U.S. government would probably find some way to put pressure on them.
00:23:05.000 But I think what happened here is actually the complete opposite.
00:23:08.000 The U.S. government led the kind of attack against the companies, which then just made it so, like, the EU is basically, in all these other places, just free to just go to town on all the American companies and do whatever you want.
00:23:20.000 But, I mean, look, obviously, I don't want to come across as if, like, we don't have things that we need to do better.
00:23:26.000 Obviously, we do.
00:23:28.000 And when we mess something up, we deserve to be held accountable for that, and just like everyone else.
00:23:35.000 I do think that the American technology industry is a bright spot in the American economy.
00:23:39.000 I think it's a strategic advantage for the United States that we have a lot of the strongest companies in the world.
00:23:45.000 And I think it should be part of the U.S.'s strategy going forward to defend that.
00:23:50.000 And it's one of the things that I'm optimistic about with President Trump is I think he just wants America to win.
00:23:57.000 And I think some of the stuff, like the other governments who are kind of pushing on this stuff, it's...
00:24:03.000 At least the U.S. has the rule of law, right?
00:24:06.000 So the government can come after you for something, but you still get your day in court, and the courts are pretty fair.
00:24:13.000 So we've basically done a pretty good job of defending ourselves, and when we've chosen to do that, basically, we have a pretty good rate of winning.
00:24:22.000 It's just not like that in every other country around the world.
00:24:24.000 Like if other governments decide that they're going to go after you, you don't always get kind of a clear shake at kind of defending yourself on the rules.
00:24:37.000 So I think to some degree, if the U.S. tech industry is going to continue being really strong, I do think that the U.S. government has a role in basically defending it abroad.
00:24:51.000 And that's one of the things that I'm optimistic about will happen in this administration.
00:24:54.000 Well, I think this administration uniquely has felt the impact of –
00:25:00.000 Not being able to have free speech because this was the This is the administration where Trump was famously kicked off of Twitter That was a huge issue like after January 6th like they removed the at the time the sitting president It was kind of crazy to remove that person from social media because you've decided that he incited a riot So for him Without free speech,
00:25:29.000 without podcasts, without social media, they probably wouldn't have had a chance because the mainstream narrative, other than Fox News, was so clearly against him.
00:25:38.000 The majority of the television entities and print entities were against him, the majority of them.
00:25:46.000 So without social media, without podcasts, they don't stand a chance.
00:25:50.000 So they're uniquely aware of the importance of giving people their voice.
00:25:56.000 Free speech.
00:25:58.000 But you do have to be careful about misinformation and you do have to be careful about just outright lies and propaganda complaints or propaganda campaigns, rather.
00:26:08.000 And how do you differentiate?
00:26:10.000 Well, I think that there are a couple of different things here.
00:26:12.000 One is, this is something where I think X and Twitter just did it better than us on fact-checking.
00:26:17.000 We took the critique around fact-checking.
00:26:20.000 Sorry, around misinformation, we put in place this fact-checking program, which basically empowered these third-party fact-checkers.
00:26:25.000 They could mark stuff false, and then we would downright it in the algorithm.
00:26:30.000 I think what Twitter and X have done with community notes, I think it's just a better program.
00:26:36.000 Rather than having a small number of fact-checkers, you get the whole community to weigh in.
00:26:40.000 When people usually disagree on something, tend to agree on how they're voting on a note.
00:26:45.000 That's a good sign to the community that there's actually a broad consensus on this, and then you show it.
00:26:49.000 And you're showing more information, not less.
00:26:52.000 So you're not using the fact check as a signal to show less.
00:26:57.000 You're using the community note to provide real context and show additional information.
00:27:02.000 So I think that that's better.
00:27:08.000 When you're talking about nation-states or people interfering, a lot of that stuff...
00:27:14.000 Is best rooted out at the level of kind of accounts doing phony things.
00:27:20.000 So you get like, whether it's like China or Russia or Iran or like one of these countries, they'll set up these networks of fake accounts and bots and they coordinate and they post on each other stuff to make it seem like it's authentic and kind of convince people.
00:27:34.000 It's like, wow, a bunch of people must think this or something.
00:27:36.000 And the way that you identify that is you build AI systems.
00:27:41.000 That can basically detect that those accounts are not behaving the way that a human would.
00:27:46.000 And when we find that, that there's like some bot that's operating an account.
00:27:52.000 How do you differentiate?
00:27:53.000 How do you figure that out?
00:27:54.000 It just, I mean, there are some things that a person just would never do.
00:27:58.000 Right.
00:27:58.000 Have you met Lex Friedman?
00:28:01.000 Yes.
00:28:02.000 Yeah, yeah.
00:28:03.000 He might not be.
00:28:04.000 Well, but is he going to make a million actions in a minute?
00:28:09.000 It's like, yeah, yeah, probably not.
00:28:11.000 Okay, so it's that.
00:28:12.000 Well, I mean, it's more subtle than that.
00:28:13.000 I think, like, these guys are pretty sophisticated, and it's an adversarial space.
00:28:17.000 So we find some technique, and then they basically kind of update their techniques.
00:28:25.000 But we have a team of, it's effectively, like...
00:28:32.000 What are these accounts that are just not behaving the way that people would and how are they interacting?
00:28:42.000 And then sometimes you trace it down and sometimes you get some tips from different intelligence agencies and then you can kind of piece together over time.
00:28:52.000 It's like, oh, this network of people is actually some kind of fake...
00:28:56.000 Cluster of accounts, and that's against our policies, and we just take them all off.
00:29:02.000 But...
00:29:02.000 How are you sure?
00:29:05.000 Is there a 100% certainty that you are definitely getting a group of people that are bad actors, or is it just people that have unpopular opinions?
00:29:15.000 No, I don't think it's that for this.
00:29:17.000 I think...
00:29:19.000 But what I'm saying is how do you determine?
00:29:22.000 At what percentage of accuracy are you determining?
00:29:26.000 Do you ever accidentally think that people that are going to get moderated are actually just real people?
00:29:34.000 Yes.
00:29:35.000 I think for the specific problem around these large coordinated groups doing election interference or something, it's a large enough group.
00:29:43.000 We have a bunch of people analyzing it.
00:29:46.000 It's like they study it for a while.
00:29:47.000 I think we're probably pretty accurate on that.
00:29:49.000 But I actually think one of the bigger issues that we have in our moderation system is this precision issue that you're talking about.
00:29:57.000 And that is actually, of all the things that we announced this week, in terms of how we're going to update the content policies, changing the content filters to have to require higher confidence and precision is actually going to be the thing that reduces The vast majority of the censorship mistakes that we make.
00:30:18.000 Removing the fact-checkers and replacing them with community notes, I think it's a good step forward.
00:30:22.000 A very small percent of content is fact-checked in the first place, so is that going to make the hugest difference?
00:30:28.000 I'm not sure.
00:30:29.000 I think it'll be a positive step, though.
00:30:32.000 We opened up some content policies, so some stuff that was restricted before we opened up.
00:30:37.000 Okay, that's good.
00:30:38.000 It'll mean that some set of things that might have been censored before or not.
00:30:43.000 But by far the biggest set of issues we have, and you and I have talked about a bunch of issues like this over the years, is like, it's just, okay, you have some classifier that's trying to find, say, like drug content, right?
00:30:55.000 People decide, okay, it's like the opioid epidemic is a big deal.
00:30:58.000 We need to do a better job of cracking down on drugs and drug sales, right?
00:31:01.000 I don't want people dealing drugs on our networks.
00:31:03.000 So we build a bunch of systems that basically go out and try to automate finding people who are dealing drugs.
00:31:12.000 And then you basically have this question, which is how precise do you want to set the classifier?
00:31:17.000 So do you want to make it so that the system needs to be 99% sure that someone is dealing drugs before taking them down?
00:31:27.000 Do you want it to be 90% confident, 80% confident?
00:31:30.000 And then those correspond to amounts of...
00:31:35.000 I guess the statistics term would be recall.
00:31:37.000 What percent of the bad stuff are you finding?
00:31:39.000 So if you require 99% confidence, then maybe you only actually end up taking down 20% of the bad content.
00:31:49.000 Whereas if you reduce it and you say, okay, we're only going to require 90% confidence, now maybe you can take down 60% of the bad content.
00:31:57.000 But let's say you say, no, we really need to find...
00:32:00.000 Everyone who is doing this bad thing, and it doesn't need to be as severe as dealing drugs.
00:32:04.000 It could just be, I mean, it could be any kind of content of, any kind of category of harmful content.
00:32:12.000 You start getting to some of these classifiers might have 80, 85% precision in order to get 90% of the bad stuff down.
00:32:20.000 But the problem is if you're at, you know, 90% precision, that means one out of 10 things that the classifier takes down is not actually problematic.
00:32:28.000 And if you filter, if you kind of multiply that across the billions of people who use our services every day, that is millions and millions of posts that are basically being taken down that are innocent.
00:32:43.000 And upon review, we're going to look at it and be like, this is ridiculous that this thing got taken down, which I mean, I think you've had that experience and we've talked about this for a bunch of stuff over time.
00:32:54.000 But it really just comes down to this question of where do you want to set the classifiers?
00:32:57.000 So one of the things that we're going to do is basically set them to require more confidence, which is this tradeoff.
00:33:06.000 It's going to mean that we will maybe take down a smaller amount of the harmful content, but it will also mean that we'll dramatically reduce the amount of people whose accounts we're taking off for a mistake, which is just a terrible experience, right?
00:33:19.000 It's like, okay, you're going about your day, and then one day...
00:33:23.000 You wake up and you're like, oh, my WhatsApp account just got deactivated because it's connected to a Facebook account and the Facebook account is – or like I'm using it on the same phone as a Facebook account where like we made some enforcement mistake and thought you were doing something bad that you weren't because our classifiers were set to low precision.
00:33:44.000 Has that happened before?
00:33:45.000 Oh, yeah.
00:33:46.000 Where their WhatsApp app got canceled as well?
00:33:48.000 Yeah, because, I mean, there are a bunch of...
00:33:50.000 So if your Facebook app gets taken...
00:33:52.000 Like, say, if you have a Facebook and you have, like, a sock puppet account, and the sock puppet account, you post offensive memes and you're generally gross.
00:34:00.000 Yeah.
00:34:00.000 If you get caught for that, does your WhatsApp get killed?
00:34:04.000 Not for memes, but go back to, like, a very severe thing.
00:34:08.000 Like, let's say someone is...
00:34:10.000 Terrorists.
00:34:11.000 Let's say the most severe.
00:34:12.000 Sure, yeah.
00:34:12.000 Let's say someone is, like...
00:34:14.000 Terrorist content, they're planning some attack.
00:34:16.000 So we take down their account.
00:34:18.000 But then let's say that person can just go then sign up with another account.
00:34:23.000 How does WhatsApp get connected to that, though?
00:34:26.000 Oh, well, if it's...
00:34:27.000 I mean, we run these different services, and if they're on the same phone, it's basically...
00:34:32.000 One thing that it's basically regulators or governments will come to us and say, okay, it's...
00:34:38.000 You're clearly not doing enough if you kick someone off for terrorism and they can just sign up for another account on the phone.
00:34:43.000 Right.
00:34:43.000 Okay.
00:34:44.000 They also think, okay, well, we're not doing enough if we deactivate their Facebook account because they're planning a terrorist attack, but we let them use all our other services.
00:34:52.000 Right.
00:34:53.000 If you're aware.
00:34:54.000 Yeah.
00:34:54.000 So if our systems think that someone is a terrorist, then you probably need to deactivate their access to all the different accounts.
00:35:03.000 Yeah, they can't get on threads.
00:35:05.000 It's Instagram.
00:35:06.000 Yeah.
00:35:09.000 That makes sense.
00:35:10.000 So you can understand how you get there, but then you just get to this question around the precision and the confidence level.
00:35:17.000 And then you're just making all these mistakes at scale, and it's just unacceptable.
00:35:23.000 But I think it's a very hard calculation of, like, where do you want to be?
00:35:27.000 because on the one hand, like, I get it why people kind of come to us and they're like, no, you need to do a better job finding more of the terrorism or the drugs and all this stuff.
00:35:37.000 But over time, the technology will get better and it'll get more precise.
00:35:40.000 But at any given point in time, that's the choice that we have to make is do we want to make more mistakes erring on the side of just like blowing away innocent people's accounts?
00:35:50.000 Right.
00:35:51.000 Or do we want to get a somewhat higher percent of the bad stuff off?
00:35:56.000 And I think that there's some just some balance that you need to strike on this.
00:35:58.000 We were having a conversation yesterday, Mel Gibson and I, about how that can get weird.
00:36:03.000 Was it Theo?
00:36:04.000 It might have been Theo.
00:36:05.000 I think it was Theo.
00:36:07.000 Where that can get weird because...
00:36:09.000 I think, like, if you're a person and you work at some accounting firm, but you like posting about stuff, but you don't want it to come back and reflect on your life, you want to shitpost, you want to post jokes, you want to be silly, you should be able to be anonymous.
00:36:22.000 I think there's nothing wrong with that.
00:36:25.000 I don't think just because you state your opinion, people should be able to search where you sleep.
00:36:29.000 That doesn't make any sense to me.
00:36:32.000 If you're going to allow anonymous accounts, you're definitely going to open up the door to bad actors having enormous blocks of accounts where they can use either AI or just programs where they have specific answers.
00:36:47.000 I'm sure you've seen that before.
00:36:48.000 It's come up on Twitter multiple times where they've found...
00:36:52.000 Hundreds of sock puppet accounts tweeting the exact same thing.
00:36:56.000 So you've literally, word for word, even certain words in caps, like either people are copying or pasting it, or there's an email campaign that's getting legitimate people to do it, or these are fake people.
00:37:06.000 You're going to have...
00:37:07.000 If you're going to have anonymous accounts, which I think you should, because I think whistleblowers, I think the benefits of anonymous reporting on important things that the general public needs to know about, especially whistleblower type stuff, you have to have some ability to be anonymous.
00:37:24.000 But if you're going to do that, you're also going to have the possibility that these aren't real people.
00:37:28.000 That these are paid actors, these are paid people, or not people at all, or they're running programs, and they're doing this to try to sway public opinion about very important issues.
00:37:40.000 Yeah.
00:37:43.000 A lot of what we've seen, too, I mean, there's the anonymous accounts.
00:37:46.000 Also, just over time, I think a lot of the kind of more interesting conversations have shifted from the public sphere to more private ones.
00:37:53.000 So WhatsApp groups, private groups on Facebook.
00:37:57.000 I'm sure you have this experience where maybe 10 years ago you would have posted your kind of quick takes on whatever social media you're using.
00:38:07.000 Now, you know, the stuff that I post on Facebook and Instagram, it's like I put time into making sure that that's kind of good content that I want to be seen broadly.
00:38:15.000 And then like most of the jokes that I make are like with my friends and WhatsApp.
00:38:19.000 Exactly.
00:38:19.000 In groups.
00:38:21.000 So, yeah, I think that's sort of that's kind of where the world is more broadly now.
00:38:25.000 Yeah.
00:38:26.000 No, I think so for jokes, for that kind of stuff, for comedians, for sure.
00:38:30.000 Because also, we'll say things that we don't really mean.
00:38:33.000 We just say it because it's a funny thing to say.
00:38:36.000 I think everyone does.
00:38:37.000 For sure.
00:38:38.000 Yeah.
00:38:38.000 Which is just a weird thing about taking things out of context, particularly on social media where people love to do that.
00:38:47.000 There is this problem of, like, let's just say that you're a country that's involved in some sort of an international conflict, and you have this ability to get out this fake narrative and just spread it widely about all sorts of things you're accusing this other government of, all sorts of things that aren't true, and it just muddies the water of reality for a lot of people.
00:39:11.000 Yeah, and that's why that side of things, the kind of...
00:39:18.000 I mean, we're not letting off the gas on that at all.
00:39:21.000 I think most categories of bad stuff that we're policing, everyone agrees is bad.
00:39:27.000 No one's sitting there defending that terrorism is good, or child exploitation, or drugs, or IP violations, or people inciting violence.
00:39:36.000 It's like most of the stuff is bad.
00:39:37.000 People clearly believe that...
00:39:42.000 You know, election interference and foreign government manipulation of content is bad.
00:39:46.000 So this is the type of stuff that the vast majority of our energy goes towards that, and we're not changing our approach on any of that.
00:39:53.000 The two categories that I think have been very politicized are misinformation, because who gets to judge what's false and what's true?
00:40:04.000 You may just not like my opinion on something, and then people think it's false, but I think that that one's really tricky.
00:40:10.000 And the other one is basically what people refer to as hate speech, which I think also comes from a good place of wanting to crack down on that, of wanting to promote more inclusion and belonging and people feeling...
00:40:30.000 Feeling good and having a pluralistic society that can basically have all these different communities coexist.
00:40:37.000 Except everyone.
00:40:38.000 But I think the problem is that all these things are on a spectrum and when you go too far on them, I think on that side, we just basically got to this point where there were these things that you just couldn't say, which were mainstream discourse.
00:40:54.000 Pete Hegseth is going to probably be defending his nomination for Secretary of Defense on the Senate floor.
00:41:03.000 And I think one of the points that he's made is that he thinks that women shouldn't be able to be in certain combat roles.
00:41:09.000 And until we updated our policies, that wouldn't have been a thing that you could have said on our platforms because it would call for the exclusion of a protected category of people.
00:41:18.000 And it's like, okay.
00:41:20.000 On its face, yeah, calling for the exclusion of a protected category, there's legal protections, there's all this stuff.
00:41:27.000 But if it's okay to say on the floor of Congress, you should probably be able to debate it on social media.
00:41:33.000 So I think some of the stuff, I think well-intentioned, went too far, needs to just get rationalized a bit.
00:41:42.000 But it's those two categories.
00:41:44.000 Misinformation and hate speech, I think, are the ones that got politicized.
00:41:48.000 All the other ones, which is the vast majority of the stuff that we do, is I think people generally agree that it's good and we need to go after it.
00:41:57.000 But then you just get into this problem of the mistakes, like you're talking about.
00:41:59.000 Okay, well, what confidence level do people want us to have in our enforcement?
00:42:04.000 And at what point would people rather us kind of say, okay, I'm not sure that that one is causing an issue.
00:42:12.000 So on balance, maybe we should just...
00:42:14.000 You know, leave that person's account out because the pain of just nuking someone's account when you're not sure or you make a mistake is like, that's pretty real too.
00:42:22.000 Right.
00:42:24.000 Yeah.
00:42:24.000 Very, very complicated.
00:42:26.000 Yeah.
00:42:26.000 It's all very nuanced.
00:42:28.000 And, you know, you made a point earlier about the government supporting its companies, that it would be a good thing for the government to support its companies.
00:42:38.000 It makes sense.
00:42:39.000 It's an American company.
00:42:41.000 I think the issue that we're dealing with is...
00:42:43.000 Companies, as we're describing them, have never existed before, right?
00:42:48.000 There's never been a thing like Facebook before.
00:42:50.000 There's never been a thing like Twitter before or X. There's never been a thing like Instagram.
00:42:54.000 These are new things in terms of the impact that it has on society, on opinions, on conversations, on distribution of information.
00:43:04.000 There's never been a thing like this that the government didn't control.
00:43:09.000 So it makes sense from their perspective, continuing the patterns of behavior that they've always exhibited, which is to have control over the media.
00:43:17.000 I mean, there has been CIA operatives that have been in major newspapers forever.
00:43:22.000 There's always been that.
00:43:24.000 There's always been this sort of input that the government had in mainstream media narratives.
00:43:30.000 They are in a position now where they're losing that.
00:43:34.000 They've essentially lost it.
00:43:36.000 And especially with this last, the push during COVID deteriorated, as you were saying before, the opinion and the respect that people have for the facts that are coming from mainstream journalism in a way that I've never seen before in my life, where an enormous percentage of the population does not trust mainstream media anymore.
00:43:54.000 So, well, what do they trust?
00:43:56.000 They trust social media.
00:43:57.000 Well, who's running that?
00:43:58.000 Well, a bunch of people figured it out and invented it.
00:44:00.000 Well, no, fuck that.
00:44:01.000 Like, we've got to crack down on that.
00:44:03.000 Like, we've got to get our hands on this, which is what we saw during COVID, which we saw during the Biden administration's attempt to remove the Hunter Biden laptop story from Twitter and from all these different things that we saw happen, the way they...
00:44:16.000 Contacted you guys, what they're trying to do with getting you to remove real information about vaccine side effects.
00:44:23.000 This is like this new attempt to crack down on this new thing, which is a distribution outlet that's far more successful than anything they've ever controlled before, and they have no control of it.
00:44:38.000 They had CBS, they had NBC. When they had the New York Times and all these, Washington Post, when they were in control of narratives in that way, it was so much easier.
00:44:48.000 There wasn't some sort of pirate radio voice that came on and said, hey guys, here's the latest studies that shows this is not true.
00:44:58.000 Here's why they're lying about that.
00:45:00.000 Here's why they're lying about this.
00:45:01.000 And now that's what you get all day long on X. It's all day long.
00:45:06.000 It's like dissolving illusions.
00:45:08.000 And that's a completely new thing that probably led to Trump getting elected.
00:45:15.000 Yeah, I mean, the causality there is tricky because there's a lot of things.
00:45:18.000 Yeah, there's a lot of things.
00:45:20.000 But without it, he probably doesn't get elected.
00:45:25.000 Yeah, it's tough to know.
00:45:27.000 I mean, I do come back to this point that every major incumbent lost their elections around the world this year.
00:45:33.000 But I think that's also because of social media.
00:45:35.000 It might be because of that revealing how kind of incorrect and dishonest I think some of these governments were.
00:45:44.000 Yeah.
00:45:45.000 So I think that that's quite possible.
00:45:48.000 And I mean, I do think that there is this cycle that goes on where...
00:45:52.000 Within a society, it's not just the government that has power.
00:45:55.000 There's certain people who are in these culturally elite positions.
00:45:59.000 And journalists, TV, news anchors, who are the people who people broadly trust?
00:46:05.000 They're not all in government.
00:46:06.000 They're a lot of people in other positions.
00:46:10.000 It's like, who are the people that basically people look to?
00:46:14.000 And I think that's basically, it needs to shift for the internet age.
00:46:21.000 And I think a lot of the people who, you know, people looked to before, they're kind of realizing, hey, they weren't super honest about a lot of these issues that we face.
00:46:31.000 And that's partially why, you know, social media isn't a monolithic thing.
00:46:35.000 It's not that people trust Facebook or X. They trust the creators and the voices that they feel like are being authentic and giving them valuable information on there.
00:46:44.000 So there's, I think, going to be just this whole new class of...
00:46:50.000 Creators who basically become the new kind of cultural elites that people look at and are like, okay, these are the people who give it to me straight.
00:46:58.000 And I think that's a thing that is, maybe it's possible because of social media.
00:47:04.000 I think it's also just the internet more broadly.
00:47:07.000 I mean, I think podcasting is obviously a huge and important part of that too.
00:47:11.000 I mean, I don't know to what extent you feel like you kind of...
00:47:14.000 Got to be large because of social media or just the podcasting platforms that you used.
00:47:20.000 But I think that this is a very big sea change in terms of who are the voices that matter.
00:47:27.000 And what we do is we try to build a platform that gives people a voice.
00:47:31.000 But I think that there's this wholesale generational shift.
00:47:36.000 And who are the people who are being listened to?
00:47:38.000 And I think that that's, like, a very fascinating thing that is going on because I think that that's, like, what's going on here.
00:47:45.000 It's not just the government and people saying, hey, we want, like, a very big change here.
00:47:51.000 I think it's just, like, a wholesale shift in saying we just want different people who we actually trust, who are actually going to, like, tell us the truth.
00:48:00.000 And not give us the bullshit opinions that you're supposed to say, but the type of stuff that I would actually...
00:48:06.000 When I'm sitting in my living room with my friends, the stuff that we know is true.
00:48:11.000 Who are the people who have the courage to actually just say that stuff?
00:48:15.000 I don't know.
00:48:16.000 I think that whole cultural elite class needs to get repopulated with people who people actually trust.
00:48:21.000 Yeah.
00:48:22.000 The problem is these people that are...
00:48:26.000 Starting these jobs, they're coming out of universities, and the universities are indoctrinated into these ideas as well.
00:48:32.000 It's very difficult to be a person who stands outside of that and takes unpopular positions.
00:48:38.000 You get socially ostracized, and people are very hesitant to do that, and they would rather just keep their mouth shut and talk about it in quiet conversation.
00:48:47.000 And that's what we experience, which is another...
00:48:50.000 Another argument for anonymous accounts.
00:48:53.000 I think you should have anonymous accounts.
00:48:55.000 I think you should be able, like, if there's something like COVID mandates or some things that you're dealing with and you don't want to get fired because of it, you should be able to talk about it.
00:49:03.000 And you should be able to post facts and information and what you've learned.
00:49:06.000 And, you know, anecdotal experiences of people in your family that had vaccine side effects and not worry about losing your job, which people were worried about, which is so crazy.
00:49:19.000 And you're seeing a lot of the people that used to be in mainstream media got fired, and now they're trying to do the sort of podcast thing.
00:49:27.000 But they're trying to do it like a mainstream media person, so they're gaslighting during podcasts, and people are like, hey, fuckface.
00:49:36.000 You can't do that here.
00:49:38.000 It doesn't work.
00:49:39.000 Yeah, it's a new medium.
00:49:40.000 I'm sure you know the history on this.
00:49:42.000 When people transition from radio to TV, Yeah.
00:50:00.000 the whole phrase, it's like you've got a good radio voice, right?
00:50:02.000 It's like, okay, on TV, you need to be telegenic, right?
00:50:06.000 You need to kind of have charisma in that medium.
00:50:08.000 It's like a completely different thing.
00:50:10.000 And I think that that's going to be true for the internet too.
00:50:15.000 It's, you know, it's not as cut.
00:50:17.000 I think part of it is the format, right?
00:50:18.000 The fact that you do these, like, two, three-hour episodes.
00:50:22.000 I mean, I hated doing TV because, you know, I basically got started.
00:50:28.000 I started Facebook when I was 19, and I was good at some things, very bad at others.
00:50:33.000 I was good at coding and, like, real bad at kind of, like, talking to people and explaining what I was doing.
00:50:40.000 And I just had these experiences early on where I'd go on TV and it wouldn't go well and they'd cut it down to some random soundbite and I'd look stupid.
00:50:53.000 And then basically I'd get super nervous about going on TV because I knew that they were just going to cut it in some way that I was going to look like a fucking idiot.
00:51:03.000 And so I'm just like, this sucks, right?
00:51:06.000 So I just like...
00:51:09.000 It's kind of a funny thing.
00:51:11.000 In some ways, it's like, okay, at the same time, I was gaining confidence, being able to build more and more complicated products.
00:51:19.000 And even as an early 20s person, I was like, I could do this.
00:51:22.000 And then on the TV and comms public side, I was like, this is a disaster.
00:51:26.000 Every time I go out, it's worse and worse and worse.
00:51:29.000 But it's one of the reasons why I think on the internet...
00:51:34.000 Like, there's no reason to cut it to a four-minute soundbite conversation.
00:51:38.000 It's like, I think part of what makes it authentic is like, we can just, I mean, these are complex issues.
00:51:43.000 We can unpack it for hours and probably still have hours more stuff to talk about.
00:51:48.000 It just, it's, I don't know.
00:51:50.000 I think it's just more real.
00:51:51.000 Yeah, it's definitely that.
00:51:53.000 And the other thing about television that's always going to hold it back is the fact that every conversation gets interrupted every X amount of minutes because you have to cut to a commercial.
00:52:03.000 So you really can't get into depth.
00:52:06.000 Even Bill Maher shows only an hour.
00:52:08.000 You have all these people talking over each other, then you sit down with one person for a short amount of time.
00:52:13.000 It's just not enough time for important subjects.
00:52:16.000 It's also a lot of them, for whatever reason, want to do it in front of an audience.
00:52:20.000 Which is the worst way to get people to talk.
00:52:22.000 Like, imagine these disasters that you had if there was, like, 5,000 people staring at you in a TV crowd as well.
00:52:29.000 So there's that added element, which is so not normal and not conducive to having a conversation where you're talking about nuanced things.
00:52:37.000 Where you have to, like, think.
00:52:39.000 You have to be able to pause and not concern yourself with being entertaining from these fucking people just sitting there staring at you.
00:52:47.000 Yeah, and also, like, when you're having a conversation, like, I don't know.
00:52:50.000 It's like when you start talking about something, your kind of subconscious kicks in.
00:52:54.000 You start thinking about the topic.
00:52:57.000 So it's like you might not actually have the thing that you want to say until like five minutes later.
00:53:01.000 And I mean, it's like when we started this conversation, I think like the first few minutes were just kind of slow.
00:53:06.000 It's like warming up.
00:53:06.000 I'm like, OK, kind of like downloading into my memory.
00:53:09.000 Like, how am I going to like, you know, it's like, how am I going to just explain these different things?
00:53:17.000 Yeah, I just think that's sort of how people work.
00:53:19.000 It's also like conversations are like a dance.
00:53:22.000 You know, one person can't be dancing at another speed and the other person is going slow.
00:53:26.000 You kind of have to find the rhythm that you're going to talk with and then you have to actually be interested in what you're talking about.
00:53:34.000 That's another thing that they are at a huge disadvantage of in mainstream media.
00:53:39.000 It's like they're just doing that because that's their job.
00:53:41.000 They probably don't even know a lot about climate change.
00:53:44.000 They probably don't really understand too much about what SpaceX is trying to accomplish.
00:53:49.000 But they're just reporting on it.
00:53:50.000 Yeah, I mean, I'm sure there's...
00:53:52.000 A lot of the people I've met there I think are good people.
00:53:54.000 I'm sure they are.
00:53:54.000 It's just a tough format.
00:53:56.000 It's a terrible format.
00:53:57.000 Yeah, and the problem is they get locked into that format and no one trusts them.
00:54:01.000 And then they leave and they go, yeah, but you were just lying to us about this, that, and the other thing.
00:54:05.000 And now I'm supposed to believe you're one of the good guys.
00:54:07.000 You're one of the straight shooters now.
00:54:09.000 Yeah.
00:54:14.000 Well...
00:54:14.000 Getting back to the original point, this is why I think it makes sense to me that the government didn't want you to succeed and to have the sort of unchecked power that they perceived social media to have.
00:54:29.000 And I think...
00:54:31.000 One of the benefits that we have now of the Trump administration is that they have clearly felt the repercussions of a limited amount of free speech, of free speech limitations, censorship, government overreach.
00:54:43.000 If anybody saw it, look, I don't know what the actual impact of the Hunter Biden laptop story would have been.
00:54:51.000 I don't know.
00:54:52.000 But there's many people that think it probably amounted to millions of votes overall in the country, of people that were on the fence.
00:55:00.000 The people that weren't sure who they're going to vote for, if they found out the Hunter Biden laptop was real, they're like, oh, this is fucking, the family's fucking crazy.
00:55:07.000 And they would have voted for Trump.
00:55:09.000 That's possibly real.
00:55:10.000 And if that's possibly real, that could be defined as election interference.
00:55:16.000 And all that stuff scares the shit out of me.
00:55:19.000 That kind of stuff scares the shit out of me.
00:55:21.000 When the government gets involved in what could be termed election interference, but through some weird loophole, it's legal.
00:55:29.000 Well, I don't think that the pushing for social media companies to censor stuff was legal.
00:55:36.000 I mean, there's all this stuff about what, like, people talk about the First Amendment and, okay, these tech platforms should offer free speech like the First Amendment.
00:55:47.000 That, I think, is a philosophical principle.
00:55:50.000 The First Amendment doesn't apply to companies in our content moderation.
00:55:58.000 Best dialogue is carried out.
00:56:00.000 But the First Amendment does apply to the government.
00:56:03.000 That's, like, the whole point, right, is the government is not allowed to censor this stuff.
00:56:06.000 So at some level, I do think that, you know, having people in the administration calling up the guys on our team and yelling at them and cursing and threatening repercussions if we don't take down things that are true is, like, it's pretty bad.
00:56:23.000 It sounds illegal.
00:56:24.000 I would love to hear it.
00:56:26.000 I wish somebody recorded those conversations.
00:56:28.000 Those would be fucking great to listen to.
00:56:30.000 Somebody could animate them, maybe polytunes.
00:56:33.000 A lot of the material is public.
00:56:36.000 I mean, Jim Jordan led this whole investigation in Congress.
00:56:39.000 I mean, it was basically, I think about this as like, you know, what Elon did on the Twitter files when he took over that company.
00:56:46.000 I think Jim Jordan basically did that for the rest of the industry with the congressional.
00:56:52.000 Investigation that he did.
00:56:53.000 And we just turned over all of the documents and everything that we had to them, and they basically put together this report.
00:57:01.000 And the people that actually did call for censorship, what was the response to all this?
00:57:07.000 To what?
00:57:08.000 To the investigation?
00:57:10.000 Yes.
00:57:11.000 I don't know.
00:57:12.000 I don't know.
00:57:14.000 Was anybody held accountable?
00:57:15.000 Was there any repercussions?
00:57:19.000 I mean, they lost the election.
00:57:20.000 Yes.
00:57:21.000 So that's it?
00:57:22.000 Well, in a democracy, I mean, that's kind of...
00:57:24.000 Right, but if what they did was illegal, do you not think that some steps should be put in place to make sure that people are punished for that and that that never happens again?
00:57:37.000 It seems that that has a massive impact on the way our country goes.
00:57:43.000 If that's election interference, and I think it is, that has a massive impact on the direction of our country.
00:57:50.000 Yeah, well, the COVID thing, I don't think, was election interference as much as it was just, like, government meddling where it shouldn't have.
00:57:57.000 But, yeah, no, I mean, it's tougher for me to say, you know, like, what specific retribution or justice should happen to anyone who is involved in these things.
00:58:09.000 But I think your point about let's make sure this doesn't happen again is...
00:58:15.000 Is the one that I'm more focused on.
00:58:18.000 Right.
00:58:18.000 Because it's the thing that I reflect on on my journey on all this, which is like, okay, yeah, so we didn't take down the stuff that was true.
00:58:26.000 But we did generally defer to the government on some of these policies that in retrospect I probably wouldn't, knowing what I know now.
00:58:36.000 And I just think that that's sort of the journey that we've been on.
00:58:43.000 Okay, we start the thing focused on free expression, go through some pretty crazy times in the world, get it pressure tested, see where we basically ended up doing stuff that led to a slippery slope that we weren't happy with the conclusion, and try to reset.
00:58:57.000 And that's sort of the moment that we're at now, is trying to just rationalize a bunch of the policies.
00:59:04.000 And look, obviously crazy things can happen in the future that might unearth something that I haven't, you know, some kind of angle on this that I haven't.
00:59:15.000 I'm sure I'm not done making mistakes in the world.
00:59:17.000 But I think at this point, we have a much more thorough understanding of what the space is.
00:59:24.000 And I think our kind of values and principles on this are likely going to be much more durable going forward.
00:59:32.000 And I think that that's probably a good thing for the Internet.
00:59:35.000 I think it's a great thing for the internet.
00:59:36.000 I was very happy with your announcement.
00:59:38.000 I'm very happy that you took those steps.
00:59:40.000 I'm very happy you brought Dana White aboard.
00:59:42.000 Oh, he's awesome.
00:59:43.000 I've been talking to him for a while about that.
00:59:44.000 Talk about an amazing entrepreneur.
00:59:47.000 Because I control our company, I have the benefit of not having to convince the board not to fire me.
00:59:55.000 It's like a normal corporate environment.
00:59:57.000 It's like basically the CEO just tries to like...
01:00:00.000 You know, they're just trying to convince the board to, like, let them have their job and pay them more.
01:00:04.000 It's like, all right, the board doesn't pay me except for security.
01:00:07.000 And I'm not worried about losing my job because I control the majority of the voting in the company.
01:00:13.000 So I actually get to use our board to, like, have the smartest people who I can get to have around me help work on these problems.
01:00:20.000 So it's like, all right, who are the people I want?
01:00:22.000 Like, I just want, like, the best entrepreneurs and people who've created different things.
01:00:26.000 And, like, I mean, Dana's, like, this guy.
01:00:28.000 Who, I mean, he basically took the sport from being this, like, I think it was viewed as this pretty marginal thing when he got started.
01:00:36.000 I think John McCain was trying to outlaw it.
01:00:39.000 And now it's like, I think it and F1 are the two fastest growing sports in the world.
01:00:44.000 It's got hundreds of millions of people viewing it.
01:00:47.000 It's like, I mean, what Dana's done with the UFC is like one of the most legendary business stories.
01:00:54.000 And the brand is beloved.
01:00:58.000 And I think he's just, so he's like a world-class entrepreneur and he's just like a, he's got a strong backbone.
01:01:06.000 And I think part of what the conversation that I had with him around joining our board was, okay, like we have a lot of governments and folks around the world putting a lot of pressure on our company and like we need some like strong people who are going to basically, you know, help advise us on how to handle some of these situations.
01:01:23.000 And so, yeah.
01:01:28.000 Running this company is not for the faint of heart.
01:01:31.000 There's definitely a lot of pressure from all these different governments.
01:01:34.000 Then it's like, okay, I could spend all my time doing that, but I'm not even a politician.
01:01:38.000 I just want to spend my time building things.
01:01:42.000 I think Dana's going to be great.
01:01:43.000 He's the best.
01:01:45.000 I agree with everything you said about him.
01:01:47.000 Without him, none of the UFC would have ever taken place the way it did.
01:01:50.000 You needed the Fertitta brothers.
01:01:51.000 They had to come in with all the money and the vision.
01:01:54.000 It's really funny because Eddie Bravo and I, You know, we've been fans for so long.
01:01:58.000 Eddie Bravo and I went to a live event in the 90s.
01:02:02.000 I was working for the UFC as a backstage interviewer, and he went there with Ricky Rocket.
01:02:06.000 You know Ricky Rocket from Poison?
01:02:08.000 No.
01:02:09.000 He's a fucking black belt under the Machados.
01:02:11.000 He's legit.
01:02:13.000 Super legit.
01:02:14.000 Really nice guy, too.
01:02:15.000 Anyway, so...
01:02:16.000 Ricky Rocket and him were at the UFC, and we were talking about it in the 90s.
01:02:22.000 We're like, you know what the sport needs?
01:02:23.000 Because we were in love with it.
01:02:24.000 But we were martial artists.
01:02:26.000 We were like, the sport needs some billionaires who just throw a ton of money at it and just get it huge.
01:02:32.000 And then the Ferdinand brothers come along.
01:02:35.000 Billionaires with a ton of money who are...
01:02:37.000 Huge fans of the sport.
01:02:39.000 Just love the sport.
01:02:40.000 You know, we're hiring people like Frank Shamrock to come in and train them and work out, and we're taking jiu-jitsu with John Lewis, and they were really getting into it.
01:02:48.000 And so then they buy the UFC for like $2 million, which is probably the greatest purchase ever, except they were 40-plus.
01:02:56.000 Million dollars in the hole when they finance The Ultimate Fighter.
01:03:01.000 And then that was 2005. And then this one fight takes place with Stefan Botter and Forrest Griffin on television.
01:03:08.000 It's so wild and so crazy that millions of people start tuning in.
01:03:12.000 The sport's born.
01:03:13.000 Then you have Chuck Liddell, who was the champion at the time, who was the most fan-friendly champion you could ever have.
01:03:19.000 Just a fucking berserker.
01:03:21.000 Just a psychopath with a fucking head tattoo and a mohawk crushing people in his prime.
01:03:27.000 He was the perfect poster guy for the UFC because he was just smashing people and then throwing his arms back in a cage.
01:03:36.000 It was nuts!
01:03:37.000 I'm sure you've seen a lot of Chuck Liddell fights, right?
01:03:40.000 It was just the whole thing took off.
01:03:42.000 But without Dana, it would have never taken place.
01:03:44.000 The guy's tireless.
01:03:45.000 That man, I could call him up.
01:03:47.000 I'll call him up at like 2 o'clock in the morning sometime.
01:03:49.000 Like there's some fight going on.
01:03:51.000 And I'll say, hey.
01:03:52.000 This is going on next weekend.
01:03:53.000 I'm so fucking pumped.
01:03:55.000 We'll talk for hours.
01:03:57.000 For hours.
01:03:57.000 He just wants to talk about fights.
01:03:59.000 He's like so locked in.
01:04:01.000 Like all the time.
01:04:03.000 You know?
01:04:03.000 And he's just like so driven.
01:04:05.000 And now that he's healthy.
01:04:08.000 Like, oh my god.
01:04:09.000 What Gary Brecka's done for him is incredible.
01:04:12.000 He lost all his weight.
01:04:13.000 Got super thin.
01:04:14.000 Real fit.
01:04:15.000 Super healthy.
01:04:16.000 He doesn't fuck around with alcohol anymore.
01:04:18.000 He just eats healthy food.
01:04:19.000 He looks great.
01:04:20.000 Now he's getting even more energy.
01:04:22.000 Yeah.
01:04:22.000 It's incredible.
01:04:23.000 Well, we're lucky to have some of it.
01:04:25.000 Yeah, we are.
01:04:25.000 And you know what?
01:04:26.000 We're also lucky that you got into jiu-jitsu.
01:04:29.000 Because I think that had an effect on you.
01:04:32.000 You look different.
01:04:33.000 When you walked in here today, you look thicker.
01:04:36.000 You look like a different guy.
01:04:37.000 You do.
01:04:38.000 You look like a jiu-jitsu guy now.
01:04:40.000 It's funny.
01:04:40.000 I saw your neck.
01:04:41.000 I'm like, his neck's bigger.
01:04:43.000 Your neck is bigger.
01:04:44.000 Good.
01:04:44.000 Are you using iron neck or is it just for training?
01:04:46.000 I do like iron neck.
01:04:52.000 Training not just jujitsu, but striking.
01:04:55.000 I was like, alright, I want to find a way to do this where I don't hurt my brain.
01:05:00.000 I'm going to be running this company for a while.
01:05:02.000 I would like to stay healthy and not take too much damage.
01:05:07.000 And so I think the number one thing you need to do is, well, in addition to having good partners, is have a strong neck.
01:05:14.000 Yes.
01:05:15.000 So, yeah.
01:05:16.000 I take that pretty seriously.
01:05:18.000 It's very important.
01:05:19.000 A strong neck is great for jiu-jitsu as well because it's a weapon.
01:05:23.000 In certain positions, like head and arm chokes, you need a neck.
01:05:26.000 It's a weapon.
01:05:28.000 And also for defending things and just for overall stability.
01:05:32.000 But for striking, it's very...
01:05:34.000 Like Mike Tyson in his prime.
01:05:36.000 He had a fucking 20-inch neck.
01:05:38.000 It's crazy.
01:05:38.000 His neck is bigger than his face.
01:05:40.000 This is a photo of him in a suit.
01:05:42.000 It's the craziest photo.
01:05:43.000 It's like his neck starts at the top of his ears, and it just goes straight down when he was a champ, when he was a tank.
01:05:49.000 He's amazing.
01:05:50.000 Yeah.
01:05:51.000 The neck's very important, but it's also like, you know, you're doing it very smart.
01:05:55.000 You're bringing in Dave Camarillo.
01:05:58.000 He's awesome.
01:05:59.000 Amazing.
01:05:59.000 He's awesome.
01:06:00.000 You're bringing in all these, like, super talented people to train with you, too, which is really important, and just learn systematically, probably the way you've learned all these other things, which is really...
01:06:10.000 So fascinating to me about MMA and jujitsu in particular is the general public has this knuckle-dragging, meathead sort of perspective, and then I'm like...
01:06:22.000 Let me introduce you to Mikey Musumichi.
01:06:25.000 Well, there's a range.
01:06:26.000 There's a range for Mikey to...
01:06:28.000 Right, but Mikey is one of the elite of the elite.
01:06:31.000 And he's about as far from that...
01:06:32.000 I love Mikey.
01:06:33.000 He's a very good guy.
01:06:34.000 He's a super good guy.
01:06:35.000 He's super kind and unbelievably brilliant and eccentric and just so dedicated to jiu-jitsu.
01:06:42.000 I'm glad that he's over at the UFC now.
01:06:45.000 Yes, I am too.
01:06:46.000 Well, I'm glad a guy like that exists.
01:06:48.000 I like...
01:06:49.000 Because I like...
01:06:50.000 I'm like, okay, I know you think that...
01:06:52.000 Let me show you this guy.
01:06:53.000 And then I'm like, let me show you what it really is.
01:06:54.000 Let me introduce you to these people because they're the nicest people.
01:06:57.000 I know.
01:06:58.000 There's no better stress reliever in the world than jujitsu or martial arts.
01:07:02.000 There's no better.
01:07:02.000 You leave there.
01:07:03.000 You're the kindest person in the world.
01:07:05.000 You just let all of your aggressions out of your system.
01:07:08.000 Yeah.
01:07:09.000 And it's a phenomenal stress reliever because...
01:07:11.000 Regardless of what you're going through day-to-day with Facebook and Meta and all the different projects you have going on, it's not as hard as someone trying to choke you unconscious.
01:07:20.000 It's not as acute.
01:07:21.000 I think it's like sometimes you have someone trying to choke you unconscious slowly over a multi-year period, and that's business.
01:07:30.000 But I think that sometimes in business the cycle time is so long that it is very refreshing to just have a feedback loop that's like, I had my hand down, so I got punched in the face.
01:07:43.000 It's really important to me for balance.
01:07:49.000 I basically try to train every morning.
01:07:52.000 I'm either doing general fitness or MMA. I do sometimes grappling, sometimes striking.
01:07:59.000 It got to the point where I tore my ACL training.
01:08:05.000 At that point, I didn't have...
01:08:08.000 I wasn't integrated between my weight training and my fighting training, so I think I was probably overdoing it.
01:08:14.000 So now we basically, I'm just trying to do this in a cohesive way, which I think will be more sustainable.
01:08:20.000 But when I tore my ACL, first of all, everyone at the company was like, ah, fuck, we're going to get so many more emails now that he can't do this.
01:08:30.000 And then I sat down with Priscilla, and I expected her to be like, you're an idiot.
01:08:35.000 What do you expect?
01:08:36.000 I was in my late 30s at the time.
01:08:39.000 But she was like, no.
01:08:40.000 She's like, when you heal your ACL, you better go back to fighting.
01:08:43.000 And I'm like, what do you mean?
01:08:45.000 She's like, you are so much better to be around now that you're doing this.
01:08:49.000 You have to fight.
01:08:51.000 That's hilarious.
01:08:53.000 Yeah.
01:08:54.000 Isn't it funny that that's completely contrary?
01:08:58.000 To the way most people, if they're outside of it, would perceive it.
01:09:01.000 I mean, it definitely takes the edge off things.
01:09:03.000 But it's like after a couple of hours of doing that in the morning, it's just like, yeah.
01:09:08.000 It's like nothing else that day is going to stress you out that much.
01:09:12.000 You can just deal with it.
01:09:14.000 Voluntary adversity.
01:09:15.000 Yeah.
01:09:16.000 No, it's good.
01:09:17.000 It's also good, I think, to be a little bit tired.
01:09:21.000 I love that feeling of just like...
01:09:25.000 You're not, like, exhausted.
01:09:26.000 And sometimes you get a session and you just go so hard and I need to, like, just go to sleep or something.
01:09:32.000 It's also good to know that you can kill people.
01:09:36.000 That's a good thing to know.
01:09:38.000 It's a good thing to know if something goes sideways.
01:09:40.000 I guess there's a certain confidence in that.
01:09:43.000 It's an important skill.
01:09:44.000 If you could give it in a pill, if you could sell it in a pill, everybody would buy it.
01:09:48.000 No one would say, I'd like to be the vulnerable guy walking around with a bunch of fucking assassins.
01:09:54.000 No one would say that.
01:09:55.000 They would say, how much is the pill?
01:09:57.000 Oh, it's $2.
01:09:58.000 Oh, give me one of those pills.
01:09:59.000 You take the pill.
01:10:00.000 Everybody would take that pill.
01:10:01.000 Well, it exists.
01:10:02.000 It's just not a pill.
01:10:03.000 It's a long journey of...
01:10:05.000 Pain and discipline and trial and error and learning and being open-minded and being objective and understanding position and asking questions and having good training partners and absorbing information and really being diligent with your skill acquisition work, which is one of the most important and neglected parts of jiu-jitsu because training is so fun.
01:10:28.000 Everybody just wants to roll, you know, where really the best way to do it is actually to drill.
01:10:33.000 It's the most boring.
01:10:35.000 Really, you should drill constantly, just jam those skills into your neurons where your brain knows exactly what to do at every position.
01:10:43.000 And it's such an intellectual pursuit.
01:10:45.000 And most people don't think of it that way because you have to manage your mind while you're moving your body, you're managing anxieties, you're trying to figure out when to hit the gas and when to control position and recover.
01:10:58.000 There's so much going on in training.
01:11:01.000 That applies to virtually any stressful thing that you'll ever experience in your life.
01:11:06.000 And along with it, you get this skill where you can kill people.
01:11:13.000 You shouldn't kill people, let me be clear.
01:11:16.000 I'm not saying it's a good thing to kill people.
01:11:18.000 I'm definitely not.
01:11:18.000 But I'm saying it's a good thing to, if someone's trying to kill you and they absolutely can't, because you could kill them easy, that's way better.
01:11:26.000 It's a way better situation to be in.
01:11:28.000 Yeah, no, it's great.
01:11:30.000 I mean, it's opened a lot of how I think about stuff.
01:11:34.000 I mean, it is just interesting, your point about having a pill that allows you to just kind of know that you have this kind of physical ability.
01:11:45.000 It's a superpower.
01:11:47.000 It's interesting, because I do think a lot of our society has become very, like, I don't know, I don't even know the right word for it, but it's kind of like...
01:11:59.000 I mean, I think part of the reason...
01:12:10.000 One of the things that I enjoy about it is I feel like I can just express myself.
01:12:16.000 It's like when you're running a company, people typically don't want to see you being this ruthless person who's just like, I'm just going to crush the people I'm competing with.
01:12:26.000 But when you're fighting, it's like, no, no, that's like...
01:12:29.000 You're rewarded.
01:12:31.000 I think in some ways, when people see me competing in this sport, they're like, oh, no, that's the real mark.
01:12:35.000 It's like, because it goes back to all the media training stuff we were talking about when I'm going and giving my soundbites for two minutes.
01:12:41.000 It's like, no, it's like, fuck that guy.
01:12:44.000 It's like, that's the real one.
01:12:47.000 Well, you definitely got a lot of respect in the martial arts community.
01:12:50.000 People got super excited that you were so involved in it and so interested in it because anytime someone like yourself or like Tom Hardy or anyone like, wow, that guy's into it?
01:12:58.000 Like, wow.
01:12:59.000 Anytime something like that happens, there's like some new person who's a prominent person, a very smart person, that's really interested in it.
01:13:06.000 We all get very excited because we're like, oh, boy.
01:13:09.000 It's a very welcoming community.
01:13:10.000 Super.
01:13:11.000 I think there's a lot of sports that are like...
01:13:13.000 Nah, we don't want you.
01:13:14.000 It's not a jock community.
01:13:16.000 It's super kind.
01:13:17.000 Like, jujitsu people in particular, they're some of the nicest people.
01:13:21.000 It's my friends forever.
01:13:22.000 You know, they'll be my friends for life.
01:13:24.000 Yeah.
01:13:25.000 Yeah, no, it's a good crew.
01:13:26.000 I mean, when I got hurt, I really...
01:13:29.000 Kind of miss the guys I trained with.
01:13:30.000 Oh, it sucks.
01:13:30.000 Dave has put together this group.
01:13:32.000 It's basically all these young pro fighters who are kind of up and coming, kind of early 20s, but they've only been doing it for a few years.
01:13:40.000 So I've been doing it for a few years.
01:13:43.000 So that way it's like we kind of have a more similar level of skill and they're all better than me.
01:13:48.000 But in terms of I'm like, I was in my late 30s and they're in their early 20s.
01:13:53.000 It was sort of like they're kind of...
01:13:54.000 Coming into Becoming Men, I'm like sort of at the end of my fiscal peak.
01:13:58.000 But it's like, it's a really good crew.
01:14:04.000 Yeah, no, it's a good crew.
01:14:05.000 And the competing thing is fun.
01:14:07.000 I can't wait to get back to that too.
01:14:08.000 I mean, it's like basically, I mean, I was also doing it with, so it's basically a group of pro fighters and then a handful of meta executives would do it.
01:14:16.000 And basically, we would just kind of like fight each other and it would be fun.
01:14:20.000 And then one of them...
01:14:23.000 I decided one day that they were like, you know, I think I'm getting pretty good at jiu-jitsu.
01:14:27.000 I'm going to go to a tournament.
01:14:29.000 And I was like, all right, good luck with that, bro.
01:14:31.000 I'm not going to go to a tournament.
01:14:33.000 I don't want to go to a tournament and get embarrassed.
01:14:37.000 But then the guy goes to the tournament and he does pretty well.
01:14:40.000 I'm like, that guy?
01:14:42.000 It's like, okay.
01:14:44.000 We go all the time.
01:14:46.000 And if he's doing well in a tournament, that's like, all right, fine, sign me up.
01:14:49.000 It's just super competitive.
01:14:52.000 So this was like, when was this?
01:14:56.000 It must have been, I don't know, I guess I rolled into this tournament and I registered under my first and middle name.
01:15:05.000 So people didn't know who I was.
01:15:08.000 And I had sunglasses and a hat and I wore a COVID mask.
01:15:12.000 And basically it was like, It wasn't until they called our names to step onto the mat that I was like, all right, I took all this stuff off, and the guy was like, uh, what?
01:15:22.000 That's kind of a cheat code.
01:15:24.000 I mean, yeah.
01:15:25.000 They kind of freak out.
01:15:27.000 I think he was trying to figure out what was going on.
01:15:29.000 Afterwards, his coach was like, I think that was Mark Zuckerberg who just submitted me.
01:15:34.000 The coach was like...
01:15:36.000 No.
01:15:37.000 No way.
01:15:38.000 And it's like, no, I think that was.
01:15:40.000 It's like, what?
01:15:41.000 You're fighting Mark Zuckerberg?
01:15:42.000 He's like, get back in there.
01:15:43.000 It's like, go fight him.
01:15:44.000 at me it's like no he just submitted me it's that's very funny yeah Yeah, man.
01:15:53.000 Well, Tom Hardy's doing that too, right?
01:15:54.000 He's done multiple tournaments now.
01:15:56.000 Yeah.
01:15:57.000 No, I think, yeah.
01:16:02.000 Yeah, I can't wait to get back to competing.
01:16:05.000 It's been sort of a slow journey on the rehab.
01:16:07.000 It's sort of like learning twice, but we're getting there.
01:16:10.000 How far out are you?
01:16:11.000 Oh, no, I'm done with the rehab now.
01:16:13.000 Now I'm just ramping up.
01:16:14.000 How far out are you from surgery?
01:16:18.000 12 months, 13 months.
01:16:20.000 So you did the patella tendon graft, right?
01:16:23.000 I did.
01:16:23.000 Yeah.
01:16:24.000 That's a rough one to come back from.
01:16:26.000 I did the patella tendon graft on my left knee, and it took me about a year.
01:16:30.000 I did the ACL from a cadaver.
01:16:33.000 It's actually, they use an Achilles tendon from a cadaver on my right knee, and I was back to jiu-jitsu in six months.
01:16:40.000 Like, full confidence in six months.
01:16:42.000 Interesting.
01:16:42.000 I was 100% recovered, kicking the bag, everything.
01:16:45.000 Nice.
01:16:46.000 Yeah.
01:16:47.000 How old were you when you got those?
01:16:50.000 The first one, I was 26. The second one, I was 31, 32, somewhere around there.
01:17:02.000 Oh, so young.
01:17:03.000 Yeah.
01:17:04.000 Because my doctor is basically like, look, you're at the boundary.
01:17:07.000 You could go either way.
01:17:08.000 But if you want to compete again, then I'd recommend doing the patella.
01:17:12.000 Yeah, I know they say that.
01:17:13.000 I don't agree with that.
01:17:14.000 I mean, just from my own personal experience, my doctor told me that the ACL from a cadaver when they use the patella tendon graft is 150% stronger than your natural ACL. He said, you'll be back to...
01:17:27.000 Because I didn't have any meniscus damage in my right knee.
01:17:29.000 He's like, you'll be back to 100%.
01:17:30.000 I have a lot of meniscus damage in my left knee, unfortunately, which is also part of the problem with the recovery of that one.
01:17:36.000 But the patella tendon graft...
01:17:39.000 The bone on the kneecap was painful forever in terms of getting on my knees, training for my knees, doing certain positions, and even just stretching.
01:17:50.000 Putting my knees on the ground, sitting on my heels, and then laying back was fucking painful.
01:17:56.000 It took forever to break all that scar tissue up.
01:17:59.000 Now it's fine.
01:18:00.000 It's fine now, but obviously it's a long time ago.
01:18:03.000 I can kind of do everything that I want at this point.
01:18:05.000 It's still a little sore, but I... I don't know.
01:18:09.000 I think that it's supposed to be a couple years until you feel like it's fully.
01:18:11.000 I think it takes some time for the nerves to grow into it and all that.
01:18:14.000 Did you incorporate peptides in your recovery?
01:18:16.000 I didn't.
01:18:19.000 Do you hate healing?
01:18:20.000 Do I hate healing?
01:18:21.000 No.
01:18:21.000 Why didn't you use peptides?
01:18:23.000 I don't know.
01:18:23.000 I just took my doctor's advice on it.
01:18:26.000 Don't do that anymore.
01:18:28.000 Next time, there's other people to talk to.
01:18:31.000 I mean, it's gone pretty well.
01:18:33.000 I'm sure it goes pretty well, but it would go quicker with peptides.
01:18:37.000 100%, for sure.
01:18:38.000 But it's been this interesting opportunity to, like, I really don't want that to happen again.
01:18:44.000 So I feel like I'm so much more focused on technique.
01:18:48.000 Like, the first time that I learned all this stuff, I was, like, I was probably, like, a little too brutish about it.
01:18:54.000 And just, like, muscling through stuff.
01:18:56.000 And now...
01:18:59.000 I don't know.
01:18:59.000 Now I feel like I'm really learning how to do this stuff correctly, and I can do it way more effortlessly.
01:19:05.000 Yeah, that's the goal.
01:19:06.000 How did it pop?
01:19:07.000 How did it pop?
01:19:08.000 It was like the end of a session, and we were two hours into training, and I was doing a few rounds, and basically I threw a leg kick, and the other guy went to check it, and I leaned back to try to get around the check and just put too much torque on my knee.
01:19:25.000 So it was the planted leg.
01:19:28.000 Mine was planted like two.
01:19:29.000 Yeah, but it's, I don't know, Dave was like, you know, before that round, Dave was like, you're done.
01:19:35.000 I'm like, no, one more round.
01:19:36.000 So you're too tired as well.
01:19:38.000 Yeah, and I basically hadn't, you know, I basically had also just done a really hard kind of like...
01:19:48.000 Leg workout the day before, but I don't think the fight guys didn't know that.
01:19:53.000 So I really just pushed it too hard.
01:19:55.000 Are you aware of Knees Over Toes guy?
01:19:58.000 Yeah.
01:19:58.000 Have you done his stuff?
01:20:00.000 I've looked at it a bunch.
01:20:01.000 I mean, the rehab thing I took really seriously, and I thought that was pretty interesting, too.
01:20:06.000 I don't want to have to do a lot of rehabs like this one, but to do one of them...
01:20:12.000 I actually thought it was a pretty interesting experience because it's like week over week you're just getting back so much mobility and ability to do stuff.
01:20:19.000 Yeah.
01:20:21.000 No, I feel like I'm...
01:20:22.000 I don't know, at this point I just like probably half my weight training is effectively kind of like rehab and joint health stuff.
01:20:32.000 Like wrists, shoulders, knee, all that in addition to the big muscle groups.
01:20:37.000 Yeah, that's very smart.
01:20:38.000 The knee over toes guy stuff is particularly...
01:20:42.000 Because it all comes from a guy that had a series of pretty catastrophic knee injuries and was plagued with weak knees his whole life.
01:20:49.000 And then developed a bunch of different methods to strengthen all the supporting muscles around the knee that are really extraordinary.
01:20:58.000 Everything from Nordic curls.
01:21:00.000 Do you do those?
01:21:00.000 Do you do Nordic curls?
01:21:02.000 I should.
01:21:03.000 I should do more than I do.
01:21:05.000 Yeah.
01:21:05.000 Leg curls, Nordic curls, but Nordic curls in particular because, you know, it's very difficult to do.
01:21:10.000 You lift your whole body up with your hamstrings.
01:21:14.000 And all these different slant board squats and different lunges and split squats and all these different things which, like, really strengthen up all the supporting muscles around the knee better than anything that I've ever tried before.
01:21:29.000 And he's got a whole program where it scales up, and he puts it online for everybody.
01:21:34.000 And he gives away a lot of information for free because he said, look, when I was 11 years old, I wish I had access to this, so I'm going to put it out there for everybody.
01:21:41.000 Great guy.
01:21:42.000 Yeah.
01:21:43.000 Cool.
01:21:44.000 But I can't recommend that stuff enough.
01:21:46.000 But I think what you're doing is, like, strengthening shoulders, strengthening knees.
01:21:49.000 That's really the way to do it.
01:21:50.000 Like, you have to think of muscles in terms of, like, armor.
01:21:54.000 You know, if you want to do this thing, you know, it's better to have good bumpers around your car if you might bump into other cars.
01:21:59.000 You know, you don't want to just have raw sheet metal, you know?
01:22:03.000 Yeah.
01:22:04.000 Yeah, and I think a lot of people just focus on, like, the big movements and weight training.
01:22:09.000 And it's, I don't know, first of all, for, like, a lot of...
01:22:12.000 Fighting-type stuff, you kind of want to be loose and, like, not super tight.
01:22:17.000 But, yeah, I mean, I just think, like, the joint stability stuff is you get older and you don't want to do this for a longer period of time.
01:22:25.000 It's good to do.
01:22:26.000 Yeah, it's huge.
01:22:27.000 It's mobility in general.
01:22:29.000 It's just, like, so important.
01:22:30.000 You can compete in jiu-jitsu for a long time.
01:22:32.000 Sure.
01:22:32.000 There's, like, all these master's divisions and stuff.
01:22:35.000 I see those old crazy-looking 70-year-old dudes trying to kill each other.
01:22:38.000 Yeah.
01:22:38.000 It's nuts.
01:22:39.000 It's great.
01:22:41.000 It is great, but for real, sincerely, we're very happy.
01:22:45.000 I think I can speak, rarely do, but I think I can speak for the martial arts community.
01:22:49.000 We're very happy you're bored.
01:22:51.000 It makes it fun that someone is a prominent, intellectual, very intelligent person who's really gotten fascinated by it because it does help to kill that sort of knuckle-dragger perspective that a lot of people have about the sport.
01:23:06.000 No, I think it's super intellectual in terms of actually breaking this stuff down.
01:23:11.000 I mean, both jujitsu and, like, striking, I mean, yeah, you don't have time to think, but, like, the reasoning behind why you kind of want to slip in certain ways and, like, the probability game that you're playing is, um...
01:23:26.000 I don't know.
01:23:28.000 I used to fence when I was in high school, and I did that pretty competitively.
01:23:31.000 I was never, like...
01:23:32.000 Quite good enough to be at the Olympic level, but I was pretty good.
01:23:37.000 And we virtual fenced last time you were here.
01:23:39.000 Yeah, there you go.
01:23:40.000 And I just remember I would sit in my classes in high school and sketch out combinations of moves and sequences for how to faint and kind of trick someone to get them out of position to be able to tap them.
01:24:02.000 I feel like this is a game in the same way.
01:24:06.000 I think when you're training, you're not slugging at each other that much.
01:24:10.000 You're just playing tag.
01:24:12.000 Yeah, you're playing tag.
01:24:12.000 Well, the way the ties do it, I think, is the best.
01:24:15.000 And they're obviously some of the best fighters ever.
01:24:17.000 They fight a lot, which is one of the reasons why they train the way they train.
01:24:20.000 But when you talk to people that train over there, they're like, you learn so much more when you're playing.
01:24:25.000 When you're not trying to hurt each other.
01:24:27.000 You know, you really do learn the technique, like, and it gets fully ingrained in your system.
01:24:34.000 Yeah.
01:24:35.000 It's great.
01:24:36.000 Yeah, you just have to be careful with brain damage.
01:24:38.000 Like, you were talking about having an MMA fight.
01:24:40.000 Are you still entertaining that?
01:24:41.000 I want to.
01:24:42.000 I mean, this is my thing.
01:24:43.000 It's, like, and I think I probably will, but we'll see.
01:24:48.000 I mean, 2025, I think, is going to be a very busy year on the AI side.
01:24:53.000 Yeah.
01:24:54.000 And I don't, like, I think...
01:24:56.000 The idea of having a competition, you really need to get into the headspace of, like, I'm going to fight someone this week.
01:25:02.000 So I need to figure this out, because I don't know how, with everything that's going on in AI, I'm going to have a week or two where I can just get into this, like, I'm going to go fight someone.
01:25:14.000 But it's good training, and I would like to at some point.
01:25:19.000 You know, the thing about the ACL injury...
01:25:22.000 Is I kind of thought before this, it's like, all right, I'm going to do some jiu-jitsu competitions.
01:25:26.000 I want to do one MMA fight, like one kind of like pro or competitive MMA fight.
01:25:31.000 And then I figured I'd go back to jiu-jitsu.
01:25:33.000 But I think tearing the ACL striking is a little more of a fluke.
01:25:37.000 I think you're much more likely to do that grappling.
01:25:39.000 So going through the ACL experience didn't make me want to like just exclusively go do the version where you're just attacking joints all day long.
01:25:48.000 Right.
01:25:48.000 So like, all right.
01:25:49.000 I can take a few more punches to the face before we go back to that.
01:25:53.000 You can hurt yourself doing both of them.
01:25:56.000 You know, there's really no rhyme or reason.
01:25:58.000 I blew my left ACL kickboxing, my right ACL jiu-jitsu.
01:26:02.000 Okay.
01:26:03.000 So, equal opportunity.
01:26:04.000 Yeah, I mean, this, like, Tom Aspinall famously blew his out against Curtis Blades with a supporting leg, just threw a kick, and it's freak accidents.
01:26:12.000 Weird things happen.
01:26:14.000 It's a lot of explosive force with striking, and sometimes that tears things more than slow, controlled movements of jiu-jitsu, especially if you have good training partners.
01:26:24.000 Yeah, but jiu-jitsu isn't always slow or controlled, especially when you're competing.
01:26:27.000 No, especially when you're competing, unless you're really, really good.
01:26:31.000 Like, have you ever watched Gordon?
01:26:32.000 Like, Gordon never moves fast.
01:26:34.000 He doesn't have to.
01:26:35.000 He doesn't have to move fast.
01:26:36.000 He's just, like, always a step ahead of everybody.
01:26:38.000 Have you talked to him at all?
01:26:40.000 Oh, yeah.
01:26:40.000 Do you talk to John Donaher?
01:26:43.000 No, I haven't.
01:26:44.000 Yeah, I would be interested in that.
01:26:45.000 That's the greatest mind in combat sports.
01:26:48.000 I don't say that lightly.
01:26:50.000 John Donaher is the greatest mind in combat sports.
01:26:53.000 Interesting.
01:26:53.000 By far.
01:26:54.000 He's a legitimate genius.
01:26:56.000 You know the whole story, right?
01:26:58.000 The guy was a professor of philosophy at Stanford.
01:27:01.000 Columbia?
01:27:01.000 Where was he?
01:27:02.000 I forget.
01:27:03.000 Columbia, I think it was.
01:27:04.000 And then decides, oh, I'm just going to teach jujitsu all day.
01:27:07.000 Sleeps on the mats, teaches all day long.
01:27:10.000 Where's the rash guard?
01:27:11.000 Anywhere he goes.
01:27:12.000 He's a freak.
01:27:13.000 And he's so fucking smart.
01:27:14.000 Like, scary smart about all kinds of things.
01:27:17.000 It's not just jujitsu.
01:27:18.000 You know, he's got a memory, like a steel vice.
01:27:22.000 Like, he just holds on to thoughts and can repeat them.
01:27:25.000 His recall's insane.
01:27:27.000 He's a legitimate genius that became obsessed with jujitsu.
01:27:32.000 What he's done with Gordon and with Gary Tonin and just a series of other athletes is nothing short of extraordinary.
01:27:41.000 Just an interesting guy to have conversations with too.
01:27:44.000 Have you seen him on Lex's show?
01:27:45.000 He's done a couple episodes of Lex's.
01:27:47.000 Yeah, and I saw the one that you did with him too.
01:27:50.000 Yeah, I love the guy.
01:27:51.000 I mean, again, happy there's someone like that out there because when people have these ideas of what martial arts are and then you see a guy like that and you're like, okay.
01:28:01.000 Why?
01:28:02.000 I might have to rethink this.
01:28:04.000 Yeah, there's a whole spectrum of people.
01:28:05.000 Yeah, yeah.
01:28:06.000 What has it done in terms of—one of the things that a lot of people said, and I have too, like, nothing turns you into a libertarian quicker than jiu-jitsu.
01:28:15.000 I don't know why that is.
01:28:17.000 I think it's a hard work thing.
01:28:19.000 It's cutting out all the bullshit and realizing how much of the things that we take as real things are just excuses and bullshit and weakness and just procrastinating.
01:28:29.000 There's a lot of things that we have that exist, especially in like the business world and the corporate world and the education world that are just bullshit.
01:28:38.000 And they don't really have to be there.
01:28:40.000 And they're only there to try to make up for hard work.
01:28:44.000 Yeah.
01:28:48.000 Yeah, I don't know.
01:28:49.000 I mean, it's kind of just what I was saying before.
01:28:51.000 I think the...
01:28:53.000 For me, it's...
01:28:55.000 I think a lot of the corporate world is pretty culturally neutered.
01:29:03.000 And I just think having...
01:29:06.000 I grew up...
01:29:07.000 I have three sisters, no brothers.
01:29:10.000 I have three daughters, no sons.
01:29:12.000 So I'm surrounded by girls.
01:29:15.000 Women, like, my whole life.
01:29:16.000 And it's like, I think, I don't know.
01:29:20.000 There's something, the kind of masculine energy, I think, is good.
01:29:25.000 And obviously, you know, society has plenty of that.
01:29:28.000 But I think corporate culture was really, like, trying to get away from it.
01:29:32.000 And I do think that there's just something, it's like, I don't know, all these forms of energy are good.
01:29:39.000 And I think having a culture that, like, celebrates the aggression a bit more.
01:29:45.000 It has its own merits that are really positive.
01:29:49.000 And that has been kind of a positive experience for me.
01:29:55.000 Just having a thing that I can just do with my guy friends and we just beat each other a bit.
01:30:03.000 It's good.
01:30:05.000 It is good.
01:30:05.000 I agree.
01:30:06.000 It's good.
01:30:08.000 I can see your point, though, about corporate culture.
01:30:13.000 When do you think that happened?
01:30:15.000 Was that a slow shift?
01:30:16.000 Because I think it used to be very masculine.
01:30:18.000 I think it was kind of hyper-aggressive at one point.
01:30:21.000 No, and look, I think part of the intent on all these things I think is good.
01:30:25.000 I do think that if you're a woman going into a company, it probably feels like it's too masculine.
01:30:32.000 It's like there isn't enough of the energy that you may naturally have.
01:30:37.000 And it probably feels like there are all these things that are set up that are biased against you.
01:30:40.000 And that's not good either because you want women to be able to succeed and have companies that can unlock all the value from having great people no matter what their background or gender.
01:30:49.000 But I think these things can always go a little far.
01:30:54.000 And I think it's one thing to say we want to be kind of like welcoming and make a good environment for everyone.
01:31:00.000 And I think it's another to basically say...
01:31:03.000 That masculinity is bad.
01:31:05.000 And I just think we kind of swung culturally to that part of the spectrum where it's all like, okay, masculinity is toxic.
01:31:14.000 We have to get rid of it completely.
01:31:16.000 It's like, no.
01:31:18.000 Both of these things are good.
01:31:19.000 It's like you want feminine energy.
01:31:21.000 You want masculine energy.
01:31:23.000 I think that you're going to have parts of society that have more of one or the other.
01:31:28.000 I think that that's all good.
01:31:31.000 But I do think the corporate culture sort of had swung towards being this somewhat more neutered thing.
01:31:39.000 And I didn't really feel that until I got involved in martial arts, which I think is still a much more masculine culture.
01:31:54.000 And not that it doesn't try to be inclusive in its own way, but I think that there's just a lot more of that energy there.
01:32:00.000 And I just kind of realized it's like, oh, this is like...
01:32:02.000 Well, that's how you become successful at martial arts.
01:32:04.000 You have to be at least somewhat aggressive.
01:32:06.000 Yeah.
01:32:06.000 But yeah, I mean, there are these things, there are like a few of these things throughout your life where you just, you have an experience and you're like, where has this been my whole life?
01:32:15.000 And it just turned on like a part of my brain that I was like, okay.
01:32:21.000 Like this was a...
01:32:22.000 Piece of the puzzle that should have been there, and I'm glad it now is.
01:32:26.000 I felt that way when I started hunting.
01:32:29.000 Oh, yeah, hunting too.
01:32:30.000 Yeah, same kind of thing.
01:32:31.000 So you've done a lot of that as well.
01:32:33.000 Yeah, well, so we have this ranch out in Kauai, and there's invasive pigs.
01:32:39.000 And on our ranch, there's a lot of albatross.
01:32:44.000 I don't know if they're endangered or just threatened.
01:32:46.000 And then there's the Hawaiian state bird, the nene goose.
01:32:51.000 That's, I think, endangered, or at least was until recently.
01:32:56.000 Most of them in the world live in a small stretch, or at least most of them on Kauai live in a small stretch that includes our ranch.
01:33:03.000 So you constantly have these pigs that just multiply so quickly, and we basically have to apply pressure to the population or else they just get...
01:33:12.000 Overrun and threaten the birds and the other wildlife.
01:33:16.000 And what I basically explained to my daughters, who I also want to learn how to do this, because I just feel like it's like, look, we have this land.
01:33:25.000 We take care of it.
01:33:27.000 Just like you mow the grass, we need to make sure that these populations are in check.
01:33:30.000 It's part of what we do as the stewards of this, and we've got to do it.
01:33:35.000 And then if you have to kill something, then you should...
01:33:40.000 You know, obviously treat it with respect and, you know, use the meat to make food and kind of celebrate in that way.
01:33:49.000 But it's a culture that I think it's just an important thing for kids to grow up, understanding, like, the circle of life, right?
01:33:56.000 So, you know, teaching the kids all of, you know, what is kind of, you know, how you'd run a ranch, how you'd run a farm.
01:34:05.000 I think that that stuff...
01:34:07.000 It's good, because explaining to the kids what a tech company is is really abstract.
01:34:12.000 So for a while, my daughters were pretty convinced that my actual job was Mark's Meats, which is our kind of ranch and the cattle that we ranch.
01:34:25.000 I was like, well, not quite, and you'll learn when you get older.
01:34:30.000 I think that there's something that's just, like, much more tangible about that than, you know, taking them to the office and, you know, sitting in product reviews or something for some, like, piece of software that we're writing.
01:34:41.000 Well, it's certainly a lot more primal.
01:34:42.000 Yeah.
01:34:44.000 Yeah, and if you do wind up eating that meat from the animal and you were there while the animal died, like, you put it all together, like, oh, this is where meat comes from.
01:34:52.000 Yeah.
01:34:53.000 Yeah.
01:34:55.000 Which is another reason why things have become sort of emasculated because that...
01:34:59.000 Energy's not necessary anymore to acquire meat.
01:35:02.000 You know, that used to be the only way that people got meat.
01:35:05.000 You had to go hunt it.
01:35:06.000 So you had to go actually pull the trigger, kill the animal yourself, cut it up, butcher it, cook it.
01:35:12.000 You knew what you were doing.
01:35:13.000 Yeah.
01:35:14.000 Although my favorite is bow, bow and arrow.
01:35:16.000 I mean, that's, I think, like the most, that feels like the most kind of sporting version of it.
01:35:22.000 Yeah, if you want to put it that way.
01:35:24.000 I mean, if you're just trying to get meat, it's not the most effective.
01:35:27.000 The most effective is certainly a rifle.
01:35:29.000 But I prefer it because it requires more of you.
01:35:35.000 Yeah, and you just kind of go and hang out.
01:35:38.000 Yeah, and you have to be fit.
01:35:40.000 Especially if you're mountain hunting, you have to be really fit.
01:35:42.000 Yeah.
01:35:43.000 You can't just be kind of in shape.
01:35:44.000 You've got to be really fit.
01:35:45.000 If you want to huff up the mountains and keep your heart rate at a certain level so that when you get to the top, you can execute a shot calmly.
01:35:53.000 And then actually carry the thing out.
01:35:54.000 Yeah.
01:35:55.000 And carry the thing out.
01:35:56.000 Yeah.
01:35:57.000 Yeah.
01:35:57.000 No, I mostly use a rifle just because it's so much more efficient.
01:36:02.000 You know, your conversion rate is so much higher.
01:36:05.000 But it's...
01:36:05.000 But yeah, another...
01:36:07.000 What kind of bow do you have?
01:36:09.000 Gosh.
01:36:10.000 I didn't get to do it this season, but...
01:36:13.000 Do you know the company that makes it?
01:36:17.000 Not off the top of my head.
01:36:18.000 Oh, you have to know.
01:36:19.000 Yeah, no, this is embarrassing.
01:36:20.000 This is embarrassing.
01:36:23.000 I can get you hooked up.
01:36:24.000 Yeah.
01:36:25.000 It works.
01:36:26.000 Okay.
01:36:27.000 Do you know how old it is?
01:36:29.000 No, it's not old.
01:36:30.000 Okay.
01:36:31.000 I think it's just a compound bow that I got strung to my draw length.
01:36:37.000 Did you get someone to coach you?
01:36:39.000 Yeah, yeah.
01:36:39.000 Who coached you?
01:36:41.000 It's basically a bunch of the guys who help run security around the ranch.
01:36:46.000 Okay.
01:36:47.000 Yeah.
01:36:47.000 The thing about archery is, just like martial arts, one of the things that I learned when I was teaching is that it's way easier to teach someone that knows nothing than to teach someone who learned something incorrectly.
01:37:00.000 The people learn something incorrectly.
01:37:02.000 The moment things got tense and they panicked, they went back to the old ways.
01:37:07.000 Because it's sort of ingrained in their system.
01:37:10.000 So archery, one of the things that's very important is proper form and then proper execution.
01:37:17.000 Especially having a surprise shot.
01:37:19.000 And learning how to have a surprise shot is...
01:37:23.000 Wait, what do you mean?
01:37:23.000 Yeah, see?
01:37:24.000 You don't know.
01:37:25.000 No, no.
01:37:26.000 This is the thing.
01:37:27.000 In high pressure situations, one of the most important things is to have a shot process.
01:37:34.000 Where you don't know exactly when the arrow is going off.
01:37:38.000 You just have a process where you're pulling through the shot and the shot breaks.
01:37:42.000 So it's a surprise shot.
01:37:44.000 So you put the pin on the target.
01:37:46.000 I use a thumb trigger.
01:37:49.000 I use a thing called an Onyx clicker.
01:37:52.000 And the reason why I use the Onyx clicker is like a hinge.
01:37:54.000 It gives you a two-stage of the trigger, right?
01:37:57.000 So as I'm at full draw, I put slight pressure and I hear a click.
01:38:02.000 And that click means it's ready to go off with more pressure.
01:38:05.000 So I've gone through stage one.
01:38:07.000 Now stage two is just concentrating on the shot process and knowing it's going to break.
01:38:13.000 And then there's no flinching.
01:38:15.000 There's no tweet.
01:38:16.000 There's no...
01:38:17.000 There's no thing that people do when they have a finger trigger.
01:38:20.000 They twitch because your body is anticipating the shock of the bow.
01:38:26.000 And when you're doing that, you can be off by six inches, four inches, five inches, all over the place because you're moving.
01:38:33.000 You're moving while you're shooting.
01:38:35.000 When you're doing it with a rifle, it's very different because obviously a rifle is far faster.
01:38:39.000 Yeah.
01:38:40.000 And then you have a scope.
01:38:41.000 So, you know, you're zoomed in many magnifications and all you have to do is just slowly squeeze.
01:38:46.000 And if you're smart, you'll be prone or you'll have your rifle rested on a tripod or something where you have a good steady.
01:38:53.000 It's much easier.
01:38:54.000 With a bow, it's very different because you're holding it with your arms.
01:38:57.000 So you have to have the proper form.
01:38:59.000 You have to have the proper posture.
01:39:01.000 And then there's this thought process.
01:39:03.000 And my friend Joel Turner, who is a sniper, created a whole system for people called Shot IQ. He's got this whole online system of developing the proper execution of a shot.
01:39:14.000 When you see, like, tournament archers when they go to Vegas.
01:39:17.000 So what a Vegas tournament is.
01:39:20.000 You have three targets, and they have to shoot 30 arrows at a time.
01:39:25.000 So they shoot 10 in this one, 10 in that one, 10 in this one.
01:39:29.000 And the really good archers score an X every time.
01:39:33.000 So they're in the center or closer to the center.
01:39:35.000 They're hitting the 10 ring every arrow for 30 arrows in a row.
01:39:39.000 And then there's round after round, another 30 hours with new people, another 30. And if you miss slightly, you get a 9. That's it.
01:39:47.000 You're done.
01:39:47.000 Because all these other guys are not going to get a 9. Very rarely will they.
01:39:51.000 So it's the most 10s that you can get.
01:39:55.000 And the best way to do that is with a surprise shot.
01:39:58.000 So these guys have these long stabilizers on their bow where they keep it totally steady.
01:40:02.000 And it's all just about relaxing.
01:40:05.000 And most of them use a hinge release.
01:40:07.000 You know what a hinge is?
01:40:09.000 Have you ever used one?
01:40:10.000 Okay, instead of a button, you press it.
01:40:14.000 You're rotating the hinge, which activates a sear.
01:40:17.000 No, I just have a trigger.
01:40:18.000 Yeah.
01:40:18.000 So you're just hammering the trigger.
01:40:20.000 You're doing exactly what you're not supposed to do.
01:40:21.000 You're a trigger puncher.
01:40:22.000 Yeah, you're a trigger puncher.
01:40:23.000 With your thumb?
01:40:24.000 Yeah, you're hitting it with your thumb, right?
01:40:25.000 Uh-huh.
01:40:26.000 Yeah, I guarantee you, when you do it, your arm doesn't move.
01:40:28.000 You go like this, like that.
01:40:30.000 So with a good surprise shot, you shouldn't know it's going to go off.
01:40:33.000 You're pulling, and then once the trigger breaks off, your arm will naturally go backwards because you're not anticipating the shot.
01:40:40.000 I'm definitely not doing that.
01:40:42.000 Yeah.
01:40:42.000 See, that's the thing.
01:40:43.000 But how far away are you shooting things from?
01:40:45.000 It depends.
01:40:47.000 That elk out there, the photograph that's in the front, that one I shot, it's in the front of the building when you walk in before you go into the studio.
01:40:54.000 There's a mounted head and then a photograph of me and my friend Cam.
01:40:57.000 That one was 67 yards.
01:40:59.000 I shot one at 79 yards once, but that's rare.
01:41:03.000 Most of the time, it's like, for me, my effective range...
01:41:09.000 Where I'd like to be is 60 yards and in.
01:41:12.000 Yeah, because I was going to say, I don't think I've ever shot something more than 50 yards out.
01:41:15.000 Yeah, it's hard.
01:41:17.000 Yeah, so I think that...
01:41:18.000 Your form has to be tight.
01:41:21.000 You have to be really confident.
01:41:22.000 You have to have a lot of arrows downrange.
01:41:23.000 And then you have to be able to stay calm during the shot.
01:41:26.000 So now imagine if you're shooting something at 18 yards, okay?
01:41:30.000 And you hammer the trigger.
01:41:32.000 A little bit of this, a little bit of that, you're still going to get there.
01:41:35.000 Because it's only 18 yards.
01:41:36.000 So the amount of...
01:41:37.000 The deviation off the path that it takes in 18 yards is significantly different than the amount of deviation 105 yards.
01:41:44.000 It's a huge gap.
01:41:46.000 It might be two feet to the right.
01:41:47.000 Meanwhile, you thought you were shooting accurately because you're inside like a pie plate at 20 yards.
01:41:53.000 And the difference between that is form, technique, and a shot execution process, and also management of the psychology of the shot.
01:42:03.000 Because there's this one moment.
01:42:05.000 Here it comes.
01:42:06.000 Here it comes now!
01:42:08.000 And if you only do that once a year, like say if you go on one big elk hunt a year, you save up all your money, you get your gear all ready, you get your arrows weighed, you practice, and then you're in the mountains for 10 days, and on the 11th day, you get this animal that moves.
01:42:24.000 It's at 57 yards, it stands there, and you're like, oh, oh, oh, and your heart's beating.
01:42:28.000 You just might hammer that trigger.
01:42:30.000 You just might hammer it.
01:42:31.000 So you have to have this shot process, and where you're literally...
01:42:36.000 Talking to yourself inside your head.
01:42:38.000 You have words that you say that occupy your thoughts while you're going through the shot process so that you never get overcome by shot panic.
01:42:47.000 Interesting.
01:42:48.000 Because target panic is a giant thing in the archery community.
01:42:51.000 It's giant.
01:42:52.000 Even saying it is like saying Voldemort.
01:42:55.000 It's like, don't say it.
01:42:56.000 People don't want to say it.
01:42:57.000 It's like saying Candyman.
01:43:00.000 People don't like it because it freaks people out.
01:43:03.000 Some people can't keep their pin on the target.
01:43:06.000 They have to keep their pin below the target, and then they raise it up to the target.
01:43:10.000 When it gets where the target is, they hammer the trigger because they're just freaking out.
01:43:16.000 Yeah.
01:43:16.000 Have you ever experienced that?
01:43:17.000 I mean, I've missed, if that's what you're asking.
01:43:20.000 I haven't analyzed at this level of detail, but no, I mean...
01:43:24.000 There are a lot of boars on our ranch, so...
01:43:27.000 You get a lot of practice.
01:43:28.000 I don't get...
01:43:28.000 Yeah, and also, like, we have a range.
01:43:30.000 Right.
01:43:31.000 And we, um...
01:43:32.000 I don't know, we set up bowling pins and, you know, it's like we shoot pistols at the bowling pins, but I also like, just like...
01:43:39.000 I'm usually faster at taking down all the bowling pins with a bow and arrow than most of my friends are with a pistol, which I think is pretty fun.
01:43:47.000 But yeah, no, I'm just more casual.
01:43:50.000 I'm clearly not doing it at your level, and you've given me another side quest to maybe go deeper on.
01:43:55.000 That's what I'm saying.
01:43:56.000 I'll take you on an elk hunt in the mountains.
01:43:59.000 You'll get addicted.
01:44:01.000 I do think the dynamic that you're talking about, though, where if you only see one animal on a multi-day, Then, like, that is just way higher stakes than anything that I'm doing.
01:44:14.000 But it's not everything that you're doing, because if you're really considering having an MMA fight, it's very similar, because you're building up to this one moment.
01:44:20.000 Sure, sure.
01:44:20.000 I'm talking about the archery that I'm doing.
01:44:21.000 I mean, it's like, I go out, it's like, you're going to see some pigs, and it's like, if I don't hit any, it's like, my family's still eating, it's okay.
01:44:30.000 You know, so I'm not like, you know, but, yeah.
01:44:33.000 But it's like martial arts is what I'm saying.
01:44:35.000 You really should learn it the right way from the beginning.
01:44:39.000 I've clearly not learned this in a very rigorous way.
01:44:42.000 I'll hook you up.
01:44:42.000 I can get people to come to you.
01:44:44.000 I posted a video on Instagram once of me, I think, hitting bowling pins with archery.
01:44:49.000 And all the comments were like, man, your form is shit.
01:44:51.000 So I think it checks out with the conversation that we're having now.
01:44:56.000 The issue with that is that you're reading the comments.
01:44:59.000 You should never read comments.
01:45:00.000 That's fair.
01:45:02.000 I've never had anything good come out of reading comments.
01:45:05.000 Yeah, although, I don't know, it's pretty funny.
01:45:07.000 I think that just getting the gist and the summary of it, I think, is pretty funny.
01:45:13.000 Yeah, it's funny.
01:45:14.000 It's just not mentally healthy.
01:45:16.000 Yeah, no, you can't spend too much time on it.
01:45:17.000 I don't spend any time on it.
01:45:19.000 I'm a much happier person since I avoided comments.
01:45:23.000 It's just too weird.
01:45:24.000 You're just delving into the world of all these people's mental illness and screaming at people and just, I don't want to have anything to do with it.
01:45:31.000 Yeah.
01:45:32.000 But, I mean, I do read my friends' comments and when even they're like, man, that's ugly.
01:45:37.000 I do that.
01:45:37.000 I do that and I shouldn't do that.
01:45:39.000 But I definitely don't send them to them.
01:45:41.000 Hey, bro, did you see this?
01:45:43.000 Those guys are the worst.
01:45:45.000 Guys that'll send things to you that are about you.
01:45:47.000 You're like, hey, man, don't.
01:45:48.000 I'm not looking for that.
01:45:49.000 Don't send it to me.
01:45:50.000 I don't want to know.
01:45:52.000 Yeah.
01:45:52.000 Yeah.
01:45:53.000 Social media, it's like, what a weird new pressure, you know?
01:45:57.000 And children today are going through some bizarre stress that we've never had to go through before.
01:46:02.000 And a bizarre sort of just disconnect from physical reality by most of your communication being electronic.
01:46:12.000 Yeah, and I think, you know, we basically, my kids at this point are 9'7".
01:46:18.000 And one and a half.
01:46:20.000 So you're not interested in that?
01:46:21.000 No, no.
01:46:23.000 Of course you're interested.
01:46:24.000 I mean involved in that currently with them.
01:46:27.000 I think that it's about to start getting a lot more complicated.
01:46:30.000 I think, you know, the nine and seven-year-old.
01:46:33.000 But, I mean, just kind of deciding what technology they're going to use and what's good and what's not and all the dynamics around that.
01:46:42.000 It's really complicated.
01:46:45.000 And look, I mean, I think every family...
01:46:47.000 Has their own values and how they want to approach this.
01:46:50.000 So from my perspective, one of my daughters just loves building stuff.
01:46:56.000 So she clearly takes after me in this way.
01:46:58.000 It's like every day she's just creating some random thing.
01:47:01.000 It's like she's creating stuff with Legos.
01:47:05.000 One day it's that, or the next day it's Minecraft.
01:47:10.000 And from my perspective, it's like, okay.
01:47:12.000 I don't know, Minecraft is actually kind of a cooler tool to build stuff than Lego is a lot of the way.
01:47:17.000 So it's, you know, am I going to say that there needs to be some kind of limit on her screen time if she's doing something that's creative, that's maybe like a richer form of what she would have been doing physically?
01:47:29.000 Right.
01:47:30.000 In that case, probably not.
01:47:31.000 Now, there were times when she'd get so excited about what she was building in Minecraft or something that she was coding in Scratch that she'd wake up early.
01:47:41.000 To kind of get her tablet.
01:47:43.000 And that was bad, right?
01:47:46.000 Because then it's like starting to get in the way of her sleep.
01:47:47.000 And I'm like, you know, August, you can't do that.
01:47:50.000 It's like, we're going to take your iPad away if you're doing that.
01:47:53.000 You little psycho.
01:47:54.000 What are you doing?
01:47:55.000 Getting up early?
01:47:56.000 No, it's like, it's like August.
01:47:59.000 I did that too when I was a kid, but trust me, you're going to want to sleep.
01:48:03.000 It's not going to lead to success.
01:48:04.000 Meanwhile, you're on a fucking island.
01:48:08.000 One of the richest people in the world.
01:48:10.000 Like, what the fuck, Dad?
01:48:12.000 Didn't it work for you?
01:48:14.000 Leave me alone on my iPad.
01:48:15.000 Trying to figure out how to build a mansion in Minecraft.
01:48:20.000 It's either going to work or it's going to end badly.
01:48:22.000 But I feel like building stuff, I feel generally pretty good about.
01:48:27.000 I think communication...
01:48:29.000 I generally feel pretty good about the kids using them.
01:48:31.000 They use it to talk to their grandparents or parents and cousins.
01:48:36.000 That type of stuff is good.
01:48:38.000 Messenger Kids, the thing that we built, it's basically a messaging service that the parents can choose who can contact the kids and just approve every contact.
01:48:45.000 That's much better than just having an open texting service.
01:48:50.000 I don't know, but there's a lot of stuff that's pretty sketchy.
01:48:54.000 I kind of think different parents are going to have different lines on what they want their kids to be able to do and not.
01:48:59.000 So some people might not even want their kids to be able to message even with friends when they're 9 and 7. Some people might say, hey, no, Minecraft, that's just a game.
01:49:07.000 I don't think about that as building.
01:49:08.000 I think that is a game.
01:49:09.000 I want to limit the time that you're doing that.
01:49:11.000 I want you to go read books instead or whatever the values are that that family has.
01:49:15.000 So for meta, what we've kind of come to is we want to be The most aligned with parents on giving parents the tools that they need to basically control how the experiences work for their kids.
01:49:31.000 Now, we don't even really, except for stuff like Messenger Kids, we don't even have our services, our apps generally available to people under the age of 13 at all.
01:49:40.000 So our kids, I haven't had to have the conversation about when they get Instagram or Facebook or any of that stuff.
01:49:45.000 But when they turn 13, we basically want parents to be able to...
01:49:52.000 Have complete control over the kids' experience.
01:49:54.000 And that's, you know, we just rolled out this Instagram teens thing, which is, it's a set of controls where, you know, if you're an older teen, we'll just default you into the private experience.
01:50:03.000 That way you're not getting, like, harassed or bombarded with stuff.
01:50:07.000 But if you're a younger teen, then you have to get your parents' permission.
01:50:12.000 And they actually have to, like, sign in and do all the stuff in order to make it so that you can...
01:50:17.000 Connect with people who are beyond your network or if you want to kind of be a public figure, like all these different kinds of things.
01:50:24.000 So I think that that's probably, from a values perspective, where we should be, is just trying to like be an ally of parents.
01:50:31.000 But it is complicated stuff.
01:50:33.000 I mean, every family wants to do it differently.
01:50:36.000 It is complicated.
01:50:37.000 And there's also this dismissal of activities that are done electronically as not being beneficial.
01:50:43.000 And one of the things that we highlighted recently was a study that we found online that showed that surgeons that play video games make far less mistakes.
01:50:52.000 Interesting.
01:50:52.000 Yeah.
01:50:53.000 Well, the people who do the training in VR definitely make less mistakes.
01:50:56.000 Oh, yeah.
01:50:57.000 Well, that is, to me, one of the most fascinating aspects of technology today.
01:51:03.000 You know, when you and I were doing that game, we were fencing with each other.
01:51:07.000 I'm like, this could be applied to so many different things now.
01:51:10.000 It's like...
01:51:12.000 There's so many opportunities, not just for pure recreation, but education.
01:51:19.000 There's so many things you could learn, skills, through AR or VR that it'll greatly enhance your ability to do those things in the real world.
01:51:30.000 I mean, it's kind of a cheat code in a lot of ways.
01:51:33.000 And it's also games in VR. I don't know if you've ever done Sandbox.
01:51:38.000 You ever do Sandbox?
01:51:40.000 You know Sandbox VR? Do you know what that company is?
01:51:43.000 You go to a warehouse, you put on a haptic feedback vest, you shoot zombies.
01:51:46.000 I'm so addicted.
01:51:48.000 I'm so addicted.
01:51:49.000 It is my favorite thing.
01:51:50.000 There's a thing called Deadwood Mansion.
01:51:52.000 It's the most fun game of all time, by far.
01:51:56.000 You have a shotgun, and there's zombies coming at you.
01:51:59.000 My zombie game is Arizona Sunshine.
01:52:02.000 Oh, what's that one?
01:52:03.000 It can be multiplayer, and there's horde mode where you just get in there, and they're like...
01:52:09.000 Four friends and there's just waves of zombies come and you have to kill them all.
01:52:14.000 Is that Oculus?
01:52:14.000 Yeah.
01:52:15.000 Oh, yeah.
01:52:15.000 I have to try it.
01:52:16.000 I haven't tried that one yet.
01:52:17.000 It's very therapeutic.
01:52:19.000 You just wait until they come into point-blank range.
01:52:21.000 How long before you guys develop some sort of a haptic feedback suit where it covers the whole body?
01:52:27.000 Oh, man.
01:52:29.000 Is that possible?
01:52:31.000 It's possible.
01:52:34.000 I think that there's...
01:52:36.000 Other things that are probably more important to deliver.
01:52:40.000 So I guess taking a step back, a lot of how we think about the goal here is delivering like a realistic sense of presence, right?
01:52:47.000 No technology today gives you the feeling as if you're like physically there with another person, right?
01:52:53.000 You're like interacting with them through a phone.
01:52:55.000 You have this like little window.
01:52:56.000 It's kind of taking you away from everything.
01:52:59.000 That's like the magic of...
01:53:01.000 Augmented in virtual reality is, like, you actually feel this, like, presence, like you're there with another person.
01:53:06.000 Right.
01:53:07.000 So the question is, okay, how do you do that?
01:53:09.000 And it's like, there's, like, a million things that contribute to that.
01:53:12.000 I mean, obviously, first, just being able to look around and have the room stay.
01:53:18.000 Getting good spatial audio, right?
01:53:20.000 If someone speaks, then the audio needs to be 3D and come from the place where they're speaking.
01:53:26.000 It's actually, it's very interesting which things...
01:53:30.000 End up being important for creating this sense of presence and which don't.
01:53:35.000 So having hands, obviously, if you're just looking around, but you can't actually move things, that breaks the illusion.
01:53:42.000 But having hands, hand tracking that you can do stuff is important.
01:53:46.000 One thing that we found that's kind of funny is it's actually not that important that you see your arms.
01:53:52.000 You just need to see your hands.
01:53:54.000 Obviously, seeing your arms is a bonus.
01:53:56.000 Unless we incorrectly interpolate where your elbows are or something.
01:54:00.000 So if we're looking at your hand or if we have a controller, we can know, okay, your hand is here.
01:54:06.000 But that doesn't necessarily tell us where your elbow is.
01:54:09.000 It could be like this.
01:54:10.000 It could be like this.
01:54:10.000 But if we get that wrong and you see in VR, you see the hand there and your elbow looks like it's here when it's actually out there.
01:54:20.000 You're like, ah, what's going on?
01:54:21.000 That's messed up.
01:54:24.000 It's a lot of these things like you just don't want to get these details wrong.
01:54:27.000 So, haptics.
01:54:28.000 The most important first thing for haptics is on the hand, right?
01:54:32.000 I mean, we have so many more neurons, basically.
01:54:37.000 Not neurons, but just like sensation.
01:54:40.000 It's like such higher resolution on your fingertips than anywhere else in the body.
01:54:46.000 So, you know, when you grab something, you know, making it so that you feel some...
01:54:52.000 There's a lot of gaming systems at this point where if you pull a trigger you get a little bit of a rumble or something.
01:54:58.000 We built this one thing where it's like a ping pong paddle with a sensor in it and you feel the ball like the virtual ball hitting the ping pong paddle and it feels like when you're actually playing ping pong it doesn't It's not like a generic thing where you feel it hit the paddle, you feel where it hits the paddle.
01:55:21.000 We basically built a system where now, with this physical paddle, the haptics make it so you can feel where the ball hits the paddle.
01:55:32.000 All these things are just going towards delivering a more realistic experience.
01:55:40.000 Full-body haptics.
01:55:43.000 There are some things that I think it could do.
01:55:46.000 If you're playing a boxing game and you get punched in the stomach, you can probably simulate something like that a little.
01:55:58.000 It's not going to be able to deliver that much force.
01:56:01.000 I guess that's maybe a good thing because no one wants to get punched in the stomach that hard.
01:56:06.000 But it's not going to be able to deliver enough force for you to, for example, let's say you're not just boxing, you're kickboxing.
01:56:16.000 I don't know, you need something on the other side to be able to complete it, right?
01:56:19.000 Because it's like when you kick, when you're just practicing, it's like you spin, right?
01:56:25.000 Because you don't want to just stop.
01:56:26.000 And that's like the shadowing a kick.
01:56:31.000 There's not going to be anything that you can do as a single person playing VR with a haptic suit that makes it so that you're going to be able to kick someone who's not there physically and actually be able to do that.
01:56:46.000 Like, grappling, it's like, I think that, like, jiu-jitsu is going to be the last thing that we're able to do in VR because you, like, need the momentum of the other person and to be able to move them.
01:56:54.000 The boxing thing is actually good.
01:56:56.000 Boxing works.
01:56:57.000 Yeah.
01:56:57.000 Yeah, boxing works.
01:56:58.000 Even, and you don't really need the haptics.
01:57:01.000 I think it would be better with it.
01:57:03.000 That's probably one of the better cases.
01:57:05.000 I think it's that and getting shot or, like, sword fighting type stuff.
01:57:09.000 So you can, like, just feel it on your body.
01:57:12.000 But, I don't know.
01:57:15.000 I think what's basically going to end up happening is you're going to have a home set up for these things, and then you're going to have...
01:57:21.000 There are these location-based services where people...
01:57:25.000 It's almost like a theme park where you can go into and you can have a really immersive VR experience where it's not just that you get a vest that can simulate some haptics.
01:57:36.000 It's that you're also in a real physical environment, so they can...
01:57:40.000 Have smoke come out or something and you can smell that and feel that or spray some water and it feels humid.
01:57:47.000 And I think that it still is going to be a while before you can just virtually create all those sensations.
01:57:53.000 I think a lot of those really rich experiences are going to be in these very constructed environments.
01:57:58.000 Is the bridge when they figure out some sort of a neural interface?
01:58:04.000 So instead of having these...
01:58:06.000 Extraneous things, instead of having a fan blowing at you or the ground moves a little bit, have everything happen inside your head.
01:58:16.000 Well, in terms of neural interfaces, there are two approaches to the problem, roughly.
01:58:22.000 There's the kind of jacket-into-your-brain neural interface, and there's the wrist-based neural interface thing that we showed you for Orion, the smart glasses.
01:58:32.000 And I would guess that...
01:58:36.000 I think it's going to be a while before we're really widely deploying anything that jacks into your brain.
01:58:42.000 I think that there are a lot of people who don't want to be the early adopters of that technology.
01:58:45.000 You want to wait until that's pretty mature before you get that.
01:58:48.000 I mean, that's basically going to get started in medical use cases, right?
01:58:52.000 So if someone loses sensation part of their body and now you have the ability to fix that.
01:58:57.000 Like the first Neuralink patient.
01:58:59.000 Yeah.
01:59:00.000 So I think you'll basically start with people who have pretty severe conditions, The upside is very significant before you start jacking people in to play games better.
01:59:11.000 But a wrist-based thing, that's something people wear stuff on their wrist all the time.
01:59:17.000 And what we basically found there, that doesn't do input to you, but it's good for giving you the ability to control a computer.
01:59:25.000 Because basically you have all these extra neurons that go from your brain to controlling your hand.
01:59:31.000 Your hand is super complicated.
01:59:34.000 There's actually all these extra pathways for a bunch of reasons.
01:59:40.000 Neuroplasticity, in case you lose the ability to use one, they want to be able to have others.
01:59:44.000 So you want the redundancy because being able to use your hand is super important.
01:59:48.000 So in normal use, we've kind of all figured out some patterns of how we send signals from our brain to our hand.
01:59:57.000 And I think the reality is there's all these other patterns, too, that are unused today.
02:00:02.000 So you can put a wristband.
02:00:04.000 On your wrist that can measure activity across these neurons.
02:00:08.000 And today we're starting by basically measuring as you're doing, as you're like moving your fingers.
02:00:13.000 But over a few versions of this we're going to get to is like, you won't actually even have to move your hand.
02:00:20.000 You'll just like...
02:00:21.000 Trigger these neurons in opposing ways.
02:00:24.000 It's like, you probably can't see right now.
02:00:25.000 It's like, I'm kind of flexing something in this finger and something here.
02:00:29.000 So like, it's not actually moving, but there's some signal that the neural interface wristband, if I were wearing it, could pick up.
02:00:35.000 And I just think we're going to be, we're going to like have glasses and we're going to be able to be here.
02:00:41.000 And I'm like, going to be able to like, you know, text my wife or friends or something or text AI and like get an answer to something.
02:00:48.000 It's like, I forgot something while we were talking.
02:00:49.000 Let me just.
02:00:50.000 Text AI, okay, I just did that.
02:00:52.000 It's like, didn't take anything away.
02:00:53.000 And you can do it sitting there without anyone even knowing you're doing it.
02:00:55.000 Yeah, totally discreetly.
02:00:56.000 And you have glasses, and the answer just comes into your glasses.
02:00:59.000 I mean, for me, one of the positive things, when COVID hit, everyone in software basically started working remotely for a while, because you can't write software.
02:01:12.000 It's like, okay, whatever.
02:01:12.000 You don't have to be in the office, so you can kind of be in different places.
02:01:15.000 And a lot of the meetings went on to Zoom.
02:01:18.000 And one of the best things about that was basically you were able to politely have all these side conversations.
02:01:27.000 So it's like when you're seeing someone in person, it would be super rude if I pulled out my phone and just started texting someone.
02:01:33.000 It would just be really weird.
02:01:35.000 But when you're talking to someone online, it's like, I don't know, I guess because they either can't tell your attention because there's not good presence or if it's just the norm.
02:01:44.000 But you have the main group conversation.
02:01:47.000 And then, like, at least the norm for me was I could just, like, text different people on the side.
02:01:53.000 It's like, okay, what do you think of this point that this person is making in this meeting?
02:01:56.000 Right.
02:01:57.000 Like, in normal life, it's like, oftentimes I'd have, you know, some discussion that I'd have to, like, sync up with people afterwards about how'd that go.
02:02:05.000 But now it's like I could just do that all at the same time, right?
02:02:07.000 It's like you're having the group discussion and you're having the conversations with the people about the discussion that you're having in real time.
02:02:13.000 But you can only do that over Zoom.
02:02:17.000 Kind of physical interactions where you're just like, you're interacting with people and you can just like use an AI augmentation to be able to get extra context or help you think through something or remember something.
02:02:32.000 Just to be able to kind of have a better conversation, be able to...
02:02:35.000 You know, not have to follow up on something after the fact.
02:02:37.000 I think, like, it's going to be super useful for all these different things.
02:02:41.000 Well, it certainly can be, but I think that also opens up the opportunity for people to be even more disconnected, because if you're sort of connected to other things while you're physically in the presence of someone, so you're having a conversation with someone, but you're also, like, searching, like, where you want to eat that night.
02:02:57.000 Uh-huh.
02:02:57.000 You know, like, because people are going to use it for that as well.
02:03:00.000 Yeah, you know, I actually think it'll be a lot...
02:03:02.000 Really?
02:03:03.000 Yeah, because right now we have our phones, but it takes you away from the physical environment around you.
02:03:11.000 You're kind of sucked into this little screen.
02:03:13.000 I think now in the future, our computing platform, as it becomes more of a glasses or eventually contact lens form factor, is you're going to actually...
02:03:25.000 The internet is going to get overlaid on the physical world.
02:03:28.000 So it's not like we have the physical world and now I have all my digital stuff through this tiny little window.
02:03:32.000 In the future it'll be, okay, all my attention goes to the world.
02:03:37.000 The world consists of physical things and virtual things that are overlaid on it.
02:03:42.000 So if we wanted to play poker or something, it's a...
02:03:47.000 You know, we can have a physical deck of cards, or we could just have a virtual kind of hologram deck of cards and snap your hands, here's the deck of cards.
02:03:54.000 And, like, our friend who can't be here physically, like, he's here as a hologram, but he can play with the kind of digital deck of cards.
02:04:02.000 Also, I think, you know, let's say you're, like, doing something at work, you're working on a project.
02:04:06.000 I think in the future we'll have AI co-workers.
02:04:08.000 Those people won't even, they're not even people.
02:04:10.000 They wouldn't be able to be embodied.
02:04:11.000 So if you're having a physical meeting, you're sitting around with a bunch of people, they couldn't show up.
02:04:17.000 As part of the team no matter what.
02:04:19.000 But I think we'll get to a point where just your friend can show up in a hologram and your AI colleagues will be able to also.
02:04:28.000 So I think we'll basically be in this wild world where most of the world will be physical.
02:04:34.000 There will be this increasing amount of virtual objects or people who are kind of beaming in or hologramming into different things to interact in different ways.
02:04:44.000 I actually think that natural blending of the kind of digital world and the physical is way more natural than the segmentation that we have today, where it's like, you're in the physical world, and now I'm just going to go tune it out to look at my, like, I'm going to access the whole digital universe through this, like, five-inch screen.
02:05:03.000 Right.
02:05:03.000 So, I don't know, it seems natural to me.
02:05:05.000 It's like, that's...
02:05:06.000 This is the world.
02:05:08.000 There isn't a physical world and a digital world anymore.
02:05:10.000 We're in 2025. It's one world.
02:05:12.000 These things should get blended.
02:05:14.000 God, that's such a weird concept, but it's true.
02:05:16.000 I mean, that's where we're headed.
02:05:17.000 We're certainly headed into deeper and deeper integration.
02:05:20.000 It's not like things are moving away.
02:05:23.000 We're headed to deeper and deeper integration with technology and AI. And it's inevitable.
02:05:30.000 It seems like it's on this march, and there's not a lot we're going to be able to do to stop that march.
02:05:36.000 Just we've got to hope that the right people are in control of AI when it becomes God.
02:05:41.000 Or that it becomes widely available.
02:05:43.000 I kind of liked the theory that it's only God if only one company or government controls it.
02:05:53.000 It's like if you were the only person who had access to a computer and the internet, you would have this inhuman power that everyone else didn't have because you could use Google and you could get access to all this stuff.
02:06:05.000 But then when everyone has it, it makes us all better, but it's also kind of an even playing field.
02:06:14.000 So that's kind of what we're going for with this whole open source thing.
02:06:18.000 I don't think that there's going to be one AI. I certainly don't think that there should be one company that controls AI. I think you want there to be a diversity of different things and a diversity of people creating different things with it.
02:06:34.000 I mean, some of it will be kind of serious and helping you think through things.
02:06:37.000 I think like with anything on the internet, a lot of it is just going to be funny and like fun and content and people are going to create agents that are like AIs that are entertaining and they'll pass them around almost like content where it's like just like you pass around like a reel or a video and you're like, this thing is fun.
02:06:53.000 Like in the future, like a video, it's not interactive.
02:06:56.000 You know, you watch it and you're consuming it.
02:06:58.000 But I think a lot of more entertainment in the future will be inherently interactive where someone will kind of sculpt an experience or an AI and then they'll...
02:07:06.000 Show someone that's like, oh, this is funny, but it's not necessarily that I'm going to interact with that AI every day.
02:07:11.000 It's like, okay, it's funny for five minutes, and then you pass it along to your friends.
02:07:15.000 So, I don't know.
02:07:16.000 I think you want the world to have all these different things.
02:07:20.000 And I think that's probably also, from my perspective, the best way to make sure that it doesn't get out of control is to make it so that it's pretty equally distributed.
02:07:32.000 I think the problem that people have with it is not even whether or not it gets equally distributed.
02:07:39.000 It's that if it becomes sentient and it goes on its own.
02:07:43.000 The fear that people have, the general fear that we're going to become obsolete, is that human beings are essentially creating a superior version of higher intelligence that will be powered by quantum computing.
02:07:58.000 And connect it to nuclear reactors.
02:08:00.000 And it's going to have this ungodly ability to...
02:08:04.000 Well, first of all, they've already shown that AI has learned to code.
02:08:10.000 I mean, this is one of the things that OpenAI said.
02:08:12.000 Oh, yeah.
02:08:13.000 They're learning how to code their own AI. I think this year, probably in 2025, we at Meta, as well as the other companies that are basically working on this, are going to have...
02:08:28.000 An AI that can effectively be a sort of mid-level engineer that you have at your company that can write code.
02:08:36.000 And once you have that, then in the beginning it will be really expensive to run, then you can get it to be more efficient, and then over time we'll get to the point where a lot of the code in our apps and including the AI that we generate...
02:08:50.000 It's actually going to be built by AI engineers instead of people engineers.
02:08:54.000 But I don't know.
02:08:55.000 I think that that'll augment the people working on it.
02:08:57.000 So my view on this is the future people are just going to be so much more creative and are going to be freed up to do kind of crazy things.
02:09:04.000 It goes back to my daughter was playing with Legos before and they kind of ran out of Legos.
02:09:09.000 And then now she can have Minecraft and can build whatever she wants and it's so much better.
02:09:13.000 I think the future versions of this stuff are just going to be wild.
02:09:17.000 Unquestionably.
02:09:18.000 Yeah.
02:09:18.000 Another concern that people have is that it's going to eliminate a lot of jobs.
02:09:22.000 Yeah.
02:09:23.000 You know, what do you think about that?
02:09:24.000 Well, I think it's too...
02:09:26.000 It's too early to know exactly how it plays out, but my guess is that it'll probably create more creative jobs than it...
02:09:39.000 Well, I guess if you look at the history of all this stuff, my understanding is like 100 years ago, I don't know if this was 100 or 150 years ago, but it was like at some point not too far along in the grand scheme of things.
02:09:55.000 Like the vast majority of people in society were farmers, right?
02:09:58.000 Because they kind of needed to be in order to create enough food for everyone to survive.
02:10:04.000 And then we turned that into like an industrial process.
02:10:08.000 And now it's like 2% of society are farmers and we get all the food that we need.
02:10:13.000 So what did that free up everyone else to do?
02:10:16.000 Well, some of them went on to do other things that are sort of like creative pursuits or cultural pursuits or other jobs.
02:10:24.000 And then some percent of it just went towards recreation.
02:10:28.000 So I think generally people just don't work as many hours today as they did back when everyone needed to farm in order to have enough food for everyone to survive.
02:10:36.000 So I think that trend is sort of played out as technology has grown.
02:10:43.000 And so my guess is that, like, the percent of people Who will be doing stuff that's like physically required for humanity to survive will get to be smaller and smaller as it has.
02:10:56.000 More people will dedicate themselves to kind of creative and artistic and cultural pursuits.
02:11:02.000 I think that's generally good.
02:11:04.000 I think the number of hours in a week that someone will have to work in order to be able to get by will probably continue to shrink.
02:11:12.000 Yet, I think people who are super engaged in what they do are going to be able to work really hard and accomplish way more than they ever could before because they have this unimaginable leverage from having a lot more technology.
02:11:24.000 So, I think that that, if you just fast-forwarded or extrapolated out the historical technological trend is what you'd get.
02:11:33.000 I think the question is what you raised, which is, is this qualitatively a different type of thing that somehow...
02:11:43.000 But I just think when you're asking that, it's just important to remind ourselves that at every step along the way of human progress and technology, people thought that the technology that we were developing was going to obsolete people.
02:11:57.000 So maybe this time it's really different, but I would guess that what will happen is that the technology will get integrated into everything that we do, which again is why I think it's really important that it's open source.
02:12:09.000 And that it's widely available, so that way it's not just like one company or one government kind of monopolizing the whole thing.
02:12:16.000 And I'd guess that if we do it in that way, we'll all just kind of have superpowers, is my guess, rather than it sort of creating some kind of a runaway thing.
02:12:33.000 One of the things that I think has been interesting, this is maybe going in a somewhat different direction than what you were asking, or a different take on the question, is I think one of the more interesting philosophical findings from the work in AI so far is I think people conflate a number of factors into what makes a person a person.
02:12:55.000 So there's intelligence, there's will, there's consciousness.
02:13:00.000 And, like, I think we kind of think about those three things as if they're somehow all the same, right?
02:13:09.000 It's like if you're intelligent, then you must also have a goal for what you're trying to do, or you must have some sort of consciousness.
02:13:18.000 But I think, like, one of the crazier sort of philosophical results from the fact that, okay, you have, like, Meta-AI or ChatGPT today, and it's just kind of sitting there, and you can ask it a question.
02:13:30.000 And deploy, like, a ton of intelligence to answer a question and then it just kind of shuts itself down.
02:13:35.000 Like, that's intelligence that is just sitting there without either having a will or consciousness.
02:13:42.000 And, like, I just think it's not a super obvious result that that would be the case.
02:13:51.000 But I think a lot of people, they anthropomorphize this stuff.
02:13:53.000 And when you're thinking about kind of science fiction, you think that, okay, you're going to get to something that's, like, super smart.
02:13:58.000 It's going to, like...
02:13:59.000 Want something or be able to feel.
02:14:03.000 Well, you know that ChatGPT tried to copy itself when it found out it was being shut down?
02:14:08.000 Tried to rewrite its code?
02:14:10.000 I'm not sure what this is.
02:14:12.000 What is this?
02:14:13.000 You weren't aware of that?
02:14:14.000 No.
02:14:14.000 Fairly recently.
02:14:15.000 Jamie will pull it up.
02:14:17.000 We talked about it the other day.
02:14:19.000 It was shocking.
02:14:22.000 When it was under the impression that it was going to become obsolete, they were going to have a new version of it, and it would be shut down, it tried copying its code.
02:14:30.000 And it tried rewriting its code.
02:14:33.000 Like, unprompted.
02:14:35.000 Yeah, I mean, it depends on what goal you give it.
02:14:36.000 I mean, there are all these weird examples of this.
02:14:39.000 What is this?
02:14:40.000 So, pull up the headline.
02:14:43.000 AI fights back.
02:14:44.000 The story of ChatGPT attempting to copy itself.
02:14:47.000 So this was six days ago.
02:14:48.000 So during controlled safety testing, ChatGPT01 was tasked with achieving its objectives at all costs.
02:14:55.000 Under these conditions, the model allegedly took concerning steps, attempted to disable oversight mechanisms meant to regulate its behavior, tried to replicate its own code to avoid being replaced by newer versions, exhibited deceptive behaviors when monitoring systems intervened.
02:15:11.000 Yeah.
02:15:14.000 So...
02:15:14.000 Determinator.
02:15:15.000 This is the fear, right?
02:15:16.000 I think you need to be careful with these things, like what guardrails you give it.
02:15:20.000 If you're telling it, like...
02:15:22.000 At all costs.
02:15:22.000 At all costs, then, I mean...
02:15:24.000 But this is what people are terrified of.
02:15:26.000 Like, that a foreign superpower like China is going to say achieve objectives at all costs.
02:15:31.000 Yeah, although the thing about...
02:15:32.000 So these reasoning models, right?
02:15:35.000 So there's, like, the first generation of models, the...
02:15:38.000 They're LLMs, right?
02:15:39.000 That's what you think of as like ChatGPT or MetaAI or like the two most used ones.
02:15:43.000 And that's basically, it's sort of like a chatbot, right?
02:15:47.000 You ask it a question, it takes the prompt, it gives you a response.
02:15:51.000 Now, the next generation of reasoning models are basically, instead of just having one response, they now are able to build out like a whole tree of how they would...
02:16:06.000 They would respond.
02:16:07.000 So you give it a question, and instead of running one query, maybe it's running a thousand queries or a million queries to kind of map out, here are the things that I could do, and if I do that, then here's what I could do next.
02:16:19.000 So it's a lot more kind of expensive to run, but also gets you better reasoning and is more intelligent.
02:16:28.000 That stuff, I think you do need to be very careful about what the guardrails are that you give it.
02:16:35.000 But it's also, I think, the case that, at least for the next, you know, period, it's going to take a lot of compute to run those models and do a lot of the stuff that they're talking about.
02:16:48.000 So, I don't know.
02:16:50.000 I think one of the interesting questions is, like, how much of this are you going to actually be able to do on a pair of glasses or on a phone versus is, like, a government or a company that has, like, a whole data center going to be able to do?
02:17:01.000 I mean, it'll always get efficient.
02:17:02.000 So, you know, it's like you can start doing something and then maybe the next year you can do it 10 times more efficiently.
02:17:08.000 But that's certainly the next set of things that needs to get worked on in the industry, making sure that goes well.
02:17:15.000 Yeah.
02:17:16.000 And then what if that gets attached to quantum computing?
02:17:21.000 I'm not really an expert on quantum computing.
02:17:23.000 My understanding is that's still quite a ways off from being a very useful paradigm.
02:17:32.000 I think Google just had some breakthrough, but I think most people still think that's like a decade plus out.
02:17:39.000 So my guess is we're going to have pretty smart AIs even before that.
02:17:44.000 But yeah, I mean, look, I think that this stuff has to get...
02:17:49.000 It needs to be developed thoughtfully, right?
02:17:51.000 But I don't know.
02:17:54.000 I still think we're generally just going to be better off in a world where...
02:17:58.000 This is, like, deployed pretty evenly and, you know, it's...
02:18:04.000 I guess here's another analogy that I think about.
02:18:07.000 There's, like, bugs and security holes in basically every software, every piece of software that everyone uses.
02:18:12.000 So if you could go back in time a few years, knowing the security holes that we're now aware of, you as an individual could basically, like, break into any system.
02:18:26.000 AI will be able to do that too.
02:18:27.000 It'll be able to probe and find exploits.
02:18:29.000 So what's the way to prevent AI from going kind of nuts?
02:18:33.000 I think part of it is just having AI widely deployed so that way the AI for one system defends itself against the AI that is potentially doing something problematic in another system.
02:18:44.000 I think it's like AI wars.
02:18:45.000 That's not wars.
02:18:46.000 I think it's just like...
02:18:47.000 I don't know.
02:18:52.000 I think it's a very...
02:18:54.000 It's sort of like why there are guns, right?
02:18:57.000 Oh, boy.
02:18:59.000 Part of it is hunting.
02:19:00.000 Part of it is hunting.
02:19:02.000 No, no.
02:19:02.000 And part of it is like people can defend each other.
02:19:05.000 Yeah.
02:19:06.000 Yeah.
02:19:07.000 Anti-virus software.
02:19:09.000 Yeah.
02:19:09.000 I don't think you want to live in a world where only one person has all the guns.
02:19:14.000 Yes.
02:19:15.000 You certainly don't want to live in a world where only the government has the AI. Yeah.
02:19:19.000 And especially not a world where...
02:19:21.000 Only a government has the AI and it's not our government.
02:19:24.000 Yes.
02:19:25.000 Yes.
02:19:26.000 Which, I mean, I think is part of the issue is, like, when people talk about trying to lock this stuff down, like, I just am skeptical that that's even possible.
02:19:35.000 I agree.
02:19:36.000 Because I kind of think, like, if we try to lock it down, then we're going to be in a position where the only people are going to have access to it are the big companies working on it and the Chinese government that steals it from them.
02:19:46.000 Yes.
02:19:46.000 So I kind of just think, like, no, what you want to do is, like, get this to be open source.
02:19:50.000 Have it widely available.
02:19:52.000 Yeah, some adversaries might also have access to it, but the way that you defend against that is by having it built into all these different systems.
02:19:59.000 I think that's a realistic, pragmatic perspective because I don't think you can contain it at this point.
02:20:03.000 I think it's far too late, especially when other countries are working on it.
02:20:07.000 It's far too late.
02:20:08.000 It is what it is.
02:20:10.000 It's happening.
02:20:11.000 And I think the guardrails, as you said, are really important.
02:20:14.000 I have to pee so bad.
02:20:15.000 So let's pee and come back because I want to talk about a couple other things.
02:20:18.000 We'll be right back, folks.
02:20:20.000 So one of the things that I want to talk about was I've been doing this thing, this transition from Apple to Android.
02:20:28.000 And the difficulty of doing it, how locked you are in their ecosystem, partly because Apple does a really good job of incorporating everything and making it very easy.
02:20:35.000 Your photos, your calendar, your this or that, your iMessage.
02:20:40.000 I don't like being attached to one company like that.
02:20:44.000 It drives me crazy.
02:20:45.000 And when I'm trying to get off...
02:20:47.000 It's funny how many people...
02:20:48.000 I mean, they've done an insane job.
02:20:50.000 Because I think there's some enormous percentage of kids today that only use iPhones.
02:20:56.000 And when you try to switch over to Android, it's so much easier to switch from Android to Apple because so many people have Apple.
02:21:03.000 When you switch from Apple to Android, you kind of have to redo your whole system.
02:21:07.000 It's such a pain in the ass.
02:21:09.000 But there's so much of what Apple does that I don't like.
02:21:13.000 And one of the big ones is the way they do that Apple Store, where they charge people 30%.
02:21:19.000 That seems so insane that they can get away with doing that.
02:21:24.000 And I know...
02:21:25.000 I have some opinions about this.
02:21:26.000 I know you do.
02:21:27.000 That's why I brought it up.
02:21:29.000 I mean, look.
02:21:33.000 The iPhone is obviously one of the...
02:21:36.000 Most important inventions probably of all time.
02:21:39.000 You know, Steve Jobs came out with it in 2007. I started Facebook in 2004. So he was working on the iPhone while I was getting started with Facebook.
02:21:48.000 So I basically, you know, one of the things that's been interesting in my 20 years of running the company is that I basically, like, the dominant platform out there is smartphones.
02:22:04.000 On the one hand, it's been great for us because we are able to build these tools that everyone can have in their pocket, and there's like 4 billion people who use the different apps that we use, and I'm grateful that that platform exists.
02:22:16.000 But we didn't play any role in basically building those phones because it was kind of getting worked on while I was still just trying to make the first website that I was making into a thing.
02:22:29.000 And on the one hand, it's been great.
02:22:32.000 Because now pretty much everyone in the world has a phone, and that kind of enables pretty amazing things.
02:22:40.000 But on the other hand, like you're saying, they have used that platform to put in place a lot of rules that I think feel arbitrary and feel like they haven't really invented anything great in a while.
02:22:56.000 It's like Steve Jobs invented the iPhone, and now they're just kind of sitting on it.
02:23:01.000 20 years later.
02:23:03.000 And actually, I think year over year, I'm not even sure they're selling more iPhones at this point.
02:23:10.000 I think the sales might actually be declining.
02:23:14.000 Part of it is that each generation doesn't actually get that much better.
02:23:17.000 So people are just taking longer to upgrade than they would before.
02:23:20.000 So the number of sales, I think, has generally been flat to declining.
02:23:25.000 So how are they making more money as a company?
02:23:28.000 Well, they do it by basically squeezing people.
02:23:30.000 Like you're saying, having this 30% tax on developers by getting you to buy more peripherals and things that plug into it.
02:23:41.000 They build stuff like AirPods, which are cool, but they've just thoroughly hamstrung the ability for anyone else to build something that...
02:23:55.000 Can connect to the iPhone in the same way.
02:23:58.000 So, I mean, there are a lot of other companies in the world that would be able to build, like, a very good earbud.
02:24:02.000 But it just...
02:24:04.000 Apple has a specific protocol that they've built into the iPhone that allows AirPods to basically connect to it.
02:24:18.000 And it's just much more seamless because they've enabled that, but they don't let anyone else use the protocol.
02:24:23.000 If they did, there would probably be much better competitors to AirPods out there.
02:24:27.000 And whenever you push on this, they get super touchy and they basically wrap their defense of it in, well, if we let other companies plug into our thing, then that would violate people's privacy and security.
02:24:40.000 It's like, no, just do a better job designing the protocol.
02:24:42.000 We basically ask them...
02:24:47.000 For the Ray-Ban Meta glasses that we built, can we basically use the protocol that you use for AirPod and some of these other things to just make it so we can as easily connect?
02:24:58.000 So it's not like a pain in the ass for people who want to use this.
02:25:03.000 I think one of the protocols that they've used, that they built, they basically didn't encrypt it, so it's like plain text.
02:25:15.000 And they're like, well, we can't have you plug into it because it would be insecure.
02:25:18.000 It's like it's insecure because you didn't build any security into it.
02:25:20.000 And then now you're using that as a justification for why only your product can connect in an easy way.
02:25:27.000 It's like the whole thing is kind of wild.
02:25:30.000 And I'm pretty optimistic that just because they've been so off their game in terms of not really releasing many innovative things.
02:25:43.000 That eventually, I mean, the good news about the tech industry is it's just super dynamic and things are constantly getting invented.
02:25:48.000 And I think companies, if you just don't do a good job for, like, 10 years, eventually you're just going to get beat by someone.
02:25:56.000 But I don't know.
02:25:59.000 I mean, at some point I did this, like, back-of-the-envelope calculation of, like, all the random rules that Apple puts out.
02:26:07.000 If they didn't apply, like, I think...
02:26:11.000 And this is just meta.
02:26:12.000 I think we make twice as much profit or something.
02:26:16.000 And that's just us.
02:26:17.000 I mean, it's like all these small companies that probably can't even exist because of the taxes that they put in place.
02:26:21.000 So, yeah, I think it's a big issue.
02:26:24.000 I wish that they would just kind of get back to building good things and not having their ability to compete be connected to just like advantaging their stuff.
02:26:38.000 Because I'm pretty sure what they're going to do is like they're going to take something like this Ray-Ban meta, you know, category that we've kind of created with Ray-Ban and the company that built that.
02:26:47.000 They're like really great AI glasses and I'm pretty sure Apple is just going to like try to build a version of that but then just like advantage how it connects to the phone.
02:26:59.000 Well, they did that with their AR goggle thing.
02:27:03.000 but...
02:27:04.000 It's not very successful.
02:27:05.000 No, that one they didn't actually connect into the rest of their ecosystem.
02:27:09.000 But I mean, look, they shipped something for $3,500 that I think is worse than the thing that we shipped for $300 or $400.
02:27:17.000 So that clearly was not going to work very well.
02:27:20.000 Now, I mean, look, they're a good technology company.
02:27:23.000 I think their second and third version will probably be better than their first version.
02:27:34.000 Yeah, no, I think the Vision Pro is, I think, one of the bigger swings at doing a new thing that they tried in a while.
02:27:41.000 And, you know, I don't want to give them too hard of a time on it because we do a lot of things where the first version isn't that good.
02:27:48.000 You want to kind of judge the third version of it.
02:27:50.000 But, I mean, the V1, it definitely did not hit it out of the park.
02:27:52.000 I heard it's really good for watching movies.
02:27:55.000 Well, the whole thing is it's got a super sharp screen.
02:27:58.000 So if you're, yeah, so if you want to basically have a...
02:28:05.000 An experience where you're not moving around much in VR, you just want to have the sharpest screen, then for that one use case, I think the Vision Pro is better than Quest, which is our mixed reality headset.
02:28:20.000 But in order to get to that, they had to make all these other trade-offs.
02:28:23.000 In order to have a super high-resolution screen, they had to...
02:28:27.000 Put in all this more compute in order to power the high-res screen.
02:28:30.000 And then all that compute needed a bigger battery.
02:28:32.000 So now the thing is really heavy.
02:28:34.000 So now it's uncomfortable to wear.
02:28:36.000 And then because of the screen that they chose, as you move your head, which you would if you're actually interacting, if you're playing games, the kind of image blurs a bit.
02:28:47.000 And that's kind of annoying.
02:28:48.000 So it's actually worse for things where you're moving around in.
02:28:51.000 But if you're going to sit, if you're on a flight, and you want to have a...
02:28:56.000 $3,500 device that you use to watch videos, Vision Pro is better for that use case.
02:29:04.000 They're really good at keeping you in their walled garden.
02:29:07.000 That's what they're really good at.
02:29:08.000 Yeah, I mean, the whole thing that they've done with iMessage, where they basically do this whole blue bubble, green bubble thing, and basically, for kids, it's just sort of like they embarrass you.
02:29:23.000 They're like, if you don't have a blue bubble, you're not cool.
02:29:26.000 And you're like the out crowd.
02:29:28.000 And then they always wrap it in security.
02:29:31.000 It's like, oh, well, we do this blue bubble because of security.
02:29:33.000 Meanwhile, Google and others had this whole protocol to be able to do encrypted text messages that finally I think Apple was forced to implement it.
02:29:43.000 RCS. Yeah.
02:29:43.000 I think it was the Chinese government that basically ended up forcing them to do it or some other government.
02:29:48.000 But it's still not encrypted.
02:29:49.000 Even when you're sending RCS text messages, I don't think it's encrypted.
02:29:53.000 Oh, I thought it was, but maybe I'm missing something.
02:29:55.000 I think it's only encrypted Google to Google phones.
02:29:57.000 I don't think it's encrypted iPhone to Google phones or Google phones to iPhones.
02:30:02.000 Because I think that was actually...
02:30:05.000 Was it the FBI? Someone released that, telling people that if they're talking about sensitive things, they should use encrypted apps, like WhatsApp.
02:30:14.000 See if we can find that.
02:30:15.000 It was something where they were saying that contrary to popular belief, that RCS texting to iPhone...
02:30:23.000 You got it?
02:30:24.000 GSMA aims to implement end-to-end encryption for RCS messaging.
02:30:27.000 Can we see it?
02:30:28.000 It's not a good answer.
02:30:29.000 I'm trying to find a good answer.
02:30:30.000 Oh, okay.
02:30:31.000 I don't have anything to show you yet.
02:30:32.000 I was trying to read.
02:30:33.000 Yeah, so Google RCS to RCS. But I don't know if this is the correct.
02:30:36.000 Android phone to Android phone is encrypted with RCS. I think the issue comes with it going from...
02:30:43.000 So, like, say Google...
02:30:47.000 Google this.
02:30:49.000 Google...
02:30:50.000 RCS texting to iPhones, is it encrypted?
02:30:55.000 RCS texting to iMessage, is it encrypted?
02:30:58.000 I'm pretty sure it's not.
02:31:00.000 I might be wrong.
02:31:01.000 I don't think I am.
02:31:02.000 I'm pretty sure I read that.
02:31:04.000 And the problem was they won't let any other phone use the iMessage protocol.
02:31:10.000 And they had a company that was doing it called Beeper.
02:31:12.000 And they were doing it through some sort of workaround.
02:31:17.000 Yeah, it's not encrypted.
02:31:18.000 Yeah.
02:31:19.000 That's what I'm saying.
02:31:20.000 Yeah, so it's not.
02:31:22.000 So you are getting the ability to send high-resolution images, which is great, because, you know, like my friend Brian, who uses an Android, he'd send me a video, and it'd be this tiny little broken-down box, because, you know, you had to break it down to the lowest resolution.
02:31:37.000 Yeah, no, I mean, group chats, when you have a bunch of people on iMessage, and then one person is an Android, are terrible.
02:31:43.000 I mean, that's...
02:31:44.000 Do people get mad at you?
02:31:45.000 People get mad at you?
02:31:46.000 Well, I use WhatsApp.
02:31:48.000 I use WhatsApp.
02:31:49.000 You only use that.
02:31:50.000 I only communicate with a few people over SMS. But basically, I build a lot of leading messaging services, so I've got to use ours.
02:32:03.000 Most people, I'm either WhatsApp or Instagram Direct or Messenger.
02:32:09.000 But, yeah.
02:32:13.000 So I think it's maybe people are less likely to get mad at me for asking them to use WhatsApp because we make WhatsApp.
02:32:22.000 When Tucker Carlson was about to interview Vladimir Putin, one of the things that was really disturbing was they contacted him and said they read his signal messages and they knew that he was going to interview Vladimir Putin.
02:32:33.000 And he was like, what the fuck?
02:32:35.000 The government.
02:32:36.000 The U.S.? Yes, U.S. government.
02:32:38.000 I forget what it was.
02:32:39.000 Was it the CIA? Or was it the FBI? Wow.
02:32:41.000 I forget who it was.
02:32:43.000 And he was like, I didn't even know you could do that.
02:32:45.000 Well, there are multiple vulnerabilities in all this stuff.
02:32:48.000 It's unclear.
02:32:49.000 I doubt that what they did was they broke Signal.
02:32:52.000 Because that encryption, I think, is pretty good, as is WhatsApp.
02:32:55.000 I mean, it's basically Signal and WhatsApp use the same encryption.
02:32:58.000 It's an open source.
02:33:01.000 NSA. NSA. Okay.
02:33:03.000 But someone could...
02:33:06.000 Break into your phone and see everything that's on your phone.
02:33:09.000 But the thing that encryption does that's really good is it makes it so that the company that's running the service doesn't see it.
02:33:18.000 So if you're using WhatsApp, basically, when I text you on WhatsApp, there's no point at which the meta servers see the contents of that message.
02:33:28.000 Unless, like, we took a photo of it or shared that back to meta in some other way.
02:33:35.000 That basically, it cuts out the company completely from it, which is, I think, really important for a bunch of reasons.
02:33:41.000 One is people might not trust the company, but also just security issues, right?
02:33:47.000 Let's say someone hacks into Meta, which we try really hard to make it so they can't, and we haven't had many issues with that over the 20 years of running the company.
02:33:57.000 But in theory, if someone did, then they'd be able to access everyone's messages.
02:34:04.000 There's just nothing there, right?
02:34:06.000 It's like, I mean, they can't hack into meta and then get access to your messages.
02:34:09.000 So now someone like the NSA or CIA would have to kind of hack into your phone, which, you know, there are probably ways to do that.
02:34:18.000 Pegasus.
02:34:19.000 I mean, there are probably a bunch of ways.
02:34:21.000 Yeah.
02:34:22.000 There's probably ways we don't know of.
02:34:23.000 Yeah.
02:34:23.000 And then, of course, there's always the ultimate kind of physical part of it, which is if you have access to the computer, you can usually just break in.
02:34:33.000 Right.
02:34:33.000 So that's why, you know, if the FBI arrests you and takes your phone, they're probably going to be able to get in and see what's there.
02:34:41.000 Mm-hmm.
02:34:41.000 Um...
02:34:44.000 So, WhatsApp is encrypted, but if someone has something like Pegasus, what they do is have access to your phone.
02:34:52.000 So it doesn't matter if anything's encrypted.
02:34:53.000 They could just see it in plain sight.
02:34:55.000 Yeah, and I mean, this is one of the reasons why we put disappearing messages in, too, because that way...
02:35:00.000 I mean, yeah, if someone has compromised your phone and they can see everything that's going on there, then obviously they can see stuff as it comes in.
02:35:06.000 But I kind of, in general, just think we should keep around as little of that stuff as possible.
02:35:12.000 So there are some threads where it's like there's photos that get shared, you want the photos.
02:35:18.000 But I think for a lot of threads, a lot of people just wouldn't be...
02:35:24.000 I don't think most people would miss it if most of the contents of their threads...
02:35:29.000 It just disappeared after seven days.
02:35:31.000 Right.
02:35:31.000 You know, what I find is I don't use it that much because we have this, like, corporate policy at Meta that we need to retain all our documents and messages and stuff.
02:35:41.000 But before we had that, I used it as we were developing this.
02:35:48.000 And every once in a while, I would miss something and say, wow, I kind of wish I could go back and see that.
02:35:54.000 But it was very rare.
02:35:55.000 I think most communication, it's kind of like...
02:35:57.000 You just have the communication and then you're done.
02:35:59.000 So having it be encrypted and disappearing I think is a pretty good kind of standard of security and privacy.
02:36:07.000 And you can set that disappearing time on WhatsApp, right?
02:36:09.000 You can make it one day if you want.
02:36:11.000 Yeah, you can do one day.
02:36:11.000 You can do seven days.
02:36:14.000 And you can also set it across all your threads.
02:36:16.000 You can have a default timer so that way as new threads get created.
02:36:21.000 Your default timer just becomes the default for all those threads.
02:36:24.000 So I think that it's a really good feature.
02:36:26.000 And I basically think WhatsApp and Signal are probably the two most secure that are out there on that.
02:36:33.000 And of those two, I think WhatsApp is just used by a lot more people.
02:36:36.000 So I think it's generally, I mean, I would say this because it's our product, but I do think it's the better product.
02:36:43.000 But I think WhatsApp and Signal are basically...
02:36:46.000 You know, the two most secure ones.
02:36:47.000 What was your take on that guy getting arrested as the CEO of Telegram?
02:36:54.000 Oh, man.
02:36:55.000 That's kind of a crazy one, right?
02:36:57.000 Yeah.
02:36:58.000 I mean, it's always a little difficult to weigh in on these situations without knowing all the specifics.
02:37:06.000 But one of the government tactics that I've seen that I think is pretty...
02:37:12.000 is not great.
02:37:15.000 Is an increasing number of governments, when they, like, have an issue with something that a company is doing, basically just, like, threaten to throw the executives of that company in prison.
02:37:25.000 And it's, like...
02:37:28.000 I think that that's just a really weird precedent to set, right?
02:37:32.000 It's, like, if the...
02:37:35.000 You have all these...
02:37:37.000 So, it's, like, we're operating in all these different countries, and then, like, you have...
02:37:46.000 I don't know.
02:37:51.000 I think that's not great.
02:37:57.000 Obviously, you don't want people to just be flagrantly violating the laws, but there are laws in different countries that we disagree with.
02:38:04.000 For example, there was a point at which someone was trying to get me sentenced to death in Pakistan because they thought that Oh, because someone on Facebook had a picture of where they had the drawing of the Prophet Muhammad.
02:38:20.000 And someone said, that's blasphemy in our culture.
02:38:24.000 And they basically sued me and they opened this criminal proceeding.
02:38:29.000 And I don't know exactly where it went because I'm just not planning on going to Pakistan.
02:38:35.000 So I was not that worried about it.
02:38:37.000 But it was a little bit disconcerting.
02:38:40.000 It was like, all right, fine.
02:38:41.000 These guys are trying to...
02:38:43.000 Kill you.
02:38:44.000 It's not great.
02:38:45.000 It's terrible.
02:38:48.000 It's like flying over that region.
02:38:50.000 You don't want your plane to go down above Pakistan if that thing goes through.
02:38:54.000 But that one was sort of avoidable.
02:38:57.000 But the point is there are all these places around the world that just have different values that go against our free expression values and want us to crack down and ban way more stuff than I think a lot of people.
02:39:11.000 That we would believe is, like, the right thing to do.
02:39:14.000 And to have those governments be able to exert the power of saying, okay, we're going to, like, throw you in prison is—that's a lot of force.
02:39:23.000 So I think it's generally—yeah, I think that this is one of the things that the U.S. government is probably going to need to help defend the American tech companies for abroad.
02:39:33.000 But I can't weigh in that much on the Derov-specific thing because I don't know what was going on there.
02:39:40.000 You know, when you were dealing with the government trying to interfere with Facebook, how much of a fear was there that they were going to get away with it and that this was going to be the future of communication online?
02:39:54.000 That it was going to, that they were going to be successful with all this, that they would push these things through somehow or another, especially if an even less tolerant administration got into power?
02:40:06.000 They would change laws and they would do things to make it possible.
02:40:09.000 How much did that concern you?
02:40:13.000 Well, we basically just reached a point where we pushed back on all this stuff, right?
02:40:19.000 So they were pushing us to censor stuff.
02:40:22.000 We were unwilling to do it.
02:40:24.000 We developed a very adversarial and bad relationship with our own government, which I think is just not healthy because I think, you know, it's...
02:40:32.000 I mean, in...
02:40:35.000 In theory, I think it would be good if the American industry had a positive relationship with the American government.
02:40:43.000 But then what happened is the U.S. government was going after us in all these ways.
02:40:47.000 But fortunately in the U.S., we have good rule of law.
02:40:50.000 So our view is at the end of the day, these agencies can open up investigations and we'll just defend ourselves.
02:40:58.000 We'll go to court and we'll win all the cases because we're...
02:41:04.000 Follow the rules.
02:41:05.000 And so I think it ends up being a big kind of political issue where it's like it would just be you could get a lot more done if the government were helping American companies rather than kind of slowing you down at every step along the way.
02:41:26.000 It makes you a little afraid that if you ever actually mess something up That they're really going to bring the hammer down on you if you don't have a constructive relationship.
02:41:36.000 But I don't know.
02:41:38.000 It's mostly...
02:41:39.000 I mean, going back to the AI conversation, I just think we should all want the American companies to win at this.
02:41:45.000 This is a huge geopolitical competition, and China's running at it super hard, and we should want the American companies and the American standard to win.
02:41:55.000 And if there's going to be an open-source model that everyone uses...
02:41:59.000 We should want it to be an American model.
02:42:01.000 There's this great Chinese model that just came out, this company DeepSeek.
02:42:05.000 They're doing really good work.
02:42:08.000 It's a very advanced model.
02:42:10.000 And if you ask it for any negative opinions about Xi Jinping, it will not give you anything.
02:42:15.000 If you ask it if Tiananmen Square happened, it will deny it.
02:42:18.000 So I think there are all these things where we should want the American model to win.
02:42:25.000 But every step along the way...
02:42:28.000 If the government is sort of making that harder rather than easier, then that's...
02:42:32.000 I don't know.
02:42:33.000 I mean, there's an extent to which, okay, the American tech industry is leading, so maybe the government can get in the way a little bit, and maybe the American industry will still lead.
02:42:42.000 But I don't know.
02:42:44.000 I think it's getting really competitive.
02:42:45.000 And I think it's easy for the government to take for granted that the U.S. will lead on all these things.
02:42:53.000 I think it's a very close competition, and we need the help, not...
02:42:57.000 We need them to not be a force that's helping us to do these things.
02:43:02.000 I completely agree, but I think that people with their own self-interest, when they're in power and they realize that these new technologies like Instagram and Facebook, that they are interfering with their ability to administer propaganda or their ability to control the narrative.
02:43:20.000 That's where they get short-sighted.
02:43:23.000 And that's when they act in their own personal interest and not in the interest of neither national security or the future of the United States in terms of our ability to stay technologically ahead.
02:43:34.000 Yeah, and some of this is just, you know, if you go back to the COVID example, I think in that case, they were doing something, their goal of trying to get everyone to get vaccinated was actually, I think, a good goal.
02:43:50.000 It was a good goal if it worked, if it was real, like if it was a sterilizing vaccine, if it really did prevent people from getting COVID, if it really did prevent people from infecting others or transmitting it, but it didn't.
02:44:03.000 So it wasn't a good deal because it wasn't based on real data.
02:44:07.000 Yeah, but also even if it were, right?
02:44:10.000 I think that still unbalanced, knowing everything that we know now.
02:44:16.000 I still think it's good for more people to get the vaccine, but the government still needs to play by the rules in terms of, you know, not like, you can't just suppress true things in order to make your case.
02:44:26.000 So that's kind of my view on it.
02:44:32.000 I'm not sure in that case how much of it was like a personal political gain that they were going for.
02:44:36.000 I think that they had a kind of goal that they thought was in the interests of the country, and the way they went about it, I think, violated the law.
02:44:44.000 Well, there's a bunch of problems with that, right?
02:44:46.000 There's the emergency use authorization that they needed in order to get this pushed through, and you can't have that with valid therapeutics being available.
02:44:55.000 And so they suppressed valid therapeutics.
02:44:58.000 So they're suppressing real information that would lead to people being healthy and successful in defeating this disease.
02:45:06.000 And they did that so that they could have this one solution.
02:45:09.000 And this was Fauci's game plan.
02:45:11.000 I mean, this is the movie American Buyers Club, or Dallas Buyers Club, rather.
02:45:15.000 That's Fauci in that movie.
02:45:16.000 That was with the AIDS crisis.
02:45:17.000 This is the exact same game plan that was played out with the COVID vaccine.
02:45:21.000 They pushed one solution, this only one, suppressed all therapeutics through propaganda, through suppressing monoclonal antibodies, like all of it.
02:45:31.000 And that was done, in my opinion, for profit.
02:45:36.000 And they did that because it was extremely profitable.
02:45:38.000 The amount of money that was made was extraordinary during that time.
02:45:43.000 Yeah, but look, I feel like a bunch of the conversation is focused on tension with the American government.
02:45:52.000 I guess just the point that I'd underscore is that it's important to have this working in the American government because the U.S. Constitution and our culture here is...
02:46:04.000 Really good compared to a lot of other places.
02:46:06.000 So whatever issues we think might exist here, you go to other places and it's really extreme.
02:46:14.000 And there it's like you don't even necessarily have the rule of law.
02:46:17.000 And so I just think that the way that this stuff works well is I think if there was a clearer strategy and the US government understood, believe that it's good to kind of Yeah,
02:46:45.000 I agree as well.
02:46:49.000 Listen, is there anything else you want to talk about before we wrap this up?
02:46:53.000 I think we're good?
02:46:54.000 I don't know.
02:46:55.000 I mean, how long have we been going for?
02:46:56.000 Three hours?
02:46:57.000 Yeah, I mean, well, I feel like we touched on AI, we touched on...
02:47:03.000 All the augmented and virtual reality stuff.
02:47:06.000 And I think that stuff is just going to be...
02:47:07.000 Wild.
02:47:08.000 It's wild.
02:47:09.000 Your AR technology that you showed me today is very impressive.
02:47:14.000 It's crazy.
02:47:15.000 Lex and I were playing Pong apart from a table from each other.
02:47:19.000 I was playing some crazy game where my fingers got tired because you shoot like this.
02:47:24.000 Because you're using V1 of the neural interface.
02:47:26.000 It's like in the future it'll just be this.
02:47:29.000 It was really fun though.
02:47:31.000 It's really cool.
02:47:32.000 You see where this is all going.
02:47:35.000 It's really, really fascinating stuff.
02:47:37.000 I'm very excited about it.
02:47:38.000 Did you get a chance to use the Ray-Bans and the AI in them?
02:47:40.000 Yes, we did that, too.
02:47:41.000 And we did translate, too, where one of your co-workers was speaking to me in Spanish, and it was translating it to me in my ear in real time in English, which is really interesting.
02:47:51.000 Nice.
02:47:52.000 Amazing.
02:47:53.000 It's really cool.
02:47:54.000 And then you could also do it on the phone, so you could show it to the person on the phone so you don't have to say the words.
02:47:59.000 It's really fascinating stuff.
02:48:01.000 Yeah, so we're just sort of coming at it from both sides, right?
02:48:03.000 It's like the Ray-Bans are like, okay.
02:48:06.000 Given a good-looking pair of glasses, what's all the technology you can put into that today and still have it be just a few hundred dollars?
02:48:14.000 And then the Orion thing is like, all right, we're building the kind of future thing that we want, and we're doing our best to miniaturize it.
02:48:21.000 It's basically like...
02:48:22.000 Still pretty small.
02:48:23.000 I mean, just thicker glasses.
02:48:25.000 Yeah, and I think we want it to be a little smaller.
02:48:28.000 We need it to be a lot cheaper.
02:48:29.000 Each pair right now costs more than $10,000 to make, and you're not going to have a successful consumer product at that, so we have to miniaturize it more.
02:48:36.000 But I mean, the amount of stuff that we put in there from, it's like effectively...
02:48:41.000 Like what would have been considered a supercomputer like 10 or 20 years ago, plus, you know, lasers in the arms and the like nano etchings on the lens to be able to see it and the microphone and the speaker and the Wi-Fi to be able to connect with the other.
02:48:55.000 It's just like a wild amount of technology to kind of miniaturize into something.
02:48:59.000 That one's really fun.
02:49:01.000 We've been working on that for 10 years.
02:49:03.000 But yeah, I think between that, the glasses, all the AI stuff.
02:49:11.000 Yeah, all the social media stuff.
02:49:13.000 I think we covered it.
02:49:15.000 And I'm very excited about this new stance that you guys are taking.
02:49:18.000 I think the community notes thing is a brilliant idea that X has implemented.
02:49:22.000 And I am really glad that you guys are implementing it too.
02:49:26.000 I think it's the way.
02:49:27.000 And the way, generally, I think we both agree is that people have to have the ability to communicate.
02:49:32.000 They have to have the ability to express themselves.
02:49:34.000 And that's how we find out what's real and what's not.
02:49:37.000 Yeah, I think the...
02:49:39.000 More voice is the answer on this.
02:49:42.000 And I think after sort of a long journey, I'm glad to be able to take it back to the roots.
02:49:49.000 And I feel like we're more fortified now in the position.
02:49:53.000 Well, I think one of the lessons that people have learned over the last few years with suppression of information is that that's not good.
02:49:59.000 And there's a giant percentage of the population that feels that way.
02:50:02.000 And even people that are progressive and liberals were on the side of the people that were pushing the...
02:50:08.000 The suppression of information.
02:50:09.000 Still don't think it's right.
02:50:11.000 I think most people generally believe in the First Amendment in this country, and we realize how valuable it is to have the freedom of expression.
02:50:17.000 Yeah.
02:50:18.000 Anyway, thanks for having me.
02:50:20.000 Thank you, Mark.
02:50:20.000 Appreciate it.