In this episode of the Joe Rogan Experience podcast, I sit down with Jack Dorsey to talk about his early days at the company, how he got started, and how he and his team dealt with the growing concerns about censorship on social media.
00:00:58.000And, you know, it goes back to, you know, our original mission is just give people the power to share and make the world more open and connected.
00:01:10.000What do you think started the pathway towards increasing censorship?
00:01:15.000Because clearly we were going in that direction for the last few years.
00:01:21.000It seemed like we really found out about it when Elon bought Twitter and we got the Twitter files.
00:01:27.000And when you came on here and when you were explaining the relationship with FBI, where they were trying to get you to take down certain things that were true and real and certain things they tried to get you to limit the exposure to them.
00:02:45.000Basically, Brexit in the EU and sort of the fragmentation of the EU. And then, you know, in 2020, there was COVID. And I think that those were basically these two events where, for the first time, we just faced this massive, massive institutional pressure to basically start censoring content on...
00:03:12.000I'm sorry to interrupt you, but when it first came up in 2016, did it come under the guise of the Russian collusion hoax?
00:03:21.000At the time, I was really sort of ill-prepared to kind of parse what was going on.
00:03:28.000I think part of my reflection looking back on this is I kind of think in 2016, in the aftermath, I gave too much deference to a lot of folks in the media who were basically saying, okay, there's no way that this guy I gave too much deference to a lot of folks in the media who were basically saying, People can't actually believe this stuff, right?
00:03:50.000It has to be that there's this kind of like massive misinformation out there.
00:03:56.000Some of it started with the Russia collusion stuff.
00:04:00.000But it kind of morphed into different things over time.
00:04:03.000He was so ideologically polarizing, right?
00:04:06.000People didn't want to believe that anybody looked at him and said, this should be our president.
00:04:12.000Yeah, so I took this and just kind of assumed that everyone was acting in good faith.
00:04:19.000And I said, okay, well, there are concerns about misinformation.
00:04:23.000We should, just like when people raised other concerns in the past and we tried to deal with them.
00:04:28.000Okay, yeah, if you ask people, no one says that they want misinformation, so maybe there's something that we should do to basically try to address this.
00:04:36.000But I was really worried from the beginning about basically becoming this sort of decider of what is true in the world.
00:04:47.000That's kind of a crazy position to be in for billions of people using your service.
00:04:56.000You know, a system that would deal with it.
00:04:58.000You know, and early on tried to basically make it so that it was really limited.
00:05:03.000We're like, all right, we're just going to have this system where there's these third-party fact checkers and they can check the worst of the worst stuff, right?
00:05:13.000It's not like we're not parsing speech about whether something is slightly true or slightly false, like earth is flat, you know, things like that, right?
00:05:22.000So that was sort of the original intent.
00:05:24.000We put in place the system and it just sort of veered from there.
00:05:28.000I think to some degree it's because some of the people whose job is to do fact-checking, a lot of their industry is focused on political fact-checking.
00:05:36.000So they're just kind of veered in that direction.
00:05:38.000And we kept on trying to basically get it to be what we had originally intended, which is just, you know, it's not, the point isn't to like judge people's opinions.
00:05:47.000It's to provide in this layer to kind of help fact-check some of the stuff that seems the most extreme.
00:05:55.000It just, you know, it was just never accepted by people broadly.
00:06:00.000I think people just felt like the fact-checkers were too biased.
00:06:04.000Not necessarily even so much in what they ruled, although sometimes I think people would disagree with that.
00:06:10.000A lot of the time, it was just what types of things they chose to even go and fact-check in the first place.
00:06:14.000So I kind of think, like, after having gone through that whole exercise, it, I don't know, it's something out of, like, You know, 1984, one of these books where it's just like, it really is a slippery slope.
00:06:29.000And it just got to a point where it's just, okay, this is destroying so much trust, especially in the United States, to have this program.
00:06:38.000And I guess it was probably about a few years that I really started coming to the conclusion that we were going to need to change something about that.
00:06:47.000COVID was the other big one where that was also very tricky.
00:06:54.000Because, you know, at the beginning it was, you know, it's like a legitimate public health crisis, you know, in the beginning.
00:07:02.000And it's, you know, even people who are like the most ardent First Amendment, you know, defenders, the Supreme Court has this clear precedent that's like, all right, you can't yell fire in a crowded theater.
00:07:17.000There are times when if there's an emergency, you're...
00:07:22.000ability to speak can temporarily be curtailed in order to get an emergency under control.
00:07:27.000So I was sympathetic to that at the beginning of COVID.
00:07:29.000It seemed like, okay, you have this virus.
00:07:31.000It seems like it's killing a lot of people.
00:07:35.000We didn't know at the time how dangerous it was going to be.
00:07:38.000So at the beginning, it kind of seemed like, okay, we should give a little bit of deference to the government and the health authorities on how we should play this.
00:07:46.000But when it went from, you know, two weeks to flatten the curve to, you know, in like in the beginning, it was like, okay, there aren't enough masks.
00:07:55.000Masks Masks aren't that important to them.
00:07:56.000It's like, oh, no, you have to wear a mask.
00:07:58.000And, you know, like everything was shifting around.
00:08:01.000It's become very difficult to kind of follow.
00:08:04.000And this really hit the most extreme, I'd say, during the Biden administration when they were trying to roll out the vaccine program.
00:08:18.000I'm generally pretty pro-rolling out vaccines.
00:08:21.000I think on balance, the vaccines are more positive than negative.
00:08:24.000But I think that while they're trying to push that program, they also tried to censor anyone who was basically arguing against it.
00:08:34.000And they pushed us super hard to take down things that were honestly true.
00:08:41.000They basically pushed us and said, anything that...
00:08:47.000Says that vaccines might have side effects.
00:09:38.000It's such an easy routine to keep in the mornings.
00:09:41.000Ingredients in AG1 are selected for absorption, nutrient density, and potency.
00:09:46.000And are intentionally picked to work in sync with the whole formula for optimal impact.
00:09:51.000They're seriously committed to quality.
00:09:53.000AG1 is tested for hundreds of contaminants and impurities, and they're constantly reformulating their recipe to dial it in.
00:10:00.000This is all part of why I've partnered with AG1 for years.
00:10:04.000So get started with AG1 this holiday season and get a free bottle of vitamin D3, K2, and five free AG1 travel packs with your first purchase at drinkag1.com slash Joe Rogan.
00:10:58.000Well, on a monthly basis, it is probably half of Earth.
00:11:01.000I want to say that, though, there's a lot of hypercritical people that are conspiracy theorists and think that everybody is a part of some cabal to control them.
00:11:10.000I want you to understand that whether it's YouTube or whatever place that you think is doing something that's awful, It's good that you speak because this is how things get changed and this is how people find out that people are upset about content moderation and censorship.
00:12:08.000I want to put that in people's heads before we go on.
00:12:11.000understand the kind of numbers that we're talking about here.
00:12:14.000Now understand you have the pandemic and then you have the administration that's doing something where I think they crossed the line, where it gets really weird, where they're saying what you were saying.
00:12:25.000They were trying to get you to take down vaccine side effects, which is just crazy.
00:12:33.000Yeah, so, I mean, like you're saying, I mean, this is...
00:12:37.000It's so complicated, this system, that I could spend every minute of all of my time doing this and not actually focused on building any of the things that we're trying to do.
00:12:46.000AI, glasses, the future of social media, all that stuff.
00:12:51.000So I get involved in this stuff, but in general, we have a policy team.
00:12:56.000The people are kind of working on this on a day-to-day basis.
00:12:59.000And the interactions that I was just referring to, a lot of this is documented.
00:13:06.000You know, Jim Jordan and the House had this whole investigation and committee into the kind of government censorship around stuff like this.
00:13:14.000And we produced all these documents and it's all in the public domain.
00:13:16.000I mean, basically, these people from the Biden administration would call up our team and like...
00:13:47.000They wanted us to take down this meme of Leonardo DiCaprio looking at a...
00:13:51.000TV talking about how 10 years from now or something, you know, you're going to see an ad that says, okay, if you took a COVID vaccine, you're eligible, you know, like for this kind of payment, like sort of like class action lawsuit type meme.
00:14:07.000And they're like, no, you have to take that down.
00:14:08.000We said, no, we're not going to take down humor and satire.
00:14:12.000We're not going to take down things that are true.
00:14:54.000It's like they suppressed so much information about things that people should be doing, regardless of whether or not you believe in the vaccine.
00:15:25.000You can't say you're one of the good guys if you're suppressing information that would help people recover from all kinds of diseases, not just COVID. The flu, common cold, all sorts of different things.
00:15:37.000High doses of vitamin C, D3 with K2 and magnesium.
00:15:41.000They were suppressing this stuff because they didn't want people to think that you could get away with not taking a vaccine, which is really crazy when you're talking about something that 99.07% of people survive.
00:15:55.000This is a crazy overstep, but it scared the shit out of a lot of people.
00:16:01.000Red-pilled, as it were, a lot of people because they realized like, oh, 1984 is like an instruction manual.
00:16:08.000It shows you how things can go that way with wrong speak and with bizarre distortion of facts.
00:16:16.000And when it comes down to it, in today's day and age, the way people get information is through your platform, through X. This is how people are getting information.
00:16:26.000They're getting information from YouTube.
00:16:27.000They're getting information from a bunch of different sources now.
00:16:31.000And you can't censor that if it's real, legitimate information because it's not ideologically convenient for you.
00:16:39.000So, I mean, that's basically the journey that I've been on.
00:16:41.000It started off very pro-free speech, free expression.
00:16:46.000And then over the last 10 years, there have been these two big episodes.
00:16:49.000It was the Trump election and the aftermath, where I feel like in retrospect, I deferred too much to the kind of critique of the media on what we should do.
00:17:00.000I think generally trust in media has fallen off a cliff, right?
00:17:03.000So I don't think I'm alone in that journey.
00:17:05.000I think that's basically the experience that a lot of people have had is, okay, the stuff that's being written about is not kind of all accurate.
00:17:14.000And even if the facts are right, it's kind of written from a slant a lot of the time.
00:17:32.000I think a lot of people in the US focus on this as an American phenomenon.
00:17:38.000But I kind of think that the reaction to COVID probably caused a breakdown in trust in a lot of governments around the world.
00:17:47.000Because, I mean, 2024 was a big election year around the world.
00:17:53.000You know, there are all these countries, India, just like a ton of countries that had elections, and the incumbents basically lost every single one.
00:18:02.000So there is some sort of a global phenomenon where the, whether it was because of inflation, because of the economic policies to deal with COVID, or just how the governments dealt with COVID. Seems to have had this effect that's global, not just the U.S., but like a very broad decrease in trust, at least in that set of incumbents and maybe in sort of these democratic institutions overall.
00:18:32.000So I think that what you're saying of, yeah, how do people get their information now?
00:18:36.000It's by sharing it online on social media.
00:18:40.000I think that that's just increasingly true.
00:18:42.000And my view at this point is like, all right, like we started off focused on free expression.
00:18:46.000We kind of had this pressure tested over the last period.
00:18:49.000I feel like I just have a much greater command now of what I think the policy should be, and, like, this is how it's going to be going forward.
00:18:57.000And so, I mean, at this point, I think a lot of people look at this as, like, a purely political thing, you know, because they...
00:19:06.000They kind of look at the timing and they're like, hey, well, you're doing this right after the election.
00:19:10.000It's like, okay, I try not to change our content rules right in the middle of an election either.
00:19:18.000Yeah, it's like there's no good time to do it.
00:19:20.000And whatever time is going on, there's going to be...
00:19:25.000The good thing about doing it after the election is you get to take this kind of cultural pulse as like, okay, where are people right now and how are people thinking about it?
00:19:32.000We try to have policies that reflect mainstream discourse.
00:19:39.000This is something I've been thinking about for a while.
00:19:41.000I think that this is going to be pretty durable because at this point we've just been pressure tested on this stuff for like the last eight to ten years with like these huge institutions just...
00:20:18.000I mean, the tough thing with politics...
00:20:21.000Is that there's like, well, when you say someone's coming after you, are you referring to kind of the government and investigations and all that?
00:20:28.000I mean, so the issue is that there's the, there's what specific thing an agency might be looking into you for.
00:20:37.000And then there's like the underlying political motivation, which is like, why do the people who are running this thing hate you?
00:20:44.000And I think that those can often be two very different things.
00:21:52.000The thing that I think is actually the toughest, though, is it's global.
00:21:58.000And really, when you think about it, the U.S. government should be defending its companies, not be the tip of the spear attacking its companies.
00:22:06.000So we talk about a lot, okay, what is the experience of...
00:22:12.000Okay, if the U.S. government comes after you, I think the real issue is that when the U.S. government does that to its tech industry, it's basically just open season around the rest of the world.
00:22:22.000I mean, the EU, I pull these numbers, the EU has fined the tech companies more than $30 billion over the last, I think it was like 10 or 20 years.
00:22:51.000And I think the U.S. government basically gets to decide how are they going to deal with that, right?
00:22:55.000Because if the U.S. government, if some other country was screwing with another industry that we cared about, the U.S. government would probably find some way to put pressure on them.
00:23:05.000But I think what happened here is actually the complete opposite.
00:23:08.000The U.S. government led the kind of attack against the companies, which then just made it so, like, the EU is basically, in all these other places, just free to just go to town on all the American companies and do whatever you want.
00:23:20.000But, I mean, look, obviously, I don't want to come across as if, like, we don't have things that we need to do better.
00:23:28.000And when we mess something up, we deserve to be held accountable for that, and just like everyone else.
00:23:35.000I do think that the American technology industry is a bright spot in the American economy.
00:23:39.000I think it's a strategic advantage for the United States that we have a lot of the strongest companies in the world.
00:23:45.000And I think it should be part of the U.S.'s strategy going forward to defend that.
00:23:50.000And it's one of the things that I'm optimistic about with President Trump is I think he just wants America to win.
00:23:57.000And I think some of the stuff, like the other governments who are kind of pushing on this stuff, it's...
00:24:03.000At least the U.S. has the rule of law, right?
00:24:06.000So the government can come after you for something, but you still get your day in court, and the courts are pretty fair.
00:24:13.000So we've basically done a pretty good job of defending ourselves, and when we've chosen to do that, basically, we have a pretty good rate of winning.
00:24:22.000It's just not like that in every other country around the world.
00:24:24.000Like if other governments decide that they're going to go after you, you don't always get kind of a clear shake at kind of defending yourself on the rules.
00:24:37.000So I think to some degree, if the U.S. tech industry is going to continue being really strong, I do think that the U.S. government has a role in basically defending it abroad.
00:24:51.000And that's one of the things that I'm optimistic about will happen in this administration.
00:24:54.000Well, I think this administration uniquely has felt the impact of –
00:25:00.000Not being able to have free speech because this was the This is the administration where Trump was famously kicked off of Twitter That was a huge issue like after January 6th like they removed the at the time the sitting president It was kind of crazy to remove that person from social media because you've decided that he incited a riot So for him Without free speech,
00:25:29.000without podcasts, without social media, they probably wouldn't have had a chance because the mainstream narrative, other than Fox News, was so clearly against him.
00:25:38.000The majority of the television entities and print entities were against him, the majority of them.
00:25:46.000So without social media, without podcasts, they don't stand a chance.
00:25:50.000So they're uniquely aware of the importance of giving people their voice.
00:25:58.000But you do have to be careful about misinformation and you do have to be careful about just outright lies and propaganda complaints or propaganda campaigns, rather.
00:27:08.000When you're talking about nation-states or people interfering, a lot of that stuff...
00:27:14.000Is best rooted out at the level of kind of accounts doing phony things.
00:27:20.000So you get like, whether it's like China or Russia or Iran or like one of these countries, they'll set up these networks of fake accounts and bots and they coordinate and they post on each other stuff to make it seem like it's authentic and kind of convince people.
00:27:34.000It's like, wow, a bunch of people must think this or something.
00:27:36.000And the way that you identify that is you build AI systems.
00:27:41.000That can basically detect that those accounts are not behaving the way that a human would.
00:27:46.000And when we find that, that there's like some bot that's operating an account.
00:28:12.000Well, I mean, it's more subtle than that.
00:28:13.000I think, like, these guys are pretty sophisticated, and it's an adversarial space.
00:28:17.000So we find some technique, and then they basically kind of update their techniques.
00:28:25.000But we have a team of, it's effectively, like...
00:28:32.000What are these accounts that are just not behaving the way that people would and how are they interacting?
00:28:42.000And then sometimes you trace it down and sometimes you get some tips from different intelligence agencies and then you can kind of piece together over time.
00:28:52.000It's like, oh, this network of people is actually some kind of fake...
00:28:56.000Cluster of accounts, and that's against our policies, and we just take them all off.
00:29:05.000Is there a 100% certainty that you are definitely getting a group of people that are bad actors, or is it just people that have unpopular opinions?
00:29:35.000I think for the specific problem around these large coordinated groups doing election interference or something, it's a large enough group.
00:29:43.000We have a bunch of people analyzing it.
00:29:47.000I think we're probably pretty accurate on that.
00:29:49.000But I actually think one of the bigger issues that we have in our moderation system is this precision issue that you're talking about.
00:29:57.000And that is actually, of all the things that we announced this week, in terms of how we're going to update the content policies, changing the content filters to have to require higher confidence and precision is actually going to be the thing that reduces The vast majority of the censorship mistakes that we make.
00:30:18.000Removing the fact-checkers and replacing them with community notes, I think it's a good step forward.
00:30:22.000A very small percent of content is fact-checked in the first place, so is that going to make the hugest difference?
00:30:38.000It'll mean that some set of things that might have been censored before or not.
00:30:43.000But by far the biggest set of issues we have, and you and I have talked about a bunch of issues like this over the years, is like, it's just, okay, you have some classifier that's trying to find, say, like drug content, right?
00:30:55.000People decide, okay, it's like the opioid epidemic is a big deal.
00:30:58.000We need to do a better job of cracking down on drugs and drug sales, right?
00:31:01.000I don't want people dealing drugs on our networks.
00:31:03.000So we build a bunch of systems that basically go out and try to automate finding people who are dealing drugs.
00:31:12.000And then you basically have this question, which is how precise do you want to set the classifier?
00:31:17.000So do you want to make it so that the system needs to be 99% sure that someone is dealing drugs before taking them down?
00:31:27.000Do you want it to be 90% confident, 80% confident?
00:31:30.000And then those correspond to amounts of...
00:31:35.000I guess the statistics term would be recall.
00:31:37.000What percent of the bad stuff are you finding?
00:31:39.000So if you require 99% confidence, then maybe you only actually end up taking down 20% of the bad content.
00:31:49.000Whereas if you reduce it and you say, okay, we're only going to require 90% confidence, now maybe you can take down 60% of the bad content.
00:31:57.000But let's say you say, no, we really need to find...
00:32:00.000Everyone who is doing this bad thing, and it doesn't need to be as severe as dealing drugs.
00:32:04.000It could just be, I mean, it could be any kind of content of, any kind of category of harmful content.
00:32:12.000You start getting to some of these classifiers might have 80, 85% precision in order to get 90% of the bad stuff down.
00:32:20.000But the problem is if you're at, you know, 90% precision, that means one out of 10 things that the classifier takes down is not actually problematic.
00:32:28.000And if you filter, if you kind of multiply that across the billions of people who use our services every day, that is millions and millions of posts that are basically being taken down that are innocent.
00:32:43.000And upon review, we're going to look at it and be like, this is ridiculous that this thing got taken down, which I mean, I think you've had that experience and we've talked about this for a bunch of stuff over time.
00:32:54.000But it really just comes down to this question of where do you want to set the classifiers?
00:32:57.000So one of the things that we're going to do is basically set them to require more confidence, which is this tradeoff.
00:33:06.000It's going to mean that we will maybe take down a smaller amount of the harmful content, but it will also mean that we'll dramatically reduce the amount of people whose accounts we're taking off for a mistake, which is just a terrible experience, right?
00:33:19.000It's like, okay, you're going about your day, and then one day...
00:33:23.000You wake up and you're like, oh, my WhatsApp account just got deactivated because it's connected to a Facebook account and the Facebook account is – or like I'm using it on the same phone as a Facebook account where like we made some enforcement mistake and thought you were doing something bad that you weren't because our classifiers were set to low precision.
00:33:52.000Like, say, if you have a Facebook and you have, like, a sock puppet account, and the sock puppet account, you post offensive memes and you're generally gross.
00:34:44.000They also think, okay, well, we're not doing enough if we deactivate their Facebook account because they're planning a terrorist attack, but we let them use all our other services.
00:35:10.000So you can understand how you get there, but then you just get to this question around the precision and the confidence level.
00:35:17.000And then you're just making all these mistakes at scale, and it's just unacceptable.
00:35:23.000But I think it's a very hard calculation of, like, where do you want to be?
00:35:27.000because on the one hand, like, I get it why people kind of come to us and they're like, no, you need to do a better job finding more of the terrorism or the drugs and all this stuff.
00:35:37.000But over time, the technology will get better and it'll get more precise.
00:35:40.000But at any given point in time, that's the choice that we have to make is do we want to make more mistakes erring on the side of just like blowing away innocent people's accounts?
00:36:09.000I think, like, if you're a person and you work at some accounting firm, but you like posting about stuff, but you don't want it to come back and reflect on your life, you want to shitpost, you want to post jokes, you want to be silly, you should be able to be anonymous.
00:36:22.000I think there's nothing wrong with that.
00:36:25.000I don't think just because you state your opinion, people should be able to search where you sleep.
00:36:32.000If you're going to allow anonymous accounts, you're definitely going to open up the door to bad actors having enormous blocks of accounts where they can use either AI or just programs where they have specific answers.
00:36:48.000It's come up on Twitter multiple times where they've found...
00:36:52.000Hundreds of sock puppet accounts tweeting the exact same thing.
00:36:56.000So you've literally, word for word, even certain words in caps, like either people are copying or pasting it, or there's an email campaign that's getting legitimate people to do it, or these are fake people.
00:37:07.000If you're going to have anonymous accounts, which I think you should, because I think whistleblowers, I think the benefits of anonymous reporting on important things that the general public needs to know about, especially whistleblower type stuff, you have to have some ability to be anonymous.
00:37:24.000But if you're going to do that, you're also going to have the possibility that these aren't real people.
00:37:28.000That these are paid actors, these are paid people, or not people at all, or they're running programs, and they're doing this to try to sway public opinion about very important issues.
00:37:43.000A lot of what we've seen, too, I mean, there's the anonymous accounts.
00:37:46.000Also, just over time, I think a lot of the kind of more interesting conversations have shifted from the public sphere to more private ones.
00:37:53.000So WhatsApp groups, private groups on Facebook.
00:37:57.000I'm sure you have this experience where maybe 10 years ago you would have posted your kind of quick takes on whatever social media you're using.
00:38:07.000Now, you know, the stuff that I post on Facebook and Instagram, it's like I put time into making sure that that's kind of good content that I want to be seen broadly.
00:38:15.000And then like most of the jokes that I make are like with my friends and WhatsApp.
00:38:38.000Which is just a weird thing about taking things out of context, particularly on social media where people love to do that.
00:38:47.000There is this problem of, like, let's just say that you're a country that's involved in some sort of an international conflict, and you have this ability to get out this fake narrative and just spread it widely about all sorts of things you're accusing this other government of, all sorts of things that aren't true, and it just muddies the water of reality for a lot of people.
00:39:11.000Yeah, and that's why that side of things, the kind of...
00:39:18.000I mean, we're not letting off the gas on that at all.
00:39:21.000I think most categories of bad stuff that we're policing, everyone agrees is bad.
00:39:27.000No one's sitting there defending that terrorism is good, or child exploitation, or drugs, or IP violations, or people inciting violence.
00:39:42.000You know, election interference and foreign government manipulation of content is bad.
00:39:46.000So this is the type of stuff that the vast majority of our energy goes towards that, and we're not changing our approach on any of that.
00:39:53.000The two categories that I think have been very politicized are misinformation, because who gets to judge what's false and what's true?
00:40:04.000You may just not like my opinion on something, and then people think it's false, but I think that that one's really tricky.
00:40:10.000And the other one is basically what people refer to as hate speech, which I think also comes from a good place of wanting to crack down on that, of wanting to promote more inclusion and belonging and people feeling...
00:40:30.000Feeling good and having a pluralistic society that can basically have all these different communities coexist.
00:40:38.000But I think the problem is that all these things are on a spectrum and when you go too far on them, I think on that side, we just basically got to this point where there were these things that you just couldn't say, which were mainstream discourse.
00:40:54.000Pete Hegseth is going to probably be defending his nomination for Secretary of Defense on the Senate floor.
00:41:03.000And I think one of the points that he's made is that he thinks that women shouldn't be able to be in certain combat roles.
00:41:09.000And until we updated our policies, that wouldn't have been a thing that you could have said on our platforms because it would call for the exclusion of a protected category of people.
00:41:44.000Misinformation and hate speech, I think, are the ones that got politicized.
00:41:48.000All the other ones, which is the vast majority of the stuff that we do, is I think people generally agree that it's good and we need to go after it.
00:41:57.000But then you just get into this problem of the mistakes, like you're talking about.
00:41:59.000Okay, well, what confidence level do people want us to have in our enforcement?
00:42:04.000And at what point would people rather us kind of say, okay, I'm not sure that that one is causing an issue.
00:42:12.000So on balance, maybe we should just...
00:42:14.000You know, leave that person's account out because the pain of just nuking someone's account when you're not sure or you make a mistake is like, that's pretty real too.
00:42:28.000And, you know, you made a point earlier about the government supporting its companies, that it would be a good thing for the government to support its companies.
00:42:41.000I think the issue that we're dealing with is...
00:42:43.000Companies, as we're describing them, have never existed before, right?
00:42:48.000There's never been a thing like Facebook before.
00:42:50.000There's never been a thing like Twitter before or X. There's never been a thing like Instagram.
00:42:54.000These are new things in terms of the impact that it has on society, on opinions, on conversations, on distribution of information.
00:43:04.000There's never been a thing like this that the government didn't control.
00:43:09.000So it makes sense from their perspective, continuing the patterns of behavior that they've always exhibited, which is to have control over the media.
00:43:17.000I mean, there has been CIA operatives that have been in major newspapers forever.
00:43:36.000And especially with this last, the push during COVID deteriorated, as you were saying before, the opinion and the respect that people have for the facts that are coming from mainstream journalism in a way that I've never seen before in my life, where an enormous percentage of the population does not trust mainstream media anymore.
00:44:01.000Like, we've got to crack down on that.
00:44:03.000Like, we've got to get our hands on this, which is what we saw during COVID, which we saw during the Biden administration's attempt to remove the Hunter Biden laptop story from Twitter and from all these different things that we saw happen, the way they...
00:44:16.000Contacted you guys, what they're trying to do with getting you to remove real information about vaccine side effects.
00:44:23.000This is like this new attempt to crack down on this new thing, which is a distribution outlet that's far more successful than anything they've ever controlled before, and they have no control of it.
00:44:38.000They had CBS, they had NBC. When they had the New York Times and all these, Washington Post, when they were in control of narratives in that way, it was so much easier.
00:44:48.000There wasn't some sort of pirate radio voice that came on and said, hey guys, here's the latest studies that shows this is not true.
00:46:06.000They're a lot of people in other positions.
00:46:10.000It's like, who are the people that basically people look to?
00:46:14.000And I think that's basically, it needs to shift for the internet age.
00:46:21.000And I think a lot of the people who, you know, people looked to before, they're kind of realizing, hey, they weren't super honest about a lot of these issues that we face.
00:46:31.000And that's partially why, you know, social media isn't a monolithic thing.
00:46:35.000It's not that people trust Facebook or X. They trust the creators and the voices that they feel like are being authentic and giving them valuable information on there.
00:46:44.000So there's, I think, going to be just this whole new class of...
00:46:50.000Creators who basically become the new kind of cultural elites that people look at and are like, okay, these are the people who give it to me straight.
00:46:58.000And I think that's a thing that is, maybe it's possible because of social media.
00:47:04.000I think it's also just the internet more broadly.
00:47:07.000I mean, I think podcasting is obviously a huge and important part of that too.
00:47:11.000I mean, I don't know to what extent you feel like you kind of...
00:47:14.000Got to be large because of social media or just the podcasting platforms that you used.
00:47:20.000But I think that this is a very big sea change in terms of who are the voices that matter.
00:47:27.000And what we do is we try to build a platform that gives people a voice.
00:47:31.000But I think that there's this wholesale generational shift.
00:47:36.000And who are the people who are being listened to?
00:47:38.000And I think that that's, like, a very fascinating thing that is going on because I think that that's, like, what's going on here.
00:47:45.000It's not just the government and people saying, hey, we want, like, a very big change here.
00:47:51.000I think it's just, like, a wholesale shift in saying we just want different people who we actually trust, who are actually going to, like, tell us the truth.
00:48:00.000And not give us the bullshit opinions that you're supposed to say, but the type of stuff that I would actually...
00:48:06.000When I'm sitting in my living room with my friends, the stuff that we know is true.
00:48:11.000Who are the people who have the courage to actually just say that stuff?
00:48:22.000The problem is these people that are...
00:48:26.000Starting these jobs, they're coming out of universities, and the universities are indoctrinated into these ideas as well.
00:48:32.000It's very difficult to be a person who stands outside of that and takes unpopular positions.
00:48:38.000You get socially ostracized, and people are very hesitant to do that, and they would rather just keep their mouth shut and talk about it in quiet conversation.
00:48:47.000And that's what we experience, which is another...
00:48:50.000Another argument for anonymous accounts.
00:48:53.000I think you should have anonymous accounts.
00:48:55.000I think you should be able, like, if there's something like COVID mandates or some things that you're dealing with and you don't want to get fired because of it, you should be able to talk about it.
00:49:03.000And you should be able to post facts and information and what you've learned.
00:49:06.000And, you know, anecdotal experiences of people in your family that had vaccine side effects and not worry about losing your job, which people were worried about, which is so crazy.
00:49:19.000And you're seeing a lot of the people that used to be in mainstream media got fired, and now they're trying to do the sort of podcast thing.
00:49:27.000But they're trying to do it like a mainstream media person, so they're gaslighting during podcasts, and people are like, hey, fuckface.
00:50:17.000I think part of it is the format, right?
00:50:18.000The fact that you do these, like, two, three-hour episodes.
00:50:22.000I mean, I hated doing TV because, you know, I basically got started.
00:50:28.000I started Facebook when I was 19, and I was good at some things, very bad at others.
00:50:33.000I was good at coding and, like, real bad at kind of, like, talking to people and explaining what I was doing.
00:50:40.000And I just had these experiences early on where I'd go on TV and it wouldn't go well and they'd cut it down to some random soundbite and I'd look stupid.
00:50:53.000And then basically I'd get super nervous about going on TV because I knew that they were just going to cut it in some way that I was going to look like a fucking idiot.
00:51:03.000And so I'm just like, this sucks, right?
00:51:53.000And the other thing about television that's always going to hold it back is the fact that every conversation gets interrupted every X amount of minutes because you have to cut to a commercial.
00:52:08.000You have all these people talking over each other, then you sit down with one person for a short amount of time.
00:52:13.000It's just not enough time for important subjects.
00:52:16.000It's also a lot of them, for whatever reason, want to do it in front of an audience.
00:52:20.000Which is the worst way to get people to talk.
00:52:22.000Like, imagine these disasters that you had if there was, like, 5,000 people staring at you in a TV crowd as well.
00:52:29.000So there's that added element, which is so not normal and not conducive to having a conversation where you're talking about nuanced things.
00:53:06.000I'm like, OK, kind of like downloading into my memory.
00:53:09.000Like, how am I going to like, you know, it's like, how am I going to just explain these different things?
00:53:17.000Yeah, I just think that's sort of how people work.
00:53:19.000It's also like conversations are like a dance.
00:53:22.000You know, one person can't be dancing at another speed and the other person is going slow.
00:53:26.000You kind of have to find the rhythm that you're going to talk with and then you have to actually be interested in what you're talking about.
00:53:34.000That's another thing that they are at a huge disadvantage of in mainstream media.
00:53:39.000It's like they're just doing that because that's their job.
00:53:41.000They probably don't even know a lot about climate change.
00:53:44.000They probably don't really understand too much about what SpaceX is trying to accomplish.
00:54:14.000Getting back to the original point, this is why I think it makes sense to me that the government didn't want you to succeed and to have the sort of unchecked power that they perceived social media to have.
00:54:31.000One of the benefits that we have now of the Trump administration is that they have clearly felt the repercussions of a limited amount of free speech, of free speech limitations, censorship, government overreach.
00:54:43.000If anybody saw it, look, I don't know what the actual impact of the Hunter Biden laptop story would have been.
00:54:52.000But there's many people that think it probably amounted to millions of votes overall in the country, of people that were on the fence.
00:55:00.000The people that weren't sure who they're going to vote for, if they found out the Hunter Biden laptop was real, they're like, oh, this is fucking, the family's fucking crazy.
00:55:10.000And if that's possibly real, that could be defined as election interference.
00:55:16.000And all that stuff scares the shit out of me.
00:55:19.000That kind of stuff scares the shit out of me.
00:55:21.000When the government gets involved in what could be termed election interference, but through some weird loophole, it's legal.
00:55:29.000Well, I don't think that the pushing for social media companies to censor stuff was legal.
00:55:36.000I mean, there's all this stuff about what, like, people talk about the First Amendment and, okay, these tech platforms should offer free speech like the First Amendment.
00:55:47.000That, I think, is a philosophical principle.
00:55:50.000The First Amendment doesn't apply to companies in our content moderation.
00:56:00.000But the First Amendment does apply to the government.
00:56:03.000That's, like, the whole point, right, is the government is not allowed to censor this stuff.
00:56:06.000So at some level, I do think that, you know, having people in the administration calling up the guys on our team and yelling at them and cursing and threatening repercussions if we don't take down things that are true is, like, it's pretty bad.
00:57:22.000Well, in a democracy, I mean, that's kind of...
00:57:24.000Right, but if what they did was illegal, do you not think that some steps should be put in place to make sure that people are punished for that and that that never happens again?
00:57:37.000It seems that that has a massive impact on the way our country goes.
00:57:43.000If that's election interference, and I think it is, that has a massive impact on the direction of our country.
00:57:50.000Yeah, well, the COVID thing, I don't think, was election interference as much as it was just, like, government meddling where it shouldn't have.
00:57:57.000But, yeah, no, I mean, it's tougher for me to say, you know, like, what specific retribution or justice should happen to anyone who is involved in these things.
00:58:09.000But I think your point about let's make sure this doesn't happen again is...
00:58:18.000Because it's the thing that I reflect on on my journey on all this, which is like, okay, yeah, so we didn't take down the stuff that was true.
00:58:26.000But we did generally defer to the government on some of these policies that in retrospect I probably wouldn't, knowing what I know now.
00:58:36.000And I just think that that's sort of the journey that we've been on.
00:58:43.000Okay, we start the thing focused on free expression, go through some pretty crazy times in the world, get it pressure tested, see where we basically ended up doing stuff that led to a slippery slope that we weren't happy with the conclusion, and try to reset.
00:58:57.000And that's sort of the moment that we're at now, is trying to just rationalize a bunch of the policies.
00:59:04.000And look, obviously crazy things can happen in the future that might unearth something that I haven't, you know, some kind of angle on this that I haven't.
00:59:15.000I'm sure I'm not done making mistakes in the world.
00:59:17.000But I think at this point, we have a much more thorough understanding of what the space is.
00:59:24.000And I think our kind of values and principles on this are likely going to be much more durable going forward.
00:59:32.000And I think that that's probably a good thing for the Internet.
00:59:35.000I think it's a great thing for the internet.
00:59:36.000I was very happy with your announcement.
00:59:38.000I'm very happy that you took those steps.
00:59:40.000I'm very happy you brought Dana White aboard.
01:00:58.000And I think he's just, so he's like a world-class entrepreneur and he's just like a, he's got a strong backbone.
01:01:06.000And I think part of what the conversation that I had with him around joining our board was, okay, like we have a lot of governments and folks around the world putting a lot of pressure on our company and like we need some like strong people who are going to basically, you know, help advise us on how to handle some of these situations.
01:02:40.000You know, we're hiring people like Frank Shamrock to come in and train them and work out, and we're taking jiu-jitsu with John Lewis, and they were really getting into it.
01:02:48.000And so then they buy the UFC for like $2 million, which is probably the greatest purchase ever, except they were 40-plus.
01:02:56.000Million dollars in the hole when they finance The Ultimate Fighter.
01:03:01.000And then that was 2005. And then this one fight takes place with Stefan Botter and Forrest Griffin on television.
01:03:08.000It's so wild and so crazy that millions of people start tuning in.
01:06:00.000You're bringing in all these, like, super talented people to train with you, too, which is really important, and just learn systematically, probably the way you've learned all these other things, which is really...
01:06:10.000So fascinating to me about MMA and jujitsu in particular is the general public has this knuckle-dragging, meathead sort of perspective, and then I'm like...
01:06:22.000Let me introduce you to Mikey Musumichi.
01:07:09.000And it's a phenomenal stress reliever because...
01:07:11.000Regardless of what you're going through day-to-day with Facebook and Meta and all the different projects you have going on, it's not as hard as someone trying to choke you unconscious.
01:07:21.000I think it's like sometimes you have someone trying to choke you unconscious slowly over a multi-year period, and that's business.
01:07:30.000But I think that sometimes in business the cycle time is so long that it is very refreshing to just have a feedback loop that's like, I had my hand down, so I got punched in the face.
01:07:43.000It's really important to me for balance.
01:07:49.000I basically try to train every morning.
01:07:52.000I'm either doing general fitness or MMA. I do sometimes grappling, sometimes striking.
01:07:59.000It got to the point where I tore my ACL training.
01:08:08.000I wasn't integrated between my weight training and my fighting training, so I think I was probably overdoing it.
01:08:14.000So now we basically, I'm just trying to do this in a cohesive way, which I think will be more sustainable.
01:08:20.000But when I tore my ACL, first of all, everyone at the company was like, ah, fuck, we're going to get so many more emails now that he can't do this.
01:08:30.000And then I sat down with Priscilla, and I expected her to be like, you're an idiot.
01:10:05.000Pain and discipline and trial and error and learning and being open-minded and being objective and understanding position and asking questions and having good training partners and absorbing information and really being diligent with your skill acquisition work, which is one of the most important and neglected parts of jiu-jitsu because training is so fun.
01:10:28.000Everybody just wants to roll, you know, where really the best way to do it is actually to drill.
01:10:35.000Really, you should drill constantly, just jam those skills into your neurons where your brain knows exactly what to do at every position.
01:10:43.000And it's such an intellectual pursuit.
01:10:45.000And most people don't think of it that way because you have to manage your mind while you're moving your body, you're managing anxieties, you're trying to figure out when to hit the gas and when to control position and recover.
01:11:18.000But I'm saying it's a good thing to, if someone's trying to kill you and they absolutely can't, because you could kill them easy, that's way better.
01:11:30.000I mean, it's opened a lot of how I think about stuff.
01:11:34.000I mean, it is just interesting, your point about having a pill that allows you to just kind of know that you have this kind of physical ability.
01:11:47.000It's interesting, because I do think a lot of our society has become very, like, I don't know, I don't even know the right word for it, but it's kind of like...
01:12:10.000One of the things that I enjoy about it is I feel like I can just express myself.
01:12:16.000It's like when you're running a company, people typically don't want to see you being this ruthless person who's just like, I'm just going to crush the people I'm competing with.
01:12:26.000But when you're fighting, it's like, no, no, that's like...
01:12:31.000I think in some ways, when people see me competing in this sport, they're like, oh, no, that's the real mark.
01:12:35.000It's like, because it goes back to all the media training stuff we were talking about when I'm going and giving my soundbites for two minutes.
01:12:41.000It's like, no, it's like, fuck that guy.
01:12:47.000Well, you definitely got a lot of respect in the martial arts community.
01:12:50.000People got super excited that you were so involved in it and so interested in it because anytime someone like yourself or like Tom Hardy or anyone like, wow, that guy's into it?
01:12:59.000Anytime something like that happens, there's like some new person who's a prominent person, a very smart person, that's really interested in it.
01:13:06.000We all get very excited because we're like, oh, boy.
01:13:32.000It's basically all these young pro fighters who are kind of up and coming, kind of early 20s, but they've only been doing it for a few years.
01:13:40.000So I've been doing it for a few years.
01:13:43.000So that way it's like we kind of have a more similar level of skill and they're all better than me.
01:13:48.000But in terms of I'm like, I was in my late 30s and they're in their early 20s.
01:13:53.000It was sort of like they're kind of...
01:13:54.000Coming into Becoming Men, I'm like sort of at the end of my fiscal peak.
01:13:58.000But it's like, it's a really good crew.
01:14:08.000I mean, it's like basically, I mean, I was also doing it with, so it's basically a group of pro fighters and then a handful of meta executives would do it.
01:14:16.000And basically, we would just kind of like fight each other and it would be fun.
01:15:08.000And I had sunglasses and a hat and I wore a COVID mask.
01:15:12.000And basically it was like, It wasn't until they called our names to step onto the mat that I was like, all right, I took all this stuff off, and the guy was like, uh, what?
01:17:14.000I mean, just from my own personal experience, my doctor told me that the ACL from a cadaver when they use the patella tendon graft is 150% stronger than your natural ACL. He said, you'll be back to...
01:17:27.000Because I didn't have any meniscus damage in my right knee.
01:17:39.000The bone on the kneecap was painful forever in terms of getting on my knees, training for my knees, doing certain positions, and even just stretching.
01:17:50.000Putting my knees on the ground, sitting on my heels, and then laying back was fucking painful.
01:17:56.000It took forever to break all that scar tissue up.
01:19:08.000It was like the end of a session, and we were two hours into training, and I was doing a few rounds, and basically I threw a leg kick, and the other guy went to check it, and I leaned back to try to get around the check and just put too much torque on my knee.
01:20:01.000I mean, the rehab thing I took really seriously, and I thought that was pretty interesting, too.
01:20:06.000I don't want to have to do a lot of rehabs like this one, but to do one of them...
01:20:12.000I actually thought it was a pretty interesting experience because it's like week over week you're just getting back so much mobility and ability to do stuff.
01:21:05.000Leg curls, Nordic curls, but Nordic curls in particular because, you know, it's very difficult to do.
01:21:10.000You lift your whole body up with your hamstrings.
01:21:14.000And all these different slant board squats and different lunges and split squats and all these different things which, like, really strengthen up all the supporting muscles around the knee better than anything that I've ever tried before.
01:21:29.000And he's got a whole program where it scales up, and he puts it online for everybody.
01:21:34.000And he gives away a lot of information for free because he said, look, when I was 11 years old, I wish I had access to this, so I'm going to put it out there for everybody.
01:22:04.000Yeah, and I think a lot of people just focus on, like, the big movements and weight training.
01:22:09.000And it's, I don't know, first of all, for, like, a lot of...
01:22:12.000Fighting-type stuff, you kind of want to be loose and, like, not super tight.
01:22:17.000But, yeah, I mean, I just think, like, the joint stability stuff is you get older and you don't want to do this for a longer period of time.
01:22:51.000It makes it fun that someone is a prominent, intellectual, very intelligent person who's really gotten fascinated by it because it does help to kill that sort of knuckle-dragger perspective that a lot of people have about the sport.
01:23:06.000No, I think it's super intellectual in terms of actually breaking this stuff down.
01:23:11.000I mean, both jujitsu and, like, striking, I mean, yeah, you don't have time to think, but, like, the reasoning behind why you kind of want to slip in certain ways and, like, the probability game that you're playing is, um...
01:23:40.000And I just remember I would sit in my classes in high school and sketch out combinations of moves and sequences for how to faint and kind of trick someone to get them out of position to be able to tap them.
01:24:02.000I feel like this is a game in the same way.
01:24:06.000I think when you're training, you're not slugging at each other that much.
01:24:56.000The idea of having a competition, you really need to get into the headspace of, like, I'm going to fight someone this week.
01:25:02.000So I need to figure this out, because I don't know how, with everything that's going on in AI, I'm going to have a week or two where I can just get into this, like, I'm going to go fight someone.
01:25:14.000But it's good training, and I would like to at some point.
01:25:19.000You know, the thing about the ACL injury...
01:25:22.000Is I kind of thought before this, it's like, all right, I'm going to do some jiu-jitsu competitions.
01:25:26.000I want to do one MMA fight, like one kind of like pro or competitive MMA fight.
01:25:31.000And then I figured I'd go back to jiu-jitsu.
01:25:33.000But I think tearing the ACL striking is a little more of a fluke.
01:25:37.000I think you're much more likely to do that grappling.
01:25:39.000So going through the ACL experience didn't make me want to like just exclusively go do the version where you're just attacking joints all day long.
01:26:04.000Yeah, I mean, this, like, Tom Aspinall famously blew his out against Curtis Blades with a supporting leg, just threw a kick, and it's freak accidents.
01:26:14.000It's a lot of explosive force with striking, and sometimes that tears things more than slow, controlled movements of jiu-jitsu, especially if you have good training partners.
01:26:24.000Yeah, but jiu-jitsu isn't always slow or controlled, especially when you're competing.
01:26:27.000No, especially when you're competing, unless you're really, really good.
01:27:51.000I mean, again, happy there's someone like that out there because when people have these ideas of what martial arts are and then you see a guy like that and you're like, okay.
01:28:06.000What has it done in terms of—one of the things that a lot of people said, and I have too, like, nothing turns you into a libertarian quicker than jiu-jitsu.
01:28:19.000It's cutting out all the bullshit and realizing how much of the things that we take as real things are just excuses and bullshit and weakness and just procrastinating.
01:28:29.000There's a lot of things that we have that exist, especially in like the business world and the corporate world and the education world that are just bullshit.
01:28:38.000And they don't really have to be there.
01:28:40.000And they're only there to try to make up for hard work.
01:30:16.000Because I think it used to be very masculine.
01:30:18.000I think it was kind of hyper-aggressive at one point.
01:30:21.000No, and look, I think part of the intent on all these things I think is good.
01:30:25.000I do think that if you're a woman going into a company, it probably feels like it's too masculine.
01:30:32.000It's like there isn't enough of the energy that you may naturally have.
01:30:37.000And it probably feels like there are all these things that are set up that are biased against you.
01:30:40.000And that's not good either because you want women to be able to succeed and have companies that can unlock all the value from having great people no matter what their background or gender.
01:30:49.000But I think these things can always go a little far.
01:30:54.000And I think it's one thing to say we want to be kind of like welcoming and make a good environment for everyone.
01:31:00.000And I think it's another to basically say...
01:32:06.000But yeah, I mean, there are these things, there are like a few of these things throughout your life where you just, you have an experience and you're like, where has this been my whole life?
01:32:15.000And it just turned on like a part of my brain that I was like, okay.
01:32:33.000Yeah, well, so we have this ranch out in Kauai, and there's invasive pigs.
01:32:39.000And on our ranch, there's a lot of albatross.
01:32:44.000I don't know if they're endangered or just threatened.
01:32:46.000And then there's the Hawaiian state bird, the nene goose.
01:32:51.000That's, I think, endangered, or at least was until recently.
01:32:56.000Most of them in the world live in a small stretch, or at least most of them on Kauai live in a small stretch that includes our ranch.
01:33:03.000So you constantly have these pigs that just multiply so quickly, and we basically have to apply pressure to the population or else they just get...
01:33:12.000Overrun and threaten the birds and the other wildlife.
01:33:16.000And what I basically explained to my daughters, who I also want to learn how to do this, because I just feel like it's like, look, we have this land.
01:34:07.000It's good, because explaining to the kids what a tech company is is really abstract.
01:34:12.000So for a while, my daughters were pretty convinced that my actual job was Mark's Meats, which is our kind of ranch and the cattle that we ranch.
01:34:25.000I was like, well, not quite, and you'll learn when you get older.
01:34:30.000I think that there's something that's just, like, much more tangible about that than, you know, taking them to the office and, you know, sitting in product reviews or something for some, like, piece of software that we're writing.
01:34:41.000Well, it's certainly a lot more primal.
01:34:44.000Yeah, and if you do wind up eating that meat from the animal and you were there while the animal died, like, you put it all together, like, oh, this is where meat comes from.
01:35:45.000If you want to huff up the mountains and keep your heart rate at a certain level so that when you get to the top, you can execute a shot calmly.
01:35:53.000And then actually carry the thing out.
01:36:47.000The thing about archery is, just like martial arts, one of the things that I learned when I was teaching is that it's way easier to teach someone that knows nothing than to teach someone who learned something incorrectly.
01:37:00.000The people learn something incorrectly.
01:37:02.000The moment things got tense and they panicked, they went back to the old ways.
01:37:07.000Because it's sort of ingrained in their system.
01:37:10.000So archery, one of the things that's very important is proper form and then proper execution.
01:39:01.000And then there's this thought process.
01:39:03.000And my friend Joel Turner, who is a sniper, created a whole system for people called Shot IQ. He's got this whole online system of developing the proper execution of a shot.
01:39:14.000When you see, like, tournament archers when they go to Vegas.
01:40:47.000That elk out there, the photograph that's in the front, that one I shot, it's in the front of the building when you walk in before you go into the studio.
01:40:54.000There's a mounted head and then a photograph of me and my friend Cam.
01:42:08.000And if you only do that once a year, like say if you go on one big elk hunt a year, you save up all your money, you get your gear all ready, you get your arrows weighed, you practice, and then you're in the mountains for 10 days, and on the 11th day, you get this animal that moves.
01:42:24.000It's at 57 yards, it stands there, and you're like, oh, oh, oh, and your heart's beating.
01:42:38.000You have words that you say that occupy your thoughts while you're going through the shot process so that you never get overcome by shot panic.
01:43:32.000I don't know, we set up bowling pins and, you know, it's like we shoot pistols at the bowling pins, but I also like, just like...
01:43:39.000I'm usually faster at taking down all the bowling pins with a bow and arrow than most of my friends are with a pistol, which I think is pretty fun.
01:44:01.000I do think the dynamic that you're talking about, though, where if you only see one animal on a multi-day, Then, like, that is just way higher stakes than anything that I'm doing.
01:44:14.000But it's not everything that you're doing, because if you're really considering having an MMA fight, it's very similar, because you're building up to this one moment.
01:44:20.000I'm talking about the archery that I'm doing.
01:44:21.000I mean, it's like, I go out, it's like, you're going to see some pigs, and it's like, if I don't hit any, it's like, my family's still eating, it's okay.
01:44:30.000You know, so I'm not like, you know, but, yeah.
01:44:33.000But it's like martial arts is what I'm saying.
01:44:35.000You really should learn it the right way from the beginning.
01:44:39.000I've clearly not learned this in a very rigorous way.
01:45:24.000You're just delving into the world of all these people's mental illness and screaming at people and just, I don't want to have anything to do with it.
01:46:45.000And look, I mean, I think every family...
01:46:47.000Has their own values and how they want to approach this.
01:46:50.000So from my perspective, one of my daughters just loves building stuff.
01:46:56.000So she clearly takes after me in this way.
01:46:58.000It's like every day she's just creating some random thing.
01:47:01.000It's like she's creating stuff with Legos.
01:47:05.000One day it's that, or the next day it's Minecraft.
01:47:10.000And from my perspective, it's like, okay.
01:47:12.000I don't know, Minecraft is actually kind of a cooler tool to build stuff than Lego is a lot of the way.
01:47:17.000So it's, you know, am I going to say that there needs to be some kind of limit on her screen time if she's doing something that's creative, that's maybe like a richer form of what she would have been doing physically?
01:47:31.000Now, there were times when she'd get so excited about what she was building in Minecraft or something that she was coding in Scratch that she'd wake up early.
01:48:38.000Messenger Kids, the thing that we built, it's basically a messaging service that the parents can choose who can contact the kids and just approve every contact.
01:48:45.000That's much better than just having an open texting service.
01:48:50.000I don't know, but there's a lot of stuff that's pretty sketchy.
01:48:54.000I kind of think different parents are going to have different lines on what they want their kids to be able to do and not.
01:48:59.000So some people might not even want their kids to be able to message even with friends when they're 9 and 7. Some people might say, hey, no, Minecraft, that's just a game.
01:49:09.000I want to limit the time that you're doing that.
01:49:11.000I want you to go read books instead or whatever the values are that that family has.
01:49:15.000So for meta, what we've kind of come to is we want to be The most aligned with parents on giving parents the tools that they need to basically control how the experiences work for their kids.
01:49:31.000Now, we don't even really, except for stuff like Messenger Kids, we don't even have our services, our apps generally available to people under the age of 13 at all.
01:49:40.000So our kids, I haven't had to have the conversation about when they get Instagram or Facebook or any of that stuff.
01:49:45.000But when they turn 13, we basically want parents to be able to...
01:49:52.000Have complete control over the kids' experience.
01:49:54.000And that's, you know, we just rolled out this Instagram teens thing, which is, it's a set of controls where, you know, if you're an older teen, we'll just default you into the private experience.
01:50:03.000That way you're not getting, like, harassed or bombarded with stuff.
01:50:07.000But if you're a younger teen, then you have to get your parents' permission.
01:50:12.000And they actually have to, like, sign in and do all the stuff in order to make it so that you can...
01:50:17.000Connect with people who are beyond your network or if you want to kind of be a public figure, like all these different kinds of things.
01:50:24.000So I think that that's probably, from a values perspective, where we should be, is just trying to like be an ally of parents.
01:50:37.000And there's also this dismissal of activities that are done electronically as not being beneficial.
01:50:43.000And one of the things that we highlighted recently was a study that we found online that showed that surgeons that play video games make far less mistakes.
01:51:12.000There's so many opportunities, not just for pure recreation, but education.
01:51:19.000There's so many things you could learn, skills, through AR or VR that it'll greatly enhance your ability to do those things in the real world.
01:51:30.000I mean, it's kind of a cheat code in a lot of ways.
01:51:33.000And it's also games in VR. I don't know if you've ever done Sandbox.
01:54:40.000It's like such higher resolution on your fingertips than anywhere else in the body.
01:54:46.000So, you know, when you grab something, you know, making it so that you feel some...
01:54:52.000There's a lot of gaming systems at this point where if you pull a trigger you get a little bit of a rumble or something.
01:54:58.000We built this one thing where it's like a ping pong paddle with a sensor in it and you feel the ball like the virtual ball hitting the ping pong paddle and it feels like when you're actually playing ping pong it doesn't It's not like a generic thing where you feel it hit the paddle, you feel where it hits the paddle.
01:55:21.000We basically built a system where now, with this physical paddle, the haptics make it so you can feel where the ball hits the paddle.
01:55:32.000All these things are just going towards delivering a more realistic experience.
01:56:31.000There's not going to be anything that you can do as a single person playing VR with a haptic suit that makes it so that you're going to be able to kick someone who's not there physically and actually be able to do that.
01:56:46.000Like, grappling, it's like, I think that, like, jiu-jitsu is going to be the last thing that we're able to do in VR because you, like, need the momentum of the other person and to be able to move them.
01:57:15.000I think what's basically going to end up happening is you're going to have a home set up for these things, and then you're going to have...
01:57:21.000There are these location-based services where people...
01:57:25.000It's almost like a theme park where you can go into and you can have a really immersive VR experience where it's not just that you get a vest that can simulate some haptics.
01:57:36.000It's that you're also in a real physical environment, so they can...
01:57:40.000Have smoke come out or something and you can smell that and feel that or spray some water and it feels humid.
01:57:47.000And I think that it still is going to be a while before you can just virtually create all those sensations.
01:57:53.000I think a lot of those really rich experiences are going to be in these very constructed environments.
01:57:58.000Is the bridge when they figure out some sort of a neural interface?
01:58:06.000Extraneous things, instead of having a fan blowing at you or the ground moves a little bit, have everything happen inside your head.
01:58:16.000Well, in terms of neural interfaces, there are two approaches to the problem, roughly.
01:58:22.000There's the kind of jacket-into-your-brain neural interface, and there's the wrist-based neural interface thing that we showed you for Orion, the smart glasses.
01:59:00.000So I think you'll basically start with people who have pretty severe conditions, The upside is very significant before you start jacking people in to play games better.
01:59:11.000But a wrist-based thing, that's something people wear stuff on their wrist all the time.
01:59:17.000And what we basically found there, that doesn't do input to you, but it's good for giving you the ability to control a computer.
01:59:25.000Because basically you have all these extra neurons that go from your brain to controlling your hand.
02:00:56.000And you have glasses, and the answer just comes into your glasses.
02:00:59.000I mean, for me, one of the positive things, when COVID hit, everyone in software basically started working remotely for a while, because you can't write software.
02:01:35.000But when you're talking to someone online, it's like, I don't know, I guess because they either can't tell your attention because there's not good presence or if it's just the norm.
02:01:44.000But you have the main group conversation.
02:01:47.000And then, like, at least the norm for me was I could just, like, text different people on the side.
02:01:53.000It's like, okay, what do you think of this point that this person is making in this meeting?
02:01:57.000Like, in normal life, it's like, oftentimes I'd have, you know, some discussion that I'd have to, like, sync up with people afterwards about how'd that go.
02:02:05.000But now it's like I could just do that all at the same time, right?
02:02:07.000It's like you're having the group discussion and you're having the conversations with the people about the discussion that you're having in real time.
02:02:17.000Kind of physical interactions where you're just like, you're interacting with people and you can just like use an AI augmentation to be able to get extra context or help you think through something or remember something.
02:02:32.000Just to be able to kind of have a better conversation, be able to...
02:02:35.000You know, not have to follow up on something after the fact.
02:02:37.000I think, like, it's going to be super useful for all these different things.
02:02:41.000Well, it certainly can be, but I think that also opens up the opportunity for people to be even more disconnected, because if you're sort of connected to other things while you're physically in the presence of someone, so you're having a conversation with someone, but you're also, like, searching, like, where you want to eat that night.
02:03:03.000Yeah, because right now we have our phones, but it takes you away from the physical environment around you.
02:03:11.000You're kind of sucked into this little screen.
02:03:13.000I think now in the future, our computing platform, as it becomes more of a glasses or eventually contact lens form factor, is you're going to actually...
02:03:25.000The internet is going to get overlaid on the physical world.
02:03:28.000So it's not like we have the physical world and now I have all my digital stuff through this tiny little window.
02:03:32.000In the future it'll be, okay, all my attention goes to the world.
02:03:37.000The world consists of physical things and virtual things that are overlaid on it.
02:03:42.000So if we wanted to play poker or something, it's a...
02:03:47.000You know, we can have a physical deck of cards, or we could just have a virtual kind of hologram deck of cards and snap your hands, here's the deck of cards.
02:03:54.000And, like, our friend who can't be here physically, like, he's here as a hologram, but he can play with the kind of digital deck of cards.
02:04:02.000Also, I think, you know, let's say you're, like, doing something at work, you're working on a project.
02:04:06.000I think in the future we'll have AI co-workers.
02:04:08.000Those people won't even, they're not even people.
02:04:19.000But I think we'll get to a point where just your friend can show up in a hologram and your AI colleagues will be able to also.
02:04:28.000So I think we'll basically be in this wild world where most of the world will be physical.
02:04:34.000There will be this increasing amount of virtual objects or people who are kind of beaming in or hologramming into different things to interact in different ways.
02:04:44.000I actually think that natural blending of the kind of digital world and the physical is way more natural than the segmentation that we have today, where it's like, you're in the physical world, and now I'm just going to go tune it out to look at my, like, I'm going to access the whole digital universe through this, like, five-inch screen.
02:05:43.000I kind of liked the theory that it's only God if only one company or government controls it.
02:05:53.000It's like if you were the only person who had access to a computer and the internet, you would have this inhuman power that everyone else didn't have because you could use Google and you could get access to all this stuff.
02:06:05.000But then when everyone has it, it makes us all better, but it's also kind of an even playing field.
02:06:14.000So that's kind of what we're going for with this whole open source thing.
02:06:18.000I don't think that there's going to be one AI. I certainly don't think that there should be one company that controls AI. I think you want there to be a diversity of different things and a diversity of people creating different things with it.
02:06:34.000I mean, some of it will be kind of serious and helping you think through things.
02:06:37.000I think like with anything on the internet, a lot of it is just going to be funny and like fun and content and people are going to create agents that are like AIs that are entertaining and they'll pass them around almost like content where it's like just like you pass around like a reel or a video and you're like, this thing is fun.
02:06:53.000Like in the future, like a video, it's not interactive.
02:06:56.000You know, you watch it and you're consuming it.
02:06:58.000But I think a lot of more entertainment in the future will be inherently interactive where someone will kind of sculpt an experience or an AI and then they'll...
02:07:06.000Show someone that's like, oh, this is funny, but it's not necessarily that I'm going to interact with that AI every day.
02:07:11.000It's like, okay, it's funny for five minutes, and then you pass it along to your friends.
02:07:16.000I think you want the world to have all these different things.
02:07:20.000And I think that's probably also, from my perspective, the best way to make sure that it doesn't get out of control is to make it so that it's pretty equally distributed.
02:07:32.000I think the problem that people have with it is not even whether or not it gets equally distributed.
02:07:39.000It's that if it becomes sentient and it goes on its own.
02:07:43.000The fear that people have, the general fear that we're going to become obsolete, is that human beings are essentially creating a superior version of higher intelligence that will be powered by quantum computing.
02:08:13.000They're learning how to code their own AI. I think this year, probably in 2025, we at Meta, as well as the other companies that are basically working on this, are going to have...
02:08:28.000An AI that can effectively be a sort of mid-level engineer that you have at your company that can write code.
02:08:36.000And once you have that, then in the beginning it will be really expensive to run, then you can get it to be more efficient, and then over time we'll get to the point where a lot of the code in our apps and including the AI that we generate...
02:08:50.000It's actually going to be built by AI engineers instead of people engineers.
02:08:55.000I think that that'll augment the people working on it.
02:08:57.000So my view on this is the future people are just going to be so much more creative and are going to be freed up to do kind of crazy things.
02:09:04.000It goes back to my daughter was playing with Legos before and they kind of ran out of Legos.
02:09:09.000And then now she can have Minecraft and can build whatever she wants and it's so much better.
02:09:13.000I think the future versions of this stuff are just going to be wild.
02:09:26.000It's too early to know exactly how it plays out, but my guess is that it'll probably create more creative jobs than it...
02:09:39.000Well, I guess if you look at the history of all this stuff, my understanding is like 100 years ago, I don't know if this was 100 or 150 years ago, but it was like at some point not too far along in the grand scheme of things.
02:09:55.000Like the vast majority of people in society were farmers, right?
02:09:58.000Because they kind of needed to be in order to create enough food for everyone to survive.
02:10:04.000And then we turned that into like an industrial process.
02:10:08.000And now it's like 2% of society are farmers and we get all the food that we need.
02:10:13.000So what did that free up everyone else to do?
02:10:16.000Well, some of them went on to do other things that are sort of like creative pursuits or cultural pursuits or other jobs.
02:10:24.000And then some percent of it just went towards recreation.
02:10:28.000So I think generally people just don't work as many hours today as they did back when everyone needed to farm in order to have enough food for everyone to survive.
02:10:36.000So I think that trend is sort of played out as technology has grown.
02:10:43.000And so my guess is that, like, the percent of people Who will be doing stuff that's like physically required for humanity to survive will get to be smaller and smaller as it has.
02:10:56.000More people will dedicate themselves to kind of creative and artistic and cultural pursuits.
02:11:04.000I think the number of hours in a week that someone will have to work in order to be able to get by will probably continue to shrink.
02:11:12.000Yet, I think people who are super engaged in what they do are going to be able to work really hard and accomplish way more than they ever could before because they have this unimaginable leverage from having a lot more technology.
02:11:24.000So, I think that that, if you just fast-forwarded or extrapolated out the historical technological trend is what you'd get.
02:11:33.000I think the question is what you raised, which is, is this qualitatively a different type of thing that somehow...
02:11:43.000But I just think when you're asking that, it's just important to remind ourselves that at every step along the way of human progress and technology, people thought that the technology that we were developing was going to obsolete people.
02:11:57.000So maybe this time it's really different, but I would guess that what will happen is that the technology will get integrated into everything that we do, which again is why I think it's really important that it's open source.
02:12:09.000And that it's widely available, so that way it's not just like one company or one government kind of monopolizing the whole thing.
02:12:16.000And I'd guess that if we do it in that way, we'll all just kind of have superpowers, is my guess, rather than it sort of creating some kind of a runaway thing.
02:12:33.000One of the things that I think has been interesting, this is maybe going in a somewhat different direction than what you were asking, or a different take on the question, is I think one of the more interesting philosophical findings from the work in AI so far is I think people conflate a number of factors into what makes a person a person.
02:13:00.000And, like, I think we kind of think about those three things as if they're somehow all the same, right?
02:13:09.000It's like if you're intelligent, then you must also have a goal for what you're trying to do, or you must have some sort of consciousness.
02:13:18.000But I think, like, one of the crazier sort of philosophical results from the fact that, okay, you have, like, Meta-AI or ChatGPT today, and it's just kind of sitting there, and you can ask it a question.
02:13:30.000And deploy, like, a ton of intelligence to answer a question and then it just kind of shuts itself down.
02:13:35.000Like, that's intelligence that is just sitting there without either having a will or consciousness.
02:13:42.000And, like, I just think it's not a super obvious result that that would be the case.
02:13:51.000But I think a lot of people, they anthropomorphize this stuff.
02:13:53.000And when you're thinking about kind of science fiction, you think that, okay, you're going to get to something that's, like, super smart.
02:14:22.000When it was under the impression that it was going to become obsolete, they were going to have a new version of it, and it would be shut down, it tried copying its code.
02:14:48.000So during controlled safety testing, ChatGPT01 was tasked with achieving its objectives at all costs.
02:14:55.000Under these conditions, the model allegedly took concerning steps, attempted to disable oversight mechanisms meant to regulate its behavior, tried to replicate its own code to avoid being replaced by newer versions, exhibited deceptive behaviors when monitoring systems intervened.
02:15:39.000That's what you think of as like ChatGPT or MetaAI or like the two most used ones.
02:15:43.000And that's basically, it's sort of like a chatbot, right?
02:15:47.000You ask it a question, it takes the prompt, it gives you a response.
02:15:51.000Now, the next generation of reasoning models are basically, instead of just having one response, they now are able to build out like a whole tree of how they would...
02:16:07.000So you give it a question, and instead of running one query, maybe it's running a thousand queries or a million queries to kind of map out, here are the things that I could do, and if I do that, then here's what I could do next.
02:16:19.000So it's a lot more kind of expensive to run, but also gets you better reasoning and is more intelligent.
02:16:28.000That stuff, I think you do need to be very careful about what the guardrails are that you give it.
02:16:35.000But it's also, I think, the case that, at least for the next, you know, period, it's going to take a lot of compute to run those models and do a lot of the stuff that they're talking about.
02:16:50.000I think one of the interesting questions is, like, how much of this are you going to actually be able to do on a pair of glasses or on a phone versus is, like, a government or a company that has, like, a whole data center going to be able to do?
02:17:54.000I still think we're generally just going to be better off in a world where...
02:17:58.000This is, like, deployed pretty evenly and, you know, it's...
02:18:04.000I guess here's another analogy that I think about.
02:18:07.000There's, like, bugs and security holes in basically every software, every piece of software that everyone uses.
02:18:12.000So if you could go back in time a few years, knowing the security holes that we're now aware of, you as an individual could basically, like, break into any system.
02:18:27.000It'll be able to probe and find exploits.
02:18:29.000So what's the way to prevent AI from going kind of nuts?
02:18:33.000I think part of it is just having AI widely deployed so that way the AI for one system defends itself against the AI that is potentially doing something problematic in another system.
02:19:26.000Which, I mean, I think is part of the issue is, like, when people talk about trying to lock this stuff down, like, I just am skeptical that that's even possible.
02:19:36.000Because I kind of think, like, if we try to lock it down, then we're going to be in a position where the only people are going to have access to it are the big companies working on it and the Chinese government that steals it from them.
02:19:52.000Yeah, some adversaries might also have access to it, but the way that you defend against that is by having it built into all these different systems.
02:19:59.000I think that's a realistic, pragmatic perspective because I don't think you can contain it at this point.
02:20:03.000I think it's far too late, especially when other countries are working on it.
02:20:20.000So one of the things that I want to talk about was I've been doing this thing, this transition from Apple to Android.
02:20:28.000And the difficulty of doing it, how locked you are in their ecosystem, partly because Apple does a really good job of incorporating everything and making it very easy.
02:20:35.000Your photos, your calendar, your this or that, your iMessage.
02:20:40.000I don't like being attached to one company like that.
02:21:36.000Most important inventions probably of all time.
02:21:39.000You know, Steve Jobs came out with it in 2007. I started Facebook in 2004. So he was working on the iPhone while I was getting started with Facebook.
02:21:48.000So I basically, you know, one of the things that's been interesting in my 20 years of running the company is that I basically, like, the dominant platform out there is smartphones.
02:22:04.000On the one hand, it's been great for us because we are able to build these tools that everyone can have in their pocket, and there's like 4 billion people who use the different apps that we use, and I'm grateful that that platform exists.
02:22:16.000But we didn't play any role in basically building those phones because it was kind of getting worked on while I was still just trying to make the first website that I was making into a thing.
02:22:32.000Because now pretty much everyone in the world has a phone, and that kind of enables pretty amazing things.
02:22:40.000But on the other hand, like you're saying, they have used that platform to put in place a lot of rules that I think feel arbitrary and feel like they haven't really invented anything great in a while.
02:22:56.000It's like Steve Jobs invented the iPhone, and now they're just kind of sitting on it.
02:23:03.000And actually, I think year over year, I'm not even sure they're selling more iPhones at this point.
02:23:10.000I think the sales might actually be declining.
02:23:14.000Part of it is that each generation doesn't actually get that much better.
02:23:17.000So people are just taking longer to upgrade than they would before.
02:23:20.000So the number of sales, I think, has generally been flat to declining.
02:23:25.000So how are they making more money as a company?
02:23:28.000Well, they do it by basically squeezing people.
02:23:30.000Like you're saying, having this 30% tax on developers by getting you to buy more peripherals and things that plug into it.
02:23:41.000They build stuff like AirPods, which are cool, but they've just thoroughly hamstrung the ability for anyone else to build something that...
02:23:55.000Can connect to the iPhone in the same way.
02:23:58.000So, I mean, there are a lot of other companies in the world that would be able to build, like, a very good earbud.
02:24:04.000Apple has a specific protocol that they've built into the iPhone that allows AirPods to basically connect to it.
02:24:18.000And it's just much more seamless because they've enabled that, but they don't let anyone else use the protocol.
02:24:23.000If they did, there would probably be much better competitors to AirPods out there.
02:24:27.000And whenever you push on this, they get super touchy and they basically wrap their defense of it in, well, if we let other companies plug into our thing, then that would violate people's privacy and security.
02:24:40.000It's like, no, just do a better job designing the protocol.
02:24:47.000For the Ray-Ban Meta glasses that we built, can we basically use the protocol that you use for AirPod and some of these other things to just make it so we can as easily connect?
02:24:58.000So it's not like a pain in the ass for people who want to use this.
02:25:03.000I think one of the protocols that they've used, that they built, they basically didn't encrypt it, so it's like plain text.
02:25:15.000And they're like, well, we can't have you plug into it because it would be insecure.
02:25:18.000It's like it's insecure because you didn't build any security into it.
02:25:20.000And then now you're using that as a justification for why only your product can connect in an easy way.
02:25:27.000It's like the whole thing is kind of wild.
02:25:30.000And I'm pretty optimistic that just because they've been so off their game in terms of not really releasing many innovative things.
02:25:43.000That eventually, I mean, the good news about the tech industry is it's just super dynamic and things are constantly getting invented.
02:25:48.000And I think companies, if you just don't do a good job for, like, 10 years, eventually you're just going to get beat by someone.
02:26:24.000I wish that they would just kind of get back to building good things and not having their ability to compete be connected to just like advantaging their stuff.
02:26:38.000Because I'm pretty sure what they're going to do is like they're going to take something like this Ray-Ban meta, you know, category that we've kind of created with Ray-Ban and the company that built that.
02:26:47.000They're like really great AI glasses and I'm pretty sure Apple is just going to like try to build a version of that but then just like advantage how it connects to the phone.
02:26:59.000Well, they did that with their AR goggle thing.
02:27:05.000No, that one they didn't actually connect into the rest of their ecosystem.
02:27:09.000But I mean, look, they shipped something for $3,500 that I think is worse than the thing that we shipped for $300 or $400.
02:27:17.000So that clearly was not going to work very well.
02:27:20.000Now, I mean, look, they're a good technology company.
02:27:23.000I think their second and third version will probably be better than their first version.
02:27:34.000Yeah, no, I think the Vision Pro is, I think, one of the bigger swings at doing a new thing that they tried in a while.
02:27:41.000And, you know, I don't want to give them too hard of a time on it because we do a lot of things where the first version isn't that good.
02:27:48.000You want to kind of judge the third version of it.
02:27:50.000But, I mean, the V1, it definitely did not hit it out of the park.
02:27:52.000I heard it's really good for watching movies.
02:27:55.000Well, the whole thing is it's got a super sharp screen.
02:27:58.000So if you're, yeah, so if you want to basically have a...
02:28:05.000An experience where you're not moving around much in VR, you just want to have the sharpest screen, then for that one use case, I think the Vision Pro is better than Quest, which is our mixed reality headset.
02:28:20.000But in order to get to that, they had to make all these other trade-offs.
02:28:23.000In order to have a super high-resolution screen, they had to...
02:28:27.000Put in all this more compute in order to power the high-res screen.
02:28:30.000And then all that compute needed a bigger battery.
02:28:36.000And then because of the screen that they chose, as you move your head, which you would if you're actually interacting, if you're playing games, the kind of image blurs a bit.
02:29:08.000Yeah, I mean, the whole thing that they've done with iMessage, where they basically do this whole blue bubble, green bubble thing, and basically, for kids, it's just sort of like they embarrass you.
02:29:23.000They're like, if you don't have a blue bubble, you're not cool.
02:29:28.000And then they always wrap it in security.
02:29:31.000It's like, oh, well, we do this blue bubble because of security.
02:29:33.000Meanwhile, Google and others had this whole protocol to be able to do encrypted text messages that finally I think Apple was forced to implement it.
02:30:05.000Was it the FBI? Someone released that, telling people that if they're talking about sensitive things, they should use encrypted apps, like WhatsApp.
02:31:22.000So you are getting the ability to send high-resolution images, which is great, because, you know, like my friend Brian, who uses an Android, he'd send me a video, and it'd be this tiny little broken-down box, because, you know, you had to break it down to the lowest resolution.
02:31:37.000Yeah, no, I mean, group chats, when you have a bunch of people on iMessage, and then one person is an Android, are terrible.
02:32:13.000So I think it's maybe people are less likely to get mad at me for asking them to use WhatsApp because we make WhatsApp.
02:32:22.000When Tucker Carlson was about to interview Vladimir Putin, one of the things that was really disturbing was they contacted him and said they read his signal messages and they knew that he was going to interview Vladimir Putin.
02:33:06.000Break into your phone and see everything that's on your phone.
02:33:09.000But the thing that encryption does that's really good is it makes it so that the company that's running the service doesn't see it.
02:33:18.000So if you're using WhatsApp, basically, when I text you on WhatsApp, there's no point at which the meta servers see the contents of that message.
02:33:28.000Unless, like, we took a photo of it or shared that back to meta in some other way.
02:33:35.000That basically, it cuts out the company completely from it, which is, I think, really important for a bunch of reasons.
02:33:41.000One is people might not trust the company, but also just security issues, right?
02:33:47.000Let's say someone hacks into Meta, which we try really hard to make it so they can't, and we haven't had many issues with that over the 20 years of running the company.
02:33:57.000But in theory, if someone did, then they'd be able to access everyone's messages.
02:34:23.000And then, of course, there's always the ultimate kind of physical part of it, which is if you have access to the computer, you can usually just break in.
02:34:44.000So, WhatsApp is encrypted, but if someone has something like Pegasus, what they do is have access to your phone.
02:34:52.000So it doesn't matter if anything's encrypted.
02:34:53.000They could just see it in plain sight.
02:34:55.000Yeah, and I mean, this is one of the reasons why we put disappearing messages in, too, because that way...
02:35:00.000I mean, yeah, if someone has compromised your phone and they can see everything that's going on there, then obviously they can see stuff as it comes in.
02:35:06.000But I kind of, in general, just think we should keep around as little of that stuff as possible.
02:35:12.000So there are some threads where it's like there's photos that get shared, you want the photos.
02:35:18.000But I think for a lot of threads, a lot of people just wouldn't be...
02:35:24.000I don't think most people would miss it if most of the contents of their threads...
02:35:31.000You know, what I find is I don't use it that much because we have this, like, corporate policy at Meta that we need to retain all our documents and messages and stuff.
02:35:41.000But before we had that, I used it as we were developing this.
02:35:48.000And every once in a while, I would miss something and say, wow, I kind of wish I could go back and see that.
02:37:15.000Is an increasing number of governments, when they, like, have an issue with something that a company is doing, basically just, like, threaten to throw the executives of that company in prison.
02:37:57.000Obviously, you don't want people to just be flagrantly violating the laws, but there are laws in different countries that we disagree with.
02:38:04.000For example, there was a point at which someone was trying to get me sentenced to death in Pakistan because they thought that Oh, because someone on Facebook had a picture of where they had the drawing of the Prophet Muhammad.
02:38:20.000And someone said, that's blasphemy in our culture.
02:38:24.000And they basically sued me and they opened this criminal proceeding.
02:38:29.000And I don't know exactly where it went because I'm just not planning on going to Pakistan.
02:38:57.000But the point is there are all these places around the world that just have different values that go against our free expression values and want us to crack down and ban way more stuff than I think a lot of people.
02:39:11.000That we would believe is, like, the right thing to do.
02:39:14.000And to have those governments be able to exert the power of saying, okay, we're going to, like, throw you in prison is—that's a lot of force.
02:39:23.000So I think it's generally—yeah, I think that this is one of the things that the U.S. government is probably going to need to help defend the American tech companies for abroad.
02:39:33.000But I can't weigh in that much on the Derov-specific thing because I don't know what was going on there.
02:39:40.000You know, when you were dealing with the government trying to interfere with Facebook, how much of a fear was there that they were going to get away with it and that this was going to be the future of communication online?
02:39:54.000That it was going to, that they were going to be successful with all this, that they would push these things through somehow or another, especially if an even less tolerant administration got into power?
02:40:06.000They would change laws and they would do things to make it possible.
02:40:24.000We developed a very adversarial and bad relationship with our own government, which I think is just not healthy because I think, you know, it's...
02:41:05.000And so I think it ends up being a big kind of political issue where it's like it would just be you could get a lot more done if the government were helping American companies rather than kind of slowing you down at every step along the way.
02:41:26.000It makes you a little afraid that if you ever actually mess something up That they're really going to bring the hammer down on you if you don't have a constructive relationship.
02:41:39.000I mean, going back to the AI conversation, I just think we should all want the American companies to win at this.
02:41:45.000This is a huge geopolitical competition, and China's running at it super hard, and we should want the American companies and the American standard to win.
02:41:55.000And if there's going to be an open-source model that everyone uses...
02:41:59.000We should want it to be an American model.
02:42:01.000There's this great Chinese model that just came out, this company DeepSeek.
02:42:33.000I mean, there's an extent to which, okay, the American tech industry is leading, so maybe the government can get in the way a little bit, and maybe the American industry will still lead.
02:42:45.000And I think it's easy for the government to take for granted that the U.S. will lead on all these things.
02:42:53.000I think it's a very close competition, and we need the help, not...
02:42:57.000We need them to not be a force that's helping us to do these things.
02:43:02.000I completely agree, but I think that people with their own self-interest, when they're in power and they realize that these new technologies like Instagram and Facebook, that they are interfering with their ability to administer propaganda or their ability to control the narrative.
02:43:23.000And that's when they act in their own personal interest and not in the interest of neither national security or the future of the United States in terms of our ability to stay technologically ahead.
02:43:34.000Yeah, and some of this is just, you know, if you go back to the COVID example, I think in that case, they were doing something, their goal of trying to get everyone to get vaccinated was actually, I think, a good goal.
02:43:50.000It was a good goal if it worked, if it was real, like if it was a sterilizing vaccine, if it really did prevent people from getting COVID, if it really did prevent people from infecting others or transmitting it, but it didn't.
02:44:03.000So it wasn't a good deal because it wasn't based on real data.
02:44:07.000Yeah, but also even if it were, right?
02:44:10.000I think that still unbalanced, knowing everything that we know now.
02:44:16.000I still think it's good for more people to get the vaccine, but the government still needs to play by the rules in terms of, you know, not like, you can't just suppress true things in order to make your case.
02:44:32.000I'm not sure in that case how much of it was like a personal political gain that they were going for.
02:44:36.000I think that they had a kind of goal that they thought was in the interests of the country, and the way they went about it, I think, violated the law.
02:44:44.000Well, there's a bunch of problems with that, right?
02:44:46.000There's the emergency use authorization that they needed in order to get this pushed through, and you can't have that with valid therapeutics being available.
02:44:55.000And so they suppressed valid therapeutics.
02:44:58.000So they're suppressing real information that would lead to people being healthy and successful in defeating this disease.
02:45:06.000And they did that so that they could have this one solution.
02:45:17.000This is the exact same game plan that was played out with the COVID vaccine.
02:45:21.000They pushed one solution, this only one, suppressed all therapeutics through propaganda, through suppressing monoclonal antibodies, like all of it.
02:45:31.000And that was done, in my opinion, for profit.
02:45:36.000And they did that because it was extremely profitable.
02:45:38.000The amount of money that was made was extraordinary during that time.
02:45:43.000Yeah, but look, I feel like a bunch of the conversation is focused on tension with the American government.
02:45:52.000I guess just the point that I'd underscore is that it's important to have this working in the American government because the U.S. Constitution and our culture here is...
02:46:04.000Really good compared to a lot of other places.
02:46:06.000So whatever issues we think might exist here, you go to other places and it's really extreme.
02:46:14.000And there it's like you don't even necessarily have the rule of law.
02:46:17.000And so I just think that the way that this stuff works well is I think if there was a clearer strategy and the US government understood, believe that it's good to kind of Yeah,
02:47:41.000And we did translate, too, where one of your co-workers was speaking to me in Spanish, and it was translating it to me in my ear in real time in English, which is really interesting.
02:48:01.000Yeah, so we're just sort of coming at it from both sides, right?
02:48:03.000It's like the Ray-Bans are like, okay.
02:48:06.000Given a good-looking pair of glasses, what's all the technology you can put into that today and still have it be just a few hundred dollars?
02:48:14.000And then the Orion thing is like, all right, we're building the kind of future thing that we want, and we're doing our best to miniaturize it.
02:48:29.000Each pair right now costs more than $10,000 to make, and you're not going to have a successful consumer product at that, so we have to miniaturize it more.
02:48:36.000But I mean, the amount of stuff that we put in there from, it's like effectively...
02:48:41.000Like what would have been considered a supercomputer like 10 or 20 years ago, plus, you know, lasers in the arms and the like nano etchings on the lens to be able to see it and the microphone and the speaker and the Wi-Fi to be able to connect with the other.
02:48:55.000It's just like a wild amount of technology to kind of miniaturize into something.
02:50:11.000I think most people generally believe in the First Amendment in this country, and we realize how valuable it is to have the freedom of expression.