Jack Dorsey, Tim Pool, and Jack Dorsey join me in this episode to talk about Bitcoin, crypto-currencies, and much more! Recorded in Los Angeles, CA! Bitcoin and other cryptocurrencies are the future of the financial services industry. Today's episode is a mashup of a few of my favorite moments from the past week, and a special guest appearance from my good friend Tim Pool. Tweet me if you liked the episode and/or have any thoughts on any of the topics covered. Tim's AMA is linked here. If you like the show and want to become a supporter, please HIT SUBSCRIBE and leave us a rating and review on Apple Podcasts, too! Tim and I talk about: - Bitcoin, Bitcoin, and more. - Bitcoin's future in the space - Bitcoin and more - The future of Bitcoin in the cryptocurrency space - Why Bitcoin is not Bitcoin - Bitcoin vs. traditional money - Bitcoin as a financial service - What's next for Bitcoin? - Can Bitcoin be the next $100k or $1,000,000 or $5,000? What's the difference between Bitcoin and traditional money? Can Bitcoin and Bitcoin be more than $10,000 worth of gold? What are the best crypto-dividend options in the near future? Is Bitcoin's role in the Bitcoin ecosystem? -- and is Bitcoin better than traditional money better than $5 or $10? Does Bitcoin have a place in the world of Bitcoin's value? Will Bitcoin have more value than Bitcoin be better than a better than Bitcoin? What is more valuable than $15, or $25 or $20,000 than $50? ? Bitcoin's impact on Bitcoin's growth in the long-term outlook than $3,000 vs. $3 or $15? Bitcoin s impact on the world? (and more? ) - What does Bitcoin have to do with Bitcoin's potential in the 21st century? ...and much, much more? Recorded in San Francisco, California, more? ...and so much more... ... and much, more! Recorded in Tel Aviv, Israel, Israel and more! ...and a lot more. ...and more! (more! ) . . . and more!! Subscribe to our new episode coming in the coming weeks! - Tom and Vija Vija
00:01:42.000And the reason why we decided to come together is we had, I thought, a great conversation last time, but there's a lot of people that were upset that there were some issues that we didn't discuss or didn't discuss in depth enough or they felt that I didn't press you enough.
00:01:55.000I talked to Tim because, you know, Tim and I have talked before and he made a video about it and I felt like his criticism was very valid.
00:02:03.000So we got on the phone and we talked about it and I knew immediately within the first few minutes of the conversation that he was far more educated about this than I was.
00:02:11.000So I said, would you be willing to do a podcast and perhaps do a podcast with Jack?
00:02:53.000His count was frozen today because of an image that he had because he's a proponent of the carnivore diet.
00:02:59.000There's a lot of people that believe that this elimination diet is very healthy for you and it's known to cure a lot of autoimmune issues with certain people, but some people ideologically oppose it because they think it's bad for the environment or you shouldn't eat meat or whatever the reasons are.
00:03:13.000This is huge in the Bitcoin community.
00:03:25.000Essentially, it's an autoimmune issue.
00:03:28.000So, because he has a photo of a lion in a header eating what looks like a wildebeest or something like that, his account was locked for violating his rules against graphic violence or adult content in profile images.
00:03:45.000And I wanted to just mention that right away.
00:03:47.000Now, whose decision is something like that?
00:03:50.000Like, who decides to lock a guy's account out because it has a nature image of, you know, natural predatory behavior?
00:03:57.000On this particular case, it's probably an algorithm that detected it and made some sort of an assessment.
00:04:02.000But as a general rule, how we operate as a company is we rely on people to report information to us.
00:04:08.000So if you look at any tweet, you can kind of pull down on the carrot on the right and you can say report the tweet, and then you have a bunch of categories you can choose from of what you want to report.
00:04:17.000I think this one in particular, though, is probably an algorithm.
00:04:20.000So does he have the option to protest that or to ask someone to review it?
00:04:27.000And I'm guessing that people are already reviewing it, but there's a choice to appeal any action, and that would go to a human to make sure that it is actually a violation of the rules, or in this case, if it's not, then it would be removed.
00:05:17.000And so then they get upset at him, and they can target posts and just report them en masse, and when they do that, then this becomes an issue.
00:05:26.000I think this does reveal part of the challenges that we face as a global platform at scale.
00:05:34.000I don't know what happened in this case.
00:05:36.000Sorry, it's hard for me to talk about it.
00:05:37.000But what I would say is that it doesn't really matter if one person reports it or 10,000 people report it.
00:05:43.000We're going to review the reports and we're going to make an assessment.
00:05:46.000And we're never going to kick someone off the platform finally and forever without a person taking a look and making sure that it's an actual violation of the reports.
00:06:21.000It's a really complicated world out there, so the motivations of why people mob report are different, and it's not always under someone's control.
00:06:30.000It could even be other carnivore diet proponents who are just jerks that don't like him because he's getting all the love.
00:06:39.000The idea, though, is that it does kind of highlight a bit of a flaw in that it's good that someone can – because you might see something awful, someone doxing someone or something like that, and then you can take that and report it, and then people can see it and get rid of it and minimize the damage that's done.
00:06:56.000There's another big problem here in that is the carnivore diet legitimately healthy?
00:07:30.000So I guess what I'm trying to say is, would you guys restrict someone from sharing false information about vaccines that could get someone hurt?
00:07:39.000That is not a violation of Twitter's rules.
00:07:42.000I think, I mean, I'd be interested to hear your ideas around this, but our perspective right now is around this concept of variety perspective.
00:07:52.000Like, are we encouraging more echo chambers and filter bubbles, or are we at least showing people other information that might be counter to what they see?
00:08:01.000And there's There's a bunch of research that would suggest that further emboldens their views.
00:08:05.000There's also research that would suggest that it at least gives them a consideration about what they currently believe.
00:08:13.000Given the dynamics of our network being completely public, we're not organized around communities, we're not organized around topics, we have a little bit more freedom to show more of the spectrum of any one particular issue.
00:08:28.000And I think that's how we would approach it from the start.
00:08:32.000That said, we haven't really dealt much with misinformation more broadly across, like, these sorts of topics.
00:08:40.000We've focused our efforts on elections and, well, mainly elections right now.
00:08:46.000You know, YouTube is a different animal.
00:08:48.000You know, YouTube, someone can really convince you that the earth is flat if you're gullible and you watch a 45-minute YouTube video.
00:08:54.000You know, it's kind of a different thing.
00:08:56.000But I wanted to just kind of get into that statement you made about misinformation and whether or not you'll police it.
00:09:01.000I think that the tough part of this is really, and I'd love to have a discussion about this, is do you really want corporations to police what's true and not true?
00:09:17.000But the places that we focus on is where we think that people are going to be harmed by this in a direct and tangible way that we feel a responsibility to correct.
00:09:25.000When you say in your rules, Tim, what do you mean by that?
00:10:14.000And I will clarify, too, your rules specifically say targeted misgendering and deadnaming, I believe is correct, right?
00:10:20.000So years ago, we passed a policy that we call our hateful conduct policy, and that prohibits targeting or attacking someone based on their belonging in any number of groups.
00:10:59.000I mean, if you created an account that only was there to call the same person stupid 5,000 times, we'd probably view that as a, you know...
00:11:27.000But can we just take a step back and try to level set what we're trying to do with our policies?
00:11:32.000Because I think it's worth doing that.
00:11:34.000So as a high level, I personally, and this is my job to run the policy team, I believe that everyone has a voice and should be able to use it.
00:11:43.000And I want them to be able to use it online.
00:11:45.000Now where we draw a line is when people use their voice and use their platform to abuse and harass other people to silence them.
00:11:53.000Because I think that that's what we've seen over the years is a number of people who have been silenced online because of the abuse and harassment they've received and they either stop talking or they leave the platform in its entirety.
00:12:03.000If you look at free expression and free speech laws around the world, they're not absolute.
00:12:28.000Right, so I guess there's the obvious question of why does it always feel like your policies are going one direction politically?
00:12:34.000You say it's about behavior, you said it several times already, but I've already, I've got tons of examples of that not being the case.
00:12:40.000And you will always be able to find those examples.
00:12:42.000Yeah, examples where you guys were alerted multiple times and did nothing, like when Antifa doxed a bunch of law enforcement agents, some of the tweets were removed, but since September, this tweet is still live with a list of private phone numbers, addresses, yet Kathy Griffin...
00:13:25.000Sometimes your tweet is forced to be deleted.
00:13:27.000It's a very rare occasion where we will outright suspend someone without any sort of warning or any sort of ability to understand what happened.
00:13:36.000What did you guys do with Kathy Griffin when she was saying she wanted the names of those young kids wearing the MAGA hats at the Covington High School kids?
00:13:45.000So in that particular case, you know, our doxing policy really focuses on posting private information, which we don't consider names to be private.
00:13:52.000We consider your home address, your home phone number, your mobile phone number, those types of things to be private.
00:13:58.000So in that particular case, we took what I think now is probably a very literal interpretation of our policy and said that that was not a doxing incident.
00:14:09.000And given the context of what was going on there, that if I was doing this all over again, I would probably ask my team to look at that through the lens of what was the purpose behind that tweet?
00:14:19.000And if the purpose was, in fact, to identify these kids to either dox them or abuse and harass them, which it probably was, then we should be taking a more expansive view of that policy and including that type of content.
00:14:30.000Especially considering the fact they're minors.
00:14:32.000I mean, I would think that right away that would be the approach.
00:14:35.000So this is a trial and error, sort of learn and move on with new information sort of a deal.
00:14:50.000Even if we get better, there will always be mistakes.
00:14:52.000But we're hoping to learn from those and to make ourselves better and to catch cases like Tim's or others where we clearly may have made an error.
00:15:00.000And I'm open to having those discussions.
00:15:02.000I'm sorry, Tim, I'm familiar with your specific cases, but I'd love to follow up with you.
00:15:11.000So it's bit.ly slash antifatweet, all lowercase.
00:15:16.000This is also an evolution in prioritization as well.
00:15:20.000One of the things we've come to recently is we do need to prioritize these efforts, both in terms of policy, enforcement, how we're thinking about evolving them.
00:15:30.000One of the things that we want to focus on as number one is physical safety.
00:15:34.000And this leads you immediately to something like doxing.
00:15:37.000And right now, the only way we take action on a doxing case is if it's reported or not.
00:15:44.000What we want to move to is to be able to recognize those in real time, at least in the English language, recognize those in real time through our machine learning algorithms, and take the action before it has to be reported.
00:15:55.000So we're focused purely right now on Going after doxing cases with our algorithms so that we can be proactive.
00:16:03.000That also requires a much more rigorous appeals process to correct us when we're wrong.
00:16:09.000But we think it's tightly scoped enough.
00:16:12.000It impacts the most important thing, which is someone's physical safety.
00:16:15.000Once we learn from that, we can really look at the biggest issue with our system right now is all the burden is placed upon the victim.
00:16:25.000We don't have a lot of enforcement, especially with more of the takedowns that are run through machine learning and deep learning algorithms.
00:16:36.000But if something is reported, a human does review it eventually, or are there a series of reports that you never get to?
00:16:43.000I mean, we prioritize the queue based on severity, and the thing that will mark severity is something like physical safety or private information or whatnot.
00:16:51.000So generally, we try to get through everything, but we have to prioritize that queue even coming in.
00:16:57.000So if someone threatened the lives of someone else, would you ban that account?
00:17:13.000I don't necessarily want to give out specific usernames because then people just point the finger at me and say, I'm getting these people banned.
00:17:21.000During Covington, this guy said multiple times he wanted his followers to go and kill these kids.
00:17:26.000And we have to look at that, but we also have to look in the context.
00:17:29.000Because we also have, I think we talked about this a little bit in the last podcast, but we have gamers on the platform who are saying exactly that to their friends that they're going to meet in the game tonight.
00:17:40.000And without the context of that relationship, without the context of the conversation that we're having, we would take the exact same action on them incorrectly.
00:18:15.000Fact check me on that, but that's basically the conversation that was had.
00:18:19.000There's a guy at Disney, he posted a picture from Fargo of someone being tossed in a wood chipper, and he says, I want all these MAGA kids done like this.
00:18:26.000You had another guy who specifically said, lock them in the school, burn it down, said a bunch of disparaging things, and then said, if you see them, fire on them.
00:19:26.000So I want to make sure that when someone violates our rules, they understand what happened and they're given an opportunity to get back on the platform and change their behavior.
00:19:36.000And so in many of these cases, what happens is we will force someone to acknowledge that their tweet violated our rules, force them to delete that tweet before they can get back on the platform.
00:19:46.000And in many cases, if they do it again, we give them a timeout, which is like seven days, and we say, look, you've done it again.
00:20:26.000I'm happy to talk about Milo, and I actually brought the tweets.
00:20:29.000So let's preface that by saying the point I want to make sure is clear is that you had somebody who actively called for the death of people.
00:21:19.000So Milo had a number of tweets that violated our rules going back to 2014, but I'm going to talk about the final three in this three strikes concept.
00:21:29.000He claimed to be a BuzzFeed reporter in his bio, and he's a verified account, so that is impersonation.
00:21:39.000Well, BuzzFeed's a left-wing thing, so he was doing parody.
00:21:42.000Potentially, but our parity rules are very specific that if you have an account that is a parity account, you need to say that it is a parity account so you don't confuse people.
00:21:50.000Everybody who knows Milo would know that he's not a BuzzFeed reporter.
00:21:54.000But people who don't know Milo will look at that verified account and say, hey.
00:22:39.000I understand why reasonable people would have different impressions of this.
00:22:43.000I'm just going through and telling you what they are just so we can have all the facts on the table and then we can debate them.
00:22:47.000And then the last one, we found a bunch of things that he posted that we viewed as incitement of abuse against Leslie Jones.
00:22:55.000So there's a bunch of them, but the one that I like to look at, which really convinced me, is he posted two doctored tweets that were supposedly by Leslie Jones.
00:25:08.000The doxing incident wasn't related to the film.
00:25:11.000I hope we all agree that doxing is something that Twitter should take action on.
00:25:17.000And it can threaten people in real life.
00:25:19.000And I take an enormous amount of responsibility for that because I fear daily for the things that are happening on the platform that are translating into the real world.
00:25:29.000So Milo is a contentious figure, and there's certainly things you can pull up that I wouldn't agree with anything he did there.
00:26:34.000Number one, we haven't done enough education about what our rules are.
00:26:38.000Because a lot of people violate our rules and they don't even know it.
00:26:40.000Like, some of the statistics that we've looked at, like, for a lot of first-time users of the platform, if they violate the rule once, almost two-thirds of them never violate the rules again.
00:26:49.000So we're not talking about, like, a bunch of people accidentally.
00:26:51.000Like, if they know what the rules are, most people can avoid it.
00:26:54.000And most people, when they feel the sting of a violation, they go, okay, I don't want to lose my rights to post.
00:27:01.000So we have a lot of work to do in education so people really understand what the rules are in the first place.
00:27:06.000The other thing we have to do to address these allegations that we're doing this from a biased perspective is to be really clear about what types of behavior are caught by our rules and what types are not.
00:27:17.000And to be transparent within the product.
00:27:19.000So when a particular tweet is found to be in violation of our rules, being very, very clear, like this tweet was found to be in violation of this particular rule.
00:27:28.000Because we think the combination of education and transparency is really important, particularly for an open platform like Twitter.
00:27:34.000It's just part of who we are, and we have to build it into the product.
00:27:37.000I appreciate that your particular thoughts, though, on those examples that he described, when he's talking about someone saying they should throw these children into a wood chipper versus Chuck Johnson saying he should take this guy – he wants to prepare a dossier to take this guy out, or how did he say it?
00:27:52.000He said something like, I'm going to take out DeRay McKesson with – he said, I'm preparing to take out – something like that, I can't remember.
00:28:14.000It's about a pattern and practice of violating our rules.
00:28:17.000And we don't want to kick someone off for one thing.
00:28:19.000But if there's a pattern and practice like there was from Milo, we are going to have to take action at some point because we can't sit back and let people be abused and harassed and silenced on the platform.
00:28:28.000Well, so one really important thing that needs to be stated is that Twitter, by definition, is a biased platform in favor of the left, period.
00:28:36.000I understand you might have your own interpretation, but it's very simple.
00:28:40.000Conservatives do not agree with you on the definition of misgendering.
00:28:42.000If you have a rule in place that specifically adheres to the left ideology, you, by default, are enforcing rules from a biased perspective.
00:28:49.000Well, Tim, there are a lot of people on the left who don't agree with how we're doing our job either.
00:29:09.000But in this particular case, it's how the speech is being used.
00:29:12.000This is a new vector of attack that people have felt that I don't want to be on this platform anymore because I'm being harassed and abused and I need to get the hell out.
00:29:21.000Will people harass and abuse me all day and night?
00:30:00.000Of course Twitter is going to enforce the social justice aspect of their policy immediately, in my opinion, probably because you guys have PR constraints and you're probably nervous about that.
00:30:09.000But when someone actually threatens me with a crime and incites their followers to do it, nothing got done.
00:30:13.000And I'm not the only one who feels that way.
00:30:22.000Maybe there was a mistake there and I'm happy to go and correct that and we can do it offline so we don't fear any sort of reprisal against you.
00:30:48.000The reason I bring him up is that Oliver Darcy, one of the lead reporters covering Alex Jones and his content, said on CNN that it was only after media pressure did these social networks take action.
00:30:58.000So that's why I bring him up specifically because it sort of implies you are under PR constraints to get rid of him.
00:31:03.000I think if you look at the PR that Twitter went through in that incident, it wouldn't be that we looked good in it.
00:31:08.000And that's not at all why we took action on this.
00:31:11.000You have to look at the full context on the spectrum here.
00:31:13.000Because one of the things that happened over a weekend is what Alex mentioned on your podcast with him.
00:31:22.000He was removed from the iTunes podcast directory.
00:31:26.000That was the linchpin for him because it drove all the traffic to...
00:31:43.000We did not because when we looked at our service and we looked at the reports on our service, we did not find anything in violation of our rules.
00:31:53.000Then we got into a situation where suddenly a bunch of people were reporting content on our platform, including CNN, who wrote an article about all the things that might violate our rules that we looked into.
00:32:16.000We resisted just being like a domino with our peers because it wasn't consistent with our rules and the contract we put before our customers.
00:32:52.000But now it's time to act on the enemy before they do a false flag.
00:32:55.000I know the Justice Department's crippled a bunch of followers and cowards, but there's groups, there's grand juries, there's you called for it.
00:33:01.000It's time politically, economically, and judiciously, and legally and criminally to move against these people.
00:33:08.000Get together the people you know aren't traitors, aren't cowards, aren't helping their frickin' bets, hedging their frickin' bets like all these other assholes do, and let's go, let's do it.
00:33:16.000So people need to have their, and then there's a bunch of other stuff, but at the end, so people need to have their battle rifles ready and everything ready at their bedsides, and you've got to be ready because the media is so disciplined in their deception.
00:33:29.000So you're saying that this is a call to violence against the media?
00:33:32.000That's what it sounded like to us at the time.
00:33:34.000And there have been a number of incidents of violence against the media.
00:33:37.000And again, I take my responsibility for what happens on the platform and how that translates off-platform very seriously.
00:33:43.000And that felt like it was an incitement to violence.
00:33:45.000So if he only tweeted the incitement to violence, he would have been fine?
00:33:48.000If he only posted that transcript saying, get your battlefield rifles ready, you wouldn't have deleted his account?
00:34:18.000There are certain types of situations where if you were reporting on, you know, war zone and things that might be happening, we would put an interstitial on that type of content that's graphic or violent, but we didn't feel that that was the context here.
00:34:31.000Well, there's a video that's been going around that was going around a few...
00:34:35.000Four or five weeks ago, the one where the girls were yelling at that big giant guy and the guy punched that girl in the face and she was like 11 years old.
00:35:01.000So the third strike that we looked at was a verbal altercation that Alex got into with a journalist, and in that altercation, which was uploaded to Twitter, there were a number of statements using eyes of the rat, even more evil-looking person,
00:35:17.000he's just scum, You're a virus to America and freedom, smelling like a possum that climbed out of the rear end of a dead cow.
00:35:23.000You look like a possum that got caught doing some really, really nasty stuff in my view.
00:36:22.000Hard part of content moderation at scale on global platforms.
00:36:26.000It's not easy, and I don't think Jack or I would tell you that it's easy.
00:36:28.000It's a preposterous volume that you guys have to deal with, and that's one of the things that I wanted to get into with Jack when I first had him on, because when my thought, and I wasn't as concerned about the censorship as many people were, my main concern was, what is it like to start this thing that's kind of for fun,
00:36:47.000and then all of a sudden it becomes the premier platform for free speech on the planet Earth?
00:36:53.000It is that, but it's also a platform that's used to abuse and harass a lot of people and used in ways that none of us want it to be used, but nonetheless it happens.
00:37:02.000And I think it's an enormously complicated challenge.
00:37:37.000Then he goes on TV and says, we got him banned.
00:37:40.000Then Alex Jones confronts him in a very aggressive and mean way, and that's your justification for, or I should say, I inverted the timeline.
00:37:46.000Basically, you have someone who's relentlessly digging through stuff, insulting you, calling you names, sifting through your history, trying to find anything they can to get you terminated, going on TV even, writing numerous stories.
00:37:58.000You confront them and say, you're evil, and you say a bunch of really awful mean things.
00:38:19.000The conservatives, to an extent, probably will try and mass flag people on the left.
00:38:24.000But from an ideological standpoint, you have the actual, you know, whatever people want to call it, sect of identitarian left that believe free speech is a problem, that have literally shown up in Berkeley burning free speech signs.
00:38:36.000And then you have conservatives who are tweeting mean things.
00:38:38.000And the conservatives are less likely, I think it's fair to point out, less likely to try and get someone else banned because they like playing off them.
00:39:39.000Quillette recently published an article where they looked at 22 high-profile bannings from 2015 and found 21 of them were only on one side of the cultural debate.
00:39:47.000But I don't look at the political spectrum of people when I'm looking at their tweets.
00:40:51.000And they tell us, like, you should consider different types of rules, different types of perspectives, different...
00:40:56.000Like, for example, when we try to enforce hateful conduct in our hateful conduct policy in a particular country, we are not going to know all the slur words that are used to target people of a particular race or a particular religion.
00:41:08.000So we're going to rely on building out a team of experts all around the world who are going to help us enforce our rules.
00:41:15.000So in the particular case of misgendering, I'm just trying to pull up some of the studies that we looked at, but we looked at the American Association of Pediatrics and looked at the number of transgender youths that were committing suicide.
00:41:29.000It's an astronomical, I'm sorry, I can't find it right now in front of me.
00:41:32.000It's a really, really high statistic that's like 10 times what the normal suicide rate is.
00:41:50.000Because we thought, and we believe, that those types of behaviors were happening on our platform, and we wanted to stop it.
00:41:58.000Now there are exceptions to this rule.
00:41:59.000We don't, and this is all, this isn't about like public figures, and there's always going to be public figures that you're going to Want to talk about, and that's fine.
00:42:07.000But this is about, are you doing something with the intention of abusing and harassing a trans person on the platform?
00:42:12.000And are they viewing it that way and reporting it to us so that we take action?
00:42:17.000So I will just state, I actually agree with the rule.
00:42:21.000From my point of view, I agree that bullying and harassing trans people is entirely wrong.
00:42:35.000And he's one of the biggest podcasts in the world.
00:42:38.000So if you have all of his millions upon millions of followers who are looking at this rule saying this goes against My view of the world, and it's literally 60-plus million in this country, you do have a rule that's ideologically bent.
00:43:33.000That rule is at odds with conservatives, period.
00:43:35.000Well, I think that you're generalizing, but I think it is really important, as Jack said, to the why behind these things.
00:43:42.000The why is to protect people from abuse and harassment on our platform.
00:43:46.000I understand, but you essentially created a protected class, if this is the case, because despite these studies and what these studies are showing...
00:43:55.000There's a gigantic suicide rate amongst trans people, period.
00:44:04.000Now, whether that is because of gender dysphoria, whether it's because of the complications from sexual transition surgery, whether it's because of bullying, whether it's because of this awful feeling of being born in the wrong gender, all that is yet to be determined.
00:44:20.000The fact that they've shown that That there's a large amount of trans people that are committing suicide.
00:44:25.000I don't necessarily think that that makes sense in terms of people from someone's perspective, like a Ben Shapiro, saying that if you are biologically female, if you are born with a double X chromosome,
00:44:41.000you will never be XY. If he says that, that's a violation of your policy.
00:44:48.000And you're creating a protected class.
00:44:54.000If he wants to express that opinion, he is fully entitled to express that opinion.
00:45:01.000If he's doing it in a manner that's targeted at an individual, repeatedly, and saying that, that's where the intent and the behavior comes in.
00:45:09.000You know what's going on with Martina Navatrolova right now?
00:45:22.000Epic, world-class, legend tennis player, who happens to be a lesbian, is being harassed because she says that she doesn't believe that trans women, meaning someone who is biologically male, who transitions to a female, should be able to compete in sports against biological females.
00:45:42.000This is something I have personally experienced a tremendous amount of harassment because I stood up when there was a woman who was a trans woman who was fighting biological females in mixed martial arts fights and destroying these women.
00:45:55.000And I was saying, just watch this and tell me this doesn't look crazy to you.
00:46:00.000Well, my point is, You should be able to express yourself.
00:46:07.000And if you say that you believe someone is biologically male, even though they identify as a female, that's a perspective that should be valid.
00:46:17.000First of all, it's biologically correct.
00:46:22.000So we have a problem in that if your standards and your policies are not biologically accurate, Then you're dealing with an ideological policy.
00:47:01.000But going into, like, Meghan Murphy, for instance, right?
00:47:04.000You can call that targeted harassment.
00:47:06.000If Meghan Murphy, who is – for those that don't know, she's a radical feminist who refuses to use the transgender pronouns – If she's in an argument with a trans person over whether or not they should be allowed in sports or in biologically female spaces, and she refuses to use their pronoun because of her ideology,
00:47:39.000My understanding, and I don't have the tweet by tweet the way that I did for the others, but my understanding is that she was warned multiple times for misgendering an individual that she was in an argument with, and this individual is actually bringing a lawsuit against her in Canada as well.
00:47:54.000So you have an argument between two people, and you have a rule that enforces only one side of the ideology, and you've banned only one of those people.
00:48:02.000We have a rule that attempts to address what we have perceived to be instances of abuse and harassment.
00:48:21.000This is not something like you can say, water is wet, you know, this is dry.
00:48:25.000This is not like something you can prove.
00:48:27.000This is something where you have to acknowledge that there's an understanding that if someone is a trans person, we all agree to consider them a woman, and to think of them as a woman, to...
00:49:00.000If you're taking those viewpoints and you're targeting them at a specific person in a way that reflects your intent to abuse and harass them.
00:49:07.000What if it's in the context of the conversation?
00:49:09.000What if she's saying that, I don't think that trans women should be allowed in these female spaces to make decisions for women?
00:49:14.000And then this person's arguing and she says, a woman is biologically female.
00:49:54.000So you're having an individual who is debating a high-profile individual in her community, and she's expressing her ideology versus hers, and you have opted to ban one of those ideologies.
00:50:04.000And it's within the context of this conversation.
00:50:06.000This is what is being debated, whether or not someone is, in fact, a woman when they were born a male.
00:50:13.000I understand that this is controversial.
00:51:23.000Because I know people who have specifically begun using insults of animals to avoid getting kicked off the platform for breaking the rules.
00:51:29.000Certain individuals who have been suspended now use certain small woodland creatures in place of slurs, so they're not really insulting you, and it's fine.
00:51:36.000But there are people who consider themselves trans species.
00:51:39.000Now, I'm not trying to belittle the trans community by no means.
00:51:42.000I'm just trying to point out that you have a specific rule for one set of people.
00:51:45.000So there are people who have general body dysphoria.
00:51:55.000And more importantly, in the context of a targeted conversation, I can say a whole bunch of things that would never be considered a rule break, but that one is, which is ideologically driven.
00:52:09.000I mean, we're, again, always learning and trying to understand different people's perspectives.
00:52:14.000And all I'll say is that our intent is not to police ideology.
00:52:19.000Our intent is to police behaviors that we view as abuse, movement, and harassment.
00:52:23.000And I hear your point of view, and it's something that I'll definitely discuss with my team.
00:52:27.000And even in this case, it wasn't just going against this particular rule, but also things that were more ban-evasive as well, including taking a screenshot of the original tweet, reposting it, which is against our Terms of Service.
00:52:55.000I understand what you're saying, but I just want to make sure I point out she was clearly doing it as an effort to push back on what she viewed as an ideologically driven rule.
00:53:01.000Well, the problem is this is a real debate in...
00:53:07.000This is a debate where there's a division, and there's a division between people that think that trans women are invading biological female spaces and making decisions that don't benefit these biological females, cisgender, whatever you want to call them.
00:53:22.000This is an actual debate, and it's a debate amongst progressive people, amongst left-wing people, and it's a debate amongst liberals.
00:53:29.000This is, I mean, I would imagine the vast majority of people in the LBGT community are, in fact, on the left.
00:53:39.000So you have a protected class that's having an argument with a woman who feels like there's an ideological bent to this conversation that is not only not accurate, but not fair.
00:53:52.000And she feels like it's not fair for biological women.
00:54:02.000They were having an argument with someone on Twitter and responded with, dude, comma, you don't know, blah, blah, blah.
00:54:08.000And they got a suspension and a lockout and had to delete the tweet because the individual, using a cartoon avatar with the name apparently was Sam… Thank you for not being offended.
00:55:03.000I just don't want to run into beating a dead horse.
00:55:06.000It's a really important thing to go over all the nuances of this particular subject because I think that one in particular highlights this idea of where the problems lie in having a protected class.
00:55:20.000And I think we should be compassionate.
00:56:16.000I think racism is looking at someone that is from whatever perspective Whatever race and deciding that they are in fact less or less worthy or less valuable, whatever it is.
00:56:28.000That takes place across the platform against white people.
00:56:33.000Now I'm not saying white people need to be protected.
00:56:35.000I know it's easier being a white person in America.
00:56:40.000To have a policy that only distinguishes you can make fun of white people all day long, but if you decide to make fun of Asian folks or, you know, fill in the blank, that is racist.
00:56:51.000But making fun of white people isn't, and it doesn't get removed.
00:57:01.000My understanding is that you guys started banning people officially under these policies around 2015, and all the tweets she made was prior to that, and so you didn't enforce the old tweets.
00:57:08.000Yeah, so our hateful conduct policy, Joe, just to be clear, is across the board, meaning it doesn't just protect women.
00:57:49.000Because if you try to police every opinion that people have about different races or religions, like, obviously, that's a very different story.
00:57:56.000So this is about if you target that to somebody who belongs to that class, and that's reported to us, that is a violation of our rules.
00:59:25.000Just to address that point, and I think Jack talked about this a little bit, like this is where right now we have a system that relies on people to report it to us, which is a huge burden on people.
00:59:35.000And especially if you happen to be a high-profile person, and Tim, you would understand this, you're not going to sit there and report every tweet.
00:59:50.000So this is where we have to start getting better at identifying when this is happening and taking action on it without waiting for somebody to tell us it's happening.
00:59:57.000But using an algorithm, though, do you not miss context?
01:00:01.000I mean, it seems to me that there's a lot of people that say things in humor, you know.
01:00:05.000Or slurs within particular communities, which is perfectly reasonable.
01:00:10.000So, yes, there is a danger of the algorithms missing context, and that's why we really want to go carefully into this, and this is why we've scoped it down, first and foremost, to doxing.
01:00:20.000Which is, at least, first, it hits our number one goal of protecting physical safety.
01:00:25.000Like, making sure that nothing done online will impact someone's physical safety offline, on our platform, in this case.
01:00:32.000The second is that there are patterns around doxing that are much easier to...
01:00:39.000There are exceptions, of course, because you could dox a representative's public office phone number and email address, and the algorithm might catch that, not have the context that this is a U.S. representative and this information is already public.
01:00:57.000So essentially, it highlights how insanely difficult it is to monitor all of these posts.
01:03:10.000I think, you know, I don't want harassment.
01:03:12.000But the reason I bring this up is getting into the discussion about democratic health of a nation.
01:03:18.000So I think it can't be disputed at this point that Twitter is extremely powerful in influencing elections.
01:03:25.000I'm pretty sure you guys published recently a bunch of tweets from foreign actors that were trying to meddle in elections.
01:03:30.000So even you as a company recognize that foreign entities are trying to manipulate people using this platform.
01:03:35.000So there's a few things I want to ask beyond this, but...
01:03:39.000Wouldn't it be important then to just – at a certain point, Twitter becomes so powerful in influencing elections and giving access to even the president's tweets that you should allow people to use the platform based under the norms of U.S. law.
01:03:52.000First Amendment, free speech, right to expression on the platform.
01:03:56.000This is becoming too much of a – it's becoming too powerful in how our elections are taking place.
01:04:02.000So even if – You are saying, well, hate speech is our rule, and a lot of people agree with it.
01:04:06.000If at any point one person disagrees, there's still an American who has a right to access to the public discourse, and you've essentially monopolized that, and not completely, but for the most part.
01:04:17.000So isn't there some responsibility on you to guarantee, at a certain extent, less regulation happen, right?
01:04:23.000Like, look, if you recognize foreign governments are manipulating our elections, then shouldn't you guarantee the right to an American to access this platform to be involved in the electoral process?
01:04:35.000I'm not sure I see the tie between those things, but I will address one of your points, which is we're a platform that serves the world.
01:04:44.00075% of the users of Twitter are outside of the United States.
01:04:49.000So we don't apply laws of just one country when we're thinking about it.
01:04:55.000We think about how do you have a global standard that can meet the threshold of as many countries as possible because we want all the people in the world to be able to participate And also meet elections like the Indian election coming up as well.
01:05:40.000I'm not sure what you're talking about, but we did have our Vice President of Public Policy testify in front of Indian Parliament a couple weeks ago, and they were really focused on election integrity and safety and abuse and harassment of women and political figures and the likes.
01:05:55.000So my concern, I guess, is I recognize you're a company that serves the world, but as an American, I have a concern that the democracy I live in, the democratic republic, I'm sorry, and the democratic functions are healthy.
01:06:06.000One of the biggest threats is Russia, Iran, China.
01:06:10.000They're trying to meddle in our elections using your platform, and it's effective, so much so that you've actually come out and removed many people.
01:06:16.000Covington was apparently started by a account based in Brazil.
01:06:19.000The Covington scandal where this fake news goes viral.
01:06:21.000It was reported by CNN that it was a dummy account.
01:06:25.000They were trying to prop it up and they were pushing out this out of context information.
01:06:31.000You've now got a platform that is so powerful in our American discourse that foreign governments are using it as weapons against us.
01:06:39.000And you've taken a stance against the laws of the United States.
01:06:43.000I don't mean like against like you're breaking the law.
01:06:45.000I mean you have rules that go beyond the scope of the U.S. which will restrict American citizens from being able to participate.
01:06:50.000Meanwhile, foreign actors are free to do so so long as they play by your rules.
01:06:54.000So our elections are being threatened.
01:06:56.000By the fact that if there's an American citizen who says, I do not believe in your misgendering policy, and you ban them, that person has been removed from public discourse on Twitter.
01:07:04.000Right, but they don't get banned for saying they don't agree with it.
01:07:07.000They get banned for specifically violating it by targeting an individual.
01:07:11.000Let's say in protest, an individual repeatedly says, no, I refuse to use your pronouns, in like Megan Murphy's case.
01:07:17.000She's Canadian, so I don't want to use her specifically.
01:07:19.000The point I'm trying to make is, at a certain level, There are going to be American citizens who have been removed from this public discourse which has become so absurdly powerful foreign governments weaponize it because you have different rules than the American country has.
01:07:32.000Just to be clear, my understanding, and I'm not an expert on all the platforms, is that foreign governments use multiple, multiple different ways to interfere in elections.
01:07:41.000It is not limited to our platform, nor is it limited to social media.
01:08:45.000That means your platform is so powerful, it's being used to manipulate elections, and you have rules that are not recognized by the government to remove American citizens from that discourse.
01:08:55.000So as a private platform, you've become too powerful to not be regulated if you refuse to allow people free speech.
01:09:03.000But I'm trying to pick apart the connection.
01:09:08.000So, yes, we do have an issue with foreign entities and misinformation.
01:09:16.000And this is an extremely complicated issue, which we're just beginning to understand and grasp and take action on.
01:09:26.000I don't think that issue is solved purely by not being more aggressive on something else that is taking people off the platform entirely as well, which is abuse and harassment.
01:09:40.000It's a cost-benefit analysis, ultimately, and our rules are designed, again...
01:09:46.000You know, they don't always manifest this way in the outcomes, but in terms of what we're trying to drive is opportunity for every single person to be able to speak freely on the platform.
01:11:41.000That means American citizens who are abiding by all of the laws of our country are being restricted from engaging in public discourse because you've monopolized it.
01:11:49.000Because these foreign governments are restricted by the same rules.
01:11:52.000So if they violate those same rules, they will be removed.
01:11:55.000So if they play within those rules, they can participate in the discourse even if they are just trying to manipulate our elections.
01:12:01.000On the other hand, if the people that are on the platform I see what you're saying.
01:12:28.000We can see that at a certain point, Twitter is slowly gaining, in my opinion, too much control from your personal ideology based on what you've researched, what you think is right, over American discourse.
01:12:41.000If Twitter, and again, this is my opinion, I'm not a lawmaker, but I would have to assume if Twitter refuses to say, in the United States, you are allowed to say what is legally acceptable, period, then lawmakers' only choice will be to enforce regulation on your company.
01:12:57.000Actually, Tim, I spent quite a bit of time talking to lawmakers as part of my role.
01:13:02.000I spent a lot of time in D.C. I want to say that Jack and I have both spent a lot of time in D.C. And I think from the perspective of lawmakers, they, across the spectrum, are also in favor of policing abuse and harassment online and bullying online.
01:13:20.000Those are things that people care about because they affect their children, and they affect their communities, and they affect individuals.
01:13:27.000And so I don't think that, and as a private American business, we can have different standards than what an American government-owned corporation or American government would have to institute.
01:13:42.000And I understand your point about the influence, and I'm not denying that.
01:13:46.000Certainly, Twitter is an influential platform.
01:13:48.000But, like anything, whether it's the American law or the rules of Twitter or the rules of Facebook or rules of any platform, there are rules.
01:13:57.000So it is your choice whether to follow those rules and to continue to participate in a civic dialogue, or it is your choice to not do that.
01:14:36.000I think we should regulate you guys because you are unelected officials running your system the way you see fit against the wishes of a democratic republic.
01:14:44.000And there are people who disagree with you who are being excised from public discourse because of your ideology.
01:14:51.000So Tim, just so I understand, so are you suggesting that we don't have any policies around abuse and harassment on the platform?
01:14:59.000I'm trying to understand what it is you're saying because I'm not sure I'm following you.
01:15:03.000So you don't think we should have any rules about abuse and harassment?
01:15:06.000So even the threats that you received that you mentioned that we didn't...
01:15:10.000But you mentioned a number of threats that you received, and you were quite frustrated that we hadn't taken action on them.
01:15:14.000You think we shouldn't have rules that- Well, I'm frustrated because of the hypocrisy of when I see the flow of one direction.
01:15:22.000And then what I see are Republican politicians, who in my opinion are just too ignorant to understand what the hell's going on around them.
01:15:27.000And I see people burning signs that say free speech.
01:15:30.000I see you openly saying, we recognize the power of our platform, and we're not going to abide by American norms.
01:15:37.000I see the manipulation of Twitter in violation of our elections.
01:15:40.000I see Democratic operatives in Alabama waging a false flag campaign using fake Russian accounts.
01:15:46.000And the guy who runs that company has not been banned from your platform.
01:15:50.000Even after it's been written by the New York Times, he was doing this.
01:15:53.000So we know that not only are people manipulating your platform, you have rules that remove honest American citizens with bad opinions who have a right to engage in public discourse.
01:16:03.000And it's like you recognize it, but you like having the power?
01:16:06.000I'm not quite sure at what point— So just to get back to my point, so you believe that Twitter should not have any rules about abuse and harassment or any sort of hate speech on the platform?
01:16:20.000The point I'm trying to make is— But that is a point you're trying to make.
01:16:23.000You're asking us to comply with the U.S. law that would criminalize potential speech and put people in jail for it, and you're asking us to enforce those standards.
01:16:32.000Well, I mean, if you incite death, that's a crime.
01:17:25.000I think your team understands what they're doing.
01:17:29.000However, you get really dangerous territory if someone accidentally tweets an N and you assume they're trying to engage in a harassment campaign, which is why I said let's talk about learn to code.
01:17:38.000But we do look at coordination of accounts.
01:17:41.000Do you do that through direct messages?
01:17:49.000We don't read them unless someone reports a direct message to us that they have received.
01:17:53.000And so you read their direct message that they send to you?
01:17:56.000So if you have a direct message and someone says something terrible and then you receive a death threat and you report that to us, then we would read it because you've reported it to us.
01:18:05.000Does anyone in the company have access to direct messages other than that?
01:18:10.000Only in the context, again, of reviewing reports.
01:18:13.000Other than that, they're not accessible?
01:18:26.000So if Tim writes an N, and I write an I, and Jamie writes a G, can you go into our direct messages and say, hey, let's fuck with Jack, and we're going to write this stuff out, and we're going to do it, and let's see if they ban us.
01:18:55.000Well, I think beyond the N, like, you know, the first person with the letter, you can't prove he did it, but everybody else, you kind of can.
01:19:33.000So there was a situation, I guess about a month ago or so, where a number of journalists were receiving a variety of tweets, some containing learn to code, some containing a bunch of other coded language that was wishes of harm.
01:19:52.000These were thousands and thousands of tweets being directed at a handful of journalists.
01:19:56.000And we did some research and what we found was a number of the accounts that were engaging in this behavior, which is tweeting at the journalists with this either Learn to Code or things like Day of the Rope and other coded language, were actually ban evasion accounts.
01:20:13.000That means accounts that had been previously suspended.
01:20:15.000And we also learned that there was a targeted campaign being organized off our platform to abuse and harass these journalists.
01:20:25.000An activist who works for NBC wrote that story and then lobbied you.
01:20:29.000You issued an official statement, and then even the editor-in-chief of the Daily Caller got a suspension for tweeting Learn to Code at the Daily Show.
01:20:36.000So I have never talked to anybody from NBC about this issue, so I'm not sure.
01:20:43.000The narrative goes far and wide amongst your circles.
01:20:46.000Then all of a sudden you're seeing high-profile conservatives tweeting a joke getting suspensions.
01:20:50.000So again, some of these tweets actually contained death threats, wishes of harm, other coded language that we've seen to mean death to journalists.
01:21:02.000So it wasn't about just the learn to code.
01:21:05.000It was about the context that we were seeing.
01:21:17.000So we were looking at the context, and what was happening is there were journalists receiving hundreds of tweets.
01:21:22.000Some had death threats, some had wishes of harm, some just learned to code.
01:21:25.000And in that particular context, we made a decision.
01:21:28.000We consider this type of behavior dogpiling, which is when...
01:21:31.000All of a sudden individuals are getting tons and tons of tweets at them.
01:21:35.000They feel very abused and harassed on the platform.
01:21:37.000Can we pause this because this is super confusing for people who don't know the context.
01:21:41.000The learn to code thing is in response to people saying that people that are losing their jobs like coal miners and truck drivers and things like that could learn to code.
01:22:01.000So the first stories that came out were simply like, can miners learn to code?
01:22:05.000And the hashtag learn to code is just a meme.
01:22:09.000It's not even necessarily a conservative one, though you will see more conservatives using it.
01:22:12.000People are using it to mock how stupid the idea of taking a person who's uneducated, who's in their 50s, who should learn some new form of vocation, and then someone says, learn to code.
01:22:24.000And so then other people, when they're losing their job or when something's happening, people would write, learn to code, because it's a meme.
01:22:32.000I would just characterize learn to code as a meme that represents the elitism of modern journalists and how they target certain communities with disdain.
01:22:41.000So to make that point, there are people who have been suspended for tweeting something like, I'm not too happy with how BuzzFeed reported the story, hashtag learn to code.
01:22:49.000Making representation of these people are snooty elites who live in ivory towers.
01:22:57.000This is a meme that has nothing to do with harassment, but some people might be harassing somebody and might tweet it.
01:23:03.000Why would we expect to see – even still today, I'm still getting messages from people with screenshots saying I've been suspended for using a hashtag.
01:23:08.000And the editor-in-chief of The Daily Caller, he quote-tweeted a video from The Daily Show with hashtag learn to code, and he got a suspension for it.
01:25:48.000Well, the first articles weren't mean.
01:25:50.000It was just, Learn to Code kind of identified, you have these journalists who are so far removed from middle America that they think you can take a 50-year-old man who's never used a computer before and put him in a, you know.
01:26:00.000The stories, I think, were legitimate.
01:26:02.000But the point more so is it was a meme.
01:26:04.000The hashtag, the idea of Learn to Code condenses this idea, and it's easy to communicate, especially when you only have 280 characters, that there is a class of individual in this country.
01:26:14.000I think you mentioned on, was it Sam Harris, that the left, these left liberal journalists only follow each other.
01:26:19.000Yeah, in the run-up to the 2016 elections.
01:26:23.000Yeah, and so, I mean, I still believe that to be true, and I've worked in these offices.
01:26:28.000They've done the study again, the visualization, and now there is a lot more cross-pollination.
01:26:32.000But what we saw is folks who were reporting on the left end of the spectrum mainly followed folks on the left, and folks on the right followed everyone.
01:26:41.000What you were talking about earlier, that there's these bubbles.
01:26:44.000There's bubbles, and we've helped create them and maintain them.
01:26:48.000So here's what ends up happening, and this is one of the big problems that people have.
01:26:51.000With this story particularly, you have a left-wing activist who works for NBC News.
01:26:57.000I'm not accusing you of having read the article.
01:26:58.000He spends like a day lobbying to Twitter saying, Guy, you have to do this.
01:27:05.000The next day he writes a story saying that 4chan is organizing these harassment campaigns and death threats.
01:27:10.000And while 4chan was doing threads about it, you can't accuse 4chan simply for talking about it because Reddit was talking about it too, as was Twitter.
01:27:17.000So then the next day, after he published his article, now he's getting threats.
01:27:22.000And then Twitter issues a statement saying, we will take action.
01:27:25.000And to make matters worse, when John Levine, a writer for The Wrap, got a statement from one of your spokespeople saying, yes, we are banning people for saying learn to code.
01:27:35.000A bunch of journalists came out and then lied.
01:27:37.000I had no idea why saying this is not true.
01:28:12.000You have situations like this where you can see – this journalist, I'm not going to name him, but he routinely has very left-wing – I don't want to use overtly esoteric words, but intersectional dogmatic points of view.
01:28:27.000So like intersectional feminism is considered like a small ideology.
01:28:32.000People refer to these groups as the regressive left or the identitarian left.
01:28:36.000These are basically people who hold views that a person is judged based on the color of their skin instead of the content of their character.
01:28:41.000So you have the right-wing version, which is like the alt-right, the left-wing version, which is like...
01:28:47.000Intersectional feminism is how it's simply referred to.
01:28:50.000So you'll see people say things like – typically when they rag on white men or when they say like white feminism, these are signals that they hold these particular views.
01:28:59.000And these views are becoming more pervasive.
01:29:00.000So what ends up happening is you have a journalist who clearly holds these views.
01:29:03.000I don't even want to call him a journalist.
01:29:05.000He writes extremely biased and out of context story.
01:29:08.000Twitter takes action in response, seemingly in response.
01:29:11.000Then we can look at what happens with Oliver Darcy at CNN. He says, you know, the people at CPAC, the conservatives are gullible eating red meat from grifters, among other things, disparaging comments about the right.
01:29:20.000And he's the one who's primarily advocating for the removal of certain individuals who you then remove.
01:29:25.000And then when Kathy Griffin calls for doxing, that's fine.
01:29:27.000When this guy calls for the death of these kids, he gets a slap on the wrist.
01:29:32.000And look, I understand the context matters, but grains of sand make a heap, and eventually you have all of these stories piling up, and people are asking you why it only flows in one direction.
01:29:40.000Because I've got to be honest, I'd imagine that calling for the death three times of any individual is a bannable offense, even without a warning.
01:29:50.000We see these, you know, people say men aren't women, though, and they get a suspension.
01:29:53.000We see people say the editor-in-chief of The Daily Caller may be the best example.
01:29:57.000Hashtag learn to code, quoting The Daily Show, and he gets a suspension.
01:30:01.000Threatening death and inciting death is a suspension, too.
01:30:04.000It feels like it's only going in one direction.
01:30:07.000Yeah, I think we have a lot of work to do to explain more clearly when we're taking action and why, and certainly looking into any mistakes we may have made in those particular situations.
01:30:16.000So would you guys agree that in tech, I think we can all agree this, I would hope you agree, tech tends to lean left.
01:31:12.000My point is that I think a lot of people that are on the right feel disenfranchised by these platforms that they use on a daily basis.
01:31:21.000I don't know what the percentages are in terms of the number of people that are conservative that use Twitter versus the number of people that are liberal, but I would imagine it's probably pretty close, isn't it?
01:31:42.000But the people that run, whether it's Google or Twitter or Facebook, any of these platforms, YouTube for sure, powerful leaning towards the left.
01:32:58.000If we were purely looking at the content, but a lot of this agent work is based on the behaviors, all the things that we've been discussing in terms of the context of the actual content itself.
01:33:26.000I definitely hear the point in terms of us putting this rule forth.
01:33:29.000But we have to balance it with the fact that people are being driven away from our platform.
01:33:34.000And they may not agree with me on that, my folks from Missouri.
01:33:38.000But I think they would see some valid argument in what we're trying to do to, again, increase the opportunity for as many people as possible to talk differently.
01:33:52.000What community is and isn't deserving of protection?
01:33:55.000Are conservatives not deserving of protection for their opinions?
01:33:57.000But I wanted to focus on individuals and increasing the absolute number of people who have opportunity to speak on the platform in the first place.
01:34:05.000So then do you need a rule for body dysphoria?
01:34:12.000And this came from a call and research.
01:34:17.000And there's disagreement as to whether this is the right outcome or not and this is the right policy.
01:34:24.000And yes, our bias does influence looking in this direction.
01:34:28.000And our bias does influence us putting a rule like this in place, but it is with the understanding of creating as much opportunity as possible for as many people to speak based on the actual data that we see of people leaving the platform because of experiences they have.
01:35:05.000And to your credit, I really do appreciate the fact that you're very open about that you have made mistakes and that you're continuing to learn and grow and that your company is reviewing these things and trying to figure out which way to go.
01:35:15.000And I think we all need to pay attention to the fact that this is a completely new road.
01:35:38.000And I know people that have been banned to them, this is a matter of ideology, this is a matter of this, this is a matter of that.
01:35:44.000There's a lot of debate going on here, and that's one of the reasons why I wanted to bring you on.
01:35:48.000Because, Tim, because you know so much about so many of these cases, because you are a journalist, and you're very aware of the implications and all the problems that have been...
01:35:58.000That maybe have slipped through my fingers.
01:36:00.000So I do want to make one thing really clear, though.
01:36:03.000I have a tremendous amount of respect and trust for you when you say you wanted to solve this problem simply because you're sitting here right now and these other companies aren't, right?
01:36:54.000We know that this very binary off or on platform isn't right, and it doesn't scale, and it ultimately goes against our key initiative of wanting to promote more healthier conversation.
01:37:08.000I just don't think that's what you're doing.
01:37:18.000And we need to, the reason I'm going on all these podcasts and having these conversations and ideally Vidja's getting out there more often as well because we don't see enough and hear enough for her.
01:37:28.000We need to have these conversations so we can learn.
01:37:30.000We can get the feedback and also pay attention to where the technology is going.
01:37:35.000Before the podcast, we talked a little bit about, and I talked about it on our previous podcast and also Sam's, I think?
01:38:04.000And that is a reality that we need to pay attention to and really understand our value.
01:38:09.000And I believe a lot of our value in the future, not today, again, we have a ton of work, is to take a strong stance of like we are going to be a company that given this entire corpus of conversation and content within the world, we're going to work to promote healthy public conversation.
01:38:29.000And if you disagree with it, You should be able to turn it off and you should be able to access anything that you want as you would with the internet.
01:38:38.000But those are technologies that are just in the formative stages and presenting new opportunities to companies like ours.
01:38:45.000And there's a ton of challenges with them and a ton of things that we've discussed over the past hour.
01:38:52.000That it doesn't solve and maybe exacerbates, especially around things like election interference and some of the regulatory concerns that you're bringing.
01:39:08.000We have four indicators right now that we're working on.
01:39:13.000With an external lab, we want other labs to give it up, open source, make sure that people can comment on it, that people can help us define it.
01:39:20.000We'll use that interpretation on our own algorithms and then push it.
01:39:37.000You want to have a healthy conversation.
01:39:39.000You want to maximize the amount of people.
01:39:40.000That means you've got to cut off all the tall grass and level everything out.
01:39:44.000So if you've decided that this one rule needs to be enforced because certain things are offensive...
01:39:49.000But can I explain what health at least means to us in this particular case?
01:39:53.000So, like, we talked a little bit about this on the previous podcast, but, like, we have four indicators that we're trying to define and try to understand if there's actually something there.
01:40:49.000Are the participants receptive to debate and to civility and to expressing their opinion?
01:40:56.000And even if it is something that might be hurtful, are people receptive to at least look at and be empathetic and look at what's behind that?
01:41:05.000This is the one we have the most measurement around today.
01:41:09.000We can determine and predict when someone might walk away from a Twitter conversation because they feel it's toxic.
01:43:26.000It is completely far off from where we are today.
01:43:29.000Not only have we had to address a lot of these issues that we're talking about at this table, but we've also had to turn the company around from a business standpoint.
01:43:39.000We've had to fix all of our infrastructure that's over 10 years old.
01:43:42.000We had to go through two layoffs because the company was too large.
01:43:48.000And I don't know any other way to do this than be really specific about our intentions and our aspirations and the intent and the why behind our actions.
01:43:59.000And not everyone's going to agree with it in the particular moment.
01:44:03.000So I want to point this out before I make my next statement, though, just real quick.
01:44:07.000It seems like the technology is moving faster than the culture.
01:44:09.000So I do recognize you guys are in a rock and a hard place.
01:44:12.000How do you get to a point where you can have that open source crypto blockchain technology that allows free and open speech?
01:46:19.000So the whole focus behind the temporary suspensions Is to at least give people pause and think about why and how they violated our particular rules that they signed up for when they came in through our terms of service.
01:46:38.000Whether you agree with them or not, this is the agreement that we have with people.
01:47:23.000If you're in the middle of the night and someone sends you an email and you find insulting, you type an email, go to sleep.
01:47:30.000Wake up in the morning like, I'm going to say something nice.
01:47:33.000That's how I wind up interacting with these people.
01:47:37.000But what do you think can be done for people like, let's say Megan Murphy, because she seems one of the, it's as easy to see her perspective as any.
01:47:47.000What do you think could be done for her?
01:48:00.000Right now, that's the only option that we've built into our rules, but we have every capability of changing that, and that's something that I want my team to focus on, is thinking about, as Jack said, not just coming back after some time-bound period, but also, like, what more can and should we be doing within the product itself?
01:48:16.000Early on to educate people about the rules.
01:48:18.000So one of the things that we're working on is a very, very simplified version of the Twitter rules.
01:48:23.000That's two pages, not 20. I've made sure that my lawyers don't write it and it's written in as plain English as we can.
01:48:32.000And like really taking the time to educate people.
01:48:35.000And I get people aren't always going to agree with those rules and we have to address that too.
01:48:39.000But at least simplifying it and educating people so that they don't even get to that stage.
01:48:44.000But once they do, understanding that there are going to be different contexts in people's lives, different times, they're going to say and do things that they may not agree with and they don't deserve to be permanently suspended forever.
01:48:59.000So we, this is something that actually we just had a meeting on this earlier this week with our executive team and, you know, identifying kind of some of the principles by which we would want to think about, you know, time bounding suspension.
01:49:23.000When someone reports something, instead of you having to worry about it, there would be no accusation of bias if 100,000 users were randomly selected to determine, because Periscope does this.
01:49:35.000So Periscope has a content moderation jury.
01:49:41.000So we flag, based on the machine learning algorithms, and in some cases reports, particular replies.
01:49:49.000We send them to a small jury of folks to ask, is this against our term service, or is this something that you believe should be in the channel or not?
01:50:09.000It has some gaming aspects to it as well.
01:50:11.000But we do have a lot of experiments that we're testing and we want to build confidence and it's actually driving the outcomes that we think are useful.
01:50:23.000And Periscope is a good playground for us across many regards.
01:50:27.000I think, ultimately, one of the greater philosophical challenges is that you are a massively powerful corporation.
01:50:39.000Well, we're a publicly traded corporation, so anybody can buy stock, but that doesn't mean they have influence on day-to-day operations.
01:50:46.000Well, I think, depending on which political faction you ask, they'll say money is influence.
01:50:49.000So I'm not going to say that the Saudi prince who invested in Twitter – because again, it's been a while since I've read these stories – is like showing up to your meetings and throwing his weight around.
01:50:57.000But at a certain point – He's definitely not doing that.
01:51:47.000That's actually what I was on the phone with.
01:51:49.000Alex was texting me saying that he never did anything to endanger any child and that he was disputing what people were saying about a video of a child getting harmed.
01:51:58.000And so do we just trust an unelected – I mean extremely wealthy individuals, Saudi princes.
01:52:07.000Who knows where the influence is coming from?
01:52:09.000Your rules are based on a global policy.
01:52:11.000And I'm sitting here watching, wow, these people who are never chosen in this position have too much power over my politics.
01:52:17.000I think that that's why it's so important that we take the time to build transparency into what we're doing.
01:52:22.000And that's part of what we're trying to do is not just in being here and talking to you guys, but also building it into the product itself.
01:52:30.000I think one of the things that I've really loved about a new product launch, what we've done is to Disable any sort of ranking in the home timeline if you want, and you don't have to see our algorithms at play anymore.
01:52:41.000These are the kinds of things that we're thinking about.
01:52:43.000How do we give power back to the people using our service so that they can see what they want to see and they can participate the way they want to participate?
01:52:50.000And this is long term, and I get that we're not there yet, but this is how we're thinking about it.
01:53:12.000Look, I definitely understand the mistrust that people have in our company, in myself, in the corporate structure, in all the variables that are associated with it, including who chooses to buy on the public market, who chooses not to.
01:53:27.000I get all of it, and I grew up on the Internet.
01:53:31.000I'm a believer in the Internet principles, and I want to do everything in my power to make sure that we are consistent with those ideals.
01:53:38.000At the same time, I want to make sure that every single person and do everything in my power has the opportunity to participate.
01:53:57.000So even in countries where it's criminal to be LGBT, you will still ban someone for saying something disparaging to or saying something to that effect?
01:54:07.000Let's say Saudi Arabia sentenced someone to death.
01:54:10.000I don't want to call it Saudi Arabia specifically.
01:54:11.000Let's call it Iran because I believe that's the big focus right now with the Trump administration.
01:54:15.000Iran, it's my understanding, it's still punishable by death.
01:54:28.000But there are some countries where, for instance, Michelle Malkin recently got really angry because she received notice that she violated blasphemy laws in Pakistan.
01:54:37.000So you do follow some laws in some countries, but it's not a violation.
01:54:41.000I guess the question I'm asking is, in Pakistan, it's very clearly a different culture.
01:54:47.000We do have a per-country takedown, meaning that content might be non-visible within that country, but visible throughout the rest of the world.
01:54:55.000Just to add on to what Jack's saying, we actually are very, very transparent about this.
01:54:59.000So we publish a transparency report every six months that details every single request that we get from every government around the world and the content that they ask us to remove, and we post that to an independent third-party site.
01:55:10.000So you could go right now and look and see every single request that comes from the Pakistani government and what content they're trying to remove from Pakistan.
01:55:19.000I've seen a lot of conservatives get angry about this, and it's kind of confusing.
01:55:27.000Blasphemy laws, posting pictures of Muhammad.
01:55:31.000Are they angry about our transparency report?
01:55:34.000There's a perception that you sending that notice is like a threat against them for violating blasphemy laws, whereas it's very clearly just letting you know a government has taken action against you.
01:55:44.000It's saying that the government has restricted access to that content in that country.
01:55:48.000And the reason we tell users or tell people that that's happened is because a lot of them may want to file their own suit against the government, or a lot of them may be in danger if they happen to be under that particular government's jurisdiction, and they may want to take action to protect themselves if they know that the government is looking at the content in their accounts.
01:56:08.000We don't always know where you are or what country you live in.
01:56:11.000And so we just send that notice to try to be as transparent as possible.
01:56:15.000The main point I was trying to get to is...
01:56:18.000Your policies support a community, but there may be laws in a certain country that does not support that community and finds it criminal.
01:56:24.000So your actions are now directly opposed to the culture of another country.
01:56:29.000I guess the point I'm trying to make is that if you enforce your values, which are perceivably not even the majority of this country, if you consider yourself more liberal-leaning than you're half of the United States, but you're enforcing those rules on the rest of the world that use the service, it's sort of forcing other cultures to adhere to yours.
01:56:48.000So a lot of our rules are based in more of the UN Declaration than just purely US. Doesn't the UN Declaration guarantee the right of all people through any medium to express their opinion?
01:57:51.000I really appreciate the fact that you guys are so open and that you're willing to come on here and talk about this because you don't have to.
01:58:17.000Look, I think it's also important that the company is not just me.
01:58:24.000We have people in the company who are really good at this and are making some really tough decisions and having tough conversations and getting pushback and getting feedback.
01:58:51.000He's a conservative personality, but he's very, very controversial for, like, fake news or something.
01:58:59.000I don't know too much about him, so I don't want to accuse him of things because I don't know who he is.
01:59:02.000But he was in something where he tried accusing Mueller of sexual assault, and it turned out to be just completely fake, ridiculous.
01:59:11.000This is a gentleman that was in the USA Today article where he admitted that he had used tactics in the past to influence the election, and he will continue to do so using all of his channels.
01:59:25.000And so when we saw that report, our team looked at his account.
01:59:29.000We noticed there were multiple accounts tied to his account, so fake accounts that he had created that were discussing political issues and pretending to be other people from other perspectives.
01:59:38.000We would have phone numbers linking accounts together or email addresses, in some cases IP addresses, other types of metadata that are associated with accounts, so we can link those accounts together.
01:59:48.000And having multiple accounts in and of itself is not a violation of our rules because some people have their work account, their personal account.
01:59:56.000It's when you're deliberately pretending to be someone else and manipulating a conversation about a political issue.
02:00:01.000And those are exactly the types of things that we saw the Russians do, for example, in the 2016 election.
02:00:07.000So it was that playbook and that type of activity that we saw about Jacob Wall.
02:00:12.000And that's why his accounts were suspended.
02:00:44.000But it's about grains of sand making a heap in the flow of a direction where we can see Jacob Wall has said he's done this, so you're like, we're going to investigate, we ban him.
02:00:51.000It was recently reported and covered by numerous outlets that a group called New Knowledge was meddling in the Alabama election by creating fake Russian accounts to manipulate national media into believing that Roy Moore was propped up by the Russians.
02:01:03.000Facebook banned him, as well as four other people, but Twitter didn't.
02:01:13.000So you didn't ban the guy doing it, but you banned the people.
02:01:16.000So in the case of Jacob Wall, we were able to directly attribute through email addresses and phone numbers his direct connection to the accounts that were created to manipulate the election.
02:01:26.000If we're not able to tie that direct connection on our platform, or law enforcement doesn't give us information to tie attribution, we won't take action.
02:01:35.000And it's not because of political ideology, it's because we want to be damn sure before we take action on accounts.
02:01:39.000So someone could use a VPN, perhaps, and maybe additional email accounts, and they could game the system in that way.
02:01:45.000There are certainly sophisticated ways that people can do things to mask who they are and what accounts that they're controlling.
02:01:52.000And just the internal conversation, Tim, just to provide more light into what happens.
02:01:56.000Like, I got an email or a text from Vidya one morning and said, we are going to permanently suspend this particular account.
02:02:05.000And it's not a, you know, what do you think?
02:03:13.000I'm getting confused about what we're talking about.
02:03:15.000Jacob Wall, it's announced in the USA Today, he says, I'm doing this.
02:03:19.000And you're like, okay, we can look at his account, we can see it, we get rid of him.
02:03:22.000With new knowledge, you said you did take those accounts down.
02:03:25.000I believe we were able to take down a certain cluster of accounts that we saw engaging in the behavior, but we weren't necessarily able to tie it back to one person controlling those accounts.
02:03:36.000We like to have some sort of attribution that's direct that we can see.
02:03:41.000Would we just take any newspaper or any article at face value and just action them?
02:03:46.000Would you have to contact him and get some sort of a statement from him in order to take down his account?
02:03:52.000I mean, I don't think he would admit to manipulating Twitter if Twitter asked him.
02:03:56.000But if you could get the fact that he communicated with a newspaper, right?
02:04:00.000To clarify what they said, what they claimed to the New York Times was that it was a false flag.
02:04:06.000New York Times said they reviewed internal documents that showed they admitted it was a false flag operation.
02:04:11.000The guy who runs the company said, oh, his company does this.
02:04:15.000He wasn't aware necessarily, but it was an experiment.
02:04:19.000So he's given kind of, in my opinion, duplicitous – not straightforward, but at the time of this campaign, which he claims to know about – He tweeted that it was real.
02:04:29.000So during the Roy Moore campaign, he tweets, wow, look at the Russians.
02:04:32.000Then it comes out later, his company is the one that did it.
02:04:35.000So you're kind of like, oh, so this guy was propping up his own fake news, right?
02:04:39.000Then when they get busted, he goes, oh, no, it's just my company doing an experiment.
02:05:07.000I don't want to get sued and have my facts wrong.
02:05:10.000But the reason I bring this up was not to accuse you of wrongdoing, was to point out that...
02:05:15.000I don't think that the people who work at Twitter are twirling their mustaches, laughing, pressing the ban button whenever they see a conservative.
02:05:22.000I think it's just there's a bias that's unintentional that flows in one direction.
02:05:26.000So you see the news about Jacob Wall, and I think there's a reason for it too.
02:05:29.000For one, your staff is likely more – you've mentioned more likely to lean left and look at certain sources.
02:05:37.000So you're going to hear about more things more often and take action on those things as opposed to the other side of the coin.
02:05:43.000But we have to consider where the actions are taking place.
02:05:46.000I'm speaking more broadly to the 4,000 people that we have as a company versus the deliberateness that we have on Vidya's team, for instance.
02:05:53.000I just mean when we look at a company-wide average of all of your employees and the direction they lean versus the news sources they're willing to read, you're going to see a flow in one direction, whether it's intentional or not.
02:06:25.000I knew one name and I didn't know another name.
02:06:27.000And it was because Vidya said, you know, we're permanently banning this account.
02:06:31.000And yes, we didn't have the same sort of findings in the other particular account, which I got feedback on, passed to her, and we didn't find what we needed to find.
02:06:42.000But to be clear, the team had taken action on this stuff months ago when it actually had happened.
02:06:47.000I think, you know, a lot of what people assume is malintent is sometimes fake news.
02:06:53.000You know, I think one of my biggest criticisms in terms of what's going on in our culture is the news system is, like you pointed out, although it's changed, left-wing journalists only follow themselves.
02:07:22.000So if, you know, we hear about Jesse Smollett.
02:07:25.000We hear about how the story goes wild.
02:07:27.000But there's like 800 instances of Trump supporters wearing MAGA hats getting beaten, you know, throughout the past couple of years.
02:07:32.000We had a guy show up to a school in Eugene, Oregon with a gun and fire two rounds at a cop wearing a Smash the Patriarchy and Chill shirt.
02:07:38.000And those stories don't make the headlines.
02:07:40.000So it's, you know, when the journalists are inherently in a bubble, I hear you.
02:07:55.000I think our biggest issue and the thing that I want to fix the most is the fact that we create and sustain and maintain these echo chambers.
02:08:42.000It might change in the future, but we can't do this without a level of transparency, because we minimize something Vidja spoke to earlier, which is speaking truth to power, holding people to account.
02:08:53.000Even things like the Fyre Festival, where, you know, you had these organizers who were deleting every single comment, moderating every single comment that called this thing a fraud, and don't go here.
02:09:05.000We can't reliably and, like, just from a responsibility standpoint, ever create a future that enables more of that to happen.
02:09:14.000And that's how we're thinking about even features like this.
02:09:17.000I'm going to jump right off to a different train cart here.
02:09:19.000Has law enforcement ever asked you to keep certain people on the platform even after they violated your rules?
02:09:29.000So then this – to the next question pertaining to bias, you have the issue of Antifa versus the Proud Boys and Patriot Prayer.
02:09:36.000And Twitter permanently excised anyone associated with the Proud Boys.
02:09:40.000Antifa accounts who have broken the rules repeatedly, branded known cells that have been involved in violence, all still active.
02:09:47.000Is there a reason – Well, with the Proud Boys, what we were able to do was actually look at documentation and announcements that the leaders of that organization had made and their use of violence in the real world.
02:10:12.000Gavin McGinnis, Anthony Cumia, who was part of Opie and Anthony, now it's his own show, It happened on his show because there was a guy that was on the show and they made a joke about starting a gang based on him because he was a very effeminate guy and they would call him the Proud Boys.
02:10:28.000And they went into detail about how this thing became...
02:10:34.000From a joke and saying that you could join the Proud Boys and everyone was, you know, it was like being silly to people joining it and then it becoming this thing to fight Antifa and then becoming infested with white nationalists and becoming this thing.
02:10:49.000Well, in many ways it was, but it's been documented how it started and what it was and misrepresented as to why it was started.
02:11:00.000I think there's some things that should be clarified about them, but Gavin has made a bunch of statements that cross the line.
02:11:09.000He was talking to me about Antifa, that when Antifa was blocking people like Ben Shapiro's speeches and things along those lines and stopping conservatives from speaking, you should just punch them in the face.
02:11:19.000We're going to have to start kicking people's asses.
02:11:21.000I was like, this is not just irresponsible, but foolish and short-sighted and just a dumb way to talk.
02:11:26.000So then you have the Antifa groups that are engaging in the same thing.
02:11:30.000The famous bike lock basher incident where a guy showed up, hit seven people over there with a bike lock.
02:11:38.000I'm going to leave that out for the time being.
02:11:40.000You have other groups like By Any Means Necessary.
02:11:44.000You have in Portland, for instance, there are specific branded factions.
02:11:50.000There's the tweet I mentioned earlier where they doxed ICE agents and they said, do whatever inspires you with this information.
02:11:57.000And I mean, you're tagged in a million times.
02:11:58.000I know you probably can't see it, but you can actually see that some of the tweets in the thread are removed.
02:12:02.000But the main tweet itself from an anti-fascist account linking to a website is Straight up saying, like, here's the private home details, phone number, addresses of these law enforcement officers is not removed since September.
02:12:12.000So what you end up seeing is, again, to point, I think one of the big problems in this country is the media, because it was reported that the FBI designated Proud Boys an extremist group.
02:12:22.000But it was a misinterpretation based – a sheriff wrote a draft saying with – the FBI considers them to be extremists.
02:12:29.000The media then reported hearsay from the sheriff and the FBI came out and said, no, no, no, we never meant to do that.
02:12:38.000And again, I think Gavin is a different story, right?
02:12:40.000If you want to go after the individuals who are associating with that group versus the guy who goes on his show and says outrageous things and goes on Joe's show.
02:12:48.000What I mean by that is they have specific names, they sell merchandise, and they're the ones showing up throwing mortar shells into crowds.
02:12:56.000They're the ones showing up with crowbars and bats and whacking people.
02:12:59.000I was in Boston, and there was a rally where conservatives were planning on putting on a rally.
02:13:04.000It was literally just libertarians and conservatives.
02:13:06.000Antifa shows up with crowbars, bats, and balaclavas with weapons threatening them.
02:13:12.000So I have to wonder if these people are allowed to organize in your platform.
02:13:25.000Homeland Security in New Jersey has listed them under domestic terrorism.
02:13:28.000So I understand there's a conundrum in that the general concept of anti-fascism is a loose term that means you oppose fascism.
02:13:35.000But Antifa is now – they have a flag.
02:13:38.000They've had a flag since the Soviet – Nazi Germany in the Soviet era, and they've brought it back.
02:13:43.000There are specific groups that I'm not going to mention by name that have specific names, and they sell merchandise.
02:13:47.000They've appeared in various news outlets.
02:13:49.000They've expressed their desire to use violence to suppress speech.
02:13:52.000Is it a centralized organization the same way that – I hear you on Proud Boys, but like where they have like tenants that are written out and there's a leader and like – Yeah.
02:14:27.000I should point out that they decided to call for violence based on Antifa calling for violence.
02:14:33.000Based on Antifa actually actively committing violence against conservative people, they were there to see different people speak.
02:14:39.000Well, it partly started because in Berkeley, there was a Trump rally.
02:14:43.000So actually, after Milo got chased out of the Berkeley, there was $100,000 in damages.
02:14:47.000I mean, there's a video of some guy in all black cracking someone on the back who's on the ground looking like they're unconscious.
02:14:53.000So these conservatives see this and they decide to hold a rally saying we won't back down.
02:14:57.000They hold a rally in Berkeley and then Antifa shows up again.
02:15:00.000I understand you can't figure out who these people are for the most part.
02:15:48.000So I guess the question is, how come they don't get removed?
02:15:52.000Well, in the past when we've looked at Antifa, we ran into this decentralization issue, which is we weren't able to find the same type of information that we were able to find about Proud Boys, which was a centralized, leadership-based documentation of what they stand for.
02:16:07.000But absolutely, I mean, it's something that we'll continue to look into.
02:16:09.000And to the extent that they're using Twitter to organize any sort of offline violence, it's completely prohibited under our rules, and we would absolutely take action on that.
02:16:25.000He's not only that, he's disassociated himself with it and said that it completely got out of hand and he doesn't want to have anything to do with it.
02:16:31.000Yeah, and I think this is a great, again, test case for how we think about getting people back on the platform.
02:16:37.000Yeah, he's an interesting case, because he's really a provocateur, and he fancies himself sort of a punk rocker, and he likes stirring shit.
02:16:46.000I mean, when he came on my show last time he was on, he was dressed up like Michael Douglas in Falling Down.
02:17:58.000There's another – when it comes to the weaponization of rules against – like Gavin isn't creating a compilation of things he's ever said out of context and then sending them around to get himself banned.
02:18:08.000Other people are doing that to him, activists who don't like him, and it's effective.
02:18:12.000In fact, I would actually like to point out there's one particular user who has repeatedly made fake videos attacking one of your other high-profile conservatives so much so that he's had to file police reports, harassment complaints, and it just doesn't stop.
02:18:26.000If someone repeatedly makes videos of you out of context, fake audio, accusing you of doing things you've never done, at what point is that bannable?
02:18:34.000Yeah, and if it's targeted harassment and we can establish it, it's just a really hard thing with us determining whether something is fake or not.
02:18:40.000Well, it's also when things are out of context.
02:18:42.000You still have video of the person saying that.
02:18:44.000I agree that it's out of context and it's disingenuous, but it's still the person saying it and you're making a compilation of some pre-existing audio or video.
02:18:55.000So I think in the instance of Gavin, like, one of the things he said was, like, a call to violence, but he was talking about, like, it was in the context of talking about a dog and being scolded.
02:19:04.000So he was like, hit him, just hit him, and then it's like, it turns out he's talking about a dog, like, doing something wrong.
02:19:09.000And they take that and they snip it, and then it goes viral, and then everyone starts flagging, saying, you gotta ban this guy.
02:19:13.000So again, I understand, like, you know.
02:19:15.000But I guess the issue is, if people keep doing that to destroy someone's life...
02:19:19.000So I think there's a bigger discussion, I think, both of you could probably shed some important light on, too, outside of Twitter.
02:19:25.000This weaponization of content from platforms is being used to get people banned from their banking accounts.
02:19:31.000We can talk about Patreon, for instance.
02:19:33.000And again, this may just be something you could chime in on.
02:19:37.000Patreon banned a man named Carl Benjamin, also known as Sargon of Akkad.
02:22:27.000I knew he had done things that were egregious violations of the rules because, plain and simple, I didn't bring him up to go through it and try to figure out if he – but it does sound like at least the first one was meant to be a critique of your – Potentially, but there are a bunch of others if you want to hear them.
02:23:24.000He's probably going to mind his P's and Q's.
02:23:27.000Oh, so the reason I brought him up again, but we'll move on, was that activists found a live stream from eight months ago.
02:23:35.000I totally forgot why I was bringing this up because we've moved so far away from where we were.
02:23:39.000But they pulled a clip from an hour and a half or whatever into a two-hour live stream on a small channel that only had 2,000 views, sent it to Patreon, and then Patreon said, yep, that's a violation, and banned him outright without warning.
02:23:52.000Which, again, I understand is different from what you guys do.
02:24:56.000So there's also all this infrastructure that we have to fix in order to pass those through in terms of what action you took or what action someone else took to be transparent about what's happening on the network.
02:26:29.000I'm not sure about that because one of the things that I do think is that just – I'm not in favor of a lot of this heavy-handed banning and a lot of the things that have been going on, particularly a case like the Megan Murphy case.
02:26:42.000But what I think that we are doing is we're exploring the idea of civil discourse.
02:26:50.000We're trying to figure out what's acceptable and what's not acceptable.
02:26:54.000And you're communicating about this on a very large scale.
02:26:58.000And it's putting that out there and then people are discussing it.
02:27:01.000Whether they agree or disagree, whether they vehemently defend you or hate you, they're discussing this.
02:27:08.000And I think this is how these things change.
02:27:11.000And they change over long periods of time.
02:27:13.000Think about words that were commonplace just a few years ago that you literally can't say anymore.
02:27:20.000I mean, there's so many of them that were extremely commonplace or not even thought to be offensive 10 years ago that now you can get banned off of platforms for them.
02:27:31.000But that's a good point to argue against banning people and to cease enforcing hate speech rules.
02:28:06.000Unfortunately, I don't know why, but when I did the Google search, nothing came up.
02:28:10.000What I did notice was at the bottom of the page, it said due to UK law, certain things have been removed.
02:28:16.000So I don't know if it's exactly why I couldn't pull up a video proving or tweets or anything because I think using these words gets stripped from the social platforms.
02:30:36.000To protect the original tweeter and also folks who don't want to see that.
02:30:41.000They can still see everything, they just have to do one more tap.
02:30:44.000So that's one solution, ranking is another solution, but as technology gets better and we get better at applying to it, we have a lot more optionality, whereas we don't have that as much today.
02:30:55.000I feel like, you know, I'm just going to reiterate an earlier point, though.
02:30:58.000You know, if you recognize sunlight is the best disinfectant, it's like you're chasing after a goal that can never be met.
02:31:04.000If you want to protect all speech and they start banning certain individual, you want to increase the amount of healthy conversations, but you're banning some people.
02:31:11.000Well, how long until this group is now offended by that group?
02:31:14.000How long until you've banned everybody?
02:31:16.000I don't believe a permanent ban promotes health.
02:31:18.000I don't believe that, but we have to work with the technologies, tools, and conditions that we have today and evolve over time to where we can see examples like this woman at the Westboro Baptist Church who was using Twitter every single day to spread hate against the LGBTQA community.
02:31:43.000And over time, we had, I think it was three or four folks on Twitter who would engage her every single day about what she was doing, and she actually left the church.
02:32:13.000Have you considered allowing some of these people permanently band back on with some restrictions?
02:32:17.000Maybe you can only tweet twice per day.
02:32:19.000Maybe you can't retweet or something to that effect.
02:32:21.000I think we're very early in our thinking here, so we're open-minded to how to do this.
02:32:25.000I think we agree philosophically that permanent bans are an extreme case scenario, and it shouldn't be one of our, you know, regularly used tools in our tool chest.
02:32:34.000So how we do that, I think, is something that we're actively talking about today.
02:32:41.000I think that would fix a lot of problems.
02:33:14.000One of the challenges is we have the benefit in English common law of hundreds of years of precedent and developing new rules and figuring out what works and doesn't.
02:34:01.000Yeah, I don't think that's a good move.
02:34:02.000What do you think about, perhaps, instead of...
02:34:06.000Is it possible to have levels of Twitter, like a completely uncensored, unmoderated level of Twitter, and then have a rated R, and then have a PG-13?
02:34:18.000I mean, I don't think that's a bad idea.
02:34:20.000We have those levels in place today, but you don't really see them.
02:34:25.000One, we have a not-safe-for-work switch, which you can turn on or off.
02:35:03.000So these are all the questions that are on the table.
02:35:05.000You asked about timeline, and this is a challenging one.
02:35:08.000I don't know about timeline because first...
02:35:13.000We've decided that our priority right now is going to be on proactively enforcing a lot of this content, specifically around anything that impacts physical safety, like doxing.
02:35:25.000Right, but there's so many examples of what you guys not doing there.
02:35:29.000I know, but that's what we're fixing right now.
02:38:35.000And you mentioned earlier layoffs and retraction.
02:38:39.000Peer review, which we mentioned, but have you just considered opening an office, even a small one, for trust and safety in an area that's not predominantly blue so that at least you can have some pushback?
02:39:39.000And so high-profile cases, cases people ask us about, like to actually publish this so that we can go through, you know, tweet by tweet just like this.
02:39:48.000Because I think a lot of people just don't understand and they don't believe us when we're saying these things.
02:39:53.000So to put that out there so people can see.
02:39:55.000And again, they may disagree with the calls that we're making, but we at least want them to see why we're making these calls.
02:40:02.000I want to at least start that by the end of this year.
02:40:05.000So I think ultimately my main criticism stands and I don't see a solution to in that Twitter is an unelected, unaccountable as far as I'm concerned when it comes to public discourse.
02:40:15.000You have rules that are very clearly at odds as we discussed.
02:40:19.000I don't see a solution to that and I think in my opinion we can have this kind of like we've toned things down.
02:40:24.000We've had some interesting conversations but ultimately unless you're willing to allow people to just speak entirely freely – You are – we have an unelected group with a near monopoly on public discourse in many capacities and I understand it's not everything.
02:40:37.000Reddit is big too and it's – what I see is you are going to dictate policy whether you realize it or not and that's going to terrify people and it's going to make violence happen.
02:40:50.000I hate bringing up this example on the rule for misgendering because I'm actually – I understand it and I can agree with it to a certain extent.
02:40:57.000I have nothing but respect for the trans community, but I also recognize we've seen an escalation in street violence.
02:41:03.000We see a continually disenfranchised large faction of individuals in this country.
02:41:08.000We then see only one of those factions banned.
02:41:10.000We then see a massive multinational billion-dollar corporation with private and foreign investors.
02:41:15.000And it looks to me like if foreign governments are trying to manipulate us, I don't see a direct solution to that problem, that you do have political views.
02:41:59.000We also have to be free to experiment with solutions and experiment with evolving policy and putting something out there that might look right at the time and evolving.
02:42:12.000I'm not saying this is it, but we look to research, we look to our experience and data on the platform, and we make a call.
02:42:22.000And if we get it wrong, we're going to admit it and we're going to evolve it.
02:42:28.000But I guess, do you understand my point?
02:42:32.000That there are American citizens abiding by the law who have a right to speak and be involved in public discourse that you have decided aren't allowed to.
02:42:37.000Yeah, and I think we've discussed, like, we don't see that as a win.
02:42:44.000We see that as not promoting health, ultimately, over time.
02:42:47.000But it's ultimately, what is your priority?
02:42:50.000Do you have it prioritized in terms of what you guys would like to change?
02:42:55.000I think Jack has said it a couple times, but the first thing we're going to do is prioritize people's physical safety because that's got to be understanding.
02:43:02.000You already have done that pretty much, right?
02:43:42.000I mean, my opinion would be as much as I don't like a lot of what people say about me, what they do, the rules you've enforced on Twitter have done nothing to stop harassment towards me or anyone else.
02:43:52.000I swear to God, my Twitter, I mean, my Reddit is probably 50 messages from various far-left and left-wing subreddits lying about me, calling me horrible names, quote-tweeting me, and these people are blocked.
02:44:04.000And I never used to block people because I thought it was silly because they can get around it anyway, but I decided to at one point because out of sight, out of mind.
02:44:11.000If they see my tweets less, they'll probably interact with me less, but they do this, and they lie about what I believe, they lie about what I stand for, and they're trying to destroy everything about me, and they do this to other people.
02:44:25.000As they say on the internet, welcome to the internet.
02:44:27.000So to me, I see Twitter trying to enforce all these rules to maximize good, and all you end up doing is stripping people from the platform, putting them in dark corners of the web where they get worse, and then you don't actually solve the harassment problem.
02:44:39.000Red is hardly a dark corner of the web, right?
02:44:59.000They ban similar ideology and they're creating a parallel society.
02:45:03.000You've got alternative social networks popping up that are taking the dregs of the mainstream and giving them a place to flourish, grow, make money.
02:45:10.000Now we're seeing people be banned from MasterCard, banned from PayPal, even banned from Chase Bank because they all hold the same similar ideology to you.
02:45:18.000In some capacities, I don't know exactly why Chase does it.
02:45:22.000I assume it's because you'll get some activists who will lie.
02:46:26.000Now we're seeing people who have, like, you mentioned Westboro Baptist Church, and she's been deradicalized by being on the platform.
02:46:33.000But now we have people who are being radicalized by being pushed into the dark corners, and they're building, and they're growing.
02:46:39.000And they're growing because there's this idea that you can control this and you can't.
02:46:44.000You know, I think you mentioned earlier that...
02:46:47.000There are studies showing, and also counter-studies, but people exposed to each other is better.
02:46:51.000I found something really interesting, and because I have most, whether or not people want to believe this, all of my friends are on the left, and some of them are even, like, socialists, and they're absolutely terrified to say, to talk, because they know they'll get attacked by the people who call for censorship and try to get them fired.
02:47:07.000And when I talked to them, I was talking to a friend of mine in LA, and she said, is there a reason to vote for Trump?
02:47:14.000And I explained a very simple thing about Trump supporters.
02:47:16.000This was back in 2016. I said, oh, well, you've got a lot of people who are concerned about the free trade agreements sending jobs overseas.
02:47:22.000So they don't know much about Trump, but they're going to vote for him because he supported that.
02:47:29.000And so you have this ever-expanding narrative that Trump supporters are Nazis and the MAGA head is the KKK hood.
02:47:35.000And a lot of this rhetoric emerges on Twitter.
02:47:37.000But when a lot of these people start getting excised, Then you can't actually meet these people and see that they're actually people, and they may be mean.
02:47:45.000They may be awful people, but they're still people, and even if they have bad opinions, sometimes you actually, I think in most instances, you find they're regular people.
02:47:53.000Well there's a part of the problem of calling for censorship and banning people in that it is sometimes effective and that people don't want to be thought of as being racist or in support of racism or in support of nationalism or any of these horrible things so you feel like if you support these bannings you support positive discourse and a good society and all these different things.
02:48:15.000What you don't realize is what you're saying, is that this does create these dark corners of the web and these other social media platforms evolve and have far...
02:48:24.000I mean, when you're talking about bubbles and about these groupthink bubbles, the worst kind of groupthink bubbles is a bunch of hateful people that get together and decide they've been persecuted.
02:48:37.000Instead of, like we were talking about with Megan Phelps, having an opportunity to maybe reshape their views by having discourse with people.
02:48:45.000Who choose to or not choose to engage with them.
02:49:30.000And it develops hate for the opposing viewpoint.
02:49:33.000You start hating people that are progressive because these are the people that, like, you and I have talked about the Dayton Society report that labeled us as alt-right adjacent or whatever.
02:49:41.000And now more fake news coming out about it, right?
02:49:43.000They connected because you and I have talked to people that are on the right or far right that somehow or another were secretly far right and that there's this influence network of people together.
02:49:55.000Well, it's a schizophrenic connection.
02:49:58.000It's like one of those weird things where people draw a circle.
02:50:01.000Oh, you talk to this guy and this guy talk to that guy.
02:50:04.000So here's an expanded part of this problem.
02:50:07.000So you're probably not familiar, but a group called Data& Society published what's entirely fake, a report labeling 81 alt-right adjacent to whatever they want to call it.
02:50:16.000YouTube channels included Joe Rogan and me.
02:50:19.000A couple dozen news outlets wrote about it as if it was fact.
02:50:22.000You believe the Proud Boys were labeled by the FBI's extremists when they actually weren't.
02:50:26.000It was a sheriff's report from someone not affiliated with the FBI, but they are activists within media who have an agenda, and we saw this with Learn to Code.
02:50:34.000It was an NBC reporter who very clearly is left-wing identitarian.
02:51:50.000You know, we've seen for the past years with Trump, we've seen Breitbart has a list of 640 instances of Trump supporters being physically attacked or harassed in some way.
02:51:59.000There was a story the other day about an 81 year old man who was attacked.
02:53:00.000A regulator's job is to protect the individual and make sure that they level the playing field and they're not pushed by any particular special interests.
02:53:09.000Like, companies like ours who might, you know...
02:53:17.000I agree that we should have an agency that can help us protect the individual and level the playing field.
02:53:29.000So I think oftentimes companies see themselves as reacting to regulation.
02:53:35.000I think we need to take more of an education role.
02:54:14.000Do you think you can hold off regulation, though?
02:54:16.000Do you think that by these approaches and by being proactive and by taking a stand and perhaps offering up a road to redemption to these people and making clear distinctions between what you're allowing, what you're not allowing, you can hold off Regulation, or do you disagree with what he's saying about regulation?
02:54:32.000No, I don't believe that should be our goal, is to hold off regulation.
02:54:34.000I believe we should participate like any other citizen, whether it be a corporate citizen or an individual citizen, in helping to guide the right regulation.
02:54:45.000So, are you familiar, and I could be wrong on this because it's been like 15 years since I've done this.
02:54:50.000Are you familiar with the Clean Water Restoration Act at all?
02:55:04.000And what was typically told to us was that all of these different companies said we're doing the right thing.
02:55:09.000But as I mentioned, the snowflake doesn't blame itself.
02:55:12.000So over time, the river was so polluted it became sludge and lit on fire.
02:55:15.000And so someone said, if all of these companies think they're doing the right thing, And they've all just contributed to this nightmare.
02:55:22.000We need to tell them blanket regulation.
02:55:24.000And so what I see with these companies like banking institutions, public discourse platforms, video distribution, I actually – I'm really worried about what regulation will look like because I think the government is going to screw everything up.
02:55:37.000But I think there's going to be a recoil of – first, I think the Republicans – because I watched the testimony you had in Congress and I thought they had no idea what they were talking about nor did they care.
02:55:46.000There was like a couple people who made good points, but for the most part, they were like, I don't know, whatever.
02:55:50.000And they asked about Russia and stuff.
02:55:51.000So they have no idea what's going on, but there will come a time when, you know, for instance, one of the great things they brought up was that by default, when someone in D.C. signs up, they see way more Democrats than Republicans.
02:56:31.000So then it's going to escalate for me.
02:56:33.000It's not going to stop with these conversations.
02:56:34.000And so we've been having a lot of talks about this, particularly around algorithms.
02:56:38.000And one of the things that we're really focused on is not just fairness and outcomes, but also explainability of algorithms.
02:56:43.000And I know, Jack, you love this stuff, so I don't know if you want to talk a little bit about our work there.
02:56:47.000Yeah, I mean, so there's two fields of research within artificial intelligence that are rather new, but I think really impactful for our industry.
02:56:54.000One is fairness in ML. Fairness in what?
02:56:58.000Fairness in machine learning and deep learning.
02:57:01.000So looking at everything from what data set is fed to an algorithm, so like the training data set.
02:57:30.000The reality is a lot of this human judgment is moving to algorithms.
02:57:34.000And the second issue with it moving to algorithms is algorithms today can't necessarily explain the decision-making criteria that they use.
02:57:42.000So they can't explain in the way that you make a decision, you explain why you make that decision.
02:57:47.000Algorithms today are not being programmed in such a way that they can even explain that.
02:57:50.000You may wear an Apple Watch, for instance.
02:57:53.000It might tell you to stand every now and then.
02:57:56.000Right now, those algorithms can't explain why they're doing that.
02:58:00.000That's a bad example because it does it every 50 minutes, but as we offload more and more of these decisions, both internally and also individually to watches and to cars and whatnot, There is no ability right now for that algorithm to actually go through and list out the criteria used to make that decision.
02:58:21.000So this is another area that we'd like to get really good at if we want to continue to be transparent around our actions because a lot of these things are just black boxes and they're being built in that way because there's been no research into like, well, how do we get these algorithms to explain what their decision is?
02:58:58.000Someone is going to take a sledgehammer to Twitter, to Facebook, to YouTube and just be like – Not understanding the technology behind it, not willing to give you the benefit of the doubt, and just saying, I don't care why you're doing it, we are mad.
03:00:01.000I would like to know all the specifics of why they chose to do that.
03:00:04.000And I would hope that they would release some sort of a statement explaining why they chose to do that.
03:00:07.000Maybe there's something we don't know.
03:00:08.000There was a reporter, and I could be getting this wrong because I didn't follow it very much, with Big League Politics, who said that after reporting on PayPal negatively, they banned him.
03:01:15.000Let's talk about the incestuous relationship that a lot of these journalists have in defending the policies you guys push.
03:01:21.000A study was done, I talked about this last time, where they found 5% of the posts on Gab were hate speech, compared to Twitter's like 2.4%.
03:01:30.000So it's a marginal increase, yet Gab is called the White Supremacy Network.
03:03:37.000Implicitly, not explicitly, to lie, to side with the audience, as it were.
03:03:41.000I've seen the narratives they push, and I've had conversations with people that I'm going to keep relatively off the record.
03:03:48.000Journalists who are terrified because they said the narrative is real.
03:03:52.000One journalist in particular said that he had evidence of, essentially, he had reason to believe there was wrongdoing, but if he talks about it, he could lose his job.
03:04:02.000And there was a journalist who reported to me that Data& Society admitted their report was incorrect.
03:04:08.000And now you've got organizations lobbying for terminating Joe and I because of this stuff.
03:04:15.000Then you see all the actions I mentioned before and all the organizations saying we're doing the right thing.
03:04:20.000And I got to say, like, we're living in a – I mean, I feel like we're looking at the doorway to the nightmare dystopia of – I just want to clarify.
03:04:28.000I don't know if we're going around saying we're necessarily doing the right thing.
03:04:33.000We're saying why we're doing what we're doing.
03:04:54.000Well, but I think it's just obvious to point out – again, I said this before, we can have the calm conversation and I can understand you.
03:05:01.000But from where I'm sitting, you hold a vastly different ideology than I do and you have substantially more power in controlling my government.
03:05:33.000I'm not trying to insinuate he's showing up to your meetings and telling you what to do, but when someone dumps a billion dollars in your company...
03:05:38.000I think it's silly to imply that they don't at least have some influence, but regardless.
03:05:42.000And unlike the internet, within a company like ours, you don't necessarily see the protocol, you don't see the processes, and that is an area where we can do a lot much better.
03:05:51.000I guess, you know, beat it over the head a million times, beat the dead horse.
03:05:56.000I think ultimately, yeah, I get what you're doing.
03:06:03.000And we're heading down to this nightmare scenario of a future where it terrifies me when I see people who claim to be supporting liberal ideology burning signs that say free speech, threatening violence against other people.
03:06:13.000You have these journalists who do the same thing.
03:06:15.000They accuse everybody of being a Nazi, everybody of being a fascist, Joe Rogan for Christ.
03:09:36.000There have been statements from foreign security advisors, international security experts saying we're facing down high probability of civil war.
03:09:44.000It's not going to look like what you think it looks like.
03:09:46.000It may not be as extreme as it was in the 1800s.
03:09:48.000But I think it was in the Atlantic where they surveyed something like 10 different international security experts who said based on what the platforms are doing, based on how the people are responding, one guy said it was like 90% chance, but the average was really high.
03:10:02.000Well, let's look outside of the idea of physical war and let's look at the war of information.
03:10:07.000We're talking about what's happening with foreign entities invading social media platforms and trying to influence our elections and our democracy.
03:10:27.000An attempt to lie to people to strip out their ideological opponents.
03:10:31.000And it's also the woman who wrote that said that it's been proven over and over again that deplatforming is an effective way to silence people.
03:10:40.000I don't think she was saying that we should be banned.
03:10:42.000I don't think she said that I should be banned.
03:10:43.000She said something to the effect of YouTube has to take action to prevent this from, you know...
03:10:47.000Well, you know, when people see someone saying things that they don't agree with, it's very important for people to understand where silencing people leads to.
03:11:43.000Do you think we could do this again in like six months and see where you guys are at in terms of like what I think is important is the road to redemption.
03:11:49.000I think that would open up a lot of doors for a lot of people to appreciate you.
03:11:53.000We're going to need more than six months.
03:12:19.000I mean, there was an early phrase in the internet by some of the earliest internet engineers and designers, which is, code is law.
03:12:28.000And a lot of what companies like ours and startups and random folks who are individuals who are contributing to the internet will change parts of society, and some for the positive and some for the negative.
03:12:46.000I think the most important thing that we need to do is to, as we just said, shine a bunch of light on it, make sure that people know where we stand and where we're trying to go and what bridges we might need to build from our current state to the future state.
03:13:02.000And be open about the fact that we're not going to, and this is to your other point, we're not going to get to a perfect answer here.
03:13:12.000It's just going to be steps and steps and steps and steps.
03:13:18.000What we need to build is an ability to experiment very, very quickly and take in all these feedback loops that we get, some feedback loops like this, some within the numbers itself, and integrate them much faster.
03:13:31.000What's wrong with the jury system on Twitter?
03:13:46.000But again, we're a company of so many resources, finite resources, finite people, and we need to prioritize.
03:13:53.000And we've decided, you may disagree with this decision, but we've decided that physical safety and the admission of off-platform ramifications...
03:15:33.000I mean, the United States doesn't have a platform to do that.
03:15:36.000When you're talking about the internet, the United States, if they want to come up with a United States Twitter, like a solution or an alternative that the government runs, and they use free speech to govern that, good luck.
03:17:21.000Well, the problem with people like me is that I put out a lot of content, and there's millions of views, and it's impossible to moderate all the comments.
03:17:42.000If you put a YouTube video on and you have a bunch of people that say a bunch of racist things in your YouTube comments, you could be held responsible and get a fuck.
03:18:10.000Look, you know, I pointed out I think the Democrats are in a really dangerous position because outrage culture, although it exists in all factions, is predominantly on one faction.
03:18:21.000And so when Trump comes out and says something really offensive, you know, grab him by the, you know what I'm talking about, the Trump supporters laugh.
03:19:07.000And that means even though YouTube did nothing wrong with these comments, it was just a creepy group of people who didn't break the rules, who figured out how to manipulate the system, YouTube ate, like, YouTube had to take that one.
03:19:18.000The advertisers pulled out, YouTube lost money.
03:19:20.000So YouTube then panics, sledgehammers comments, just wipes them out.
03:19:50.000Even if you segment it, they're going to be threatened by it, and so the restrictions are going to come from whether or not you can make money doing it.
03:20:26.000I mean, I don't think it's to the point where everyone's lost all ads, but look, you think George Carlin would be allowed to do his bit today?
03:20:40.000It's just in the process of this transformation where people are understanding that because of the internet, if you look at late night conversations, how about...
03:20:52.000Colbert saying that President Trump has Putin's dick in his mouth.
03:20:56.000How about him saying that on television?
03:20:57.000Do you really think that would have been done 10 years ago?
03:21:18.000He lost it because people were complaining.
03:21:20.000Because people who are activists were complaining that he had said some homophobic things that he had subsequently apologized for before they ever said that.
03:21:43.000I'm saying there's a lot of people out there that are complaining.
03:21:46.000But the problem is not necessarily that there's so many people that are complaining.
03:21:49.000The problem is that people are reacting to those complaints.
03:21:52.000The vast majority of the population is recognizing that there is an evolution of free speech that's occurring in our culture and in all cultures around the world.
03:22:00.000But this is a slow process when you're in the middle of it.