SNEAKO - August 26, 2022
SNEAKO Reacts To Mark Zuckerberg on Joe Rogan
Episode Stats
Words per Minute
189.42337
Summary
In this episode, the boys discuss the Mark Zuckerberg interview on Joe Rogan's show, the latest in the war between Elon Musk and Elon on the internet, and the controversial question of who should be in charge of deciding what's controversial.
Transcript
00:00:04.000
Mark Zuckerberg answers to Facebook's moderation of controversial content.
00:00:12.800
Every time I look into his eyes, don't you see a soulless person?
00:00:19.460
That's the guy who runs everything right now, bro.
00:00:22.320
And also, dubbing Joe Rogan for getting this dude.
00:00:42.780
I don't know how they found out that 19 of 20 were fake.
00:01:24.040
Mark Zuckerberg answers to Facebook's moderation of controversial content.
00:01:30.840
Basically, they're asking, like, why did Tate get banned?
00:01:45.480
Like, say, like, these Christian Facebook pages.
00:01:48.220
I don't know how they found out that 19 of 20 were fake.
00:01:51.940
Like, but if someone just says, I am Bob Smith, and they post as Bob Smith, and they have a photograph, and they...
00:01:59.000
But really what they're doing is trying to talk shit about Joe Biden and get people to vote Republican in the midterms.
00:02:09.740
Like, this is the big argument with Elon and Twitter.
00:02:12.880
Because Elon asked Twitter, like, what percentage...
00:02:15.460
Jeff WRL, should Zuckerberg and Facebook be in charge of deciding what's controversial and delete it based on such?
00:02:23.220
...of your website is filled with bots, and they say 5%, and he says, I don't believe you.
00:02:35.180
There should be some safety and privacy things, but there should be no controversial social objective.
00:02:43.560
I believe they said that they just took 100 random Twitter pages and looked at the interaction,
00:02:49.900
and there's some sort of algorithm that applied to it.
00:02:54.760
So, I mean, I think estimating the overall prep...
00:02:58.940
This guy is so media-trained because he literally is the media.
00:03:03.440
Joe Borg is asking him, how do you decide what's controversial?
00:03:08.020
But I think that the question of looking at a page and is this page authentic, I think
00:03:15.580
One of the things that we try to do is for large pages, we try to make sure that we know
00:03:24.460
You don't necessarily need to out yourself and say who you are running it.
00:03:27.400
But we want to make sure that we sort of have like an identity for that person on file
00:03:32.780
so that way we know, like at least behind the scenes, that that person is real.
00:03:39.940
You see how he didn't answer the question so far?
00:03:42.480
You see how he just said a bunch of stuff to just kind of like, yeah, well, here's
00:03:46.420
the answer to a question that has nothing to do with what you just said.
00:04:00.260
I think having a sense of what country they're originating from, I mean, some of that you
00:04:04.980
can do just by looking at where their server traffic comes from.
00:04:07.800
Like, is the IP address coming from Romania or, you know, is...
00:04:14.960
Why was Romania the first IP address he thought of?
00:04:18.920
Like, he's saying if you have a big page, we need to know who it is on file.
00:04:25.620
In case it's somebody that should be deleted based on his criteria.
00:04:34.660
Because if it's, like, an ad in some other country's election, then, you know, you probably
00:04:41.200
want to make sure that that ad is, you know, especially in countries that have laws around
00:04:46.480
that are, like, are coming from someone who's a valid citizen or, like, at least in that place.
00:04:50.880
So there's a bunch of, I think, I don't know, one theme in my worldview around this stuff,
00:04:57.160
when it gets to some of the stuff that we talked about before, is, like, I don't think
00:04:59.940
that this stuff is black and white or that you're ever going to have, like, a perfect...
00:05:04.900
Basically, he could decide whatever the fuck he wants to delete.
00:05:10.720
Whatever he thinks is controversial, he could delete.
00:05:21.520
I think it's all trade-offs all the way down, right?
00:05:26.800
You could build a system, and you can either be overly aggressive and capture a higher percent
00:05:32.160
of the bad guys, but then also, by accident, take out some number of good guys.
00:05:36.520
Chad, we'll watch Tate on Tucker right after this.
00:05:38.020
Or you could be a little more lenient and say, okay, no, the cost of taking out any number
00:05:43.940
of good guys is too high, so we're going to tolerate having...
00:05:45.860
See, but he doesn't even explain what good and bad are.
00:05:50.280
Do you see the level of power that this lizard has?
00:05:54.440
You know, just a little bit more, like, more bad guys on the system.
00:05:59.440
These are values questions, right, around what do you value more?
00:06:12.760
Part of what I've struggled with around this is...
00:06:16.640
I didn't get into this to basically judge those things.
00:06:21.560
I got into this to design technology that helps...
00:06:26.400
And I don't like how Joe Rogan's just not pressing him about it.
00:06:31.800
And, like, I mean, you could probably tell when we spent the first hour talking about
00:06:36.000
the metaverse and the future of basically building this whole technology roadmap to basically
00:06:46.900
Just like a politician, Bob and Weave answers something to make you sound likable, turn up
00:06:54.460
The people will like when I sound good because the bad answer...
00:07:01.060
Everybody, I will put a chip in your brain for the metaverse.
00:07:06.540
I obviously have to be involved in that because this is, at some level, you know, I run the
00:07:15.120
But I also don't think that, as a matter of governance, you want all of that decision-making
00:07:25.360
So I think one of the things that, you know, our country and our government gets right is
00:07:30.560
So, you know, one of the things that I tried to create is...
00:07:34.960
It's an independent board that basically we appointed people whose kind of paramount value
00:07:40.360
is free expression, but they also balance that with things like when is there going
00:07:44.420
to be real harm to others in terms of safety or privacy or other human rights issues.
00:07:50.480
And basically, that board, people in our community can appeal cases to when they think
00:07:56.840
that we got it wrong, and that board actually gets to make the final binding decision, not
00:08:05.820
I actually think that that is a more legitimate form of governance than having just a team
00:08:11.460
internally that makes these decisions or, you know, maybe some of them go up to me, although
00:08:16.120
I don't spend a ton of my time on this on a day-to-day basis.
00:08:19.400
But, like, I think it's generally good to have some kind of separation of powers where you're
00:08:25.440
So, that way, you have different stakeholders and different people who can make these decisions,
00:08:29.840
and it's not just, like, one private company that's making decisions.
00:08:38.960
And it's crazy that he said that some of the people, some of the cases actually do make
00:08:54.520
How do you guys handle things when they're a big news item that's controversial?
00:09:00.640
Like, there was a lot of attention on Twitter during the election because of the Hunter
00:09:13.720
I mean, basically, the background here is the FBI, I think, basically came to us, some
00:09:19.580
folks on our team, and was like, hey, just so you know, you should be on high alert.
00:09:25.240
We thought that there was a lot of Russian propaganda in the 2016 election.
00:09:29.800
We have it on notice that basically there's about to be some kind of dump that's similar
00:09:42.700
What Twitter did is they said, you can't share this at all.
00:09:47.500
What we do is we have, if something's reported to us as potentially misinformation, important
00:09:54.760
misinformation, we also have this third-party fact-checking program because we don't want
00:10:00.120
And for the, I think it was five or seven days.
00:10:04.660
So, an AI decides when a real person could say, well, yeah, he said, we don't want to
00:10:12.020
decide what's true and false, so we let an AI decide what's true and false.
00:10:21.380
And we also have this third-party fact-checking program because we don't want to be deciding
00:10:29.600
So, an AI decides what a human is saying, truth and false.
00:10:35.300
And for the, I think it was five or seven days when it was basically being determined
00:10:43.760
whether it was false, the distribution on Facebook was decreased, but people were still
00:10:54.360
So, when you say the distribution has decreased, how does that work?
00:10:58.280
Basically, the ranking in newsfeed was a little bit less.
00:11:01.040
So, fewer people saw it than would have otherwise.
00:11:06.740
I don't know off the top of my head, but it's...
00:11:11.100
But, I mean, but basically, a lot of people were still able to share it.
00:11:17.580
We got a lot of complaints that that was the case.
00:11:19.920
You know, obviously, this is a hyper-political issue.
00:11:22.480
So, depending on what side of the political spectrum, you either think we didn't censor it
00:11:27.180
But we weren't sort of as black and white about it as Twitter.
00:11:30.440
We just kind of thought, hey, look, if the FBI, which I still view as a legitimate institution
00:11:37.840
Chad, you're saying that third-party doesn't mean AI.
00:11:50.560
They come to us and tell us that we need to be on guard about something.
00:11:55.000
Did they specifically say you need to be on guard about that story?
00:11:59.740
I don't remember if it was that specifically, but it basically fit the pattern.
00:12:04.080
When something like that turns out to be real, is there regret for not having it evenly distributed
00:12:14.240
When I get a copyright claim for nudity, and when I have a community guideline strike right
00:12:21.660
now for corona misinformation, do they feel bad?
00:12:57.480
So basically it had this period where it was getting less distribution.
00:13:00.740
Um, so yeah, I mean, I, I, but I think like, I think it probably, it sucks though.
00:13:07.440
I think in the same way that probably having to go through like a criminal trial, but being
00:13:18.680
Because all this misinformation that they claim was misinformation that ends up being
00:13:25.020
And now you have all these boomers who still think that everything they heard in the beginning
00:13:32.540
And now when you're saying the truth, they still won't believe you.
00:13:34.900
And it's so divided because you suppressed the truth in the beginning.
00:13:39.500
They act like time is not one of the most valuable assets we have.
00:13:43.180
See, when a criminal goes to jail for something they didn't do for 20 years and they're like,
00:13:49.180
And they give them like $5 million and let them out.
00:13:51.340
They still lost 20 years they could have spent with their kids.
00:13:55.720
I know it's different, but it goes the same with information that they
00:14:00.560
That time that it could have been up and people could have heard it and people could
00:14:02.980
have heard the truth while you deleted it while it was going through your fucking third
00:14:13.520
Just because it gets up eventually doesn't mean that it's fine just because it should have
00:14:20.120
And the difference is that the misinformation is reversible.
00:14:23.200
Like if you wanted to push the things that are right now to the public, you have the power
00:14:28.660
They actively don't because the FBI doesn't want to or whatever their agenda is, but it
00:14:36.300
That you had to go through a criminal trial, but at the end you're free.
00:14:40.200
So it's, I don't know if the answer would have been don't do anything or don't have
00:14:48.260
You know, we still let people share it, but obviously you don't want situations like
00:14:52.860
But certainly much more reasonable than Twitter stance.
00:14:55.300
And it's probably also the case of armchair quarterbacking, right?
00:14:59.540
Or at least Monday morning quarterbacking, I should say.
00:15:02.620
Because in the moment you had reason to believe based on the FBI talking to you that it wasn't
00:15:09.260
real and that there was going to be some propaganda.
00:15:14.720
And then if you just let it get out there and what if it changes the election and it
00:15:18.820
turns out to be bullshit, that's a real problem.
00:15:22.080
And I would imagine that those kind of decisions are the most difficult.
00:15:27.540
The decisions of like what is allowed and what is not allowed.
00:15:36.360
Well, first of all, you're dealing with the New York Post, which is one of the oldest
00:15:42.260
So I would I would say I would want to talk to someone from the New York Post and I would
00:15:51.140
Like where where are you getting the information from?
00:15:53.820
How do you know whether or not this is correct?
00:15:56.100
And then you have to make a decision because they might have got duped.
00:15:59.780
It's it's very it's hard because everybody wants to look at it after the fact.
00:16:05.140
Now that we know that the laptop was real and it was a legitimate story and there is
00:16:14.320
What we we we think, oh, that should not have been restricted.
00:16:19.840
That should not have been banned from sharing on Twitter.
00:16:24.760
The FBI decides he's like, I think the FBI is a good institution.
00:16:27.480
They're the ones who tell him to hide anything that goes against Joe Biden.
00:16:32.140
When the election comes around, you're about to see it when the midterms come up.
00:16:34.820
The level of censorship is about a triple because they have the most power.
00:16:38.900
They said that don't Hunter Biden, it's it's Russian misinformation.
00:16:42.840
No, that guy's a crackhead and he's been running wild doing whatever he wanted for decades.
00:16:47.660
And they said that his laptop like they didn't even want to release the information on his
00:16:57.240
You knew what he was doing, but you hide it because the election was coming around and
00:17:03.160
You think any journalist in Minecraft and Minecraft, I don't actually mean that they
00:17:07.100
You think any journalistic company has integrity to like prove or just state facts like anywhere
00:17:12.180
that you would look that it's just straight facts.
00:17:14.660
It's all like alternative sites that they call conspiracy sites.
00:17:31.380
Like if something comes along and the Republicans cook up some scheme to make it look like Joe
00:17:37.040
Biden's a terrible person and they only do it so that they can win the election, but it's
00:17:44.960
You're supposed to not allow that to be distributed.
00:17:47.520
So if they think that's the case, it makes sense to me that they would try to stop it.
00:17:53.420
But I just don't think that they looked at it hard enough.
00:17:56.860
When the New York Post is talking about it, they're pretty smart about what they release
00:18:03.480
If they're going over some data from a laptop and you could talk to a person.
00:18:10.900
But again, this is just one story, one individual story.
00:18:14.260
How many of these pop up every day, especially in regards to polarizing issues like climate
00:18:20.880
change or COVID or, you know, foreign policy or Ukraine, anytime there's like a really controversial
00:18:28.000
issue where some people think that it's imperative that you take a very specific stance and you
00:18:34.880
can't have the other stance like that, those moments on social media, those trouble a lot
00:18:41.920
of people because they don't know why certain things get censored or certain things get promoted.
00:18:54.900
And that was one of the things that I really wanted to talk to.
00:19:02.440
Being in your spot must be insanely difficult to have.
00:19:09.680
What decision you make, you're going to have a giant chunk of people that are upset at you.
00:19:15.020
And there might be a right way to handle it, but I don't know.
00:19:22.800
I personally don't think so, but I hear that sentiment a lot about Alex Jones and Joe Rogan.
00:19:30.580
Well, I think the right way is to establish principles for governance that try to be balanced
00:19:38.860
and not have the decision-making too centralized.
00:19:41.180
Because I think that it's hard for people to accept that, like, some team at Meta or that I personally am making all these decisions.
00:19:51.300
And I think people should be skeptical about so much concentration around that.
00:19:56.140
So that's why a lot of the innovation that I've tried to push for in governance is around things like establishing this oversight board.
00:20:03.820
So that way you have people who are luminaries around expression from all over the world, but also in the U.S.
00:20:11.020
You know, I mean, folks like Michael McConnell, who's, I mean, he's a Stanford professor, who's like, just, he was, I forget which, which Republican president appointed him.
00:20:22.120
But I mean, he was, I think, going to be considered for the Supreme Court at some point.
00:20:25.560
I mean, he's a very, very prominent and kind of celebrated free expression advocate.
00:20:35.160
And I think, like, setting up forms of governance around.
00:20:39.800
Just the fact that Tate was banned on all these platforms is a perfect example of that, bro.
00:20:43.640
But Cardi B runs wild, who openly said that she drugs and robs men.
00:20:52.660
They bring her on debate panels and walk her in the White House.
00:20:55.620
And she has publicly said, I drugged and robbed men.
00:21:00.320
Tate has been canceled for false accusations, things that didn't happen.
00:21:09.260
That are independent of us, that basically get the final say on a bunch of these decisions.
00:21:18.740
I mean, in the Hunter Biden case that you talked about before, I don't want our company to decide what's misinformation and what's not.
00:21:25.660
So don't work with third parties and basically.
00:21:30.600
So you just blame someone else when you own it.
00:21:37.760
Now, I mean, then you have the question of, are those organizations biased or not?
00:21:43.040
You see how he's just not taking accountability like a girl in the it's complicated videos, just no response.
00:21:50.580
But at least we're not the ones who are basically sitting here.
00:21:54.260
Ministry of truth for the world that's deciding whether everything is true or not.
00:22:06.040
You know, I think that there's it is interesting that the U.S. is actually more polarized than than most other countries.
00:22:16.140
So I think sitting in the U.S., it's easy to extrapolate and say, hey, it probably feels this way around the whole world.
00:22:24.220
And from the social science research that I've seen, that's not actually the case.
00:22:28.140
There's a bunch of countries where social media is just as prominent, but polarization is either flat or has declined slightly.
00:22:35.660
So there's something kind of different happening in the U.S.
00:22:40.520
But but for better or worse, I mean, it does seem like like like the the next several years do seem like they're set up to be quite polarized.
00:22:49.780
There are going to be a bunch of different decisions like this that that come up because of the scale of what we do.
00:22:56.380
Almost every major world event has some angle that's like the Facebook or Instagram or WhatsApp angle about how the services are used in it.
00:23:04.380
So, yeah, I think just establishing as much as possible independent governance.
00:23:08.720
You just dodged it the whole time, the whole interview.
00:23:15.260
Going back to what you said about how they brought Cardi B on for politicians, I think that's the lamest shit in the world, bro.
00:23:21.800
It's like a fresh and fit type B where you just bring stupid people in to just roast them and be smarter than them.
00:23:27.280
Cardi B is just a stunt for them to be like, look how dumb liberals are because you fuck with Cardi B.
00:23:32.700
And we're just smarter than her in life because she's a stripper from the Bronx.