SNEAKO - August 26, 2022


SNEAKO Reacts To Mark Zuckerberg on Joe Rogan


Episode Stats

Length

23 minutes

Words per Minute

189.42337

Word Count

4,518

Sentence Count

365

Misogynist Sentences

5

Hate Speech Sentences

3


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, the boys discuss the Mark Zuckerberg interview on Joe Rogan's show, the latest in the war between Elon Musk and Elon on the internet, and the controversial question of who should be in charge of deciding what's controversial.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Okay, chat, did you see this?
00:00:02.020 Mark Zuckerberg was on Joe Rogan.
00:00:04.000 Mark Zuckerberg answers to Facebook's moderation of controversial content.
00:00:07.660 When we take down something that...
00:00:10.700 Lizard, can you spam some lizards?
00:00:12.800 Every time I look into his eyes, don't you see a soulless person?
00:00:16.600 That looks like a real human.
00:00:19.460 That's the guy who runs everything right now, bro.
00:00:22.320 And also, dubbing Joe Rogan for getting this dude.
00:00:25.600 Has he done a podcast ever?
00:00:26.720 I don't think so.
00:00:27.700 This is crazy.
00:00:28.380 This just came out, chat.
00:00:32.060 That we're not supposed to.
00:00:34.620 I mean, that is like...
00:00:36.380 I mean, that's the worst.
00:00:38.040 How do you discern?
00:00:40.340 Say, like, these Christian Facebook pages.
00:00:42.780 I don't know how they found out that 19 of 20 were fake.
00:00:46.600 Oh, chat, okay, we'll go right back to this.
00:00:47.960 Apparently, Tate's on Fox News right now.
00:00:49.800 Tate's on Fox News live right now?
00:00:52.560 Okay, if it's live right now, we'll watch it.
00:00:53.860 We'll go right back to the lizard.
00:00:55.080 Hold up, hold up.
00:00:55.580 Don't lose your attention span.
00:00:56.440 Tate, Tucker.
00:00:58.280 Is it on YouTube?
00:01:01.120 Where the fuck do you watch it?
00:01:02.880 Fox.
00:01:03.820 Fox News.com?
00:01:08.620 Go back to Google.
00:01:09.460 I said, watch Fox.
00:01:12.880 Ew, bro.
00:01:13.460 You gotta have cable or something.
00:01:19.160 Yeah, no, okay.
00:01:19.760 We can't watch that.
00:01:20.680 Let's go back to the lizard.
00:01:21.420 My bad.
00:01:22.720 We'll watch it tomorrow.
00:01:24.040 Mark Zuckerberg answers to Facebook's moderation of controversial content.
00:01:27.220 That's the worst.
00:01:28.120 I mean, that's...
00:01:28.440 Which is me.
00:01:29.660 Now they get to decide.
00:01:30.840 Basically, they're asking, like, why did Tate get banned?
00:01:33.360 Oh, my God.
00:01:36.200 The mouse just fucked up.
00:01:37.760 Oh, Zuckerberg got my computer.
00:01:39.800 Where's the mouse?
00:01:40.920 I saw it.
00:01:41.320 It was behind the window.
00:01:43.740 You discern.
00:01:44.920 Like, how...
00:01:45.480 Like, say, like, these Christian Facebook pages.
00:01:48.220 I don't know how they found out that 19 of 20 were fake.
00:01:51.940 Like, but if someone just says, I am Bob Smith, and they post as Bob Smith, and they have a photograph, and they...
00:01:59.000 But really what they're doing is trying to talk shit about Joe Biden and get people to vote Republican in the midterms.
00:02:05.240 Like, how...
00:02:06.620 What...
00:02:07.120 How do you know whether someone's real or not?
00:02:09.740 Like, this is the big argument with Elon and Twitter.
00:02:12.880 Because Elon asked Twitter, like, what percentage...
00:02:15.460 Jeff WRL, should Zuckerberg and Facebook be in charge of deciding what's controversial and delete it based on such?
00:02:23.220 ...of your website is filled with bots, and they say 5%, and he says, I don't believe you.
00:02:28.200 Well, who should be then?
00:02:29.580 Nobody.
00:02:29.980 And let's find out how you...
00:02:31.200 You should just...
00:02:31.840 Anarchy on the internet?
00:02:33.140 Yes.
00:02:34.240 Prone, everything?
00:02:34.960 No.
00:02:35.180 There should be some safety and privacy things, but there should be no controversial social objective.
00:02:41.180 Inclusion.
00:02:41.760 Yeah.
00:02:41.940 And, you know, they're...
00:02:43.560 I believe they said that they just took 100 random Twitter pages and looked at the interaction,
00:02:49.900 and there's some sort of algorithm that applied to it.
00:02:51.920 But how do you discern?
00:02:54.260 Yeah.
00:02:54.760 So, I mean, I think estimating the overall prep...
00:02:57.140 We're ready for him to dodge the question.
00:02:58.940 This guy is so media-trained because he literally is the media.
00:03:03.440 Joe Borg is asking him, how do you decide what's controversial?
00:03:06.100 Watch him dodge it.
00:03:06.840 ...is one thing.
00:03:08.020 But I think that the question of looking at a page and is this page authentic, I think
00:03:12.700 that there's a bunch of signals around that.
00:03:15.580 One of the things that we try to do is for large pages, we try to make sure that we know
00:03:20.140 who the admin of that page is.
00:03:21.960 We don't necessarily...
00:03:22.580 You should be able to run an anonymous page.
00:03:24.460 You don't necessarily need to out yourself and say who you are running it.
00:03:27.400 But we want to make sure that we sort of have like an identity for that person on file
00:03:32.780 so that way we know, like at least behind the scenes, that that person is real.
00:03:38.820 For certain political...
00:03:39.940 You see how he didn't answer the question so far?
00:03:42.480 You see how he just said a bunch of stuff to just kind of like, yeah, well, here's
00:03:46.420 the answer to a question that has nothing to do with what you just said.
00:03:49.640 And this is what we do in our policy.
00:03:51.420 And I'm adding empathy to my voice.
00:03:53.160 Empathy points.
00:03:53.800 Three, dig, dig, dig.
00:03:54.500 This will make me sound good and likable.
00:03:56.740 Turn empathy up.
00:03:57.760 Turn empathy.
00:03:58.780 Three, B, B, B.
00:04:00.260 I think having a sense of what country they're originating from, I mean, some of that you
00:04:04.980 can do just by looking at where their server traffic comes from.
00:04:07.800 Like, is the IP address coming from Romania or, you know, is...
00:04:11.760 Why from Romania?
00:04:13.600 You hear that?
00:04:14.500 Yeah.
00:04:14.960 Why was Romania the first IP address he thought of?
00:04:17.660 I don't get why it matters.
00:04:18.920 Like, he's saying if you have a big page, we need to know who it is on file.
00:04:22.960 Why does that matter?
00:04:25.620 In case it's somebody that should be deleted based on his criteria.
00:04:30.260 Oops, oops, oops, oops, oops.
00:04:34.660 Because if it's, like, an ad in some other country's election, then, you know, you probably
00:04:41.200 want to make sure that that ad is, you know, especially in countries that have laws around
00:04:46.480 that are, like, are coming from someone who's a valid citizen or, like, at least in that place.
00:04:50.880 So there's a bunch of, I think, I don't know, one theme in my worldview around this stuff,
00:04:57.160 when it gets to some of the stuff that we talked about before, is, like, I don't think
00:04:59.940 that this stuff is black and white or that you're ever going to have, like, a perfect...
00:05:03.380 So it's a gray area.
00:05:04.900 Basically, he could decide whatever the fuck he wants to delete.
00:05:08.040 It's not a black and white.
00:05:09.520 There's no rules.
00:05:10.720 Whatever he thinks is controversial, he could delete.
00:05:13.300 That's exactly what he said.
00:05:15.940 That's kind of fucked up.
00:05:16.640 Exactly what he said.
00:05:17.660 It's a gray area.
00:05:19.740 Perfect AI system.
00:05:21.520 I think it's all trade-offs all the way down, right?
00:05:24.160 And it's...
00:05:24.920 And you could either...
00:05:26.800 You could build a system, and you can either be overly aggressive and capture a higher percent
00:05:32.160 of the bad guys, but then also, by accident, take out some number of good guys.
00:05:36.520 Chad, we'll watch Tate on Tucker right after this.
00:05:38.020 Or you could be a little more lenient and say, okay, no, the cost of taking out any number
00:05:43.940 of good guys is too high, so we're going to tolerate having...
00:05:45.860 See, but he doesn't even explain what good and bad are.
00:05:48.280 He decides what's good and bad.
00:05:50.280 Do you see the level of power that this lizard has?
00:05:52.340 I've got to watch this whole podcast, man.
00:05:53.620 Shout out Joe Rogan.
00:05:54.440 You know, just a little bit more, like, more bad guys on the system.
00:05:59.440 These are values questions, right, around what do you value more?
00:06:04.120 And those are super tricky questions.
00:06:06.760 And it's a tricky question.
00:06:08.700 Fucking answer it.
00:06:10.080 How do you decide what you delete?
00:06:11.880 Well, I don't...
00:06:12.760 Part of what I've struggled with around this is...
00:06:16.640 I didn't get into this to basically judge those things.
00:06:21.560 I got into this to design technology that helps...
00:06:23.640 So answer the fucking...
00:06:24.680 He's not going to answer the question?
00:06:26.400 And I don't like how Joe Rogan's just not pressing him about it.
00:06:28.660 He's waiting until he fully answers.
00:06:30.160 People connect, right?
00:06:31.280 It's like...
00:06:31.460 He'll press him.
00:06:31.800 And, like, I mean, you could probably tell when we spent the first hour talking about
00:06:36.000 the metaverse and the future of basically building this whole technology roadmap to basically
00:06:40.460 give people this realistic sense of presence.
00:06:42.180 It's like, that's what I'm here to do, right?
00:06:46.300 So this whole...
00:06:46.900 Just like a politician, Bob and Weave answers something to make you sound likable, turn up
00:06:52.720 the empathy points, da-da-da-da-da-da.
00:06:54.460 The people will like when I sound good because the bad answer...
00:06:59.400 I think that's like...
00:07:00.060 I'm trying to correct everyone.
00:07:01.060 Everybody, I will put a chip in your brain for the metaverse.
00:07:05.180 What is okay and what is not?
00:07:06.540 I obviously have to be involved in that because this is, at some level, you know, I run the
00:07:10.960 company and I can't just abdicate that.
00:07:15.120 But I also don't think that, as a matter of governance, you want all of that decision-making
00:07:22.760 vested in one individual.
00:07:25.360 So I think one of the things that, you know, our country and our government gets right is
00:07:28.860 the separation of powers.
00:07:30.560 So, you know, one of the things that I tried to create is...
00:07:33.720 We created this oversight board.
00:07:34.960 It's an independent board that basically we appointed people whose kind of paramount value
00:07:40.360 is free expression, but they also balance that with things like when is there going
00:07:44.420 to be real harm to others in terms of safety or privacy or other human rights issues.
00:07:50.480 And basically, that board, people in our community can appeal cases to when they think
00:07:56.840 that we got it wrong, and that board actually gets to make the final binding decision, not
00:08:00.960 us.
00:08:01.300 So, in a way...
00:08:03.300 Chad, do you hear him dodge the question?
00:08:05.820 I actually think that that is a more legitimate form of governance than having just a team
00:08:11.460 internally that makes these decisions or, you know, maybe some of them go up to me, although
00:08:16.120 I don't spend a ton of my time on this on a day-to-day basis.
00:08:19.400 But, like, I think it's generally good to have some kind of separation of powers where you're
00:08:24.400 architecting the governance.
00:08:25.440 So, that way, you have different stakeholders and different people who can make these decisions,
00:08:29.840 and it's not just, like, one private company that's making decisions.
00:08:34.100 Is it not one company?
00:08:35.920 Aren't you the CEO of Meta?
00:08:38.960 And it's crazy that he said that some of the people, some of the cases actually do make
00:08:41.900 it up to him.
00:08:43.640 I mean, he is making...
00:08:44.340 He's just saying nothing right now.
00:08:45.900 About what just happened.
00:08:46.900 Hey, Jadion on the chat!
00:08:48.320 Free Jadion!
00:08:49.240 Free Jadion on the chat!
00:08:50.180 What happened?
00:08:50.640 Yo, Jadion, did you get deleted?
00:08:52.000 Do you want to get him on stream?
00:08:53.200 It's on our platform.
00:08:54.520 How do you guys handle things when they're a big news item that's controversial?
00:09:00.640 Like, there was a lot of attention on Twitter during the election because of the Hunter
00:09:06.200 Biden laptop story, the New York Post.
00:09:08.040 Yeah, we have that, too.
00:09:09.160 Yeah, so you guys censored that as well?
00:09:11.440 So, we took a different path than Twitter.
00:09:13.720 I mean, basically, the background here is the FBI, I think, basically came to us, some
00:09:19.580 folks on our team, and was like, hey, just so you know, you should be on high alert.
00:09:25.240 We thought that there was a lot of Russian propaganda in the 2016 election.
00:09:29.800 We have it on notice that basically there's about to be some kind of dump that's similar
00:09:38.280 to that, so just be vigilant.
00:09:40.340 So, our protocol is different from Twitter's.
00:09:42.700 What Twitter did is they said, you can't share this at all.
00:09:46.300 We didn't do that.
00:09:47.500 What we do is we have, if something's reported to us as potentially misinformation, important
00:09:54.760 misinformation, we also have this third-party fact-checking program because we don't want
00:09:58.760 to be deciding what's true and false.
00:10:00.120 And for the, I think it was five or seven days.
00:10:04.660 So, an AI decides when a real person could say, well, yeah, he said, we don't want to
00:10:12.020 decide what's true and false, so we let an AI decide what's true and false.
00:10:15.860 Huh?
00:10:16.080 I thought you said third-party.
00:10:18.260 Did he?
00:10:20.980 False.
00:10:21.380 And we also have this third-party fact-checking program because we don't want to be deciding
00:10:25.620 third-party fact-checking programs.
00:10:28.020 Yeah, third-party is a program.
00:10:29.600 So, an AI decides what a human is saying, truth and false.
00:10:34.340 What's true and false.
00:10:35.300 And for the, I think it was five or seven days when it was basically being determined
00:10:43.760 whether it was false, the distribution on Facebook was decreased, but people were still
00:10:50.960 allowed to share it.
00:10:51.740 So, you could still share it.
00:10:52.960 You could still consume it.
00:10:54.360 So, when you say the distribution has decreased, how does that work?
00:10:58.280 Basically, the ranking in newsfeed was a little bit less.
00:11:01.040 So, fewer people saw it than would have otherwise.
00:11:04.160 So, it definitely...
00:11:05.360 By what percentage?
00:11:06.740 I don't know off the top of my head, but it's...
00:11:08.600 You should probably know that!
00:11:09.940 It's meaningful.
00:11:11.100 But, I mean, but basically, a lot of people were still able to share it.
00:11:17.580 We got a lot of complaints that that was the case.
00:11:19.920 You know, obviously, this is a hyper-political issue.
00:11:22.480 So, depending on what side of the political spectrum, you either think we didn't censor it
00:11:25.640 enough or censored it way too much.
00:11:27.180 But we weren't sort of as black and white about it as Twitter.
00:11:30.440 We just kind of thought, hey, look, if the FBI, which I still view as a legitimate institution
00:11:36.100 in this country...
00:11:37.060 Of course you do.
00:11:37.840 Chad, you're saying that third-party doesn't mean AI.
00:11:39.660 Then what do you mean by that?
00:11:40.620 Who's fact-checking if it's not Facebook?
00:11:42.600 It's Facebook's platform.
00:11:44.080 He said third-party fact-checking program.
00:11:46.740 A program is AI.
00:11:48.340 It's like very professional law enforcement.
00:11:50.560 They come to us and tell us that we need to be on guard about something.
00:11:53.660 Then I want to take that seriously.
00:11:55.000 Did they specifically say you need to be on guard about that story?
00:11:57.900 No.
00:11:59.740 I don't remember if it was that specifically, but it basically fit the pattern.
00:12:04.080 When something like that turns out to be real, is there regret for not having it evenly distributed
00:12:12.260 and for throttling...
00:12:14.240 When I get a copyright claim for nudity, and when I have a community guideline strike right
00:12:21.660 now for corona misinformation, do they feel bad?
00:12:26.480 Do you know the answer?
00:12:27.760 Great question, Joe.
00:12:28.740 Because it's no.
00:12:29.680 They don't give a fuck about us.
00:12:32.140 They don't.
00:12:33.240 The distribution of that story?
00:12:36.000 What do you mean evenly distributed?
00:12:37.440 I mean evenly in that it's not suppressed.
00:12:40.980 It's not...
00:12:41.440 Yeah, yeah, yeah.
00:12:42.480 I mean, it sucks.
00:12:44.080 Yeah.
00:12:44.400 Yeah, I mean, because I'm going to turn...
00:12:45.720 Mark Zuckerberg said it sucks.
00:12:47.160 ...out after the fact.
00:12:48.260 Oh, yeah, receive.
00:12:48.900 Oh, no, they suck.
00:12:50.560 Hey, he's not so bad after all.
00:12:52.400 I know you're a lizard, but yeah, you're cool.
00:12:54.560 The teachers looked into it.
00:12:55.300 No one was able to say it was false, right?
00:12:57.480 So basically it had this period where it was getting less distribution.
00:13:00.740 Um, so yeah, I mean, I, I, but I think like, I think it probably, it sucks though.
00:13:07.440 I think in the same way that probably having to go through like a criminal trial, but being
00:13:12.840 proven innocent in the end sucks.
00:13:14.340 Like it still sucks to have, have...
00:13:16.020 No, bro, it's not the same.
00:13:18.680 Because all this misinformation that they claim was misinformation that ends up being
00:13:22.500 true was suppressed by the greater public.
00:13:25.020 And now you have all these boomers who still think that everything they heard in the beginning
00:13:29.240 of this whole pandemic is right.
00:13:32.540 And now when you're saying the truth, they still won't believe you.
00:13:34.900 And it's so divided because you suppressed the truth in the beginning.
00:13:37.720 And they, they act like...
00:13:38.540 Before you even knew anything.
00:13:39.500 They act like time is not one of the most valuable assets we have.
00:13:43.180 See, when a criminal goes to jail for something they didn't do for 20 years and they're like,
00:13:47.840 oh, oopsie.
00:13:49.180 And they give them like $5 million and let them out.
00:13:51.340 They still lost 20 years they could have spent with their kids.
00:13:54.600 It goes the same.
00:13:55.720 I know it's different, but it goes the same with information that they
00:13:58.660 decide is misinformation.
00:14:00.560 That time that it could have been up and people could have heard it and people could
00:14:02.980 have heard the truth while you deleted it while it was going through your fucking third
00:14:06.820 party AI bullshits, people didn't hear it.
00:14:10.460 And time is important.
00:14:11.700 A lot of people need to hear it right now.
00:14:13.520 Just because it gets up eventually doesn't mean that it's fine just because it should have
00:14:18.420 been heard at the time that it was posted.
00:14:20.120 And the difference is that the misinformation is reversible.
00:14:23.200 Like if you wanted to push the things that are right now to the public, you have the power
00:14:26.560 to do that.
00:14:27.060 Mark Zuckerberg has the power to do that.
00:14:28.660 They actively don't because the FBI doesn't want to or whatever their agenda is, but it
00:14:34.200 is reversible for them.
00:14:36.300 That you had to go through a criminal trial, but at the end you're free.
00:14:40.200 So it's, I don't know if the answer would have been don't do anything or don't have
00:14:44.940 any process.
00:14:45.440 I think the process was pretty reasonable.
00:14:48.260 You know, we still let people share it, but obviously you don't want situations like
00:14:52.620 that.
00:14:52.860 But certainly much more reasonable than Twitter stance.
00:14:55.300 And it's probably also the case of armchair quarterbacking, right?
00:14:59.540 Or at least Monday morning quarterbacking, I should say.
00:15:02.620 Because in the moment you had reason to believe based on the FBI talking to you that it wasn't
00:15:09.260 real and that there was going to be some propaganda.
00:15:12.160 So what do you do?
00:15:14.100 Yeah.
00:15:14.720 And then if you just let it get out there and what if it changes the election and it
00:15:18.820 turns out to be bullshit, that's a real problem.
00:15:22.080 And I would imagine that those kind of decisions are the most difficult.
00:15:27.540 The decisions of like what is allowed and what is not allowed.
00:15:31.300 Yeah.
00:15:31.600 Yeah.
00:15:31.760 I mean, what would you do in that situation?
00:15:33.220 I don't know what I would do.
00:15:34.120 I would have to like really thoroughly.
00:15:36.360 Well, first of all, you're dealing with the New York Post, which is one of the oldest
00:15:40.380 newspapers in the country.
00:15:42.260 So I would I would say I would want to talk to someone from the New York Post and I would
00:15:48.140 say, how did you come up with this data?
00:15:51.140 Like where where are you getting the information from?
00:15:53.820 How do you know whether or not this is correct?
00:15:56.100 And then you have to make a decision because they might have got duped.
00:15:59.780 It's it's very it's hard because everybody wants to look at it after the fact.
00:16:05.140 Now that we know that the laptop was real and it was a legitimate story and there is
00:16:10.780 potential corruption involved with him.
00:16:14.320 What we we we think, oh, that should not have been restricted.
00:16:19.840 That should not have been banned from sharing on Twitter.
00:16:22.520 That's a good point.
00:16:23.140 Right.
00:16:23.420 What do you decide what journalists?
00:16:24.760 The FBI decides he's like, I think the FBI is a good institution.
00:16:27.480 They're the ones who tell him to hide anything that goes against Joe Biden.
00:16:32.140 When the election comes around, you're about to see it when the midterms come up.
00:16:34.820 The level of censorship is about a triple because they have the most power.
00:16:38.900 They said that don't Hunter Biden, it's it's Russian misinformation.
00:16:42.840 No, that guy's a crackhead and he's been running wild doing whatever he wanted for decades.
00:16:47.660 And they said that his laptop like they didn't even want to release the information on his
00:16:51.760 laptop, but they knew what he was doing.
00:16:54.180 It's not misinformation.
00:16:55.200 It's not Russia.
00:16:57.240 You knew what he was doing, but you hide it because the election was coming around and
00:17:01.580 you wanted to get rid of Trump.
00:17:03.160 You think any journalist in Minecraft and Minecraft, I don't actually mean that they
00:17:06.720 rigged it.
00:17:07.100 You think any journalistic company has integrity to like prove or just state facts like anywhere
00:17:12.180 that you would look that it's just straight facts.
00:17:13.600 It's all third.
00:17:14.660 It's all like alternative sites that they call conspiracy sites.
00:17:17.620 But you think Mark Zuckerberg is above that?
00:17:19.880 Yeah.
00:17:21.020 Duh.
00:17:21.320 I think everybody agrees with that.
00:17:23.200 Even Twitter agrees with that.
00:17:24.760 But the thing is, then they didn't think that.
00:17:28.000 In the beginning, they thought it was fake.
00:17:30.160 So what do they do?
00:17:31.380 Like if something comes along and the Republicans cook up some scheme to make it look like Joe
00:17:37.040 Biden's a terrible person and they only do it so that they can win the election, but it's
00:17:40.940 really just propaganda.
00:17:42.860 What are you supposed to do with that?
00:17:44.960 You're supposed to not allow that to be distributed.
00:17:47.520 So if they think that's the case, it makes sense to me that they would try to stop it.
00:17:53.420 But I just don't think that they looked at it hard enough.
00:17:56.860 When the New York Post is talking about it, they're pretty smart about what they release
00:18:02.420 and what they don't release.
00:18:03.480 If they're going over some data from a laptop and you could talk to a person.
00:18:10.900 But again, this is just one story, one individual story.
00:18:14.260 How many of these pop up every day, especially in regards to polarizing issues like climate
00:18:20.880 change or COVID or, you know, foreign policy or Ukraine, anytime there's like a really controversial
00:18:28.000 issue where some people think that it's imperative that you take a very specific stance and you
00:18:34.880 can't have the other stance like that, those moments on social media, those trouble a lot
00:18:41.920 of people because they don't know why certain things get censored or certain things get promoted.
00:18:48.360 Yeah, I agree.
00:18:52.080 What do you mean you agree?
00:18:53.160 It's you!
00:18:53.560 It's to be in your spot.
00:18:54.900 And that was one of the things that I really wanted to talk to.
00:18:56.360 Yeah, see, why is Joe not pressing him?
00:18:57.640 I don't like this.
00:18:58.480 It's you!
00:18:59.240 What do you mean you agree?
00:19:00.900 You own it!
00:19:02.440 Being in your spot must be insanely difficult to have.
00:19:06.620 Joe must have gotten mad for this.
00:19:09.680 What decision you make, you're going to have a giant chunk of people that are upset at you.
00:19:15.020 And there might be a right way to handle it, but I don't know.
00:19:17.560 Yo, chat!
00:19:18.360 Is Joe Rogan controlled opposition?
00:19:21.700 I hear you guys say that a lot.
00:19:22.800 I personally don't think so, but I hear that sentiment a lot about Alex Jones and Joe Rogan.
00:19:26.980 Is he controlled opposition?
00:19:28.460 Is he just a puppet too?
00:19:29.680 The fuck right way is.
00:19:30.580 Well, I think the right way is to establish principles for governance that try to be balanced
00:19:38.860 and not have the decision-making too centralized.
00:19:41.180 Because I think that it's hard for people to accept that, like, some team at Meta or that I personally am making all these decisions.
00:19:51.300 And I think people should be skeptical about so much concentration around that.
00:19:56.140 So that's why a lot of the innovation that I've tried to push for in governance is around things like establishing this oversight board.
00:20:03.820 So that way you have people who are luminaries around expression from all over the world, but also in the U.S.
00:20:11.020 You know, I mean, folks like Michael McConnell, who's, I mean, he's a Stanford professor, who's like, just, he was, I forget which, which Republican president appointed him.
00:20:22.120 But I mean, he was, I think, going to be considered for the Supreme Court at some point.
00:20:25.560 I mean, he's a very, very prominent and kind of celebrated free expression advocate.
00:20:33.240 And he helped me set the thing up.
00:20:35.160 And I think, like, setting up forms of governance around.
00:20:39.800 Just the fact that Tate was banned on all these platforms is a perfect example of that, bro.
00:20:43.640 But Cardi B runs wild, who openly said that she drugs and robs men.
00:20:47.840 She can be on every single platform.
00:20:49.800 She's the queen of hip hop.
00:20:51.600 Politicians talk to her.
00:20:52.660 They bring her on debate panels and walk her in the White House.
00:20:55.620 And she has publicly said, I drugged and robbed men.
00:21:00.320 Tate has been canceled for false accusations, things that didn't happen.
00:21:04.260 The fact that he hurt people's feelings.
00:21:06.020 Freedom of speech is dead.
00:21:07.320 And that's the perfect example of why.
00:21:09.260 That are independent of us, that basically get the final say on a bunch of these decisions.
00:21:16.800 And that's a step in the right direction.
00:21:18.740 I mean, in the Hunter Biden case that you talked about before, I don't want our company to decide what's misinformation and what's not.
00:21:25.660 So don't work with third parties and basically.
00:21:28.600 So you don't want to be responsible.
00:21:30.600 So you just blame someone else when you own it.
00:21:33.680 It's you.
00:21:34.280 You have the final say.
00:21:35.440 Different organizations do that.
00:21:37.760 Now, I mean, then you have the question of, are those organizations biased or not?
00:21:41.160 And that's a very difficult.
00:21:43.040 You see how he's just not taking accountability like a girl in the it's complicated videos, just no response.
00:21:49.380 You own it.
00:21:50.580 But at least we're not the ones who are basically sitting here.
00:21:52.840 At least we're not the ones.
00:21:54.260 Ministry of truth for the world that's deciding whether everything is true or not.
00:21:57.280 Tom, Joe, how are you saying?
00:21:58.520 Run.
00:21:59.000 So I'd say this is not a solved problem.
00:22:04.820 Controversies aren't going away.
00:22:06.040 You know, I think that there's it is interesting that the U.S. is actually more polarized than than most other countries.
00:22:16.140 So I think sitting in the U.S., it's easy to extrapolate and say, hey, it probably feels this way around the whole world.
00:22:24.220 And from the social science research that I've seen, that's not actually the case.
00:22:28.140 There's a bunch of countries where social media is just as prominent, but polarization is either flat or has declined slightly.
00:22:35.660 So there's something kind of different happening in the U.S.
00:22:40.520 But but for better or worse, I mean, it does seem like like like the the next several years do seem like they're set up to be quite polarized.
00:22:48.140 So I tend to agree with you.
00:22:49.780 There are going to be a bunch of different decisions like this that that come up because of the scale of what we do.
00:22:56.380 Almost every major world event has some angle that's like the Facebook or Instagram or WhatsApp angle about how the services are used in it.
00:23:04.380 So, yeah, I think just establishing as much as possible independent governance.
00:23:08.720 You just dodged it the whole time, the whole interview.
00:23:15.140 Yeah.
00:23:15.260 Going back to what you said about how they brought Cardi B on for politicians, I think that's the lamest shit in the world, bro.
00:23:20.500 Like they bring in the White House.
00:23:21.800 It's like a fresh and fit type B where you just bring stupid people in to just roast them and be smarter than them.
00:23:26.620 Like that's what they're doing.
00:23:27.280 Cardi B is just a stunt for them to be like, look how dumb liberals are because you fuck with Cardi B.
00:23:32.700 And we're just smarter than her in life because she's a stripper from the Bronx.
00:23:37.160 And we went to debate team our whole life.
00:23:40.940 Let me call Deion now.
00:23:42.940 Free Deion's hat.
00:23:49.000 That's it.
00:23:49.480 Is this your job?
00:23:50.700 Thank you.