Timcast IRL - Tim Pool - October 23, 2020


Timcast IRL - Facebook Whistleblower In Studio EXPOSES Election Interference And Censorship


Episode Stats

Length

2 hours and 15 minutes

Words per Minute

199.1609

Word Count

26,900

Sentence Count

2,123

Misogynist Sentences

32

Hate Speech Sentences

28


Summary

We re a week away from the mid-term election, and the media, big tech, and big old Joe Biden are all out to get Donald Trump re-elected, and they re doing it in a big way that we ve never seen before. We re joined by Ryan Hartweg, a former content moderator for the site, who leaked a ton of documents to Project Veritas, and also interviewed with them and laid it all out.


Transcript

00:00:00.000 We're about a week out from one of the most consequential elections in American history
00:00:29.000 and probably the most important election in our lifetimes.
00:00:33.000 Donald Trump versus Joe Biden.
00:00:35.000 And the most important thing that any one of these candidates can do in order to get more votes is to get their message out.
00:00:41.000 Well, we've seen some of the most dramatic censorship in the past couple of weeks, notably the New York Post.
00:00:49.000 It is the oldest newspaper that's never been in uninterrupted circulation, the fourth biggest in the country, is still, as far as I can tell, Suspended on Twitter for a story we now know to be true.
00:01:01.000 Now, there's a lot in this story about Hunter Biden that we haven't necessarily confirmed, but there are emails, they're suspect.
00:01:07.000 We've got a whistleblower, a man named Tony Bobulinski, who has now come out and said he's worked with the Bidens and he knows they were doing this nefarious stuff.
00:01:17.000 Censored.
00:01:18.000 How is Donald Trump supposed to win when he has got the media and big tech set up against him?
00:01:24.000 Maybe he will.
00:01:25.000 We don't know for sure.
00:01:26.000 But we have a very special guest today.
00:01:28.000 Tonight, Ryan Hartweg, who was a content moderator for Facebook, who leaked a bunch of documents to Project Veritas.
00:01:34.000 You also interviewed with them and just basically laid it all out, exactly what you could do, what they were doing.
00:01:40.000 And I asked you just a moment ago, you would say election interference.
00:01:43.000 Yeah, I think that's a valid phrase to describe what Facebook is doing.
00:01:47.000 I mean, so just to set it up for you, in 2016 Trump won the election, and in 2017 is when Facebook contracted Cognizant for this project.
00:01:59.000 So we'll jump into it.
00:02:01.000 So obviously you know all about this stuff.
00:02:04.000 There's a bunch of documents out in front of you.
00:02:06.000 And I gotta say, the first little conversations we had before the show went live, it sounds crazier than even I realize, and scarier.
00:02:13.000 You started mentioning what qualifies it.
00:02:15.000 An influencer, the swears you're allowed to use against people, and why you aren't allowed to use certain swears.
00:02:20.000 This is weird stuff.
00:02:21.000 So, of course, this should be interesting, too, because, as you know, Ian Crossland's chillin'.
00:02:25.000 Hey, guys.
00:02:25.000 But as we mentioned the other day with Alan Bakari, Ian was a co-founder of Minds.com, which is one of the most prominent... I don't like saying alternate, you know, social media platform, but Minds is pretty big.
00:02:37.000 It's got a couple million users.
00:02:40.000 Minds.com is pretty prominent in terms of not one of the big three social media networks.
00:02:44.000 You moderated for them as well.
00:02:45.000 Yeah, I did a lot, just like Ryan.
00:02:47.000 Similar, probably, to what you did.
00:02:49.000 I think the difference is you weren't purposefully targeting people based on their politics to remove them.
00:02:53.000 I was purposefully not targeting.
00:02:54.000 Yes.
00:02:55.000 Okay, okay, so we'll get into all this.
00:02:56.000 And then, of course, we've got Sour Patch Lids producing.
00:02:59.000 Hello, I'm over here.
00:02:59.000 Hi.
00:03:01.000 But I think, look, we've got a lot of stories in the news.
00:03:03.000 I think we should just jump right in because we are about a week and a half away from this election.
00:03:09.000 About 50 million people, that's my understanding, have already voted.
00:03:12.000 These people can't change their votes.
00:03:14.000 And that's one of the craziest things.
00:03:15.000 They changed the rules.
00:03:16.000 They put up all this early voting.
00:03:18.000 They got all this mail-in voting.
00:03:19.000 I'm willing to bet there's a lot of people right now who are learning things they couldn't have.
00:03:23.000 Like it was very... Let me rephrase this.
00:03:26.000 There's a lot of people who are just now learning things.
00:03:29.000 Information that was probably suppressed.
00:03:31.000 I bet there's a lot of people who watched that debate the other night.
00:03:33.000 And when Donald Trump said, your family was making a ton of money off these deals, these interviews, these emails, these meetings, you know, people want to know your brother made millions in Iraq.
00:03:44.000 Someone probably said, whoa, what is that?
00:03:46.000 And they couldn't hear that story because Twitter and Facebook were actively censoring it.
00:03:50.000 They may have already voted.
00:03:51.000 And now they're sitting there saying, oh no, I already voted for this guy.
00:03:54.000 This is why censorship is so crazy.
00:03:56.000 And it's gonna have a huge impact in the next 10 or so days.
00:04:00.000 So, uh, we're here with Ryan Hartwig.
00:04:03.000 You were a, a, a moderator, a content moderator for Facebook.
00:04:07.000 Yeah.
00:04:07.000 What is, what did you do?
00:04:09.000 What is that?
00:04:09.000 Yeah, so as a content moderator, I mean, we see the most vile things that you can imagine that are on the internet.
00:04:15.000 So, there was a training for a month and they threw us on the production floor and we would be seeing, you know, incest videos, snuff videos.
00:04:27.000 I was working for a time the Spanish queue in Latin America, so I'd see a lot of cartel violence, beheadings, throat slittings, pornography.
00:04:36.000 So everything that's horrible on the internet, that's what I would take down and delete.
00:04:41.000 I've heard a lot of people get PTSD from watching these videos all day, every day.
00:04:46.000 I have to imagine.
00:04:47.000 That's messed up stuff.
00:04:48.000 Yeah.
00:04:50.000 At the beginning, in the Spanish queue, I'd get more gross stuff, more graphic violence, more cartel violence.
00:04:56.000 But towards the end, when I switched over to the North American side, I didn't get as much.
00:05:01.000 We'd get some child pornography as well.
00:05:03.000 But yeah, some of my coworkers, I was just talking to one the other day who has PTSD symptoms.
00:05:08.000 But they didn't have counselors on site, counselors that would be coaching us and giving us techniques.
00:05:13.000 So I don't feel like I really had too much PTSD.
00:05:15.000 Most of the time I didn't bring it home at night.
00:05:18.000 But yeah, it is a tough job.
00:05:20.000 Ian's messed up.
00:05:21.000 He's gone.
00:05:21.000 I was going crazy.
00:05:24.000 What was your schedule like?
00:05:26.000 Um, so they had quite a few different shifts.
00:05:27.000 Mine was mainly the day shift, but like, I'd hate to have like the night shifts, like having to see that kind of stuff.
00:05:33.000 Cause I think your brain's different like during the night shift, how it interprets things.
00:05:37.000 So we had, we had all shifts.
00:05:38.000 Yeah.
00:05:39.000 So you would, what would you do?
00:05:40.000 Would you delete videos, delete posts or what?
00:05:42.000 Yeah, I would delete videos, I would delete groups, pages.
00:05:44.000 I monitored the Mexican presidential election in 2018, so there was about 200 of us on the Spanish side, and we were monitoring the Mexican presidential election.
00:05:53.000 But yeah, I could take down whole groups, pages, videos, posts, comments, for Facebook and Instagram.
00:05:59.000 First simple question.
00:06:00.000 In your experience, having worked at this company, taking down groups, pages, videos, did you feel there was a political bias?
00:06:07.000 Yeah, 100%.
00:06:08.000 100%.
00:06:09.000 Against what political group?
00:06:11.000 Against conservatives.
00:06:13.000 That's simple.
00:06:14.000 Plain and simple.
00:06:16.000 There it is.
00:06:16.000 I mean, the first year before the Covington law firm did an audit against Facebook, they were blatant.
00:06:22.000 Every time Trump gave a speech, even the State of the Union, they're like, hey guys, watch out for hate speech stemming from Trump.
00:06:28.000 Wow.
00:06:29.000 Everything he said.
00:06:30.000 And then, I mean, I added, by May of 2019, and we can go through the timeline at some point, But yeah, I mean, I had made a list before I even decided to reach out to like journalists or Project Veritas.
00:06:42.000 I made a list of about 20 examples of biases I'd seen.
00:06:45.000 And then that list just grew and grew.
00:06:46.000 And so by the time I went public four months ago, I mean, the list is like I have 30 plus clear cut examples of bias.
00:06:53.000 And this this is why Congressman Matt Gaetz, you know, could take the evidence I gave him.
00:06:58.000 And he was able to give that to the DOJ.
00:07:01.000 And because of that, there's a criminal referral to the DOJ for Mark Zuckerberg.
00:07:04.000 Criminal?
00:07:05.000 Yeah, criminal referral.
00:07:07.000 This is not civil, this is criminal.
00:07:08.000 Why is it criminal?
00:07:09.000 Because that referral was for alleged perjury.
00:07:14.000 Because in April 2018, Zuckerberg testified that they do not censor political speech.
00:07:19.000 And they do.
00:07:20.000 They sure do.
00:07:21.000 So what's an example?
00:07:22.000 Did you ever personally remove American conservatives?
00:07:26.000 Um, yeah, so, for example, just a quick example, like, um, there was a viral video in summer of 2018 where this Trump supporter got attacked in a restaurant.
00:07:37.000 He was a kid, like a 16-year-old kid.
00:07:39.000 Was that the splashed in the face thing?
00:07:41.000 Like a drink?
00:07:41.000 I think it might have been.
00:07:42.000 Yeah, I think he might have splashed his drink on him.
00:07:46.000 And so Facebook said, Hey, well, there was cursing in that video towards a minor.
00:07:49.000 So delete the whole thing.
00:07:50.000 And they even knew it was a viral video.
00:07:52.000 They said, Hey, we know this is a viral video showing a Trump supporter being attacked, but because there's cursing delete the whole video, which.
00:08:00.000 kind of fits the, it's kind of a gray area in the policy.
00:08:02.000 Like we don't allow cursing in a minor, we're normally that's person to person.
00:08:06.000 So if I'm on Facebook attacking a minor, cursing at the minor,
00:08:09.000 that's different than just sharing a video with a neutral caption.
00:08:12.000 And in some of those videos, the curse words were even bleeped out. Wow. So,
00:08:16.000 so how would you describe yourself politically? Do you, Are you conservative?
00:08:20.000 Yeah, I think more conservative.
00:08:22.000 More, like, libertarian.
00:08:23.000 Yeah.
00:08:24.000 So, when you were there—well, actually, let me ask, how long were you there for doing this job?
00:08:29.000 Yeah, I was there for just under two years.
00:08:30.000 Wow, you were there for a while.
00:08:32.000 Yeah.
00:08:32.000 Didn't it get to you, you're like, I'm deleting this very important stuff that, like, is important for people to know?
00:08:38.000 Yeah, it did get to me, and that's why I started making a list on my own.
00:08:41.000 Like, hey, here's some examples that I saw.
00:08:44.000 But, like, the policy is very nuanced.
00:08:46.000 So, to the average person who's a content moderator, it might not stick out.
00:08:51.000 It might not be too obvious.
00:08:52.000 But, like, once you dig down into the policy, you're like, hey, this is baked into the cake.
00:08:56.000 It's not just a couple of rogue moderators who are deleting Trump content.
00:09:00.000 It's built into the policy.
00:09:01.000 What about leftist content that you think should have been removed that they told you not to remove or allowed to stay?
00:09:08.000 Do you see a lot of that?
00:09:12.000 There's a few examples of that.
00:09:14.000 I mean, obviously, the most clear-cut example I have is there was a post in 2017, actually.
00:09:19.000 I wasn't there at the time, but I could go back and see a post from 2017.
00:09:22.000 And they're clearly saying that Antifa is not a hate organization.
00:09:26.000 But it's funny because in the post, they're like, hey, there's a bunch of protests being organized in like nine American cities.
00:09:31.000 There are alleged ties to Antifa.
00:09:33.000 Please remember that Antifa is not a hate organization.
00:09:36.000 Wow.
00:09:36.000 So, I mean, yeah, I can definitely see them protecting leftist viewpoints when it comes to protests, topless protests, when there's females protesting, or if there was a protest called Grab Them by the Ballot that showed a bunch of females naked protesting Trump, a plan where it's for the grab them by the... Right, right, right.
00:09:56.000 Yeah, the Trump line, yeah.
00:09:59.000 And so... That was allowed.
00:10:01.000 They would allow it.
00:10:01.000 So they make newsworthy exceptions whenever they want to change the policy at their whim.
00:10:06.000 So, you think they swung any elections?
00:10:09.000 You think they swung 2018's midterms, or what?
00:10:12.000 2018 was a trial run.
00:10:15.000 So, Facebook told us, and the word on the street there at Facebook was, hey, we brought all the content moderation to the US, which is very expensive, by the way.
00:10:24.000 It was like a $200 million three-year contract.
00:10:26.000 So they brought these jobs to the U.S.
00:10:28.000 so they could keep it closer on the election.
00:10:31.000 And the reason they gave was because Russia interfered in the 2016 election.
00:10:36.000 So that was the whole basis for them bringing thousands of jobs to the U.S.
00:10:40.000 was, hey, Russia interfered in 2016.
00:10:43.000 We messed up.
00:10:44.000 We're trying to fix this.
00:10:45.000 But in 2018, yeah, we had a training deck just for the 2018 midterms.
00:10:51.000 Excuse me.
00:10:52.000 Yeah, we had a training deck.
00:10:53.000 So they said, hey, flag any content that's election related.
00:10:56.000 If it meets certain criteria, flag it with a VI, which goes directly to the Facebook queue, to Facebook employees.
00:11:04.000 So, and then just this past fall, they were like, they sent us a message saying, we urgently need visibility into conversations about the Democratic debates, the Democrat debates, when the primaries.
00:11:16.000 So even stuff that's not violating, they want to know what's going on.
00:11:18.000 We're their eyes and ears.
00:11:20.000 Because without us flagging trends, like I was flagging this past January and December, I was flagging like Boogaloo and Civil War was trending.
00:11:28.000 So we flagged trends to them.
00:11:32.000 You're not just removing stuff.
00:11:34.000 You're actually like scouting intel and giving them information on what people are talking about.
00:11:38.000 Exactly.
00:11:38.000 Interesting.
00:11:39.000 Alan Bakari, for those unfamiliar with him on the show, he's a journalist and tech reporter for Breitbart, and he's been covering a lot of this.
00:11:45.000 He mentioned there's this program where they're trying to pull people to the center.
00:11:51.000 Have you heard about this at all?
00:11:52.000 I think I heard mention of it on the interview on Tuesday.
00:11:54.000 So they're trying to pull people to the center.
00:11:56.000 Yeah.
00:11:56.000 Well, it sounds like you didn't come across anything like that in your work.
00:12:00.000 Dick, actually when I talked to the policy manager, Sean, I had a lot of conversations with him.
00:12:04.000 He was in charge of the, he was a cognizant employee, but he could make decisions for like the entire, you know, all the staff at the Phoenix location.
00:12:13.000 So he could make a decision for a thousand workers about the policy.
00:12:18.000 And so I asked him about that and he's like, yeah, we try to like segregate people, like like-minded people together to prevent more, yeah.
00:12:25.000 They're tribalizing people on purpose.
00:12:27.000 Yeah.
00:12:28.000 Why?
00:12:28.000 He said, like, to prevent, I don't know, maybe... Fights?
00:12:32.000 Prevent conflict or whatnot.
00:12:34.000 Because I raised up civil war to him, the trending civil war.
00:12:37.000 This was during the impeachment.
00:12:38.000 And people were talking about Boogaloo, which kind of means civil war.
00:12:41.000 Right, right.
00:12:41.000 He's like, that's great.
00:12:42.000 Like, keep on sending me these jobs.
00:12:44.000 Facebook really wants visibility and wants to know what's going on.
00:12:47.000 He's like, if Facebook had identified some of these trends in 2016, then they would have been glad to know what the trends were in 2016 when Trump won.
00:12:56.000 They were trying to figure out why people who are far right became more moderate, regular, conservative, whatever, and they wanted to find whatever content they were viewing and give them more of it.
00:13:11.000 So I'm wondering if...
00:13:12.000 Did they ever come to you and say, this content clearly breaks the rules?
00:13:17.000 Like, I understand you mentioned the leftist protest, but I'm wondering if there was other examples where they said, these things get a special exemption.
00:13:24.000 Like, straight up told you, don't get rid of this kind of content.
00:13:31.000 I'm trying to think of some examples like that because I know I know they gave newsworthy exceptions like like if you know, there were celebs Opposing abortion in Alabama and they said something that violated the policy then then Facebook giving them a pass We can go back to that later if you'd like discussing abortion But so they gave specific newsworthy exceptions to promote allow, you know promotion of leftist ideologies but as far as what you're saying is Like, is there a type of content we're looking for?
00:13:59.000 They did say, look for right-wing extremism globally that might lead to violence.
00:14:05.000 And they did call out Spain.
00:14:06.000 They said, hey, in Spain, there's a separatist movement, separatist nationalist movement involving with the Basques.
00:14:11.000 Look out for violence stemming from that.
00:14:14.000 So, in a way, they did try to, like, by asking the content moderators, like thousands of them, to look for certain things.
00:14:20.000 Like, we're going to be... We get bored.
00:14:22.000 We see hundreds of posts a day.
00:14:24.000 We probably do 100, 200 jobs a day.
00:14:26.000 So, by feeding us information, by telling us to look for certain things, it kind of sways things a certain way.
00:14:35.000 Were your co-workers progressive or leftist or what?
00:14:39.000 Some of them were.
00:14:40.000 Some of them were right-wing.
00:14:41.000 Some of them were conservative.
00:14:42.000 There's two guys I worked with who were actually in the original video with Project Veritas, Jose Moreno and his friend.
00:14:50.000 And we had conversations.
00:14:53.000 I sat with them towards the last couple months at work.
00:14:56.000 So there was a pretty diverse group of people, but all the leadership I noticed were more left-leaning.
00:15:04.000 Yeah.
00:15:05.000 So Sean Browder, for example, was a huge Bernie Sanders supporter.
00:15:08.000 The reason I ask is because, you know, when I worked for some of these media companies, it seems like their goal is to hire people who are progressive left-leaning and then let them do their thing.
00:15:18.000 You know, they don't need to tell you to go after conservatives if you're already biased, you know?
00:15:24.000 So that's why I asked.
00:15:25.000 But it doesn't seem like they were doing that.
00:15:26.000 It doesn't seem like they were hiring people, you know, they're just hiring, it seemed like they were hiring regular people.
00:15:32.000 I think for promotions, I think they definitely did take that into account.
00:15:35.000 What's crazy, too, is the summer after I started, that June or July, they actually made us link our personal Facebook accounts to continue working.
00:15:43.000 Wow.
00:15:43.000 Why?
00:15:43.000 Were they tracking you?
00:15:45.000 They said the excuse was so that we didn't accidentally action our friends' content.
00:15:49.000 Like, what's the probability of it, right?
00:15:51.000 Yeah.
00:15:52.000 So, it freaked a lot of people out because a lot of people didn't have a Facebook account.
00:15:55.000 But yeah, I applied for a promotion a couple of times, never got it.
00:15:59.000 I applied to the policy team, the same team with Sean Browder, and I have a degree.
00:16:04.000 And a lot of these people were young, like in their early 20s, fresh out of high school, didn't have a degree.
00:16:08.000 So, I think for promotions, they definitely did take ideology into account.
00:16:13.000 Interesting.
00:16:14.000 Do you feel like they, because of your politics or just because you didn't fit in with like their culture, they didn't give you a promotion or they held you back or what?
00:16:24.000 Yeah, I think it's probably the politics about it.
00:16:28.000 I mean, in the interviews, they said, hey, as part of the policy team, you're gonna be interfacing with the client a lot.
00:16:34.000 So, I mean, if you're a higher-up, if you're hiring someone for the policy team and you know they're gonna be interacting a lot with the client, why would you promote a right-wing or conservative person if, I mean, if you're trying to protect, yeah.
00:16:47.000 It's really similar to what I was told when I worked for Fusion, side with the audience.
00:16:51.000 Look, the people who come to us are progressive, therefore we give them, you know, we want to give them what they want.
00:16:56.000 And so that seems to make sense.
00:16:58.000 Yeah.
00:16:58.000 Based on what you were doing, I asked you already, do you think they swung an election?
00:17:05.000 Do you think what they did helped the Democrats win in the midterms in 2018?
00:17:09.000 Yeah, definitely.
00:17:10.000 A hundred percent.
00:17:11.000 I mean, just think about, it's about gathering information and intel.
00:17:16.000 So, I mean, you have, for example, just as an example, this past fall you had their Ukraine whistleblower and Facebook's guidance was to delete that.
00:17:25.000 And I was on the front lines when that happened.
00:17:27.000 You were at Facebook?
00:17:28.000 Yeah.
00:17:28.000 Yeah, when so we can't say the Ukraine whistleblowers name on YouTube right now
00:17:32.000 This is this is active censorship. If we say this person's name, they will cut the feed
00:17:36.000 I have videos on YouTube that are in this weird state that doesn't exist anywhere else on YouTube
00:17:43.000 So what happens is if you break the rules on YouTube, they'll delete your post if it's not a rule-breaking thing
00:17:50.000 But it's like borderline and they're like well look they'll do what's called forced private
00:17:56.000 Your video will change to private so only you can see it, and you can't change it back.
00:18:01.000 But you don't get a strike, it's not banned, it's just one of the steps they have.
00:18:05.000 What they did to my videos on the Ukraine whistleblower, this is the guy who started the whole impeachment process, they are almost just a graphic on the website.
00:18:14.000 When I go into my videos, I have one video on my main channel, one on my second channel, TimCast and TimCast News.
00:18:19.000 When I go into my videos, the videos are there, you can see them, but you can't click anything.
00:18:24.000 When the mouse goes over it, it doesn't change.
00:18:26.000 So weird.
00:18:28.000 When you hover over a link, it like turns into a little finger about to click it, nothing.
00:18:32.000 And I click and nothing happens.
00:18:34.000 That's what they do.
00:18:35.000 So when all this is going down, I went on Facebook, and I immediately started, you know, posting this guy's name like crazy.
00:18:42.000 Now here's the crazy thing.
00:18:43.000 I never got any warnings.
00:18:45.000 I never got any, like, notifications.
00:18:47.000 The posts would just disappear.
00:18:51.000 That was you.
00:18:52.000 Yeah, that was us.
00:18:53.000 So I actually discovered it, and we actually sent it to Facebook so Facebook finalized the policy.
00:18:57.000 So originally, and this goes back to 2018, because, you know, if Facebook can, essentially, his name was a Republican talking point.
00:19:05.000 You know, Rand Paul tried to mention his name.
00:19:07.000 It was on Fox News.
00:19:08.000 He said it in the Senate.
00:19:09.000 Yeah, exactly.
00:19:10.000 A senator named this guy because of his potential ties to Democrats, his lawyers, the statements they made.
00:19:16.000 And there was also a statement made about him that had nothing to do with impeachment or whistleblowing.
00:19:20.000 Yeah.
00:19:21.000 You could not say his name no matter what.
00:19:23.000 There was a C-SPAN video.
00:19:24.000 A senator, an American senator, on the Senate floor said to the American people, this guy and this guy were overheard saying they wanted to remove Trump.
00:19:33.000 Had nothing to do with the whistleblowing.
00:19:35.000 That video got removed from YouTube.
00:19:37.000 So it really wasn't for to protect him as a whistleblower.
00:19:40.000 It was, it was, it was deleting a Republican talking point.
00:19:43.000 That's what it was plain and simple.
00:19:44.000 So when we first discovered it, I ran across this job and I was talking to my
00:19:49.000 coworker Skylar about it and we're like, Hey, what should we do?
00:19:52.000 This guy's, you know, a whistleblower or whatever.
00:19:55.000 And so we raised it to our local policy team and, and we made, they made an interim decision for the next six hours to delete it under our privacy policy because they thought that he was undercover law enforcement.
00:20:08.000 And I have screenshots of that exact same policy.
00:20:10.000 Wait, wait, they thought the whistleblower was an undercover law enforcement?
00:20:14.000 Yeah.
00:20:15.000 I mean, he literally worked for the CIA.
00:20:17.000 Yeah.
00:20:17.000 He wasn't undercover.
00:20:19.000 No.
00:20:20.000 And so that was the initial decision from our local policy team.
00:20:23.000 And then six hours later, Facebook said, okay, we'll continue deleting it, but we're going to delete it under this generic part of the policy called coordinating harm.
00:20:31.000 So there's nothing in the wording of that policy at all that relates to whistleblowers.
00:20:36.000 So it was just delete, coordinating harm, other.
00:20:38.000 So it's just some generic part of it.
00:20:40.000 But I have conversations.
00:20:42.000 I was recording, filming at the time when I had those conversations with Skylar and my other coworkers.
00:20:47.000 I have a really good analysis of it written up.
00:20:49.000 But yeah, it's mind-blowing.
00:20:50.000 So you were deleting posts?
00:20:51.000 I was deleting all day.
00:20:52.000 So they feed them into our queue.
00:20:53.000 So they do a proactive pull, and they pull in so they can search whatever name it is.
00:20:57.000 They pull it into our queue, and all day I'd get like 100 jobs like that.
00:21:01.000 Just delete, delete.
00:21:02.000 Did you have a touch screen?
00:21:04.000 No, it was your mouse.
00:21:05.000 So you were literally like clicking, clicking, because the pose was probably crazy.
00:21:09.000 Like people were probably saying this guy's name thousands of times per hour.
00:21:12.000 Yeah.
00:21:13.000 Yeah.
00:21:14.000 And then we had shortcut keys.
00:21:15.000 It was like two, seven, seven.
00:21:17.000 Oh wow.
00:21:18.000 So much easier.
00:21:19.000 Wow.
00:21:20.000 I mean, did that shock, that worry you at all?
00:21:22.000 Were you, were you like, why am I deleting this?
00:21:23.000 Or were you just like, I'm at work and I'm going to delete all these guys, you know?
00:21:26.000 Some people enjoyed it.
00:21:27.000 I'm sure they did.
00:21:30.000 They enjoyed the power trip.
00:21:32.000 But we even had a picture of George Soros' son that people are confusing with the Ukraine whistleblower, and I raised that up.
00:21:38.000 And they asked my supervisors, and they asked Facebook actually, I think, and they said, no, still delete it because they're implying that it's him.
00:21:46.000 So it wasn't even him, it was someone else.
00:21:48.000 It was Alexander Soros.
00:21:50.000 I fooled around a bit on Facebook to see what I can get away with.
00:21:53.000 Of course you did.
00:21:54.000 And I did one post that was like, I wrote this thing, which was basically, I wrote a short paragraph saying why censorship is wrong, but the first letter of each word spelled his name going straight down.
00:22:06.000 It's pretty creative.
00:22:06.000 Yeah, I'd seen those ones.
00:22:07.000 It never got deleted.
00:22:08.000 Never got deleted.
00:22:09.000 No, why?
00:22:09.000 No, yeah, I think I still have it on, maybe they deleted it and I just didn't notice at this point.
00:22:12.000 But it was up for a really, really long time as far as I know.
00:22:15.000 And I thought it was going to be temporary.
00:22:16.000 I'm like, okay, yeah, it's trending right now, maybe in a couple months, but I checked back and that was like in October or November.
00:22:23.000 I checked back in January and up until I left to the project ended in this past February, that guidance was still active.
00:22:28.000 You want to know what the craziest thing was?
00:22:30.000 One of my posts, we'll just call this guy John Doe.
00:22:33.000 No, I called him Voldemort.
00:22:34.000 We'll call him Voldemort.
00:22:35.000 He who must not be named.
00:22:36.000 Yeah, we can.
00:22:36.000 So, I made a post, using his real name, of course, but it said, Voldemort is a 55-year-old dental hygienist from, you know, Dubuque, Iowa.
00:22:45.000 He has a family, a wife, and five kids, and he's gonna, you know, something really benign and having nothing to do with anything.
00:22:52.000 They deleted it.
00:22:54.000 Just because that name was in it, even though it was a text post about a dental hygienist in Dubuque, totally different person, gone.
00:23:03.000 Now that was over the top.
00:23:04.000 I was like, wow.
00:23:06.000 It almost felt like it was a robot doing it, but it was people.
00:23:09.000 Yeah, there really could be someone with that name, who happens to have the same name as him, who's getting punished because of it.
00:23:16.000 And speaking of attacking innocent people, and like the average Joe citizen, you know who Caitlin Bennett is, right?
00:23:23.000 Yes.
00:23:24.000 So there was a meme that's trending about her, that's still trending, of her like passed out drunk.
00:23:30.000 Supposedly it was her, and there was like feces coming out.
00:23:33.000 It was really gross.
00:23:34.000 Right.
00:23:34.000 She made a mess.
00:23:35.000 Yeah.
00:23:35.000 So they still, to this day... Is that real?
00:23:38.000 I don't know.
00:23:38.000 Is it real?
00:23:39.000 That's what we were debating and that's the kind of discussions, we have weird conversations at Facebook.
00:23:44.000 Sounds like it.
00:23:45.000 Just a quick aside, you know that the Melania Trump nude photo, where she's crotch to crotch with another female?
00:23:52.000 We had this huge discussion about that because per Facebook policy, if they're crotch on crotch and one of them doesn't have underwear on, if it's a guy and a girl, there could be, for the sake of a better word, penetration.
00:24:06.000 I'm trying to talk about this scientifically.
00:24:07.000 Be a little family friendly.
00:24:09.000 Um, and so we were, we ended up interpreting the policy to delete that because anyways, it was just kind of weird, but it's, you know, two females next to each other.
00:24:17.000 Um, so back to Caitlin Bennett.
00:24:20.000 So Caitlin Bennett, yeah, there's this meme trending about her, you know, passed out drunk, but you can't see her face.
00:24:26.000 So there's, there's some college co-ed face down and her friends are standing around and she's passed out drunk and there's stuff coming out of her backside and her skirts pulled up halfway, like halfway up her buttocks.
00:24:37.000 And so.
00:24:39.000 There's three different policies that we'd look at in this situation.
00:24:43.000 So the first is the bullying policy.
00:24:45.000 So first of all, is she a public figure or a private individual?
00:24:47.000 We don't know.
00:24:48.000 But the policy says if we don't know if it's a public figure or private, then we default to private individual to protect the private individual.
00:24:56.000 So they should have done that from the get-go.
00:24:58.000 There's another policy called sexual exploitation of adults that covers creep shots or taking pictures of people when they're passed out, half naked.
00:25:06.000 That makes sense.
00:25:07.000 And so there's clearly a lot of things they could have deleted this for.
00:25:12.000 But the guidance was, and I have a screenshot of the guidance, they said, well, we don't know, but we kind of think that it is true, so leave it up.
00:25:22.000 You couldn't even see what it was?
00:25:23.000 You couldn't tell what it was.
00:25:24.000 And the specific guidance in the letter of the law was to default to private individual.
00:25:29.000 If you're not sure, default to private individual.
00:25:31.000 So they didn't follow their own policy, which as you can see is a trend.
00:25:35.000 Facebook not following their own rules.
00:25:37.000 Yeah.
00:25:38.000 And so they left it up.
00:25:40.000 So that could be some innocent 19-year-old girl who's now being made fun of nationally.
00:25:46.000 Well, yeah, that's what they always say about Caitlyn Bennett.
00:25:50.000 And I didn't know anything about what it was.
00:25:52.000 They were just, you know, they make fun of her.
00:25:54.000 So this could be just entirely made up to go after her to try and poison the well, discredit her so that she can't speak.
00:26:00.000 And Facebook allows it to happen in violation of their own rules.
00:26:02.000 Right.
00:26:03.000 Amazing.
00:26:03.000 Have you seen any other examples like that?
00:26:05.000 Or was that...
00:26:06.000 So that was a big one.
00:26:08.000 I think I think also with Greta Thunberg You know, she's I think she's she does a lot of good work.
00:26:15.000 I think she's a wonderful person.
00:26:16.000 I mean, but you know, she's She has been really cool.
00:26:20.000 You ridiculed a lot.
00:26:21.000 So Facebook gave her a Given exception to give her additional protections and we know she's there's something, you know autism I think and so I want to be respectful of that as well I think she's 17 now or she I'm not sure she's 17 or 18 18 But anyways, so people were calling her gretarded, and this is after that incident at the UN where there was an exchange between Greta Thunberg and Donald Trump.
00:26:45.000 But people were calling her gretarded.
00:26:47.000 Gre.
00:26:47.000 Gretarded.
00:26:48.000 Yeah.
00:26:48.000 Yeah.
00:26:49.000 And so it was kind of a play on her name.
00:26:51.000 Oh, I see, Greta.
00:26:51.000 Retarded, gretarded.
00:26:53.000 So minor public figures like her, they do have some protections, like you can't make sexual jokes about them because they are minors.
00:27:00.000 But you can call a minor, you can call Jojo Siwa a retard.
00:27:05.000 Like, you can call anybody else who's a minor public figure a retard.
00:27:09.000 It's not a nice thing to say, but Facebook allows it, because it's not sexual in nature.
00:27:13.000 But Facebook made an exception to disallow and delete any mention of Greta Thunberg.
00:27:18.000 And so that's when they used their proactive pull again.
00:27:20.000 Their AI scraped the system and dumped all instances of Greta Thunberg into our queue.
00:27:25.000 Even the actual post we had on our workplace that was mentioning retarded, it pulled that same post into our queue.
00:27:33.000 So every day we're deleting hundreds of jobs related to that.
00:27:36.000 So it's one example of them kind of making an exception to the rules to protect certain people.
00:27:40.000 Very clearly protecting the left and going after the right.
00:27:43.000 Right.
00:27:43.000 So, well, I asked you twice now that you think they did provide a benefit to the Democrats, helped them win essentially in 2018.
00:27:51.000 Yeah, I did.
00:27:52.000 100%.
00:27:53.000 Do you think that's what's happening now?
00:27:55.000 100% I mean like the yeah and like a lot all the evidence I have points towards that um yeah that they are like interfering the election I have this these notes here so this is what they did for the 2020 elections they created a new queue so it's called the civic harassment queue and so they combined basically hate speech and bullying in a way But they're saying they said why were the changes made to the existing guided review tree for bullying harassment?
00:28:22.000 And they said bullying harassment has been identified as a priority issue around the US 2020 election.
00:28:28.000 We acknowledge that anyone can share an opinion about the US 2020 election, but not all voices carry equally far.
00:28:34.000 nor are equally susceptible to attacks.
00:28:37.000 We want to protect not only influential figures who are vulnerable to harassment
00:28:42.000 through their status, but also ordinary folks that make themselves vulnerable by interacting with content generated
00:28:48.000 by these figures.
00:28:49.000 So they're saying there's this overlap between like hate speech and like the election.
00:28:53.000 So what happened was we ended up getting... I think they just really wanted to see what was trending.
00:28:59.000 So I saw DC Drano's Instagram account a lot.
00:29:03.000 Some of these huge Instagram influencers.
00:29:06.000 So more things got reported in those comments.
00:29:09.000 And so I think they just really wanted to see the trends.
00:29:13.000 But yeah, it was always a priority.
00:29:16.000 When they track the trends, it's giving them intelligence on what's going on and how people think and feel when it comes to elections and politics.
00:29:23.000 Yeah.
00:29:24.000 You know, I'm curious because I know that Mark Zuckerberg apparently had a private meeting with I think like Trump and Ben Shapiro and some other people or something.
00:29:31.000 I'm wondering if they realized, a lot of people were mentioning, if the Republicans win, And they're not censoring Republicans.
00:29:39.000 They'll probably be fine because the conservatives rarely want to regulate big companies.
00:29:42.000 In fact, there's a lot of conservatives right now saying, no, it's a free speech thing, we don't want to regulate.
00:29:47.000 If the left wins, even if they support them, they're going to regulate antitrust or whatever.
00:29:51.000 So a lot of people were suggesting that Mark Zuckerberg at some point realized the trend.
00:29:55.000 That free speech, you know, liberals were joining conservatives and defending, you know, free speech and these values.
00:30:03.000 And then Mark Zuckerberg switched to start defending the right.
00:30:06.000 So now we have, there's two big stories that overlap.
00:30:10.000 One is that Facebook recently was de-ranking progressive websites like ThinkProgress.
00:30:16.000 And that actively helping, in a way, conservative sites, and that's why conservative sites are now the top ten most shared or engaged with content every single week.
00:30:27.000 You can see it's like Fox News, Ben Shapiro, Dan Bongino, every single time.
00:30:31.000 So I'm curious if you saw, it doesn't sound like you saw anything, but I'm wondering if you saw any kind of shift in that capacity.
00:30:38.000 If there's any veracity to that theory.
00:30:40.000 So, from a business standpoint, obviously it makes sense to side with people who are more towards free speech or more libertarians, who want less government involvement, because the last thing Facebook wants is more government regulation, right?
00:30:54.000 I ask a lot of people at Facebook, my coworkers, like, hey, what are Facebook motives?
00:30:59.000 Are they political?
00:31:01.000 And they're like, well, it's all about the money for them.
00:31:03.000 So I think there's some validity to that.
00:31:05.000 I know Mark Zuckerberg got a lot of flack when he said about a year ago, he said that we're not going to fact-check political ads.
00:31:14.000 And the left destroyed him.
00:31:15.000 Oh, and then he met with Trump.
00:31:16.000 And now they're banning Trump's ads.
00:31:18.000 Yeah.
00:31:19.000 So they finally flip-flopped.
00:31:21.000 And I have some personal experience with that.
00:31:23.000 When I worked at Uber Corporate in 2016 as a contractor, I worked as a fraud analyst for about two months.
00:31:31.000 And we had this club meeting with called UberHue, like the Ebony Club from high school or whatever, but it's called UberHue.
00:31:39.000 And one of the leaders was like bragging about how they got Travis Kalanick, the CEO of Uber, to retract his statement of All Lives Matter.
00:31:47.000 Wow.
00:31:47.000 And so she was like bragging about it.
00:31:49.000 And then she's like, yeah.
00:31:50.000 And then he donated this something to the fourth floor of the Sears Center in Sacramento that was dedicated to the Black Panthers.
00:31:59.000 And I'm like, So you're admitting that, like, Black Lives Matter, like, helped, like, forced him to retract his statement.
00:32:05.000 Like, that's kind of messed up.
00:32:08.000 So yeah, as far as, yeah, with Mark Zuckerberg and this trend.
00:32:11.000 So I did notice a lot of changes after the civic audit.
00:32:14.000 So the civic audit from the Covington Law Firm with former Senator John Kyle started in about May of 2018, right after the... Yeah, what is this?
00:32:23.000 Yeah, so basically I think there was pressure from Congress obviously because Zuckerberg had just testified in April of 2018 and So there was this claim that conservatives were being censored.
00:32:23.000 This lawsuit?
00:32:34.000 So this independent law firm Covington headed by former Arizona Senator John Kyle basically went in and interviewed a bunch of people working at Facebook and tried to validify these claims and And so they found that, yeah, there was some validity to them.
00:32:52.000 And so I saw a change after that.
00:32:55.000 So after that point, then they began tracking.
00:32:59.000 The news were the exceptions.
00:33:00.000 They began being a little more careful about what they said.
00:33:02.000 I saw less, more blatant leftist leaning posts.
00:33:07.000 But as to your question as far as, you know, what Mark Zuckerberg's strategy is, or if he's now leaning to the right, I mean, from a business standpoint, like I say, I mean, it's possible that he's trying to play both sides.
00:33:23.000 But I think it definitely, I think there's too much pressure from these organizations on the left.
00:33:28.000 I mean, if you can have the Uber CEO retract his statement because of pressure from Black Lives Matter, I think there's too much social pressure from the left.
00:33:36.000 So it's gonna always lean that way.
00:33:40.000 Based on what you were saying, did it feel like, you know, there were a lot of people who... What was there more of that you saw?
00:33:48.000 Right-wing or left-wing content?
00:33:51.000 I would say there's more right-wing content.
00:33:53.000 I saw a lot of posts.
00:33:55.000 I did see, you know, was it Now This?
00:33:58.000 Now This Politics a lot.
00:34:00.000 And I saw DC Drano a lot.
00:34:02.000 What's DC Drano?
00:34:03.000 He's right-wing.
00:34:04.000 Oh, okay.
00:34:05.000 He's in Florida.
00:34:06.000 He's got about a million followers on Instagram.
00:34:08.000 Oh, wow.
00:34:09.000 He's running for some office, right?
00:34:11.000 I'm not sure.
00:34:12.000 He might be.
00:34:13.000 I know he's a big Trump supporter.
00:34:17.000 So yeah, there's a lot of... I saw more right-wing content.
00:34:21.000 There was a fair mix, I would say.
00:34:23.000 It seems like... I mentioned this, I think the previous episode, there's a meme that came from 4chan that any sufficiently free space will become right-wing and only through hard moderation can a space support left-wing ideas.
00:34:39.000 Something like that.
00:34:40.000 Yeah.
00:34:40.000 And what's interesting...
00:34:43.000 I guess my question to you in that capacity is, when it came to left-wing content versus right-wing content, do you see a difference between an individual's post or a corporation's post?
00:34:52.000 Like, was one side doing more corporate, one side more people, individuals?
00:34:57.000 I definitely saw more corporate posts that were leftist.
00:35:01.000 And I saw more just individual people who were right-wing.
00:35:04.000 Uh, and then a funny thing is with the, with, um, how they treat certain words.
00:35:07.000 So with, there was something called the bullying slang list.
00:35:11.000 So if I call someone, if I call Ian over here, if I say Ian, Ian, you're a Trump humper.
00:35:17.000 Fair.
00:35:20.000 And you, you report it.
00:35:21.000 I, as a moderator, can see that Ian reported that comment.
00:35:24.000 So it's called a name-face match.
00:35:26.000 Interesting.
00:35:27.000 So that gives me more power.
00:35:29.000 Now the content monitor can say, hey, Ian himself didn't like that comment.
00:35:33.000 And so, but Trump Humper, it stays up no matter what, even if you report it.
00:35:38.000 Yeah, it stays up.
00:35:39.000 So what if I call you a feminazi or a snowflake?
00:35:42.000 Oh, snap.
00:35:42.000 Really?
00:35:43.000 Really?
00:35:43.000 Snowflake, wow.
00:35:45.000 Trump-Humper is okay.
00:35:46.000 Trump-Humper is okay.
00:35:47.000 Nazi is okay.
00:35:49.000 Snowflake and Feminazi are not okay.
00:35:51.000 Feminazi is no good, but Nazi is great.
00:35:53.000 Yeah, wow.
00:35:55.000 So that gives an example of kind of what you were getting into.
00:35:57.000 I hope that answers your question.
00:35:59.000 Yeah, you know, that's what I assumed.
00:36:02.000 That corporations are pushing the leftist narrative, and the people who follow it follow it blindly.
00:36:08.000 And then the individuals are right-wing.
00:36:09.000 There's two ways to look at it.
00:36:11.000 For one, the right is individualist.
00:36:14.000 So you've got more people on the right, they want to take care of themselves, mind their own business.
00:36:17.000 And the left is more collectivist.
00:36:19.000 So they have a hierarchical collective structure, a corporation, telling them what to do and think, and they go along with it.
00:36:26.000 But in that sense, too, it's interesting.
00:36:28.000 That's why that meme exists, that without moderation, it would all be right-wing, because the individuals would all be the ones speaking, and the left would not have any kind of group at all.
00:36:38.000 They wouldn't be doing anything.
00:36:39.000 In fact, they'd probably just become right-wing, seeing nothing but right-wing memes and ideas and things like that.
00:36:44.000 So one of the interesting aspects of that is, it seems like there may—it's not just a left-wing bias, it's a pro-corporate bias.
00:36:53.000 Yeah, I mean, we see that just recently.
00:36:55.000 We've seen that with, what was it, Expensify that sent out that email?
00:36:58.000 Oh, that's right.
00:36:59.000 Yes, yes, yes.
00:37:00.000 So Expensify sent an email out to all their consumers basically saying, literally in the subject line, vote for Joe Biden.
00:37:06.000 Or there will be a civil war.
00:37:07.000 Or there will be a civil war.
00:37:09.000 And you mentioned this collective force of the left-wing and it's like it's like the Borg like you mentioned on Tuesday the Borg from Star Trek yeah Resistance is futile.
00:37:17.000 Yeah, and like and then they were throwing who is it that Armstrong that CEO of the that one tech company?
00:37:23.000 Under the bus because he wouldn't he said it we don't want to I don't want to talk about politics at work You remember his name Armstrong, which I think was Armstrong.
00:37:32.000 See the CEO's name was Armstrong.
00:37:33.000 Which company was that for?
00:37:35.000 Um, I don't remember the name.
00:37:37.000 I'll look it up for you.
00:37:38.000 Yeah.
00:37:38.000 At the top of my head.
00:37:39.000 But yeah.
00:37:40.000 So, I mean, yeah, we see that resistance is futile.
00:37:43.000 I mean, they're pressuring, they pressured Travis Kalanick to retract All Lives Matter.
00:37:47.000 And that was in like 2016.
00:37:48.000 That was before it became this big.
00:37:52.000 But right now, like, you know, if you're a corporate entity, you pretty much have to cave into the woke left mob.
00:37:58.000 Yeah.
00:37:59.000 What is Expensify?
00:38:00.000 Do you know?
00:38:02.000 I think they, it has something to do with expense reports.
00:38:05.000 Yeah, right.
00:38:06.000 Because in the article, in the email, he's like, well, he did like a QA.
00:38:10.000 He's like, well, why does a civil war matter?
00:38:12.000 Aren't you overreacting?
00:38:13.000 He's like, well, in a civil war, I wouldn't be able to bill out expense reports.
00:38:17.000 There's not going to be a lot of expense reports in a civil war.
00:38:19.000 So he's like, so the gist of this story is he is a tech CEO in San Francisco.
00:38:27.000 And he sent out an email to all of his customers saying, you must vote for Joe Biden.
00:38:33.000 I saw some people complaining saying, why is he using my private information for this?
00:38:38.000 He should not be emailing me these messages outside of the realms of what his business does.
00:38:43.000 I don't think he broke any laws doing it, but he probably pissed off a lot of customers.
00:38:47.000 But it's crazy the extent to which, I don't care what my expense report tracking company thinks.
00:38:53.000 When I hire a plumber, am I going to be like, by the way, who are you voting for?
00:38:57.000 Uh, don't worry about the toilet.
00:38:58.000 I know it's broken, but let's talk politics.
00:38:59.000 No, I say, thanks for coming.
00:39:01.000 My toilet's broken.
00:39:02.000 Have at it.
00:39:02.000 Let me know if you need anything.
00:39:03.000 Yeah.
00:39:04.000 But could you imagine if you hired a plumber and he showed up and said, before I fix your toilet, who'd you vote for?
00:39:10.000 I, I, I didn't vote.
00:39:11.000 You didn't vote.
00:39:12.000 You want a civil war?
00:39:13.000 No, man.
00:39:14.000 I took a dump and the toilets clogged.
00:39:16.000 That's what I want.
00:39:16.000 I don't know what you're talking about.
00:39:18.000 That's where we're at.
00:39:19.000 People have gone nuts.
00:39:20.000 Speaking of that collective mind, I mean, this is similar to what Zach Voorhees, the Google whistleblower, kind of uncovered.
00:39:26.000 I'm sorry, I gotta stop real quick.
00:39:28.000 I have to apologize to all plumbers.
00:39:29.000 I think anybody doing a hard job like that probably wouldn't be a leftist anyway.
00:39:33.000 Right, exactly.
00:39:34.000 Shut up.
00:39:34.000 Doing hard work.
00:39:36.000 Kind of precludes leftists.
00:39:37.000 Now I have to apologize to all the leftist plumbers.
00:39:39.000 I mean, no disrespect.
00:39:40.000 I'm just kidding, I'm just kidding.
00:39:41.000 Anyway, continue.
00:39:42.000 Zach Voorhees, who's this guy?
00:39:43.000 Yeah, so Zach Voorhees is the Google whistleblower.
00:39:45.000 So he went public about a year and a half ago.
00:39:47.000 I remember him.
00:39:49.000 He worked as a software engineer for Google for eight and a half years.
00:39:52.000 And he uncovered their algorithms that are basically just trying to shape kind of that collective mind in a way, shape humanity.
00:39:58.000 ML Fairness, is that what he uncovered?
00:40:00.000 Yeah, Fairness.
00:40:01.000 Yep, exactly.
00:40:02.000 Or algorithmic fairness.
00:40:03.000 Algorithmic fairness.
00:40:04.000 Yeah.
00:40:05.000 And it was basically this, essentially a way to influence as well the 2020 election.
00:40:09.000 And I'm also working with Dr. Robert Epstein.
00:40:12.000 Oh yeah, that's right.
00:40:15.000 He's actually my 501c3 sponsor for my foundation.
00:40:18.000 And so he uncovered the fact that Google was influencing search results in the 2016 election towards Hillary Clinton.
00:40:25.000 And this guy's a classic liberal, someone who's not- He's got photos!
00:40:28.000 So Dr. Robert Epstein.
00:40:31.000 He's got pictures of him with, like, the Clintons, and he's, like, giving a thumbs up.
00:40:33.000 He's all happy and excited.
00:40:35.000 And then he was, like, Google is swinging the election in favor of the Clintons, and this is scary.
00:40:38.000 Not excited.
00:40:39.000 Yeah.
00:40:39.000 Or in favor of Hillary.
00:40:40.000 And he has the science to back it up.
00:40:42.000 I mean, behavioral research, so.
00:40:44.000 I want to have him on.
00:40:45.000 So are we—is this our last election?
00:40:47.000 If Trump loses—and I'm not saying this to praise Trump.
00:40:51.000 I'm saying, if the Democrats win and the machine is favoring them to win, are we just under the boot of the machine if we can't stop it now?
00:40:59.000 Yeah, pretty much.
00:41:00.000 I mean, I hate to be a pessimist.
00:41:02.000 All right, I'm going fishing, guys.
00:41:03.000 Ian, take over the show.
00:41:04.000 I'm just kidding.
00:41:05.000 Let's talk about space.
00:41:06.000 Let's read sci-fi.
00:41:07.000 Let's just play PUBG.
00:41:08.000 How pessimistic are you?
00:41:09.000 Let's read sci-fi, let's just play PUBG.
00:41:12.000 Yeah, I'm very doomed.
00:41:13.000 No, no, no, but how pessimistic are you?
00:41:15.000 Do you think this is the end?
00:41:16.000 I think there's some hope.
00:41:17.000 I think there's a glimmer of hope.
00:41:19.000 There's actually a lawsuit coming up.
00:41:20.000 I talked to this guy named Jason Fick, F-Y-K, and he has a lawsuit against Facebook.
00:41:25.000 Facebook essentially stole his page, deleted his page, and tried to sell it to someone else.
00:41:31.000 What?
00:41:32.000 Yeah.
00:41:33.000 And he had like, I don't know, 50, 40 million followers or something like that.
00:41:35.000 Huge account.
00:41:36.000 Wow.
00:41:37.000 And so his lawsuit might go to the Supreme Court.
00:41:39.000 It might be appealed to the Supreme Court.
00:41:41.000 And Clarence Thomas just issued an opinion last week on Malwarebytes versus something
00:41:48.000 else.
00:41:49.000 Basically, he issued a really important opinion that's talking about Section 230.
00:41:54.000 So there's a glimmer of hope, and it has to do with how this Ninth Circuit has interpreted Section 230 incorrectly.
00:42:03.000 Interesting.
00:42:04.000 So with Jason's lawsuit, basically he tried to sue them, and then Facebook's like, oh, well, we're not the publisher, but the publisher is different than a publisher.
00:42:16.000 So they're like, we're not the publisher, as in we weren't the ones who originally wrote the content.
00:42:22.000 But they were acting as a developer because they were promoting or de-boosting certain content.
00:42:26.000 Right.
00:42:26.000 And so that's the argument that Jason Fick has is basically to fix Section 230 we need to have it either reinterpreted by the... Basically we need to have it interpreted correctly by the Supreme Court because it's been interpreted incorrectly by the Ninth Circuit Court of California which has given additional protections and immunity under Section 230c1.
00:42:49.000 Yeah not c2 because we always talk about c2 so well hold on so what uh can you explain section 230 just uh easily for people who don't know what it is yeah so 1996 um the congress created a law that was supposed to protect children on the internet from bad content so that if someone has a um Yeah, if someone hosts a platform or a message board, that message board would not be responsible for every single comment.
00:43:17.000 But it gave immunity to these platforms.
00:43:20.000 I hate to use the word platform, but... It's like digital information site or something, I think.
00:43:24.000 It's like a language.
00:43:25.000 Yeah, like there's service providers, there's information content providers, which is you and me, and then there's another category.
00:43:33.000 But yeah, for the sake of simplicity, yes, it gave platforms immunity.
00:43:37.000 But yeah, so the way it was designed, the way it was interpreted is fine, but the way it's been reinterpreted by the 9th Circuit gives additional immunity to these platforms.
00:43:44.000 Basically, how it started was there was like a news website, I guess, and someone commented on it saying... I think it was the dude from Wolf of Wall Street, actually, who was like the subject of the suit.
00:43:56.000 Someone went on the website, commented, this dude is like a scammer or something like that.
00:44:01.000 So he sued the website saying, you published this comment.
00:44:05.000 The website said, no we didn't, it was a comment from somebody else.
00:44:08.000 So this led to the creation of this law that said, okay, newswebsite.com can't be responsible for a comment.
00:44:15.000 So we're going to pass this law that says you are not responsible for this content so long as you are not the publisher or editor of the content.
00:44:24.000 Then the website said, wait, wait, wait.
00:44:27.000 But we might want to remove content.
00:44:29.000 I mean, do we lose this protection if someone posts a picture of like a dead cat or something?
00:44:33.000 And they're like, okay, that's a good point.
00:44:34.000 Okay.
00:44:35.000 So you're not responsible for what users post and you can remove things if they're objectionable.
00:44:42.000 And they said, excellent.
00:44:43.000 This was like God tier immunity.
00:44:46.000 The lawmakers didn't realize what they had just done.
00:44:49.000 Now, literally everything's objectionable.
00:44:51.000 What does it even mean?
00:44:52.000 Nobody even knows.
00:44:53.000 So it's a good-faith effort to moderate.
00:44:55.000 You know, if it's lewd, lascivious, or objectionable.
00:44:57.000 So now you get Twitter being like, this guy tweeted, learn to code.
00:45:01.000 Well, that's objectionable.
00:45:02.000 Nuked.
00:45:03.000 And they're protected.
00:45:04.000 They have immunity.
00:45:06.000 No other company has this.
00:45:08.000 So it gets even crazier when we talk about Facebook.
00:45:12.000 Because Facebook has fact-checkers now that are a special class of people that they choose Not individuals who just sign up and do things.
00:45:21.000 No, it's their choice.
00:45:22.000 Their criteria.
00:45:24.000 And they can say, you're a liar.
00:45:26.000 They can take something you said and put a big thing over it that says, fake news.
00:45:30.000 Which is a statement of fact.
00:45:32.000 False information.
00:45:33.000 This person is wrong.
00:45:35.000 That is coming from Facebook and no one else.
00:45:38.000 So this is what's crazy to me.
00:45:39.000 If you want to make an argument that, you know, Facebook, Twitter, whoever, shouldn't be sued because a commenter said something, I'm listening.
00:45:45.000 If you tell me that Facebook can appoint people to insult and make statements about other people or defame by calling them liars, Well, Facebook's responsible for that.
00:45:53.000 Facebook's the one who's authorized that posting.
00:45:56.000 It is not the same as a random user signing up and using it.
00:46:00.000 They've said, okay, these seven people are allowed to say whatever they want.
00:46:03.000 It's like, okay, well, Facebook, when you put a card over my post, you have editorialized and you have personally published a statement.
00:46:12.000 So we need 230 reform.
00:46:14.000 I know Trump has said repeal 230, which would be a huge mistake.
00:46:18.000 These platforms wouldn't exist without it.
00:46:20.000 They do need immunity so long as they're acting in good faith and they're not just removing legal speech.
00:46:27.000 So you mentioned before you have some ideas on 230.
00:46:29.000 Yeah, exactly.
00:46:30.000 That's a good interpretation of it.
00:46:32.000 So, yeah, the way that it's been interpreted is that Section 230 has been interpreted by the Ninth Circuit Court of California gives Facebook additional protections under C-1.
00:46:43.000 So that whole motive part of being a good Samaritan doesn't even apply.
00:46:48.000 So in Jason Fick's case, when Facebook fought back with their appeal or whatever, their response, they didn't have to argue on the basis of Section C-2, they just argued on the merits of C-1, which the 9th Circuit had interpreted which favors them.
00:47:05.000 And so, I agree with you on that.
00:47:08.000 I think that repealing it completely would be a disaster.
00:47:12.000 Now, I've heard that Ajit Pai actually has authority to reinterpret and reform Section 230.
00:47:18.000 And he should have done it a long time ago.
00:47:20.000 So forgive me if I have no faith in him or any one of these Republicans to get it done.
00:47:25.000 Can you define Section 1 and Section 2 of 230?
00:47:28.000 C1 and C2.
00:47:31.000 So I don't have the... I have what Jason Fick wrote about it.
00:47:36.000 Oh, wait, wait, we actually have it.
00:47:38.000 Yeah, I got it right in front of us.
00:47:40.000 So section C2 is the one that we're all familiar with.
00:47:43.000 It talks about... So, right, so C1.
00:47:46.000 Says treatment of publisher or speaker.
00:47:49.000 No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
00:47:56.000 Which is like the individual.
00:47:58.000 Two is civil liability.
00:47:59.000 No provider or user of an interactive computer service shall be held liable on account of A.
00:48:04.000 Any action voluntarily taken in good faith to restrict access to or availability of material the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected, or B, any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph 1.
00:48:28.000 So basically the first C1 is, you can't, I'm not liable for what another person said.
00:48:34.000 C2 is we're allowed to remove whatever we want, as long as we're doing it in good faith, which basically this point means literally whatever.
00:48:41.000 And you could argue that if they're putting like a notice out that says Tim Poole posted fake news that they're become a publisher under section one, C1.
00:48:50.000 So what I'm saying is, well let me tell you the story.
00:48:54.000 I posted something about Bill Clinton and Epstein Island.
00:48:58.000 Everything I said was factually correct.
00:49:00.000 I said Bill Clinton was seen in these flight logs, he was seen on the plane, he was ID'd by a witness on the island, and this is not major breaking news.
00:49:10.000 I'm paraphrasing right now because I don't have the tweet in front of me.
00:49:10.000 Something like that.
00:49:14.000 Somebody screenshotted it, because it went viral, posted it to Facebook, and Facebook put a card over it saying false information or something.
00:49:21.000 Yeah.
00:49:22.000 That was Facebook doing that.
00:49:24.000 Facebook put a flag saying false information on my post.
00:49:28.000 They made a statement about me that I had lied to the public.
00:49:32.000 Facebook refused to take it down.
00:49:34.000 They become a publisher at that point, under C1.
00:49:36.000 They are the speaker!
00:49:37.000 Yeah, it seems like that.
00:49:38.000 I think that's a good argument.
00:49:39.000 Yep.
00:49:40.000 And if you look at the definition of development, which is actually in the law, it says, you know, choosing to promote, prioritize, well, this is not the law, but basically choosing to promote, prioritize, advance, boost, or increase the availability or usability of information is by definition development.
00:49:56.000 Go look up development in the web service.
00:49:57.000 So I'm reading from Jason Fick's analysis.
00:50:00.000 But yeah, I mean, the service provider Yeah, they're sponsoring ads, so the entire nature of their business violates Section 230.
00:50:08.000 Because what do they do all day?
00:50:10.000 They boost ads, they place other people's content in front of other people's content.
00:50:15.000 Oh wow, so that negates it.
00:50:17.000 Yeah, that negates the community.
00:50:19.000 That makes sense, because one of the arguments that we've all been saying about Twitter and Facebook for a long time is, if they choose the winners and losers, they may as well be the New York Times.
00:50:28.000 The only difference, the New York Times says, I hire you, write for me, and then they choose which to go up on the site, whereas Twitter has everybody write for free, and then we'll choose which one to go up on the site.
00:50:36.000 Yeah, and even Clarence Thomas and his opinion, and that was Malwarebytes versus Enigma Software, and that was this past Tuesday, October 13th.
00:50:45.000 Wow, so recently.
00:50:46.000 Yeah, very recent.
00:50:47.000 And so he said, the Ninth Circuit's court incorrectly held that subsection 230C1 does not render subsection 230C2A redundant.
00:50:57.000 As subsection 232 provides an additional shield from liability.
00:51:00.000 So there was some redundancy.
00:51:02.000 So that's why it needs to be reinterpreted either by Ajit Pai.
00:51:05.000 But I agree with you or the Supreme Court.
00:51:07.000 But I agree with you because, you know, we had that antitrust hearing a couple months ago where we had the tech CEOs testify.
00:51:13.000 I guess there's another subpoena.
00:51:15.000 But, you know, Jim Sensenbrenner, this congressman from Wisconsin, he's been in like 21 terms and he's out there saying, well, we shouldn't punish successful companies.
00:51:24.000 He's a Republican, right?
00:51:26.000 He's a Republican.
00:51:27.000 The rhinos and the dinos have been sitting in there milking the system, suckling the tea to big business, and that's why I don't vote.
00:51:27.000 Go figure.
00:51:34.000 And that's why I'm voting now, because I think something's happening.
00:51:37.000 You've got, on the left and the right, populists.
00:51:40.000 And I think the left hates the right, the right, well, I shouldn't say hate, but they think, both sides think each other is crazy.
00:51:45.000 But I'll tell you what, I would rather, I want to see these These people who sat in office for decades doing nothing, saying the bare minimum, taking cash from big corporations and then just being like, whatever, I'm not going to do anything for you.
00:51:59.000 Who cares if the people are suffering?
00:52:00.000 You got a job, huh?
00:52:02.000 Congratulations, buddy.
00:52:03.000 I hope it was worth it.
00:52:04.000 But yeah, just to summarize Section 230, I mean, imagine, like, your public library, and you go to the public library, and, you know, the library itself is not responsible for what's in the content of those books, right?
00:52:13.000 But, whereas, if you go to Barnes & Noble, like, they're promoting certain books, there's certain books that are on special, so that's kind of the difference, and so Facebook's kind of morphed from a public library into this Barnes & Noble, so to speak.
00:52:26.000 Yeah, we have it mines.
00:52:28.000 If it's not legal in the United States, then it's taken down.
00:52:31.000 And it's actually state by state.
00:52:33.000 And it's a Connecticut thing, because that's where the corporation's based.
00:52:35.000 So it's not legal in Connecticut, it's taken off the site.
00:52:38.000 But if it's legal, it stays on the site.
00:52:40.000 It just goes into different buckets, depending on if it's objectionable.
00:52:43.000 And then you have to opt in to see objectionable content.
00:52:46.000 By default, everyone kind of has the rated G filter on.
00:52:50.000 That's interesting, because Jack Dorsey's talked about that.
00:52:53.000 Like, uh, Some kind of system where instead of getting banned, I think it was Jack, maybe I'm thinking of somebody else, but I think it was Jack Dorsey, that if instead of getting banned you get put in basically like the underbelly and people can choose to go into the dark crevice of horror Twitter and see all of the nasty pictures and photos and people and it's still like
00:53:15.000 Up to discretion, what's nasty?
00:53:17.000 And that's a little weird because you put it in the terms and like objectionable is a horrible word to use because like how do you define that?
00:53:24.000 It's up to the the admin basically.
00:53:26.000 But how, I don't know, we kind of have our peers can judge now.
00:53:26.000 Yeah.
00:53:29.000 We've got like a 12 person jury system where if they judge that it's objectionable then it goes the objectionable thing and you can appeal that objectionable, you know.
00:53:37.000 That's a great idea.
00:53:38.000 Yeah.
00:53:39.000 So, actually, this really actually relates to something I was researching.
00:53:44.000 So, back in the 60s, well, it started in the 20s, but in England, there was this thing called the British Board of Film Censors.
00:53:51.000 So, it was basically that, like, hey, there's this consensus opinion, what's culturally appropriate?
00:53:56.000 Obviously, things that we see on TV would be, you know, obscene to someone from 100 years ago, like living in the 1920s.
00:54:04.000 So, what's acceptable publicly to be viewed?
00:54:07.000 And so this British Board of Film Censors, there's this documentary called Video Nasties.
00:54:12.000 You can search it on YouTube.
00:54:14.000 And it talks about how the British Board of Film Censorship basically, you know, could make their own rules.
00:54:19.000 So this famous quote from John Trevelyan, he said, we have no rules, which I think is important.
00:54:25.000 I think it's the only way to do it.
00:54:26.000 You see, if you have your rules, you either got to stick to them or you have to interpret them.
00:54:31.000 And I think either is foolish.
00:54:33.000 So he's basically admitting, like, hey, we're in charge of the film censorship.
00:54:37.000 We determine whether the movies get approved or not, whether they don't get approved.
00:54:42.000 And we have no rules.
00:54:43.000 And it really made me think of Facebook, because sure, they have the rules, but they don't follow them half the time.
00:54:48.000 They make exceptions whenever they want to protect someone that they want to protect.
00:54:53.000 All right, so let me ask you this question.
00:54:55.000 You mentioned that you submitted for some promotions.
00:54:57.000 Yeah.
00:54:58.000 But you didn't get them.
00:55:00.000 No, I didn't get them.
00:55:01.000 What would you say to somebody who said that your leaking of this information was just because you were angry they didn't promote you?
00:55:06.000 That's a valid point.
00:55:07.000 I mean, you want to, if you have anyone whistleblower comes forward, you want to examine their motives, right?
00:55:12.000 And so, um, yeah, I mean, that's a valid point.
00:55:15.000 Something that I also mentioned in the video that came out is, um, okay, well, first you got to realize, okay, my, if I got a promotion, how much more money would I be making?
00:55:25.000 I don't know.
00:55:26.000 Like a dollar 25.
00:55:28.000 I was making 15 bucks an hour.
00:55:29.000 I was making like $28,000 a year.
00:55:31.000 And so if I got a promotion, I'd be making $16,000 an hour.
00:55:34.000 Facebook promised us that we'd get like an increase, but our project ended before that.
00:55:39.000 So they're going to increase to like $18,000 an hour base.
00:55:42.000 But everyone was making the same.
00:55:44.000 I even knew Spanish.
00:55:45.000 I was on the Spanish side.
00:55:46.000 I was making $15,000 an hour.
00:55:50.000 Yes, it would have helped with my resume, I guess.
00:55:56.000 So just to clarify, you're saying it was not a factor?
00:55:58.000 No, it wasn't a factor.
00:55:59.000 So I only ask that because the next question is, at what point were you like, I've got to do something and I've got to just release all this stuff?
00:56:08.000 So it's funny because there was actually another insider who blew the whistle on wrongdoings at Cognizant who felt that they weren't helping us enough with our mental health.
00:56:22.000 I remember that story.
00:56:23.000 So February 2019, this journalist Casey Newton, he actually came to the site in Phoenix.
00:56:23.000 Yeah, the Verge story.
00:56:31.000 And I saw him there and he walked around and he wrote an article about it.
00:56:36.000 And so in a way that might have been a little bit of inspiration for me because I had some information that was kind of bothering me that I'd seen some examples of bias.
00:56:45.000 So that May, three months later in May of 2019, I wrote a letter to Congress or to a few congressmen with about 19 examples of bias and I didn't hear back and that's when I started reaching out to a couple journalists.
00:56:58.000 Can you name them?
00:56:59.000 So, I mean, I'll just say that I ended up with Project Veritas.
00:57:03.000 Someone referred me to Project Veritas.
00:57:04.000 I'm not surprised.
00:57:05.000 You mentioned... I was gonna say, when you mentioned Casey Newton, I was like, oh, did he go in and cheer for the censorship?
00:57:12.000 He's at the verge.
00:57:13.000 I know, I know.
00:57:14.000 The funny thing is, like, since you'd think he'd be interested in another... Hey, I had an insider at this location.
00:57:20.000 Another guy from the same exact location came forward with video evidence.
00:57:24.000 You'd think he'd be interested.
00:57:26.000 But I searched their website.
00:57:27.000 There's zero mention of me or Project Veritas.
00:57:30.000 I don't think they're fans of Project Veritas.
00:57:32.000 So you don't want to mention which other organizations you reached out to?
00:57:35.000 You don't have to.
00:57:36.000 I don't want to pressure you to do it.
00:57:38.000 No, not at this time.
00:57:39.000 But it did bother me because I think I saw something, what did I see that May of 2019?
00:57:47.000 There were a lot of things that bothered me.
00:57:48.000 But one of the things that bothered me the most, I think, that came out that I saw in 2018 was They made an exception to allow calling straight white males filth.
00:57:58.000 So they said, Hey, every summer there's this pride month.
00:58:01.000 Um, and we're going to make an exception to allow attacks on straight white males for not supporting LGBT.
00:58:07.000 So straight white, if you say straight white males are filth for not supporting LGBT, that's allowed.
00:58:11.000 So they're calling it an exception.
00:58:14.000 Um, and then they changed the policy to allow the phrase white trash, which is almost synonymous with trailer trash.
00:58:21.000 So I kind of understand, but at the same time, Do you remember when those feminists also getting banned for saying hashtag kill all men?
00:58:31.000 Were you there at the time when that happened?
00:58:32.000 I don't recall that.
00:58:33.000 I know we made an exception in South Africa when there was domestic violence against women where they allowed the phrase men are trash.
00:58:39.000 Wow, that's just a bad idea.
00:58:41.000 It's so stupid.
00:58:42.000 Sounds like morons run these companies.
00:58:44.000 Forgive me if you're friends with some of these people still, but that's the stupidest thing I've ever heard.
00:58:47.000 Hey, there's a bunch of tension and animosity between people based on identity.
00:58:51.000 Can we inflame that?
00:58:52.000 Yeah, okay.
00:58:55.000 So, did you tell James O'Keefe that he's your bronze medal?
00:59:01.000 Wait, did I say it right?
00:59:02.000 He was not your first choice or whatever?
00:59:04.000 I'm kidding, I'm kidding.
00:59:06.000 So, there was a time.
00:59:07.000 So, for the first show, around June of 2019, I first contacted Project Veritas.
00:59:12.000 And then we talked a little bit, and I think I filmed the first time around June of 2019.
00:59:17.000 I don't know.
00:59:17.000 And then I didn't respond for a little while.
00:59:19.000 I didn't check my email or whatever.
00:59:22.000 And then later that fall we started talking again.
00:59:25.000 So I didn't film.
00:59:27.000 And then we found out that the project was ending.
00:59:29.000 Cognizant chose to end the project with Cognizant.
00:59:32.000 And so I started filming more regularly.
00:59:33.000 Interesting.
00:59:35.000 But yeah, so the February 2019, the Verge article came out and I didn't agree with some of it because I felt like they had, like Cognizant did a really good job of helping us with our mental health.
00:59:45.000 We had counselors on site 24-7.
00:59:47.000 We had a psychiatrist who was in charge of everything.
00:59:51.000 And so we had a number to call.
00:59:53.000 They gave us wellness time, like 10 minutes a day.
00:59:56.000 Were you right-leaning before taking this job?
00:59:58.000 I was.
00:59:59.000 I heard another story that people are getting red-pilled by moderating this content because they see so many right-wing memes that they start to be like, hey, wait a minute, you know?
01:00:08.000 Is that something you've ever experienced?
01:00:10.000 Like, not you personally, but did you see anybody who was like, hey, I saw this thing?
01:00:14.000 I don't know if, I don't know anyone personally.
01:00:17.000 I know some people were like, were open to conspiracy theories or like, you know, the flat earth theories, things like that.
01:00:25.000 So it did expose you to different viewpoints and you did get a darker sense of humor.
01:00:30.000 You got kind of like a gallows humor from working there.
01:00:32.000 Did you have like really nasty inside jokes?
01:00:35.000 Like the 12th time you saw an incest video and like everyone's like, Oh, one of those, huh?
01:00:39.000 And you guys are laughing about the horror of it all.
01:00:42.000 You know, similar types of humor.
01:00:43.000 I say this because people use humor to release the tension, right?
01:00:48.000 Yeah.
01:00:49.000 So I'm wondering if, like, you're watching, like, videos of murder and, like, all this crazy stuff and then someone's just cracking really dark jokes about it to try and bring some, like, levity to the situation, you know?
01:00:58.000 Yeah, I think that can be effective in a way.
01:01:00.000 I mean, therapy, I mean, a way of dealing with it.
01:01:04.000 One strategy that was kind of cool that one of the counselors taught me is if you're seeing something really violent, you don't really want to empathize.
01:01:10.000 You want to visualize yourself in a movie theater and then there's another you standing at the back of the theater.
01:01:16.000 So you're watching yourself in the movie seat watching the film and the film is what you're seeing in the video.
01:01:23.000 So it kind of distances and detaches yourself from the actual content.
01:01:28.000 It's weird.
01:01:29.000 It's so weird.
01:01:30.000 Yeah, I had co-workers who, man, they saw child porn and it really bothered them.
01:01:36.000 That's really bad.
01:01:37.000 But it was a small percentage.
01:01:38.000 Some of it was really funny.
01:01:39.000 It might burn everything down.
01:01:41.000 I couldn't imagine someone doing that job.
01:01:43.000 You see that stuff and then you just want to track these people down.
01:01:46.000 You see their profiles, right?
01:01:49.000 You can see their personal information.
01:01:51.000 So not not too much personal information sometimes just your profile photo and their name, but I'm sure I'm sure it's crossed through people's minds I mean hunt these people down.
01:02:00.000 Oh, yeah, like wow, dude, and then that'd be a cool superhero film I guess to be too gritty like a dude is a content moderator for Facebook Every time he sees a video.
01:02:10.000 He just likes like like kind of like like Liam Neeson I He said, like, the guy gets a Facebook message and he's like, I'm a Facebook content moderator.
01:02:19.000 I will find you.
01:02:20.000 I will destroy you.
01:02:21.000 It would be like Dexter, you know, like this dude just like has their Facebook information.
01:02:26.000 I couldn't imagine working a job like that, man.
01:02:27.000 I don't know how you did it for almost two years.
01:02:29.000 Yeah, it was tough.
01:02:30.000 There were times, but you know, luckily I wasn't able to, I didn't bring it home very often.
01:02:34.000 And I think there's, you know, there's a lot of people who struggle with it more than others.
01:02:39.000 But it was just, it was a unique job.
01:02:41.000 In a way, though, if you could get past that, it was a cushy type of job.
01:02:45.000 I mean, it was... If you could get past it.
01:02:46.000 Yeah, if you could get past it.
01:02:48.000 But we'd have just the weirdest conversations, like, you know, hey, is this... Because we had definitions for, like, I don't know if I can say this, like, for erections.
01:02:56.000 Like, is this an erection?
01:02:57.000 Is it the shape?
01:02:58.000 You know, you got really nuanced and detailed.
01:03:01.000 So you'd have conversations with your co-workers, like, hey, what do you think of this?
01:03:05.000 Oh, geez.
01:03:06.000 I'm imagining a dude walking up to a guy who's putting cream in his coffee, and it's like, so I got a video today, and this guy's got a screwdriver, right?
01:03:14.000 He's holding this other guy down while strangling him, and he lifts his arm up, and then it cuts out.
01:03:18.000 I'm wondering, is that too much?
01:03:20.000 Send me the link.
01:03:21.000 Yes, send me the link.
01:03:22.000 I'll check it out, and I'll let you know what I think.
01:03:24.000 And then it's like, hey Janet, come over here and take a look at this guy.
01:03:27.000 He's choking some guy.
01:03:28.000 He's about to...
01:03:29.000 Like, literal.
01:03:30.000 That's crazy.
01:03:31.000 Yeah.
01:03:32.000 Was that what it was like?
01:03:33.000 It was.
01:03:34.000 That's exactly what it was like.
01:03:35.000 Oh, man.
01:03:36.000 I can't help feeling that there are some parallels here between what you did and what we used to go through in the hospital.
01:03:42.000 Because we would talk about these horrible, horrifying things, and you have to make a joke about them.
01:03:47.000 You just have to.
01:03:48.000 Yeah, you just can't go with it.
01:03:50.000 Yeah, you have to deal with it somehow.
01:03:52.000 But you should know, maybe this is why I wasn't affected as much.
01:03:56.000 Like, I actually had a stint at a funeral home.
01:04:00.000 So, you know, we used to have a lot of people under us.
01:04:03.000 Under you?
01:04:04.000 That sounds bad.
01:04:06.000 See, that's the gal's humor.
01:04:08.000 I used to work for a bunch of dips.
01:04:09.000 I got it, yeah.
01:04:12.000 It took us a while, huh?
01:04:15.000 I don't know the inside humor.
01:04:16.000 Probably wasn't the best joke.
01:04:17.000 That's funny.
01:04:19.000 If you work at a funeral home, you'll probably start laughing.
01:04:22.000 I kind of got it, but I was like, wait, is that it?
01:04:25.000 Wait, what am I missing?
01:04:26.000 Did you guys ever watch bangedup.com back in the day?
01:04:30.000 A couple decades ago, it was just gruesome people getting killed videos.
01:04:33.000 Faces of Death.
01:04:34.000 Yeah, with stuff like that.
01:04:35.000 That's like the video series.
01:04:38.000 2001, 2, 3 years.
01:04:39.000 I used to watch that.
01:04:40.000 That kind of prepped me for the job.
01:04:41.000 Dude, I remember my friends would share the Faces of Death VHS.
01:04:45.000 Gross.
01:04:45.000 You guys don't know what that was?
01:04:46.000 It was literally just videos of people dying.
01:04:49.000 And I was just like, why would you put that on?
01:04:51.000 I was like, I've got Fast Times at Ridgemont High on VHS.
01:04:55.000 Let's put that in.
01:04:57.000 That one's got boobs in it.
01:04:58.000 I don't want to watch a guy die.
01:04:59.000 I would try and watch military headcams, like a soldier getting shot and bleeding out as he's screaming.
01:05:06.000 Because my theory was, if it happened, I should expose myself to it.
01:05:09.000 And it kind of works, but it can also drive you insane.
01:05:13.000 I actually agree.
01:05:14.000 I remember there would be these viral videos periodically that were extremely gruesome.
01:05:19.000 And people would be like, don't watch it, don't watch it.
01:05:20.000 And I'd be like, I kind of feel like you shouldn't hide from what's out there.
01:05:24.000 And I'm not saying to go around and watch every single video.
01:05:27.000 You watch one and you say, I get it.
01:05:29.000 But don't hide.
01:05:30.000 Yeah, don't hide from it.
01:05:31.000 Because I've seen stuff in real life.
01:05:33.000 I've seen people get shot and killed.
01:05:34.000 I've seen some pretty brutal stuff.
01:05:36.000 Yeah.
01:05:37.000 Blow my track in me.
01:05:37.000 That's too much.
01:05:38.000 funny when you were bringing up the counseling and all that stuff, I'm like, if I watched
01:05:41.000 a video of someone getting seriously hurt or some violence or gore, I could easily handle
01:05:47.000 it.
01:05:48.000 But then you went into the child stuff and I was like, oh, that's one out.
01:05:50.000 Probably just my head would explode and I'd start smashing things.
01:05:53.000 Yeah, exactly.
01:05:54.000 It's like you're taking off work and you're loading a 12 gauge or whatever and they're
01:05:59.000 I just got some work.
01:06:00.000 I got some stuff to do.
01:06:02.000 I'll be back tomorrow.
01:06:04.000 Maybe not.
01:06:04.000 I might be in prison.
01:06:05.000 When I was watching, when I was watching what was happening with Project Planet Pizza.
01:06:10.000 I don't know if you remember the Pizzagate story.
01:06:13.000 There was somebody who went there with a gun, and I was like, I was honestly just surprised it wasn't a girl.
01:06:18.000 Because if you really believe that kids are being abused, you're going to make stuff happen.
01:06:22.000 That's what's scary, though, because that was so dumb.
01:06:25.000 Yeah.
01:06:25.000 I was like, that's not right.
01:06:26.000 This building doesn't even have a basement.
01:06:29.000 So I'll tell you what, in that regard, I do kind of understand why they want to moderate some content.
01:06:35.000 Yeah.
01:06:35.000 I mean, you were mentioning like graphic scenes, you know, we would see like car accidents, people dying, suicide videos.
01:06:41.000 Also, uh, I was going to say tambien, which means also in Spanish.
01:06:45.000 Sometimes I switch the two.
01:06:46.000 Um, but I mean, like, so I was going to say, so remember that, that Trumpsman viral meme, like from the Kingsman movie, there was this Trumpsman meme and we actually, yeah, we categorized that.
01:06:58.000 So it showed, so it showed a scene from this fictional movie Kingsman where there's a lot of violence and it was like killing the media.
01:07:04.000 Right.
01:07:04.000 Yeah, killing the media, and so I had the real photos photoshopped in, but like, in the graphic violence, so we actually put an interstitial on that.
01:07:12.000 They told us to mark as disturbing.
01:07:14.000 M-A-D, mad.
01:07:15.000 And so, like, I could see an argument for just leaving it up because it's fictional.
01:07:20.000 It's a fictional scene.
01:07:21.000 And wasn't Trump, like, it's like, so Kingsman's an awesome movie, and it's like this is really graphic scene where he's killing, it's in a church.
01:07:27.000 It was actually, I think Colin Firth was the actor.
01:07:30.000 And it was in the movie.
01:07:31.000 This is what's really, really funny.
01:07:33.000 I mean, kind of disturbing.
01:07:35.000 In the movie Kingsman, definitely watch it.
01:07:36.000 I love this movie.
01:07:37.000 Colin Firth goes to this, like, really extremist church.
01:07:42.000 And the villain triggers this kind of sound thing that makes everyone go nuts and become extremely aggressive.
01:07:47.000 Well, Colin Firth's character is a secret agent of measurable skill, and he kills every single person.
01:07:54.000 And it's gruesome.
01:07:55.000 Yes.
01:07:56.000 Now here's what's funny.
01:07:57.000 My understanding, that's okay.
01:08:00.000 Putting up a video on Facebook, a scene from a movie where a British guy in a suit, a gentleman, brutally murders a bunch of Southern churchgoers, totally fine.
01:08:09.000 But superimpose logos from media companies on it, and put Trump's face on the British guy and WHOA!
01:08:15.000 Whoa, whoa.
01:08:18.000 And the same thing happens in Brazil, too.
01:08:19.000 Like, President Bolsonaro, when he was running for president in Brazil, he actually got stabbed.
01:08:25.000 I remember that.
01:08:25.000 It was like a long knife, wasn't it?
01:08:27.000 Yeah.
01:08:28.000 I don't know the exact length of it.
01:08:29.000 Yeah, yeah.
01:08:30.000 Huge knife.
01:08:30.000 It was a life-threatening wound.
01:08:32.000 Like, it was a big deal.
01:08:33.000 That was crazy.
01:08:33.000 And so he, yeah, so he got stabbed.
01:08:36.000 And then there was memes showing, like a cartoon meme showing Bolsonaro with a knife coming out, kind of like a boomerang, and coming back, landing back on him.
01:08:45.000 So it was mocking, or like, you know, mocking the events of this, of him almost dying.
01:08:50.000 And Facebook allows that.
01:08:52.000 And so, I mean, Brazil is another animal as well.
01:08:56.000 I mean, Brazil, there's a lot of tech censorship that goes on there.
01:08:59.000 I was there like last month.
01:09:00.000 I met with a congresswoman in Brazil.
01:09:03.000 And their Supreme Court is just targeting conservatives.
01:09:07.000 Like, I met a couple people whose homes were raided by the Supreme Court because they were supporting right-wing news organizations.
01:09:14.000 Wow.
01:09:14.000 And so these people, they want to use, they want, actually want the U.S., the U.S.' 's help to, uh, use the Magnitsky Act.
01:09:22.000 Magnitsky?
01:09:23.000 Magnitsky, yeah.
01:09:24.000 Act against, um, the Brazilian Supreme Court.
01:09:27.000 So, I mean, yeah, it's, it's happening there and they're looking at this election coming up and wondering what's going to happen.
01:09:33.000 Cause a lot of countries in the world are looking at the U.S., hey, if, if Trump doesn't win, things are going to get a lot worse for us.
01:09:41.000 In Brazil?
01:09:42.000 Yeah, in Brazil and everywhere else, because we're such a big strategic ally, and they look to us for leadership as far as, you know, basic freedoms and whatnot.
01:09:52.000 I think it's funny that the people who oppose Trump call themselves the resistance, and they're literally on the side of every major multinational corporation and big tech conglomerate authoritarian structure.
01:10:04.000 But Donald Trump, this one guy, I tell you what, they resist him.
01:10:08.000 If he's a dictator, he's pretty bad at it because you think by this time after four years he would have taken control of, you know, Facebook and Twitter.
01:10:17.000 It's funny that, you know, I spent my life growing up hearing from the left that the corporations are the problem.
01:10:22.000 And the government will save us.
01:10:23.000 And then I hear from the right, no, the government is bad and corporations are fine.
01:10:27.000 And I'm being very general with this.
01:10:28.000 Now you've got the left that they're basically like, corporations and government are great!
01:10:32.000 Which sounds very much like fascism.
01:10:35.000 They've got roving bands of... This is crazy.
01:10:39.000 In San Francisco.
01:10:40.000 There was a dude, there were people putting on a protest of big tech censorship.
01:10:44.000 One of the dudes was a black dude.
01:10:46.000 And Antifa punched him in the face, knocking his teeth out.
01:10:49.000 And my question is just like, wait, so you got a black dude who's protesting against multinational billionaires who are stifling his speech, so you punch him in the face?
01:10:59.000 Sounds like you're the fascist, man.
01:11:01.000 Like you're punching someone on behalf of a major corporation, billionaires.
01:11:07.000 Like you're the brown shirt man.
01:11:11.000 It's, yeah, it's pretty insane.
01:11:14.000 Yeah, the whole corporate mentality, I mean, now you have these huge corporations, like, bending to the Black Lives Matter movement, and none of these groups are hate orgs.
01:11:25.000 So the dangerous individuals and organizations policy deals with these organizations, like any criminal organization.
01:11:31.000 So we have a list of cartels that we would delete on a regular basis, terrorist organizations, obviously.
01:11:37.000 But they added something there and it said not allowed We delete people notable for attacking people based on
01:11:43.000 protected characteristics. So based on the race Ethnicity gender, but it's such a broad definition. Okay,
01:11:50.000 who are these people that are notable?
01:11:51.000 Where's the list because we had the list of hate figures?
01:11:54.000 So we had the list where you had another hate figures Because it's on the list that Facebook gives us
01:12:00.000 Oh, okay.
01:12:00.000 So Facebook created the list?
01:12:01.000 Yeah, Facebook defined them as hate figures.
01:12:05.000 Oh, so that was Paul Joseph Watson, Milo, Alex Jones, right?
01:12:10.000 So on the list that I had, I for sure saw Gavin McInnes and Tommy so-and-so.
01:12:16.000 Yeah, I don't know if we were allowed to say his name, but yeah.
01:12:19.000 Gavin McInnes, we're talking about them neutrally, so we're not praising, supporting, or representing them.
01:12:23.000 Oh, we're noobs, they're gonna ban us.
01:12:24.000 You need to consider that you can't say someone's name is insane.
01:12:27.000 Well yeah, so with these organizations- That's 1984, isn't it?
01:12:29.000 So bizarre.
01:12:30.000 Like you couldn't say people's names?
01:12:31.000 This is backwards, we should stop that.
01:12:33.000 No, I mean like, wasn't that literally in the book?
01:12:34.000 Where it was like a person's name wasn't allowed to be said or something?
01:12:36.000 I don't know, I think so.
01:12:37.000 What book was it?
01:12:38.000 The Brave New World, maybe?
01:12:39.000 Get it together.
01:12:39.000 Anyway.
01:12:40.000 So yeah, PSR, Praise, Support, or Represent, you cannot do any Praise, Support, or Represent of these individuals.
01:12:45.000 And so Gavin McInnes and Tommy are literally on the same list as Adolf Hitler.
01:12:53.000 I'm not shitting you.
01:12:54.000 Wow.
01:12:55.000 And so, yeah, so this is the hate figure list.
01:12:58.000 So, I mean, this is how they count, and the source they use, and I have evidence, the source they use is the ADL and the Southern Poverty Law.
01:13:06.000 That's what I was asking.
01:13:08.000 So, the SPLC went through its own major scandal where apparently they were being run by racists.
01:13:14.000 And there was an investigation that found decades of racism.
01:13:18.000 And that the organization itself was essentially just a bunch of racists making money off pretending to be not racist.
01:13:26.000 And then storing a bunch of that money overseas or something.
01:13:29.000 The ADL has its issues, but I gotta be honest, when you read articles from the Anti-Defamation League, like we had, and Ricky Tario of the Proud Boys here, it was, I would say, extremely critical.
01:13:41.000 It had an extremely negative view.
01:13:44.000 But it was... decent assessment, I should say.
01:13:48.000 Like, they said they're clearly not white supremacists, though they have had members who overlap, and I'm like, that's true.
01:13:52.000 Like, even they've talked to us about like, oh yeah, we had to kick these people out for these reasons.
01:13:56.000 And so I think the ADL is... the problem with the ADL is that their view of everything is extremely negative, and they view all these groups like...
01:14:08.000 I don't know how else to describe it other than they're very, very opinionated to an extreme degree.
01:14:15.000 You know, you do one wrong thing and they're like, ah, that person's evil, but they'll mention what you did and why they're mad about it.
01:14:20.000 They just get mad really easily.
01:14:22.000 Southern Poverty Law Center, as far as I'm concerned, just makes it all up.
01:14:25.000 Like, they had one article that included me.
01:14:27.000 Where they claimed that I went to an Iranian Holocaust deniers conference and their evidence for it was an archived version of a website that didn't exist anymore that was a Holocaust deniers website saying that I was an attendee.
01:14:41.000 I've never been to Iran.
01:14:42.000 Did you sue them for that?
01:14:43.000 So there was a suit.
01:14:45.000 They settled immediately and issued an apology.
01:14:47.000 And they said I was a leftist.
01:14:49.000 How about that?
01:14:50.000 They basically said we apologize to people who are on the left, Tim Pool, you know, so-and-so, so-and-so, or whatever.
01:14:55.000 Speaking of Iran, so I actually have evidence, a screenshot at work, of a post giving guidance about Iran.
01:15:03.000 So Facebook allowed the phrase death to Khamenei for about three months, roughly, when there was mass protests in Iran.
01:15:10.000 Interesting.
01:15:11.000 Well, let's have a hard question.
01:15:12.000 I mean, a hard conversation.
01:15:13.000 side you're on doesn't really matter but it just shows you the amount of power
01:15:16.000 that they have so on the entire Facebook platform Facebook can just switch a
01:15:19.000 lever and say we're gonna allow the phrase death came in you well that's
01:15:24.000 normally it's not allowed well let's have a hard question I mean a hard
01:15:28.000 conversation if if Facebook just allows everything won't it go nuts
01:15:34.000 Like, isn't it possible we'll see crazy extremism and crazy groups of people just screaming violence or whatever?
01:15:41.000 I mean, actually look at Antifa and look at Black Lives Matter violence.
01:15:45.000 140 plus days of rioting.
01:15:47.000 And isn't it because they won't check them?
01:15:49.000 They won't say stop advocating for violence?
01:15:53.000 Yeah, I think there's the potential.
01:15:54.000 I mean, if there's some changes with Facebook and more things are allowed, I mean, if in general more things are allowed, and it's funny because the left always claims that Facebook's not doing enough to censor hate speech.
01:16:05.000 Right, right.
01:16:06.000 When there's a bunch of people on the list that are right-wing, but there's nobody on the left.
01:16:10.000 Even segregationists on the left, they're never on the list.
01:16:14.000 Yeah, I think there's a lot of crazy stuff on the internet.
01:16:17.000 I mean, I think we're barely even beginning to Grasp like how influential the internet is how powerful it's become in the last 20 years.
01:16:27.000 We've seen this technology revolution So I don't think even the leading technologists or even the people who created the internet understand what it's become Assuming that goes into the larger debate of what's allowed on the internet.
01:16:38.000 I mean, yeah, there's me crazy There's gonna be a lot of crazy stuff on the left and the right both both parts.
01:16:43.000 I Do you think there is some kind of collusion between Democratic politicians and Facebook?
01:16:49.000 You do think so?
01:16:50.000 Yeah.
01:16:52.000 Well, I mean, so, here, let me say this.
01:16:55.000 Indirectly, at the very least.
01:16:58.000 At the most, direct collusion.
01:17:00.000 So, did you ever witness a Democratic politician coming in and talking with anybody or any kind of evidence that they were?
01:17:08.000 I do not have direct evidence.
01:17:10.000 What I can say is there's a group of about six people.
01:17:14.000 So I talked to one of my team manager, Alexis, and she had conversations, interactions with the global policy team at Facebook.
01:17:23.000 There's about six people who determine the policy.
01:17:26.000 And she and she was telling me like yeah, like they're all kind of they all kind of have the same mindset And they're all based in San Francisco as far as direct like, you know collusion with Democrat politicians I talked to a journalist with the New York Post the other day one or two days ago and he was finding this connection between Twitter Executives who had this revolving door with the Obama administration Well, so there was a Facebook employee who joined Joe Biden's team and a Twitter employee who did as well.
01:17:54.000 And I think the guy's name is Andy Stone.
01:17:56.000 I'm not sure.
01:17:57.000 Facebook Communications previously worked for the Democratic Congressional Campaign Committee.
01:18:01.000 So we've seen that there is the revolving door between Facebook and Democrats and Twitter and Democrats.
01:18:08.000 They're working for Joe Biden.
01:18:10.000 This guy announced that they were going to censor the New York Post story on Hunter Biden.
01:18:14.000 Like, that's crazy.
01:18:15.000 I mean, this is a prominent American newspaper, started by Alexander Hamilton in 1801.
01:18:21.000 And he's, I believe it was 1801, and he's like, well, we're going to censor the story anyway.
01:18:25.000 This guy worked for the Democrats.
01:18:26.000 Apparently, I guess he worked for Barbara Boxer before.
01:18:29.000 So, uh, what makes you, look, I get it.
01:18:33.000 We have those stories, but, uh, you know, based on your experience working in this company, what, what made you think that there was potentially direct collusion?
01:18:42.000 As far as direct collusion, I didn't work in the Facebook offices in Menlo Park.
01:18:50.000 The people I interacted with were mainly Cognizant employees, to be frank.
01:18:57.000 Now, Sean Browder is the person who interacted constantly with the client, with Facebook.
01:19:02.000 And so what I can, you know, notice, what I've noticed from my conversations with him is, you know, first of all, he's very, he's very left-leaning, a Bernie Sanders supporter.
01:19:11.000 Um, they, they are very interested in, you know, me bringing up trends about reckoning extremism, uh, like Boogaloo.
01:19:18.000 And like, for example, the Virginia gun rally in January was labeled as like, Hey, watch out for hate speech or, you know, racist groups, you know, at this Virginia gun rally.
01:19:30.000 So, I mean, the evidence I have is not really, you know, it's not like I had conversations with Facebook employees who said, yes, a Democrat politician told us to change the policy.
01:19:45.000 So I think there's more of an indirect effect from all these leftist organizations.
01:19:51.000 So just more of like an extreme bias?
01:19:53.000 Yeah, more of extreme bias.
01:19:54.000 I mean, are there politicians that are probably buddy-buddy with people at Facebook?
01:19:57.000 I'm sure there are.
01:19:59.000 I mean, we had Joshua Faustine get banned like a month ago or whatever, and the only way he could get his account back was by reaching out to Trump officials, like people he knew.
01:20:11.000 So if that's happening on that side, same thing's probably happening on the opposite end.
01:20:17.000 I mean, if people are probably giving favors or helping out Democrat politicians.
01:20:23.000 I did notice here's something huge that I do have evidence of is the something called the shields that people that they have that Facebook has So it's called the fire brigade Let me see if I have the notes on here, but basically I Anyways basically there's different shields so I ran across some content I tried to like action it and it was like you cannot action this content there is a shield on this content so fire brigade stands for like PR fire so there's different tags so there's like high pry X check tag and business tag so you cannot delete or touch certain content because it has this shield or tag associated with it.
01:21:03.000 So, it may very well be that there's accounts that should be protected, kind of like VIPs, that you cannot delete.
01:21:11.000 I ran into the same thing when I worked at Uber because we would deal with fraud and partner fraud, but we occasionally ran into VIP accounts, like really big celebs that we didn't want to piss off or have their account canceled or something.
01:21:11.000 Interesting.
01:21:23.000 So we seeing everything we've seen over the past year, you know, someone like myself, I read the news all day and I have my opinions on whether or not Trump's going to win.
01:21:31.000 But with your experience with how big tech is manipulating the election, essentially, do you think Trump is going to win?
01:21:39.000 I think that Trump has a good chance of winning.
01:21:44.000 I think Facebook's doing everything they can to push people a certain way.
01:21:51.000 There was a cartoon image of Trump shooting himself and Facebook said, we're going to allow that.
01:21:57.000 Who was the guy who tried to assassinate Reagan?
01:22:00.000 He got out of jail or out of the hospital.
01:22:03.000 Hinckley?
01:22:04.000 Something like that.
01:22:05.000 Yeah.
01:22:06.000 And so he came out of the hospital and there was this meme implying that he should shoot Trump and Facebook allowed that.
01:22:12.000 Wow.
01:22:12.000 And so, yeah, I really think there's a chance that, that, you know, despite all that, there's a good chance Trump will still win, especially when you're going against someone like Joe Biden.
01:22:21.000 I wonder if it's not despite all that.
01:22:23.000 I wonder if it's because of it.
01:22:24.000 I wonder if people are sick and tired of seeing the psychotic behavior of the left, the cancel culture stuff.
01:22:30.000 Like, you don't see videos of people, you know, threatening Joe Biden.
01:22:33.000 You don't see these pictures or videos mocking him.
01:22:35.000 I have to imagine there's regular people who might see, like, remember when Kathy Griffin held up the head?
01:22:40.000 I wonder how many votes Trump got from that, from people who are just like, oh, that's disgusting.
01:22:40.000 Yeah.
01:22:44.000 What's wrong with you?
01:22:46.000 And then all of a sudden kind of just like, I don't like these people, you know?
01:22:46.000 You know?
01:22:50.000 I think it's backfired.
01:22:51.000 Yeah, I think you're right in a lot of regards.
01:22:52.000 I think it's backfired because there's so much hate against Trump that the normal individual who's not hateful is like, hey, why is there all this hate against Trump?
01:23:02.000 And so, yeah, I think that's very possible.
01:23:04.000 There's been pushback and it's kind of worked against them.
01:23:07.000 And the fact that I revealed this damaged, you know, Facebook's reputation and brought to light certain things.
01:23:12.000 So, excuse me.
01:23:15.000 If Facebook had been playing by the rules from the get-go, then I wouldn't have been able to film their discrepancies, them breaking their own rules.
01:23:22.000 Can't Facebook just be like, oh no, this is cognizant, these, these, oh darn it, these evil contractors, we had no idea.
01:23:29.000 Well, some of the posts, they can't really argue that because, for example, when they told us to not treat abortion as a violent death, it said the Facebook team has given us, it said the FB team has given us guidance to not treat abortion as a violent death.
01:23:43.000 That reminds me of my conversation with Jack Dorsey when I was explaining to them that their rules are inherently biased.
01:23:49.000 And, you know, Vijay Gadde and Dorsey were immediately like, oh no, that's not true, what are you talking about?
01:23:53.000 And I said, you have a misgendering policy.
01:23:56.000 Like, you straight up say if you misgender someone, if you went to a conservative and said, don't misgender someone, they would assume you're saying, if someone is born male, then you call them he, him, and if born female, you know, she, her.
01:24:07.000 Whereas Twitter's perspective is the inverse.
01:24:10.000 If somebody says their identity is, you know, then you gotta use that.
01:24:13.000 So, to a conservative, what Dorsey views as misgendering, it's inverted.
01:24:18.000 And if their rules are built around a progressive understanding of these definitions and what they mean, Then yeah, they're inherently biased.
01:24:26.000 Yeah.
01:24:27.000 Yeah, because like, you can't really tell that you're biased.
01:24:30.000 If you're that biased, you can't tell that you're biased.
01:24:32.000 And there's that group think effect that we saw, you know, in 2008 when the media just fawned over President Obama.
01:24:40.000 But an example of that related misgendering is we had, so in the hate speech policy, if you're attacking someone, like a race or an ethnicity, and you have like a stick figure that represents them, then that can still violate.
01:24:50.000 So if you have a little cartoon imagery of A Christian figure, like a little stick figure, kicking a Muslim stick figure out of Europe, that violates the hate speech policy.
01:25:00.000 Because they're representing, the stick figures are representing that characteristic, the ethnicity, the race.
01:25:06.000 And so with mental, okay, so with gender, so there was a meme that showed the, that was calling certain genders a mental illness.
01:25:17.000 So, I think it was like non-binary or something like that.
01:25:20.000 He was saying, and he said, this is a mental illness, but there was no stick figure.
01:25:23.000 It was just the symbols.
01:25:25.000 So, it's purely talking about the ideology.
01:25:28.000 So, Facebook backtracked because for a while they were like, okay, allow it.
01:25:30.000 It does not violate our policy.
01:25:32.000 But then they backtracked and they said, actually, we're going to delete this.
01:25:35.000 We know it does not violate the policy, but we're going to delete this.
01:25:40.000 Once again, more exceptions.
01:25:42.000 I have a list of like 30 examples here, but... Bro, it's human centipede.
01:25:47.000 Yeah, it's human centipede.
01:25:49.000 These Bernie bros are getting news from progressive websites, and these writers are getting their news from Facebook posts, and then it's all being recycled.
01:26:01.000 Over and over again.
01:26:02.000 You know, Twitter is probably one of the worst things that ever happened to journalism because it created a feedback loop among journalists where they just follow each other and share the same stories over and over again.
01:26:12.000 And Facebook's algorithm essentially promoting intersectionality because it's got more keywords, more buzzwords, and it's shot content.
01:26:19.000 So long as they allow it, it will continue to get worse.
01:26:21.000 So here's what I think happens.
01:26:24.000 These people on Facebook accidentally got sucked into this vortex, where content that was about police brutality and intersectionality was socially acceptable, because racism, bad.
01:26:35.000 And it combined all these keywords, so it made money, and that was the perfect storm.
01:26:40.000 These people were reading this stuff on Facebook, believed it all, then got hired at companies to moderate and said, oh, but these things are true, I see them all the time.
01:26:47.000 That's okay.
01:26:48.000 And then the journalists see the same thing and write the stories and create this feedback loop where they're all spinning away in their little human centipede vortex off in the corner while regular people are confused as to what is going on.
01:26:58.000 Yeah.
01:26:59.000 Because regular people aren't in that world.
01:27:02.000 That's the craziest thing to me is that, like, Joe Biden struggling, struggling to find that space between regular America and the Democrats is just, it's not there.
01:27:13.000 And that's why he's trapped in this fracking thing.
01:27:15.000 So you watch the debate, I assume?
01:27:17.000 Yeah, I watch the debate.
01:27:17.000 The reason Joe Biden lied and said we're gonna ban fracking is because the activist left is in that vortex with these people at Facebook, with these journalists, and they believe insane things, and they're a tiny group of people, relatively.
01:27:31.000 So Joe Biden, if he wants to be the Democratic nominee, you gotta say the craziest thing imaginable.
01:27:36.000 And that's why even Bernie was like going off the rails, and their policies kept getting crazier.
01:27:40.000 Then once Biden won, Now he's got to talk to regular America again.
01:27:44.000 And so he's like, no, no, I don't want to buy fracking and they're like, here's a video of you saying it over and over again.
01:27:48.000 So I can't imagine him winning.
01:27:50.000 He chased.
01:27:51.000 So here's what I'm hoping.
01:27:52.000 Here's my optimism.
01:27:53.000 I think you've got The desperate Democrats chasing after Twitter, the Twitterati left, who are in this human centipede vortex, and it's going to backfire on them because regular people don't live there.
01:28:06.000 Regular people don't know what Joe Biden's on or talking about when he says these things.
01:28:10.000 And I think that's why they're so desperate, you know, to run.
01:28:13.000 I think they're running against Trump, like anti-Trump.
01:28:17.000 Joe Biden is not the candidate.
01:28:18.000 The candidate is, do you like Trump or hate Trump?
01:28:21.000 And they're doing that because Joe Biden can't possibly please the Democrats.
01:28:21.000 Yeah.
01:28:25.000 They are split up in a million different ways, and they'll never get back the moderates.
01:28:30.000 In fact, the party seems to be shrinking based on Gallup's latest polling.
01:28:35.000 The party affiliation for Democrats has gone way down.
01:28:39.000 So we'll see how that plays out, I guess.
01:28:41.000 Yeah, it's going to be interesting to see.
01:28:43.000 I mean, you know, Facebook's been involved in the elections.
01:28:47.000 They've certainly taken a central role.
01:28:50.000 And it's weird because I saw this app I have where you get like a discount on your gasoline.
01:28:55.000 It's called Get Upside.
01:28:57.000 And I saw this little notification that popped up and then disappeared.
01:28:59.000 It's like, hey, if you vote, we'll give you 10 cents off per gallon.
01:29:02.000 I'm like, No, I'm gonna take that to the Attorney General in Arizona.
01:29:06.000 Same thing with, like, Schlotzky's.
01:29:08.000 They can't offer you a Schlotzky's sandwich shop?
01:29:11.000 Yeah, yeah, I'm familiar.
01:29:12.000 Were they offering money or something?
01:29:13.000 A while back, like, they were offering a free sandwich if you bring in your voted sticker, which violates Arizona law.
01:29:19.000 So that's been my biggest complaint is, like, this entire time.
01:29:22.000 Schlotzky's is so good.
01:29:23.000 It's such a good sandwich.
01:29:24.000 They really should give you a free sandwich.
01:29:26.000 They should change the election law.
01:29:28.000 Yeah.
01:29:30.000 So that's my thing is like, you know, if Schlotzky's getting in trouble for that, look what Facebook's doing on a global scale.
01:29:38.000 I was monitoring content in Latin America, in Venezuela, in Mexico, the Mexican presidential election.
01:29:44.000 I saw some, you know, content about Spain.
01:29:47.000 Facebook has training decks for Poland, for Taiwan, for every country imaginable.
01:29:53.000 In Poland, Facebook shut down their Independence Day march every year because they called it hate speech.
01:30:00.000 They're always purging these nationalist groups in Spain or in Europe.
01:30:06.000 And so I just think that, you know, the amount of power they have and influence they have is immense and they're using all of it to try to influence this election in the name of, you know, protecting against hate speech.
01:30:19.000 It's a feedback loop and it's going to implode at some point because the things they believe are getting more and more unhinged and it literally doesn't make sense.
01:30:28.000 Yeah.
01:30:29.000 Like when you see the degree to which I'll put it this way.
01:30:34.000 Ocasio-Cortez gets 400,000 concurrent viewers on her Twitch stream.
01:30:39.000 This is the future of politicians.
01:30:42.000 This woman, with all due respect, I respect that she was a bartender and it's part of the American dream, you know, that anybody can be uh... a politician it's it's for oven by the people seem to
01:30:53.000 be smart person in the world you gotta be rich
01:30:55.000 and that's pretty incredible but man does she not know anything about what she's doing
01:30:59.000 the bills that she's gotten past the black renaming post offices
01:31:02.000 and yet she still gets all these left a screaming and cheering for
01:31:06.000 and i i i so i i was talking my friend this guy in our earlier
01:31:10.000 There's a viral post by Sophia Narwicz of all of these leftist blue checks mocking Trump for saying coyotes were bringing kids over the border.
01:31:18.000 Coyote, of course, is the name of the smugglers who bring people across the border.
01:31:23.000 All of these people thought that Trump was literally talking about coyotes carrying children, like the Dingo Ate My Baby or whatever.
01:31:32.000 It was a combination of ignorance and arrogance.
01:31:35.000 And then when I see AOC, and I see her getting 400,000 Twitch stream viewers, I'm just like, our politicians of the future, it's gonna be idiocracy.
01:31:44.000 They're going to be influencers who know nothing, but man, can they get those views.
01:31:49.000 Because she won her primary.
01:31:52.000 She might not win re-election, she probably, probably will.
01:31:55.000 You know, because there are still Republicans, and maybe, I really doubt it, she's gonna win.
01:31:59.000 And that's crazy because what has she done other than rename some post offices and then offer up ridiculous policies?
01:32:06.000 No, she's an influencer.
01:32:08.000 She's got 10 million followers on Twitter.
01:32:10.000 Sure, that's part of being an influencer.
01:32:12.000 And she dances.
01:32:14.000 And you take someone who has no idea what's going on.
01:32:17.000 We are going to have politicians who are literally going to have flame wars on Twitter and that's going to be it.
01:32:21.000 No policy discussion whatsoever.
01:32:22.000 Well, they should have a term limit then.
01:32:24.000 In and out.
01:32:24.000 If you don't do anything too bad, do something with your time.
01:32:27.000 I don't think that's the issue.
01:32:27.000 Don't waste my time.
01:32:29.000 I think the issue is people don't vote based on policy.
01:32:33.000 They're voting based on orange man bad.
01:32:36.000 They're voting based on who I like or don't like.
01:32:38.000 That's how Reagan got in too.
01:32:40.000 He was famous.
01:32:41.000 He was a B-list actor.
01:32:42.000 But who was Reagan running against in 84?
01:32:45.000 for? Do you know? George Bush senior? Was it Mondale? It was like a horrible campaign.
01:32:50.000 And what we've seen with Nixon as well is that there are these, when things go too far left
01:32:55.000 does snap back. So maybe we're secretly going to see this big Trump wave and everyone's going to
01:33:00.000 be shocked and the polls are going to be wrong. But what I kind of see happening in the future is
01:33:04.000 the left is being wound up into this psychotic space.
01:33:08.000 And it's so insane that policy can never be argued.
01:33:12.000 That's why they're running Joe Biden.
01:33:14.000 And that's why he hides in the basement.
01:33:15.000 Because if Joe Biden comes out, they're going to be like, wait a minute, I disagree with that.
01:33:18.000 So better just hide him and then say, you hate Trump though, right?
01:33:22.000 The politicians we're going to have in the future are going to be people who are like, I have no idea policy, but man left.
01:33:28.000 Woo.
01:33:29.000 And they're going to be like, yeah, left.
01:33:30.000 And it's going to be literal tribe.
01:33:31.000 It's going to make teams.
01:33:32.000 Well, it's like the Hunger Games movie with Pan Em and Katniss Everdeen.
01:33:37.000 Seriously.
01:33:37.000 What was the name of Lawrence, the actress?
01:33:40.000 Jennifer Lawrence, yeah.
01:33:40.000 Jennifer Lawrence.
01:33:41.000 And, you know, you have Pan Em, you have Cornelius Snow, and they're having non-stop parties in the Capitol.
01:33:47.000 They have no clue what's going on in the rest of the country.
01:33:49.000 Exactly.
01:33:50.000 Suppressing everybody.
01:33:51.000 And then they celebrate, you know, people dying.
01:33:53.000 Right.
01:33:53.000 And competing against each other, killing each other in a barbaric fashion.
01:33:58.000 Have you seen the movie Idiocracy?
01:34:00.000 No.
01:34:00.000 No.
01:34:01.000 Mike Judge is a prophet.
01:34:02.000 The president of...
01:34:03.000 So do you know what it's about?
01:34:04.000 Have you heard about it?
01:34:05.000 No.
01:34:06.000 So there's this average guy and it's played by Luke Wilson and he does this trial for
01:34:11.000 like a stasis.
01:34:12.000 He's in the army and he volunteers I guess for the stasis thing.
01:34:14.000 I don't know if he volunteered.
01:34:15.000 He was like useless.
01:34:17.000 So they were like, here's something you can do.
01:34:18.000 Sleep for a year.
01:34:19.000 So the project was to put him in this chamber, put him to sleep for a year, wake him up.
01:34:24.000 And it was him and a prostitute who got put in this program.
01:34:28.000 But, I guess he mentions, it's been a long time since I've seen the movie, because of bureaucracy and a loss of funding, they moved the stasis containers into storage and forgot about it for 500 years.
01:34:39.000 Then the machine finally kicks open, it's been 500 years, he wakes up, he's in the future, and now he's the smartest person in the world because In the beginning of the movie, they explain that evolution wasn't favoring those who were the most skilled.
01:34:53.000 Humanity had reached a point where evolution just rewarded those who reproduced the most.
01:34:57.000 And then it shows this really funny scene where, like, this football player is, like, he wins a football game, and he goes, I'm gonna do you!
01:35:04.000 And then it shows, like, his family tree getting bigger and bigger, like, babies popping up.
01:35:04.000 I'm gonna do you!
01:35:08.000 So anyway, in the future, the president is a wrestler named Camacho, and he has no idea what he's doing.
01:35:14.000 Everybody, like, the water fountains all have Gatorade in them.
01:35:17.000 And then they're watering their crops with Gatorade, so there's a huge famine.
01:35:22.000 And they don't know why.
01:35:23.000 And he's like, you know, when Luke Wilson, he's an average guy, he's not very smart.
01:35:28.000 He's just like, have you tried watering them?
01:35:30.000 And they're like, water?
01:35:31.000 Like, from a toilet?
01:35:33.000 And they laugh at him.
01:35:35.000 What's funny about that is, again, it's an older movie.
01:35:38.000 But seeing the people laugh at him when he says try watering the plants and they're like from a toilet you're
01:35:43.000 so dumb like that's exactly what we're seeing right now.
01:35:45.000 Yeah.
01:35:46.000 When people go on Twitter and go can you believe Trump said coyote they start laughing and high-fiving each other.
01:35:50.000 And you've got some high-profile left-wing activists that are saying these things and it's like we're an idiocracy.
01:35:58.000 It's the food supply, man.
01:35:59.000 High fructose and aspartame's doing it to people.
01:36:02.000 It's been 30 years.
01:36:03.000 That's nothing to do with why people are stupid.
01:36:05.000 Oh, that's a big part of why people are stupid.
01:36:07.000 No, people are stupid because they're being catered to on their baser instincts by social media and by video games, and instead of going out and actually engaging with the world, we are continually... It's like we're institutionalizing our children Every generation is more and more institutionalized.
01:36:25.000 A kid goes to school, and he's told what to do, and he's given everything.
01:36:29.000 Here's your lunch, here's your homework, do it.
01:36:31.000 They come back, now they're 22, they're 24, they're 26, they're getting out of college, and they say, just tell me what to do.
01:36:37.000 I've never done anything on my own, I'll just do what you tell me.
01:36:39.000 Then they go on the internet, and they hear their stupid tribalist garbage, and they all laugh and giggle about how dumb they are, but they think they're smart.
01:36:45.000 Those are the people who are voting right now.
01:36:48.000 And they might win.
01:36:49.000 Ocasio-Cortez is the first step.
01:36:53.000 And I think, look, we've already got a World Wrestling Entertainment Hall of Famer president.
01:36:58.000 So it's like Mike Judge called it.
01:36:59.000 He called it a little too early.
01:37:00.000 Or he said it was going to happen way later.
01:37:03.000 But I see this is where we're headed.
01:37:06.000 Now the optimism in this is that perhaps Regular people aren't stupid.
01:37:11.000 I mean, people are average.
01:37:13.000 You know, George Carlin said, think about how stupid the average person is, and I realize half of them are stupider than that.
01:37:18.000 Maybe regular people aren't that stupid, and see what's going on, and are like, I don't have anything to do with this.
01:37:23.000 Yeah.
01:37:24.000 But maybe, I don't know, maybe they are dumb, and they're gonna be like, Trump is bad, and, you know, vote against him.
01:37:28.000 Now I'm not going to sit here and act like Trump is the perfect person, you know, that we need, but I'll tell you this.
01:37:34.000 When you've got Facebook and these big tech companies and they're manipulating everything and they're, and they're, uh, you know, everything they're doing.
01:37:42.000 The only thing we can do right now is hope the Republicans win and then enact some 230 reform and change this.
01:37:47.000 Otherwise we will get a government run by people like Ocasio-Cortez who don't pass anything, but stream video games on Twitch and do Instagram live streams, but clearly have no understanding of politics.
01:37:58.000 Yeah, it reminds me of the 1993 movie Demolition Man with Wesley Snipes and Sylvester Stallone, where in the future, they're cops, basically in the future, and this, every time you curse or swear, there's this, it automatically finds you.
01:38:13.000 It's like Alexa, with, it automatically, like, finds you.
01:38:17.000 You may have just turned around.
01:38:19.000 Alexa, stop!
01:38:21.000 Sorry if I turned on your computer.
01:38:23.000 So yeah, it's like Demolition Man.
01:38:24.000 I mean, we're going to live in a society that's built into the system.
01:38:28.000 It's institutionalized where any hate speech is automatically filtered out and you have no freedom of expression.
01:38:35.000 I mean, you have to understand this is not like a right versus left issue.
01:38:39.000 I don't know if it's the A.I.
01:38:40.000 that's doing it, but maybe it's us versus the A.I.
01:38:43.000 Maybe A.I.
01:38:44.000 controls Facebook.
01:38:45.000 Because people have asked me, what are Mark Zuckerberg's motives?
01:38:49.000 Is he leftist?
01:38:50.000 I don't know.
01:38:50.000 No, he's a robot.
01:38:52.000 Maybe he is a robot.
01:38:53.000 It reminds me of Terminator 2.
01:38:54.000 Have you seen Terminator 2 with Skynet?
01:38:57.000 When they meet the guy that's building the Skynet A.I.
01:38:59.000 and he's all joyous about it and jubilant.
01:39:02.000 And Zuckerberg has the same kind of oblivious optimism.
01:39:06.000 Have you seen Zuckerberg?
01:39:06.000 And he's literally building A.I.
01:39:08.000 Yeah.
01:39:09.000 He thinks he's bringing everyone together in smiling and holding hands.
01:39:13.000 Hello, my name is Mark Zuckerberg, and I have created Facebook.
01:39:16.000 Thank you for the question.
01:39:17.000 He's got to free that code.
01:39:18.000 Shout out to all the code freers out there.
01:39:20.000 That literally doesn't, you always say that, but it doesn't even mean anything.
01:39:22.000 It's because you've got to watch what the AI is doing while it's talking to itself.
01:39:25.000 Otherwise, the AI is going to go haywire.
01:39:27.000 In terms of the AI, but... He's building AI.
01:39:31.000 He's one of the proponents of AI in the world right now.
01:39:34.000 So what, are we being purposefully wrapped up in this algorithm so that we all live in idiocracy joyfully?
01:39:41.000 I think we need to localize lawmaking.
01:39:44.000 Ignorance is bliss, baby.
01:39:45.000 So I think the top-down lawmaking is part of the problem with idiocracy, that we need somebody above us telling us.
01:39:51.000 I think at the local level, if we could make our laws locally.
01:39:53.000 We do that.
01:39:54.000 Well, I mean, more, more.
01:39:55.000 I don't want someone to decide how much taxes I have to pay from the top.
01:39:59.000 That makes no sense.
01:39:59.000 What do you mean?
01:40:00.000 We as a community should decide how much I want to commit to my roadways in this community.
01:40:05.000 We do.
01:40:05.000 Well, we have like a governor.
01:40:07.000 You gotta vote for your comptroller, and your governor, and your president, and your foreigner people to represent the dreamers.
01:40:16.000 First, your block leader in Chicago, then your aldermen, then your sheriffs, then your county commissioners, then your city council people, then your mayor, then your state senators.
01:40:27.000 One of the problems in this country is that people stopped voting locally.
01:40:30.000 power over me but I want to be the one that decides where my tax dollars go.
01:40:34.000 I don't want I want to just have that power.
01:40:36.000 Bro, what you don't understand is that one of the problems in this country is that people
01:40:40.000 stopped voting locally.
01:40:42.000 They're like, you know what blows my mind is people are like I'm gonna vote for Ocasio-Cortez
01:40:48.000 to fix my neighborhood.
01:40:49.000 She represents you to the federal government.
01:40:50.000 What does she have to do with the Bronx?
01:40:51.000 She can't represent anybody.
01:40:53.000 No, no, no.
01:40:53.000 She represents you to the federal government.
01:40:55.000 She tries and she fails.
01:40:57.000 Bro, what you don't understand is that when you vote for a congressperson, you're voting for someone to vote federally, not for your home.
01:41:03.000 Representative democracy is failing.
01:41:05.000 What you don't seem to understand is that when you vote for somebody, When you want your streets cleaned, you vote for your city councilman who says, I will clean the streets.
01:41:15.000 Or you run for city council on cleaning the streets.
01:41:18.000 We have all of that local stuff, but Americans don't do that anymore.
01:41:21.000 They're voting for senators on local issues for federal issues.
01:41:25.000 It makes no sense.
01:41:26.000 They're like, this is what blows my mind about modern politics.
01:41:29.000 You'll see a politician in Congress or the Senate, and they'll say, I'm going to go to Washington and I'm going to help you here by fixing this thing.
01:41:38.000 It's like, well, hold on.
01:41:40.000 Why are we talking about fixing this city and voting for a federal representative?
01:41:44.000 That doesn't make sense.
01:41:46.000 You want to fix the city, you got to vote for the locals to do the local thing.
01:41:48.000 I don't even, I think that's, we could cut that part out.
01:41:51.000 I mean, the technology is good enough.
01:41:52.000 We don't need someone above us unless you don't want to participate and you want to give away your power.
01:41:57.000 But why, why would you, why would you hire someone to do your, to represent you when you can just represent yourself?
01:42:02.000 I guess the question is why do you think someone's above you?
01:42:04.000 Well, if you vote someone to have the power over you to do, you know, the, A lot of times it could just be menial work that you don't want to deal with.
01:42:12.000 They're public servants.
01:42:13.000 Right.
01:42:15.000 This is not the issue.
01:42:16.000 I'm a public servant.
01:42:17.000 I could be.
01:42:17.000 You could be.
01:42:18.000 We could all be voting on our local jurisdiction.
01:42:21.000 Direct democracy doesn't work.
01:42:24.000 I mean, with the right technology, maybe it would, but I don't like mob mentality.
01:42:28.000 I don't like mob mentality.
01:42:29.000 That's why it wouldn't work.
01:42:31.000 So you elect a representative who's supposed to have a better understanding and represent your best interests and say something like... And they don't.
01:42:36.000 I've got farmers over here and coal miners over here and they disagree on this fundamental issue.
01:42:41.000 Unfortunately, the one that has to be done is going to favor the coal miners.
01:42:44.000 That's what we need to happen.
01:42:45.000 So there's this idea of, there's actually a group on Facebook that I know, a few guys, and they had this idea called the Seasteading Nation.
01:42:54.000 And so basically, they want to build their own independent nation in the ocean, in international waters.
01:43:02.000 So they've built up designs on how to build their, like, the structure, whatever, and they'd have to deal with, like, international politics.
01:43:09.000 But, you know, if you're going to do that, if you're going to build a, give me one second, you know, your own little nation in international waters, you've got to be careful about who you bring out there.
01:43:18.000 Because if there's enough crazy people, then things can go get chaotic, right?
01:43:22.000 It's really difficult to make a functioning society.
01:43:26.000 It's extremely difficult.
01:43:27.000 And we've done a pretty good job.
01:43:28.000 The problem is, I'm reminded of that video of the woman screaming into her phone, we're losing our democracy!
01:43:34.000 Yeah, well, she's right, but probably for the wrong reasons.
01:43:38.000 We're losing our democracy because we're voting for people, and we've done it for decades, and all they do is... Look at Joe Biden.
01:43:44.000 I'm gonna ban fracking!
01:43:46.000 Then he gets on stage.
01:43:47.000 Are you gonna ban fracking?
01:43:47.000 No, I didn't say that.
01:43:49.000 You get Hillary Clinton going on stage and speaking with a Southern accent.
01:43:52.000 You get Ocasio-Cortez going on stage and now all of a sudden she's talking with a Latina accent and she's like, but that's my first language.
01:43:56.000 Yeah, but you don't talk that way.
01:43:58.000 You're talking that way now.
01:43:59.000 We have video of you.
01:44:00.000 You're just manipulating people.
01:44:02.000 We have politicians who have decided just to say whatever they need to say to get elected.
01:44:06.000 And then what do we get?
01:44:08.000 AOC is supposed to represent this new wave of young upstarts who care.
01:44:11.000 She doesn't.
01:44:11.000 She doesn't do anything.
01:44:13.000 She renames some post offices, and then she goes and does Twitch and activism, and she talks about getting rid of farting cows.
01:44:18.000 None of which is realistic or is going to help anybody.
01:44:20.000 She tries to implement this radical social-economic change in the Green New Deal, which is more about the economy than it is about the environment, using the environment as a cudgel.
01:44:29.000 We have politicians that just don't care.
01:44:32.000 Here's something.
01:44:32.000 So Cortez makes a decision and makes a vote about something.
01:44:36.000 Then the people she represents, 70,000 people.
01:44:38.000 What if those 750,000 people all voted on an app and then that vote was tallied.
01:44:45.000 And instead of having someone like Cortez there, there's just that vote goes into effect.
01:44:50.000 Because that would just be direct democracy.
01:44:52.000 Well, on some level.
01:44:55.000 And I feel like that term democracy also is just being thrown out there a lot and I don't
01:44:59.000 even know if people understand what democracy means.
01:45:02.000 Because everyone's like, save democracy, don't let Trump defeat democracy, or democracy in
01:45:08.000 darkness from the Washington Post.
01:45:10.000 But to your example, I think using an app, that's a good way to interact with people
01:45:14.000 I think, but that is more direct democracy and not so much a republic.
01:45:21.000 But is it worse than having one person, Cortez, just making the decision?
01:45:25.000 Like, I'd rather give the decision to the crowd.
01:45:28.000 Why?
01:45:28.000 The crowd will then storm to some random person's house and then burn it down because they thought little girls were inside and they weren't?
01:45:33.000 Yeah, but giving one human the power over 750,000 people's lives is less effective than giving those 750,000 people the decision.
01:45:42.000 Ocasio-Cortez represents a district to the federal government.
01:45:44.000 She fails to represent them, by the way.
01:45:46.000 No.
01:45:46.000 Do you know what that means?
01:45:47.000 She ineffectively represents them.
01:45:48.000 Do you know what that means?
01:45:49.000 Yeah, people vote her in, and then she goes and makes decisions on their behalf.
01:45:52.000 No, she makes decisions at the federal level that affect the country, not her specific district.
01:45:58.000 On these people's behalf.
01:46:00.000 Yes.
01:46:00.000 Okay.
01:46:01.000 So what it sounds like you're not understanding is that when a politician comes out for Congress and says, I'm going to clean up our district and do good.
01:46:08.000 No, you're not.
01:46:08.000 You're going to Congress at the federal level.
01:46:11.000 You're going to vote on war and healthcare and things that will affect the greater nation.
01:46:15.000 If you want to fix your district vote locally, but people don't do this anymore.
01:46:18.000 So she's going to cast a federal vote on like something.
01:46:21.000 Why wouldn't you not just take the tally of those people that she was representing and then cast that as the vote for the federal.
01:46:27.000 Because direct democracy doesn't work, and that's why we have a constitutional republic with democratically elected representatives.
01:46:32.000 Constitutional republics don't work.
01:46:33.000 Not at this scale.
01:46:35.000 It's the best system the world has devised so far.
01:46:37.000 I agree, but we can improve it.
01:46:39.000 Sure, but how?
01:46:41.000 With technology, by using apps and stuff.
01:46:42.000 But all you're doing is saying, I found a new way to do direct democracy, which doesn't work.
01:46:45.000 Well, I mean, you still have sections.
01:46:47.000 It's like playing Among Us, and then you vote out the person, then you kill the wrong person.
01:46:50.000 Right, exactly.
01:46:51.000 The new video game called Among Us.
01:46:52.000 Well, it's actually an old video game.
01:46:53.000 You still have, like, communities that have to come together to vote.
01:46:58.000 But, you know, you'd still have, I think, locales of people.
01:47:03.000 We have local politics.
01:47:05.000 We have state-level politics.
01:47:06.000 We have city-level politics.
01:47:07.000 We have county-level politics.
01:47:08.000 We've got regional.
01:47:09.000 We've got regional coalitions between states.
01:47:11.000 And then we have the federal level.
01:47:13.000 Everyone, what's happening in this country right now is the internet has removed localities.
01:47:17.000 Out.
01:47:18.000 Local news is gone.
01:47:18.000 They're gone.
01:47:20.000 Why?
01:47:20.000 Because I'm just going to go to the New York Times and I'm going to read about Trump.
01:47:23.000 Well, did you know that a water main broke on 37th Street in your small town?
01:47:27.000 And now it's flooding and you're stuck.
01:47:29.000 People are going to national sources for news.
01:47:32.000 They're not talking to their neighbors anymore.
01:47:33.000 They're not voting for local politicians.
01:47:35.000 Corruption will run amok.
01:47:37.000 It's at the local level.
01:47:39.000 The internet has kind of digitized, in a sense, our political world, so that instead of focusing on my neighbor, you know, threw poop in my yard, ah, I'm angry about this, we should make that illegal, he shouldn't be allowed to do that, so I'm gonna vote for a guy who's gonna pass that law to make that illegal.
01:47:55.000 Instead, what's happening is they're like, someone's polluting in my backyard, it's my neighbor, and he dumped a big ol' bucket of poop, so I'm gonna vote for a federal politician to go and vote on whether we should go to war with it, you know, in Afghanistan.
01:48:05.000 It has nothing to do with what's happening in your backyard.
01:48:08.000 So what ends up happening is, in California, you get one party control because people are like, Democrat.
01:48:13.000 And the Democrats at the local level do literally nothing.
01:48:16.000 Then they vote for federal level politicians to go do things at the federal government level who don't do anything for California.
01:48:22.000 So San Francisco, for instance, you got Nancy Pelosi.
01:48:25.000 She represents San Francisco to the federal government.
01:48:28.000 She's not going to clean up San Francisco.
01:48:29.000 Her city's in ruins because people don't care.
01:48:32.000 They don't do anything.
01:48:33.000 You got a bunch of people that just don't know, don't care, and they're voting D or voting R. And that's where we are headed.
01:48:38.000 That's it.
01:48:39.000 Yeah.
01:48:39.000 Facebook contributes to that.
01:48:41.000 Yeah.
01:48:41.000 Facebook can influence on a local level.
01:48:42.000 So here's an example.
01:48:44.000 So on September 27th, 2019, there was a post that was direction to it given to us as content moderators, and it was having to do with cops.
01:48:53.000 So, if I have a photo of a cop, I post a photo of a cop doing an arrest, alright?
01:48:57.000 I'm standing there watching, taking a photo of him, post it on Facebook, and I put the caption, this cop is a pig.
01:49:04.000 Should that be allowed on Facebook?
01:49:06.000 I don't know.
01:49:06.000 But, Facebook's policy says, if there's a private individual, this cop's not a public figure, they're not famous, if there's a private individual, I have a photo of them, I cannot compare them to an animal.
01:49:16.000 So, that gets deleted no matter what.
01:49:18.000 Really?
01:49:19.000 Yeah, so if I put a photo of someone who's not famous, like my neighbor, and I say, this person's a pig.
01:49:25.000 If I compare it to an inferior animal, that gets deleted.
01:49:28.000 What's inferior?
01:49:30.000 What animals are inferior?
01:49:31.000 I don't know.
01:49:32.000 What if I say it was a grizzly bear?
01:49:33.000 That's probably not inferior.
01:49:35.000 This guy's a grizzly bear.
01:49:36.000 Those are superior.
01:49:37.000 Yeah, those are superior.
01:49:38.000 Are they?
01:49:38.000 Yeah, they are.
01:49:39.000 I'm actually from Alaska, so I take offense to that statement.
01:49:42.000 This guy's a moose!
01:49:44.000 Is a moose inferior?
01:49:46.000 Mooses?
01:49:47.000 I think it's a moose.
01:49:49.000 Yeah, I should know that.
01:49:51.000 What about a giraffe?
01:49:52.000 No, because think about it, like, it's silly, right?
01:49:54.000 Yeah.
01:49:55.000 Would that be considered an insult?
01:49:56.000 I don't know, giraffes, they're pretty cool.
01:49:58.000 It's up to the moderator.
01:49:58.000 And that's how nuanced the policy is, and there's a lot of gray area, and we try to all align and action things a certain way, but this exception, they said, Now, moving forward, as of September 27th, so we're talking about Facebook's preventing hate on the platform, right?
01:50:14.000 But here they're allowing more hate.
01:50:16.000 They're saying, we're going to allow con cops pigs.
01:50:18.000 And the only reason why was, quote, that is because of how the term is used in the NA market, North American market.
01:50:25.000 So that was the justification for the decision to allow more attacks on cops.
01:50:29.000 The only way that post of a Facebook of a cop getting called a pig, the only way it could get taken down is if that cop went to every single post and reported it himself with the name of FaceMatch.
01:50:40.000 Well then, how about we do, uh, we're gonna go Super Chats.
01:50:44.000 Okay.
01:50:44.000 Cause we went a little, yeah, well, you know.
01:50:46.000 Gotta flesh this local democracy thing out.
01:50:48.000 Oh yeah.
01:50:49.000 That's right.
01:50:50.000 Not Curtis says, I want to vote Trump mainly in spite of the media and Democrats, but also because of the peace deals.
01:50:55.000 However, I can't stand him when words are coming out of his mouth.
01:50:58.000 Oh, I'm so with you, dude.
01:50:59.000 Not like my vote matters.
01:51:00.000 I live in California.
01:51:01.000 Your vote definitely matters in California.
01:51:03.000 If every single Republican in every jurisdiction voted, they would probably beat every Democrat.
01:51:09.000 This is what people don't realize about AOC.
01:51:11.000 When she won, I think she got like 180,000 votes and there's something like 200,000 Republicans.
01:51:17.000 If every Republican went out and voted, they'd win.
01:51:19.000 And then the whole district would be like, how did this happen?
01:51:21.000 Well, it's because people voted, but people don't actually vote.
01:51:24.000 So, you know, we'll see how things play out.
01:51:26.000 I tell you what, uh, I hear you vote if you want to vote, but your vote matters wherever you are.
01:51:33.000 The only reason your vote doesn't matter is because people think it doesn't matter.
01:51:35.000 So they don't vote.
01:51:36.000 You gotta be enfranchised.
01:51:40.000 El Diablo says early voting is for chumps.
01:51:42.000 Change my mind.
01:51:43.000 Sure, but Democrats are winning right now and it's going to rile them up.
01:51:46.000 Dude, maybe.
01:51:47.000 Biden said that he's going to end the oil industry.
01:51:49.000 He did.
01:51:50.000 And all those people that voted for him that are like, what?
01:51:53.000 What?
01:51:54.000 Imagine you voted early in Pennsylvania and you're like, I don't like Trump.
01:51:58.000 And then Biden comes out a week later.
01:52:00.000 By the way, you should have waited for the debate because I'm going to end the oil industry.
01:52:03.000 And you're like, well, I'm going to lose my job.
01:52:05.000 Can I change my vote?
01:52:06.000 Apparently people were Google searching change vote like crazy.
01:52:09.000 And they were Google searching change vote to Republican like crazy.
01:52:12.000 Wow.
01:52:13.000 Joe Biden said he's going to end the oil industry.
01:52:16.000 He said a lot of stupid stuff last night.
01:52:17.000 That's crazy.
01:52:18.000 Yeah.
01:52:18.000 I think Trump did a good job in playing to, you know, trying to get Biden riled up and towards the last half hour Biden was, I think he was losing some steam.
01:52:27.000 He was stuttering a little bit more Rio Grande.
01:52:30.000 I think he was stuttering in that Rio Grande.
01:52:32.000 So yeah, I think, um, yeah, I think a lot of people are going to want to change their votes.
01:52:37.000 Hydro says, Tim, you say you get more views than CNN and other mainstream media, and if that is so, why would so many people not know about Hunter's laptop?
01:52:45.000 I have never said that.
01:52:46.000 I said I get around 55% of CNN's views on YouTube only, but YouTube, with their TV views and YouTube views combined, is like three or four times the views I get.
01:52:55.000 Still, you know, it's not bad for my channels.
01:52:59.000 GoAway says, Tim, look up the Great Reset.
01:53:01.000 Biden's talking points are coming from the World Economic Forum and Klaus Schwab.
01:53:04.000 I do not know who that is, but I'll check it out.
01:53:06.000 Katie says, Tim, we don't always agree, but at least you are a journalist that is researching both sides to get the truth.
01:53:11.000 We need more journalists like you.
01:53:12.000 Appreciate it.
01:53:13.000 Let's see.
01:53:14.000 There's a, you are amazing emoji.
01:53:15.000 I like that.
01:53:18.000 All right.
01:53:18.000 Let's see.
01:53:18.000 Aaron Freeman says, PA court sides with the Democrats.
01:53:21.000 Signatures don't have to match.
01:53:23.000 Opening door to full fraud, like Al Franken and the pizza box missing votes that made their way for Obamacare.
01:53:29.000 Yup.
01:53:30.000 We are going to have one heck of a wild ride in about 10 days.
01:53:36.000 Also, don't forget to smash the like button if you haven't already.
01:53:38.000 I really appreciate it.
01:53:40.000 Let's see.
01:53:41.000 Zurg says, Biden, you should have told Americans the truth.
01:53:44.000 Don't panic.
01:53:45.000 Trump, do you remember the toilet paper?
01:53:47.000 What, the panic?
01:53:49.000 Here we go.
01:53:49.000 The Flaming Gamer says, Caitlin Bennett meme appears to be a lie.
01:53:53.000 The oldest post I can find of the image is on Reddit in a random post completely unrelated to Bennett.
01:53:58.000 Yeah.
01:53:59.000 That's why you just, like, I tell people, like, just don't care about what they say because they're making things up.
01:54:03.000 Like, what are you going to do?
01:54:03.000 People are going to post fake stuff.
01:54:04.000 Let them post fake stuff.
01:54:05.000 Yeah.
01:54:06.000 Charming person says, I had a vision.
01:54:08.000 Biden and Putin were attempting to resurrect Stalin, but failed.
01:54:11.000 The spirit of Marx talking through AOC told them they'd been lacking two ingredients.
01:54:16.000 Two bin seed and Trump's blood.
01:54:17.000 Thank you for that.
01:54:18.000 I appreciate it.
01:54:19.000 That's a great, great dream.
01:54:22.000 Dan Larkin says, Tim, you're wrong about conservative prominence on Facebook.
01:54:26.000 Jeremy Boring, CEO of Daily Wire, did a video about this just two days ago on the Daily Wire YouTube channel.
01:54:31.000 I highly suggest you all watch it.
01:54:33.000 I'll check it out.
01:54:33.000 Jeremy's smart.
01:54:34.000 He didn't go to high school or he didn't graduate, I don't think.
01:54:37.000 Let's see.
01:54:37.000 Timothy Barsotti says, censorship has seemed to backfire because of the Streisand effect.
01:54:42.000 So why do they keep censoring posts?
01:54:44.000 Are they too stupid, arrogant, or are they actively helping some of these stories gain legs?
01:54:49.000 That's a good point.
01:54:50.000 That's a valid point.
01:54:51.000 What do you think?
01:54:53.000 Like, let's ban this so that people hear about it.
01:54:56.000 Like when they banned InfoWars, it was like an emergency update.
01:54:58.000 Yeah, who knows?
01:55:03.000 Noni Perry says, the problem with the Expensify email is that as an employee, I have to use that.
01:55:09.000 My employer mandates it.
01:55:11.000 So when I get that email, the hand of my own employer is complicit in the political message.
01:55:15.000 Yeah, I'd be livid.
01:55:16.000 I'd quit.
01:55:17.000 If I worked for a company and they were like, we're gonna issue a big message about Biden, I'd be like, I'll walk out that door and then do it.
01:55:22.000 Okay, later and I'll walk out the door.
01:55:24.000 Whatever, man.
01:55:25.000 What's funny, Tim, is one of my trainers when I first started as a content moderator, one of my trainers was open about our political views.
01:55:32.000 She said that Obama was her patronus charm.
01:55:36.000 What?
01:55:37.000 Read another book.
01:55:38.000 Oh my gosh, I hate it so much.
01:55:40.000 Millennials are like, I read Harry Potter once.
01:55:42.000 And The Hunger Games.
01:55:42.000 That totally ruins Harry Potter for them.
01:55:44.000 Seriously.
01:55:45.000 What is this?
01:55:47.000 Royal Canadian Moose says, prepare to put on your tinfoil hats, folks.
01:55:50.000 What if book sales and speaking engagements are how they launder money to the corrupt politicians?
01:55:54.000 Let's be real.
01:55:55.000 How many people would read, let alone pay, for something Joe Biden wrote?
01:55:59.000 I mean, that's not even a conspiracy theory.
01:56:01.000 You have warehouses full of books.
01:56:04.000 They'll write a book, someone will buy, you know, $500,000, and they'll say it's for an event or something, they'll put it in a warehouse, and then that money goes to the publisher and you get a percentage.
01:56:13.000 Yep.
01:56:13.000 And they can charge like whatever they want for speeches.
01:56:16.000 I think Hillary Clinton was like $100,000 a speech or something.
01:56:19.000 She was.
01:56:19.000 I think she's gotten down a lot in value.
01:56:21.000 And by the way, Tim, I am working on a book.
01:56:23.000 It's going to be called the Behind the Mask of Facebook.
01:56:26.000 And so it should be coming out in the next couple months.
01:56:28.000 Oh, nice.
01:56:28.000 That's going to be awesome.
01:56:29.000 You mentioned an organization you had too.
01:56:31.000 What is that?
01:56:31.000 Yeah, so the Hartwig Foundation for Free Speech is an Arizona non-profit corporation I formed last month.
01:56:38.000 And so I'm trying to get 501c3 status.
01:56:41.000 But you can go to ryanhartwig.org and learn more about it.
01:56:45.000 But yeah, I'm looking to just be more active and be an activist as far as big tech goes and censorship because I feel like this, even if whoever wins in the next couple weeks, Whether it's Trump or Biden, we still need people to speak out about censorship.
01:57:02.000 Definitely.
01:57:03.000 Justin Gunning says, actually you can change your vote.
01:57:07.000 If you vote again, it just deletes the first vote cast.
01:57:09.000 Look into this, encourage voters to change to Trump.
01:57:12.000 I believe that is 100% incorrect and do not vote twice.
01:57:15.000 That's super illegal.
01:57:16.000 Yeah.
01:57:17.000 Yeah.
01:57:17.000 Super illegal.
01:57:18.000 Don't do that.
01:57:18.000 You should look into it, though, if you're interested.
01:57:20.000 Yeah, we'll figure out how you need to go about the rules properly to make sure your vote is accurately cast.
01:57:25.000 If, you know, whatever happens, just talk to your people, but don't vote twice.
01:57:29.000 Consult your local election officials.
01:57:30.000 Yeah, exactly.
01:57:31.000 Don't take advice from us or anybody in the Super Chats.
01:57:34.000 Or Facebook.
01:57:34.000 Yeah, exactly.
01:57:35.000 Claymore says, if President Trump were to get the Nobel Prize and you were invited to the dinner, would you wear a suit?
01:57:41.000 I'm betting I'm not.
01:57:42.000 I would not wear a suit.
01:57:44.000 Would you wear a beanie?
01:57:45.000 Yes, I would.
01:57:45.000 I would go wearing my clothes.
01:57:47.000 And I would say, dude, I got invited to the Clinton Foundation Gala.
01:57:50.000 Black tie affair.
01:57:51.000 And I said, I'm not wearing a black tie.
01:57:53.000 And they said, you can't come.
01:57:53.000 I said, I'm not coming.
01:57:54.000 And I did not go.
01:57:55.000 Could you imagine that?
01:57:56.000 They invited me to that thing.
01:57:58.000 And I was like, nah.
01:57:59.000 You can't make me do it.
01:58:00.000 Come on, man.
01:58:01.000 Talk about, like, I wonder if I could've got, you know, I would've made tons of connections, tons of high-profile personalities, schmoozed with all these bigwigs and millionaires, probably would've given me money, and like, do these things.
01:58:12.000 I'm like, okay, I'm not gonna wear your clothes.
01:58:14.000 Get out of here.
01:58:16.000 TheAfrican says, Hey Chads and Lass, out of everything from last night's debate, there was one thing that stuck with me.
01:58:22.000 Trump's been busting his butt making peace deals across the globe.
01:58:25.000 What the hell is Biden thinking saying that all that crap he did about North Korea?
01:58:28.000 Does he want war?
01:58:29.000 Yes, he compared Kim Jong-un to Hitler.
01:58:32.000 So now if he gets elected, we're going to have no relationship with North Korea at all.
01:58:36.000 Talk about making everything worse.
01:58:38.000 Not only that, Biden said we had a good relationship with Hitler.
01:58:43.000 Yeah.
01:58:44.000 Are you nuts?
01:58:45.000 Talk about a moron.
01:58:46.000 They tried to before he invaded the Sudetenland.
01:58:49.000 They gave him Neville Chamberlain.
01:58:51.000 They called it appeasement.
01:58:52.000 Sure, sure.
01:58:52.000 They were trying to be on his good—actually, he was Time Man of the Year, Hitler.
01:58:55.000 Yeah.
01:58:56.000 And was it 33, I think?
01:58:57.000 Something like that?
01:58:58.000 Yeah, it was 33.
01:58:58.000 Yeah, the world loves him at first.
01:59:00.000 Oh, yeah.
01:59:01.000 Alright, let's see, what is this?
01:59:03.000 Kyle says, anyone checking out their local Trump donators?
01:59:05.000 I'm two doors down from a proud 9K woman.
01:59:09.000 9K, like, you can't donate that much.
01:59:10.000 Oh, actually, yeah, over the years you can, yeah.
01:59:12.000 Thinking of dropping off some flowers and a friendly warning about the website.
01:59:15.000 IDK about presentations since I don't want to spook are probably a bad idea.
01:59:19.000 If people's private information gets released, you should not be acting upon it in any way.
01:59:24.000 I would not appreciate that.
01:59:24.000 Keep reading your thoughts, I guess.
01:59:25.000 That's something that I also concur with.
01:59:27.000 So there are some people in my video release, I obviously filmed my co-workers, and their names were, first and last names were released.
01:59:36.000 And one of my co-workers actually, sorry.
01:59:39.000 One of my co-workers actually was talking about how she wanted to accept the Iran bounty, the $180 million from Iran to assassinate Trump.
01:59:48.000 And she was speaking about that and, you know, people, I don't encourage doxxing at all.
01:59:52.000 So, I mean, it's horrible that she, you know, she got some hate for that, but I think that's understandable.
02:00:00.000 But in any other country, if you were speaking out against someone the president, the state might take action. So I think, I
02:00:06.000 don't know if the Secure Service investigated those claims, but you know that
02:00:10.000 I think privacy is very important. We shouldn't dox people. That's also part
02:00:12.000 of Facebook's policy.
02:00:13.000 It's a ran into their policy.
02:00:15.000 Trunk Driver says, Tim, how could you support Trump when he tried to coup
02:00:18.000 Venezuela and he is sanctioning Iran and staying in Syria for the oil and
02:00:22.000 printing money like a madman and expanding Big Brother? So my
02:00:26.000 understanding about the Venezuela coup thing is I think you're referring to
02:00:30.000 of those guys who got captured.
02:00:32.000 Like, they showed up on some boats and got caught?
02:00:35.000 Is he talking about Juan Guaido?
02:00:37.000 So, Juan Guaido is supposedly going to run against for election.
02:00:40.000 But you need to know that every single Latin American country, except for maybe Mexico, supported Juan Guaido as the legitimate president of Venezuela.
02:00:50.000 Venezuela is a disaster.
02:00:52.000 It's an absolute disaster.
02:00:53.000 Like, I've been to Venezuela.
02:00:55.000 They got no food.
02:00:56.000 I went there, and there's wealthier areas.
02:00:58.000 But man, some of these pro-Venezuela journalists put up these Potemkin Village pieces, and I'm like, dude, I went to the malls there.
02:01:04.000 It was crazy.
02:01:05.000 Just empty stores everywhere.
02:01:06.000 Yeah, the theory is that the American sanctions caused the government to crumble, and then they blamed it on Maduro.
02:01:13.000 Yes, they did they nationalized like I think they're recently national and not recently but nationalized like airlines or like it's all the airlines leave Dude, you if you've got a company doing a service and you're like, oh by the way, we're taking that company They're back.
02:01:26.000 I'm out later.
02:01:27.000 And then here's what happens every time with socialists You get a bunch of farmers right and they're like Farmers shouldn't own farms, so we're seizing your farm, and we're gonna put the workers in charge.
02:01:37.000 And the workers are like, I don't know how to run a farm.
02:01:39.000 And then they run out of food.
02:01:40.000 That's what happens.
02:01:41.000 They're like, we're gonna take over the oil industry, that way the government gets all the profits.
02:01:46.000 Good luck running the oil industry, you have no idea how to do it.
02:01:49.000 So they don't, and then... Yeah.
02:01:50.000 Anyway, I want to answer this, not to get into Venezuelan politics.
02:01:55.000 I don't know enough about the coup of Venezuela, but I'm not a fan if that's the case, and that's a good point of criticism.
02:02:01.000 Sanctioning Iran, I'm not opposed to sanctions.
02:02:03.000 I think sanctions are an excellent way of going about putting pressure on foreign countries instead of going to war.
02:02:09.000 Staying in Syria for the oil, as far as I know, Trump tried leaving completely And he got attacked by the left and the right, Democrats and Republicans, trying to stop him from doing it.
02:02:19.000 So then he just blatantly was like, okay, we're gonna keep him there to guard the oil, and I think he did that as kind of a smack in the face to the establishment.
02:02:26.000 I'm gonna let everybody know exactly what we're doing there.
02:02:29.000 That I like.
02:02:30.000 Though I would like our troops to be gone from there.
02:02:32.000 He's printing money like a madman, and that's horrible!
02:02:35.000 And expanding Big Brother.
02:02:37.000 I don't know what reference expanding Big Brother is, but I'll tell you this.
02:02:41.000 How can I vote for him?
02:02:42.000 Banning critical race theory.
02:02:44.000 Shutting down the violent leftist riots.
02:02:46.000 Four historic peace agreements.
02:02:48.000 Withdrawing troops from the Middle East.
02:02:50.000 I'll take it over Joe Biden, who would do... Listen.
02:02:54.000 Trump's not great.
02:02:55.000 We've got a lot of bad things about him.
02:02:56.000 But, uh, Joe Biden.
02:02:57.000 We had eight years of Obama.
02:02:58.000 And what did we get?
02:02:59.000 More war.
02:03:00.000 An escalation.
02:03:01.000 And then they handed off this conflict to Donald Trump.
02:03:05.000 Kids in cages?
02:03:06.000 Obama.
02:03:07.000 So, so, so, listen.
02:03:09.000 I don't like voting for the lesser of two evils.
02:03:09.000 If, if, listen.
02:03:11.000 And I've said over and over again that I wouldn't vote for Trump if that was the case.
02:03:14.000 But I don't think Trump is the lesser of two evils at this point.
02:03:16.000 I think he's kinda okay.
02:03:17.000 I'll take four historic peace agreements.
02:03:19.000 I'm happy with that.
02:03:20.000 Can I interject my two thoughts?
02:03:23.000 My two thoughts, my two cents.
02:03:24.000 I literally have two thoughts and then we can kind of wrap up.
02:03:26.000 The reason I'm voting for Trump is because I don't care about the deficit anymore.
02:03:31.000 That used to be one of my biggest issues.
02:03:33.000 My biggest issue is the pursuit of human life, which is very, very important to me.
02:03:37.000 And that extends to the Middle East, which is Tim's deal.
02:03:40.000 So I just want to let you guys know, if you are concerned about the deficit and the cost, it's more important that we have a country to worry about the deficit for than we don't.
02:03:47.000 Yeah, I really, I think we're facing dire straits no matter what.
02:03:53.000 But banning critical race theory, these are some of the things that need to happen that I think will start reversing the problem.
02:03:58.000 So I think Trump will be a net positive in the long run.
02:04:01.000 Lawson Harrison says, get Ryan Dawson on your show ASAP.
02:04:04.000 Don't buy into the crap spewed about him being a loon or whatever.
02:04:07.000 He is a wealth of knowledge in the Middle East and everything relevant about politics.
02:04:10.000 I do not know who he is.
02:04:12.000 I will look him up.
02:04:13.000 Regan says, Hey Tim, thank you for keeping journalism alive.
02:04:16.000 Don't always agree, but we both know.
02:04:18.000 Uh, and I can't say that name.
02:04:21.000 Unar says, AZ boys are in the house.
02:04:24.000 On another note, the downhill skate scene is super far left because there are a bunch of college kids who complain about capitalism and afro-individuals who think we are freaking nuts doing 60 miles an hour downhill on pieces of wood with wheels.
02:04:37.000 I'll tell you this, it seems like pro skateboarders either don't care or are right-leaning.
02:04:43.000 Like, I've got people hitting me up.
02:04:46.000 Yeah, man.
02:04:47.000 Skateboards.
02:04:47.000 Yeah, I think you were talking about that in one of the previous shows.
02:04:49.000 You were talking about... You know, it's not the right-leaning.
02:04:51.000 It's that whatever's happening right now, you've got people who are left-libertarian joining the ranks of the right.
02:04:57.000 That's why I'm really annoyed when everyone's like, Antifa's left-libertarian.
02:05:00.000 No, they aren't.
02:05:01.000 They're violent authoritarians.
02:05:03.000 They go around beating people to instill their will on them.
02:05:06.000 That's not libertarian at all.
02:05:06.000 Yeah.
02:05:09.000 No, the left libertarians are like, leave me alone, man.
02:05:11.000 I wanna go skate.
02:05:13.000 Like, dude, skateboarders will work minimum wage jobs so they can rent a one-bedroom apartment with five people living in it so that they can all work one day a week and skate the rest of the week.
02:05:23.000 That's super left libertarians, as hippie as you get.
02:05:27.000 These people are all like, wow, these people are crazy.
02:05:30.000 They should leave me alone.
02:05:30.000 I just wanna skate.
02:05:32.000 So they're like, I'm getting hit up by these people like crazy.
02:05:34.000 Yeah.
02:05:35.000 Let's see.
02:05:36.000 Alternative.
02:05:37.000 JK says, Biden mentioned in debates while VP, China was defending South Korea from North Korea.
02:05:43.000 If true, this raises questions if President Xi is in ties with the South Korean president.
02:05:48.000 In fact, the South Korean government is proposing their Green New Deal.
02:05:51.000 Interesting.
02:05:53.000 Trunk Driver says, what about Trump betraying Assange?
02:05:55.000 What about Stephen Miller having ties to VDARE?
02:05:57.000 What do you think of Stephen Miller and VDARE, bro?
02:06:00.000 I don't know a whole lot about VDARE at all.
02:06:02.000 I don't know... Yeah, I was just going to say, well, Ecuador betrayed Assange, right?
02:06:07.000 Because they... Oh, yeah, absolutely.
02:06:09.000 Because the new president came in and then they...
02:06:15.000 Yes, you are correct.
02:06:18.000 I wouldn't say Trump betrayed Assange.
02:06:19.000 I would say Trump is enacting standard foreign policy against the man, which is wrong, and he should pardon him.
02:06:25.000 But I think Trump's perspective on this is Julian Assange knows exactly what happened with WikiLeaks and the Democrat emails, and whether or not a particular individual was a source.
02:06:37.000 I think Trump made the wrong move.
02:06:40.000 I think Trump should have just pardoned Assange and then hoped for the best.
02:06:45.000 Instead, what we know, at least according to a lot of this testimony and stories that
02:06:50.000 have come out, Trump wanted Assange to say who was behind the leaks because Trump knows
02:06:56.000 it wasn't Russia, and Assange could prove it wasn't Russia, and because Assange doesn't
02:07:02.000 want to compromise his organization and his sources, won't reveal that information.
02:07:07.000 And think about, I think Trump's view is, if Julian Assange came out and just said who
02:07:12.000 the source was, it would cripple the establishment politicians in this country and their entire
02:07:18.000 I don't think it would.
02:07:18.000 I think they'd be like, he's lying!
02:07:20.000 And then the media would just be like, he's lying, he's lying, he's lying.
02:07:23.000 Yeah, probably.
02:07:24.000 And he'd bury his organization for no reason.
02:07:27.000 That's why Trump played the wrong move.
02:07:29.000 And Trump should pardon him.
02:07:33.000 I don't know much about Stephen Miller or whatever.
02:07:37.000 Daniel Irving says, love the show, Tim.
02:07:38.000 Please shout out the fundraiser I'm helping produce tomorrow.
02:07:40.000 It's for DMD.
02:07:41.000 Details on Facebook at Inspiration on Wheels.
02:07:44.000 Six-hour live show.
02:07:45.000 Thank you.
02:07:46.000 There you go.
02:07:47.000 Okay, we're getting a bunch of superchats popping in now.
02:07:50.000 We will read just a couple more.
02:07:52.000 UberChat!
02:07:53.000 UberChat?
02:07:54.000 How do you say chat in German?
02:07:56.000 Let's find out.
02:07:56.000 I don't know.
02:07:57.000 Elia says, support you Tim.
02:07:59.000 Thank you for your daily segments.
02:08:00.000 Trump 2020.
02:08:01.000 Appreciate it.
02:08:02.000 JathTech says, I wrote an article for you.
02:08:04.000 Here's an excerpt.
02:08:05.000 The polls are a way to put a blinder over the eyes of the American people so they are
02:08:08.000 shocked.
02:08:09.000 They gave them a catastrophe they never saw coming and they went to war.
02:08:14.000 Austin Trammell says, my friend just told me she's voting for Biden because Trump bad.
02:08:17.000 She can't name a thing he has done bad other than media talking points.
02:08:21.000 How can I possibly change her mind?
02:08:23.000 I don't want her to vote away her rights.
02:08:25.000 Let me tell you all, persuasion 101.
02:08:27.000 In persuasion, the first thing you never do is approach someone as an adversary.
02:08:32.000 The first thing you need to do is rapport.
02:08:34.000 So when you're talking to someone, like Ryan here, you're right leaning.
02:08:38.000 Are you going to vote for Trump?
02:08:39.000 Yeah.
02:08:40.000 So if I didn't want you to vote for Trump, the first thing you do, it's the basics of persuasion, is rapport, extreme, and turn.
02:08:45.000 The first thing is, I would be like, yeah bro, high five, like I'm all about it, you know, Trump 2020 maga, all that stuff.
02:08:52.000 Rapport immediately makes the other person feel comfortable around you, like they're like me and I'm safe.
02:08:57.000 It's a psychological tribalist function of the human mind or whatever.
02:09:01.000 The second thing you do is called the extreme.
02:09:03.000 After you agree with them, you offer up a positive proposition that is too extreme for them to agree with.
02:09:08.000 You say something like, the reason you're voting for Donald Trump is because he committed atrocity, and you love that he committed atrocity.
02:09:15.000 And then, this other person will be like, oh, no!
02:09:19.000 I'm not all about that.
02:09:21.000 Then you give them the turn, which is you say, Okay, well, I mean, fine, I guess that one's bad.
02:09:28.000 But he's still pretty good.
02:09:29.000 What you've done there is, as a friend, you've gotten them to reject their own opinion.
02:09:35.000 Or at least a portion of it.
02:09:36.000 You can never convince someone overnight.
02:09:37.000 These people have been inundated by media over and over again and propaganda.
02:09:41.000 So that will never just change their mind outright.
02:09:44.000 But that's like, it's part of sales.
02:09:46.000 When we used to do fundraising for non-profits, the smart people who are good at it, they understood these concepts.
02:09:51.000 It's one of the reasons I hated doing this job, because it just became plastic.
02:09:54.000 You're not actually talking to people and explaining what you think and what you want to do.
02:09:58.000 But the gist of it is, you tell them, you respect them, yeah, Biden's great, all that good stuff, then you say something that Biden's done, and you say either, you know what, I'm glad that he was cutting deals with these Chinese companies, you know?
02:10:11.000 I mean, like, think about it.
02:10:12.000 If our president is indebted to the Chinese Communist Party, there won't be a war.
02:10:17.000 And you guys read about Thucydides' trap, right?
02:10:20.000 I understand they're, you know, they're torturing the Uighur Muslims and all that stuff.
02:10:23.000 So that kind of extreme position, and you can get really extreme with it, will make the average person be like, I don't agree with that.
02:10:30.000 And then you say, okay, well, I guess Biden's not that great, but I mean, I still like him.
02:10:34.000 And then you make the other person change their mind.
02:10:37.000 It's a manipulation thing.
02:10:38.000 I'm not actually recommending doing it.
02:10:39.000 It's just something I often explain because it's what the nonprofits do.
02:10:43.000 That's like the key function of how non-profits fundraise.
02:10:46.000 I used to be a director at one of these companies.
02:10:49.000 At a bunch of them, actually.
02:10:52.000 I do not.
02:10:54.000 I think everyone's getting all riled up just like they did the last day with Russiagate and it's going to result in nothing.
02:11:00.000 Let's see, we got a couple more, let's see.
02:11:02.000 Jason Savorn says, why not address today's media strategy of propaganda being similar to the strategies used by Joseph Goebbels in ushering Hitler to power?
02:11:10.000 I mean, is it?
02:11:11.000 Goebbels.
02:11:11.000 Goebbels?
02:11:12.000 Yeah.
02:11:12.000 Is it similar?
02:11:13.000 I don't know.
02:11:14.000 Goebbels.
02:11:14.000 Oh, I said that he invaded the Sudetenland, I want to take that back.
02:11:18.000 Neville Chamberlain gave Hitler the Sudetenland, and then he invaded Poland, everybody declared war on him.
02:11:23.000 All right, let's see.
02:11:24.000 Nate says, Hey Tim and crew, long time fans to Occupy.
02:11:26.000 I would love to see Dylan Radigan on your show.
02:11:28.000 I would too, but he retired.
02:11:29.000 He's a farmer now.
02:11:30.000 Oh, man.
02:11:31.000 Damn it.
02:11:31.000 TrumpDriver says, Thanks for answering and not running.
02:11:33.000 Much respect for that.
02:11:34.000 VDARE is a straight up neo-Nazi newsletter.
02:11:37.000 TrumpTop8 has ties to VDARE and Richard Spencer, another one of Trump's people.
02:11:41.000 Julia Hahn also has ties.
02:11:43.000 You know, I don't know anything about that.
02:11:46.000 I'm down for whatever, but I mean, like, it's all about critical mass for these platforms.
02:11:48.000 completely a completely verified social platform where only people identified
02:11:52.000 identified to open but can have whatever handle only one account I'm down for
02:11:58.000 whatever but I mean like it's all about critical mass for these platforms
02:12:02.000 otherwise no one wants to come on them so it's like linked to your government
02:12:05.000 ID basically Yeah.
02:12:06.000 Like you can only have one account?
02:12:08.000 Yeah.
02:12:09.000 I don't know.
02:12:10.000 I don't want the government having access to those.
02:12:11.000 Because at some point the government would have to have access to your social media account and then that's just kind of a slippery slope.
02:12:17.000 We were talking about anonymous social media thing where you have peer verification where you'll have like If enough people can say like, oh, yeah, this account likes dogs, this account likes video games, then when you say this is who you are, they can see that, you know, 70 of your of your peers have acknowledged that you are who you say you are without ever having to acknowledge who you are.
02:12:37.000 OK.
02:12:38.000 All right.
02:12:39.000 Here's the last one.
02:12:41.000 Let's see.
02:12:41.000 What is that one?
02:12:43.000 Is it the last one?
02:12:43.000 The last super chat.
02:12:45.000 Man Spider says Aiden Paladin needs to come on the show.
02:12:48.000 I know Aiden.
02:12:48.000 I will reach out to Aiden Paladin and we'll see what happens.
02:12:50.000 Very cool.
02:12:50.000 That being said, hey Ryan, thanks for joining us.
02:12:52.000 Yeah, thanks for having me.
02:12:53.000 Yeah, great conversation.
02:12:54.000 We didn't go through most of your notes.
02:12:56.000 You got a ton, but it was good anyway.
02:12:57.000 Something in particular you wanted to mention?
02:13:01.000 I saw you looking at your notes at one point.
02:13:02.000 I mean, I came a little bit overprepared.
02:13:04.000 There's so much there.
02:13:05.000 I mean, there's so much evidence.
02:13:06.000 I mean, the video you saw with Project Veritas was just like scratching the surface.
02:13:11.000 And there were some conversations I wanted to include in the video that didn't make it into the video.
02:13:16.000 But just the last thing I want to say, it's kind of funny, is this is a post from October 17th.
02:13:21.000 So Zuckerberg, Mark Zuckerberg, gave a speech at Georgetown University this past October, or a year ago.
02:13:27.000 And so they gave us instructions like heads up Mark Zuckerberg live speech He's gonna underscore the company's commitment to giving people a voice dot dot dot and then that same paragraph They're telling us but due to the nature of this commentary of feedback We may see escalations or an increase in user reports of hate speech and wanted to provide a heads up on this Free speech is bad.
02:13:47.000 Zuckerberg's gonna talk about giving people a voice, make sure you delete any hate speech.
02:13:51.000 So I mean, it just shows you where their heart is.
02:13:54.000 They're really not concerned about giving people a voice.
02:13:56.000 Free speech is bad.
02:13:59.000 So the left would report him for saying free speech.
02:14:01.000 Yeah.
02:14:02.000 Yeah.
02:14:03.000 Do you have a social media you want to mention or anything?
02:14:05.000 So Twitter is at RealRyanHartwig and then also on Instagram the same handle and then I'm also on Gab and Parlor and you can go to RyanHartwig.org is my domain.
02:14:17.000 Right on man, thanks for joining us.
02:14:18.000 Thanks.
02:14:19.000 And, of course, you can follow me on YouTube at youtube.com slash TimCast and youtube.com slash TimCastNews, my other two channels.
02:14:25.000 I'm also on Twitter, Instagram, and Parler at TimCast, of course.
02:14:29.000 You can follow at Ian Crossland.
02:14:30.000 Yes, at Ian Crossland.
02:14:32.000 And basically everywhere.
02:14:33.000 Everywhere.
02:14:33.000 Everywhere.
02:14:33.000 Pretty much everywhere.
02:14:35.000 And at Sour Patch Lids.
02:14:37.000 That's our petulance.
02:14:38.000 L-Y-D-S.
02:14:39.000 L-Y-D-S.
02:14:40.000 So we do the show Monday through Friday live.
02:14:41.000 We'll be back Monday.
02:14:43.000 Make sure you smash that like button on the way out.
02:14:44.000 Subscribe.
02:14:45.000 And we'll have clips up from this show all throughout tomorrow.
02:14:49.000 Very shareable segments.
02:14:50.000 So we'll hit key points.
02:14:51.000 That's the point of the clips for the most part.
02:14:53.000 You can be like, hey, here's a thing you need to see.
02:14:55.000 And share with your friends if you think this stuff's important because we got a dude sitting right here basically saying, yeah, they're interfering in our elections, man.
02:15:02.000 But anyway, thanks for hanging out.
02:15:03.000 We'll see you all Monday at 8 p.m.