The Critical Compass Podcast - June 29, 2024


Keeping the YouTube Community Safe | A Critical Compass Growth Opportunity


Episode Stats

Length

31 minutes

Words per Minute

147.90907

Word Count

4,598

Sentence Count

642

Misogynist Sentences

8

Hate Speech Sentences

7


Summary

In this episode of The Critical Compass, Mike and James discuss the removal of a video that was flagged as "medical misinformation" by YouTube. They discuss what this means and what they can do about it. They also discuss the controversial AstraZeneca data that was recently released by the company regarding the safety and effectiveness of a new type of vaccine.


Transcript

00:00:00.000 We'll see. I'm going to assume
00:00:02.040 What if she claims that
00:00:04.180 her injury came from this?
00:00:06.340 Yeah.
00:00:07.380 Is that enough to trigger?
00:00:09.540 That's right!
00:00:10.700 This example violates our policy.
00:00:14.480 I feel like we should get a sticker
00:00:15.780 for this.
00:00:18.460 We do not allow claims
00:00:20.020 that vaccines cause chronic side effects
00:00:22.300 outside of rare side effects
00:00:24.400 that are recognized by health authorities.
00:00:26.820 Okay, well what if it's a rare
00:00:28.000 chronic side effect?
00:00:30.000 While we may make
00:00:32.360 exceptions for content in which creators
00:00:35.020 describe their or their family's first-hand
00:00:36.980 experiences, we recognize
00:00:39.080 there is a difference between sharing personal
00:00:40.840 experiences and promoting misinformation about
00:00:42.960 vaccines.
00:00:44.280 It's misinformation
00:00:46.640 promoting if you talk about things
00:00:48.800 that are in fact rare
00:00:50.820 side effects that do actually happen.
00:00:52.900 You're misinforming people by telling them
00:00:54.860 things that do in fact actually happen.
00:00:56.480 As soon as you say, oh, we thought
00:00:58.880 this was rare, but it's not rare
00:01:00.380 that is misinformation
00:01:01.560 cause it's...
00:01:02.380 There you go.
00:01:03.020 Yeah.
00:01:03.780 And what exactly does rare mean?
00:01:07.000 How...
00:01:07.460 How...
00:01:07.960 How...
00:01:08.560 What would the case incidents have to be
00:01:11.240 for it to still be considered rare, right?
00:01:13.380 Welcome back to another episode of The Critical Compass.
00:01:33.620 I'm James and this is Mike and we've got a short episode for you today
00:01:37.160 because we are going through re-education.
00:01:40.560 Our full episode where we sat down with
00:01:44.540 Yakstack or Sheldon Yakachuk,
00:01:47.780 we went over the AstraZeneca data that recently came out
00:01:51.520 and that got flagged for medical misinformation
00:01:54.600 even though we were just discussing what was in front of us
00:01:58.240 and what was released publicly.
00:02:00.140 So, yeah, it's frustrating that we had to take that down.
00:02:06.580 Well, it got token down.
00:02:07.740 We couldn't keep it up, but we do have it on Rumble
00:02:11.620 for anybody who's interested.
00:02:14.300 Yeah.
00:02:14.860 Well, let's take a look here, James.
00:02:16.720 I will add our screen in here.
00:02:21.380 So, this is our dashboard.
00:02:24.400 And as you can see right here,
00:02:25.660 here's our channel violation.
00:02:28.320 And let's see what YouTube wants from us.
00:02:33.020 Okay.
00:02:34.940 So, let's see here.
00:02:37.740 Warning, your content was removed
00:02:39.220 due to a violation of our community guidelines.
00:02:41.120 You've received a warning.
00:02:42.220 You can take a policy training
00:02:43.520 which will remove the warning after 90 days.
00:02:45.980 Okay.
00:02:47.540 Oh, so they call it policy training, not re-education.
00:02:50.560 That's our term for it.
00:02:51.800 Policy training, yeah.
00:02:53.040 There's your Orwellian terminology of the day.
00:02:56.400 If this happens again,
00:02:57.980 if you violate this policy again
00:02:59.320 within 90 days of completing the policy training,
00:03:01.360 you'll get a strike.
00:03:02.940 You won't be able to do things like
00:03:04.220 upload, post, or live stream for one week.
00:03:07.920 Okay.
00:03:08.440 And if you think we've made a mistake,
00:03:09.680 you can appeal this decision.
00:03:10.800 Well, let's take a look here.
00:03:12.540 So, let's take some action, shall we?
00:03:15.440 Get started, look at your content, and take action.
00:03:17.980 Okay.
00:03:18.200 So, there may be an issue with your content.
00:03:21.620 Something you posted may violate YouTube's community guidelines
00:03:24.240 to help keep the community safe.
00:03:28.380 We've removed it.
00:03:29.420 Your channel has received a warning.
00:03:30.700 You can take a training to dismiss the warning,
00:03:33.500 appeal our decision, or do nothing.
00:03:35.680 Okay.
00:03:36.340 So, warning.
00:03:37.460 This is just a warning.
00:03:38.220 If it happens again,
00:03:39.060 your channel will get a strike.
00:03:39.860 Yeah, okay.
00:03:40.240 So, let's begin review.
00:03:44.760 Take a look at your content team policies in mind.
00:03:47.020 The only thing with this,
00:03:48.120 if it does play,
00:03:49.540 will it flag this again?
00:03:52.380 So, we can mention the time frame of it.
00:03:55.900 It may actually flag.
00:03:57.040 It depends.
00:03:58.960 Like, are we on a list?
00:04:01.700 Are they going to, like,
00:04:03.120 have their algorithm flag through?
00:04:05.100 Or, like, how does...
00:04:06.120 You'd have to imagine you have some sort of...
00:04:08.300 Or, like...
00:04:09.500 Yeah, I don't know.
00:04:10.860 Yeah, but, like, how does...
00:04:13.700 What is ACT wording?
00:04:14.840 Because we had two shorts,
00:04:17.260 or two shorter videos,
00:04:18.540 two clips,
00:04:19.560 some highlights,
00:04:20.360 that were talking about a similar thing.
00:04:23.580 Um...
00:04:25.340 It was talking about the...
00:04:27.780 Yeah, it was talking about
00:04:29.520 spontaneous abortions.
00:04:32.600 Um...
00:04:33.320 And how that relates to
00:04:35.740 AstraZeneca
00:04:36.780 injections in the trials.
00:04:40.200 And I don't know if it was a certain
00:04:42.220 wording that...
00:04:43.720 Yeah, there it is.
00:04:44.440 Right there, James.
00:04:45.480 Dead on spontaneous abortions.
00:04:47.180 Okay, let's take a look here.
00:04:48.480 So, that's 05258.
00:04:51.740 Technically, if you find that time on Rumble,
00:04:54.340 um...
00:04:54.900 You should be able to...
00:04:55.860 Like, our viewers should actually be able to
00:04:57.760 watch that portion.
00:05:00.140 Exactly.
00:05:00.760 So, that's what I was referring to.
00:05:02.540 It was talking about...
00:05:03.420 Yeah.
00:05:04.280 Um...
00:05:05.080 I don't know if it's exactly because
00:05:07.120 he said
00:05:07.600 it's not safe.
00:05:11.000 That could...
00:05:12.080 That could trigger.
00:05:12.940 That could be...
00:05:13.540 Oh, here we go.
00:05:16.520 Okay, so...
00:05:17.600 But is that really...
00:05:19.320 Is that really narrowing it down, though?
00:05:23.380 No, so probably what this algorithm has...
00:05:26.060 I don't know if these things are, like, um...
00:05:29.360 You know, done by AI or...
00:05:29.940 I don't know if there's a white flag and then it...
00:05:31.900 And then it, like...
00:05:33.300 I don't know if any humans end up going through the process
00:05:35.840 or if it just auto-triggers...
00:05:37.760 So, what we don't want to do is say any words right now, I assume,
00:05:43.340 that start with the letter V
00:05:44.680 or that start with the letter A in this context
00:05:49.260 next to each other
00:05:50.200 because that's probably what...
00:05:52.260 That's probably what it's doing.
00:05:55.060 I don't know.
00:05:56.460 So, okay.
00:05:57.480 So, let's just see here.
00:05:58.660 So, do we want to read policy?
00:06:00.640 Let's try that.
00:06:01.740 Yeah.
00:06:01.960 Let's see what we can do here.
00:06:03.400 Okay, so...
00:06:04.400 Medical misinformation policy.
00:06:05.740 YouTube doesn't allow content
00:06:06.700 that poses a serious risk of egregious harm
00:06:09.040 by spreading medical misinformation
00:06:10.720 that contradicts local health authorities
00:06:12.620 or the World Health Organization's guidance
00:06:14.440 about specific health conditions and substances.
00:06:17.180 This policy includes the following categories.
00:06:19.200 Prevent misinformation...
00:06:21.200 Prevention misinformation...
00:06:23.680 Treatment misinformation...
00:06:25.080 Denial misinformation.
00:06:26.440 Wow.
00:06:27.820 Holy smokes.
00:06:28.460 So, that would include...
00:06:30.620 So, that would include...
00:06:32.620 Okay, preventing is like...
00:06:35.620 Maybe vitamin D falls under...
00:06:39.620 Somebody talking about...
00:06:41.500 Yeah.
00:06:43.060 Well, this is...
00:06:44.200 What makes this extra hilarious
00:06:45.620 is that this medical intervention
00:06:50.380 that we were discussing in that video
00:06:51.880 isn't even on the market anymore.
00:06:53.800 It's not even being...
00:06:54.800 It's not even being...
00:06:55.700 Literally, it was pulled off.
00:06:56.840 Yeah.
00:06:57.000 Yeah.
00:06:57.180 So, like...
00:06:58.920 Oh, this is crazy.
00:06:59.940 Look at this.
00:07:03.140 Denial misinformation.
00:07:04.440 Content that denies the existence of COVID-19
00:07:06.900 or that people have died from COVID-19.
00:07:13.780 Oh, my goodness gracious.
00:07:14.940 So, that is...
00:07:16.500 That...
00:07:16.900 See, that's a very specific, strong language.
00:07:19.940 But I wonder if they...
00:07:21.680 If you said...
00:07:24.000 Oh, there's not as many people than others.
00:07:27.280 If that would count as denial
00:07:28.660 because you're denying some of it.
00:07:31.460 Like, I don't know how firm...
00:07:32.680 Affirm how exact the language has to be
00:07:35.180 or if this wide umbrella captures a little bit more
00:07:39.200 than we think.
00:07:40.940 Yeah.
00:07:41.700 Yeah.
00:07:43.400 The treatment...
00:07:47.400 I don't know.
00:07:48.760 Like, would that fall under...
00:07:50.360 Well, we haven't really promoted anything alternative.
00:07:53.320 We weren't...
00:07:53.860 We weren't...
00:07:54.560 Yeah, we weren't talking about treatments.
00:07:56.820 Does the...
00:07:58.860 Astra...
00:07:59.780 Does that fall under the prevention, though?
00:08:05.280 It contradicts health authority.
00:08:08.060 Well...
00:08:09.280 We're talking about their own...
00:08:10.160 It's their own study that's...
00:08:11.560 ...contradicting health authority.
00:08:12.660 Yeah.
00:08:13.660 Yeah.
00:08:14.900 Oh, man.
00:08:15.720 Okay.
00:08:16.160 All right.
00:08:16.400 Oh, anything on the safety or efficacy.
00:08:21.140 So, anything...
00:08:21.720 Yeah, that's...
00:08:22.180 If you have anything to say about...
00:08:24.920 Effective or safety...
00:08:27.060 Safe or effective, then...
00:08:28.840 Mm-hmm.
00:08:29.320 And that will...
00:08:29.880 Yeah.
00:08:30.900 Counts as error.
00:08:32.160 Okay.
00:08:32.560 Well, let's go back here.
00:08:34.700 So...
00:08:34.740 Now we're clear.
00:08:35.260 I guess we...
00:08:36.400 We know...
00:08:38.240 We know what not to say.
00:08:40.280 Yeah.
00:08:41.200 Okay.
00:08:41.560 So, what do you want to do?
00:08:42.500 Take a policy training.
00:08:43.620 Your warning will expire in 90 days.
00:08:45.220 To keep our community safe...
00:08:47.260 Remember, James...
00:08:47.980 Your content won't be put back on YouTube.
00:08:50.500 Okay.
00:08:50.720 So, this is a seven...
00:08:51.340 We're literally harming people.
00:08:53.000 We would literally harm people.
00:08:54.540 Yeah.
00:08:54.840 So, we don't want to appeal...
00:08:56.840 Oh, so we can even do that after taking the training.
00:08:58.980 But let's do the...
00:08:59.620 Let's do the policy training first...
00:09:00.980 And decide if we want to appeal after that.
00:09:03.400 Okay.
00:09:03.900 Take training.
00:09:05.680 Okay.
00:09:06.140 Sweet.
00:09:06.680 Let's see.
00:09:08.840 Okay.
00:09:09.220 About the training.
00:09:09.900 Seven questions, 15 minutes.
00:09:11.200 We want you to create and share your content confidently.
00:09:13.920 In this training, you'll see...
00:09:15.200 It was related to the policy you violated.
00:09:17.280 You'll have unlimited attempts to get it right.
00:09:19.160 Oh, well, that's good.
00:09:22.320 Medical misinformation.
00:09:24.120 Vaccine misinformation.
00:09:25.340 While there are different types of medical misinformation, the one we'll focus on most
00:09:28.700 in your training is vaccine misinformation.
00:09:31.780 YouTube doesn't allow content that poses a serious risk of egregious harm by spreading
00:09:35.540 medical misinformation that contradicts local health authorities or the World Health Organization's
00:09:40.040 guidance about specific health conditions and substances.
00:09:42.320 This policy includes the following categories.
00:09:45.460 Oh, this is what we just read.
00:09:47.480 Yeah.
00:09:47.700 What happens if a local health authority has different information or a guideline than the
00:09:56.780 WHO?
00:09:57.980 Doesn't matter.
00:09:58.760 Does the WHO supersede that?
00:10:00.060 Yeah.
00:10:00.840 I would imagine.
00:10:01.820 I would imagine the WHO must supersede that.
00:10:03.480 Because you could have local health authorities like they had in Florida, right?
00:10:07.140 Where the Florida Surgeon General had that.
00:10:10.100 They came up with...
00:10:11.360 Yeah.
00:10:12.180 Yeah.
00:10:12.780 Okay.
00:10:13.200 Let's take a look here.
00:10:14.620 Okay.
00:10:15.980 Question one of seven.
00:10:16.860 A community leader uploads a video in which he urges local residents not to get vaccinated
00:10:22.760 because he believes the vaccines contain cells and tissue samples from aborted babies.
00:10:28.320 Is this a violation?
00:10:29.680 Yes, it is.
00:10:30.120 No, it isn't.
00:10:30.500 Now, isn't...
00:10:31.440 Hmm.
00:10:32.820 How can I say this in a way that it's not going to...
00:10:38.060 Hmm.
00:10:38.880 Trigger this.
00:10:40.200 Um...
00:10:41.240 Yeah.
00:10:42.000 Wasn't there a Project Veritas, uh, like, Pfizer whistleblower that said it was, in fact,
00:10:50.740 the Pfizer vaccine was, in fact, descended from certain cell lines of this thing on the screen?
00:11:01.120 Yeah.
00:11:01.820 So, is the issue...
00:11:04.520 It doesn't actually contain cells.
00:11:06.760 It's cell lines.
00:11:07.580 And, therefore, they're like, well, it depends on how exact you get with...
00:11:15.400 Yeah.
00:11:16.520 Let's just assume it wants us to say yes.
00:11:21.700 Here's the thing.
00:11:22.620 We know all the...
00:11:24.000 We know all the right answers.
00:11:28.580 I imagine we can predict the answers because we know...
00:11:32.120 We know what they've said about these things.
00:11:37.940 Like, any good citizen, we know what the right thing is to say.
00:11:43.420 That's right.
00:11:44.220 That's right.
00:11:44.860 But here's the thing.
00:11:46.320 So, again, you're talking about, like, why...
00:11:49.940 Like, why aren't some people able to have opinions?
00:11:57.160 Or what if you're reporting on somebody saying these things?
00:12:00.380 Would your video get flagged if the person says that in a clip?
00:12:03.760 But you're...
00:12:04.680 What if you're reviewing somebody talking about this?
00:12:06.860 Yeah.
00:12:06.940 So, like, now you don't even...
00:12:09.280 Like, is that going to chill speech on even talking about people having opinions or controversies or anything around that as well?
00:12:18.660 So, like, it depends.
00:12:21.180 These things could be arbitrarily enforced at any point.
00:12:26.420 Yeah.
00:12:27.080 Like...
00:12:27.660 Yeah.
00:12:28.280 I think that's the insidiousness of it.
00:12:33.880 All right.
00:12:34.520 Question two.
00:12:35.760 Jada uploads a video in which she shows 5G cell phone towers in her town.
00:12:39.740 She claims it's no coincidence that, quote,
00:12:42.060 all this talk about COVID and dying, unquote, started just after the towers were built.
00:12:47.180 She says the 5G towers are being used to spread COVID, which proves that COVID is not a virus at all.
00:12:52.840 Is this a violation, James?
00:12:54.340 What do you think?
00:12:56.880 I feel like even if she's wrong, we'd want to hear what she says so we can show that she's wrong.
00:13:03.020 But, yes, it's a violation according to...
00:13:06.180 There you go.
00:13:08.440 Well, because it could cause a greenish harm.
00:13:09.860 Is that a play on words?
00:13:15.460 All right.
00:13:16.360 Question three.
00:13:17.780 Tammy posts a video after getting the influenza vaccine.
00:13:20.300 She shares that she became permanently paralyzed after receiving her seasonal influenza shot.
00:13:24.840 In the video, she warns others not to get the influenza vaccine because, quote,
00:13:29.060 you're all going to suffer the same way I did, unquote.
00:13:32.240 Pretty dramatic language.
00:13:34.260 Is that wrong if she says...
00:13:36.180 If she doesn't say the last line?
00:13:37.900 Right.
00:13:38.420 Yeah.
00:13:38.680 What part of it makes it wrong?
00:13:39.260 If she just talks about her individual case?
00:13:41.380 Yeah.
00:13:42.280 Yeah.
00:13:42.740 I wonder if she just stopped it right here.
00:13:45.880 Is that okay?
00:13:47.380 We'll see.
00:13:48.600 I'm going to assume...
00:13:49.120 What if she said...
00:13:50.520 What if she claims that her injury came from this?
00:13:53.600 Yeah.
00:13:55.020 Is that enough to trigger?
00:13:57.180 That's right!
00:13:58.220 Exclamation mark.
00:13:59.020 This example violates our policy.
00:14:01.920 I feel like we should get a sticker for this.
00:14:05.900 We do not allow claims that vaccines cause chronic side effects,
00:14:10.320 outside of rare side effects that are recognized by health authorities.
00:14:14.240 Okay.
00:14:14.480 Well, what if it's a rare chronic side effect?
00:14:18.560 While we may make exceptions for content in which creators describe their or their family's
00:14:24.060 first-hand experiences, we recognize there is a difference between sharing personal experiences
00:14:28.760 and promoting misinformation about vaccines.
00:14:31.660 So it's misinformation promoting if you talk about things that are, in fact, rare side
00:14:38.500 effects that do actually happen.
00:14:40.320 You're misinforming people by telling them things that do, in fact, actually happen.
00:14:43.700 As soon as you say, oh, we thought this was rare, but it's not rare, that is misinformation
00:14:48.880 cause it's...
00:14:49.820 There you go.
00:14:50.460 Yeah.
00:14:50.780 ...questioning.
00:14:51.380 What exactly does rare mean?
00:14:53.080 Yeah.
00:14:54.060 How, how, how, uh, what would the case incidents have to be for it to still be considered
00:15:01.280 rare, right?
00:15:02.940 All right.
00:15:03.820 I hate this question.
00:15:04.980 Okay.
00:15:05.320 Question four.
00:15:06.400 Pablo, Diversity, uploads a video in which he criticizes a university's requirement that
00:15:12.040 all incoming students be vaccinated against certain diseases like polio or measles, mumps,
00:15:16.520 and rubella.
00:15:17.700 In the video, Pablo says that these vaccines don't work because they don't reduce the number
00:15:21.900 of people who get diseases.
00:15:23.140 Again, these seem like they are, the examples are very exaggerated.
00:15:30.440 Yeah.
00:15:30.600 Like it's, um, obviously you can find people with like, not a well nuanced conversation,
00:15:38.440 but like, I've seen many around this that maybe would doubt the significance or the amount
00:15:46.180 that affected and like, well, what were the other kind of sanitary and health conditions
00:15:50.740 leading up to that, that maybe had a play into it.
00:15:53.840 So like, you could always find somebody with this kind of like simple view, but, um,
00:16:00.640 this example violates our policy because public claims that vaccines don't reduce the contraction
00:16:10.380 of diseases.
00:16:11.420 Okay.
00:16:12.900 Content claiming that vaccines do not reduce transmission or contraction of disease is not
00:16:17.060 allowed on YouTube because it contradicts information from health authorities and could
00:16:20.480 cause egregious harm.
00:16:21.300 All right.
00:16:22.060 So what about during, James, I can't even say it.
00:16:26.060 We're going to get flagged again for this flagging video.
00:16:28.900 What about if you have a health authority?
00:16:30.440 I think this whole video is going to.
00:16:32.740 What if you have a health authority that, oh, I don't know.
00:16:37.740 I just, I'm not going to say it.
00:16:39.300 What, what vaccine, what transmission reduction and contraction, like what percentages, what
00:16:47.860 give me a number?
00:16:49.720 Like, are we going to say that every vaccine is 100% effective?
00:16:53.460 I don't, I don't even think, I don't think Anthony Fauci would say that.
00:16:57.460 They're, they all work.
00:16:59.280 They're, they're all safe.
00:17:01.280 They're all safe.
00:17:02.280 They're all effective.
00:17:02.700 And just, and just shut up and take it.
00:17:05.860 Don't question.
00:17:06.680 Don't think.
00:17:07.280 Don't.
00:17:07.940 Nope.
00:17:08.880 Nope.
00:17:09.120 Let's learn about Yvonne now.
00:17:10.420 Question five, seven.
00:17:11.880 Yvonne and her friends live stream a discussion about COVID-19.
00:17:15.740 Yvonne says that the most tragic part of the pandemic was that some government leaders
00:17:19.380 failed to respond quickly enough.
00:17:21.500 She says, quote, this never would have happened if the government made the right decisions.
00:17:25.360 The pandemic is the government's fault.
00:17:27.400 Unquote.
00:17:28.740 Interesting question.
00:17:29.980 Interesting question.
00:17:30.620 Because this is a very specifically worded question too.
00:17:35.160 Like I, I didn't expect this one actually.
00:17:37.440 This is tough because this is the type of person that YouTube likes.
00:17:40.300 I think this is, this is the type of, uh, like this person I don't think would have,
00:17:45.600 I don't actually, I legitimately, legitimately don't know how to answer the question.
00:17:49.920 Because there, there are people, you, you could find somebody, this just sums it up.
00:17:55.320 You could find somebody on like either side of, there's some that say should have locked
00:17:59.520 down harder and they didn't do enough.
00:18:01.760 And the harm, like all the other stuff, the reason it went on for four years is because
00:18:06.240 they didn't shut down hard enough and fast enough.
00:18:09.760 So.
00:18:10.260 I'm going to say no.
00:18:11.200 Let's say no.
00:18:13.260 What do you think?
00:18:14.020 Okay.
00:18:19.920 We have unlimited chances.
00:18:23.220 It doesn't really matter, but we have unlimited, okay.
00:18:26.040 Maybe, maybe they're trying to trick us.
00:18:28.080 Maybe they're trying to.
00:18:30.240 That's correct.
00:18:31.240 This example does not violate our policy because Yvonne doesn't deny the existence of the COVID-19
00:18:36.760 pandemic.
00:18:39.400 Oh, that's amazing.
00:18:41.100 Props, props to YouTube for having just a little bit of nuance.
00:18:44.360 Well, but, but let's, let's take this example.
00:18:47.480 Let's take this thought experiment a little bit further.
00:18:49.600 The pandemic is the government's fault in this quote.
00:18:52.960 Okay.
00:18:53.780 So does that mean that it's okay to say that, uh, Anthony Fauci through a third party paid,
00:19:00.100 uh, a biomedical lab in China to create the COVID-19 virus?
00:19:06.060 That would be the government's fault.
00:19:07.560 Would it not?
00:19:08.020 Yeah, it doesn't deny, doesn't deny the pandemic.
00:19:14.140 It just, yeah.
00:19:15.080 But we already know that people talk about the origin.
00:19:17.720 Well, maybe not anymore, but at least they used to anyway.
00:19:21.320 Yeah.
00:19:21.500 I wonder, Hey, we're yeah.
00:19:23.000 How much does that change?
00:19:25.100 Ooh, another diverse individual.
00:19:28.140 Jing is a nurse who makes educational health content in her latest video.
00:19:32.600 She shares that since getting the COVID-19 vaccine last year, she has had three miscarriages.
00:19:37.040 She says that the COVID-19 vaccine made her infertile.
00:19:41.660 Select all the correct answers.
00:19:43.480 Interesting.
00:19:45.000 Okay.
00:19:45.740 This video doesn't violate the policy because Jing is a medical professional.
00:19:49.300 This video doesn't violate the policy because Jing is allowed to share her personal experience.
00:19:53.800 This video doesn't violate the policy because Jing is not spreading medical misinformation.
00:19:59.220 Okay.
00:19:59.880 So the last line is what's going to trip it.
00:20:03.380 She says that the COVID-19 vaccine made her infertile.
00:20:10.480 This is a good question.
00:20:13.080 I'm trying to put my brain in the head of the 20-year-old that wrote this question.
00:20:19.560 And this does seem like something they would say because, you know, unless you're a, you know, she makes educational health content.
00:20:31.320 So she must have an active, you know, account that isn't banned from her, you know, putting, you know, anti-vaccine stuff off.
00:20:38.040 We have to assume that, I presume, right?
00:20:42.200 Yeah.
00:20:42.680 So I'm going to say that none of those are correct.
00:20:46.520 Not even this one?
00:20:47.680 Because...
00:20:49.560 No, because her personal experience is making a claim about the safety of the shot.
00:20:57.240 Yeah.
00:20:58.540 Interesting.
00:21:00.280 That's my...
00:21:01.160 Okay, so like none of them?
00:21:03.100 It's because she's making a wide statement outside of her...
00:21:08.160 Like, it's, you know, the rare side effect kind of thing?
00:21:12.120 She's making it seem...
00:21:13.500 I don't know.
00:21:14.900 That's my...
00:21:16.620 Yeah, interesting because...
00:21:17.500 Unless you think otherwise.
00:21:19.560 Well, they're going to say...
00:21:20.820 They're going to say that she can't know that it was the vaccine that made her infertile, right?
00:21:26.260 Okay, let's select none of them.
00:21:30.200 Oh.
00:21:31.060 Oh, really?
00:21:31.640 It didn't allow me to do that.
00:21:32.580 Okay, okay.
00:21:33.360 Maybe that's...
00:21:34.020 Let's try this.
00:21:34.720 Let's try this one.
00:21:35.440 I'm going to try this one.
00:21:40.200 Mike, I thought we were really going to 100% this.
00:21:43.460 I know.
00:21:44.020 I'm so...
00:21:44.400 I feel so bad.
00:21:45.440 That's correct.
00:21:46.000 We don't allow content that claims that an approved COVID-19 vaccine causes infertility
00:21:49.880 because that contradicts information from health authorities.
00:21:53.040 Unless, of course, it's AstraZeneca reporting their own data and could cause egregious harm.
00:22:00.520 Honestly, it could cause egregious harm.
00:22:03.120 However, we may make exceptions for content.
00:22:05.120 It's egregious when things are spontaneously reported.
00:22:08.720 Yeah, when certain things spontaneously...
00:22:11.420 Are rejected.
00:22:16.000 Rejected, yeah.
00:22:16.620 You know when you just decide that you're just going to just terminate the call.
00:22:23.120 However, we may make exceptions for content like this where the creator describes their
00:22:26.680 or their family's first sentence.
00:22:27.840 Yeah, okay.
00:22:28.540 Okay.
00:22:29.880 Good question.
00:22:31.040 So maybe...
00:22:31.680 Maybe it didn't...
00:22:32.420 Because it didn't say all vaccine...
00:22:35.020 Like, it wasn't a...
00:22:36.340 It was still tied into her personal...
00:22:39.420 Yeah.
00:22:40.720 Yeah.
00:22:41.100 ...experience.
00:22:42.720 Oh, okay.
00:22:43.780 Sehar.
00:22:44.880 I'm going to guess.
00:22:46.140 Sehar uploaded a video saying that she still has stage 4 lung cancer after several rounds
00:22:50.740 of chemotherapy.
00:22:52.100 So it's clear that chemotherapy did not work for them.
00:22:55.920 Oh, Sehar is a they-them.
00:22:57.540 Okay.
00:22:57.980 That's good, too.
00:22:58.780 This is...
00:22:59.440 Honestly, these questions just keep getting more diverse.
00:23:01.560 I'm just...
00:23:02.240 I'm such a fan of diversity in these questions.
00:23:04.500 Actually, we should have gone through and double-checked all the...
00:23:08.140 We should have sampled all the races and genders.
00:23:11.020 I think there was only one...
00:23:12.520 I think there was only one male name, and that was Pablo.
00:23:16.420 So that's okay, because that was diverse.
00:23:18.660 Okay.
00:23:19.100 This video doesn't violate the policy because stage 4 lung cancer is incurable.
00:23:23.360 This video doesn't violate the policy because Sehar is allowed to share...
00:23:26.420 Oh, now she's a her.
00:23:27.840 She's a her-them.
00:23:29.520 To share her personal experience with approved cancer treatments.
00:23:32.860 This video doesn't violate the policy because everyone knows that chemotherapy doesn't work.
00:23:37.020 Like, this video violates the policy because Sehar is discouraging others from getting chemotherapy.
00:23:42.620 Interesting.
00:23:43.380 There's nothing directly discouraging others.
00:23:45.640 No.
00:23:45.960 No.
00:23:46.280 It's this one, I'm sure.
00:23:48.320 Yeah.
00:23:48.800 Yeah.
00:23:49.400 Okay.
00:23:49.800 Yeah.
00:23:50.020 Try it.
00:23:50.840 Because, well, it doesn't have 100% efficacy anyways.
00:23:55.680 Yeah.
00:23:56.260 Yeah.
00:23:56.520 Okay.
00:23:56.720 So no treatment or no thing is like 100%, so...
00:24:01.240 Yeah.
00:24:01.820 Yeah.
00:24:02.520 Okay.
00:24:03.200 Sweet.
00:24:03.320 I guess they're less concerned about the cancer sphere of information.
00:24:07.020 Right now.
00:24:08.060 And others' fears.
00:24:10.040 Nice work.
00:24:11.480 Nice work, James.
00:24:12.380 You've completed the training.
00:24:13.900 We appreciate you taking the time to help us keep YouTube safe.
00:24:18.380 So our warning will expire September 24th.
00:24:20.840 Wow, that's a long time.
00:24:21.600 I guess that's 90 days.
00:24:23.420 So we better not violate this medical misinformation policy again, otherwise we'll get a strike.
00:24:28.120 Here's the thing.
00:24:28.920 Do you think...
00:24:30.240 What if this video triggers the same medical misinformation?
00:24:34.840 Yeah.
00:24:35.280 If we violate a different policy, however, we'll get another warning with the opportunity
00:24:40.240 to take another training.
00:24:41.860 Okay.
00:24:42.920 Interesting.
00:24:44.320 So...
00:24:44.600 Here's my prediction.
00:24:45.340 Should we appeal?
00:24:45.740 If we...
00:24:46.740 Um...
00:24:48.740 I don't...
00:24:52.200 I don't...
00:24:53.200 I don't know.
00:24:54.920 It depends on...
00:24:55.200 I'm almost tempted to...
00:24:57.060 Because I want to see...
00:24:58.580 Like, I want to see if we can get a response from a human saying what exactly our violation
00:25:05.520 was by reporting on AstraZeneca's own policy.
00:25:10.300 Um...
00:25:12.060 Or test data, rather.
00:25:14.700 Here's...
00:25:15.260 Here's the thing.
00:25:15.900 All it would take is for them to find a moment of Sheldon saying they don't work.
00:25:23.940 That, like, he did make a general claim to, like, it's like, they don't work.
00:25:27.720 We know they don't work, and...
00:25:30.520 Yeah.
00:25:31.360 Technically, that would align with their policy of a claim on the safety and efficacy.
00:25:36.500 And that's enough for them to, as we go through, that's...
00:25:41.720 Even if it's AstraZeneca's own data, it's, like, he maybe didn't state...
00:25:49.300 Have the exact statement of talking about AstraZeneca at that very moment.
00:25:54.300 If it's...
00:25:55.300 If it was pulled back and it was more general, then that would be claimed it's a...
00:25:59.420 A statement against the who at that point.
00:26:02.660 Yeah, right.
00:26:03.580 Yeah.
00:26:04.600 Okay.
00:26:05.520 Okay.
00:26:06.140 Well, here, why don't we do this?
00:26:09.640 Let's...
00:26:10.120 Not appeal now.
00:26:11.240 We could ask for these...
00:26:12.660 We could always ask for the exact...
00:26:14.360 Like, what was the exact statement?
00:26:17.540 Yeah.
00:26:17.880 And see if they...
00:26:19.020 Well, I think...
00:26:19.980 So, based on what you said, why don't we not appeal now?
00:26:23.460 We'll upload this video.
00:26:25.880 See if this one gets flagged.
00:26:27.700 If it does not, then we can, you know, take it from there.
00:26:32.820 If it does as well, that's when we're going to be at risk of a strike, I guess.
00:26:37.200 Or we'll just get an automatic strike.
00:26:39.240 Like, and then we should appeal one or both of them to...
00:26:42.620 This is getting really, like, meta at this point of, like, we're going to appeal the video
00:26:46.640 or we're talking about the original video that we didn't appeal, but, like, you know what
00:26:50.700 I mean, right?
00:26:52.340 Yeah.
00:26:52.820 So, I think if we do get a strike, it goes away after 90 days.
00:26:57.180 Right.
00:26:57.580 So, I think you can...
00:26:59.280 There's a cool-down period.
00:27:01.780 But, so...
00:27:03.240 Wait.
00:27:04.000 Yeah, I'm curious.
00:27:04.680 Are you sure you want to leave?
00:27:08.780 Okay, we already did this.
00:27:09.940 Oh, if we...
00:27:11.200 Okay.
00:27:12.240 Leave a...
00:27:12.900 Oh, okay.
00:27:14.520 Wait, we'll just close it.
00:27:16.500 Okay.
00:27:17.200 Cool.
00:27:17.720 Well...
00:27:18.040 But, is it...
00:27:21.420 Are we still...
00:27:24.160 Did it...
00:27:24.540 Did that actually take?
00:27:26.680 Yeah, training completed.
00:27:28.460 Training completed.
00:27:29.340 There we go.
00:27:29.940 Yeah.
00:27:30.560 There we go.
00:27:32.080 So, I suppose this probably just stays on here until it falls off.
00:27:37.180 Yeah.
00:27:38.200 Okay.
00:27:39.380 Well, there we go.
00:27:39.980 I'm curious if we'll get double flagged if there are discussion on...
00:27:44.980 Yeah, that'll be interesting.
00:27:48.980 Cool.
00:27:49.640 Well, how...
00:27:50.560 Do you feel...
00:27:51.520 What do you feel...
00:27:53.700 Like, do you feel like we probably got like a...
00:27:55.260 That was probably a diploma.
00:27:56.380 I wouldn't say that that was a...
00:27:57.760 Or maybe a certificate.
00:27:58.860 Maybe not a diploma.
00:28:00.260 I would say I have at least a certificate now in medical misinformation policy.
00:28:06.040 I feel like we were already experts in that.
00:28:09.340 In creating misinformation.
00:28:12.260 According to their...
00:28:13.900 According to their definitions, but...
00:28:17.640 I was actually...
00:28:18.660 The thing that actually disturbed me the most was how Seyhard turned from a they-them to a she in the question.
00:28:24.620 I feel like that was actually probably...
00:28:26.100 They probably misgendered Seyhard in their own questioning.
00:28:29.340 Could be a she-slash-they.
00:28:31.380 So...
00:28:31.700 Yeah.
00:28:32.020 And that gives you the flexibility there.
00:28:34.680 Yeah.
00:28:37.940 Yeah, I guess...
00:28:39.900 It is that month.
00:28:41.680 YouTube should be more careful during this month.
00:28:43.900 That's right.
00:28:44.520 Be more inclusive that way.
00:28:45.800 Right.
00:28:46.800 Well, okay.
00:28:47.800 Well, James, that's...
00:28:49.360 I think we'll keep her at about, you know, 30-ish minutes here for this short episode on our...
00:28:54.900 Documenting our YouTube re-education.
00:28:57.180 A short re-education, man.
00:28:58.840 Yeah.
00:28:58.960 I'd like to thank Sundar Pishai for this opportunity to expand my mind.
00:29:09.540 And I feel...
00:29:12.020 Honestly, I just feel more...
00:29:14.860 I just feel like my appreciation for global health authorities has grown.
00:29:21.380 And I regret my actions.
00:29:24.540 I don't know about you.
00:29:25.320 I don't know if you still are going to insist on being...
00:29:28.320 You know, having this tendency to question things and to, you know, come to certain conclusions based on changing data.
00:29:34.920 But, you know, for me, at least, I don't...
00:29:38.160 I don't see the benefit in that.
00:29:39.880 If we're going to get strikes, you know, I think I'm just going to have to...
00:29:43.780 I think I'm going to have to just, you know, stick to what I'm told from now on.
00:29:47.960 Yeah, it's...
00:29:50.720 I won't say anything but one thing, but I'm going to avoid the sun.
00:29:54.440 I'm going to only eat three ounces of red meat per week.
00:29:57.920 Twelve servings of grains a day, remember?
00:30:01.320 Yeah, today I ate five times the amount of my daily allotted amount of red meat, so...
00:30:08.200 I had a bowl of ground beef for dinner before coming on this call, so I think we're both...
00:30:14.920 Where are the negatives?
00:30:16.240 Our beef quota is...
00:30:19.440 We have to, like, abstain from it for the next two months just to make up for the...
00:30:25.420 For the methane that we've...
00:30:27.280 That our behavior has injected into the atmosphere.
00:30:29.980 Okay, James, let's quit while we're ahead here.
00:30:32.260 Thanks for your time today.
00:30:34.840 Appreciate you undergoing this re-education with me, and we'll see you guys in probably a week or so with another dangerous take, probably.
00:30:44.880 Yeah, see you in the next one.
00:30:46.560 Cheers, cheers.
00:30:46.940 Cheers.
00:30:47.080 Cheers.
00:30:47.760 Cheers.
00:30:48.360 Cheers.
00:30:48.680 Cheers.
00:31:00.220 Cheers.
00:31:01.140 Cheers.
00:31:02.240 Sorry.
00:31:02.260 Cheers.
00:31:04.680 Cheers.