The Critical Compass Podcast - March 20, 2024


Bill C-63, Winnipeg Lab Leak Updates | A Critical Compass Discussion


Episode Stats

Length

59 minutes

Words per Minute

138.38945

Word Count

8,269

Sentence Count

366

Misogynist Sentences

2

Hate Speech Sentences

8


Summary

In this week's episode, we discuss the Bill C-63, a new bill that would make online service providers accountable for hate speech, child abuse, revenge porn, and other crimes committed using someone's likeness. We also talk about how the bill could impact the way we learn about hate speech.


Transcript

00:00:00.000 So if there are truly bad ideas, we'd want to know what these bad ideas are, who's advocating for them, and we wouldn't want people with bad ideas just hiding in a corner.
00:00:11.060 We want them in the public sphere.
00:00:15.380 Yeah, that's the thing is that bad ideas don't go away by being banned on Twitter.
00:00:22.080 They just go underground and they fester and they become even more potentially influential movements than if they were just allowed to be in the public sphere so people can accurately assess them and the people who are voicing them.
00:00:38.280 Well, let's make a little equation.
00:00:39.940 Bad ideas plus isolation plus resentment equals some very bad reactions in the end.
00:00:47.400 So I think if you exclude people like that, now they have a purpose to, or proof of their bad ideas as well.
00:01:14.060 Hello, welcome to this week's Critical Compass.
00:01:16.720 This is the week of March the 4th, and we're recording on March the 5th, talking about some news that happened last week, and early this week, actually, too.
00:01:28.540 There's been some pretty notable things developing in Canadian politics.
00:01:36.520 So we've got a couple main topics today.
00:01:39.000 James would like to lead us off with some news regarding a new bill, Bill C-63, I believe it is, and James has been looking into this, and why don't you take her away, sir?
00:01:53.420 Yeah, so Bill C-63, it's a new bill.
00:01:57.860 Well, there's a lot there, and there's kind of two pillars we have to kind of understand.
00:02:05.220 It's almost like it's a Trojan horse for some bad ideas, and there's some things that would need to be changed if this was to pass, or it's going to create problems, or it can be—it would not be a good use of our parliamentary system to have this bill.
00:02:23.680 Hold on, James. Sorry, I've got to interrupt you here. Are you suggesting that the current Trudeau liberals would introduce a piece of legislation that has ulterior motives?
00:02:35.340 What I'm trying to say is that if you have a few things you really want to get in, you would not introduce a bill with some extremely heavy-handed other elements.
00:02:49.920 So, like, you would want it to go through, so wouldn't you make something as, like, clear and focused as possible?
00:02:57.680 Like, that's kind of what I'm asking here, what I'm suggesting.
00:03:02.020 So, the first pillar, I think, is something that pretty much everybody can agree on is the child abuse material distribution, revenge porn, and amendments for, like, AI-generated content using somebody's likeness.
00:03:20.880 All currently criminal acts in some form or another, and it's just changing.
00:03:28.940 This bill will make online service providers have to report, and it gives a way of taking down this material and or keeping these service providers accountable.
00:03:43.700 So, putting some accountability, some responsibility on the platform hosts.
00:03:51.420 Yeah.
00:03:52.120 So, they can't really hide behind this notion of, currently, most platforms, they say that they're not a publisher because they're not the editors of the content.
00:04:06.300 They are providing a space for people to upload.
00:04:09.440 So, since they don't create it directly, that has been the excuse for certain platforms to let some material slide.
00:04:18.400 Or maybe they have, like, a lapse period in how long it takes for something to actually be reported or taken down.
00:04:25.440 So, that part of the bill, I think, is excellent.
00:04:30.020 Like, we should definitely give more tools to help control and punish and actually follow up and prevent this sort of content from existing.
00:04:44.180 Especially with the advent of AI-powered tools where you can take a single photo and AI can generate video to that.
00:04:53.960 So, you can see how that can superpower revenge-borne like we haven't seen before.
00:05:05.160 Easily.
00:05:06.180 Easily, yeah.
00:05:07.480 So, that side of the bill is not controversial.
00:05:11.040 The other side is making amendments to hate speech.
00:05:16.760 And the problem with hate speech is it really depends on how you classify hate speech.
00:05:23.960 If it's – if you're using the definition of detestable language, I'm like, well, that's a very – that's a large umbrella.
00:05:34.800 And if that was the case, then –
00:05:36.960 Yeah, detestable to whom?
00:05:39.020 Yeah, there's a subjective nature.
00:05:40.440 So, the people advocating for these amendments to hate speech don't realize that they can easily be somebody who is spreading hate.
00:05:55.400 So, even one idea is that, well, during the pandemic, you had some very hateful statements against people who maybe had doubts about the efficacy of lockdowns or who refused to take any of the COVID shots.
00:06:18.080 It doesn't take that long to see even, like, news articles with demonizing language.
00:06:25.620 There's even some repositories, like, showing dozens and dozens of hateful published news stories.
00:06:31.840 So, if that's detestable, if that's focusing on a group, and if that's classifying them as subhuman, under this bill, they could be charged with, like, $20,000 in fines or greater for those instances of hate speech.
00:06:48.480 So, I think the people supporting that part of the bill don't realize how it can be turned against them.
00:06:57.020 Another aspect of it is the focus on targeted – like, on specific groups.
00:07:04.160 It's the blaming groups for – part of that hate speech would be blaming a specific group for the ills of society.
00:07:16.520 Here's the article you're talking about, I believe.
00:07:18.480 Oh, that is the – yeah, that's one of the – Toronto Star has a few.
00:07:24.480 I have no empathy left for the willfully unvaccinated.
00:07:27.280 Let them die.
00:07:29.360 So, the thing with that, that article, if you look into it, they took – they went through Twitter and – like, Twitter and social media and found statements, like, the most outrageous statements,
00:07:46.300 and then posted it on there, almost like it was a popular, like, commonly held kind of ideas of the majority.
00:07:57.700 Yeah.
00:07:58.700 So, it wasn't even that those were, like, a good representation of that.
00:08:04.460 So, the other part of this hate speech is that making targeted statements about the group – about a group being responsible for the ills of society.
00:08:19.560 Well, like, that has a lot of historical parallels.
00:08:23.900 You pretty much – any massive violent event in history often is, like – the spearhead of that is a focus on one group and blaming – having a scapegoat for the problems,
00:08:40.380 where these problems are often multifactorial and more complex than that.
00:08:48.180 I think it requires – that's part of what it requires, actually, to have a kind of a social mob reaction to a – to that sort of – that sort of input.
00:09:00.600 I think you need that, actually.
00:09:01.700 Yeah.
00:09:02.700 Yeah.
00:09:03.700 The problem is with this bill and this language is the focus on a group and blaming them for society's ills.
00:09:12.100 Well, we've talked about DEI training, diversity, equity, and inclusion, and the cornerstone of that is blaming society's problems on the – like, on white people and men and able-bodied people and straight people.
00:09:34.360 And men, again.
00:09:35.780 Second time.
00:09:36.120 And men, again.
00:09:36.840 Yeah.
00:09:37.240 Double up on men.
00:09:37.960 And so, in this case, DEI training would technically be hate speech under this new law.
00:09:46.880 Like, I don't – I don't know it would be – like, I don't think it would be anybody would actually go against – well, like, maybe they wouldn't be the first ones to be targeted with this.
00:09:58.460 They'd find some way to pretzel their way out of it.
00:10:00.960 Yeah.
00:10:01.560 But the other part of this legislation is anonymous reporting.
00:10:05.300 So, it's powering anonymous reporting for a wide umbrella of hate speech with massive amount of fines.
00:10:15.080 And even if –
00:10:15.900 And jail time, too, I think I remember reading, right?
00:10:17.560 There's – for calls to genocide or something like that, right?
00:10:21.440 Yeah, there can be jail time.
00:10:24.380 There can be – there can be – there's a component of it for future crime.
00:10:31.480 Like, if somebody is suspected to spread hate or – if they're going to spread hate in the future, if they're suspected, then they are – they can be sentenced to house arrest and a curfew.
00:10:51.660 And this language is, like, absolutely –
00:10:56.660 Orwellian.
00:10:58.240 Yeah, it is – it's extreme.
00:11:02.780 And so, what I don't know is that – are the people submitting this bill, do they truly believe this is a good wording and this is a – this is the kind of language that will provide clarity in our society for actually dealing with some of these problems?
00:11:22.480 Or are they overreaching that it gets pulled back to something still Orwellian but, like, not as crazy?
00:11:31.580 Yeah, that's interesting you say that because there's a – kind of a – Chris Voss, the guy who wrote Never Split the Difference, he's the – he's – I don't know if you know him, the former FBI –
00:11:45.660 Is that a negotiating book?
00:11:47.400 Yeah, yeah, yeah.
00:11:48.320 He was a former FBI hostage negotiator who's turned into – his second career now is kind of like a business consultant advisor.
00:11:56.960 Uh, that's a – that's a technique that he talks about that is used a lot – I don't – he didn't invent it or anything but he describes it in the book where, um, in a negotiation one – one technique to kind of get to where you want is to, um, throw out a – kind of a ridiculous number.
00:12:15.640 Uh, say, you know, in a – in a monetary negotiation, you throw out a ridiculous number to recalibrate your opponent's expectations subconsciously, even though you both know you're not going to end up there.
00:12:27.000 And then when you settle, you know, at a more reasonable number, it's still unreasonable because you've been recalibrated by the initial overreach.
00:12:34.600 So, it's interesting, uh, interesting note.
00:12:38.240 It's the same principle of why there's $100 bottles of wine sitting next to $30 bottles of wine.
00:12:44.140 Or you're like, well, that $30 bottle of wine doesn't seem so much when you compare it to the one right next to it.
00:12:50.120 Yeah.
00:12:50.260 Uh, so it does reframe.
00:12:51.800 It's, like, an inflation of – of these ideas and how extreme these ideas are.
00:12:58.320 And I think we're slowly getting pushed into the way that we're, like – these extreme ideas are slowly being normalized in a way that they – they shouldn't.
00:13:08.540 The – even the idea of being fully canceled or having social media completely wiped or deleted, that used to be quite rare and reserved for the most extreme cases.
00:13:22.060 And it doesn't take much for somebody to get, uh, deleted off one of these platforms.
00:13:26.820 Um, part of it is, like, obviously, you have top-down approaches from whatever laws are implemented by a government in a region.
00:13:39.080 But then you've got this other layer.
00:13:42.220 It adds a little bit of complexity of a platform like Facebook or Instagram, Twitter, or even YouTube where they are – they've got their own code of conduct,
00:13:52.740 which is partially informed by some of the major countries that they – they operate out of, or – and they are kind of self-policing and self-censoring based on those policies as well.
00:14:06.320 And then you have self-censorship.
00:14:08.340 Like, the bottom layer is self-censorship of – of people censoring themselves through – maybe you think of it as a – there's a social layer to it of, like,
00:14:18.420 well, I don't want to offend people, so they're self-censoring that way.
00:14:22.120 I don't want to get canceled, so they're self-censoring that way.
00:14:25.260 Um, and even, let's say somebody's a creator on a platform, they're going to self-censor because they're going to align with the monetary incentive.
00:14:33.820 So, even if they're not worried about offending anybody, there's still going to be a tweak, and maybe they're not saying certain words,
00:14:44.100 or they're changing the way that they approach things just to stay on a platform.
00:14:48.200 Yeah, and if – and if legislation like this has the effect that we just talked about, um, really what – what'll happen is that'll just kind of further shrink an already very small Overton window.
00:15:00.620 And when you combine that with what you say, you've got a, you know, perhaps a noble, uh, intent of not wanting to cause offense,
00:15:11.480 and then a more selfish intent of not wanting to be canceled, with a third, uh, a third, uh, um, aspect of, you know, if you're a creator,
00:15:23.160 or if you're somebody with any sort of public presence at all, you know, on any social media, uh, with not wanting to be – uh, have your –
00:15:30.600 potential income restricted. So, that's – it's a perfect storm in all – in all fronts, if that's what your goal is.
00:15:38.720 Yeah, it's – it's not – it's not setting up a space where citizens feel comfortable talking about any specific issue.
00:15:48.900 Um, because identity politics are interwoven pretty much into every policy and issue now,
00:15:56.220 it's difficult to say something that couldn't be interpreted as, as hate.
00:16:03.180 So, for example, you, you have parents that maybe don't – there, there are parents that do not want to enable their child to go through a medical intervention at a young age for cosmetic reasons.
00:16:17.220 Um, this kind of dovetails into that – into some of the discussions we've already had on – on – on the transgender debate.
00:16:27.100 Um, but the fact that parents have lost custody of their children because they haven't affirmed their children's identity,
00:16:37.260 um, where in this case, they're maybe saying, let's go, uh, let's try some counseling, let's try to wait it out,
00:16:46.180 let's try to look at other options instead of, like, a hormonal or surgical intervention.
00:16:54.320 In those cases, they are – they would be deemed under this law hateful and or transphobic
00:17:00.720 for not affirming their child's identity.
00:17:04.860 Yeah.
00:17:05.340 So, as the umbrella gets wider and wider, the chances of you breaking one of these –
00:17:13.560 the chances of you committing a hate – hate speech is – is higher and higher, like, every day.
00:17:22.780 Oh, it reaches nearly 100% if – if you want to be involved in any of these discussions at all,
00:17:27.160 which, uh, Jordan Peterson said this years ago, you know, in – in order to be able to think,
00:17:33.700 you have to be able to – you have to be willing to risk causing offense.
00:17:38.400 How – how else is anyone ever supposed to be able to sort out the good from the bad ideas
00:17:43.500 if you can't, you know, risk having a bad idea or, uh, you know, having your idea challenged at all,
00:17:50.800 which is what the, you know, really what – what the goal of – of a lot of people is in this situation,
00:17:57.380 is to not have any of their current ideas challenged.
00:18:01.360 Either because they know that they're bad ideas or they know that they can't withstand any criticism.
00:18:07.240 These are the same people who – if they view all of society as – like, through power dynamics,
00:18:15.020 oppressed – like, or oppressor versus oppressed power dynamics,
00:18:20.660 they're viewing everything through that, which means their policies and what they're advocating is
00:18:27.880 through the same power dynamics.
00:18:31.440 It's a reversal of power dynamics.
00:18:34.040 And I think when you get a case where they truly believe what they're doing is right,
00:18:40.040 why would they leave room for people to disagree with that?
00:18:46.180 They – they – like –
00:18:48.300 Because they're right.
00:18:49.660 They – they have the –
00:18:50.800 They have a moral duty to shut down dissent.
00:18:53.400 They have a moral imperative, yeah.
00:18:54.320 That's right, yeah.
00:18:55.460 I was reading a quote today.
00:18:57.140 I was – I don't know why it came to my head, but I was trying to find the source of the –
00:19:00.220 of the quote,
00:19:02.060 the road to hell is paid with good intentions.
00:19:04.420 And it's – long story short, it's very hard to nail down the actual source of this quote.
00:19:11.320 It's been edited and changed a little bit over the years,
00:19:14.700 and different religions and print copies in different languages have different versions of it,
00:19:21.420 but the intent kind of stays the same.
00:19:23.820 So that's – that's what this sort of feels like to me anyway.
00:19:27.100 Yeah.
00:19:27.320 Well, it is the thing where it's hard to discern if – like, how much is good intentions,
00:19:36.840 and then there is always a malice element that can appear,
00:19:40.680 or a revenge element of like, well, we're the good guys, we're the bad guys,
00:19:46.080 they – like, they did us dirty in this way,
00:19:49.280 so we have to fix it by either just a heavy-handed approach.
00:19:58.640 The problem is the people who support these heavy-handed approaches
00:20:03.640 or these kind of amendments that shut down debate
00:20:08.720 don't realize that it can quickly become weaponized against them.
00:20:16.080 And history shows that it does, too.
00:20:18.520 Yeah, they never think about the – what's going to happen when the social winds shift
00:20:25.540 and these rules are put in place.
00:20:28.080 I've had this discussion with people before regarding – you know, this is a –
00:20:33.940 maybe will lead us into our next topic, actually,
00:20:35.980 but when the COVID mandates came out,
00:20:42.060 the QR codes and other things associated with being – needing to show proof of certain medications,
00:20:51.080 you know, or medical interventions, anyway, to enter certain establishments.
00:20:55.800 You know, you've got to think about it in the way of, well, you might be in favor of this one,
00:21:02.020 but what happens when something else comes along that you're not in favor of?
00:21:05.100 What happens when you want to have a discussion of maybe regarding the government's role to play in abortion,
00:21:15.460 for example?
00:21:16.060 And if you open the door for the government being able to demand certain things or restrict certain things from you
00:21:24.480 because of their particular view on a particular medical intervention at a particular time,
00:21:31.420 you don't have so much of a leg to stand on, logically, when it comes to something that you disagree with, don't you?
00:21:37.860 Like, that's hard to – that's hard to get people to reframe like that.
00:21:43.040 Yeah, the last four years have really shifted what things may be drawn upon in the future.
00:21:53.460 So lockdowns are potentially a tool that may be used again.
00:22:00.720 They may try the QR codes again for entry into places.
00:22:07.400 So that's something –
00:22:09.720 People laugh when you talk about things like this, but it's like this –
00:22:14.900 right now that's happening in China.
00:22:16.880 And it was happening prior to COVID too, but even more so now.
00:22:20.520 Like, it's not an unreasonable thing to think because it is literally –
00:22:24.280 like, people call you a conspiracy theorist or look at you funny when you talk about social credit scores.
00:22:29.200 But literally right now in China, people have social credit scores,
00:22:33.380 and they are restricted or allowed to do certain things because of those scores.
00:22:38.680 Yeah, they're publicly shamed for a low credit score.
00:22:43.800 They cannot get either – like, maybe they can't get a rental agreement or a mortgage or certain jobs.
00:22:52.120 Or even public transportation, I think.
00:22:54.200 Public transportation, they can't – there's restrictions on mobility.
00:22:57.380 So there are certain checkpoints that – well, during COVID, they were keeping people in certain areas
00:23:07.280 and not letting travel.
00:23:08.840 And I think the justification was, well, we're controlling the spread,
00:23:13.100 but you're also just, again, treating people like cattle, and you're boxing them in.
00:23:20.340 And in North America, like, we take our mobility rights quite seriously.
00:23:25.560 And that was already – we weren't quite at the degree of China, but we were starting down that road.
00:23:32.560 And for anybody who's not familiar with what's happening in China, take a look.
00:23:39.340 There's even, like, news articles exploring the social credit system.
00:23:42.860 Even, I think, like, 2018, I think as early as that, they had a few pilot projects, and then it's expanded.
00:23:52.040 So when you give power to authoritarians, it's, like, not surprising that it would unfold in this way.
00:24:04.880 But, yeah, and they're quite hesitant to ever give it back, despite what they may say.
00:24:10.260 Yeah, as soon as you're tied into a system, as soon as you're banking,
00:24:15.200 as soon as, like, multiple aspects of your life are tied into a system,
00:24:19.400 if any part of that is compromised, your basic, like, amenities become very difficult to even –
00:24:28.120 like, if you can't use your credit card, and let's say there's no cash,
00:24:33.340 you're locked out from buying food, or you just –
00:24:35.880 maybe you have to get somebody to buy food for you.
00:24:39.620 These things are not –
00:24:42.060 it's not a pretty picture, which is why we're trying to highlight some of this.
00:24:48.500 The question is, like, well, I guess,
00:24:52.360 just for your – from what you notice, like, with friends or family,
00:24:56.300 how much have you mentioned kind of the social credit system
00:25:02.020 and kind of the risks of some of these policies in Canada
00:25:08.300 and how we may or may not get to that kind of dystopian point?
00:25:17.300 Well, I don't – to be honest, I try not to bring it up anymore,
00:25:21.280 for the most part, when most of my –
00:25:26.300 friends and family have kind of –
00:25:29.240 I mean, they know my opinions on these things.
00:25:32.300 But most of them are not –
00:25:34.300 they are not ready to accept that there are actors in our institutions
00:25:43.640 who either, A, don't know at all what they're doing
00:25:50.140 and are just kind of going through the motions of an agenda
00:25:53.620 that they don't quite understand,
00:25:55.720 or, B, are actively trying to push policy into society
00:26:03.460 that will lead down those roads,
00:26:05.920 because whatever their intentions are,
00:26:09.420 if they're the people who –
00:26:11.140 if they're more of the same kind of person
00:26:12.760 who truly believes that they have, you know,
00:26:15.280 the moral arc of history on their side,
00:26:18.280 or if – whatever the rationale is,
00:26:21.220 if there's some monetary incentive in it for them,
00:26:23.300 or both, as it may turn out,
00:26:25.160 they just aren't really ready to accept that,
00:26:27.280 because Canada has had –
00:26:31.220 I mean, this sort of changes by the era,
00:26:34.440 but by and large, we've sort of had the luxury
00:26:36.780 of having a relatively above-board,
00:26:41.280 relatively trustworthy media apparatus
00:26:44.460 and generally a fairly innocuous political history.
00:26:52.160 You know, we don't have a whole lot of scandals
00:26:55.880 and, you know, very big, earth-shattering politics here,
00:27:01.900 you know, over the years.
00:27:03.240 Like, we haven't had that sort of –
00:27:05.560 nothing like what happened in 2020, 2021, you know?
00:27:09.560 So it's hard for people to reframe when they see –
00:27:13.620 when sort of the – I don't know the phrase,
00:27:15.780 but when the wolf sort of sheds his sheep's clothing,
00:27:19.620 you know what I mean?
00:27:21.400 It's hard to accept that.
00:27:22.520 So it's easier for people to accept that people like us
00:27:26.580 who discuss these things and worry about these things
00:27:28.500 and try to actively avoid these things perpetuating,
00:27:32.280 it's easier for them to believe that, you know,
00:27:34.540 we're just the kind of the outliers,
00:27:36.180 kind of the nuts ones,
00:27:37.840 sort of the conspiracy theory-minded ones,
00:27:39.700 and the institutions are still fine.
00:27:41.800 You know, that's more intellectually palatable for them.
00:27:46.540 So I don't know.
00:27:47.740 What's your experience been with that?
00:27:49.600 Well, I definitely also made the observation
00:27:53.220 that it does seem like Canada used to be
00:27:56.540 a more boring place on the international stage.
00:27:59.200 Like, we didn't have as much as maybe the last few years
00:28:04.340 where things are really starting to –
00:28:07.160 maybe I'm just noticing it more,
00:28:09.420 or maybe it has shifted as well,
00:28:13.940 but I'm starting to see –
00:28:15.740 I think a little bit of both.
00:28:17.360 Yeah.
00:28:17.540 Yeah, like, I'm starting to see more and more of the corruption
00:28:22.660 because we've had scandals,
00:28:24.340 but again, they haven't –
00:28:28.340 I guess they haven't been in the public consciousness
00:28:32.540 to the same degree.
00:28:34.740 I don't think people realize just how much damage
00:28:36.920 Shudo has done to Canada's reputation on the world stage.
00:28:39.960 Like, it is very obvious for anyone who pays attention
00:28:43.440 any time he shows up at a, you know, a G7 event
00:28:47.460 or some kind of –
00:28:48.460 The body language.
00:28:49.740 Yeah, like, no one has any time for him.
00:28:52.780 Like, no one takes him seriously.
00:28:54.260 And that reflects on us, right?
00:28:55.760 That reflects on Canada's, you know,
00:28:59.220 general standing in the world at large
00:29:01.560 when we just keep re-electing the guy, you know?
00:29:05.980 Like, no one can stand him.
00:29:09.080 It's not just, like, you know, Trump or whoever else,
00:29:12.720 you know, or, you know, Xi Jinping.
00:29:17.540 It's not just those guys.
00:29:18.920 It's – you can tell, like, no one wants to have their –
00:29:23.400 his stink on them, basically, right?
00:29:26.160 Well, I think when it comes to the idea that, like,
00:29:31.340 our institutions are –
00:29:33.240 I think friends and family,
00:29:35.740 they believe the institutions aren't corrupt,
00:29:39.440 but they obviously have their bad guys
00:29:42.340 that they're fighting against within the institutions,
00:29:44.740 but they trust in the actual institutions.
00:29:48.900 Itself, yeah.
00:29:50.120 Yeah, because I think the idea that
00:29:52.760 the institution itself is corrupt
00:29:55.140 is such a scary idea that, like, we –
00:29:58.760 like, well, if nobody's driving the ship,
00:30:00.740 like, who –
00:30:03.640 like, how are we –
00:30:04.960 like, are we going to go off a cliff?
00:30:06.480 Like, they would be so worried about the consequences
00:30:10.200 of somebody not driving the ship
00:30:12.120 that it's scary to know –
00:30:15.740 like, it's scary to even consider that.
00:30:17.720 So I think it's a protective mechanism that way.
00:30:19.720 I do have discussions where –
00:30:23.720 I'm fairly patient in discussions,
00:30:26.860 and I've just been sharing little things,
00:30:30.060 and I'm always curious on, like, well,
00:30:32.500 you're never going to –
00:30:35.720 you're never just going to say one sentence
00:30:39.860 and convince somebody and just flick a switch.
00:30:42.840 Yeah.
00:30:43.260 But I'm curious if you make somebody think
00:30:47.500 about something that they didn't consider,
00:30:49.900 and one entry point is even before –
00:30:53.960 even before the pandemic,
00:30:55.160 I was deep in kind of the nutrition sphere
00:30:59.540 and looking at how our nutritional studies were set up
00:31:03.900 and the issues with observational trials
00:31:07.160 and the limitations
00:31:07.960 and how a good chunk of our –
00:31:11.500 these claims of risk and cause and effect
00:31:15.020 are really just based on observing patterns
00:31:18.060 based on self-reported food questionnaires of people,
00:31:22.100 and they don't differentiate between eating meat
00:31:25.300 as a burger and fries and a large Coke
00:31:28.280 versus eating a home-cooked steak.
00:31:30.480 You don't have anything to kind of filter out
00:31:34.440 these healthy user biases.
00:31:36.040 So I went into the pandemic with an understanding
00:31:40.260 of how modeling and how these studies
00:31:45.040 can be steered in to create whatever result you want.
00:31:49.540 You can hack the p-values.
00:31:51.580 You can design a study to say what you want,
00:31:55.480 and there's a lot of ways of manipulating this
00:31:57.720 to either align with how you're getting funded
00:32:01.340 because there's a big conflict of interest right there
00:32:04.900 or maybe of an ideological motivation of
00:32:08.200 if a researcher is plant-based,
00:32:11.320 are they going to get more of a favorable response
00:32:14.280 to studies or the data that confirms their own ideas?
00:32:21.140 So I find it useful often to point out
00:32:24.600 like the fact that the sugar industry
00:32:27.720 literally paid off scientists to shift the blame
00:32:30.460 for fat-causing heart disease 50 years ago.
00:32:36.500 So that's a known thing,
00:32:40.100 and we know that there's corruption in these spaces.
00:32:42.120 So if you can point to other examples of corruption,
00:32:44.240 I think that helps people entertain the ideas
00:32:48.060 that like, well, maybe we should be
00:32:49.880 a little bit more critical on these current institutions.
00:32:54.500 Yeah.
00:32:55.380 And a good way as well to do that is
00:32:57.560 Peter Boghossian talks about this in his book.
00:33:01.640 I think it was co-written with James Lindsay,
00:33:03.600 actually, How to Have Impossible Conversations.
00:33:05.860 He talks about how one of the most,
00:33:08.560 how one of the most effective ways
00:33:10.680 of changing somebody's mind
00:33:11.960 is to have somebody,
00:33:14.220 is to show them somebody that they already respect,
00:33:16.480 having the opinion that you are trying to show them
00:33:21.720 as an alternative opinion.
00:33:23.800 So it will be another thing as well
00:33:26.380 where it'll take some public figures
00:33:29.080 who either may have been silent
00:33:31.040 or may have been hesitant
00:33:32.260 or even frightened of being cancelled
00:33:34.820 sharing their opinions on certain political
00:33:38.900 or social matters
00:33:39.700 that otherwise would have a following
00:33:43.040 that may not have associated them
00:33:45.020 with those opinions.
00:33:47.380 It'll take some of those people getting brave
00:33:49.080 and voicing their actual thoughts
00:33:53.040 rather than what they're supposed to be thinking
00:33:57.400 about certain things.
00:33:58.300 And that'll help move the needle a little bit too.
00:34:01.340 And that only happens when we have
00:34:02.800 a clear space for the exchange of ideas.
00:34:05.420 So that's the whole thing with this Bill C-63
00:34:11.180 is are we giving the chance for ideas
00:34:14.420 to be out in the open, to be dissected?
00:34:18.900 Like sun is the best disinfectant.
00:34:21.600 So if there are truly bad ideas,
00:34:24.120 we'd want to know what these bad ideas are,
00:34:26.680 who's advocating for them.
00:34:28.760 And we wouldn't want people with bad ideas
00:34:31.360 just hiding a corner.
00:34:32.260 Like we want them in the public sphere.
00:34:36.940 Yeah.
00:34:37.580 Yeah, that's the thing is that, you know,
00:34:39.160 bad ideas don't go away
00:34:40.940 by being banned on Twitter.
00:34:45.140 Like they just go underground
00:34:46.880 and they fester
00:34:47.620 and they become even more
00:34:50.540 potentially influential movements
00:34:52.300 than if they were just allowed
00:34:54.320 to be in the public sphere
00:34:55.860 so people can accurately assess them
00:34:58.320 and the people who are voicing them.
00:35:00.420 Well, let's make a little equation.
00:35:02.800 Bad ideas plus isolation plus resentment
00:35:05.940 equals some very bad reactions in the end.
00:35:10.800 So I think if you exclude people like that,
00:35:14.340 now they have a purpose to,
00:35:17.000 or proof of their bad ideas as well.
00:35:19.780 Yeah, and they don't get people like,
00:35:21.920 you know, people who do have truly very dangerous ideas
00:35:25.460 don't get discouraged
00:35:28.340 or have their minds changed
00:35:31.340 by being censored, right?
00:35:32.860 That just further motivates them.
00:35:35.340 So, yeah.
00:35:36.780 Well, I think we can maybe segue a little bit
00:35:40.900 into our second main topic of today.
00:35:45.660 I'm going to share some screen caps here
00:35:49.060 and this will be the first one.
00:35:51.440 So this is the actual,
00:35:52.680 this is the actual article
00:35:55.940 from what happened yesterday.
00:36:00.140 The article headline I don't necessarily love,
00:36:02.820 but essentially what happened
00:36:05.300 is that there was a committee formed
00:36:07.980 to discuss the firing of some scientists
00:36:12.620 from a microbiology lab in Winnipeg.
00:36:16.060 So this happened kind of during the COVID pandemic
00:36:22.040 and there has been some,
00:36:25.460 I don't want to say conspiracy around it
00:36:29.500 because it really isn't a conspiracy at this point,
00:36:31.640 but there has been some very important questions
00:36:34.760 being asked about what exactly those scientists
00:36:38.240 were doing in that lab in Winnipeg in 2019
00:36:41.540 and what ramifications that might have had
00:36:45.460 in relation to the COVID pandemic.
00:36:48.460 So I'm going to pull up another article here.
00:36:55.320 And this is just a couple of days ago.
00:37:02.260 So the article headline from the National Post
00:37:04.240 is fired scientists at Winnipeg lab
00:37:05.920 worked closely and covertly
00:37:07.540 with Chinese government CSIS report says.
00:37:09.840 This is from February 28th,
00:37:11.820 so just a few days ago.
00:37:13.820 And yesterday,
00:37:14.800 a committee determined that
00:37:17.260 even with this report saying,
00:37:19.960 this is a quote from CSIS,
00:37:21.840 the service assesses that Mrs.
00:37:24.160 I don't want to mispronounce her name.
00:37:25.940 I think it's Chu.
00:37:26.800 Mrs. Chu developed deep cooperative relationships
00:37:30.120 with a variety of People's Republic of China institutions
00:37:32.900 and has intentionally transferred scientific knowledge
00:37:35.680 and materials to China
00:37:36.760 in order to benefit the PRC government.
00:37:38.800 reads a letter from January 2021
00:37:41.020 recommending her security clearance be revoked.
00:37:44.840 So a committee in the House of Commons
00:37:48.740 determined that even with that CSIS,
00:37:51.780 very unambiguous CSIS statement,
00:37:54.780 this is not worth investigating.
00:37:56.700 Why would CSIS even say that
00:38:02.200 if it wasn't of, like, grave importance?
00:38:06.280 It's the same kind of thing.
00:38:07.860 If you have standards and procedures within,
00:38:10.920 like, this is the type of communications we have,
00:38:13.640 or this is what we're sharing for results,
00:38:16.120 or because there is collaboration between different labs,
00:38:18.800 but there's obviously security concerns
00:38:22.180 when it comes to matters of, like, a biolab like this.
00:38:27.260 So whatever policies they have,
00:38:29.480 if these individuals broke those policies,
00:38:31.520 then you'd want them investigated.
00:38:34.420 So it's pretty simple.
00:38:36.340 It's very simple.
00:38:37.320 I'm going to read you another portion from that article.
00:38:40.840 Chu was found to have shipped sensitive materials
00:38:42.860 outside of the National Microbiology Lab
00:38:44.940 without approval,
00:38:45.920 including antibodies to the China National Institute
00:38:48.420 for Food and Drug Control,
00:38:50.160 as well as some to others,
00:38:51.920 as well as some others to the United Kingdom
00:38:54.480 and the United States for testing.
00:38:56.340 She also admitted to having sent several emails
00:38:58.640 to research associates
00:38:59.660 through personal email accounts on Gmail or Yahoo
00:39:02.100 instead of her personal corporate account.
00:39:04.600 As for Cheng, this is the second scientist,
00:39:06.540 he had not only used his personal email accounts
00:39:08.460 and unauthorized external drives
00:39:10.040 to conduct government business,
00:39:11.740 but also breached security policies
00:39:13.280 regarding students under his supervision
00:39:14.960 by giving them access to the scientific network
00:39:17.200 through a computer in the laboratory.
00:39:20.000 So this Chu, the main scientist here, the woman,
00:39:25.640 she kind of, from what I understand,
00:39:29.020 she kind of rose to prominence
00:39:30.040 by having quite a, developing quite an interesting,
00:39:34.480 multi, I don't know the science behind it,
00:39:37.940 but it's like a multifaceted approach
00:39:39.720 using monoclonal antibodies
00:39:41.740 to, essentially to cure COVID in a laboratory,
00:39:44.900 or not COVID, sorry,
00:39:45.940 to cure Ebola in a laboratory setting.
00:39:50.380 So there was initially some worries
00:39:52.140 that there may have been
00:39:53.200 really potentially lethal substances
00:39:57.200 being shipped overseas
00:40:00.060 without proper procedures.
00:40:02.200 Apparently, she was not necessarily known
00:40:05.180 for being, for following the letter of the law
00:40:09.640 when it came to transferring samples.
00:40:13.540 I'm going to share another article here,
00:40:15.540 which even further complicates
00:40:17.940 the matter of, like, why they would say
00:40:22.020 this is not something worth investigating,
00:40:23.760 because apparently, in June of 2021,
00:40:29.600 Patty Haydew said,
00:40:31.000 In this particular case,
00:40:32.820 the information requested
00:40:33.860 has both privacy
00:40:34.840 and national security implications.
00:40:37.980 Compliance with the order
00:40:38.820 without proper safeguards in place
00:40:40.240 would put sensitive information
00:40:41.520 at the risk of public release.
00:40:43.620 So this is something
00:40:44.700 that apparently is so sensitive
00:40:48.840 that they can't discuss it publicly,
00:40:53.960 but at the same time,
00:40:55.400 it's also not worth investigating.
00:40:57.840 So here it shows
00:41:01.380 what I was just saying here.
00:41:05.260 Xu had earlier been responsible
00:41:06.900 for a shipment of Ebola
00:41:08.040 and, I don't know what this is,
00:41:10.380 Hennepa viruses
00:41:11.480 to China's Wuhan Institute of Virology.
00:41:14.820 That's very interesting.
00:41:16.380 But even though they don't know what it is
00:41:18.060 and haven't investigated it,
00:41:20.280 there is no connection
00:41:21.480 between the transfer of viruses
00:41:22.580 cited in this order
00:41:23.360 and the subsequent...
00:41:23.920 They know for sure it's not that.
00:41:25.440 No, and there's definitely
00:41:27.200 no link to COVID-19.
00:41:28.560 They definitely know that.
00:41:30.200 So these are the perfect examples
00:41:35.060 of our government officials
00:41:36.480 talking out of both sides
00:41:37.440 of their mouth.
00:41:38.480 But it doesn't stop there
00:41:40.120 because actually,
00:41:42.100 in that same year,
00:41:44.680 in 2021,
00:41:46.440 just a little bit later,
00:41:48.240 I have to forgive me
00:41:49.020 for all the links I have here,
00:41:50.140 but Canadians must know
00:41:53.560 what our lab's role was
00:41:55.020 in COVID's origins.
00:41:56.940 China resists more investigation
00:41:58.340 in its borders,
00:41:59.240 but one strand of the trail
00:42:00.440 leads to Winnipeg,
00:42:01.540 writes author and journalist
00:42:02.540 Elaine Dewar.
00:42:05.380 Now, you can't see the full article
00:42:07.020 on this view,
00:42:09.140 but if I go to a reader view,
00:42:12.400 there is a section
00:42:13.900 where...
00:42:17.140 Let's see if I can find...
00:42:20.140 This section...
00:42:21.020 Here we go.
00:42:23.320 This is the Winnipeg lab.
00:42:24.800 The NML is Canada's
00:42:25.980 only high-containment lab
00:42:27.420 for the study of human pathogens.
00:42:29.540 Its top level four lab,
00:42:31.360 the sort devoted
00:42:31.960 to the most dangerous microbes,
00:42:33.680 was led by...
00:42:34.520 Again, I'm sorry.
00:42:36.240 I think Xianguo Chu,
00:42:38.220 recipient in 2018
00:42:39.760 of Gover General's Innovation Award
00:42:41.400 for an antibody cocktail
00:42:42.640 against Ebola.
00:42:44.720 Xianguo Chu
00:42:45.780 and her husband,
00:42:47.060 Keding Cheng,
00:42:47.880 were marched out of that lab
00:42:49.500 in 2019 by the RCMP.
00:42:52.060 Their security clearances
00:42:53.000 and access to data
00:42:54.060 allegedly withdrawn.
00:42:55.800 The federal government
00:42:56.380 claimed this was nothing serious,
00:42:57.960 just something to do
00:42:58.720 with administrative matters.
00:43:00.840 Yet it occurred just months
00:43:01.980 after Chu sent 15 strains
00:43:03.780 of Ebola,
00:43:05.040 again this NIPA,
00:43:06.220 and other glycoproteins
00:43:07.820 to the WIV.
00:43:09.240 Those samples helped seed
00:43:10.440 the WIV's own
00:43:11.520 newly opened level four lab.
00:43:13.200 What do we know, James,
00:43:15.220 about Wuhan's
00:43:16.500 Institute of Virology
00:43:17.860 level four lab?
00:43:19.800 We know
00:43:20.800 it was denied
00:43:22.340 as having
00:43:24.300 anything to do
00:43:26.080 with the last four years.
00:43:27.860 Mm-hmm.
00:43:28.600 Mm-hmm.
00:43:28.960 Just like
00:43:29.500 these scientists
00:43:30.820 being
00:43:31.320 escorted
00:43:33.220 out by the RCMP
00:43:34.660 and having
00:43:35.660 CSIS reports
00:43:36.580 written about them,
00:43:37.540 just like that
00:43:38.080 has nothing to do
00:43:38.840 with this either.
00:43:39.840 And you're a conspiracy theorist
00:43:40.980 if you think otherwise, sir.
00:43:42.080 Well, even just
00:43:44.580 taking it on face value,
00:43:47.060 you listed
00:43:48.020 not only just like
00:43:49.500 the sharing
00:43:50.260 of like
00:43:50.980 a result of something,
00:43:52.740 the physically sending
00:43:54.180 lab
00:43:55.020 samples
00:43:56.140 and breaching
00:43:58.300 security concerns.
00:43:59.820 You can't
00:44:00.960 accidentally
00:44:01.680 use
00:44:02.980 the wrong
00:44:03.840 email address.
00:44:05.200 Like, there's no
00:44:06.360 plead of ignorance
00:44:07.340 to this.
00:44:09.060 Like, these are
00:44:09.640 very
00:44:10.340 clear examples
00:44:12.200 of a breach
00:44:12.960 of policy,
00:44:14.040 the breach
00:44:14.440 of conduct,
00:44:15.740 and
00:44:15.940 like,
00:44:17.280 why are not,
00:44:18.200 why are they not
00:44:19.020 investigating this
00:44:20.320 other than to
00:44:21.280 cover up
00:44:22.680 any other
00:44:23.500 potential
00:44:24.440 corruption?
00:44:26.820 That would be
00:44:28.040 my
00:44:28.440 conclusion there,
00:44:29.540 that
00:44:29.800 the lack
00:44:31.020 of looking
00:44:31.500 at this further
00:44:32.300 implicates
00:44:33.200 enough of the people
00:44:35.460 around that
00:44:37.140 that it would
00:44:37.760 create too
00:44:38.520 much collateral
00:44:39.240 damage,
00:44:40.200 therefore they
00:44:40.660 don't want to
00:44:41.300 look into it
00:44:41.840 further.
00:44:43.420 Yeah.
00:44:43.940 Yeah.
00:44:45.680 There's another
00:44:46.560 article here too
00:44:47.380 that it's a good
00:44:48.560 read if
00:44:49.500 a little bit
00:44:52.100 maybe wishy-washy
00:44:54.880 in its
00:44:55.380 unwillingness
00:44:59.540 to find
00:45:00.920 a link
00:45:03.220 where
00:45:03.840 one
00:45:04.300 very
00:45:05.680 obviously
00:45:06.060 exists.
00:45:06.960 It's a good,
00:45:07.440 it's a long-form
00:45:08.040 article,
00:45:08.440 it's by
00:45:08.620 McLean's.
00:45:11.900 A brilliant
00:45:13.380 scientist was
00:45:13.980 mysteriously fired
00:45:14.700 from a
00:45:15.360 Winnipeg virus
00:45:16.040 lab.
00:45:16.560 No one knows
00:45:16.940 why.
00:45:17.800 No one.
00:45:18.700 Not a single
00:45:20.000 person knows
00:45:20.500 why.
00:45:21.980 It's a,
00:45:22.320 it's a quite a
00:45:22.960 long read.
00:45:23.560 There is a
00:45:23.980 section here
00:45:24.580 that discusses
00:45:25.680 and it's
00:45:27.660 going to
00:45:28.000 take me
00:45:30.240 just a second
00:45:30.780 here,
00:45:31.040 but there is,
00:45:31.620 here it is,
00:45:32.140 the materials
00:45:32.600 transfer agreement.
00:45:33.360 So this is
00:45:33.760 what,
00:45:34.080 this is what
00:45:35.540 apparently
00:45:35.980 was her
00:45:37.480 biggest issue
00:45:38.080 is that
00:45:38.460 she,
00:45:39.120 like,
00:45:43.440 she just
00:45:44.900 didn't,
00:45:45.580 she didn't
00:45:46.000 like following
00:45:46.440 procedure when
00:45:47.060 it came to
00:45:47.440 these things.
00:45:50.300 Personally,
00:45:50.840 I don't agree,
00:45:51.340 I don't believe
00:45:51.900 in MTAs for
00:45:52.580 these materials.
00:45:53.400 You know,
00:45:53.560 there was,
00:45:54.300 there was a
00:45:57.280 lot of
00:45:57.840 of like
00:45:58.620 kind of
00:46:00.460 playing a
00:46:00.840 little bit
00:46:01.120 fast and
00:46:01.620 loose with
00:46:02.000 some very
00:46:02.400 deadly
00:46:02.660 pathogens and
00:46:03.400 if you're
00:46:03.640 going to
00:46:03.800 do that
00:46:04.260 with something
00:46:04.960 like Ebola,
00:46:06.320 I mean,
00:46:07.720 who's to say
00:46:08.240 you wouldn't do
00:46:08.660 that with other
00:46:09.180 viruses that
00:46:09.780 maybe you'd be
00:46:10.460 working on in
00:46:11.140 an unofficial
00:46:12.040 context.
00:46:13.260 And I can't
00:46:14.500 remember if it's
00:46:14.980 that article or
00:46:15.960 the National
00:46:16.500 Post one,
00:46:17.120 but they,
00:46:19.000 there are
00:46:19.880 definitely ties
00:46:20.980 with China
00:46:21.640 for the
00:46:22.820 scientist.
00:46:23.960 She was,
00:46:24.800 yeah,
00:46:27.220 oh,
00:46:27.420 it was the
00:46:27.720 National Post
00:46:28.280 article.
00:46:28.820 So I'm
00:46:29.700 reading another
00:46:30.140 quote here.
00:46:31.580 The service
00:46:32.420 assesses that
00:46:33.240 Ms. Chu
00:46:33.740 developed deep
00:46:34.400 cooperative
00:46:34.820 relationships.
00:46:36.000 Oh,
00:46:36.300 I think I
00:46:36.680 already,
00:46:36.940 I already
00:46:37.360 read this
00:46:37.740 quote,
00:46:38.500 but yeah,
00:46:39.560 she,
00:46:39.940 she definitely
00:46:40.620 had relationships
00:46:41.400 with Chinese
00:46:44.680 officials and
00:46:45.340 there's even,
00:46:45.920 there was even
00:46:46.460 a,
00:46:47.360 a,
00:46:48.860 some
00:46:49.640 implication that
00:46:50.420 there may
00:46:50.720 have been a
00:46:51.220 sort of a,
00:46:52.180 you know how
00:46:52.540 there exists in
00:46:53.140 China certain
00:46:53.640 like work
00:46:54.160 programs that
00:46:55.060 kind of send
00:46:55.960 Chinese people
00:46:57.420 to other
00:46:58.140 countries to
00:46:58.940 infiltrate is
00:47:00.920 the wrong
00:47:01.280 word,
00:47:01.640 but like to
00:47:02.480 sort of become
00:47:03.180 influential in
00:47:04.080 certain institutions
00:47:05.280 and then with
00:47:06.480 the ultimate goal
00:47:07.180 of really of
00:47:08.080 bringing that
00:47:08.540 information and
00:47:09.260 that experience
00:47:09.900 back to China
00:47:10.780 ultimately.
00:47:12.020 So that's sort
00:47:12.660 of the,
00:47:13.020 the idea of
00:47:13.640 what she was
00:47:14.220 doing potentially
00:47:15.900 on behalf of
00:47:17.020 the PRC.
00:47:19.320 Well,
00:47:19.920 that opens up
00:47:21.280 kind of the,
00:47:21.960 the question of
00:47:22.920 if you look
00:47:25.900 at China,
00:47:26.460 they don't
00:47:27.200 let just
00:47:27.900 anybody into
00:47:28.800 their labs.
00:47:29.760 They're quite,
00:47:30.820 they've got
00:47:32.180 quite a tight
00:47:33.020 lid on that,
00:47:34.420 on that.
00:47:35.580 And it
00:47:36.500 would make
00:47:37.160 sense that
00:47:37.700 if,
00:47:38.080 if other
00:47:39.320 countries were
00:47:39.980 quite lax in
00:47:42.120 who they let
00:47:42.920 in,
00:47:43.620 like for clear,
00:47:44.660 like if the
00:47:45.780 security clearance
00:47:46.640 can be passed
00:47:47.420 by pretty much
00:47:49.600 any scientist,
00:47:50.280 then why
00:47:51.180 wouldn't they
00:47:51.660 send people
00:47:53.600 over?
00:47:54.260 Like that's,
00:47:54.860 that doesn't
00:47:55.160 seem that
00:47:56.700 crazy to me.
00:47:58.380 Yeah.
00:47:58.920 Yeah.
00:48:00.380 Um,
00:48:01.060 I found the,
00:48:02.260 um,
00:48:02.620 I found the
00:48:03.000 portion here.
00:48:03.700 It's,
00:48:04.060 um,
00:48:05.120 share it again.
00:48:05.760 CSIS discovered
00:48:12.800 Chu,
00:48:13.500 um,
00:48:14.520 had applied
00:48:15.180 for and
00:48:15.460 likely to
00:48:15.780 receive a
00:48:16.140 position under
00:48:16.740 China's
00:48:17.220 Thousand
00:48:17.960 Talents
00:48:18.460 program,
00:48:19.460 a government
00:48:19.840 sponsored
00:48:20.180 program to
00:48:20.760 recruit Chinese
00:48:21.380 experts,
00:48:21.900 which also
00:48:22.400 allows them
00:48:22.900 to keep
00:48:23.240 jobs in
00:48:23.800 Western
00:48:24.000 countries.
00:48:25.380 Chu's
00:48:25.560 position came
00:48:26.140 through the
00:48:26.760 Wuhan Institute
00:48:27.380 of Virology,
00:48:28.300 and CSIS
00:48:28.660 determined that
00:48:29.200 the Thousand
00:48:29.600 Talents
00:48:29.840 program offers
00:48:31.080 researchers up
00:48:31.740 to $1 million
00:48:32.260 in research
00:48:33.300 subsidies and
00:48:34.080 better access
00:48:34.580 to visas and
00:48:35.400 Chinese health
00:48:35.920 care.
00:48:37.820 Ain't that
00:48:38.520 something?
00:48:40.080 So with
00:48:41.280 that information,
00:48:43.500 we are,
00:48:44.580 if we ask the
00:48:45.440 questions of
00:48:46.000 like how many
00:48:47.380 of these
00:48:47.900 scientists should
00:48:48.500 we trust or
00:48:49.120 what screening
00:48:49.720 methods should
00:48:51.180 we,
00:48:52.620 like,
00:48:53.260 how do we,
00:48:54.060 how do we
00:48:54.660 vet and from
00:48:55.960 like a security
00:48:57.680 aspect?
00:48:57.840 Or are we
00:48:58.100 vetting at all?
00:48:59.280 Are we vetting?
00:49:00.120 Like,
00:49:00.380 but if you ask
00:49:01.060 those questions,
00:49:01.720 that would
00:49:02.340 be a
00:49:03.360 hateful thing
00:49:03.900 because it's
00:49:04.480 targeting,
00:49:05.440 it's targeting
00:49:07.240 Chinese scientists
00:49:08.280 and we can't
00:49:09.060 have any
00:49:10.060 group targeted.
00:49:11.100 Yeah,
00:49:11.200 that would be
00:49:11.580 racist.
00:49:11.860 So you can
00:49:13.900 see how,
00:49:14.640 I imagine
00:49:16.540 even writing
00:49:17.140 these articles,
00:49:18.400 some of these
00:49:19.320 writers would be
00:49:20.000 dancing around
00:49:20.920 the implication
00:49:22.320 of their words.
00:49:23.280 That's maybe
00:49:23.800 why they
00:49:24.360 don't feel
00:49:25.520 confident in
00:49:26.320 making any
00:49:27.040 definitive
00:49:27.820 statements or
00:49:28.740 links.
00:49:29.560 Like,
00:49:29.820 they're just,
00:49:30.860 they're just
00:49:31.500 like,
00:49:31.760 well,
00:49:32.380 here's a
00:49:32.800 quote,
00:49:33.080 here's a
00:49:33.400 quote,
00:49:33.680 here's a
00:49:34.020 quote.
00:49:34.960 Like,
00:49:35.300 I know it's
00:49:36.260 good to
00:49:36.580 stay
00:49:36.940 completely
00:49:38.340 like,
00:49:41.320 objective
00:49:41.960 and non-biased
00:49:42.900 at the same
00:49:43.960 time.
00:49:45.160 Yeah,
00:49:45.720 the being
00:49:46.540 too impartial
00:49:47.420 to the point
00:49:47.800 that you
00:49:48.080 can't even,
00:49:49.900 this is the
00:49:50.360 kind of case
00:49:50.820 where like,
00:49:52.260 an opinion
00:49:52.880 piece or
00:49:53.460 two,
00:49:54.920 a,
00:49:55.680 to get
00:49:57.100 somebody in
00:49:57.500 to write
00:49:57.820 into like,
00:49:58.560 sum up some
00:49:59.000 of these
00:49:59.180 ideas.
00:49:59.600 Yeah,
00:50:00.060 somebody
00:50:00.360 taking a
00:50:00.860 chance.
00:50:02.060 Might be,
00:50:02.700 might be
00:50:03.400 good in
00:50:03.960 this case.
00:50:05.020 Well,
00:50:05.300 and I
00:50:05.480 mean,
00:50:05.680 it's,
00:50:05.960 it's
00:50:06.160 something
00:50:06.380 that,
00:50:07.180 I don't
00:50:08.160 know,
00:50:08.420 we're gonna
00:50:08.900 put all
00:50:09.480 these links
00:50:09.920 in the,
00:50:10.340 in the
00:50:11.100 show notes
00:50:11.500 here,
00:50:11.760 but in
00:50:12.740 one of
00:50:13.000 those
00:50:13.140 articles,
00:50:13.660 I'm pretty
00:50:14.040 sure they
00:50:14.340 talked about
00:50:14.780 how there
00:50:17.040 was some,
00:50:17.800 there was
00:50:18.280 some thought
00:50:18.840 that after
00:50:20.320 the,
00:50:20.640 it was either
00:50:21.180 2017 or
00:50:22.040 2018 when,
00:50:23.120 when Canadian
00:50:24.600 officials apprehended
00:50:25.580 that,
00:50:25.960 was it
00:50:28.520 a Huawei,
00:50:29.920 that Huawei
00:50:30.580 executive that
00:50:31.420 was under
00:50:31.900 investigation in
00:50:32.720 the US,
00:50:33.420 and there
00:50:34.060 was some
00:50:34.560 thought,
00:50:36.480 maybe,
00:50:36.640 you know,
00:50:36.920 just some
00:50:37.380 kind of
00:50:37.760 theorizing
00:50:38.240 that with
00:50:40.060 how relations
00:50:41.380 between China
00:50:42.360 and Canada
00:50:42.940 really cooled
00:50:44.260 off after
00:50:44.840 that,
00:50:45.780 some of
00:50:47.100 this,
00:50:48.140 of what
00:50:48.520 happened in
00:50:48.980 the Winnipeg
00:50:49.480 lab,
00:50:50.140 may have
00:50:50.640 been an
00:50:51.640 attempt to
00:50:52.900 sort of
00:50:53.440 not smooth
00:50:55.340 over those
00:50:55.880 relations,
00:50:56.460 but just
00:50:56.840 kind of
00:50:57.300 look the
00:50:58.020 other way
00:50:58.580 for things
00:50:59.580 that were
00:51:00.360 pretty clearly
00:51:02.120 breaches of
00:51:03.160 security,
00:51:03.680 because
00:51:04.080 either Canadian
00:51:06.260 officials didn't
00:51:06.840 want to rock
00:51:07.220 the boat,
00:51:07.920 or they were
00:51:09.020 told to
00:51:10.160 not,
00:51:10.740 you know,
00:51:11.060 make a stink
00:51:11.640 about this,
00:51:12.160 even though
00:51:12.480 there were
00:51:12.800 some
00:51:13.200 policies and
00:51:15.400 procedures being
00:51:16.180 ignored or
00:51:17.440 thwarted,
00:51:18.380 or however
00:51:19.200 it happened,
00:51:20.520 that sort of
00:51:21.120 a,
00:51:21.840 that was a
00:51:22.260 theory about
00:51:22.940 why this
00:51:24.120 could have
00:51:24.440 been allowed
00:51:24.820 to happen.
00:51:26.880 Yeah,
00:51:27.660 it's,
00:51:28.800 it's difficult
00:51:30.040 even in the
00:51:30.620 public eye,
00:51:32.240 our goldfish
00:51:33.080 memory,
00:51:34.000 where we
00:51:35.740 have more
00:51:36.840 and more
00:51:37.060 scandals,
00:51:37.680 more and more
00:51:38.160 things that
00:51:38.800 should make
00:51:39.460 the average
00:51:39.880 person just
00:51:40.480 say,
00:51:41.020 like,
00:51:41.940 what's,
00:51:42.360 what's going
00:51:42.780 on?
00:51:44.340 These things
00:51:45.180 don't seem
00:51:45.620 to be ringing
00:51:46.360 the alarm
00:51:46.860 bells of the
00:51:47.820 average person.
00:51:48.500 and,
00:51:49.980 like,
00:51:51.940 we've seen,
00:51:52.960 I guess,
00:51:53.880 Twitter X
00:51:54.560 is not the
00:51:55.400 most neutral
00:51:56.840 sample of
00:51:57.720 opinions,
00:51:58.200 but when
00:51:59.520 having discussions
00:52:00.400 with people
00:52:00.900 and you
00:52:01.280 share a link
00:52:03.080 that's,
00:52:03.520 like,
00:52:03.620 clearly,
00:52:05.000 one example
00:52:05.600 would be,
00:52:05.840 like,
00:52:06.140 well,
00:52:06.540 the idea
00:52:07.820 that
00:52:08.220 the drought
00:52:11.480 and the warm
00:52:12.240 weather is the
00:52:13.040 sole
00:52:13.420 cause of
00:52:15.800 all the
00:52:16.200 firefighters,
00:52:16.820 all the,
00:52:17.440 all the fires
00:52:18.600 in the last
00:52:19.900 few years.
00:52:21.140 We'll link to
00:52:21.740 this discussion,
00:52:22.520 too,
00:52:22.640 that we recently
00:52:23.200 had on our
00:52:23.720 Twitter.
00:52:24.180 That was a
00:52:24.540 trip.
00:52:24.720 Yeah,
00:52:25.340 and maybe
00:52:27.100 you share
00:52:27.460 information of
00:52:28.260 the arson,
00:52:29.080 and then they
00:52:29.520 say,
00:52:29.760 well,
00:52:29.940 it's a small
00:52:30.520 minority of
00:52:31.320 fires.
00:52:32.100 Like,
00:52:32.200 well,
00:52:33.520 you didn't
00:52:33.880 actually revise
00:52:34.640 your,
00:52:35.080 like,
00:52:35.980 you still think
00:52:37.180 your original
00:52:38.460 thought is true.
00:52:39.180 It's a small
00:52:39.640 minority of the
00:52:40.380 60% of human,
00:52:41.920 of man-caused
00:52:43.200 fires.
00:52:44.500 Yeah,
00:52:44.800 so,
00:52:45.780 even in cases
00:52:46.620 where you can
00:52:47.160 point to
00:52:48.180 clear data,
00:52:49.720 I think some
00:52:50.820 people aren't,
00:52:53.260 maybe the
00:52:53.800 willingness isn't
00:52:54.800 there to
00:52:55.780 kind of change
00:52:56.420 their mind,
00:52:56.880 or to,
00:52:57.780 again,
00:52:58.740 think that
00:52:59.640 institutions can
00:53:00.880 be corrupt.
00:53:01.840 I think they
00:53:02.780 would look to
00:53:03.340 this and be like,
00:53:03.900 well,
00:53:04.260 that individual
00:53:05.140 at the lab
00:53:07.160 breached
00:53:08.620 their,
00:53:09.860 they just
00:53:10.600 maybe didn't
00:53:11.800 follow the
00:53:12.780 policy to
00:53:13.540 the letter.
00:53:13.980 Yeah,
00:53:14.200 they're a
00:53:14.500 wild card,
00:53:15.020 they're a
00:53:15.320 renegade.
00:53:17.440 But that
00:53:18.460 still doesn't
00:53:19.020 explain why
00:53:19.860 you wouldn't
00:53:20.340 investigate and
00:53:21.220 why you
00:53:21.660 wouldn't
00:53:22.020 revise policy
00:53:22.960 and why you
00:53:23.500 wouldn't
00:53:24.320 ensure that
00:53:25.120 this thing
00:53:25.540 can't happen
00:53:26.260 again.
00:53:27.040 Yeah,
00:53:27.580 well,
00:53:28.040 why you
00:53:28.360 would say
00:53:28.740 in 2021
00:53:29.520 or 2022,
00:53:30.740 whenever that
00:53:31.120 article was,
00:53:32.040 Patty Hady
00:53:32.480 would say,
00:53:33.340 this is a,
00:53:34.380 this is of
00:53:34.900 such national
00:53:36.120 security import
00:53:37.060 that it can't
00:53:37.740 be,
00:53:38.120 it can't be
00:53:39.180 publicly discussed
00:53:39.940 at this time
00:53:40.640 and then
00:53:41.440 two years
00:53:41.940 later,
00:53:42.500 it's not
00:53:43.640 going to be
00:53:43.920 investigated at
00:53:44.620 all.
00:53:44.860 It can't,
00:53:45.360 it literally
00:53:45.960 cannot be
00:53:46.560 both.
00:53:48.780 Yeah,
00:53:49.320 why ring the
00:53:49.980 bell?
00:53:50.300 Like,
00:53:50.680 why is the
00:53:51.220 alarm rung
00:53:51.840 then?
00:53:52.800 Yeah.
00:53:53.660 And then
00:53:54.420 now it
00:53:55.340 doesn't matter.
00:53:56.860 So what,
00:53:57.960 what was the
00:53:59.080 change within
00:54:00.340 that time?
00:54:01.700 But the,
00:54:02.680 the,
00:54:03.080 like the
00:54:04.700 facts of the
00:54:05.300 case haven't,
00:54:06.040 like the
00:54:07.740 situation hasn't
00:54:08.500 changed like
00:54:09.180 in that time.
00:54:10.980 So it's not
00:54:11.320 like they
00:54:11.620 discovered
00:54:12.280 what,
00:54:14.060 like what
00:54:14.480 context would
00:54:15.460 make that
00:54:15.940 right?
00:54:16.700 Yeah.
00:54:17.060 Well,
00:54:17.320 it's,
00:54:17.540 it's,
00:54:18.220 you know,
00:54:18.480 the,
00:54:18.780 the conservatives
00:54:19.740 have said that
00:54:20.420 this is a,
00:54:21.760 or some,
00:54:22.480 some members of
00:54:23.180 the,
00:54:23.380 some conservative
00:54:24.160 MPs on that
00:54:25.340 committee have
00:54:25.840 said that this
00:54:26.340 is very
00:54:27.260 obviously a,
00:54:28.340 a move by
00:54:29.720 the liberal
00:54:30.380 and NDP
00:54:30.900 members of
00:54:31.480 this committee
00:54:31.900 to avoid
00:54:32.660 potential
00:54:33.660 embarrassment.
00:54:34.920 But,
00:54:35.120 again,
00:54:36.760 if this is
00:54:37.200 something that
00:54:37.580 is so,
00:54:39.320 such a serious
00:54:40.600 national security
00:54:41.520 concern,
00:54:42.600 you would think
00:54:44.420 that the
00:54:45.240 embarrassment
00:54:45.740 factor of
00:54:47.800 it is not as
00:54:48.600 important as
00:54:49.280 the potential
00:54:49.840 ramifications for
00:54:51.060 essentially admitting
00:54:53.700 on the world
00:54:54.700 stage,
00:54:56.020 because this is a,
00:54:56.880 this is a public
00:54:57.440 committee in the
00:54:58.140 House of Commons,
00:54:58.900 like essentially
00:54:59.400 admitting that
00:55:00.160 when Canada
00:55:01.480 faces,
00:55:02.180 issues
00:55:03.460 or problems
00:55:04.780 that are
00:55:05.500 literally
00:55:07.240 stated to be
00:55:07.940 national security
00:55:08.820 concerns,
00:55:09.400 we don't
00:55:10.540 investigate it.
00:55:13.160 That's the
00:55:13.860 embarrassing part.
00:55:15.760 What,
00:55:15.900 what does that
00:55:16.320 say about
00:55:16.960 any of these
00:55:18.460 other,
00:55:19.240 like anytime
00:55:20.460 a scandal
00:55:21.500 comes up,
00:55:22.060 are we just
00:55:22.520 gonna,
00:55:22.800 like put it
00:55:24.240 under the rug?
00:55:25.260 Yeah.
00:55:25.940 Is that our
00:55:26.600 standard of
00:55:27.400 practice?
00:55:28.060 Well,
00:55:28.280 and what is,
00:55:28.780 what is ceases
00:55:29.440 to think of this
00:55:30.340 too?
00:55:30.560 Like what is,
00:55:31.540 what are the
00:55:31.920 officials of
00:55:32.560 our,
00:55:33.100 essentially the
00:55:34.040 equivalent of
00:55:34.680 our CIA?
00:55:36.820 How can
00:55:37.540 they,
00:55:38.980 what does this
00:55:40.000 mean for the
00:55:40.580 future of how
00:55:41.420 they act in
00:55:43.200 relation to
00:55:43.820 whatever government
00:55:44.560 is in,
00:55:45.200 is in power at
00:55:45.940 the time?
00:55:46.280 If they don't
00:55:46.820 believe that
00:55:47.500 their investigations
00:55:48.980 will be taken
00:55:49.680 seriously,
00:55:51.140 well does that
00:55:51.680 then open you up
00:55:52.360 to even more
00:55:53.160 problems regarding,
00:55:54.720 you know,
00:55:55.460 related to potential
00:55:56.540 like vigilante
00:55:57.840 investigations
00:55:58.600 or,
00:55:59.360 you know,
00:56:00.060 willful,
00:56:00.940 like you saw,
00:56:01.600 we saw with the
00:56:02.240 Trump presidency
00:56:03.140 where essentially
00:56:04.060 the,
00:56:04.420 the FBI and
00:56:05.520 Trump didn't,
00:56:07.500 the FBI wasn't
00:56:08.260 giving information
00:56:08.920 to Trump because
00:56:09.580 they didn't feel
00:56:10.220 it could be
00:56:10.720 trusted with him.
00:56:11.660 He was firing
00:56:13.020 FBI,
00:56:13.800 FBI,
00:56:14.440 you know,
00:56:15.260 agents and
00:56:15.800 leaders pretty
00:56:17.060 indiscriminately
00:56:17.800 and,
00:56:18.840 and you can't
00:56:19.440 have that sort
00:56:20.140 of impasse
00:56:20.760 between two
00:56:21.720 essential
00:56:23.160 institutions of
00:56:24.520 a country
00:56:24.960 if you want
00:56:25.700 it to,
00:56:27.060 if you want
00:56:27.620 that country
00:56:28.000 to function
00:56:28.580 reliably.
00:56:30.060 Yeah,
00:56:30.580 how is this
00:56:31.120 benefiting Canada
00:56:32.120 if our
00:56:33.320 institutions are
00:56:34.260 becoming more
00:56:35.860 siloed
00:56:36.760 in that way
00:56:38.340 or there's a
00:56:39.080 fear of
00:56:40.800 like,
00:56:43.240 if there's a
00:56:44.620 fear of speaking
00:56:45.300 up or highlighting
00:56:46.240 these problems
00:56:47.900 within institutions,
00:56:49.280 they're not
00:56:50.460 going to get
00:56:50.840 better.
00:56:52.100 There's no,
00:56:52.900 if there's no
00:56:53.420 feedback mechanism
00:56:54.380 to solve these
00:56:55.240 problems,
00:56:55.720 problems,
00:56:56.380 then that's
00:56:57.520 what we would
00:56:58.000 classify as
00:56:59.000 institutional
00:56:59.720 rot,
00:57:00.800 where you'd
00:57:01.500 get problems
00:57:02.920 being hidden
00:57:04.220 just to protect
00:57:06.100 somebody's job
00:57:06.980 or protect
00:57:07.680 the overall
00:57:08.760 image of a
00:57:09.500 department.
00:57:10.720 There's almost
00:57:11.520 a incentive.
00:57:13.580 This is where
00:57:14.260 you get this
00:57:14.840 idea that
00:57:15.820 as governments
00:57:17.800 get larger,
00:57:18.860 they require
00:57:19.920 more reasons
00:57:20.700 to justify
00:57:21.480 all these
00:57:22.340 departments
00:57:22.960 that they
00:57:23.380 have.
00:57:24.760 They become
00:57:26.240 more bloated
00:57:26.880 and then
00:57:27.440 the department
00:57:28.900 heads or
00:57:29.660 whoever's
00:57:30.240 on,
00:57:31.640 whoever's
00:57:32.520 control of
00:57:33.460 that department,
00:57:34.440 they want to
00:57:35.540 keep a job,
00:57:36.380 they want to
00:57:36.760 keep funding,
00:57:37.860 they want to
00:57:38.500 keep their
00:57:38.860 employees,
00:57:39.960 and it doesn't
00:57:40.840 naturally get
00:57:41.600 smaller without
00:57:43.400 like,
00:57:44.480 without top-down
00:57:45.380 reforms.
00:57:46.480 So I think
00:57:46.980 there's natural
00:57:47.660 incentives to
00:57:49.160 anything that
00:57:51.520 might hurt
00:57:52.400 the position
00:57:53.740 of any
00:57:54.160 specific
00:57:54.700 department
00:57:55.120 within the
00:57:55.560 government,
00:57:57.820 there's a
00:57:58.700 self-preservation
00:57:59.460 that will
00:58:00.220 always take
00:58:00.820 over,
00:58:01.760 if not for
00:58:03.220 a good
00:58:04.840 structure that
00:58:05.880 allows
00:58:06.500 reflection.
00:58:09.840 But as we
00:58:11.400 just make a
00:58:12.260 larger government,
00:58:13.400 the problem
00:58:13.800 doesn't,
00:58:15.800 it will
00:58:16.260 increase until
00:58:17.520 you specifically
00:58:19.460 downsize certain
00:58:20.620 parts of it.
00:58:22.880 So I think
00:58:23.600 that's a risk
00:58:24.460 over time,
00:58:25.120 but we're
00:58:25.860 talking about
00:58:26.460 people's trust
00:58:28.840 in these
00:58:29.140 institutions,
00:58:30.440 I don't
00:58:30.920 think we're
00:58:31.420 not giving
00:58:36.360 more reasons
00:58:36.960 for people to
00:58:37.500 trust it,
00:58:37.880 I just hope
00:58:38.420 this is another
00:58:39.400 example that
00:58:40.200 maybe helps
00:58:41.100 a few people
00:58:41.640 realize maybe
00:58:43.840 we shouldn't
00:58:44.360 trust,
00:58:45.740 blindly trust
00:58:46.380 these institutions.
00:58:48.100 Agreed.
00:58:49.640 Well James,
00:58:50.700 we're at about
00:58:51.580 an hour here,
00:58:52.240 was there
00:58:52.620 anything else
00:58:54.060 that you wanted
00:58:54.500 to chat about?
00:58:56.120 I think that's
00:58:57.340 a perfect place
00:58:57.980 to wrap it up.
00:58:59.140 Okay,
00:58:59.860 okay.
00:59:00.880 Well,
00:59:01.540 thank you as
00:59:02.520 always sir,
00:59:03.160 always a pleasure
00:59:03.740 chatting and
00:59:04.740 catching up on
00:59:05.300 the week.
00:59:06.940 We'll of
00:59:08.200 course have
00:59:08.880 links in the
00:59:10.920 description to
00:59:11.560 everything we
00:59:11.980 talked about,
00:59:12.400 everything we
00:59:12.740 showed,
00:59:13.640 always recommended
00:59:14.920 reading for sure,
00:59:15.980 so you know
00:59:16.880 that we're not
00:59:17.480 just making
00:59:18.960 stuff up as
00:59:19.600 we go along.
00:59:20.980 And yeah,
00:59:23.060 we will see
00:59:24.080 you again shortly.
00:59:24.940 Thanks guys.
00:59:25.900 Thanks for
00:59:26.160 watching.
00:59:27.120 All right.
00:59:27.660 Cheers.
00:59:27.920 Cheers.
00:59:27.980 Cheers.
00:59:28.040 Cheers.
00:59:28.080 Cheers.
00:59:29.040 Cheers.
00:59:30.040 Cheers.
00:59:30.080 Cheers.
00:59:32.040 Cheers.
00:59:33.040 Cheers.
00:59:34.040 Cheers.
00:59:36.040 Cheers.
00:59:38.040 Cheers.
00:59:40.040 Cheers.
00:59:42.040 Cheers.
00:59:44.040 Cheers.
00:59:44.100 Cheers.
00:59:44.220 Cheers.