Rebel News Podcast - December 03, 2022


EZRA LEVANT: The Media Party throws itself a pity party — and surprise: they want to ban you.


Episode Stats

Length

1 hour and 15 minutes

Words per Minute

160.87

Word Count

12,066

Sentence Count

907

Misogynist Sentences

69

Hate Speech Sentences

15


Summary

A panel discussion at Carleton University's journalism school had no violence, but they did describe the violence of receiving mail from angry viewers. And of course, Trudeau had a cabinet minister there very eager to call the police on any conservative letter writers. I'll let you see with your own eyes what happened.


Transcript

00:00:00.000 Hello, my rebels. Today, I take you through a panel discussion at a journalism school.
00:00:04.600 No, don't click away. It's not that boring. In fact, it's quite revealing.
00:00:08.060 They want to silence you. They want to stop anyone from emailing them or writing mean tweets to them.
00:00:13.660 They're comparing their tweets to violence and rape. It was a panel on violence, but they had
00:00:19.200 no violence to describe just the violence of receiving mail from angry viewers. It's quite
00:00:26.160 something. And of course, Trudeau had a cabinet minister there very eager to call the police on
00:00:31.520 any conservative letter writers. I'll let you see with your own eyes. That's today's show. But first,
00:00:35.840 let me invite you to become a subscriber to Rebel News Plus. You can see the video version of the
00:00:39.780 show that way. Go to rebelnewsplus.com, click subscribe. It's eight bucks a month. You get my
00:00:45.140 show that's on every weeknight. Plus, we have other shows on a weekly basis. Eight bucks a month. It's
00:00:50.640 half the price of Netflix. We give you a ton of interesting stuff you can't find elsewhere. And
00:00:55.740 we need the dough to survive because we don't get any money from Trudeau. So we
00:01:00.200 we need to rely on our viewers. Thanks very much. Here's today's podcast.
00:01:19.100 Tonight, the media party throws itself a pity party and surprise, they want to censor you.
00:01:25.140 It's December 2nd, and this is the Ezra LeVant Show.
00:01:30.600 Shame on you, you censorious bug.
00:01:42.600 Well, you don't get more inside Ottawa, inside baseball, media party, Laurentian elite than the
00:01:50.260 Carleton University's journalism school in Ottawa. And they had a panel discussion the other day,
00:01:57.000 where they were talking about threats and violence, actual violence against Canadian journalists. Now,
00:02:04.320 I don't know about threats, but I do know about violence. Our reporters have faced more violence than
00:02:11.120 all other media in the country combined. And I say that with complete certainty because I actually don't
00:02:19.900 know of a single other case of violence against any reporter in Canada in a decade.
00:02:27.780 There was a reporter for a Sikh media outlet in BC some years ago who was the subject of violence, and it was an
00:02:38.520 internecine battle about Sikh issues in that community. And in no way do I downplay the violence against him. That was
00:02:46.160 extremely real. But I'm talking about mainstream journalists covering the news, allegedly under threat
00:02:53.400 from the right. Although, when you think about the last few years, it's the Antifa and Black Lives Matter
00:02:58.840 types who have been in violence. Do you know of a single journalist in Canada who has been punched,
00:03:07.160 kicked, kicked, beaten, shot? I don't. Other than, well, half of our Rebel News team. Look at this
00:03:16.900 montage. I played this recently at our Rebel News live conferences. This isn't even all of it. Take a look.
00:03:22.960 Yeah, that's what I'm f—ing.
00:03:24.700 Yeah, that's what I'm f—ing.
00:03:29.160 We don't understand, um.
00:03:32.700 Whoa, whoa!
00:03:34.700 Calm down.
00:03:36.700 Not in here.
00:03:38.700 Are you kidding me?
00:03:40.700 Take an egg. Get out.
00:03:43.700 Get out of here.
00:03:45.700 Get out of here. It's a public park.
00:03:47.700 It's— Don't—
00:03:49.560 Come on!
00:03:51.500 Don't touch me again, buddy.
00:03:56.500 Here are the thugs! Here are the thugs!
00:03:58.500 Here are the thugs!
00:04:00.500 Shame! Shame! Shame! Shame! Shame!
00:04:06.500 What are you doing? Get off me!
00:04:08.500 Hey, this is assault. I'm on a sidewalk.
00:04:11.500 What is this?
00:04:12.500 I'm on a sidewalk.
00:04:17.500 I'm on a sidewalk.
00:04:18.500 What is this? You cannot touch me!
00:04:21.500 I'm not rushing over you.
00:04:23.500 Hey!
00:04:25.500 My f...
00:04:27.500 Are you kidding?
00:04:28.500 Are you kidding?
00:04:29.500 I don't...
00:04:31.500 Oh!
00:04:32.500 Ow!
00:04:33.500 Ow!
00:04:34.500 Ow!
00:04:35.500 Ow!
00:04:36.500 Ow!
00:04:40.500 What?
00:04:41.500 I'm with Rebel News. I'm on media Jewish.
00:04:43.500 Hello, Rebel News.
00:04:45.500 Hey!
00:04:46.500 What's up?
00:04:47.500 What's up?
00:04:49.500 So why are we getting such a tough time?
00:04:51.500 We've been through this so many times with you guys.
00:04:53.500 I need to hold my camera, man.
00:04:54.500 Hey, what's going on? I need to hold my camera.
00:04:56.500 Hey!
00:04:58.500 The reality is, organizations...
00:05:02.500 Organizations like yours...
00:05:04.500 Yours.
00:05:05.500 ...that continue to spread misinformation and disinformation.
00:05:09.500 I won't call it a media organization.
00:05:11.500 Your group of individuals need to take accountability,
00:05:16.500 polarization that we're seeing in this country.
00:05:19.500 It is disappointing to see the conservatives engage in peddling...
00:05:23.500 Engage in peddling...
00:05:24.500 Rebel media.
00:05:25.500 Rebel media.
00:05:26.500 Conspiracy theories.
00:05:27.500 Conspiracy theories.
00:05:28.500 I share my perspective on your organization yesterday.
00:05:31.500 There's nothing to say.
00:05:33.500 You're censorious thud.
00:05:34.500 You've physically assaulted me.
00:05:36.500 And so did you.
00:05:37.500 And so did this.
00:05:38.500 The likes of CBC and the other members of the media party,
00:05:41.500 they're in there.
00:05:42.500 They're reporting in the warmth,
00:05:43.500 but they can't even give us the opportunity
00:05:46.500 to ask one bloody question on a public sidewalk.
00:05:49.500 Why the f*** are you here?
00:05:51.500 We're working, sir.
00:05:52.500 Because you're shutting down the bridge.
00:05:53.500 We want to know why.
00:05:54.500 Why don't you f*** off?
00:05:55.500 Here, there's for your f***ing membership.
00:05:57.500 There's for your rebel news.
00:05:58.500 Go f*** yourself, lady.
00:06:00.500 Go f*** yourself.
00:06:01.500 Go f*** yourself.
00:06:02.500 Would you like me to turn it up?
00:06:03.500 Here you go.
00:06:04.500 Go f*** yourself.
00:06:05.500 You're no f***ing friend to native people.
00:06:07.500 Go f*** off.
00:06:08.500 I am native.
00:06:09.500 F*** off.
00:06:10.500 F*** off playing that f***ing card.
00:06:12.500 F*** you.
00:06:13.500 You're telling me I'm playing a race card?
00:06:15.500 Yeah, you are.
00:06:16.500 Get the f*** out of here.
00:06:17.500 F*** off.
00:06:18.500 F*** off.
00:06:19.500 Oh, no.
00:06:20.500 Oh, no.
00:06:21.500 Oh, no.
00:06:22.500 Oh, no.
00:06:23.500 Oh, no.
00:06:24.500 Oh, no.
00:06:25.500 Oh, no.
00:06:26.500 Oh, no.
00:06:27.500 Oh, no.
00:06:28.500 Oh, no.
00:06:35.500 Are you playing hockey here?
00:06:37.500 No, I'm just coming to check on our facility.
00:06:39.500 So I'm going to...
00:06:40.500 We're going to check you.
00:06:41.500 You're not supposed to be here, actually.
00:06:42.500 Okay.
00:06:43.500 Okay, I'll rest you.
00:06:44.500 I have my...
00:06:45.500 We already spoke to Darcy Henson.
00:06:46.500 He said there should be no problem.
00:06:47.500 Come in.
00:06:48.500 Sorry, why is that?
00:06:49.500 Sorry, why?
00:06:50.500 Thank you.
00:06:51.500 Now this is an administrative penalty notice.
00:06:52.500 Oh, for what?
00:06:53.500 Yeah.
00:06:54.500 This is for shaking hands with the public.
00:06:55.500 No way.
00:06:56.500 To jail to maintain a distance of at least two meters from another person.
00:06:57.500 I'm going to release you.
00:06:58.500 You go back in the province.
00:06:59.500 You have to wear a mask.
00:07:00.500 If you don't, you're going to be detained.
00:07:01.500 Do you understand?
00:07:02.500 Again, like this?
00:07:03.500 No.
00:07:04.500 You go to the police station.
00:07:05.500 To jail, basically.
00:07:06.500 To jail, yeah.
00:07:07.500 So yeah, if you're having a conference about journalists in Canada being roughed up,
00:07:11.500 beat up, threatened and violence, but you don't invite Rebel News, sounds to me like you're
00:07:18.500 not actually interested in violence against journalism.
00:07:21.500 You're interested in mean tweets against liberals.
00:07:24.500 Well, I wasn't there in Ottawa, but luckily they put the event on YouTube and I spent a couple
00:07:43.500 hours going through it today.
00:07:45.500 It was just so long I gave up before going straight to the end.
00:07:48.500 Listen, I suffered enough for the craft.
00:07:51.500 I don't recommend watching it.
00:07:52.500 You can find it online pretty easily, but let me take you through it.
00:07:55.500 I can't help but chuckle about this because this is the finest journalism school in the
00:08:00.500 country.
00:08:01.500 Just ask them.
00:08:02.500 And their very first screen on their YouTube, they spelt the word battlefield wrong and that
00:08:08.500 made me chuckle a little bit.
00:08:11.500 They really think that they're soldiers on a battlefield.
00:08:14.500 That's a thing they came to again and again, that they are the true victims and heroes in
00:08:20.500 society.
00:08:21.500 They also have a hashtag, not okay, by which they mean criticizing them is not okay.
00:08:30.500 Now the event was hosted by Alan Thompson, a former journalist who's now at the university
00:08:35.500 there.
00:08:36.500 And he is, as you will quickly see, a white man.
00:08:42.500 And he spent some time apologizing for that.
00:08:45.500 And of course, an aboriginal land territory acknowledgement.
00:08:49.500 And it went on for quite some time.
00:08:51.500 I think it was the most he had to say was basically his self-hatred and apologize for being a white
00:08:59.500 guy.
00:09:00.500 Take a look.
00:09:01.500 And I'm the head of Carleton's journalism program.
00:09:04.500 And on behalf of the School of Journalism and Communication, I want to thank you for taking
00:09:08.500 the time to join us for this critical discussion, either in person or on YouTube.
00:09:15.500 And right off the top, I need to warn you that you're going to hear some harsh, disturbing
00:09:22.500 language this evening.
00:09:24.500 There really isn't any other way to have this conversation.
00:09:29.500 Let me begin with the land acknowledgement.
00:09:33.500 We hear these more and more often these days.
00:09:36.500 But I'm increasingly conscious as a fifth generation settler and someone who is called
00:09:43.500 upon to make these land acknowledgements, that they sometimes seem to be much more for
00:09:49.500 the white audience to justify or maybe soothe our racial guilt instead of the original intended
00:09:56.500 purpose to acknowledge and honor people and the treaty agreements and unceded lands that we
00:10:02.500 occupy.
00:10:03.500 So please, let this not be just another box to check off in our meeting agenda.
00:10:09.500 As journalists, or journalists in training, we want to use this moment as a reminder to
00:10:15.500 prioritize our individual commitments to challenge and dismantle white supremacist colonial mindsets,
00:10:22.500 which we have internalized, both collectively and individually, in the School of Journalism itself
00:10:28.500 and in the wider journalism industry in Canada.
00:10:32.500 Now, he warned us that there would be harsh and disturbing language.
00:10:36.500 I guess he wants people to know that there's some microaggressions to come.
00:10:41.500 There were people, there are a lot of people wearing masks.
00:10:43.500 I listened to two hours of it and there was some mean tweets that were read, but it was incredible
00:10:50.500 that journalists have to be told that there's going to be some mean words in case they want
00:10:55.500 to go into the comfort room and just calm down a bit and maybe have some mental health workers.
00:11:00.500 Of course, he describes himself as a fifth generation settler.
00:11:05.500 If he really is upset, if he really feels that he's stealing someone's land, why doesn't he leave?
00:11:11.500 I mean, if you do feel guilty about it, what good does a land acknowledgement do?
00:11:17.500 How about give back the land if you really feel you stole it?
00:11:20.500 I find those land acknowledgements to be a kind of fraud.
00:11:24.500 Anyways, it's a conclusion that there is an upsurge of hate of online abuse,
00:11:32.500 especially against women journalists and racialized journalists.
00:11:37.500 They say this without any statistics, they just say it.
00:11:41.500 Online hate and violence. Look at that word, violence.
00:11:46.500 We are here this evening as part of an effort to grapple with the upsurge in online hate targeting journalists.
00:11:53.500 Here at Carleton, our starting point is the view that more must be done
00:11:59.500 to combat the targeted online abuse increasingly suffered by journalists,
00:12:05.500 especially women and racialized journalists.
00:12:09.500 A year ago, Carleton partnered with the Canadian Association of Journalists
00:12:13.500 to host a roundtable on journalists and online hate.
00:12:17.500 That resulted in a call to action to our industry, government, and the broader public
00:12:22.500 through a publication, Poisoned Well.
00:12:25.500 More recently, the CBC's Omyra Issa dedicated a part of her Kesterton lecture at Carleton in this room
00:12:31.500 to her own experience with relentless online hate and death threats.
00:12:37.500 We know online harassment affects journalists, and we have evidence to prove it.
00:12:42.500 Yeah, except for I watched this entire while the first two hours of this panel.
00:12:47.500 And they actually didn't show any violence.
00:12:50.500 They blur hate, which is a human emotion that everyone feels.
00:12:55.500 If you never feel hate, you don't have a full personality.
00:12:58.500 They blur hate with violence.
00:13:00.500 And anyone who watches the CBC can feel their palpable hate when talking about subjects they don't like,
00:13:07.500 when talking about Christians, when talking about conservatives.
00:13:10.500 Some of the most hateful things I've seen in Canada are on the CBC.
00:13:15.500 They publish others who hate.
00:13:17.500 Justin Trudeau and his raw hatred against those who are not vaccinated.
00:13:21.500 Should we even tolerate them?
00:13:23.500 That statement was made on a French CBC program.
00:13:26.500 There's a lot of hate in the media, but I guess their side of hate is fine.
00:13:31.500 They call people who hate them, they call that violence.
00:13:35.500 Here's another clip talking about people who are harassed or threatened.
00:13:39.500 Take a look.
00:13:40.500 The landmark Taking Care survey and report produced earlier this year
00:13:45.500 by my Carleton colleague Matthew Pearson and CBC News journalist and wellbeing champion Dave Seglance.
00:13:52.500 That survey found that 56%, more than half, of Canadian media workers reported being harassed or threatened on social media,
00:14:01.500 and 35%, more than a third, said they also experienced harassment while working in the field.
00:14:08.500 In the coming months, Professor Trish Odetlongo is coordinating a series of workshops
00:14:14.500 and discussions for journalism faculty and students to build their digital security knowledge and skills.
00:14:21.500 We must incorporate these skills into the way we teach journalism.
00:14:26.500 Now, it is true that there are mean things that come through in tweets and Facebook posts and emails every day.
00:14:34.500 I have thousands of those, and I guess I deal with it in the way that social media companies tell you to deal with it,
00:14:42.500 that the tech companies tell you to deal with it.
00:14:45.500 You can block people.
00:14:46.500 You can block email addresses from calling, from emailing.
00:14:49.500 You can block phone numbers from calling and texting you on Twitter or Facebook.
00:14:54.500 You can block someone from connecting with you.
00:14:58.500 I do it without even a thought.
00:15:00.500 But I think that's because I want to do journalism.
00:15:03.500 I don't want to obsess about being the victim of mean tweets.
00:15:06.500 This panel was a pity party where everyone was trying to outdo each other by how mean their tweets were that they obsess over.
00:15:15.500 They read every word, thus encouraging and empowering the mean tweets.
00:15:20.500 Now, I'm not saying that there is no such thing as a written or digital communication that rises to the level of an actual crime.
00:15:29.500 Of course there is.
00:15:30.500 You can commit a crime with a pen and paper, a death threat.
00:15:34.500 You can say it verbally.
00:15:35.500 You can utter a death threat.
00:15:37.500 It's in the criminal code, has been for years.
00:15:39.500 Simply applying it to a digital communication is a small novelty that is actually about 30 years old already.
00:15:46.500 It is already against the law to utter a death threat, whether it's by words, by radio, by CB, by walkie talkie, by cell phone or by email.
00:15:54.500 But what we see here, at least in the two hours that I subjected myself to this video, are people who have clearly not received real death threats.
00:16:04.500 I mean, I won't deny some of the communications they received are genuinely mean, genuinely rude.
00:16:12.500 Some of them are racist and sexist, absolutely.
00:16:15.500 But none of them rise to the level of a crime.
00:16:19.500 And an underlying theme in this was to get police to criminalize what right now is just a political disagreement.
00:16:30.500 There was one other white male on the panel, Marco Mendocino, the public safety minister.
00:16:36.500 And he repeatedly said he was open to using police.
00:16:40.500 I thought that was actually the most terrifying part of it.
00:16:43.500 Here, let's play the next clip.
00:16:45.500 In a moment, Joyce will introduce our platform guests, Catherine Tate, President and CEO of the CBC,
00:16:52.500 Sonia Verma, Editor-in-Chief of Global News.
00:16:54.500 They'll both be making some opening remarks from the podium, then take some questions from Joyce and from the audience
00:17:02.500 before we turn to the second part of the evening.
00:17:06.500 I'm also aware that Catherine has another commitment and may not be able to stay with us throughout the entire evening.
00:17:12.500 There will be a panel discussion involving three journalists who've had direct experience with online hate.
00:17:19.500 Joyce is going to make all the proper introductions.
00:17:22.500 And they'll be followed by another guest, Public Safety Minister, Marco Mendocino.
00:17:27.500 After they make some brief comments and have a chance to discuss, we will open the floor to questions.
00:17:34.500 So that's a bit of a roll call.
00:17:36.500 You can see who is at this panel.
00:17:38.500 They had all the diversity, all every point of view from A to B.
00:17:42.500 Every single person there is on Trudeau's payroll, either directly, like Catherine Tate of the CBC,
00:17:49.500 or through grants like Sonia Verma of Global News, Joyce Napier of CTV, Rachel Gilmore of Global.
00:17:56.500 Mendocino, of course, is a cabinet minister.
00:17:59.500 Erika Ifilm, these are all people who benefit from Trudeau.
00:18:04.500 There was no independent journalist there.
00:18:06.500 There was no one who had a different point of view there.
00:18:09.500 I want to show you this next clip.
00:18:11.500 Again, it comes back to the word hate.
00:18:13.500 Hate and love are human emotions, neither of which are illegal, or at least they ought not to be.
00:18:19.500 If someone hates a journalist, if you hate the liberals, if you hate Trudeau, maybe that's not an attractive emotion,
00:18:26.500 and maybe you shouldn't express it in a certain way, but hate is not the same thing as violence.
00:18:33.500 It doesn't.
00:18:34.500 And I tell you, in the two hours that I watched, there was not a single example of violence,
00:18:38.500 except for one from Sonia Verma that I'll come back to in a minute.
00:18:42.500 Here, take a look at this next clip.
00:18:44.500 Journalists and online hate.
00:18:46.500 So, yeah, that's hate.
00:18:48.500 That means threats.
00:18:49.500 You have death threats, too.
00:18:51.500 Harassment, bullying, racist, misogynistic.
00:18:55.500 Women are the favorite targets of this hate, especially racialized women.
00:19:01.500 So, think about it just for one second.
00:19:04.500 This is happening now, here in Canada.
00:19:08.500 And, you know, to many that are not here, perhaps today, it is something that is a remote topic that some people talk about,
00:19:18.500 but it is getting bigger and bigger and more and more dangerous.
00:19:23.500 So, these kind of panels will help, you know, us all understand, what is this?
00:19:31.500 And the minister is here to tell us.
00:19:34.500 How do we counter it?
00:19:36.500 The minister is here to tell us how to fix it.
00:19:39.500 You know, Marco Mendocino was the weakest cabinet minister who appeared at the public order inquiry into the Emergencies Commission.
00:19:49.500 It was not only his answers were slippery and vague, but the documents were, this one document, where his own staff were revealed to be writing reports and sharing them with others in the government and not even bothering to pass it by Mendocino himself.
00:20:06.500 Even his own staff knew he was not a decision maker, knew he was not a decider or a thinker or influential.
00:20:13.500 They ignored him and they laughed at him.
00:20:16.500 He was sent to the commission as an empty suit.
00:20:19.500 And it just is perfect that of all the people who could have attended on behalf of the government, it's Mendocino.
00:20:25.500 But he's going to be there to tell them how to fix it.
00:20:27.500 Why would you call the government in to fix a problem that the media has with credibility?
00:20:33.500 Because the reason that people are angry at the media, there's a lot of reasons and I'm sure I don't know all of them, but one big one is that people don't trust the media anymore.
00:20:43.500 And a reason for that is that so many media are on the government payroll.
00:20:47.500 So if you're trying to win back trust, if you're trying to stop people from clapping back at you on social media, and one of the knocks on you is that you're in the pocket of Trudeau financially and politically, how is inviting a chummy public safety minister going to disabuse anyone of the notion that you are in full collusion with Trudeau?
00:21:09.500 Anyway, Catherine Tate flew in from her home in New York City.
00:21:14.500 I find that just absolutely amazing that the head of the CBC can live in New York and commute.
00:21:20.500 She says that the Internet is the worst place.
00:21:24.500 And she praises the panel as being diverse.
00:21:28.500 Is this a diverse panel?
00:21:30.500 Here, take a look.
00:21:31.500 Reporters Without Borders found that an overwhelming majority of journalists agreed that the Internet was the most dangerous place for journalists.
00:21:41.500 Nearly half of women journalists said they self-censored to avoid exposing themselves to violence.
00:21:48.500 Another 21 percent had resigned or were considering not renewing their contracts.
00:21:54.500 All news organizations, and you've got us pretty well, a lot of us represented here tonight, in Canada, across Canada and around the world, have been alarmed by the galloping increase in vitriolic online harm that targets disproportionately, as Joyce has said, women, women of color, and racialized journalists.
00:22:16.500 You know, they're emphasizing that women are the victims.
00:22:18.500 Well, actually, studies show that men are as much the victims of harassment online as women are.
00:22:24.500 But they're trying to fit this in their critical theory that it's about women and about minorities.
00:22:30.500 Here's the truth.
00:22:32.500 Everyone and anyone on social media can be harassed.
00:22:35.500 I think the most criticized and harassed man on social media right now is Elon Musk.
00:22:41.500 I want to show you one more statement from Catherine Tate where she says that the people on this panel are fiercely independent, that they're strong and they're free.
00:22:54.500 Do we have a fiercely independent and strong free press?
00:22:57.500 Take a look.
00:22:58.500 I know that the sole purpose of this vile form of harassment is to silence these voices, to silence these journalists, and in so doing, to undermine the foundational pillar in our open societies, in our democracies.
00:23:17.500 So I'm here tonight to say to all of you, stay the course.
00:23:21.500 I know that may seem contradictory.
00:23:23.500 We need fiercely independent, outspoken journalists more than ever before.
00:23:30.500 Journalists who speak truth to power.
00:23:32.500 Journalists who reflect a wide range of voices and perspectives.
00:23:38.500 Without a strong, free, and diverse press, democracies cannot function.
00:23:43.500 So we worked together with the Toronto Star, CTV News, Global, La Presse, APTN, the Canadian Association of Journalists, and others to develop a newsroom guide for managing online harm.
00:23:55.500 I urge you all to take a look at it.
00:23:56.500 It's available on hashtag not okay.
00:23:59.500 And the idea was to at least, and this, by the way, this isn't written by CBC.
00:24:04.500 This is written by a collective.
00:24:05.500 So I don't know how many pens were in it, but a lot.
00:24:08.500 And the idea was to provide advice on what to do before, during, and after incidents of online harassment or abuse.
00:24:18.500 And basically, we've had to adapt many of the same practices that we've used when we send journalists overseas to war zones or when we send them to natural disasters.
00:24:31.500 This is a really important point.
00:24:34.500 Social media can be as dangerous an environment as hostile physical environments.
00:24:41.500 We take precautions when we send reporters to Ukraine.
00:24:44.500 And we need to take precautions when our people, our journalists, are similarly exposed to danger in the digital world.
00:24:53.500 She talks about working with the Toronto Star on harassment issues.
00:24:58.500 Really?
00:24:59.500 Well, I think they would know.
00:25:01.500 Do you remember that astonishing, world-famous, front-page cover photo they did in the Toronto Star,
00:25:08.500 where they had threats and demonization of the unvaxxed.
00:25:12.500 People calling for the unvaxxed to be banned from hospitals.
00:25:15.500 People laughing at the unvaxxed dying.
00:25:17.500 That was the front page of the Toronto Star.
00:25:20.500 And that's who the CBC is partnering with to fight back against harassment.
00:25:25.500 Well, I guess they do know about harassment.
00:25:27.500 But that one line, that this is as serious as war.
00:25:32.500 And they keep coming back to that.
00:25:34.500 Look at this clip, where they compare being a journalist to a soldier with post-traumatic stress disorder.
00:25:42.500 Comparing themselves as victims and as heroes.
00:25:47.500 Take a look at this.
00:25:48.500 We know from the preliminary findings that journalists who are harassed online have significantly more symptoms of anxiety, depression, and post-traumatic distress.
00:26:01.500 Post-traumatic stress.
00:26:04.500 Take a moment to consider that.
00:26:07.500 For journalists today, the battlefield is everywhere.
00:26:11.500 Oh, this next one was just incredible.
00:26:13.500 Imagine comparing people on social media criticizing you.
00:26:19.500 Imagine comparing this as a problem online tantamount to child pornography.
00:26:26.500 You know, child pornography involves the actual rape of children.
00:26:32.500 And these journalists say that the actual rape of children is comparable to what they face every day as journalists by having mean tweets.
00:26:42.500 Seriously, how dare they?
00:26:43.500 We're encouraging police across the country to treat online abuse the same way they treat child pornography.
00:26:50.500 As a cyber crime that crosses police jurisdictions and so requires a coordinated response.
00:26:58.500 Take a look at this next clip.
00:27:00.500 How society needs to change.
00:27:03.500 Everyone in society needs to change.
00:27:05.500 We have to look at talking back to powerful people like journalists and politicians.
00:27:10.500 We have to compare that with drunk driving and human trafficking.
00:27:14.500 Again, prostituting and raping women is comparable to daring to speak out against powerful journalists.
00:27:20.500 Take a look.
00:27:21.500 Society will need to change to ensure that this behavior is simply deemed unacceptable.
00:27:28.500 Just as human trafficking and drunk driving are.
00:27:33.500 This will require a huge shift in societal attitude and behavior.
00:27:39.500 And I do believe that attitudes about online harassment can shift.
00:27:43.500 The more we all talk about it, the more we flag it as an issue.
00:27:48.500 Oh, and of course, we need penalties.
00:27:52.500 We need the government to weigh in.
00:27:54.500 Just incredible.
00:27:55.500 Take a look.
00:27:56.500 But there must be penalties also for those who refuse to stop.
00:28:01.500 And that's why we need legislation to ensure online safety.
00:28:05.500 So we'll be working with other media colleagues, many of whom are here tonight, but across the country, to better support and protect our journalists.
00:28:14.500 Because until you are safe, we will not stop talking about this.
00:28:19.500 We simply cannot afford to lose your voices.
00:28:23.500 Now, I mentioned earlier that in the entire two hours that I watched this panel, there was not a single incidence of violence described, which is quite something, given that the panel was supposedly about violence.
00:28:36.500 If they would have invited anyone from the rebel, myself, David Menzies, Alexa Lavoie, really half a dozen of our team, we would have told them about actual violence towards us.
00:28:46.500 But they did manage to get a story about violence, but it was not on social media.
00:28:53.500 It's a story of one of the speakers here, Ms. Verma, from Global News.
00:28:59.500 The Global Mail at the time.
00:29:00.500 And she was in Egypt, in Cairo, in Tahrir Square, 11 years ago.
00:29:06.500 You might recall terrible things happened there.
00:29:09.500 It was rallies and protests and riots against the government.
00:29:15.500 There were protests going on for months, in fact.
00:29:18.500 Terrible things happened there.
00:29:21.500 Women who were uncovered were molested.
00:29:24.500 A terrible thing happened to Lara Logan, the 60 Minutes reporter.
00:29:29.500 Here's a quick clip about that.
00:29:31.500 Nearly three months ago, our Lara Logan was beaten and sexually assaulted by a mob in Cairo, Egypt, while covering the celebrations after Hosni Mubarak stepped down as president.
00:29:41.500 Now, for the first time, she's speaking publicly about the attack, which she says was merciless.
00:29:47.500 In a 60 Minutes interview, Lara tells Scott Pelley she thought she would die.
00:29:52.500 Our camera battery went down.
00:29:59.500 And we had to stop for a moment.
00:30:01.500 And suddenly Baha looks at me and says, we've got to get out of here.
00:30:06.500 Baha is not happy here.
00:30:08.500 He's Egyptian. He speaks Arabic and he can hear what the crowd is saying.
00:30:11.500 Yeah.
00:30:12.500 He understands what no one else in the crew understands.
00:30:14.500 That's right.
00:30:15.500 I thought, not only am I going to die here, but it's going to be just a torturous death that's going to go on forever and ever.
00:30:21.500 Ever and ever and ever.
00:30:22.500 When I thought, I'm going to die here.
00:30:24.500 And my next thought was, I can't believe I just let them kill me.
00:30:29.500 That was as much fight as I had.
00:30:32.500 That I just gave in.
00:30:34.500 And I gave up on my children so easily.
00:30:37.500 How could you do that?
00:30:38.500 How could you do that?
00:30:39.500 I thought you were stronger than that.
00:30:41.500 Well, thank God that didn't happen to the speaker from the Globe and Mail, Ms. Verma, who's now at Global News.
00:30:48.500 She was not raped, thank God.
00:30:50.500 But she was physically assaulted.
00:30:53.500 Listen to her describe what she went through when she encountered a mob of men on the streets who didn't like the fact that she was uncovered.
00:31:06.500 She swarmed her, surrounded her, and bruised her, beat her up until a janitor nearby grabbed her and her Globe and Mail colleague and pulled them into an apartment.
00:31:17.500 And this is a terrible story that could have been horrendous as it was for Lara Logan.
00:31:22.500 I was gripped by this story.
00:31:24.500 But the whole time I was thinking, we're talking about violence in Canada on social media.
00:31:31.500 And the only story you have is violence in Egypt 11 years ago from a physical mob.
00:31:39.500 Take a look.
00:31:40.500 And all of a sudden, I was completely surrounded by these men who were grabbing me and pushing me, grabbing my notebook, grabbing my pen, pulling Patrick Martin, my colleague, away from me.
00:31:51.500 I felt totally alone, totally overwhelmed, and totally powerless.
00:31:58.500 And it was an incredibly scary feeling.
00:32:01.500 My whole body was covered in bruises that I hadn't felt at the time, but were there and lasted, you know, a few days until they healed.
00:32:10.500 I'm telling this story because I think that I was able to sort of get away from that very scary situation, that very scary crowd.
00:32:20.500 I look at my colleagues right now, who are the topic and the subject and the target of online hate, and I feel that they don't have that safety anywhere.
00:32:31.500 I feel like every day in my job right now as Editor-in-Chief of Global News, I see, you know, incredibly violent messages come through our email filter systems that are forwarded to me by Rachel, by others.
00:32:47.500 And I think there are many ways every day that are just as damaging and I think just as wounding as what I experienced at that time.
00:32:56.500 So what's the solution?
00:32:58.500 Well, everyone in the room here has the same solution.
00:33:02.500 Not the solution to actually being groped by men in Tahrir Square, but the solution to the non-crimes and the non-violence in Canada.
00:33:09.500 The solution was for law enforcement, for police to stop the hate. Take a look.
00:33:15.500 We're not in a position to protect our journalists from this hate.
00:33:20.500 And that's why I think the conversation has to be a bigger conversation.
00:33:24.500 It has to be something that's talked about at a policy level, at a law enforcement level.
00:33:28.500 You know, we don't have the tools to actually, you know, go after these people.
00:33:33.500 I just have to end with this one last clip from Ms. Verma of Global News.
00:33:38.500 She says that Global News, unlike citizen journalists, is not biased.
00:33:45.500 They're not biased people. They are fact-based. Here, take a look.
00:33:51.500 As you all know, we're governed by, you know, principles.
00:33:55.500 We have something at Global News called the JP&P, right?
00:33:58.500 So that sort of governs the fact that, you know, our news is unbiased.
00:34:03.500 It's fact-based. You know, there's accountability there that we have to ourselves.
00:34:08.500 There's accountability to our audiences.
00:34:10.500 With some of the social media platforms, you know, those same obligations simply aren't there.
00:34:18.500 Oh, you bet.
00:34:19.500 I'll give you an example of one of their star journalists, David Akin,
00:34:22.500 heckling Pierre Polyev in a fact-based, nonviolent way. Remember this?
00:34:26.500 Thank you very much. I appreciate it. I appreciate your presence here today.
00:34:31.500 Before I begin, let me just say that...
00:34:37.500 Thank you very much.
00:34:39.500 I'm being heckled here by the...
00:34:42.500 Thank you very much for your congratulations.
00:34:45.500 Thank you very much for your questions.
00:34:47.500 I'm going to begin my remarks now.
00:34:49.500 But will you take some questions afterwards?
00:34:50.500 That's usually...
00:34:51.500 Justin Trudeau is out of touch and Canadians are out of money.
00:34:56.500 The cost of government is driving up the cost of living.
00:35:00.500 A half a trillion dollars of inflationary deficits have bid up the cost of the goods we buy and the interest that Canadians pay.
00:35:12.500 The cost for workers and businesses to produce the goods that we buy.
00:35:17.500 On top of that, Trudeau proposes yet more spending to bid up costs even further.
00:35:23.500 The more he spends, the more things cost.
00:35:28.500 It is just inflation.
00:35:30.500 Their homes and to buy a home in the very first place.
00:35:34.500 Hold my hand.
00:35:35.500 The reason that...
00:35:36.500 Look...
00:35:37.500 Yeah.
00:35:38.500 So we have basically a liberal heckler who snuck in here today to...
00:35:45.500 I'm a liberal heckler.
00:35:46.500 I work for Global News.
00:35:48.500 I'm the chief political correspondent of that organization.
00:35:50.500 Are you going to let me make my statement?
00:35:51.500 You may remember me from the guy who actually reported first on the Prime Minister breaking the law.
00:35:56.500 Are you going to let me make my statement?
00:35:57.500 We'd just like to ask a question.
00:35:58.500 Say...
00:35:59.500 I've never...
00:36:00.500 I've actually never seen you heckling the Prime Minister.
00:36:02.500 I've done this before.
00:36:03.500 Ask Minister Barrett back in the day.
00:36:04.500 I've never seen you heckling the Prime Minister.
00:36:05.500 Look, bottom line is this.
00:36:06.500 I'm going to take some questions at the end of this statement.
00:36:08.500 Yes, I'm taking...
00:36:09.500 I will take in two questions at the very end.
00:36:10.500 Thank you very much.
00:36:11.500 Thank you very much.
00:36:12.500 The...
00:36:13.500 So I'm going to start my statement again.
00:36:14.500 Yeah.
00:36:15.500 Not...
00:36:16.500 Fact-based and unbiased.
00:36:18.500 They started talking about...
00:36:20.500 There were some strange questions coming from the audience about how to stop ordinary people from saying
00:36:25.500 mean things.
00:36:26.500 And Catherine Tate, the New York-based boss of the CBC, boasted about eliminating the comments section on the websites, on the CBC,
00:36:38.500 simply banning people from making comments.
00:36:40.500 That was how they handled comments they didn't like.
00:36:43.500 But incredibly, that's just what we see on the outside.
00:36:46.500 They offer medical...
00:36:47.500 Sorry, mental health support for CBC-ers who feel micro-aggressed because of the comments section because people clapping back.
00:36:56.500 Take a look at this.
00:36:57.500 We're just a lot smarter about...
00:36:59.500 You know, there used to be a comment section on CBC website where people could correct typos or make comments about our content.
00:37:06.500 Yeah.
00:37:07.500 And all sorts of vile stuff was coming in through that door.
00:37:10.500 So we closed that door, you know, and we're...
00:37:13.500 It's like a little whack-a-mole, but we're constantly looking.
00:37:16.500 We've put some resources against security.
00:37:20.500 So we now have people who are, like Dave Seglins, who are working on helping our journalists through the process when they're attacked, providing mental health support.
00:37:33.500 There was another reporter who joined via Zoom.
00:37:36.500 I didn't quite catch where she was from.
00:37:38.500 The journalists who ran the thing didn't do...
00:37:40.500 They didn't put her name up.
00:37:41.500 It wasn't very easy to understand.
00:37:43.500 But I understood that she was originally from Pakistan.
00:37:46.500 And she starts off by saying that one of the reasons she fled Pakistan was because of a violent assault.
00:37:52.500 And that's terrible to hear.
00:37:53.500 And I know Pakistan can be a dangerous place.
00:37:55.500 But then she said the same exact thing here happened in Canada.
00:37:59.500 Take a look.
00:38:00.500 Safe has now become an alien word to me.
00:38:03.500 I can't remember what safe looks or feels like, which is ironic because safety is what I came to this country for.
00:38:13.500 I left Pakistan after my reporting on human rights abuses and state complicity led to a horrific organized online hate campaign against me, just like it did here.
00:38:23.500 I was doxxed, I was vilified, assaulted by misogynistic, ethnophobic, violent abuse, just like here.
00:38:32.500 The police didn't believe me.
00:38:34.500 They didn't help me.
00:38:35.500 Just like here.
00:38:36.500 Was she really subject to a violent assault in Canada?
00:38:39.500 Maybe.
00:38:40.500 Maybe.
00:38:41.500 She didn't give details of it.
00:38:43.500 Was she attacked in Canada like she was back in Pakistan?
00:38:46.500 I think I would have heard about it.
00:38:47.500 I mean, maybe not.
00:38:49.500 Or is she using the word attack, violent attack, to mean mean tweets?
00:38:55.500 There were there was another guest, another reporter named Erica Ifill and there's Mark Mendocino and CTV and Global and CBC was all the big shots.
00:39:04.500 And then some people who I think were chosen to give the pretense of diversity, although, like I say, there was no ideological diversity.
00:39:11.500 And they only talked about mean tweets from the right.
00:39:15.500 There was no mention of violence from the left, including real violence from Antifa.
00:39:19.500 Take a look at this talking about how all the problems on social media are from the far right convoy and convoy adjacent people, white supremacists.
00:39:29.500 Take a look at that.
00:39:30.500 I think I think I think what I experience, what we experience is a continuation of threats that have come from work that we've done on on far right and the rise of the far right.
00:39:45.500 And the fact that we've gotten into how many minutes, almost an hour into this and we haven't talked about the far right is like a huge problem because that is the context within which this is happening.
00:39:58.500 Right.
00:39:59.500 These are all either convoy people or convoy adjacent people or white supremacists or something like that.
00:40:07.500 And the fact that we haven't characterized it as that kind of tells me that we have a long way to go to getting people in positions of power to really understanding what is going on.
00:40:20.500 Minister, do you understand what's going on?
00:40:23.500 I certainly have a I think a very much more sober appreciation and of the experiences that you're going through every day.
00:40:33.500 And I think I can't do anything except start by expressing the gratitude for the candor and the bravery that you show every day.
00:40:43.500 I think you're having had to crash through barriers to crack into your profession and then to then have to be inundated day in and day out with intimidation, harassment, overt racism, obviously criminal conduct.
00:40:59.500 And the fact that you still go out there and write stories, which is for the benefit of all Canadians, is I think a real testament to each and every one of you.
00:41:09.500 And so I would just begin by saying that. Thank you for what you do.
00:41:15.500 It is important that we call it for what it is, Erica, and I, you know, it is racist.
00:41:29.500 It is misogynistic. It is criminal. It is against the law.
00:41:35.500 And it is intentional that it targets disproportionately women, racialized indigenous and other minority communities.
00:41:45.500 I think there is no doubt in my mind that the goal is to crowd you all out and to preserve or restore some delusional sense of what the status quo was like.
00:42:00.500 Isn't it odd that all the government-funded inquiries into hate are only what they are looking for hate on the right?
00:42:09.460 I haven't seen condemnations of anti-Semitism on the left or violence on the left, anti-violence or Black Lives Matter.
00:42:17.260 Why is it, how is it possible that the only social media harassment that they're discussing is from what they call the right?
00:42:26.520 And, you know, Marco Mendocino was up next. He had a few thoughts himself on Tahrir Square.
00:42:34.280 And, you know, I think, Sonia, your metaphor at the beginning of describing how you felt in Tahrir Square when you were assaulted and you didn't even realize it because of the trauma in the moment was very apt.
00:42:47.700 And what we are seeing online is every day at Tahrir Square where nobody is safe.
00:42:53.600 And the trauma that that causes to professionals, journalists who have a response, who are trying to fulfill a, like, a democratically essential responsibility to tell stories from perspectives that have not historically been told.
00:43:12.140 Trauma, eh? Talking about trauma.
00:43:15.060 How about this trauma?
00:43:16.620 Take a look at what happened to our Alexa Lavoie at the hands of the government.
00:43:20.460 Trauma!
00:43:21.060 Trauma!
00:43:21.120 Trauma!
00:43:21.720 Trauma!
00:43:22.060 Watch this!
00:43:22.600 You just fight on purpose!
00:43:24.660 You just fight on purpose!
00:43:26.800 Oh!
00:43:27.100 Oh!
00:43:28.020 Oh!
00:43:29.020 Oh!
00:43:30.020 Oh!
00:43:31.020 Oh!
00:43:32.020 Oh!
00:43:33.020 Oh!
00:43:34.020 Oh!
00:43:35.020 You all right?
00:43:36.020 You got shot.
00:43:37.020 Take care.
00:43:40.020 Bring her out.
00:43:41.020 Bring her out.
00:43:41.520 Come on.
00:43:41.900 Oh!
00:43:42.900 Oh!
00:43:43.900 Oh!
00:43:44.900 Oh!
00:43:45.900 Oh!
00:43:46.900 Oh!
00:43:47.900 Oh!
00:43:48.900 Oh!
00:43:49.900 Oh!
00:43:51.660 Oh!
00:43:52.120 Oh!
00:43:52.540 Oh!
00:43:53.540 Oh!
00:43:54.120 Oh!
00:43:54.940 Oh!
00:43:55.640 Oh!
00:43:55.860 Oh!
00:43:56.440 Oh!
00:43:56.740 Oh!
00:43:59.600 Oh!
00:43:59.920 Oh!
00:44:00.540 That took a lot of bravery.
00:44:02.720 So this lady said that the haters were convoy people,
00:44:06.180 although I'm not sure how they know
00:44:07.460 if the online social media haters were anonymous.
00:44:10.340 How would you know that they're your favorite enemy?
00:44:13.220 But my favorite line was Eiffel saying,
00:44:15.280 good for you, just to prove that journalists are neutral and objective.
00:44:19.300 You're talking about exactly what I wanted to bring up,
00:44:21.920 which Erica brought up at the far right.
00:44:24.600 And these are convoy people.
00:44:26.600 And I was an active counter protester against the convoy
00:44:29.660 up until now.
00:44:31.200 And I'm here because you're welcome.
00:44:33.180 And I'm here because I know about the threats to Erica,
00:44:37.900 to Rachel, which is why I'm here.
00:44:41.100 I had never heard this Eiffel before, this Erica Eiffel.
00:44:44.080 And remember, this was a forum against hate.
00:44:49.100 But I guess she couldn't keep it together for long
00:44:51.120 because about an hour and a half into things,
00:44:53.700 she, well, I think she engaged in hateful racism.
00:44:56.600 She said she simply refuses to talk to white men.
00:45:00.820 And she's rather glad that the queen is dead.
00:45:04.940 Take a look at that.
00:45:06.180 Till this day, I make it a point not to,
00:45:09.680 when I'm doing a story or when I'm seeking experts,
00:45:12.900 I do not seek white men.
00:45:15.060 Because I want to change what expertise looks like
00:45:18.940 on television and in my stories.
00:45:21.060 Because that's my job.
00:45:23.220 My job is not to placate power.
00:45:26.220 It's to challenge power, right?
00:45:29.020 And it is, for me, my job is to center marginalized people
00:45:33.940 who do not get a voice, right?
00:45:37.360 Because I'm sorry, in traditional structures,
00:45:40.200 we didn't have a voice.
00:45:42.440 And many times, we don't have a voice now.
00:45:44.980 I can give you an example.
00:45:47.700 Let's look at the queen coverage, for example,
00:45:50.960 when the queen died on CBC.
00:45:53.540 They interviewed her glove maker,
00:45:56.340 but I didn't see many voices that wrote like I did
00:46:00.260 who said, well, I'm glad the queen is dead.
00:46:02.480 And let me tell you why.
00:46:03.980 Because there are many of us
00:46:05.380 who come from colonialist backgrounds
00:46:08.580 who are sitting there being like,
00:46:11.640 oh, God, do we have to do this again?
00:46:13.200 Where is this diversity of voices
00:46:15.800 that we were promised two years ago
00:46:18.320 after George Floyd?
00:46:19.840 I don't see it.
00:46:21.060 That's weird.
00:46:23.100 But apparently, that's the good kind of hate,
00:46:25.780 and that's not something we have to worry about
00:46:27.540 in a mean tweet.
00:46:28.600 I just want to show you one more video
00:46:29.820 from that lady from Pakistan, originally,
00:46:32.160 who was joining via Zoom.
00:46:33.800 Remember that front page in the Toronto Star?
00:46:35.780 Let's put it up just one more time.
00:46:37.980 So hateful.
00:46:39.420 The Toronto Star claims these were tweets
00:46:41.860 that they found or people they interviewed
00:46:43.920 that they put online,
00:46:45.180 although that later crumbled under close examination.
00:46:48.580 Listen to how this journalist describes those.
00:46:52.420 First of all, she says they were opinions
00:46:54.240 from medical experts.
00:46:56.420 Just listen to how she describes them
00:46:59.160 and flips it around.
00:47:01.080 That that hateful, harassing front page
00:47:03.900 of the Toronto Star,
00:47:05.520 well, that wasn't the bully.
00:47:08.460 The victim was the Toronto Star
00:47:11.460 and the author of it,
00:47:12.780 who were bombarded for weeks.
00:47:15.020 To this day, that haunts us.
00:47:16.840 Take a look at that.
00:47:17.500 I remember that there was an article for the Star
00:47:20.040 that one of our reporters did,
00:47:21.840 and it talked about, you know,
00:47:23.380 people who are anti-vaccination,
00:47:25.440 and then it just kind of brought together
00:47:26.960 opinions from different medical experts
00:47:29.000 talking about whether that's dangerous,
00:47:31.060 what's the scientific data behind it.
00:47:32.360 Now, somebody did a page design in such a way
00:47:36.320 where, you know, it was a design choice,
00:47:38.720 and it's been clarified since.
00:47:40.620 It's been taken back since.
00:47:41.880 But all of the things that those medical experts
00:47:44.420 and people who were talking about anti-vaccinated
00:47:46.360 vaxxers was put up,
00:47:48.980 like the quotes were put up as a design choice.
00:47:52.380 And that journalist,
00:47:54.500 the story was taken completely out of context.
00:47:57.040 That was journalist as a racialized journalist.
00:47:58.740 She was bombarded for weeks with hate,
00:48:03.040 with vitriol, with the worst kind of abuse.
00:48:05.620 All of the things that was, you know,
00:48:07.420 some page designers, you know, choice,
00:48:09.960 they were just like attributed to her,
00:48:11.820 that she said all these things,
00:48:12.860 that the Toronto Star is saying all these things.
00:48:14.660 Nobody read the article.
00:48:16.780 Nobody looked at the context.
00:48:18.620 And to this day, that haunts us.
00:48:21.660 The screenshot of that page design,
00:48:24.640 that headline is used
00:48:26.940 every time I speak out about,
00:48:29.160 you know, the hate that I'm facing.
00:48:30.780 Other colleagues,
00:48:31.680 it's just used as some kind of proof
00:48:33.440 that, you know,
00:48:34.180 this is what these guys are saying.
00:48:35.860 There were some students there,
00:48:36.940 this being a journalism school,
00:48:38.680 and one student stood up
00:48:40.060 and quickly showed her virtue
00:48:42.240 by saying she was against Twitter.
00:48:46.160 And the irony here was too much.
00:48:47.740 She denounced Twitter
00:48:48.940 and was answered by Rachel Gilmore,
00:48:51.300 the TikTok journalist for Global News.
00:48:54.880 Twitter, an American company
00:48:57.460 that's embracing free speech,
00:48:59.100 was far worse, apparently,
00:49:00.960 than TikTok,
00:49:02.220 the Chinese spyware.
00:49:04.420 Take a look at this.
00:49:05.740 As journalists,
00:49:06.960 how,
00:49:07.700 is there a way that we can decrease
00:49:09.220 our reliability
00:49:10.080 on platforms like this?
00:49:11.760 Like, you know,
00:49:12.320 platforms that have recently undergone
00:49:13.940 ownership change
00:49:14.920 that have become
00:49:15.360 a much more toxic environment.
00:49:17.120 Without naming names.
00:49:18.380 Yeah.
00:49:18.640 It has become very toxic,
00:49:21.260 and there's been a,
00:49:22.120 like, a compliant kind of nature
00:49:23.800 towards these far-right comments,
00:49:25.340 towards these hate comments.
00:49:26.400 And so I'm wondering,
00:49:27.760 do you foresee a future
00:49:29.380 where we can decrease
00:49:30.840 our reliability
00:49:31.440 on platforms like this?
00:49:33.100 Do you see that there's other ways
00:49:34.600 that we can kind of network
00:49:36.100 and get our, share our work?
00:49:37.820 How do you,
00:49:38.080 what do you guys think?
00:49:38.840 Rachel,
00:49:39.320 what do you think about that?
00:49:40.900 Well,
00:49:41.300 I think one of the inherent issues
00:49:46.100 with the work that we do
00:49:49.620 in the modern context
00:49:51.400 is that we are communicating.
00:49:53.220 We're communicating the news
00:49:54.620 with people,
00:49:55.220 which means we have to find them
00:49:56.480 where they are.
00:49:57.880 And that makes it really difficult
00:50:00.440 when the places
00:50:02.180 where you can communicate best
00:50:03.940 with these,
00:50:04.900 with the public,
00:50:05.940 are also the places where,
00:50:07.860 you know,
00:50:08.820 you face this kind of abuse.
00:50:10.760 All right.
00:50:11.000 I'm going to stop
00:50:11.640 with the videos.
00:50:12.180 I apologize for subjecting
00:50:13.560 you to it so much.
00:50:14.460 I watched this for two hours.
00:50:16.020 I just couldn't force myself
00:50:17.340 to watch through the rest.
00:50:18.660 Let me sum it up for you
00:50:19.660 if you're still with me.
00:50:22.340 Everyone there
00:50:23.180 was on Trudeau's payroll,
00:50:24.760 either directly in cabinet
00:50:26.320 like Marco Mendocino,
00:50:27.840 directly like Catherine Tate,
00:50:29.420 the CBC,
00:50:30.600 or getting subsidies
00:50:31.880 like the women
00:50:33.100 from Global and CTV.
00:50:36.100 It was an all-women panel
00:50:38.100 plus one self-hating man
00:50:39.760 and then, of course,
00:50:40.400 the cabinet minister.
00:50:41.740 But it was never
00:50:42.840 diverse enough.
00:50:44.180 You're never done
00:50:45.460 when it comes
00:50:46.320 to intersectionality
00:50:47.740 because the black woman
00:50:49.640 was furious
00:50:50.280 about the white women
00:50:51.800 not being considered
00:50:53.160 enough to black women.
00:50:54.500 Well, of course,
00:50:55.700 what's next
00:50:56.960 is a non-binary black woman
00:50:59.040 who's raging
00:51:00.220 against the black woman
00:51:01.580 for being blind
00:51:02.660 to gender fluidity.
00:51:04.880 once you start
00:51:06.180 to judge people
00:51:07.300 based on race
00:51:08.340 or sex
00:51:08.940 or any other characteristic
00:51:10.260 other than merit,
00:51:11.160 you will never be done
00:51:13.020 apologizing.
00:51:15.160 I mean,
00:51:15.980 I tell you,
00:51:16.700 it was quite something
00:51:17.380 to see the professor
00:51:18.540 just self-abnegate
00:51:20.180 for five minutes
00:51:21.020 and that's really
00:51:21.580 all he had to say.
00:51:22.660 It's really showing you
00:51:23.640 where academia is going.
00:51:25.800 There was not one second
00:51:27.660 of self-reflection,
00:51:30.360 not one second
00:51:31.600 to why have we
00:51:33.580 engendered such hate?
00:51:34.980 Now, by the way,
00:51:35.600 I don't promote hate.
00:51:37.460 I don't think
00:51:38.040 we should express
00:51:39.180 hate negatively.
00:51:40.500 I think we should
00:51:41.140 transmogrify it
00:51:42.180 and turn it into
00:51:42.820 positive energy,
00:51:44.100 try and do something
00:51:44.740 positive with it,
00:51:46.020 try and vent it
00:51:47.060 in a healthy way,
00:51:48.000 not in an unhealthy way.
00:51:49.280 I don't believe in violence.
00:51:51.180 But hate is a natural
00:51:52.040 human emotion
00:51:52.580 that comes from
00:51:53.120 a genuine sense
00:51:53.920 of grievance.
00:51:55.160 If someone has a grievance
00:51:56.220 that's not being met,
00:51:57.800 that grievance
00:51:58.320 will not be solved
00:51:59.140 by telling them
00:51:59.680 to shut up.
00:52:01.280 And I think
00:52:01.960 one of the reasons
00:52:02.520 so many people
00:52:03.220 don't trust the media
00:52:04.120 and are showing
00:52:05.140 their rage to the media,
00:52:06.560 I mean,
00:52:06.700 I don't know
00:52:07.220 who wrote those
00:52:07.900 anonymous tweets,
00:52:08.960 those mean tweets,
00:52:09.960 but I'm guessing
00:52:10.720 that some of it
00:52:11.460 has to do
00:52:12.000 with the perception
00:52:12.720 that the media
00:52:13.800 is biased,
00:52:14.760 and we saw that
00:52:15.560 in these clips,
00:52:16.840 and that the media
00:52:17.500 is in bed
00:52:18.120 with the government,
00:52:18.820 which they clearly are,
00:52:19.820 even on this panel.
00:52:21.040 But there wasn't
00:52:21.740 one second to,
00:52:23.440 are we doing
00:52:24.260 anything wrong?
00:52:25.820 No, it's not us.
00:52:27.260 It's our audience
00:52:28.020 that's wrong.
00:52:28.660 We have to silence them.
00:52:30.300 The CBC has silenced
00:52:31.380 the comment sections.
00:52:32.620 Now they want the police
00:52:33.620 to silence the rest of you.
00:52:35.520 And another thing
00:52:36.380 I was left with was,
00:52:37.620 my God,
00:52:38.360 are they drama queens?
00:52:39.980 Comparing themselves
00:52:40.800 to post-traumatic
00:52:41.940 stress disorder soldiers
00:52:43.220 coming back
00:52:43.980 from Afghanistan?
00:52:45.820 My heart goes out
00:52:46.900 to that lady
00:52:47.400 for her experience
00:52:48.360 in Tahrir Square.
00:52:49.100 That's terrifying.
00:52:49.780 But you're seriously
00:52:51.040 comparing mean tweets
00:52:52.400 to physically being
00:52:53.920 beaten up
00:52:54.520 by a mob of men
00:52:55.420 in a foreign country?
00:52:56.640 You're comparing
00:52:57.640 mean tweets
00:52:58.280 to child rape?
00:52:59.720 That's outrageous.
00:53:01.980 And what we see
00:53:02.920 finally is the merger.
00:53:05.020 The merger of
00:53:05.920 big media,
00:53:07.060 global,
00:53:07.660 CTV,
00:53:08.260 CBC.
00:53:09.380 Big tech,
00:53:10.720 they want the social
00:53:11.760 media companies
00:53:12.540 to censor everything.
00:53:13.840 And big government
00:53:15.040 that enforces it all.
00:53:16.780 That is what we're up
00:53:18.620 against at Rebel News.
00:53:20.040 And that is why
00:53:21.520 they didn't dare invite
00:53:22.680 any of our real reporters
00:53:24.060 to talk about
00:53:24.980 the real violence
00:53:25.860 against them,
00:53:26.640 all of which has come
00:53:27.900 from the government
00:53:28.680 or from the left.
00:53:30.820 Stay with us for more.
00:53:43.360 Well, I think
00:53:44.140 the personification
00:53:45.060 of the government
00:53:47.480 response
00:53:48.480 to the pandemic,
00:53:49.620 not just in the United States,
00:53:50.800 but around the world,
00:53:51.780 including here in Canada,
00:53:53.220 has been Dr. Anthony Fauci.
00:53:55.360 He's ubiquitous.
00:53:56.660 He loves the media
00:53:57.620 and they love him.
00:53:59.120 There's a lot of
00:53:59.940 unanswered questions
00:54:00.860 about Dr. Fauci
00:54:01.920 and sometimes
00:54:02.780 he's pinned down
00:54:03.940 for a few moments
00:54:04.800 in congressional hearings.
00:54:06.560 Rand Paul has given him
00:54:07.500 a run for his money
00:54:08.160 and promises more.
00:54:09.540 But I think
00:54:10.540 the first time
00:54:11.320 he really was grilled
00:54:12.920 under oath
00:54:14.220 by lawyers
00:54:15.220 asking prickly questions,
00:54:17.740 not softballs
00:54:18.780 from the mainstream media,
00:54:20.740 happened last week
00:54:22.180 when a number
00:54:23.760 of freedom-oriented lawyers,
00:54:25.480 including the solicitor
00:54:26.320 general from Missouri,
00:54:28.640 deposed Dr. Fauci
00:54:30.960 in his office
00:54:32.180 about a very specific thing
00:54:34.600 about the government
00:54:35.700 of the United States
00:54:36.620 trying to censor
00:54:38.100 or throttle
00:54:38.920 or silence
00:54:40.360 voices that were
00:54:42.100 skeptical
00:54:42.820 of Dr. Fauci's
00:54:44.720 approach.
00:54:45.640 Voices that
00:54:46.680 one would call
00:54:47.960 a second opinion,
00:54:49.420 as we used to believe.
00:54:50.960 And one of our friends,
00:54:52.520 Janine Yunus,
00:54:53.420 a lawyer for the
00:54:54.260 new Civil Liberties Alliance,
00:54:55.720 well, she was part
00:54:56.720 of that legal team
00:54:57.660 and she was there
00:54:58.760 in the NIH headquarters
00:55:00.780 in Bethesda
00:55:01.680 as Dr. Fauci
00:55:03.360 was deposed.
00:55:04.040 And she joins us now.
00:55:05.240 Today she's in Annapolis.
00:55:07.120 Well, Janine,
00:55:07.620 it's great to see you.
00:55:08.740 I hope I got that right.
00:55:10.000 It was just last week
00:55:11.180 you were there
00:55:12.520 examining
00:55:13.940 or deposing
00:55:14.820 would be the technical term.
00:55:16.540 Dr. Fauci,
00:55:17.200 just set the scene
00:55:18.560 for us a little bit.
00:55:19.840 Where was it?
00:55:20.740 What does he look like?
00:55:21.620 Who was in the room?
00:55:22.520 How many folks were there?
00:55:23.820 Tell us what it was like,
00:55:25.580 the personal details.
00:55:26.880 We'll get to the substance
00:55:27.680 in a minute,
00:55:28.340 but that must have been
00:55:29.320 quite exciting.
00:55:30.860 Yeah, it was.
00:55:31.860 So it was the day
00:55:32.620 before Thanksgiving,
00:55:33.660 which frankly,
00:55:34.420 I don't think
00:55:34.780 is an accident.
00:55:35.600 A lot of the higher level
00:55:36.840 officials that were
00:55:37.560 deposing in this case
00:55:38.540 keep scheduling
00:55:39.180 their depositions
00:55:41.560 right before major holidays.
00:55:43.300 I think so that
00:55:44.080 there aren't so many
00:55:45.700 reports of them.
00:55:46.820 Let's put it that way.
00:55:48.220 So it took place
00:55:49.200 at the NIH headquarters
00:55:50.240 where Dr. Fauci works
00:55:51.700 in a rather large
00:55:52.960 conference room.
00:55:54.240 And it was a lot of,
00:55:55.860 it was him
00:55:56.260 and then a lot of lawyers
00:55:57.260 and a court reporter
00:55:58.700 and a videographer.
00:56:00.100 So there is a video
00:56:01.260 recording
00:56:01.720 and a transcript
00:56:02.920 that can't be released
00:56:04.420 right now
00:56:04.760 for various reasons.
00:56:05.500 Although I think
00:56:06.020 the transcript
00:56:06.480 will come out next week,
00:56:07.640 not the video.
00:56:09.920 And then,
00:56:10.340 so there was
00:56:11.080 me and my colleague
00:56:12.280 John Vecchione
00:56:13.040 from NCLA
00:56:14.320 were representing
00:56:14.940 private plaintiffs
00:56:15.720 in the case.
00:56:16.620 The attorneys general
00:56:17.820 of Missouri
00:56:18.480 and Louisiana
00:56:19.060 who are also
00:56:20.420 bringing the case
00:56:21.060 on behalf
00:56:21.560 of the citizens
00:56:22.740 of their state.
00:56:23.740 The solicitor general
00:56:24.580 of Missouri
00:56:25.060 who's leading
00:56:25.580 the depositions.
00:56:26.560 And then
00:56:26.920 two clients,
00:56:28.380 one of our clients
00:56:29.200 named Jill Hines
00:56:30.040 and Jim Hoft
00:56:30.980 and his lawyer
00:56:31.740 John Burns.
00:56:33.000 Jim Hoft
00:56:33.780 founded the Gateway Pundit
00:56:35.820 and he's also,
00:56:36.680 he's the only private
00:56:37.560 plaintiff that's being
00:56:38.240 represented by someone
00:56:39.020 other than us.
00:56:40.000 And then there were
00:56:40.440 about eight,
00:56:41.100 I'm going to say
00:56:41.700 eight lawyers
00:56:42.200 for the other side
00:56:42.920 for Dr. Fauci.
00:56:44.100 Really?
00:56:45.320 Eight lawyers?
00:56:46.260 I mean,
00:56:46.420 I can imagine two
00:56:48.020 and maybe even
00:56:49.160 a couple of students
00:56:50.860 or young helper outers,
00:56:52.700 but eight lawyers.
00:56:54.860 Why did they have
00:56:55.820 eight lawyers?
00:56:56.500 Did all eight weigh in?
00:56:57.760 Were they all representing
00:56:59.140 different government agencies
00:57:00.900 or does he really
00:57:01.660 have eight lawyers?
00:57:03.380 I think so.
00:57:05.520 There was only one lawyer
00:57:06.760 who was objecting.
00:57:07.780 He's been at almost
00:57:09.160 all of the depositions
00:57:10.060 or I think all of the
00:57:10.860 depositions we've done
00:57:11.680 in this case.
00:57:13.120 I think it was probably
00:57:14.260 interest.
00:57:15.140 Like there were more
00:57:15.900 lawyers from our side
00:57:17.160 present to than there
00:57:18.580 have been at the other
00:57:19.340 depositions because we're
00:57:20.180 deposing a number of
00:57:21.080 high ranking government
00:57:21.880 officials in this case.
00:57:23.180 So I think people were
00:57:24.080 just interested in seeing
00:57:25.440 Fauci's deposition
00:57:26.240 because he's such a
00:57:27.260 a big figure
00:57:28.340 in the COVID era.
00:57:29.900 Yeah, big figure.
00:57:30.560 I'm not sure if that's
00:57:31.300 an ironic statement.
00:57:32.080 What's he like physically?
00:57:33.240 I mean, what's his
00:57:34.340 comportment like?
00:57:35.320 Did he seem happy
00:57:36.700 and jovial?
00:57:37.680 Was he impatient?
00:57:39.620 Was he businesslike,
00:57:41.000 detached?
00:57:41.640 What was he like in person?
00:57:43.240 I mean, obviously,
00:57:43.700 it's not a fun way
00:57:44.900 to spend a few hours.
00:57:46.420 And by the way,
00:57:46.980 how long was it?
00:57:48.200 It was so we got
00:57:49.480 seven hours to depose
00:57:51.140 and that's what the law
00:57:51.920 allows unless the judge
00:57:53.600 says otherwise.
00:57:54.760 And you take breaks.
00:57:56.080 So it's, you know,
00:57:57.160 they keep careful
00:57:58.080 track of the time.
00:57:58.820 So every time it breaks,
00:57:59.640 the time stops.
00:58:00.380 So we were there for,
00:58:01.140 I would say,
00:58:01.360 almost 10 hours
00:58:02.300 because you stop
00:58:03.660 almost every hour
00:58:04.400 questioning somebody,
00:58:05.560 you know,
00:58:06.320 is I think the person
00:58:07.800 who's doing the
00:58:08.320 questioning and answering,
00:58:09.500 it takes a lot of energy.
00:58:10.440 So you break a lot.
00:58:12.440 He was very,
00:58:13.320 he was composed.
00:58:14.520 He's very together.
00:58:15.780 He would obviously
00:58:16.960 get irritated
00:58:17.660 sometimes when pressed,
00:58:19.040 but he knew
00:58:20.680 how to answer questions
00:58:21.880 and he's quite,
00:58:24.520 he's on the short side.
00:58:26.100 He's, I think,
00:58:26.860 shorter than the internet.
00:58:28.140 I was just making fun.
00:58:29.200 Listen, I'm not
00:58:29.640 the tallest guy around either.
00:58:31.020 I just, listen,
00:58:32.480 after what he's done
00:58:34.020 to millions of Americans
00:58:36.080 and people around the world,
00:58:37.160 I have no problem
00:58:38.080 making the odd
00:58:38.900 personal jab back at him.
00:58:40.280 I know I shouldn't.
00:58:41.680 So just to refresh
00:58:43.080 the memory of our viewers,
00:58:45.040 I mean, we've jumped
00:58:45.600 right into the details
00:58:46.520 of Anthony Fauci himself,
00:58:47.700 but the purpose
00:58:49.060 behind this lawsuit
00:58:50.040 is that the government
00:58:52.420 was secretly interfering
00:58:54.960 with directing
00:58:55.800 big tech companies
00:58:56.760 and actually having
00:58:58.480 sort of a veiled threat
00:58:59.620 that the big tech companies
00:59:01.020 had better do it,
00:59:02.180 that they had better censor
00:59:03.940 contrary opposition views
00:59:07.260 on the pandemic,
00:59:08.320 including the great
00:59:09.380 Barrington Declaration,
00:59:10.660 which was a bunch
00:59:11.160 of scientists and doctors
00:59:12.100 saying, whoa,
00:59:12.900 we disagree with the government's
00:59:14.480 heavy-handed approach.
00:59:15.660 So this was trying to get at
00:59:17.420 the government
00:59:18.700 secretly working with
00:59:20.820 and secretly pushing
00:59:22.320 big tech companies
00:59:23.660 to censor other voices.
00:59:24.820 Am I right?
00:59:25.700 That's exactly right.
00:59:26.740 Yeah.
00:59:27.000 So we now, you know,
00:59:28.180 the government
00:59:28.540 had made public threats.
00:59:29.760 And when I say the government,
00:59:30.860 specifically Joe Biden himself,
00:59:34.080 Surgeon General Vivek Murthy,
00:59:35.660 Biden's former press secretary,
00:59:37.240 Jen Psaki,
00:59:37.860 they had all made statements
00:59:38.760 on the record
00:59:39.320 saying that tech companies
00:59:40.900 had better censored
00:59:42.140 so-called misinformation
00:59:43.140 about COVID.
00:59:43.920 And they cite misinformation
00:59:45.680 as things like questioning
00:59:47.120 whether masks work,
00:59:48.360 you know,
00:59:48.700 suggesting the vaccines
00:59:49.520 don't stop transmission,
00:59:51.180 questioning whether social distancing
00:59:52.400 and lockdowns are a good idea.
00:59:54.700 So they had been making these threats,
00:59:56.600 telling the tech companies
00:59:57.380 they better censor
00:59:58.280 so-called misinformation
00:59:59.760 on these topics,
01:00:00.740 or they would face repercussions
01:00:02.520 in the form of regulation
01:00:03.820 or other legal consequences.
01:00:05.160 And the tech companies
01:00:06.200 have long feared
01:00:07.280 repealing Section 230,
01:00:09.080 which protects them
01:00:10.240 from liability
01:00:10.920 for content
01:00:11.620 on their platforms.
01:00:12.560 So, you know,
01:00:14.460 that's really important
01:00:15.840 for the tech companies functioning
01:00:17.020 because otherwise
01:00:17.660 they have to sort of
01:00:18.440 act as publishers
01:00:19.400 if they can be held responsible
01:00:22.280 for what people say.
01:00:24.700 So, you know,
01:00:25.480 threatening them
01:00:26.340 with repealing this
01:00:27.220 is a very heavy-handed approach
01:00:28.720 and had a real impact.
01:00:30.300 The companies did start
01:00:31.400 censoring people
01:00:32.680 for, you know,
01:00:33.940 in response to this.
01:00:34.940 And that's a First Amendment violation
01:00:36.200 because the government
01:00:36.860 can't use private companies
01:00:38.220 to accomplish
01:00:39.540 what it otherwise couldn't.
01:00:40.920 And the government
01:00:41.320 cannot censor people
01:00:42.500 for expressing
01:00:43.720 certain viewpoints.
01:00:45.600 You know,
01:00:45.780 and censorship
01:00:46.180 can take a number of forms.
01:00:47.480 I think it was literally
01:00:48.240 just this week
01:00:49.080 when Twitter announced
01:00:50.560 it would stop,
01:00:51.480 quote,
01:00:53.000 fact-checking
01:00:53.880 or fact-shaming
01:00:55.200 or counter-posting
01:00:57.380 on Twitter people
01:00:58.340 who had alternative views
01:00:59.400 on the pandemic.
01:01:00.560 This coincided with,
01:01:02.060 I think,
01:01:02.280 Pfizer dropping
01:01:03.220 some of their ad campaign.
01:01:05.140 It just,
01:01:05.720 it looks too on the nose.
01:01:08.620 There's different ways
01:01:09.640 to censor.
01:01:10.140 You can throttle someone.
01:01:11.040 You can shut down
01:01:11.840 or suspend their account.
01:01:13.200 You can also
01:01:13.880 append to their statement
01:01:16.380 an official statement
01:01:18.180 that they're wrong.
01:01:19.600 In our country,
01:01:20.660 Janine,
01:01:20.860 I don't know if you heard
01:01:21.480 about this,
01:01:22.260 a Christian pastor
01:01:23.360 was given an order
01:01:24.320 by a judge
01:01:25.460 that anytime he said
01:01:27.480 something in public
01:01:28.340 or private
01:01:28.920 on Facebook
01:01:30.180 in a sermon
01:01:30.840 in a media interview
01:01:31.700 that contradicted
01:01:33.600 the government line,
01:01:34.320 he had to immediately
01:01:35.340 take out
01:01:36.260 a little statement
01:01:37.060 written by the judge
01:01:38.640 that basically renounced
01:01:40.560 what he just said,
01:01:41.460 what I have said is,
01:01:42.760 and that was later
01:01:44.180 struck down
01:01:44.700 by the Court of Appeal.
01:01:45.900 But that's a kind of a,
01:01:47.040 imagine telling a human
01:01:48.660 that they have to,
01:01:49.740 they're ordered
01:01:50.340 by the courts,
01:01:51.220 that they have to
01:01:52.040 personally renounce
01:01:53.500 in a struggle session
01:01:55.160 what I said was wrong,
01:01:57.120 I do not believe
01:01:58.140 what I said.
01:01:58.820 Like,
01:01:59.420 that's an extreme version,
01:02:01.860 but imagine doing that
01:02:03.580 en masse
01:02:04.280 to millions
01:02:05.220 or billions
01:02:05.900 of people
01:02:06.540 using the machines
01:02:08.960 of Twitter,
01:02:09.980 YouTube,
01:02:10.380 Google,
01:02:10.740 Facebook,
01:02:11.580 Instagram.
01:02:12.360 And that's what
01:02:12.900 this suit's about.
01:02:14.020 That is,
01:02:14.880 yeah.
01:02:15.540 And also,
01:02:16.420 this suit is broader
01:02:17.380 than just COVID
01:02:18.160 information,
01:02:19.740 so we're representing
01:02:21.260 plaintiffs who were
01:02:22.300 censored for COVID
01:02:23.060 so-called misinformation,
01:02:24.420 not really misinformation,
01:02:25.480 but that's not really
01:02:26.820 the point.
01:02:27.220 But the law suit itself
01:02:29.480 actually covers also
01:02:30.740 like the Hunter Biden
01:02:32.020 laptop story,
01:02:33.120 misinformation
01:02:34.620 quote-unquote
01:02:35.360 about the 2020 election,
01:02:36.980 climate change.
01:02:38.600 So there is
01:02:39.940 sort of a large-scale
01:02:40.820 effort by the government
01:02:41.860 to censor views
01:02:43.920 that depart from,
01:02:45.140 you know,
01:02:45.420 now the Biden administration's
01:02:46.920 perspective
01:02:48.360 on most of these subjects.
01:02:50.680 Are you at liberty
01:02:51.780 to tell us
01:02:52.620 some of the substantive
01:02:53.820 questions and answers
01:02:54.980 you got,
01:02:55.500 or are these
01:02:56.260 under some sort
01:02:57.260 of confidentiality
01:02:58.340 for the time being?
01:02:59.620 I know,
01:02:59.960 I'm able to talk about it.
01:03:01.140 But so I would say
01:03:04.660 Fauci was questioned
01:03:05.860 extensively about
01:03:07.260 his involvement
01:03:07.800 in any censorship
01:03:08.600 of the Great Barrington
01:03:09.460 Declaration.
01:03:09.880 So the Great Barrington
01:03:10.900 Declaration,
01:03:11.460 which two of my clients
01:03:12.340 wrote,
01:03:12.740 was a short treatise.
01:03:15.380 The clients are
01:03:16.220 epidemiologists
01:03:17.220 who,
01:03:18.940 from Harvard
01:03:19.660 and Stanford,
01:03:20.240 and there was a third
01:03:21.280 one who's not
01:03:21.800 in this suit,
01:03:22.680 but she's from Oxford.
01:03:24.400 And they,
01:03:24.820 you know,
01:03:24.940 very esteemed
01:03:25.540 epidemiologists
01:03:26.320 who thought
01:03:27.380 that lockdowns
01:03:28.060 were a really bad idea,
01:03:28.880 that they would harm
01:03:29.740 young people,
01:03:31.600 the working class,
01:03:33.760 you know,
01:03:34.240 while failing
01:03:34.780 to actually protect
01:03:35.560 the people from COVID
01:03:36.480 who needed protection,
01:03:37.480 namely the older,
01:03:38.600 more vulnerable people,
01:03:39.860 medically vulnerable people.
01:03:41.240 So they wrote,
01:03:42.620 they encapsulated
01:03:44.260 their views
01:03:44.640 in this treatise,
01:03:45.860 and they were,
01:03:46.480 in October of 2020,
01:03:47.760 they were surprised
01:03:48.620 that it was very rapidly,
01:03:50.120 heavily censored
01:03:51.060 on social media.
01:03:52.640 So I think Facebook
01:03:53.580 took their page down.
01:03:55.640 Google,
01:03:56.140 it became very hard
01:03:57.060 when you search for it
01:03:58.040 to find it.
01:03:58.960 Now,
01:03:59.140 we know at the same time
01:03:59.880 there are emails
01:04:00.400 between Fauci
01:04:01.220 and Francis Collins
01:04:02.040 at the NIH
01:04:02.660 where Francis Collins
01:04:04.160 wrote to Fauci
01:04:04.800 and said this,
01:04:05.500 you know,
01:04:06.180 the Great Barrington Declaration
01:04:07.440 needed to be,
01:04:08.440 we needed to orchestrate
01:04:09.400 a swift and devastating
01:04:10.500 takedown of it.
01:04:11.960 And Fauci called it
01:04:13.160 dangerous nonsense.
01:04:14.620 So we don't have
01:04:15.500 direct evidence
01:04:16.440 that they were responsible
01:04:17.580 for the social media censorship,
01:04:19.300 but we have this sort
01:04:19.920 of circumstantial evidence
01:04:20.960 and we were really trying
01:04:21.820 to get to the bottom of it.
01:04:23.160 Now,
01:04:23.340 there wasn't any smoking gun,
01:04:24.760 which I had suspected.
01:04:25.920 I mean,
01:04:26.080 Fauci's not going to admit,
01:04:27.680 you know,
01:04:27.860 I told Mark Zuckerberg
01:04:29.120 to take down
01:04:30.700 the Facebook page
01:04:32.180 of the Great Barrington Declaration
01:04:33.260 and maybe he didn't.
01:04:34.480 We do know he talked
01:04:35.280 to Zuckerberg,
01:04:35.960 but we certainly don't have
01:04:36.900 any direct evidence of that.
01:04:39.180 But he did say
01:04:40.260 a number of interesting,
01:04:41.220 other interesting things.
01:04:42.460 You know,
01:04:42.640 the questioning sort of
01:04:43.460 ended up going very,
01:04:44.560 becoming much more broad.
01:04:46.820 Huh.
01:04:47.440 So he did talk
01:04:48.640 to the head of Facebook.
01:04:50.580 Were there any notes
01:04:51.580 on that call
01:04:52.340 or was it,
01:04:53.040 was it by phone?
01:04:54.240 No.
01:04:55.120 No.
01:04:55.420 So it took place
01:04:56.220 over the phone.
01:04:56.900 We know there were
01:04:57.300 a couple of phone calls.
01:04:58.340 He and Zuckerberg
01:04:59.040 had each other's numbers.
01:05:00.580 We know that
01:05:01.200 from the discovery
01:05:01.860 we obtained in this case.
01:05:03.380 And they had phone calls.
01:05:05.880 We don't know exactly
01:05:06.480 what was said.
01:05:06.920 He claims he can't recall.
01:05:08.060 He can never recall anything.
01:05:09.620 That was the theme
01:05:10.540 of this deposition.
01:05:11.400 I think he said,
01:05:12.180 he said,
01:05:13.000 someone counted it up.
01:05:14.080 He said,
01:05:14.460 I can't recall
01:05:15.420 something like 180 times
01:05:17.080 or something.
01:05:17.480 And that doesn't even,
01:05:19.560 they said,
01:05:20.120 the person who was telling me
01:05:21.020 they had counted it
01:05:21.580 said they weren't even
01:05:22.120 counting variations.
01:05:23.200 Like,
01:05:24.060 I can't quite recall
01:05:25.800 or I don't remember.
01:05:27.080 So that's how he sort of
01:05:29.300 deals with difficult questions.
01:05:31.300 Yeah.
01:05:31.540 I wonder if that's credible.
01:05:34.340 I understand,
01:05:35.360 I'm not sure where I saw this,
01:05:36.580 but that he was saying,
01:05:38.720 oh,
01:05:39.100 come on,
01:05:40.100 the idea that I,
01:05:41.880 Dr. Fauci,
01:05:43.200 who's responsible
01:05:43.900 for a multi-billion dollar
01:05:45.980 national institutes for health,
01:05:47.240 the idea that I
01:05:48.140 would be concerned
01:05:50.020 by some little,
01:05:50.620 what's it called again?
01:05:51.660 What's that thing called again?
01:05:52.520 Oh,
01:05:52.700 the great bearing to die.
01:05:53.560 I almost forgot.
01:05:55.080 I give him so little concern
01:05:56.660 when the internal emails,
01:05:58.580 we've got to have a
01:05:59.740 speedy and devastating rebuttal.
01:06:02.620 So he's obviously seized by it.
01:06:04.360 He's,
01:06:04.580 I mean,
01:06:05.060 listen,
01:06:05.960 you can be right
01:06:07.460 whether you're a king
01:06:08.340 or a peasant,
01:06:08.940 but the fact that you said
01:06:10.320 Harvard and Stanford
01:06:11.580 and Oxford,
01:06:12.680 these are not dummies.
01:06:13.820 These are people
01:06:14.460 with reputations
01:06:15.380 and pedigrees.
01:06:16.700 These are smart folks.
01:06:18.020 I'm sure it did.
01:06:19.100 I'm sure he,
01:06:20.240 frankly,
01:06:21.160 I'm sure he thought
01:06:22.160 it was persuasive.
01:06:22.960 Otherwise,
01:06:23.300 he wouldn't have cared.
01:06:24.180 So for him to suddenly say
01:06:25.420 and confirm if this is true,
01:06:27.220 oh,
01:06:27.600 I didn't,
01:06:28.500 I didn't trouble myself
01:06:29.920 with such trivial matters.
01:06:31.380 I was busy saving the world.
01:06:32.800 I didn't have time for this.
01:06:33.900 What's that called again?
01:06:35.520 Is that what it was like?
01:06:36.880 That was more or less
01:06:37.860 a good paraphrase.
01:06:38.820 He said there were two.
01:06:40.100 One was I'm too busy running.
01:06:44.040 I have a very busy day job
01:06:45.320 running a $6 billion institute.
01:06:47.040 I don't have time to worry about.
01:06:48.440 I didn't have time
01:06:49.000 to be worried about things
01:06:49.880 like the Great Barrington Declaration.
01:06:51.420 And then there was,
01:06:52.300 I was too busy developing a vaccine
01:06:54.620 that saved millions of lives
01:06:55.960 to be concerned about
01:06:56.720 what happens on social media.
01:06:58.140 So he claims he has no,
01:06:59.080 he doesn't really know
01:06:59.700 how social media works.
01:07:00.700 He's not on it.
01:07:02.280 You know,
01:07:02.840 so that whenever he was questioned
01:07:04.140 about social media,
01:07:05.020 that was what he said.
01:07:06.000 You know,
01:07:06.900 I didn't know that
01:07:07.600 he developed these vaccines.
01:07:08.740 Well,
01:07:08.860 the guy should get a Nobel Prize.
01:07:10.440 He just,
01:07:11.060 is there nothing
01:07:11.860 that Anthony Fauci can't do?
01:07:14.080 So let me ask you,
01:07:15.020 in Canada,
01:07:15.460 when we have depositions
01:07:16.860 or examinations
01:07:17.820 or whatever they're called,
01:07:18.960 sometimes if the person
01:07:21.940 being asked questions
01:07:22.840 doesn't have the information
01:07:26.240 at hand,
01:07:27.340 they give an undertaking
01:07:28.460 to either get the information later
01:07:30.680 or provide documents later.
01:07:33.120 Did that apply here?
01:07:34.660 Did Fauci say,
01:07:35.540 okay,
01:07:35.860 well,
01:07:36.040 I'll look this up
01:07:36.740 and get back to you?
01:07:37.700 Does that happen
01:07:38.440 in these depositions?
01:07:39.600 No.
01:07:39.960 No.
01:07:40.760 I mean,
01:07:41.320 it could,
01:07:42.040 but it didn't happen here.
01:07:43.940 He has too busy of a day job
01:07:47.300 to have time for that sort of thing.
01:07:48.500 Of course,
01:07:48.880 of course,
01:07:49.360 the lawsuits
01:07:50.120 for the little people.
01:07:51.620 Yeah.
01:07:52.800 So you,
01:07:53.560 we have one side
01:07:54.900 of the debate.
01:07:56.620 We have the Fauci side,
01:07:58.900 the government side.
01:08:00.240 Will we ever hear
01:08:01.420 from the Zuckerberg side
01:08:02.820 and from the Twitter side
01:08:04.140 of the other side
01:08:06.040 of these phone calls?
01:08:07.240 It wouldn't surprise me
01:08:08.580 that notes were taken,
01:08:10.120 if not by Zuckerberg himself,
01:08:11.640 that maybe someone
01:08:12.840 one notch lower
01:08:14.240 on the org chart.
01:08:15.300 Like,
01:08:15.920 I'm sure Zuckerberg
01:08:17.080 and Fauci talked a few times,
01:08:18.680 but then they delegated
01:08:20.260 an errand
01:08:21.220 to a chief of staff,
01:08:22.500 to a,
01:08:22.680 to a,
01:08:23.900 I don't know,
01:08:24.900 to an assistant
01:08:25.540 to do whatever
01:08:27.380 the real work was
01:08:28.280 to be done.
01:08:28.860 It wouldn't surprise me
01:08:30.380 that there are some records,
01:08:31.820 if not on the Fauci side,
01:08:33.120 on the Facebook side.
01:08:34.760 I mean,
01:08:35.280 if you're interacting
01:08:36.780 with the government,
01:08:37.380 and by the way,
01:08:37.860 if there are hundreds
01:08:38.440 of millions of dollars
01:08:39.360 in grants
01:08:40.640 for ads
01:08:41.960 for social media,
01:08:42.760 I mean,
01:08:43.220 there was a lot of money
01:08:45.160 behind propaganda
01:08:46.520 and censorship,
01:08:48.080 whether from the vaccine companies
01:08:50.000 or from the government.
01:08:51.100 I bet there is a paper trail
01:08:53.020 in these companies.
01:08:54.620 Will you,
01:08:55.140 or have you,
01:08:55.760 had access to that?
01:08:58.040 So we haven't had access
01:08:59.840 to internal communications
01:09:01.400 between tech employees,
01:09:02.800 which we would like.
01:09:04.260 What I will say is
01:09:05.760 the question this case
01:09:07.120 really presents
01:09:07.800 is whether the tech companies
01:09:09.220 were working with the government
01:09:10.200 voluntarily
01:09:10.820 or whether they were doing it
01:09:12.200 solely because they felt coerced
01:09:14.120 by the threats
01:09:14.720 I mentioned earlier.
01:09:15.940 In my opinion,
01:09:17.100 either of those scenarios
01:09:18.120 is a First Amendment violation.
01:09:19.960 When the government
01:09:21.700 and private companies
01:09:22.560 are working together
01:09:23.420 to censor certain viewpoints,
01:09:24.940 that's a First Amendment violation.
01:09:26.420 Now,
01:09:26.620 if it's because of coercion,
01:09:28.260 it's really the government
01:09:29.300 who's to blame
01:09:30.400 and the government
01:09:31.100 who's responsible
01:09:31.720 for the First Amendment violation.
01:09:33.300 But if the companies
01:09:34.080 are doing this voluntarily,
01:09:35.460 then they're effectively
01:09:36.540 state actors.
01:09:38.000 Right.
01:09:38.140 And so
01:09:38.680 if
01:09:39.520 they were coerced,
01:09:41.460 they may want to start
01:09:42.700 revealing information
01:09:43.760 to that effect
01:09:45.420 so that they can't
01:09:46.420 be held responsible.
01:09:47.660 Because I think
01:09:48.300 if you had like
01:09:49.600 a bigger account
01:09:50.240 like Alex Berenson's,
01:09:51.720 you know,
01:09:51.900 he,
01:09:52.180 for people who don't know,
01:09:53.260 he was suspended by Twitter.
01:09:54.840 It was clearly
01:09:55.360 at the behest of the government.
01:09:56.500 He got internal documents
01:09:57.600 that showed that
01:09:58.400 where they were saying,
01:09:59.480 you know,
01:10:00.140 the White House
01:10:00.980 is demanding
01:10:01.420 we kick him off.
01:10:03.040 We're feeling a lot of pressure.
01:10:04.500 And then a couple days later,
01:10:05.380 weeks later,
01:10:05.820 he's kicked off.
01:10:07.000 Even though,
01:10:07.640 you know,
01:10:07.860 they're saying
01:10:08.260 we don't think
01:10:08.660 he's violated anything.
01:10:09.700 And he had been told
01:10:10.660 specifically
01:10:11.200 he wasn't
01:10:12.100 violating anything.
01:10:13.200 So he actually got,
01:10:15.280 he sued Twitter
01:10:16.020 and he's getting
01:10:17.460 monetary damages
01:10:18.380 for that
01:10:18.800 because he made
01:10:19.500 a lot of money
01:10:19.920 off of his Twitter account.
01:10:21.240 So I think
01:10:22.180 for large accounts
01:10:23.120 who are kicked off
01:10:23.740 who can show
01:10:24.260 that they suffer
01:10:24.740 financial losses,
01:10:26.300 if it's this theory
01:10:27.320 of coordination
01:10:28.260 and collusion
01:10:28.940 rather than coercion,
01:10:30.580 the tech companies
01:10:31.320 themselves are on the hook.
01:10:32.360 So they have a motivation
01:10:33.320 to blame it
01:10:34.020 on the government.
01:10:34.900 So that might be
01:10:35.860 a reason
01:10:36.480 that we would get access.
01:10:37.940 Very interesting.
01:10:38.640 So what's next?
01:10:39.440 You've examined
01:10:40.760 or deposed Fauci.
01:10:42.140 Are there any other
01:10:43.660 officials
01:10:45.360 yet to be
01:10:46.900 examined?
01:10:48.660 And what's next
01:10:49.860 in the lawsuit?
01:10:52.580 So, yeah,
01:10:53.440 there are other depositions.
01:10:54.700 Actually,
01:10:55.080 it was on,
01:10:55.500 I believe,
01:10:55.940 Thursday.
01:10:56.820 Elvis Chan
01:10:57.440 from the FBI
01:10:58.180 was deposed.
01:10:59.440 He had,
01:11:00.660 he was the FBI agent
01:11:02.260 who apparently
01:11:04.000 was responsible
01:11:04.720 for suppression
01:11:05.560 of the Hunter Biden
01:11:06.260 laptop story
01:11:07.100 on social media.
01:11:08.180 Wow.
01:11:08.660 That was a very
01:11:09.580 interesting deposition,
01:11:11.400 which I,
01:11:12.140 I wasn't able
01:11:12.660 to watch a lot
01:11:13.320 of,
01:11:14.240 it was over Zoom
01:11:15.160 because I couldn't
01:11:16.100 go to San Francisco.
01:11:17.600 But I think
01:11:19.620 you'll hear more
01:11:20.600 about that later,
01:11:21.200 but we have some
01:11:21.700 others coming up.
01:11:22.720 The most interesting
01:11:23.580 ones are Surgeon General
01:11:24.580 Vivek Murthy
01:11:25.300 and Press Secretary
01:11:26.360 Jennifer Psaki.
01:11:27.460 They are fighting
01:11:28.300 those in a higher court
01:11:29.300 in the Fifth Circuit,
01:11:30.360 but they're currently
01:11:31.720 scheduled for December,
01:11:33.360 various dates in December.
01:11:35.460 And so we'll see
01:11:36.140 if those go forward.
01:11:37.700 In any event,
01:11:39.020 what,
01:11:39.540 once the depositions
01:11:40.780 wrap up,
01:11:41.640 we should have a hearing
01:11:42.520 on the preliminary injunction.
01:11:43.740 We're still in the
01:11:44.260 preliminary injunction
01:11:45.020 phase of this case
01:11:46.180 and then hopefully
01:11:47.420 get a decision
01:11:47.960 from the judge.
01:11:48.620 And I believe
01:11:49.040 it will be a favorable
01:11:49.800 one given the evidence
01:11:50.960 that we've uncovered.
01:11:52.420 Wow.
01:11:52.760 I think Jen Psaki,
01:11:54.000 and I remember
01:11:54.300 when we spoke about
01:11:54.960 that a week or two ago,
01:11:56.060 I think she is
01:11:57.420 the key person.
01:11:58.840 I really think
01:11:59.380 that she was more
01:12:00.120 than just a press secretary.
01:12:01.740 I think she was
01:12:02.500 a strategist.
01:12:03.220 I think she was a doer,
01:12:05.540 a kind of executive.
01:12:07.660 And it's no wonder
01:12:08.940 that she's fighting
01:12:09.660 so hard to avoid this.
01:12:11.420 Well, this is very exciting.
01:12:12.920 Congratulations to you.
01:12:14.300 I look forward to it.
01:12:15.180 Please do come back
01:12:16.300 to give us an update
01:12:17.560 on Jen Psaki
01:12:18.980 or just as this lawsuit
01:12:20.820 moves forward.
01:12:21.840 I think it will have
01:12:22.760 some ramifications
01:12:23.580 in Canada too.
01:12:25.040 We're treated like
01:12:25.940 a 51st state
01:12:27.220 in many ways.
01:12:28.000 I mean,
01:12:28.260 we're about the same size
01:12:29.940 as California.
01:12:31.360 And I think
01:12:32.540 our government
01:12:33.540 is so eager
01:12:34.840 to please
01:12:35.340 not just the tech companies
01:12:36.500 but Joe Biden too.
01:12:38.440 I bet there was
01:12:39.060 and I know
01:12:40.020 that we here
01:12:40.540 at Rebel News
01:12:41.180 have been censored.
01:12:42.460 We were demonetized
01:12:43.620 not for showing
01:12:45.180 any violence
01:12:45.880 or obscenity
01:12:46.820 or anything like that.
01:12:48.220 We were demonetized
01:12:49.260 for politics.
01:12:50.400 Wouldn't surprise me
01:12:51.200 if it was a phone call
01:12:52.180 from Trudeau.
01:12:52.960 We might find that out
01:12:53.900 one day.
01:12:54.920 Anyways,
01:12:55.400 for now I can only speculate
01:12:56.640 but maybe we'll see
01:12:58.640 what's revealed
01:12:59.900 in your lawsuit.
01:13:00.620 Keep it up, Janine.
01:13:01.200 It's great to catch up
01:13:02.240 with you.
01:13:03.120 Thank you so much.
01:13:04.020 It was great to catch up.
01:13:05.680 Right on.
01:13:06.140 There you have it.
01:13:06.540 Janine Younis.
01:13:07.180 She's a lawyer
01:13:07.720 with the new
01:13:08.220 Civil Liberties Alliance
01:13:09.200 taking on
01:13:10.780 the Big Tech
01:13:12.220 Big Government
01:13:13.080 Censorship Project.
01:13:14.920 Stay with us.
01:13:15.640 More ahead.
01:13:16.160 Well, I hope I wasn't
01:13:28.760 too jumbled up
01:13:29.800 on those clips
01:13:30.680 from the journalism
01:13:32.420 panel in Ottawa.
01:13:34.040 It was very poorly
01:13:34.900 attended.
01:13:36.300 I followed along.
01:13:37.200 There was a reporter
01:13:37.860 from True North there
01:13:38.680 that was live tweeting it
01:13:39.520 and then I watched
01:13:40.360 two hours of their
01:13:41.240 live stream today.
01:13:42.520 I think what's interesting
01:13:45.120 is they're raging
01:13:46.340 against ordinary people
01:13:47.940 talking back to them
01:13:49.040 and they always have
01:13:50.520 because they're losing
01:13:51.900 their monopoly power.
01:13:53.280 They think journalism
01:13:54.540 should be a guild,
01:13:55.960 an elite club.
01:13:56.780 In fact, when we've applied
01:13:57.780 to join the
01:13:58.300 Parliamentary Press Gallery
01:13:59.220 they've refused us.
01:14:00.680 Well, all right.
01:14:01.660 Our viewers came to us
01:14:02.820 instead of them.
01:14:03.860 We had more viewers
01:14:04.720 at Rebel News
01:14:05.240 during the trucker convoy
01:14:06.360 than the media party did.
01:14:08.260 And they're losing
01:14:09.400 ordinary readers too
01:14:10.960 just because
01:14:11.860 they refuse to engage
01:14:13.420 in the participative
01:14:14.500 conversation about democracy.
01:14:16.860 It started with
01:14:17.840 talk radio,
01:14:19.060 call-in radio
01:14:20.120 in the 80s and 90s.
01:14:21.920 People felt great
01:14:22.700 that they could finally
01:14:23.440 talk back to journalists.
01:14:25.060 Then the internet,
01:14:26.560 of course,
01:14:26.920 exploded it
01:14:27.600 and citizen journalists
01:14:28.640 and smartphones.
01:14:30.980 I think what we see here
01:14:32.280 is an old media guard,
01:14:34.500 even though some of them
01:14:35.260 are young,
01:14:35.960 they want an old system
01:14:37.420 that rigs the rules.
01:14:38.980 They want to shut down
01:14:40.080 their competitors
01:14:40.720 and they want to shut up
01:14:42.180 their viewers.
01:14:44.040 The terrifying thing,
01:14:45.180 of course,
01:14:45.460 is that the Liberal Party
01:14:46.220 will go along with it.
01:14:47.580 I just hope that
01:14:48.400 independent citizen journalists
01:14:49.780 and our independent viewers
01:14:50.940 are resilient enough
01:14:52.400 to resist this.
01:14:53.700 Well, that's our show for today.
01:14:54.700 On behalf of all of us
01:14:55.660 here at Rebel World Headquarters,
01:14:57.420 to you at home,
01:14:58.140 good night
01:14:58.460 and keep fighting for freedom.
01:15:00.160 Thank you.
01:15:00.240 Thank you.