Western Standard - December 10, 2025


HANNAFORD: Get kids off their phones!


Episode Stats

Length

24 minutes

Words per Minute

166.25772

Word Count

4,043

Sentence Count

236

Hate Speech Sentences

3


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

When the kids are wrapped up in their phones, they're not up to mischief anywhere else. This can't be good, and there are people trying to do something about it. Joining me today is Robin Shirk, a volunteer mom with Unplugged Canada, a movement helping families prevent digital harm by delaying smartphones and supporting a social media age minimum.

Transcript

Transcript generated with Whisper (turbo).
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Good evening, Western Standard viewers, and welcome to Hannaford, a weekly politics show
00:00:22.020 of the Western Standard. It is Thursday, December the 4th. One of the clichés of our age is the
00:00:28.400 teenager focused on their cell phone to the exclusion of all else, including friends and
00:00:33.240 family. In fact, much of the time, they're not even teenagers. How often does mom's phone do
00:00:39.620 double duty as a babysitter? This can't be good, and there are people trying to do something about
00:00:46.480 it. Joining me today is Robin Shirk. She's a volunteer mom with Unplugged Canada, a movement
00:00:52.640 helping families prevent digital harm by delaying smartphones and supporting a social media age
00:00:59.900 minimum. Welcome to the show, Robin. Thanks for having me. Oh, glad to have you here. It's a bit
00:01:07.940 of an issue, isn't it? Robin, how bad can this be? When the kids are wrapped up in their phones,
00:01:13.680 they're not up to mischief anywhere else and turning over garbage cans in the back alleys.
00:01:18.100 So, why do we care? What's Unplugged Canada, and what is it that you are trying to accomplish?
00:01:24.120 Yeah. So, broadly, Unplugged Canada, it's a movement of parents across the country
00:01:28.660 that are really focused on preventing digital harm in children. We're doing that through local
00:01:35.100 support and helping parents delay giving their kids a smartphone if they choose to,
00:01:40.920 and also with national advocacy, asking for Canada to have a social media age minimum.
00:01:46.520 And all this is about protecting childhood and helping kids have a healthy, age-appropriate
00:01:53.420 development.
00:01:55.860 Now, look, every age has its own difficult things, right? And, you know, when I was 10 years old,
00:02:04.320 I was always in a book. And so, I'd get this disembodied voice coming from somewhere else in the house,
00:02:12.140 get out of the book and come and feel potatoes or go make your bed or something like that, you know.
00:02:18.020 So, 70 years ago, it was books. Today, it's a smartphone. Has anything really changed?
00:02:25.900 Yeah, I think I would, first of all, I'd separate smartphone from social media because there can be,
00:02:31.800 you know, there's differences there. And so, if I was thinking about the kids' interaction,
00:02:36.900 reading books or watching TV versus being on social media, those are just fundamentally different.
00:02:43.780 You know, we didn't know what we released, but when you look at social media now with the social
00:02:47.940 comparisons and the, you know, the predator outreach and the manipulation through the algorithms,
00:02:54.680 it really is creating this compulsive use and these mental health harms that have become quite clear
00:03:00.700 and just did not, you know, you did not see that happen when kids spent hours reading books or
00:03:06.200 watching television, for instance. So, we are seeing that this does feel quite different.
00:03:11.880 Do you have a, you know, an example of somebody's family where things went spectacularly wrong
00:03:18.760 and it was the smartphone that ultimately proved to be the problem?
00:03:22.720 Yeah, I think the hard thing is, is there's so many examples, it's hard to pick just one.
00:03:29.760 When you think of, you know, teenage girls in Canada today, 12% of all teenage girls show signs of addiction.
00:03:36.560 It's things like withdrawal, loss of control. So, I think that's what we found is...
00:03:41.380 Now, let me stop you there. When you're saying they're showing signs of addiction,
00:03:45.660 are you saying that they're addicted to their phones or that the phones make them addicted to something else?
00:03:51.160 Yes, drugs, whatever.
00:03:52.900 So, addicted to the social media app itself.
00:03:55.800 So, when you think about what social media is, it's designed to create this compulsive use,
00:04:00.800 to maximize your engagement, as they might call it.
00:04:03.840 And we see that. A third of teenage girls today, 31%, spend more than five hours a day on social media.
00:04:11.140 So, what about teenage boys? Are they part of the problem, too?
00:04:14.140 Me? So, teenage boys are impacted. They're impacted slightly differently.
00:04:18.020 So, they do show those compulsive rates. So, it's, I believe it's about one in five overall for teenagers,
00:04:25.120 have that compulsive use. But for teenage boys, some of the harms are a bit different as well.
00:04:30.900 They're more likely to be exposed to excessively violent content, for instance,
00:04:35.580 and groomed into, you know, some of those violent ideologies.
00:04:39.400 So, it's, you know, the harms are a little bit different between boys and girls, but they're both harmful.
00:04:44.020 Hmm. So, is this something that people grow out of?
00:04:49.860 You know, one thing that we've seen is when you have harms, like mental health challenges,
00:04:55.400 like depression or anxiety, is those can carry with you. Those can linger.
00:04:59.860 Same things if you see really explicit content that's terrifying to you or disturbing.
00:05:05.600 That lingers. Once you see that, you cannot unsee that.
00:05:08.340 So, in that way, I would say it lingers.
00:05:11.280 Yes. You can't unsee the death of Charlie Kirk, can you?
00:05:15.140 And I understand that's something a lot of kids go looking for, which is, you know, a frightening thought in its way.
00:05:22.440 All right. So, the kids have the phones.
00:05:25.980 Why does anybody give a 10-year-old a smartphone?
00:05:34.620 Yeah, I think, you know, there's smartphones and there's some of the harmful content that can come through unfettered access to the Internet.
00:05:41.000 So, if someone's going to give their child a phone, I understand you want to be able to message them.
00:05:46.840 Maybe you want them to be able to have a Mac to know where they're going.
00:05:50.380 And there's certain circumstances where you might need it for medical reasons as well.
00:05:54.580 And that's very understandable.
00:05:57.080 You know, we are seeing new types of phones come onto the market that don't have that unfettered access to the Internet,
00:06:03.200 but do let kids have access to the services they need.
00:06:05.900 So, for instance, Wyze Phone is one example, or Pinwheel, for instance.
00:06:10.940 So, I think it's reasonable for kids to want to be connected to their parents or to each other.
00:06:15.700 But it's this unfettered, unsupervised access to these age-inappropriate platforms in the Internet is really where we're focused.
00:06:23.940 Well, of course, you're always going to have a child who has the phone for the kind of valid reasons you've just talked about.
00:06:32.040 But then, in any group of a dozen kids, there's probably one whose parents see things differently,
00:06:37.860 and they can wander the Internet at will.
00:06:40.640 And you see this little cluster of children gathered around the one who's actually got the phone,
00:06:45.080 and they're all looking at something that intrigues and interests all of them and not necessarily in a good way.
00:06:51.380 So, is it enough to get parents to, like, how does your organization see extending this so that scenario doesn't occur?
00:07:04.820 Yeah.
00:07:05.300 So, we recognize this is a complex challenge that requires a lot of angles.
00:07:10.800 So, whether that's education in schools or just in the home, to be aware of the risks and the types of harms that can occur,
00:07:19.840 whether it's parent involvement and community support, helping parents group together and say,
00:07:24.860 you know, we'd like to delay giving our child this device until, you know, they're of the right age.
00:07:29.720 But that's also why we're focused on the national advocacy as well,
00:07:32.520 because there's some things that, you know, just are not age-appropriate for children.
00:07:37.180 Just as you agree, children shouldn't be going to nightclubs or driving or gambling in casinos,
00:07:43.400 there's some interactions online where the compulsive use design, the harms and the risks,
00:07:48.820 just, you know, warrant the same level of youth protections.
00:07:51.720 And I would put social media in that category.
00:07:55.060 So, that's why we're advocating for that, some of these, I'll call it national age minimums,
00:08:00.860 alongside the local community involvement and education awareness.
00:08:04.440 Sorry, national internodes, did you say?
00:08:08.740 Oh, national advocacy.
00:08:10.680 So, we're asking for Canada to have a social media age minimum,
00:08:15.220 keep up with, you know, proposals that are, you know, like the bipartisan Senate bill in the U.S. right now,
00:08:21.020 or the suggestions for the EU to have an EU-wide minimum.
00:08:24.880 So, you see our...
00:08:26.380 What age would you set, Vlad?
00:08:28.240 If somebody said, yep, that's the answer, you're on to a good thing,
00:08:32.160 we're going to say it should be at this age.
00:08:35.220 What would you recommend as the age when a child should be allowed to get onto the internet
00:08:41.400 and wander wherever they will?
00:08:42.680 Yeah, so I think it's really all about developmental readiness.
00:08:46.340 And what the research is showing right now is, for social media specifically,
00:08:52.360 girls are most vulnerable to harm around the ages of 11 to 13,
00:08:56.520 whereas boys are most vulnerable between ages of 13 to 15.
00:08:59.700 So, that's why you're seeing countries internationally set that age at 16,
00:09:03.120 and that's why we would recommend 16 as well,
00:09:05.380 is to make sure you're kind of past those most vulnerable years
00:09:09.160 before having these interactions.
00:09:10.560 I hate to be a naysayer, but...
00:09:16.240 And actually, I'm not.
00:09:17.160 I wish you all the luck in the world.
00:09:18.400 But do you think that it is going to fly?
00:09:22.980 I mean, there are going to be some people.
00:09:24.800 There's obviously an organization that you're working with,
00:09:28.520 and I'm going to ask you about that organization in just a moment.
00:09:32.840 But in any group of 100 parents,
00:09:36.380 you know that there are going to be some who say,
00:09:38.320 yep, we've got to keep the phones out of the hands of the children,
00:09:41.740 and then there'll be others who, for whatever reason,
00:09:44.820 and it may well be that they don't want the fight.
00:09:49.260 It may be nothing more than that, but they're not going to go with you.
00:09:51.700 And then you've got some who are on and some who are off,
00:09:54.060 and here's a whole new source of teenage tension.
00:09:57.740 What do you think?
00:09:58.680 What are your chances?
00:09:59.320 I think, well, I would separate the having a phone versus a social media age minimum.
00:10:05.740 For having a phone, you know, I'd support every family and where they're at,
00:10:08.600 and I recognize there's different circumstances there.
00:10:11.560 So you're right.
00:10:12.600 You're never going to have consensus on how families want to approach that.
00:10:16.120 They should have differences.
00:10:18.060 Now, for a national age minimum for social media,
00:10:21.340 that actually has widespread support.
00:10:24.620 You know, Ipsos, you know, a polling firm,
00:10:27.840 actually surveyed 30 countries around the world, including Canada,
00:10:31.080 and they found that the majority of people in every country they surveyed
00:10:35.040 supported an age minimum, a government-enforced strong age minimum.
00:10:39.560 And among parents, you know, that support was three in four,
00:10:42.760 it was the majority.
00:10:44.400 You used one of the buzzwords that gets us alarmed,
00:10:48.060 and that is government enforcement.
00:10:49.800 What do you think the government could or should do about this?
00:10:56.560 You know, just like how we don't let kids buy cigarettes
00:11:01.120 or buy alcohol or gambling casinos, it's the same way.
00:11:05.300 It's putting the responsibility on the platform
00:11:07.520 to have some sort of age assurance.
00:11:11.180 And to be clear, that does not require a government ID.
00:11:15.220 If you look at what they did in the UK
00:11:17.880 when they implemented, you know, age enforcement
00:11:20.860 for going to explicit sites like pornography sites,
00:11:24.240 there they found there's multiple ways
00:11:25.780 you could verify someone is of age
00:11:28.020 without needing a government ID.
00:11:30.480 For example, using a $0 charge on a credit card
00:11:33.700 or sharing an email.
00:11:35.640 And if that email address has existed for so long,
00:11:37.820 you know that digital footprint
00:11:39.660 means that person is obviously of age.
00:11:41.380 So there's multiple ways to do this
00:11:44.720 that are very privacy-preserving.
00:11:47.020 And I think the owner should be on the platform.
00:11:49.680 It's not on the person.
00:11:51.340 And what I'd say is it was just this past summer
00:11:53.540 that Canada itself launched a new national standard
00:11:57.420 for age assurance technology.
00:11:59.040 So we have the framework in place now
00:12:00.600 in a way that we just didn't a couple of years ago.
00:12:03.020 So we have, you know, we have the guardrails,
00:12:05.820 we have the design,
00:12:06.480 and we've seen it rolled out in other industries
00:12:08.240 and other countries to know that it can be done
00:12:10.240 safely and effectively
00:12:11.820 without having to, you know, share too much information.
00:12:16.040 You were describing a technology that we have
00:12:18.760 that works in other circumstances
00:12:22.120 that can be made to work
00:12:23.680 for vetifying the age of children
00:12:28.720 so that they're not on the internet.
00:12:32.480 What is that?
00:12:33.840 Can you describe that again, please?
00:12:35.600 Yeah, so it's called age assurance technology.
00:12:38.240 And what that means is, you know,
00:12:41.240 you're not checking a specific ID,
00:12:43.660 but you have other ways that you can validate
00:12:45.740 someone is of age.
00:12:47.420 So, you know, there's multiple ways that can be done.
00:12:49.920 You're seeing that being used in the UK right now
00:12:53.200 for, you know, for their explicit sites
00:12:56.080 and the age checks there,
00:12:57.960 or in Australia for joining a social media account.
00:13:01.220 So they're explicitly say,
00:13:02.280 you do not need to use a government ID
00:13:04.200 in order to have your age validated.
00:13:06.120 So...
00:13:07.120 So the child would be trying to log onto a site
00:13:12.720 and it would just give them a blank screen
00:13:16.160 or, you know, sorry, you're not verified
00:13:18.580 or something like that.
00:13:19.500 Is that what this would look like?
00:13:21.640 Yeah.
00:13:21.940 So it would block you at the account setup level
00:13:24.500 so you could see the links from the site,
00:13:26.380 but you wouldn't be able to make your own profile.
00:13:28.120 And making your own profile,
00:13:30.600 that's the reason why that matters
00:13:32.480 is when you have your own profile,
00:13:34.220 that's when predators can reach out to you.
00:13:36.180 That's when you can have those social comparisons,
00:13:37.980 you know, tracking the likes, all of those things.
00:13:41.780 And it also lets the algorithm really collect your data
00:13:44.380 and lock into you and become more addictive
00:13:46.600 because you're sharing so much information
00:13:49.200 and it can trace it all back to you.
00:13:50.940 So it's this idea that you should be,
00:13:53.600 you know, of an age to consent
00:13:56.240 to sharing all that data
00:13:57.440 and consent to understanding
00:13:58.900 how you're being manipulated
00:13:59.940 before you're put in that circumstance.
00:14:02.720 How exactly are children being manipulated?
00:14:05.700 What are the levers that get pulled?
00:14:08.200 Yeah.
00:14:08.620 So there's what I'll call content rewards
00:14:11.000 and social rewards.
00:14:12.560 So the manipulation on the content front
00:14:14.760 is that's the algorithmic feed,
00:14:17.560 feeding your content to keep you engaged.
00:14:20.080 And the more emotionally charged that content is
00:14:22.800 and the more kind of, you know,
00:14:24.780 wow, attention getting that content is
00:14:27.180 and the more likely you are to stay hooked.
00:14:28.980 It's almost like pulling a gambling lever
00:14:31.140 to see, you know, a slot machine lever
00:14:32.860 to see what pops up.
00:14:35.380 So that's the emotional manipulation there
00:14:38.240 is it can put you in-
00:14:39.440 And this ultimately is about
00:14:40.560 selling product to children, right?
00:14:42.480 This has got the cash.
00:14:44.760 Yeah, exactly.
00:14:46.580 These are ad serve platforms.
00:14:47.920 Their entire design is to sell,
00:14:50.020 you know, to sell advertisements
00:14:51.360 and keep you on there longer
00:14:52.920 to be able to sell more advertisements to you.
00:14:55.440 So whether it's, you know,
00:14:57.520 manipulating the content
00:14:58.600 to try and hook your emotions
00:14:59.780 and keep you there longer
00:15:00.960 or the social work,
00:15:02.500 looking at the number of likes
00:15:04.320 or followers or comments
00:15:05.700 on different types of content
00:15:07.600 that you're interested in.
00:15:09.140 All in those notifications as well,
00:15:11.640 you know, someone telling you to come back in.
00:15:13.400 All of those things are designed
00:15:14.920 to catch and hook your attention.
00:15:17.460 And for adults, you know, that's fine.
00:15:19.080 We have the impulse controls
00:15:20.380 and the critical thinking skills developed.
00:15:22.720 But for young children,
00:15:23.760 they haven't developed that yet.
00:15:24.920 So it's really preying
00:15:25.920 on some of their vulnerabilities
00:15:27.380 and just, you know,
00:15:29.980 they just haven't had
00:15:30.880 the developmental maturity
00:15:31.880 to be able to handle these situations
00:15:34.660 the way that adults can.
00:15:36.660 Okay. Tell me a little bit
00:15:37.820 about your organization.
00:15:39.100 How long have you been around
00:15:40.080 and what kind of pickup are you getting?
00:15:42.300 And then I want to take you
00:15:43.440 to the petition
00:15:44.840 that you're asking people to sign.
00:15:46.860 So first, what about your organization?
00:15:48.980 So Unplug Canada,
00:15:50.420 it was officially formed last year
00:15:53.080 and it was formed by Jenny Perez
00:15:55.140 out of Vancouver.
00:15:56.560 And what it is,
00:15:57.700 is it's a group of volunteers
00:15:59.580 across the country.
00:16:01.040 So we've got chapters
00:16:01.900 from Nanaimo to Nova Scotia
00:16:04.300 and we're focused
00:16:05.640 on preventing digital harm in kids.
00:16:07.440 And that's through supporting families
00:16:09.020 in our local communities
00:16:10.020 if they do choose
00:16:11.640 to want to delay getting
00:16:12.580 their child a smartphone.
00:16:13.860 It's to help parents band together
00:16:15.260 and alleviate that peer pressure
00:16:16.860 to have one
00:16:17.480 until they feel that their kids
00:16:19.500 are of age to be able to handle that.
00:16:22.080 And then the other piece
00:16:24.580 is the national advocacy.
00:16:26.320 And this is the piece
00:16:27.520 that is asking the government
00:16:29.120 to put in a social media age minimum
00:16:31.000 of 16
00:16:32.160 and also have stronger
00:16:33.720 use data privacy rights
00:16:35.140 for under the age of 18.
00:16:36.720 So once they are old enough
00:16:38.180 to be on these platforms,
00:16:39.220 these companies are not allowed
00:16:41.080 to monetize
00:16:41.940 and share
00:16:42.600 and sell their data
00:16:43.540 until these kids
00:16:44.580 are old enough
00:16:45.500 to recognize
00:16:46.160 what they're giving away.
00:16:48.160 Now you had a,
00:16:49.220 you did,
00:16:49.880 you have already presented
00:16:51.000 a petition
00:16:51.840 to the federal government
00:16:53.620 on this matter.
00:16:54.980 We have.
00:16:56.860 Is there any further possibility
00:16:59.280 for people to engage
00:17:01.060 by signing another one?
00:17:04.040 Absolutely.
00:17:04.860 I think in terms of engaging,
00:17:06.740 I would encourage folks
00:17:08.300 to sign the call to action
00:17:09.360 on unpluggedcanada.com
00:17:11.380 and also the direct participation.
00:17:14.360 Writing a letter to your MP
00:17:16.080 and saying that you would want
00:17:17.600 to have these child protections
00:17:19.140 put in place.
00:17:20.800 We've been talking
00:17:21.880 with dozens of MPs
00:17:23.260 across the country
00:17:24.100 and I've been to Ottawa
00:17:25.880 three times in the last month
00:17:27.100 back and forth for meetings.
00:17:28.760 So it is gaining
00:17:30.060 incredible momentum
00:17:31.000 and we're gaining cross-party,
00:17:33.600 multi-party support as well.
00:17:34.980 I noticed you got a response
00:17:37.160 to your first petition
00:17:39.100 from Mr. Gilbo.
00:17:42.000 He was, you know,
00:17:44.180 one of the liberal ministers.
00:17:46.880 He was one of the liberal ministers.
00:17:48.760 He's resigned now.
00:17:50.320 But there's not a lot
00:17:52.220 of common ground
00:17:53.200 that we find with Mr. Gilbo
00:17:54.880 here at Western Standard.
00:17:57.760 How did you rate his response
00:18:00.500 to the petition
00:18:01.860 when he received it?
00:18:03.120 You know, I always know
00:18:05.640 that tabling a petition
00:18:06.760 is, that's the start.
00:18:08.860 That's not the finish
00:18:09.560 of this effort.
00:18:11.100 So, yeah, getting the attention
00:18:13.260 of the government
00:18:14.060 is important.
00:18:15.620 And, you know,
00:18:17.140 moving forward
00:18:17.920 with multi-party legislation
00:18:19.660 to prevent harm to children
00:18:21.600 and protect children,
00:18:23.000 I think everyone wants
00:18:24.000 to see that happen.
00:18:25.020 And we certainly do as well.
00:18:26.940 So, you know,
00:18:27.920 his response was a bit broader
00:18:29.200 and it referenced several pieces
00:18:30.480 of legislation.
00:18:32.980 But I want to see action today.
00:18:34.760 I don't want to wait.
00:18:35.920 So, that would be,
00:18:37.080 that would be my response
00:18:38.020 is I want to see us move yesterday.
00:18:40.760 So, let's see what we can do here.
00:18:42.660 Right.
00:18:43.340 Well, you know,
00:18:44.060 I took a look at his response
00:18:45.920 and yours was a very,
00:18:48.440 it's a very particular concern
00:18:50.420 and you have
00:18:51.080 the keeping younger children
00:18:53.920 off social media.
00:18:57.020 He took the opportunity
00:18:58.920 to explain all about Bill C-9,
00:19:03.920 which they couch
00:19:05.400 in terms of protecting children.
00:19:07.320 So, it wasn't completely off topic.
00:19:09.520 But it's also a bill
00:19:11.360 that is about to be used,
00:19:14.240 it seems,
00:19:15.240 the reports indicate that,
00:19:17.380 to restrict religious freedom as well.
00:19:20.420 I won't ask you to comment
00:19:21.660 on that specifically,
00:19:23.080 but, you know,
00:19:24.480 how confident do you really feel
00:19:26.340 that Heritage Canada,
00:19:29.640 which is the agency you're addressing,
00:19:32.920 understands what you're talking about
00:19:35.320 as opposed to using you
00:19:37.060 as an excuse to talk about
00:19:38.680 what they want to talk about?
00:19:41.100 Yeah.
00:19:41.760 I think to answer that broadly,
00:19:43.980 I have no interest
00:19:45.000 in watching my kids
00:19:45.860 be held hostage.
00:19:47.280 I want to see legislation
00:19:48.460 to prevent harm
00:19:49.960 and protect my children.
00:19:51.700 And I think we need
00:19:53.100 to focus on elements
00:19:54.840 that have multi-party support
00:19:56.580 that can get passed quickly.
00:19:58.800 And so, that's really
00:19:59.740 where I'm emphasizing
00:20:00.540 and that's where I would focus.
00:20:03.140 Yes, you can talk about
00:20:04.400 broader initiatives
00:20:05.240 and broader efforts.
00:20:06.760 And, you know,
00:20:07.480 I'm not going to comment on that
00:20:08.920 because where I'm really focused on
00:20:11.120 and where my effort is
00:20:12.600 is on getting
00:20:13.420 a social media age mineral.
00:20:14.820 And this is a targeted ask.
00:20:17.160 It has multi-party support
00:20:18.560 and I see no reason to delay it.
00:20:21.780 Have you had any
00:20:22.340 face-to-face discussions
00:20:24.060 with government officials
00:20:25.140 on this yet?
00:20:26.440 Member of Parliament
00:20:27.420 or somebody within
00:20:29.100 Heritage Canada, perhaps?
00:20:30.580 I have.
00:20:31.260 Yeah, I did.
00:20:31.840 I did get to talk quickly
00:20:33.340 with Global as well.
00:20:34.880 But I did talk,
00:20:36.440 I have talked
00:20:37.280 with four different
00:20:38.040 ministers' offices so far.
00:20:39.900 And as I mentioned,
00:20:41.540 a couple dozen MPs
00:20:43.280 and I've met senators as well.
00:20:44.800 So, we are seeing
00:20:46.380 a lot of traction
00:20:47.200 and a lot of movement.
00:20:48.740 And I do think
00:20:49.620 we can make this happen,
00:20:50.700 but we just have to show
00:20:52.140 that this is what Canadians want.
00:20:54.460 And we see it
00:20:55.000 in the polling numbers,
00:20:56.080 but emailing your MP
00:20:57.400 to ask for this
00:20:58.340 is really important
00:20:59.740 to show that broad
00:21:00.960 national support.
00:21:02.240 Mm-hmm.
00:21:02.860 Okay.
00:21:03.420 How many,
00:21:04.080 this is the last question,
00:21:05.640 we're almost out of time.
00:21:07.160 I'm sorry,
00:21:07.640 but how many parents
00:21:10.300 have responded so far
00:21:12.440 and actually signed
00:21:14.220 the petition?
00:21:15.860 Yeah.
00:21:16.340 So, what I would say is,
00:21:17.460 so the petition
00:21:18.000 was only open for 30 days
00:21:19.360 and it was a paper one.
00:21:20.860 That was always kind of symbolic
00:21:22.200 to get this piece going.
00:21:24.000 So, what I would really focus on
00:21:25.620 is the polling numbers.
00:21:27.180 It doesn't matter
00:21:27.660 what poll you look at,
00:21:29.520 the vast majority
00:21:30.600 of Canadians want this.
00:21:32.200 How big is vast?
00:21:33.680 70%, 80%?
00:21:35.880 80%.
00:21:36.520 So, 81% said
00:21:38.000 they want to see
00:21:38.520 a government-enforced age minimum
00:21:40.220 for joining social media.
00:21:43.640 And, you know,
00:21:44.320 there's different polls
00:21:45.500 asked different age numbers.
00:21:47.080 There was one that said
00:21:48.580 84% of Canadians
00:21:50.700 agreed that kids
00:21:51.420 should be 16% to join.
00:21:53.500 But when you looked at,
00:21:54.680 you know,
00:21:55.000 the Quebec Select Committee report,
00:21:57.240 it had 90% agreement
00:21:58.480 in their consultation,
00:22:00.380 including 70%,
00:22:01.680 76% agreement
00:22:03.320 among teenagers.
00:22:04.240 So, we see Canadians want this
00:22:06.500 and even teens want this.
00:22:08.080 They, you know,
00:22:09.700 we're finding a lot of folks
00:22:10.700 join because their peers
00:22:11.680 are on it
00:22:12.020 and they don't want
00:22:12.480 to be excluded,
00:22:13.180 but that doesn't mean
00:22:14.160 they want to be fed
00:22:14.820 all this other stuff
00:22:15.620 and get all this content
00:22:16.820 that can be quite excluded.
00:22:18.200 It's very interesting
00:22:18.880 because I hear informally
00:22:21.360 from people
00:22:22.060 who have younger children
00:22:24.940 that kids aren't necessarily
00:22:27.840 as into it today
00:22:29.900 as they were even a year ago.
00:22:32.660 And anecdotally,
00:22:34.720 you seem to be saying
00:22:36.200 the same thing.
00:22:38.040 Yeah, I think
00:22:38.600 you're seeing kids
00:22:39.960 are starting to recognize
00:22:40.760 the harms.
00:22:42.100 We saw that
00:22:42.920 when we went out
00:22:43.720 collecting petition signatures
00:22:45.440 for the petition we had.
00:22:48.140 What surprised me the most,
00:22:49.700 it was like the 18,
00:22:50.940 19, 20-year-olds
00:22:52.040 were the ones
00:22:52.960 that would take the pen
00:22:53.680 out of her hand
00:22:54.240 and just start signing
00:22:55.080 and then start sharing
00:22:56.020 their short stories
00:22:56.920 of what had happened
00:22:57.620 to their friend,
00:22:58.260 what had happened to them,
00:22:59.620 what they had seen.
00:23:01.080 So, yes,
00:23:02.040 I think kids
00:23:02.800 that have been through it
00:23:03.800 and have had experiences
00:23:05.360 realize, you know,
00:23:06.880 maybe you shouldn't be
00:23:08.080 giving that
00:23:08.420 to middle school students.
00:23:09.380 Robyn,
00:23:11.520 we are out of time,
00:23:13.380 but I must say
00:23:14.600 that I really find myself
00:23:16.100 in 100% sympathy
00:23:17.760 with the aims
00:23:18.380 of your organization.
00:23:19.640 That's Unplugged Canada.
00:23:21.220 If anybody wants
00:23:22.380 to get in contact
00:23:24.020 with you,
00:23:24.560 how would they do that?
00:23:26.240 Yeah, I would
00:23:27.160 go to Unplugged Canada.
00:23:28.980 You can contact
00:23:29.820 the email there
00:23:30.520 or it's robyn
00:23:31.340 at unpluggedcanada.com
00:23:32.880 as well.
00:23:33.560 Using your smartphone,
00:23:34.680 of course.
00:23:35.920 You know,
00:23:37.040 there's a lot of good
00:23:38.220 that comes from
00:23:38.760 these devices.
00:23:39.380 It just has to be
00:23:40.260 an age-
00:23:40.580 It just has to be
00:23:41.280 channeled,
00:23:41.720 doesn't it?
00:23:42.320 Yeah, I'm so with you.
00:23:44.140 Robyn Schirk,
00:23:45.300 volunteer with
00:23:46.740 Unplugged Canada
00:23:47.660 talking about
00:23:48.800 trying to take
00:23:50.720 our kids back
00:23:51.560 out of the grip
00:23:52.680 of the smartphone
00:23:54.840 social media revolution
00:23:57.100 that's taken us
00:23:57.900 over in the last decade.
00:23:59.800 Robyn,
00:24:00.240 thank you very much
00:24:00.960 for being with us.
00:24:01.780 It's been a pleasure
00:24:02.180 to have you.
00:24:03.300 Yeah, thanks for having me.
00:24:04.660 Appreciate it.
00:24:05.440 For the Western Standard,
00:24:06.600 I'm Nigel Hannaford.
00:24:08.200 Thank you.
00:24:13.480 Thank you.
00:24:16.240 Thank you.
00:24:17.300 Thank you.
00:24:17.600 you