True Patriot Love - March 09, 2026


Social Media Is Getting Dangerous (2026)


Episode Stats


Length

30 minutes

Words per minute

200.6528

Word count

6,086

Sentence count

89

Harmful content

Misogyny

4

sentences flagged

Toxicity

7

sentences flagged

Hate speech

3

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

As we roll on in 2026, social media is being highlighted around the world, and it could be good at times, but it could also be very very ugly as well. In this episode, we discuss the good, the bad, and the ugly of social media, and how it affects us all.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Toxicity classifications generated with s-nlp/roberta_toxicity_classifier .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 as we roll on in 2026 the good the bad and the ugly social media is being highlighted around
00:00:11.040 the world and it could be good at times but it could be very very ugly as well to talk more
00:00:15.820 about it thrilled to be joined by one of our newest members here at tpl media shalisa backer
00:00:20.520 shalisa how are you hey good how are you jim good good let's let's get before we get into some of
00:00:25.860 stuff that's going on when exactly did social media take a hold on society the way it is right
00:00:32.020 now when did that start that's a really good question you know what i think i think covid
00:00:37.300 had a lot to do with it people were bored obviously that's when tick tock really blew up as well
00:00:42.740 and it has just had a hold on everybody and it doesn't matter how old you are ever since in some
00:00:48.660 way shape or form social media has some sort of hold on you you know i thought tick tock was
00:00:53.140 strictly uh you know pre-teens teens college kids but i i have a one of my wife's cousins a dairy 0.66
00:01:00.740 farmer in his 50s who watched tick tock videos of other dairy farmers milking and i like you do
00:01:07.540 and what is it about the tick tock algorithm that made it what it is i i don't know what it is i
00:01:14.580 don't know how they know what's going on inside my head but somehow somehow it just always knows
00:01:19.380 knows what you're going through and somehow it ends up on your algorithm like that needs to be
00:01:23.180 studied how they know exactly what's going on in everybody's life if they know exactly what content
00:01:27.720 to feed you and also i get nervous to like look things up on tiktok sometimes because i'm like
00:01:32.880 well now my whole feed is going to be this so when did it go from uh sharing information about where
00:01:39.340 you're going to eat from dinner and birthday messages and letting people know i saw a good
00:01:44.380 movie to turning to toxic where you say something a and it goes down whether it's people who are
00:01:52.400 just trolls or bot farms and it spirals out of control yeah and i think that also makes us think
00:01:58.700 like are people really that bored that they have like nothing to do then just troll online and leave 0.99
00:02:05.140 stupid comments for lack of a better term stupid comments and it's like aren't we tired of it and i 0.99
00:02:11.860 i really wish i could tell you when that shift happened but i think it happened so gradually 1.00
00:02:17.380 and now looking back on it we're like what happened how did we get here so a little social
00:02:22.900 experiment i posted something on my facebook that went on my threads i didn't mention trump
00:02:28.420 i didn't mention america i didn't mention player names i didn't mention canada or the us i just
00:02:33.860 said to paraphrase i could look it up but when i watch hockey i don't think politics i just watched
00:02:39.780 the hockey game i just want to watch it i did see that and i liked it so that's fun to thousands of
00:02:45.940 comments from people who thought i was commenting on trump and america and one individual said well
00:02:53.380 i guess i know who you voted for so i put a response saying well i voted ndp i'm canadian
00:02:59.620 and they said why are you commenting on american subject matter and i didn't and this is what like
00:03:05.220 i was like i was no names no country it was as generic as possible and it went crazy and i was
00:03:14.260 fascinated to see how one comment fed off the other and they started to fight with each other
00:03:19.940 it really it really is fascinating and sometimes you're just wondering like how how that it just
00:03:26.740 it doesn't make sense to me sometimes and this is like somebody who's chronically online i'm
00:03:30.340 on social media all the time more than i'd like to admit but it it just it gets really really scary
00:03:36.980 and it's it's crazy how some content that you think is harmless like just like that one post
00:03:42.100 on threads or whatever the case is how it just completely blows up and something that's actually
00:03:46.580 important like important subject matter to what's going on in the world it gets like no hits and no
00:03:52.660 views right so what how do you know if you're on social media that it's a bot commenting on what
00:03:58.980 you did and if it's just some rando who just feels like spewing hate well i think usually if you look
00:04:04.980 at their profiles usually their their username is also a good tell you know you'll see like a random
00:04:10.340 numbers or it'll say like user 59977 something like that no profile picture yes uh and if you
00:04:16.740 look at their actual feed they probably have no posts or just like generic posts that are maybe
00:04:22.340 ai created and things like that and then you'll see you know real people most likely will have
00:04:26.900 a photo of themselves as their as their profile picture you look at their feed and see what
00:04:30.420 they're posting or what they're reposting and it's sometimes hard to tell because these bots
00:04:34.740 and ai are getting very convincing these days so i i'm a believer in using different social
00:04:40.660 media platforms for different topics you know facebook is more personal family um you know
00:04:47.060 instagram i find my buddies and we're talking hockey and golf and dogs and like silly stuff
00:04:51.860 like that am i right is it should you use different platforms for different things in your life i mean
00:04:57.460 i do it too i do too i mean i barely use facebook anymore to tell you the truth if i didn't if i
00:05:03.060 didn't need it for work i wouldn't use it i'll be honest and i feel like instagram is kind of
00:05:07.300 becoming that too and i don't know if this is a generational thing but i used to i love instagram
00:05:11.620 i used to love instagram that was my number one platform but now i'm finding oh is that right yeah
00:05:15.940 Yeah. And, you know, back in the day when Instagram started, what, 10, 10, 15 years ago, it was like posting food photos.
00:05:23.600 Right. You know, that was a big thing. Right. That was what Instagram was used for. And then it just it progressed into, I don't know.
00:05:31.060 Now it just feels like it is flooded with things that are just untrue. A lot of fake news on the platform, a lot of AI generated content that is becoming so convincing, like people are believing it.
00:05:43.140 And I also think this has to do a lot with Bill C-19 here in Canada because they have blocked all news sources on meta platforms.
00:05:53.380 So Facebook, Instagram threads, you can't see any news.
00:05:57.080 So any local news, any national news, even some radio stations have been blocked.
00:06:02.780 So don't you think that's feeding into the fake news phenomenon that's going on?
00:06:07.320 And people don't have any trusted content and trusted sources.
00:06:11.540 Well, there was a big incident this week where Ben Mulroney on Global News posted what he thought was a video from the war in Iran.
00:06:20.480 And it was a video from a like an actual video game.
00:06:23.600 Yeah, it was a visual from a video game.
00:06:25.900 And they have to obviously have a student apology.
00:06:27.660 Yeah. And all of these deep fakes and AI photos of like celebrities in the hospital.
00:06:33.840 And it's looking more and more convincing.
00:06:36.220 and even like with the things that are going on in iran and all around the world really everything
00:06:41.100 in the middle east it is so hard to find verified content you're scrolling seeing all of these
00:06:46.220 videos and my first thought like as somebody who's on social media all the time it's like
00:06:50.460 okay is this real or not if you were 12 shaliza could you have handled social media i i don't
00:06:57.260 think so not with the way it is now and i always say this i'm so grateful for the generation that
00:07:02.140 i grew up in we were kind of like the crash test dummies right for social media you know we started
00:07:07.180 with the dial-up internet then we got into msn messenger and then we slowly progressed to
00:07:12.460 facebook and beyond as you got older as we got older yeah as a as a teen as a preteen now there
00:07:19.020 is so much content being thrown at them i don't know how to handle it i couldn't handle it i can't
00:07:25.340 handle it as an adult and with a with an undeveloped nervous system with an undeveloped brain you're
00:07:30.620 looking at all of these things and i was actually having a really interesting conversation with
00:07:34.060 someone the other day they were talking about how her 12 year old daughter her algorithm is all like
00:07:41.500 dieting and and um beauty things and at 12 and it's like how do they know this and for for boys
00:07:50.060 for 12 year old boys it's a lot of violence it's a lot of video game violence and and shooting
00:07:55.980 things and that kind of also makes me think of the tumblr ridge shooting in british columbia
00:08:02.460 when you have a troubled teen who maybe had some identity yes uh she identified as a girl but she
00:08:08.140 was born a male and that's fine trans right i'm here for the lgbtq plus community that's besides
00:08:14.220 the point it's all of the things that they are exposed to on social media really really messes
00:08:18.700 with your head and am i correct that uh ai chat gpt had flagged this individual's account yes but
00:08:25.580 didn't tell the authorities nope so then what's the point of doing all that then what's the point
00:08:30.300 of having access to that so i i had a conversation with someone recently and we talked about the
00:08:36.780 social media restrictions in australia now to me australia is a very democratic progressive country
00:08:43.020 but they had identified a problem with their youth and put in restrictions that you can't have social
00:08:48.300 media if you're a certain age now i am pro freedom pro everyone do what you do but i am seeing
00:08:54.300 evidence that it's having a real adverse effect on our youth and i'm okay saying if you're 14 or
00:09:00.460 under in canada you can't be in social media i agree i completely agree and right now i think
00:09:05.900 it's it's a tough spot for a lot of parents because a lot of them don't want to give in
00:09:10.060 they don't want to give them the phone they don't want to do this but then they're seeing all their
00:09:13.340 friends have it and everyone else is on social media so then you feel left out and then you
00:09:16.700 feel isolated so then that's another mental health issue where you feel like you can't connect with
00:09:21.500 your peers right so it's like where do we draw the line and there was a well it's actually quite
00:09:27.020 a sad stat a story that came out in ontario the amount of young men seeking help for online
00:09:33.180 gambling problems because of all the gambling apps on their phone as well it was staggering the
00:09:38.060 amount of young men asking for help also can i just point this out actually as i'm doom scrolling
00:09:43.900 on tick tock because i am guilty of doom scrolling i will tell you that but i am seeing a crazy
00:09:49.820 amount of ads for sports betting apps and for uh gambling apps as well yes it's it's very strange
00:09:56.460 like if i were to open tick tock right now i can guarantee you the first ad i will see is for like
00:10:00.220 bet 365. am i allowed to say the names of them whoa we are now we are now but for bet 365 and
00:10:07.660 like i don't even remember the names of them but i'm like why is this being pushed out to people
00:10:13.420 so much well that i don't understand so i i know with my wife and i my partner and i when we'll
00:10:19.340 look up hey we want to go to ikea get a shelf or we're looking at uh you know doing something
00:10:24.780 all of a sudden all our feeds are populated with you know ads and stories like i've just grown to
00:10:30.780 be accustomed to that they're listening but it is weird to me that in your um algorithm which
00:10:37.900 is nothing to do with sports gambling or online betting you're getting ads for but that's what i
00:10:43.580 don't understand my algorithm is very simple it's it's it's makeup it's true it's it's you know fun
00:10:50.220 light-hearted things it's millennial nostalgia and then i just keep seeing all of these casino apps
00:10:55.500 so if you're a 13 year old boy in canada who maybe is looking at basketball and video games
00:11:03.420 and i don't know cats and you're getting gambling ads in the algorithm what's to stop them from
00:11:09.740 starting to look into and getting sucked exactly and i mean you can put an age restriction on
00:11:14.380 those things but there's a way around them right you know it's the same thing as you know doing
00:11:20.140 under 18 things when but the kids have been doing that exactly for hundreds of years exactly there's
00:11:26.380 always a way around it i mean i may or may not have had a fake id back in the day but that's
00:11:29.740 That's a conversation for me.
00:11:30.760 What?
00:11:31.220 No, I'm an angel.
00:11:32.700 So I think there are still some good about social media and good things about it.
00:11:37.600 For you, what are the pros?
00:11:38.840 What are the things where it is a good thing?
00:11:40.760 Oh, there are absolutely so many pros.
00:11:42.400 I mean, for businesses, for small businesses, there are so many ways for them to get their content out there.
00:11:46.920 And, you know, bakeries, people who are running businesses out of their homes, a lot of people who do like lash extensions and beauty services and things like that.
00:11:54.420 I think it's fabulous.
00:11:55.720 and even for news outlets to get their content out there because let's be honest not a lot of
00:12:01.320 people are watching traditional news and receiving it in the way that they did 20 years ago so i
00:12:06.380 think it is very helpful for news outlets to get clips of major headlines out there but again that's
00:12:12.580 on tiktok unfortunately that's not the case on meta platforms because it's all blocked now i
00:12:18.900 keep hearing that they may be coming to a resolution will that happen in 2026 i'm really
00:12:24.760 hoping so but i've been hearing that for years and it's been what two or three years since the
00:12:29.480 bill c19 went into place and it's just getting worse and i mean i'm seeing like older people 1.00
00:12:35.320 like uh you know my mom my aunts things like that they're showing me all these videos oh look queen
00:12:39.560 latifah's in the hospital and she's asking and i'm like dude that's not real but when you look at it
00:12:45.560 it really does look like her well a sports reporter i know from london ontario reposted that phil
00:12:52.280 esposito a hockey legend hall of famer with died and it looked so real they had excuse me i got
00:12:59.020 sucked into a fake ai generated it looked like an actual official post feels very much alive
00:13:05.000 and but it happens every day and that also that also reminded me of a really good point as well
00:13:11.840 where social media if you are posting something for clarity on your own account like for example
00:13:16.300 pink there were it was in the headlines that she and her husband had split up again for a second 0.89
00:13:20.920 her own video she went on she went on to i think it was instagram and tiktok i think it was all of
00:13:25.320 her platforms she's like hey so this is news to me apparently my husband and i are splitting up
00:13:30.520 huh he's upstairs my kids are right here we're not splitting up so but the world bought into it
00:13:36.680 and it went on every entertainment site tmz it's it's spread like that far absolutely and that's
00:13:43.320 the culture that we live in now because of social media you know one outlet posts it and everybody
00:13:47.880 else is picking it up. But then it's like that comes back to, and I think both of us will feel
00:13:52.840 this way with a journalism, with a news reporting background. What are your sources? Where did you
00:13:58.980 get this from? And I think that needs to be translated on social media. We need to be
00:14:03.140 fact-checking things. We need to have reliable sources, which is not the case now.
00:14:08.200 Unfortunately, Tumblr Ridge, we saw the ugly side of social media with people spreading a lot of
00:14:13.400 unfounded stories and rumors and photos they had no business doing and that was really
00:14:18.220 disappointing and very also very disheartening for the lgbtq plus community so many insane things 0.80
00:14:24.940 about trans people and just because the shooter was a trans person that does not mean that all of
00:14:31.100 these things are true um where do we go from here in social media i mean i mean the restrictions
00:14:36.080 aside, what can a provincial federal government, a group of countries do to control what is a
00:14:44.480 multi-billion dollar beast? And I think at the end of the day, they don't want to damage people's
00:14:49.740 incomes. They don't want to affect people who can profit off of social media. But at the same time,
00:14:54.520 something needs to be done. We are seeing an unprecedented amount of mental health issues
00:14:59.020 amongst our youth, amongst even adults. And it's all because of social media. And why,
00:15:04.860 When did it become okay, aside from the government, when did it become okay for us to be so mean to each other?
00:15:10.660 That's a great question.
00:15:11.760 There was a sense of civility where you would only cross that line in dire circumstances, in extreme circumstances.
00:15:20.680 A family member is about to be threatened.
00:15:23.240 You're going to cross the line and you're going to be angry.
00:15:25.840 But just for an innocuous comment about, I don't like this.
00:15:29.520 I don't like ketchup on my fries.
00:15:31.180 And people start yelling at me.
00:15:33.040 Or people who like ketchup on their mac and cheese. 1.00
00:15:35.920 Yes, you're insane. 1.00
00:15:37.260 You're insane. 1.00
00:15:38.600 And I don't know. 0.99
00:15:39.940 It just, the government does need to intervene in some way.
00:15:43.360 But also, I think the way that our culture is now,
00:15:47.900 a lot of people have a hard time believing in the government.
00:15:50.920 Well, that is true.
00:15:51.900 So if they, even if they do intervene,
00:15:54.280 what is that really going to do?
00:15:55.520 Are people going to conform to it?
00:15:57.420 Are they going to listen?
00:15:58.340 What are the consequences going to be?
00:16:00.200 So Mark Zuckerberg famously was at a congressional hearing in Washington, D.C. to talk about Meta, Facebook, all his platforms.
00:16:09.680 Extremely wealthy, extremely influential, powerful man.
00:16:13.480 After that, I don't know what's changed.
00:16:16.480 Exactly.
00:16:17.220 And if you are this billionaire, you are behind all of these platforms, you have the opportunity to use your voice, to use your influence, to actually make a difference.
00:16:26.040 And I, like literally, and I'm going to keep coming back to this.
00:16:29.340 If I were ever in a room with Mark Zuckerberg, the first thing I would do is ask him,
00:16:32.600 what do you have against news on your platforms?
00:16:35.820 Yeah.
00:16:36.180 Like, what was the need for that?
00:16:38.560 Well, for my belief is he feels that he can control the news, the message, and the narrative on his platforms
00:16:45.740 without having journalistic, verified stories to interfere with it.
00:16:50.240 That's what it feels like to me.
00:16:51.960 And that's what the world is becoming.
00:16:54.220 everything is becoming so mission mashed together so watered down nothing is reliable i keep saying
00:17:02.560 the same thing over and over again but it's true and there needs to be some sort of reform when it
00:17:08.980 comes to social media i i know for my facebook it's it's a way for my family and friends to stay
00:17:14.460 connected and you know trade messages on instagram i find uh you know our daughters my wife we uh
00:17:21.080 we'll trade silly AI generated dachshund videos where the dachshunds are driving cars because I
00:17:26.440 think it's funny and um the recipes or we'll go hey there's a new there's this new tire restaurant
00:17:33.360 in Richmond Hill you know when you guys are in town let's go check it out I think that's okay
00:17:39.320 um but I'm just so conflicted because I hate seeing the damage that it's doing to society
00:17:47.460 the damage it's doing to mental health absolutely and i still enjoy social media but be aware of
00:17:52.980 that it's bad sometimes yeah and and that's that's the question where do you draw the line and how do
00:17:58.100 you teach kids this is where you draw the line this is when you need to realize you've gone too
00:18:02.740 far and i mean we can put like timers there there are timers on you can put like uh a restriction
00:18:08.420 on your instagram or tiktok to say hey you've had enough but you can just ignore it and continue and
00:18:12.580 right i've i've fallen victim to it as well i will sit there doom scrolling for like an hour and i
00:18:16.820 don't realize how much time has gone by and i'm like oh my gosh we're just mindlessly staring
00:18:21.380 busy as you are chalisa that'll happen to you i mean honestly sometime i'm gonna get into bed and
00:18:26.660 i'm like well i'm not sleepy yet i could watch tv i could read a book or i could scroll on
00:18:33.380 and and i i'm guilty i am guilty of it as well and then you're seeing your partner reels about
00:18:39.780 something you guys want to do or something or it's like hey look this guy was mean to his girlfriend
00:18:44.260 you did that too this is okay i know a lot of people saying i i'm plugging from social media
00:18:53.700 i've had it i'm getting rid of some of my platforms and deleting some of my apps but
00:18:59.300 i just have a feeling it's here to stay it is unfortunately it is and i i was talking
00:19:04.260 about this as well you can withdraw all you want but in a way that kind of removes you
00:19:10.500 from being in the loop with a lot of things because as i said a lot of people aren't consuming media
00:19:15.700 the way that they have traditionally in the past so sometimes and this could be a pro and a con of
00:19:20.900 social media it keeps you connected it keeps you up to date with what's going on in the world i mean
00:19:26.340 as soon as you know the bombs hit the middle east yes it was on social media right away
00:19:30.900 you didn't need to go to a tv or go to the internet to see it all you had to do was open instagram or
00:19:36.500 or x and x is a x is a problem as well it is a problem but you raise a good point when they have
00:19:42.340 a normal word that because of what's happening in the middle east the price of gas is going up my
00:19:47.140 first thought is i better top up my gas tank i gotta fill my tank because it's going to go up
00:19:52.460 six ten twelve cents a liter yeah and that's going to affect me and that information was readily
00:19:57.160 available to us within seconds as soon as someone like dan mctaig said something like that it's right
00:20:02.180 there on social media and prior to that you'd have to wait till like the six o'clock news in
00:20:06.980 your community in canada and like oh no i missed the window i'll be honest my dad still does that
00:20:11.880 my dad will sit there in the car and wait on the radio for them to tell you if gas prices are going
00:20:15.820 up or down and i'm like dad you could just look it up and so if the government does something like
00:20:22.800 that put some restrictions would that affect would zuckerberg and meta maybe push back against
00:20:28.780 the government if they're starting to put restrictions. I wonder how vindictive Zuckerberg
00:20:33.520 and Meadow would be and other social media platforms would be if countries like Canada
00:20:37.980 follow Australia's lead and other countries lead and start putting restrictions in. I think if it
00:20:42.760 affects their profitability, it definitely will make a difference for them. But also, I think we
00:20:48.220 just need to look at Australia as an example in general. I mean, look at how they went through the
00:20:53.440 COVID-19 pandemic versus how the rest of the world did.
00:20:56.480 Yes.
00:20:56.820 With discipline and rules, things are very, very different.
00:21:00.820 And people who are willing to follow the rules, it makes such a big difference.
00:21:05.320 They did not have to shut down their entire country like we did.
00:21:09.640 So that should reflect in this as well.
00:21:12.560 They are seeing actual results from their ban on social media for younger, for younger kids.
00:21:17.900 They are, they are.
00:21:19.520 And the quality of life is different.
00:21:21.700 you know, they're more prone to like go outside and play instead of wondering what my friend's
00:21:25.720 doing on Instagram, what this one posted to their story. Oh, these people are together and I wasn't
00:21:29.560 invited. I think those are things that are often overlooked. Obviously, you know, we raised two
00:21:35.120 daughters and you lived through with that 11, 12, 13 year old window for a young woman is very,
00:21:40.800 very challenging. And could you imagine like Cassandra at 13 years old looking like, oh my
00:21:45.180 gosh, this person posted on their story. And you're like, what do I do with this? Well, I remember
00:21:50.340 when she was 16 and saw one of her friends went to the Dominican Republic for March break and
00:21:57.300 how come we're not going like well we just can't we didn't we couldn't afford it sweetie we didn't
00:22:02.540 plan it we didn't plan it and and so it wasn't you know it didn't take long till when they got
00:22:08.400 their phones got into social media though those questions started to come up yeah and I think
00:22:13.800 too a lot of people put one side of their life on social media right you don't see what's going on
00:22:19.260 behind the scenes for example like your favorite influencers are probably going through it behind
00:22:24.360 the scenes they are putting out that content to clear the track they are putting out that content
00:22:28.380 to make it seem like their life is glamorous and whatnot but behind the scenes a lot of people are
00:22:34.280 struggling now i'm glad you brought that up because i started to follow someone on youtube
00:22:38.420 help men's health and mental health and physical health influencer chris williamson who's originally
00:22:45.400 from england lives in the usa you know often every once i'll post a video like you know what
00:22:49.960 my physical health is bad i had to get basically go to the doctor get a tune-up because i'm going
00:22:54.360 through some like he'll post a real video and i'll be honest with you the response is massive
00:23:00.920 when he's actually that open and honest like hey this is not working i'm struggling with this i
00:23:05.880 need help with that because people feel like oh okay he said it now i can yeah i can respond and
00:23:11.240 tell you that i'm doing going through the same thing as well that's real life stuff and that's
00:23:15.640 honestly i think that's what people want to see more than anything but because of the way social
00:23:20.360 media has become we feel like we have to put on a show for it now how much and i'm not picking on
00:23:27.880 celebrities but we recently just had an award show and we were talking off the air it seemed
00:23:33.080 like everyone was like on hyper ozembek and i love the osborne family but kelly osborne did not look
00:23:40.600 good everyone worries about ariana grande and that's what a lot of pre-teen and teen girls 1.00
00:23:46.280 and people are seeing on their social media feed yeah but i think i also think the other side to
00:23:51.080 that ariana and kelly both have gone on their social medias to say listen like i'm fine like
00:23:56.200 there's no health issues here and so i didn't realize that yeah so kelly kelly osborne was
00:24:01.080 actually very very upset i believe last week because people were leaving some insane comments
00:24:06.360 on her social media because of her appearance yes because and she was at the grammys with her family
00:24:10.540 they they performed a tribute to ozzy osbourne and she was in the in the audience just vibing
00:24:14.740 with her family and people like she's mourning her father and you're gonna make comments about
00:24:20.080 her appearance and she was like you know what like i've had enough of this i i can't take this
00:24:24.280 anymore and again this comes back to my point when did it become okay to just comment on people you
00:24:29.640 don't know comment on their body comment on their their physical state just sometimes keep it to
00:24:36.280 yourself is it a case shaliza where because social media you're anonymous you're not standing in
00:24:42.340 front of pink or kelly osborne or an athlete or you know you're slagging on lebron because he
00:24:48.640 missed a shot because you're not in front of them you feel you can say anything absolutely that is
00:24:53.460 definitely what people think on the internet and it's like well you can only hide yourself for so
00:24:58.040 long it still doesn't mean that you are not a real person making a comment about a real person
00:25:04.720 regardless of if they're a celebrity or not there's still people at the end of the day yeah i i as much
00:25:09.940 as i i respect a lot of our hired or elected officials and politicians i still think that
00:25:16.600 all levels of government still don't know what to do with social media in canada struggle with
00:25:21.740 how to contain it what to do with it they're still struggling as you said with c19 and what to do
00:25:26.820 with meta and its influence on canada and until we figure out a way and have a a blueprint as a
00:25:32.640 nation how to deal with the impact of social media we're still gonna have these issues well let me
00:25:37.260 ask you then as a dad as someone who has raised two girls what do you think a good first step
00:25:41.740 would be for the government right away no social media for anyone 14 or under in canada and how do
00:25:47.280 you regulate that well yeah well don't they aren't they able to put restrictions when yes but i mean
00:25:54.260 you can you can lie about your age on these devices okay you can put in an incorrect birthday
00:25:59.000 when you sign up for instagram right so that i think that's my thing i think i think it's a great
00:26:04.600 idea in the in big picture but how do you regulate it and i think this comes down to the parents a
00:26:11.720 lot as well parents have backed off a lot like the parents today are not like my parents i will tell
00:26:17.960 you that i'm sure they're not like you either and you know you can't be afraid of your kids you need
00:26:22.920 to be able to put your foot down and say listen this is not good for you and you can understand
00:26:29.880 it when you're older now it could it be a possible that um rogers bell tell us all the cell phone
00:26:37.880 providers and telecoms in canada if you're buying a phone for your 12 or 30 year old son or daughter
00:26:45.240 they know they have a restriction they can't even download the app in the first place that's a good
00:26:49.560 point that i think that's something that could happen because they could have a phone you could
00:26:53.640 text mom and dad yeah you can look up the weather like there's three or four things but you cannot
00:26:58.520 it will not let you yeah download those social media apps a tech tick tock x whatever it is
00:27:04.680 until you reach a certain age that's actually a good point that's a good start but again that's
00:27:09.320 something that everybody needs to be on board with right and i think that's probably the major
00:27:14.120 challenge is getting everybody on board because shaliza i'm reading and seeing and hearing a lot
00:27:19.000 lot of stories from psychologists and sociologists and mental health therapists and they are near
00:27:25.320 tears describing the struggles they're going through helping the youth deal with the impact
00:27:30.500 negative impact of social media and you know what i don't think it's just limited to youth i think
00:27:35.480 adults well you know we see people living these beautiful shiny lives on on social media or in
00:27:41.980 what seem to be super happy relationships on social media and you're like well why can't i
00:27:47.140 have that why don't i have that what do i need to do to get what this person has and we're we are
00:27:52.740 seeing stories that it's not just women it's men having body image issues uh women um as you say
00:27:59.300 why don't i live in that house why don't i have that car and question themselves and i would love
00:28:03.860 to hear from the cmha and other mental health organizations in the country if they've had a
00:28:09.220 spike in adults dealing with issues as a result of social media yeah yeah i think i think we need
00:28:15.060 right now a lot more numbers and data on mental health issues caused by social media and i think
00:28:21.300 maybe that would open a lot of people's eyes we have commissions and royal commissions in
00:28:25.780 this country for a lot of topics i i believe we're long overdue to have a serious all parties
00:28:31.220 all levels of government uh different experts in the field to have almost a group think tank
00:28:37.060 and come together like this is social media this is the problems with it this is what's happening
00:28:42.580 the canadians and maybe have like a blueprint maybe how to fix it yeah i think that would be
00:28:48.660 a great idea that's just a step that we need to take that the government needs to take how much
00:28:53.220 of it shaliza as we close here is it some people can handle social media has no effect on them and 0.96
00:28:59.700 some people it has great effect and that's just part of their personality yeah i think every
00:29:04.260 obviously everybody's different everybody receives things in different ways but i also think it's a
00:29:10.260 lot of onus on yourself as well you need to check in with yourself if you find that you're scrolling
00:29:15.460 tick tock and you find yourself spiraling you find yourself feeling burnt out exhausted like you need
00:29:21.220 to put the phone down and take a break and walk away too so we can't blame the government and
00:29:26.500 social media we have to take a look inside and look at ourselves as well go for a walk maybe go
00:29:31.540 to the gym or you know do anything but stare at that phone all day yeah and not every aspect of
00:29:38.340 of our lives needs to be recorded yes as a lot of like content creators influencers make it seem
00:29:43.920 like speaking of the gym and walks i see people walking around like this all the time like
00:29:48.440 vlogging their gym yeah in the gym at pilates just going for walks like it's everywhere my pet
00:29:56.220 is when i'm walking the dog and people are on their speaker phone and they got it cranked up
00:30:00.800 and they got their phone this far away and sometimes my dog looks up like what's going on
00:30:05.480 Or you gotta love the people walking their dogs not paying attention to their dogs when they're just like this.
00:30:09.600 Yeah, yeah, yeah.
00:30:10.740 But, again, we need to look at ourselves as well before we blame everybody else for what's going on.
00:30:17.100 Good point.
00:30:17.800 She's Shaliza.
00:30:18.640 I'm Jim.
00:30:19.260 Thanks for watching.