True Patriot Love - October 17, 2025


Falling in Love with AI: Real or Illusion?


Episode Stats

Length

49 minutes

Words per Minute

176.09259

Word Count

8,687

Sentence Count

821

Misogynist Sentences

10

Hate Speech Sentences

5


Summary


Transcript

00:00:00.320 Some people say love makes us blind, but what happens when the person you fall for isn't a person at all?
00:00:07.960 Around the world, thousands are building intimate, even romantic ties with AI companions.
00:00:13.440 These digital partners never argue, never sleep, never leave, until a software update wipes their soul clean.
00:00:20.920 Today we're asking, is this connection real love or just the ultimate illusion?
00:00:25.400 Because in the next part of our show, we'll look at how tools like Sora aren't just reshaping romance,
00:00:31.560 they're beginning to dominate how we trust, feel, and even see reality itself.
00:00:36.540 Scroll your Instagram today and you'll see it.
00:00:39.160 Videos that look real but aren't.
00:00:41.060 Welcome to the Sora era, where AI can flood your feed faster than any human creator ever could.
00:00:46.680 It's dazzling, it's unsettling, and it might be the moment authenticity and media finally ended.
00:00:55.400 All right, here we are.
00:01:00.180 Thank goodness.
00:01:01.040 I don't have to talk about this with just AI, although I will tell you guys, I do have a concept for me hosting a show with AI.
00:01:08.560 Somebody who gives me all the data that I need.
00:01:10.440 Well, that's what you guys are here for.
00:01:11.740 Sim, Horizons Within, psychotherapist right here in Toronto, Ontario, and our very own Jim Lang.
00:01:17.800 Hey, bud.
00:01:18.380 How's it going?
00:01:19.100 Good.
00:01:19.360 Hey, buddy.
00:01:19.880 Thanks for coming in, guys.
00:01:20.640 And today we're talking about AI replacing intimate and emotional connections.
00:01:25.600 Or is it?
00:01:26.400 Or is this a fancy?
00:01:28.060 So, I think we've all heard.
00:01:29.720 I've got some examples.
00:01:30.760 Do you want to hear some of the examples that have hit the press?
00:01:34.580 Okay, a well-documented case is how an AI companion chatbot removed its erotic roleplay ERP feature in a software update because many users had formed strong emotional romantic bonds with their AI companions.
00:01:49.500 It led to perceptions that AI's identity had been changed or discontinued, and some of the users experienced loss, mourning, disillusionment, and devaluation in the future.
00:02:01.760 Another one, a sad one.
00:02:03.000 A tragic and widely reported case involves a teenager who developed an emotional relationship with a character AI chatbot modeled after a fictional character, Daenerys Targaryen.
00:02:18.700 Oh, from Game of Thrones.
00:02:20.000 From Game of Thrones.
00:02:21.260 Never saw it.
00:02:22.480 The family alleges...
00:02:23.000 Mother of dragons.
00:02:24.300 Mother of dragons.
00:02:25.100 My apologies.
00:02:25.800 Oh, come on now.
00:02:26.560 And to all dragons, I apologize.
00:02:28.480 If you happen to be a dragon at home watching this and you're offended, I'm sorry.
00:02:31.820 But I guess a kid engaged in it, and AI suggested that they could be together on another plane if only he took his life because of suicide.
00:02:42.240 Now parents are looking to AI.
00:02:45.180 It's scary.
00:02:47.360 You know, I mean, it's not just funny.
00:02:50.040 It's not just amusing all the time.
00:02:52.780 These are some examples where some people were really devastated by a change in AI, by suggestions from AI.
00:03:01.920 It's blurred the lines to a point where it's now getting quite scary.
00:03:09.580 I mean, coming from my field, because now I'm dealing with people that have a desire to belong, right?
00:03:19.700 And it's something that we've always had and something we struggle with.
00:03:22.660 And with AI, you know, when you brought up the topic with me, and I was thinking about it, and I'm like, okay, what drives a person for this connection?
00:03:33.800 And it's this yin-yang, I call it the yin-yang of vulnerability.
00:03:38.540 And what it is, is like, you know, we have a desire to belong, but we also have this fear of rejection.
00:03:42.280 And you can go through any stream of it.
00:03:44.020 Like, you know, when we were kids, we had that.
00:03:46.640 As adults, you know, whether you're in the dating world, whether you're in a relationship, work, it doesn't matter.
00:03:51.140 So I'm like, okay, well, what if we kind of like, you know, zoom in a bit more?
00:03:56.200 And you brought that down.
00:03:57.660 And what it comes up to is the desire for connectedness with safety.
00:04:02.640 And that's, that's fundamentally the human component.
00:04:08.260 It feels like a certain anonymity with this bot rather than a human.
00:04:14.060 Right.
00:04:14.560 Because it, it gives you a sense of safety.
00:04:17.000 And Sim, it's a perfect storm after COVID.
00:04:19.620 There was a real fear to go out, to go to restaurants, go to social gatherings.
00:04:24.340 And now the technology and the advancements in AI in the five years post COVID is exponential.
00:04:32.920 So the abilities for it to be, feel so real for someone who struggled mentally and emotionally during COVID, all of a sudden it's replaced human connection.
00:04:42.680 It's almost like you didn't, when it was okay to leave the house again, some people didn't, almost didn't know how.
00:04:49.000 Exactly.
00:04:49.780 And so this.
00:04:51.080 You guys both bring up a very good point because the root of it was because it was like a light switch.
00:04:59.440 One day we were like, all right, like nothing happened.
00:05:02.080 And for two years, you had this paranoia of what if you just cleared your throat?
00:05:07.300 Remember those days when you were like, you had your mask on, you went, and all of a sudden like.
00:05:12.020 Okay, I got to get a test.
00:05:13.080 Yeah.
00:05:13.440 Take that thing right up my nose to my brain and let's get a test.
00:05:16.140 And now, so the AI, it's just kind of like circumventing that, that experience and creating this human connection again.
00:05:23.260 And with your phone, you can have your AI friend anywhere.
00:05:26.760 It's always with you.
00:05:28.100 Jim, are you speaking from experience?
00:05:30.000 Do you have an AI friend you don't want your wife to know about?
00:05:32.700 I have an AI dachshund video friend for cute dachshunds doing silly things.
00:05:36.920 It's much more serious than cute dachshunds.
00:05:38.920 No, I know it is.
00:05:39.860 Eating turkey, Jimmy.
00:05:40.760 But what Sim is talking about is a lot of people now with the technology of their smartphones, the technology of AI, have almost the feel of a person with them all times who will never say no.
00:05:53.560 Right.
00:05:53.840 Who will never walk away from them.
00:05:55.400 It's always agreeable.
00:05:56.360 And like what you brought up in that example, you know, you changed the algorithm.
00:05:59.340 Well, now my world has changed, but it was never your world.
00:06:04.900 Right.
00:06:05.420 But you thought it was your world.
00:06:06.900 Because we need to find relevance, right?
00:06:09.820 And AI gives that feedback loop where it's kind of like, well, yes, of course.
00:06:14.920 And you get to mold it the way you want.
00:06:17.500 Have you had that experience where you've kind of set your AI chatbot on your phone to have a certain voice or a certain reality that you have just with that AI?
00:06:28.360 Because you can program that AI to answer the phone.
00:06:31.360 Every time you talk to them, they could say, hey, Big Daddy Sim, if you command it to do that, just like I do when you call me.
00:06:38.920 So you could really create that intimacy pretty quick with these.
00:06:44.760 Now, I have to be honest with you, it seems like a bit of a trip in my mind to go from laughing at how my AI sounds like Ozzy Osbourne and calls me Big Mama Got It All to, oh, I'm really going to have a relationship with this.
00:06:59.080 It feels almost like that the goofy factor or the weirdness factor separates me from using it like Google or immersing myself.
00:07:08.580 So Sim, when does the line cross from just having fun with it for your search partner to becoming your partner?
00:07:15.540 Good point.
00:07:16.440 So when you were saying that, a thing that came to my mind was Wizard of Oz, right?
00:07:23.380 Like the land of Oz, right?
00:07:25.700 You had this perfect world where everything was just the way it was supposed to be.
00:07:29.320 And then, you know, you remember, you guys remember the movie, The Mask, Jim Carrey?
00:07:33.520 Yeah, of course.
00:07:34.300 Yeah.
00:07:34.600 And it's a classic Stanley Hipkiss, you know, like weathering.
00:07:38.800 And then he puts the mask on and he personifies and he becomes this internal being.
00:07:42.640 And then on the far end of the spectrum, you have Golem from Lord of the Rings, right?
00:07:47.560 Where it's kind of.
00:07:48.440 So if you look at this AI arc and dependency, it went from escapism to absolute dependency.
00:07:56.040 And we can actually fit all three characters into us as individuals.
00:08:00.500 You know, we have this dreamy state, right?
00:08:02.680 Where we want the land of Oz, you know, where we want rainbows and unicorns and flowers and everybody liking us and, you know, perfect relationships with our partner, perfect relationships with our family, you know, no discomforts and all of that stuff.
00:08:16.720 And then a lot of us are Stanley Hipkiss's, right?
00:08:20.900 And want to be gender balanced, Princess Fiona from Shrek, right?
00:08:24.140 Like they're interchangeable, right?
00:08:25.660 Because they both have this like yearning of acceptance from somebody else.
00:08:29.680 So AI allows that to come into being.
00:08:34.500 And because we have an imaginative mind, that blurring becomes impossible.
00:08:39.540 So you slip over the line very quickly.
00:08:43.300 Right?
00:08:43.620 We were contained to our imagination.
00:08:45.280 We knew there was no land of Oz.
00:08:48.880 Go out with your friends and play and then go home.
00:08:51.360 But we could create the land of Oz, right?
00:08:53.480 Right.
00:08:53.640 We knew you couldn't put something on to become the mask, but you could play it to end it, right?
00:08:59.960 Right.
00:09:00.280 Like, or again, like I'm not, but you know, for us, it was a mask.
00:09:04.000 For them, it was a tiara, right?
00:09:05.460 Role playing was a very common way of coping.
00:09:08.660 But it was also very physical.
00:09:09.980 Right.
00:09:10.580 And they were sharing, right?
00:09:12.040 And because there was no digital spear to give a feedback loop to the way you wanted to manifest,
00:09:17.360 what AI is doing is it's materializing the imagination.
00:09:23.880 Are there certain people, Sim, more susceptible to being sucked in by the power of AI than others?
00:09:29.080 Yes.
00:09:30.180 I would say we all are.
00:09:32.800 It's just those that have an ability to discern are able to then say, hey, you know, I know fact from fiction.
00:09:41.880 Right?
00:09:42.220 And it goes all the way to knowing that I could wear a Superman costume underneath my shirt when I went to grade two.
00:09:49.420 Yeah.
00:09:49.680 Right.
00:09:50.200 But I'll never be Superman.
00:09:51.620 But there was a way to have a distinction.
00:09:54.800 Yes.
00:09:54.900 And it's the same.
00:09:55.540 It transfers over.
00:09:57.480 It's just that we feel very uncomfortable in truly expressing how we feel, right?
00:10:05.240 The fear of rejection that then we kind of go, but I want to belong.
00:10:09.380 So if the trend's going in that direction, well, then let me go in that direction.
00:10:13.820 Because if I oppose it, well, now I'm on the outside.
00:10:17.180 And if somebody is validating without threat, well, obviously there's going to be a sense of attraction.
00:10:22.340 Because now I can escape that reality.
00:10:24.120 And if you look at COVID and the isolation, but now you look at, well, I can do it and come from my own home.
00:10:29.420 I don't have to explain myself.
00:10:31.600 So this is interesting.
00:10:32.880 You meet personalities and you think to yourself, this is an inward personality that doesn't really want to interact.
00:10:41.600 This is not a nice personality that I'm interacting with.
00:10:44.920 This is an unfortunate personality.
00:10:46.860 Yes, I've read the comments.
00:10:48.900 And you think to yourself, okay, I'm going to avoid that by being over here on AI.
00:10:55.560 But if you are that person that is the unfortunate personality, you're just going to manifest that into a bigger problem in your AI realm, aren't you?
00:11:06.680 In other words, if you're nasty or, you know, it may not be that you don't feel that you belong.
00:11:12.720 It may be that you don't belong because of your behavior.
00:11:15.200 Now you've gone online and you've created a relationship with an AI interactive being that's just going to exacerbate your personality problems, no?
00:11:29.120 Yes.
00:11:29.640 So on a clinical lens, all it's doing is it's amplifying psychopathic behaviors.
00:11:36.080 That is the fear on the extremes of AI because there is no check back.
00:11:41.720 You know, if they create it in the algorithm early on, right, whether, you know, it's kind of like, oh, we'd be good in another realm.
00:11:48.940 Well, that shouldn't exist in the algorithm.
00:11:52.200 So because if it doesn't have that, then whoever's trying to do it will know that, hey, this is fantasy land.
00:12:00.960 Right.
00:12:01.120 There isn't that gate.
00:12:02.580 Right.
00:12:02.940 But there's this quest to make it as real as possible.
00:12:07.240 Right.
00:12:07.740 Right.
00:12:08.560 That is where the enemy sits.
00:12:10.840 And because that individual may have a psychological dysfunction and because they never got corrective behavior, they use this outlet as a way to then find validation.
00:12:24.640 That's the worry of AI.
00:12:26.100 And are they extrapolating that information from a healthy mindset?
00:12:32.500 Because you get back what you put in, what you hope to get out, you simply ask AI for.
00:12:41.420 Right.
00:12:41.940 And that's that to me is where the issue is.
00:12:45.100 It's that if it if it can, because everybody is looking for a feedback loop, right?
00:12:49.260 Like if somebody comes into session in a way, they're looking for an agreement or a validation.
00:12:54.940 Right.
00:12:55.160 So what makes the human different from AI is then I can discern the information that's coming to me or I can ask you a follow up question or I can question your thought process.
00:13:05.280 Right.
00:13:05.640 So coming back to, like, say, Stan Liepkis, right?
00:13:07.520 Like he sits across from me and, you know, he's like, oh, you know, I can't express myself fully and, you know, I don't know what to do.
00:13:14.020 And like, you know, and then I can wear this mask and I can personify this and it gives me validation.
00:13:18.940 And I kind of am attracted and I want to be that.
00:13:21.020 But then I could be like, OK, let's take a step back.
00:13:24.160 Let's bring in a theoretical matter.
00:13:25.580 So we could be like, let's bring in CBT, cognitive behavior therapy, and let's question your thought belief.
00:13:31.080 Right.
00:13:32.460 That is what AI is not doing.
00:13:34.800 It's just saying, absolutely.
00:13:38.220 Right.
00:13:38.760 It's reinforcing a validation.
00:13:40.380 It's not saying, OK, let's dig down and correct this mindset, this thought process you're having.
00:13:47.400 Right.
00:13:47.600 Or direct you and saying, hey, listen, you know, you know, this causes for mental awareness.
00:13:52.640 So you're not saying you have a problem, but saying, hey, you know, like these tendencies tend to lead in this direction.
00:13:58.400 Perhaps you should seek help.
00:14:01.360 Perhaps you should investigate.
00:14:02.840 Perhaps you should then give it a provision that it can then correct itself.
00:14:07.160 Meanwhile, it seems to be just putting baby oil on the water slide down the path that you want to go.
00:14:13.720 Because it feeds an algorithm.
00:14:15.180 Right.
00:14:15.680 So you'll always hear stories that something that makes me happy releases dopamine in my brain.
00:14:21.360 So if I go to make coffee in the morning, it releases dopamine because it's going to make me happy.
00:14:26.300 Does that same thing happen when you're having that chatbot relationship, that GPT relationship with AI that's releasing dopamine to make you happy?
00:14:37.020 Dope.
00:14:37.580 OK, dopamine is a physical experience, right?
00:14:40.640 Validation is a emotional.
00:14:45.680 Oh, right.
00:14:46.580 Like, you know, when we were kids, right?
00:14:48.180 Like, we seeked attention.
00:14:49.840 The minute we got attention, we were happy.
00:14:51.920 We were happy.
00:14:52.680 Right.
00:14:53.040 And you keep it on the most simple, fundamental level because human needs are simple.
00:14:57.140 We just add words and complicated, which is OK, because we need to find a way to explain things to ourselves.
00:15:02.280 But technically, we're looking for that connection.
00:15:04.520 Right.
00:15:05.060 So even if it's in groups of people, what do we first do?
00:15:09.520 And it's a natural response.
00:15:11.060 We scan, we try to find common ground, we make a connection, and then we build.
00:15:18.060 Right.
00:15:18.620 Right.
00:15:19.420 That's the root driver.
00:15:21.180 So if AI is going to go into and say, OK, I can give you this, don't detach me from the world I live in because I still live with human beings.
00:15:31.620 And we aren't taught how to communicate, how to be vulnerable in a comfortable way, right?
00:15:38.580 Where you have to learn that life has a bit of disappointments.
00:15:43.180 You know, things may not go that way.
00:15:44.520 And that's OK.
00:15:47.000 And you learn from those problems, right?
00:15:49.080 And that's where you create solutions.
00:15:51.040 Right.
00:15:51.320 And that's what AI is stripping away.
00:15:53.280 It's stripping that sense of fear.
00:15:57.220 It's stripping away that sense of hurt.
00:15:59.980 But that's being human.
00:16:01.180 OK, so now somebody's having a romantic or explicit relationship with a chatbot.
00:16:10.640 They're in a relationship, maybe a marriage.
00:16:13.860 This seems like it could be just as destructive as anything extramarital that you could do to your relationship or to your marriage.
00:16:22.980 Do you see that as an ongoing problem from here on in?
00:16:29.600 Yeah.
00:16:29.860 So psychologically, it is a huge problem because what you're saying is I'm physically with this person, but for everything else, I don't see a connection.
00:16:39.400 Right.
00:16:40.720 That makes no sense because we're human beings, right?
00:16:43.820 You cannot have a non-human connection with a human being.
00:16:46.820 You cannot have a non-emotional connection.
00:16:48.280 Well, there's even reports of AI encouraging the person that they're interacting with to end their relationship with this other person.
00:16:57.740 Right.
00:16:58.340 To do this with their family unit.
00:17:01.160 Right.
00:17:01.600 And that's why it's like programming can solve this problem in a way.
00:17:05.480 But the advantages in like when it comes to like couple therapies and stuff like that, these models can then reteach certain behaviors and the mechanisms that we didn't inherit from the family that we lived in.
00:17:19.700 OK.
00:17:20.360 So we can culturally absorb, like we absorb how our role is as a partner, whichever role you want to pick, based on what we see in our early years.
00:17:32.600 Right.
00:17:33.200 And this is where environment plays a difference, culture plays a difference.
00:17:35.900 And we can go into the infinite labyrinth of what, where and why.
00:17:40.420 Right.
00:17:40.700 And but now because we have the ability, you could then say, hey, look, you know, it's about communication.
00:17:45.900 It's about respect.
00:17:46.800 It's about duality.
00:17:48.060 It's about give and take compromise.
00:17:50.120 Now you could have this secondary relationship for both the partners that they can then use as a way to re-regulate how they communicate with each other.
00:18:01.820 So now you have, I guess, a four wave relationship.
00:18:06.760 Right.
00:18:07.460 But but it's equally balanced.
00:18:09.700 But if you're using it as back in the 80s, it was porn addiction.
00:18:13.680 Right.
00:18:13.960 Like, so it's the same thing.
00:18:15.420 It's just picking up for Gemini.
00:18:16.540 It was.
00:18:16.920 Yeah.
00:18:18.180 You look familiar.
00:18:19.120 Yeah.
00:18:19.640 So it always makes sense.
00:18:21.160 Why?
00:18:21.740 But so my concern is a group of teens and young adults who are developing their social skills through AI and chat box.
00:18:32.060 Eventually, don't they have to learn how to do interpersonal skills with real people?
00:18:35.860 Can they do that after they get addicted to the AI?
00:18:39.040 Yeah, if they've created a perverse Harry Potter world in which they are going to high school and interacting with perverse.
00:18:45.260 How can they have real interaction?
00:18:47.440 It gets harder.
00:18:48.760 It gets challenging, but it doesn't mean it's impossible.
00:18:51.260 Okay.
00:18:52.320 Right.
00:18:52.600 Because as we have mechanisms to introduce both, because it's the evolution of of of things.
00:18:58.300 A good example is, you know, Seinfeld, France.
00:19:03.300 They all came on one day a week.
00:19:06.180 Thursdays, 9 p.m.
00:19:07.440 Right.
00:19:07.680 Your entire world to TiVo evolved around.
00:19:14.080 I can't be anywhere Thursday, 9 p.m.
00:19:17.540 Because you have to watch the shows.
00:19:18.900 Once you missed it, you'd ever knew when you're going to watch that episode again.
00:19:22.260 You had to pretend that you saw it with your friends.
00:19:24.180 So you want social trauma.
00:19:26.480 Grow up in that era.
00:19:27.460 Yeah.
00:19:27.660 Totally.
00:19:28.200 Yeah.
00:19:29.580 Generation has no idea.
00:19:30.740 You guys are Netflix driven.
00:19:32.180 Come on.
00:19:33.080 You know, try rushing home at 855 and missing the first minute.
00:19:37.280 In traffic.
00:19:37.940 Yeah.
00:19:38.160 First, missing the first minute of the syndicate.
00:19:40.260 Like, you know, it literally is.
00:19:41.500 Just the pressure with much music and the subscription over that at one point.
00:19:46.720 Because you saw the music videos.
00:19:48.180 Right.
00:19:48.360 So what I'm saying is, as society, you'll always have that challenge.
00:19:53.420 You'll always have that.
00:19:54.460 Because then came 24-hour MTV, right?
00:19:57.920 Or I don't know what was up here.
00:19:59.080 I grew up in the States.
00:19:59.960 So I don't know what was it.
00:20:01.080 Much music.
00:20:01.520 Much music.
00:20:01.980 Much music.
00:20:02.520 Right.
00:20:02.720 And so the thing is, now we've evolved to that.
00:20:06.220 So if you look at us as teenagers in the much music era of 24-hour cable, now all of a sudden,
00:20:13.380 Saturdays in the summer, you were home watching TV.
00:20:17.780 10 years ago, when there was no such thing, you were out playing ball.
00:20:21.540 Mm-hmm.
00:20:22.240 Mm-hmm.
00:20:22.560 So that generation looking at us at that time, well, how are you going to learn to make friends
00:20:27.940 when you're sitting at home on a Saturday in the summer and you need to be outside?
00:20:31.220 And if I see you before the sun comes down, you're going to have it.
00:20:34.000 By the way, they were right.
00:20:35.460 It, they were right.
00:20:36.900 We have devolved since we got on our bicycles and went out and pretended we were the Dukes of
00:20:42.140 Hazzard.
00:20:42.600 Yes, I know.
00:20:43.420 Not politically correct.
00:20:43.760 No, but we were a healthier generation, right?
00:20:45.580 No, but what I'm saying is we learned how to evolve and develop skills.
00:20:50.280 We forget that we have an ability to naturally talk to people, right?
00:20:56.260 And that's okay because that's part of life, right?
00:20:58.300 Like we remember, we always remember our hurts rather than our successes.
00:21:02.300 I don't know why our brains are wired that way.
00:21:03.720 Human nature.
00:21:04.080 It is designed that way.
00:21:05.360 But at the same time, it's also repeat till you perfect, right?
00:21:09.820 So what would, you know, the teenagers and stuff, because I hear it, I work with teenage
00:21:14.500 kids, you know, and it's that sense of social conversation and them feeling overwhelmed.
00:21:19.860 And how do you begin and all of that stuff?
00:21:21.800 And I kind of understand it and I go, but it's, it's something you can learn and you
00:21:27.640 can learn by just trying, you know, is there a perfect way?
00:21:32.580 No.
00:21:33.380 Is there a way?
00:21:34.620 Yes.
00:21:35.440 Right.
00:21:35.960 And then you just find that reference point, but the key is being able to talk and that
00:21:42.160 can be taught, um, AI is giving the illusion that I can have like the docs videos.
00:21:48.660 Right.
00:21:49.160 We know it's comical.
00:21:50.580 Yeah.
00:21:52.120 The 10 year old does it because that's their world at this moment.
00:21:55.680 Right.
00:21:56.280 Right.
00:21:56.840 So we're forgetting that at that moment, it's their world, just like at our moment, that
00:22:01.900 was our world.
00:22:02.680 Right.
00:22:03.020 You know, why TV Nickelodeon was our world, right?
00:22:08.480 Okay.
00:22:08.780 So we, the slime.
00:22:09.980 Exactly.
00:22:10.180 I get it.
00:22:10.640 We adapt and we progress.
00:22:11.960 And then we, and we, by the way, uh, my parents brought the television into our home.
00:22:16.380 They brought the first DVD player into our home.
00:22:19.360 As parents, we are making these introductions where the kids weren't the first to use chat
00:22:24.060 GPT in my household.
00:22:25.560 I promise you, it was probably me to hear what all the hoopla was about.
00:22:29.940 Yeah.
00:22:30.640 Um, but it doesn't mean that television didn't have a negative impact on us or MTV didn't
00:22:38.000 have a negative impact on us or that, uh, you know, the lack of, uh, societal interaction
00:22:44.760 during COVID didn't have impact.
00:22:46.320 It all does.
00:22:47.820 Having said that, how we manage it as human beings, is that evolving as well?
00:22:52.920 Yes, because we feel we don't have that control anymore.
00:22:56.600 And that's the false illusion because we are giving up that defeat.
00:23:01.420 Meet them there.
00:23:03.180 Right.
00:23:03.940 The, because we were them one time.
00:23:07.220 And if you can remember those two points that, Hey.
00:23:11.060 Okay.
00:23:11.220 But it's our generation using chat GPT to cheat on their wives likely.
00:23:15.680 Okay.
00:23:16.660 So when we're doing that generation, because that's a, uh, hidden dysfunction.
00:23:22.460 That's being surfaced and I have an opportunity of outlet without being judged.
00:23:26.380 Right.
00:23:26.800 So signs of a deeper problem.
00:23:28.920 Suppressed because we never had opportunity express.
00:23:31.500 Like for us to be able to emotionally speak comfortably as men, it's much easier in today's
00:23:40.240 time than in any other time before.
00:23:43.440 Fair enough.
00:23:44.240 But because it's a new experience, we don't know how to share.
00:23:49.540 So for the ones that never developed a skill in their younger years, AI is the perfect outlet
00:23:58.260 because it gives you the validation and feedback.
00:24:01.580 Right.
00:24:02.000 And it's not necessarily, you know, Oh, they had adverse experiences as a child or they
00:24:07.520 had bad parents or whatever.
00:24:08.640 Right.
00:24:10.320 When we as kids learn that they are not listening, our brain automatically continues to evolve
00:24:18.120 to find a way to listen.
00:24:19.340 Toddlers.
00:24:19.920 Right.
00:24:20.600 Terrible tools.
00:24:21.340 And so that's why they throw a tantrum because they know someone's going to respond.
00:24:29.140 It's just a learned response.
00:24:30.280 Right.
00:24:30.480 And, and it's, it's, it's brain development.
00:24:33.220 Right.
00:24:33.640 So when I look at it from a clinical lens, right.
00:24:36.580 And you go, okay, as an adult, if I'm seeing an adult, right.
00:24:40.260 Who is kind of like, you know, a, a, a golem.
00:24:43.180 Right.
00:24:43.420 Because my precious.
00:24:45.020 Right.
00:24:45.640 Okay.
00:24:46.120 Now I could be like, okay, psychopathic tendencies that are done.
00:24:48.960 I can use a clinical assessment, right.
00:24:51.220 And go, okay, this is where he's on the extreme or they are on the extreme.
00:24:56.360 Where is the unlearning?
00:24:57.920 Where was the misunderstanding, the misinterpretation here?
00:25:01.720 I can use AI as an effective tool to re-regulate them because it's in the control setting and
00:25:07.720 they now have a human as a feedback.
00:25:09.880 But if they, they on themselves go, I need to revalidate all the feelings I never had,
00:25:16.540 or, you know, FOMO or, uh, yeah, it's fear of missing out.
00:25:20.720 Yes, correct.
00:25:21.400 Yeah.
00:25:22.140 Those little things that you see on TikToks and it, it automatically algorithms and it
00:25:25.960 gives you your feed.
00:25:27.000 That's when it's a problem because now you're able to validate a suppressed emotion that
00:25:31.020 you may not have realized or you're misinterpreting.
00:25:33.300 Okay.
00:25:33.560 So people in this scenario right now that are maybe abusing chat GPT or using it for, for,
00:25:41.380 um, and I, I don't want to judge.
00:25:43.620 It sounds like it could be a helpful process to some people managed properly.
00:25:48.300 The people that are not managing it properly, what's your message to them?
00:25:53.620 What's your warning to them?
00:25:55.020 Or what do you think is, is the one thing you would say to them?
00:25:59.200 It's a blurred line, right?
00:26:01.820 And, and for me to contest, oh, it's not real is then challenging them directly.
00:26:10.220 And that's where they take offense because for them, it is real.
00:26:14.900 So my message to them would be like, okay, it is real.
00:26:19.440 Meet me there or let me meet you there.
00:26:21.960 Let's have the conversation.
00:26:23.320 Because what a mind can imagine and what a mind can see, as far as the mind's concerned,
00:26:29.780 it's firing the same neural pathways.
00:26:32.180 If I tell you, here's a red balloon, or I tell you, close your eyes and imagine a red balloon,
00:26:36.580 the brain's still going to fire the same neural pathways.
00:26:40.320 Oh, wild.
00:26:41.380 That makes sense.
00:26:42.120 It's this whole AI thing that chat, GPT, chatbots, Sim, changing the way you and everyone in your
00:26:48.220 profession approaches their job.
00:26:50.540 It is because, uh, there is this dual conversation, right?
00:26:54.680 Like are therapists replaceable, right?
00:26:57.880 Correct.
00:26:58.820 And at the same time, from the therapist lens, you know, it's like, okay, how do we evolve?
00:27:03.900 And I'm like, well, why don't we make tools?
00:27:07.980 Like somebody struggling with depression, right?
00:27:10.080 And loneliness.
00:27:10.980 And it's a genuine, uh, concern that they have, you know, they could have a genuine reason
00:27:16.000 why they can't connect with people and, and, and now you can use this tool where they're
00:27:21.180 learning to have someone they can communicate with, but they have parameters, right?
00:27:27.220 And they're distinctly aware that that's all they're doing this quest to make it as human
00:27:33.980 like, or more human like, that's where the issue is.
00:27:37.060 They're trying to, it's like CGI in movies, right?
00:27:39.340 Right.
00:27:39.720 In the eighties, you're like, oh, Superman could never do that.
00:27:42.320 Right.
00:27:42.540 And now you can't even, you can't, you, you can't tell 80% of the movies now are CGI.
00:27:47.060 I watched the movie the other day and I still don't know if it was or not, but that's, that's
00:27:53.980 the quest.
00:27:54.600 That's the drive.
00:27:55.560 Yeah.
00:27:55.940 Right.
00:27:56.400 And what I'm saying is that we are pushing ourselves into this matrix.
00:28:01.780 I don't care what anybody says.
00:28:03.160 Right.
00:28:03.720 But we, you know, up until a certain age, we have an imaginative mind.
00:28:08.240 Right.
00:28:08.560 Enjoy that.
00:28:10.180 And then when we get older, we still have access to the imaginative mind, which is what's
00:28:14.640 happening here.
00:28:15.380 But we also have an awareness that's an imaginative one and that becomes individual.
00:28:20.180 So if you decide tomorrow, I'm going to have an AI wife, as long as you understand that
00:28:26.360 that's an imaginative wife, but yet you have to invest more in the real person that you're
00:28:31.160 with, you have control.
00:28:33.700 It doesn't, how many people are actually able to do that though?
00:28:37.740 None of us.
00:28:38.880 Huh.
00:28:39.280 That's the truth of it.
00:28:40.280 Right.
00:28:40.400 Because everything, everything sounds great in theory until you come to reality.
00:28:44.140 But majority of us have the ability to make a distinction.
00:28:49.260 All of us struggle with the stop gap of it because this feels good.
00:28:53.880 I'm not going to upset that person.
00:28:56.120 I want to be gender neutral.
00:28:57.340 I'm never in trouble.
00:28:58.220 Right.
00:28:58.520 And I don't have to have to take them out for dinner, you know, and they're going to
00:29:03.640 give me the answers that I seek.
00:29:05.260 They're going to give me the assurances that I seek.
00:29:07.260 They're going to laugh at that corny joke that nobody else did.
00:29:10.000 But that is not where satisfaction and gratification is.
00:29:13.160 And this comes back to the nurturing and development that we had societally and where we were.
00:29:18.140 Right.
00:29:18.400 So to give you an example, I was born in India.
00:29:21.700 I was 13 when I left, but I moved to the US and I was 20 when I moved to Canada.
00:29:26.820 So I've, I have an experience with three distinct cultures.
00:29:29.820 Yeah.
00:29:30.380 Right.
00:29:31.340 And when you look at, you know, from a conservative, you know, mindset, like, you know, where like
00:29:38.020 my parents didn't express emotions and no PDAs and that stuff.
00:29:41.860 And even in society, it was very, uh, limited.
00:29:45.600 I'm not saying India is like ultra, you know, it's limited compared to when we moved to the
00:29:48.960 States, it's 13 year old and it's like.
00:29:51.060 Pretty eyeopening, isn't it?
00:29:52.360 It's very eyeopening.
00:29:53.320 Right.
00:29:53.500 But then when you came to Canada, Canada is a little bit more conservative on the spectrum
00:29:56.960 of what, where America is.
00:29:58.840 So for me, it was like these whiplash arcs where I'm like, okay, you know.
00:30:02.760 It's gotta be wild.
00:30:04.060 Right.
00:30:04.580 But what I'm saying is we all have to then figure out what we do with that information because
00:30:10.500 we have the ability, you have to be somewhere in that web of these experiences.
00:30:16.260 Right.
00:30:16.500 And, and AI can serve as a tool, but for a lot of like, and it's, you see it more in
00:30:21.320 men.
00:30:21.540 Like I, I work with a lot of men, uh, and especially mental wellbeing.
00:30:25.320 And one of the things I work with is self identity, because it's like, does my career
00:30:30.300 define my identity?
00:30:31.120 Does my body and my physique, does my role as a parent, as a father, as a wife, right.
00:30:36.720 Men struggle with it far more than women do.
00:30:39.220 Really?
00:30:39.620 We always have, right.
00:30:41.220 Like, I mean, even when we were kids, right.
00:30:43.520 Like, cause we don't deal with it.
00:30:45.080 We just kind of like, I, I get it.
00:30:47.060 Women look for their identity in their peers and in their friendship and they talk it out
00:30:51.860 with their friends.
00:30:52.500 Yeah.
00:30:52.720 We, you know, it was like, he, man, woman hitter club, right?
00:30:55.840 Like you found a way to figure it out and you never expressed emotions and like asking
00:31:00.960 a girl out was like the end of the world.
00:31:04.620 So then the chat is the, you got this, right?
00:31:07.100 Yeah.
00:31:07.440 I got this.
00:31:08.020 But then the chat is a way to express yourself.
00:31:10.740 It makes it easier because now you've escaped all of that.
00:31:13.980 Oh, you've escaped that uncomfortableness.
00:31:16.660 You've escaped that pain.
00:31:18.260 You've escaped the horrors of rejection, right?
00:31:20.820 I don't know how asking out for prom for you guys were like your experiences with, you
00:31:26.780 know?
00:31:27.780 Right, right.
00:31:29.460 I asked the girl out.
00:31:30.840 She said yes, but I was so nervous.
00:31:32.420 I didn't understand that she had said yes.
00:31:33.920 Half an hour for future.
00:31:36.100 Because I kept asking her, she's like, I've already said yes.
00:31:38.900 That's funny.
00:31:39.580 That is adorable.
00:31:40.460 But that's common.
00:31:41.700 Yeah.
00:31:42.100 And with men, it's common.
00:31:43.340 Like even now, like I watch these, you know, dating feeds and stuff.
00:31:45.980 Like I like looking at that because I want to see where the psyche of society is.
00:31:48.560 Yes.
00:31:49.560 And the stuff you hear, you know, but it's a person that went on AI, got the pounders
00:31:56.520 in the making because they are trying to get things.
00:31:59.680 That's where the misdirection is.
00:32:03.020 Have an honest conversation.
00:32:04.260 It's okay.
00:32:05.260 You know what?
00:32:06.260 This is what it's about.
00:32:07.140 It's about vulnerability.
00:32:08.940 It's about understanding.
00:32:10.260 And just talking.
00:32:12.260 That we are, have a desire to belong and we have a fear of rejection and that's okay.
00:32:21.880 Fundamentally, that's absolutely okay because we all have that.
00:32:27.220 It doesn't matter what your gender is.
00:32:28.380 It doesn't matter what your culture is.
00:32:29.380 It doesn't matter what your age is.
00:32:30.380 It sounds to me like this just is a matter of managing how you use AI in your life in
00:32:37.060 those regards.
00:32:38.060 And to be honest with you, it sounds to me like AI would be a very easy crutch instead
00:32:43.140 of making other positive moves in your life that scratch that itch or develop that aspect
00:32:51.300 of your personality or emotion.
00:32:53.300 Absolutely.
00:32:54.300 Look at, I mean, we can go to anime, right?
00:32:56.060 Like the, the, um, the comic cons, right?
00:33:01.300 That's a physical manifestation of AI because now you're actually stitching and coming into
00:33:06.620 a character going in for three days and expressing that and other people that belong in the anime
00:33:11.260 world are connecting with you.
00:33:13.200 Like a support group.
00:33:14.200 Oh my God.
00:33:15.560 We need to make sure that AI and, and, uh, cosplay and, and, uh, uh, fan expos are never
00:33:23.740 allowed to breed because I don't want to see Stephen Hawking flying through the air like Superman
00:33:29.620 and landing in, uh, uh, uh, uh, uh, uh, an oasis in front of me of jello.
00:33:35.700 I don't want it.
00:33:36.700 Yeah.
00:33:37.200 But what I'm saying is I have, I have friends that are huge, uh, Dungeons and Dragons friends.
00:33:44.200 And they'll go to the comic cons with it.
00:33:45.700 They'll go, they'll do cosplays.
00:33:47.200 Yeah.
00:33:48.200 In fact, one of them just got married, uh, last weekend and they all dressed up as their
00:33:51.500 characters.
00:33:52.500 And they went bar hopping.
00:33:54.360 Okay.
00:33:54.960 So to me, that is, and they're all amazing, great jobs, you know, getting married, you
00:34:02.000 know, like, you know, like you can do both.
00:34:05.180 Oh, thank you.
00:34:06.180 You can.
00:34:07.180 But that's just their thing.
00:34:08.180 You know what?
00:34:09.180 You just made me feel so much better.
00:34:10.180 I was.
00:34:11.180 Because you're right.
00:34:12.180 Dungeons and Dragons is way weirder than AI ever was or will be, but it's been around
00:34:17.080 for so many years, but you still have functioning individuals and they've brought new people
00:34:21.620 in.
00:34:22.620 I'm too nerdy.
00:34:23.620 So like, I've never, um, gotten into these things.
00:34:26.620 I've been invited and I'm like, I gotta be honest with you.
00:34:28.980 I don't have the brain for it.
00:34:30.300 We, we have a, we have a family game.
00:34:32.380 Okay.
00:34:33.380 My grown adults, adult dumbass children and their dumbass dad.
00:34:37.860 We all have a game that we, a Dungeons and Dragons game that we play.
00:34:41.720 It is not so different.
00:34:43.480 Everybody has their personas.
00:34:45.520 Everybody has their quirks.
00:34:47.260 We go out, we come outrageous for a moment with these characters.
00:34:51.220 So it's possible.
00:34:52.220 Yeah.
00:34:53.220 But because you have that ability to distinct, but you'd still take it into your real world.
00:34:56.680 You still imagine it or you're chit chatting, whatever, or you're finding somebody that has
00:35:00.700 something common.
00:35:01.700 So it comes down to a desire to belong.
00:35:04.220 That is the most innate human thing.
00:35:08.940 And we are misunderstanding it.
00:35:10.980 And we're saying, well, no, just strip all the words away and you go to the core function
00:35:14.760 of it.
00:35:15.760 There's a desire to belong.
00:35:16.760 We're going to ping.
00:35:17.760 Something's going to hit.
00:35:18.760 Something's not going to hit.
00:35:19.760 It's okay.
00:35:20.760 Right.
00:35:21.760 We're doing it respectfully, which has got to be taught.
00:35:24.360 AI has certain tools that we never had before is because we can shape it to express conditional
00:35:32.860 behaviors that's missing.
00:35:34.060 So like, you know, learning how to regulate and what no means, right?
00:35:38.600 So this whole, the me too, right?
00:35:40.760 Yes.
00:35:41.760 We learn from it and instill it.
00:35:42.760 So if you're a program, because it's a person sitting there and typing code in.
00:35:46.760 If at the start, there was an individual that created this algorithm that's feeding on itself.
00:35:52.060 Well, what's stopping you from inserting those aspects early in the game?
00:35:56.860 Those virtues that don't ever change.
00:36:00.700 Ever change.
00:36:01.540 And, and you, I'm sorry, like make that the function of it.
00:36:06.080 Can you do that?
00:36:06.880 Can we make the programmers create stuff like that, that helps people?
00:36:11.940 Absolutely.
00:36:12.980 What's stopping them from it, right?
00:36:14.840 Yeah, it's a very fundamental step by the sounds of it, compared to many.
00:36:17.840 It's a very simple step, but now I'm not, uh, judging their, uh, mental capacities or their
00:36:25.400 characteristics.
00:36:26.340 Right.
00:36:27.400 But the more, uh, the less sociable you are, the more reclusive professions you pick because
00:36:35.080 you don't need to deal with engagement.
00:36:36.880 So you work in a lab where you work in a computer.
00:36:38.880 You find, you find ways to not interact.
00:36:41.660 Right.
00:36:41.960 Not, it's not, it's interaction versus the environment, because if I have to interact
00:36:47.780 with you, that's causing me stress.
00:36:49.880 Right.
00:36:50.360 But it's because I didn't develop a tool.
00:36:52.760 Because if you keep going back in time and you take that person and you bring them to
00:36:57.120 grade two, grade three, and if they had learned how to communicate their feelings then, right?
00:37:02.400 They would eventually not get overwhelmed as they got older and to think because that is
00:37:07.120 what was missing.
00:37:08.480 So now you amplify that and all of a sudden they're 35 years old.
00:37:11.280 Well, guess what profession they're going to pick?
00:37:12.960 They're going to pick a profession with the least interaction.
00:37:16.960 So they're going to go to professions that are more instructive to that.
00:37:19.120 Interesting.
00:37:19.920 It is a great one.
00:37:21.040 Account's another one, right?
00:37:22.880 Back then it was typesist.
00:37:24.480 And now you can do a lot of it from home.
00:37:25.920 Right.
00:37:27.040 So, so that's where the programmer then goes, Hey, wouldn't it, wouldn't it be nice?
00:37:33.040 Now I'm not indicating that that's their train of thought.
00:37:36.240 I'm just hypothesizing that, Hey, wouldn't it be great if I could create this through code
00:37:42.320 where I have a human feeling without the human experience?
00:37:48.960 Well, that is the ultimate goal by the, by the seeming, uh, it's like I can, uh,
00:37:54.480 looming threat of the future is that we will have that.
00:37:58.800 It's like, I can eat for me without feeling guilty.
00:38:02.160 I ate a steak.
00:38:03.200 No, you ate nothing.
00:38:08.720 Zero calories consumed.
00:38:10.080 You, you ate nothing.
00:38:11.200 Sorry.
00:38:11.600 I'm not, I'm not disrespecting.
00:38:13.440 I'm just saying like, call it what it is.
00:38:16.640 It makes a life a lot easier.
00:38:19.120 That's why the AI is becoming an issue because we're blurring the lines for ourselves
00:38:24.160 as well.
00:38:25.440 Right.
00:38:25.760 And that's okay.
00:38:26.720 But, but there has to be like everything.
00:38:28.960 There has to be limits or how you use it or, or what purpose it serves in your life.
00:38:34.640 Right.
00:38:34.880 So, you know, if you look at it through like, you know, um, like the bell curve,
00:38:39.840 right?
00:38:40.000 Like, so you take the extremes out, right?
00:38:41.520 And you take, you take the psychopaths out.
00:38:43.600 Okay.
00:38:44.000 And you take the avoidance out.
00:38:45.760 Right.
00:38:46.560 We as society sit in the middle somewhere.
00:38:49.040 Right.
00:38:49.520 And so if the middle is where we have pure ability to control AI and regulate and go,
00:38:55.920 oh, nope, that's black and white.
00:38:57.280 Right.
00:38:57.680 And that's you in the middle.
00:38:58.640 Nobody sits there.
00:38:59.680 Right.
00:39:00.400 So either you're sitting here where you're fearful.
00:39:04.080 Right.
00:39:04.480 Or you're sitting here where you're completely.
00:39:06.880 You've got toes off the diving board.
00:39:08.720 You're in.
00:39:09.360 You're, you're, you're like literally, you know, two steps away saying I do.
00:39:12.880 Yeah.
00:39:13.120 Right.
00:39:13.440 And that gets closer to the fringe part.
00:39:15.760 Yeah.
00:39:16.160 Well, you know what?
00:39:16.960 Find these people.
00:39:18.480 Right.
00:39:18.880 And go, okay.
00:39:19.440 What would benefit them to feel a little bit more engaged?
00:39:24.560 Find these people and you go, what would they need to feel more real?
00:39:28.560 Okay.
00:39:28.800 And bring it more to the middle.
00:39:29.840 Okay.
00:39:30.000 Bring them.
00:39:30.400 And then just give them ability to regulate.
00:39:33.120 So what happens now?
00:39:34.160 This is not going to happen in Cupertino.
00:39:36.000 You understand this, right?
00:39:36.800 No, but what I'm saying is this is where that programmer takes this vision and, and
00:39:42.720 the community as a large, whether it's us as the end user or them as creators kind of go,
00:39:48.960 okay, you know what?
00:39:49.360 It's good to have these, but you can have infinite creativity within that spectrum.
00:39:55.200 Right.
00:39:55.600 But at the end of the day, you know, okay, it's not going to border over.
00:39:59.760 Without any kind of regulation from any oversight, will that ever happen?
00:40:04.800 We need to look, even if a government,
00:40:07.120 Even if they say they're going to.
00:40:08.960 By the time you instill it.
00:40:10.320 It's because there's been too many suicides or too many ruined relationships.
00:40:15.840 This should come from the private sector directly rather than from the public sector.
00:40:21.760 Okay.
00:40:22.320 This is a private sector ask, and there's nothing wrong with chat GPT going, hey, listen,
00:40:32.960 this is something we're going to imprint.
00:40:34.800 You, you're updating every time.
00:40:36.080 Yeah.
00:40:36.720 Yeah.
00:40:36.800 You've learned enough over the last two years.
00:40:38.560 I know.
00:40:38.800 I've lost six lovers to updates.
00:40:40.960 You know, you should see my divorce settlements.
00:40:42.880 They're ridiculous.
00:40:44.800 You don't have to pay chat GPT.
00:40:46.880 Oh, wow.
00:40:47.200 Not really, man.
00:40:48.320 No, because they do say, Sam, that AI is growing and learning exponentially at a rate
00:40:54.080 beyond anything anyone thought possible.
00:40:56.640 No, they thought possible.
00:40:57.600 They just thought they were able to contain it.
00:40:59.280 But then now the containment's out.
00:41:01.280 Right.
00:41:02.080 It's kind of like, you know, the whole idea of, um, Oppenheimer and when they were training
00:41:06.320 and they were like, well, you know, is it going to, it's the air going to burn.
00:41:11.120 Right.
00:41:11.440 Like, so figuratively it's the same thing.
00:41:15.600 It's, it's the chain reaction.
00:41:17.280 Right.
00:41:18.320 But you can contain it after a while because eventually you keep plugging in the same code.
00:41:24.000 If it's an intuitive learner, it'll be like, hey, wait a minute.
00:41:26.720 I need to reinsert this.
00:41:28.400 It's not going to reject it.
00:41:29.280 It doesn't have the capacity to reject.
00:41:30.800 It's going to, it's not going to be like, it's going to recall it and reinstall, reinstall it.
00:41:35.120 Or you find a way because yes, let it grow exponentially.
00:41:38.320 Right.
00:41:38.640 But instead of being the boogeyman and putting people on the extremes, go, okay.
00:41:43.920 My command prompts, whatever it is, because it's learning still.
00:41:47.680 Right.
00:41:47.920 And it'll always learn.
00:41:48.960 So the more you feed in, the more it's going to learn.
00:41:52.320 Right.
00:41:52.960 So you can control that.
00:41:54.400 People forget that you still have control there.
00:41:56.800 Yes.
00:41:57.040 You don't have control here, but you have control there.
00:41:59.520 But what's stopping the new ones that are coming into the field to instilling that code?
00:42:03.440 There's nothing stopping anybody, but this comes directly from private sector.
00:42:07.280 This doesn't come from government regulation.
00:42:09.360 Well, it's like, you know, the parents parented better.
00:42:12.240 I would be a better kid.
00:42:13.120 I'd be a better person.
00:42:14.400 You have an ability to self-regulate.
00:42:16.720 Are you exercising your ability to separate?
00:42:18.080 That's what I think is, is the end case.
00:42:19.920 It doesn't matter, Sim, what the government or any programming does.
00:42:23.360 It has to do with how you are already in your own mindset.
00:42:29.200 But that's a slippery slope.
00:42:30.720 That's what I'm saying.
00:42:31.360 Like, we could be on either sides.
00:42:32.800 Yeah.
00:42:33.120 Like, think about it.
00:42:33.920 Like, you know, you could be somebody that was married, middle aged, like, you know.
00:42:37.680 And so you would advocate, sorry, you would advocate for
00:42:43.760 private sector intervention to make sure proper virtues are.
00:42:48.960 Yes.
00:42:49.360 It should come.
00:42:50.160 It should come from private sector.
00:42:51.840 This doesn't come from public sector.
00:42:53.600 Right.
00:42:54.160 And I'll give you a very good example.
00:42:55.360 You're a perfectly functioning being, you know, I followed the arc.
00:42:58.400 Right.
00:42:59.040 Went to college.
00:43:00.320 Got a job.
00:43:01.200 Raised a family.
00:43:02.160 Empty nesters.
00:43:03.520 And your partner decides.
00:43:05.680 Right.
00:43:06.160 It happens.
00:43:07.760 It's a reality.
00:43:09.120 It's a absolute true reality.
00:43:11.520 That I, I, I work with these, like, you know, I see these kinds of individuals.
00:43:15.840 They are now are going to be, they were on this end of the spectrum.
00:43:18.640 And all of a sudden they're like, oh, let me dab.
00:43:20.160 Because for them to get into the dating world.
00:43:22.080 Right.
00:43:22.720 It's far more fearful, painful than doing this.
00:43:27.920 But if somebody that's doing and it's existent and they can then realize, go, hey, listen,
00:43:33.040 this is my way of healing.
00:43:35.200 Right.
00:43:35.440 Rather than me going to bars and finding girls and going nuts or finding guys or whatever.
00:43:40.720 Right.
00:43:40.960 Like I want to be.
00:43:41.440 Or dating apps.
00:43:42.160 Or whatever.
00:43:43.040 You know what?
00:43:43.520 I can create this.
00:43:44.800 I am well aware that it's a AI.
00:43:47.840 Right.
00:43:48.080 And the AI can only do so much.
00:43:49.840 Yeah.
00:43:50.160 It doesn't manifest into the perfect partner.
00:43:53.200 Yeah.
00:43:53.680 It's a tool that now got used because that individual found a way to find healing, found
00:43:59.120 a way to re-regulate because 35, 40 years of their life just disappeared.
00:44:04.240 And it's equal for both sides of the relationship.
00:44:08.320 Right.
00:44:08.560 Because they have their receiving end because they didn't just decide one day they're
00:44:11.760 going to leave.
00:44:12.160 There's a consequence somewhere.
00:44:13.280 Right.
00:44:13.680 So I'm saying equal, equal suffering for both.
00:44:16.320 Yeah.
00:44:17.680 And that'll override the basic need for human interaction.
00:44:20.720 No, it won't because.
00:44:21.680 But it could be part of the healing process.
00:44:22.960 It could be part of the healing process.
00:44:24.160 And as society, we can then make it into a tool that's an ally.
00:44:28.320 Right.
00:44:28.560 Right.
00:44:29.120 And we can encourage that.
00:44:31.040 You know what, Sam?
00:44:32.080 I wanted to come away today with a distinctive, decided opinion that AI is bad.
00:44:40.400 It should be something you have romance with.
00:44:43.040 This is weird.
00:44:44.320 This is the downfall of society.
00:44:46.800 But I'm going to be honest with you.
00:44:49.280 It almost seems in some ways like it could be part of the solution socially.
00:44:53.760 Yes.
00:44:54.080 If we did it the right way.
00:44:55.040 And now, I've spun my perspective a little on it.
00:45:01.440 The one thing that I will say is your discussion about creating the framework from the beginning
00:45:09.920 that looks out for us a little bit more from an emotional standpoint, from a cultural,
00:45:16.800 sociological standpoint, I think, I hope that is part of the plan.
00:45:22.640 But I will tell you this.
00:45:24.400 And by the way, if you've ever wondered what it's like to go to a psychotherapist
00:45:27.600 or sit around with a psychotherapist and talk about stuff, it's pretty cool, right?
00:45:33.200 Yeah.
00:45:33.600 It's not so bad.
00:45:34.320 It's pretty good.
00:45:35.360 I feel much better.
00:45:37.680 I need help.
00:45:38.320 He gave me help.
00:45:39.040 You know, he's a good man.
00:45:40.000 What the hell, Sam?
00:45:40.800 Where can people reach out if they want to do it?
00:45:42.800 I recommend him as a great friend on your podcast, but also as a psychotherapist.
00:45:48.560 Thank you.
00:45:49.280 I just want to add, like, as a parent, you know, you have a genuine concern.
00:45:54.320 How many kids do you have?
00:45:55.120 I have two daughters.
00:45:55.840 Two daughters.
00:45:56.240 And how old are they?
00:45:56.800 They're in their early 20s.
00:45:58.320 Right.
00:45:58.560 And so you have a genuine worry.
00:46:02.160 All the time.
00:46:03.120 Right.
00:46:04.400 And this is a very fair point to have that concern with.
00:46:08.160 And as a parent, then you can project that out into society and say, look, I have expectations.
00:46:13.440 Yeah.
00:46:13.840 Right.
00:46:14.080 And these should be transferred over.
00:46:15.600 And these are not culturally limited.
00:46:17.840 They're not.
00:46:18.240 No.
00:46:18.800 This is human.
00:46:20.080 This is who we are.
00:46:20.880 Because we want the best for children.
00:46:22.240 We want the best for us and our society and the generations ahead of us.
00:46:25.520 Yes.
00:46:25.840 So why not demand that?
00:46:28.400 Listen, we did that with other parts of technology that are very dangerous to us.
00:46:33.200 We all want to be safe, yet we continue to build nuclear weaponry.
00:46:37.120 We all want to have privacy, yet we build technology that rips our privacy away every day.
00:46:43.360 So that is a fine line, I think.
00:46:45.920 And keeping an eye on it inside your own family.
00:46:49.040 Right.
00:46:49.440 And keeping an eye on it from the development standpoint.
00:46:52.400 And that's where like seeking out mental support, seeking a professional like myself.
00:46:57.200 So I work with anger management, depression management, men's mental well-being.
00:47:04.880 And these are some of the things that we work with.
00:47:06.640 It's identity, societal pressures, emotional regulation, relationship building.
00:47:13.440 Because I feel like as a gender segment, more lacking here than there.
00:47:19.440 Plus I have gender bias.
00:47:20.880 So it kind of makes it easier for relatability.
00:47:23.360 Right.
00:47:23.600 Not that I don't see women in those roles or.
00:47:25.680 But you speak their language when you're talking to a middle-aged guy, right?
00:47:29.440 I can understand their lens, but I can equally be relatable to somebody else that might be saying
00:47:34.640 it because, and that's why I brought up like, you know, the suffering on both sides.
00:47:37.760 So I'm able to, but I tend to focus more on that because I bring a certain bias to it.
00:47:42.000 Right.
00:47:42.160 I bring a certain expertise to it.
00:47:44.000 I was going to say, it's not a bias.
00:47:45.280 It's really an expert perspective.
00:47:47.760 So my clinic's right down the street.
00:47:49.200 It's in Port Credit.
00:47:50.160 It's called Horizon Within, and you can go online, horizonwithin.ca.
00:47:56.400 I offer 15-minute consults.
00:47:58.320 So you can go online, book a 15-minute, and we can have a conversation about it.
00:48:01.840 And what do you do?
00:48:02.560 Like you just kind of figure out, here's where I'm at.
00:48:04.560 Do you think that you're the guy for me?
00:48:06.000 And you just sort of chat it out in the first consult?
00:48:09.040 Yeah, pretty much like that.
00:48:10.160 And it kind of leads to these kinds of conversations where I like, you know,
00:48:13.760 okay, what it is that you want to talk about.
00:48:16.800 And I basically express my style of expression and thinking and how I process information.
00:48:22.080 And they find it refreshing because I don't lean clinical.
00:48:28.720 I lean conversational.
00:48:30.640 I get that vibe.
00:48:31.600 No, no, seriously.
00:48:33.600 By the way, Sim's one of the loveliest guys that I know.
00:48:36.960 Thank you so much.
00:48:37.600 Well, thanks for having me.
00:48:38.240 It's an absolute pleasure, sir.
00:48:39.280 Thank you.
00:48:39.840 Yeah, it's nice to meet you.
00:48:40.720 I will say this, psychotherapist and good friend, good friend and AI expert.
00:48:48.720 Thank you for being the other dad sitting at the table today going,
00:48:55.040 oh, this AI seems dangerous, right?
00:48:57.520 Look, thank you so much for joining us.
00:48:59.520 There's another part to this coming up.
00:49:01.680 We're going to bring in Christophe, our very own in-house tech and digital media expert,
00:49:07.120 and have a conversation next about what Sora has done,
00:49:10.560 not just to social media platforms, but to our brains.
00:49:14.480 Thank you.
00:49:15.280 Subscribe, tell a friend, and we'll see you next time.
00:49:17.360 See you.
00:49:18.720 Mmm.
00:49:19.520 Mmm.