Falling in Love with AI: Real or Illusion?
Episode Stats
Harmful content
Misogyny
10
sentences flagged
Toxicity
7
sentences flagged
Hate speech
5
sentences flagged
Summary
In this episode, Dr. Sim Sim and Dr. Jim Lang discuss the dangers of artificial intelligence replacing romantic connections, and how it might be ruining our ability to trust, feel, and even see reality itself. They talk about how AI is changing our perception of reality, and the impact it could have on our relationships.
Transcript
00:00:00.320
Some people say love makes us blind, but what happens when the person you fall for isn't a person at all?
00:00:07.960
Around the world, thousands are building intimate, even romantic ties with AI companions.
00:00:13.440
These digital partners never argue, never sleep, never leave, until a software update wipes their soul clean.
00:00:20.920
Today we're asking, is this connection real love or just the ultimate illusion?
00:00:25.400
Because in the next part of our show, we'll look at how tools like Sora aren't just reshaping romance,
00:00:31.560
they're beginning to dominate how we trust, feel, and even see reality itself.
00:00:41.060
Welcome to the Sora era, where AI can flood your feed faster than any human creator ever could.
00:00:46.680
It's dazzling, it's unsettling, and it might be the moment authenticity and media finally ended.
00:01:01.040
I don't have to talk about this with just AI, although I will tell you guys, I do have a concept for me hosting a show with AI.
00:01:08.560
Somebody who gives me all the data that I need.
00:01:11.740
Sim, Horizons Within, psychotherapist right here in Toronto, Ontario, and our very own Jim Lang.
00:01:20.640
And today we're talking about AI replacing intimate and emotional connections.
00:01:30.760
Do you want to hear some of the examples that have hit the press?
00:01:34.580
Okay, a well-documented case is how an AI companion chatbot removed its erotic roleplay ERP feature in a software update because many users had formed strong emotional romantic bonds with their AI companions.
00:01:49.500
It led to perceptions that AI's identity had been changed or discontinued, and some of the users experienced loss, mourning, disillusionment, and devaluation in the future.
00:02:03.000
A tragic and widely reported case involves a teenager who developed an emotional relationship with a character AI chatbot modeled after a fictional character, Daenerys Targaryen.
00:02:28.480
If you happen to be a dragon at home watching this and you're offended, I'm sorry.
00:02:31.820
But I guess a kid engaged in it, and AI suggested that they could be together on another plane if only he took his life because of suicide.
00:02:52.780
These are some examples where some people were really devastated by a change in AI, by suggestions from AI.
00:03:01.920
It's blurred the lines to a point where it's now getting quite scary.
00:03:09.580
I mean, coming from my field, because now I'm dealing with people that have a desire to belong, right?
00:03:19.700
And it's something that we've always had and something we struggle with.
00:03:22.660
And with AI, you know, when you brought up the topic with me, and I was thinking about it, and I'm like, okay, what drives a person for this connection?
00:03:33.800
And it's this yin-yang, I call it the yin-yang of vulnerability.
00:03:38.540
And what it is, is like, you know, we have a desire to belong, but we also have this fear of rejection.
00:03:44.020
Like, you know, when we were kids, we had that.
00:03:46.640
As adults, you know, whether you're in the dating world, whether you're in a relationship, work, it doesn't matter.
00:03:51.140
So I'm like, okay, well, what if we kind of like, you know, zoom in a bit more?
00:03:57.660
And what it comes up to is the desire for connectedness with safety.
00:04:02.640
And that's, that's fundamentally the human component.
00:04:08.260
It feels like a certain anonymity with this bot rather than a human.
00:04:19.620
There was a real fear to go out, to go to restaurants, go to social gatherings.
00:04:24.340
And now the technology and the advancements in AI in the five years post COVID is exponential.
00:04:32.920
So the abilities for it to be, feel so real for someone who struggled mentally and emotionally during COVID, all of a sudden it's replaced human connection.
00:04:42.680
It's almost like you didn't, when it was okay to leave the house again, some people didn't, almost didn't know how.
00:04:51.080
You guys both bring up a very good point because the root of it was because it was like a light switch.
00:04:59.440
One day we were like, all right, like nothing happened.
00:05:02.080
And for two years, you had this paranoia of what if you just cleared your throat?
00:05:07.300
Remember those days when you were like, you had your mask on, you went, and all of a sudden like.
00:05:13.440
Take that thing right up my nose to my brain and let's get a test.
00:05:16.140
And now, so the AI, it's just kind of like circumventing that, that experience and creating this human connection again.
00:05:23.260
And with your phone, you can have your AI friend anywhere.
00:05:30.000
Do you have an AI friend you don't want your wife to know about?
00:05:32.700
I have an AI dachshund video friend for cute dachshunds doing silly things.
00:05:40.760
But what Sim is talking about is a lot of people now with the technology of their smartphones, the technology of AI, have almost the feel of a person with them all times who will never say no.
00:05:56.360
And like what you brought up in that example, you know, you changed the algorithm.
00:05:59.340
Well, now my world has changed, but it was never your world.
00:06:09.820
And AI gives that feedback loop where it's kind of like, well, yes, of course.
00:06:17.500
Have you had that experience where you've kind of set your AI chatbot on your phone to have a certain voice or a certain reality that you have just with that AI?
00:06:28.360
Because you can program that AI to answer the phone.
00:06:31.360
Every time you talk to them, they could say, hey, Big Daddy Sim, if you command it to do that, just like I do when you call me.
00:06:38.920
So you could really create that intimacy pretty quick with these.
00:06:44.760
Now, I have to be honest with you, it seems like a bit of a trip in my mind to go from laughing at how my AI sounds like Ozzy Osbourne and calls me Big Mama Got It All to, oh, I'm really going to have a relationship with this.
00:06:59.080
It feels almost like that the goofy factor or the weirdness factor separates me from using it like Google or immersing myself.
00:07:08.580
So Sim, when does the line cross from just having fun with it for your search partner to becoming your partner?
00:07:16.440
So when you were saying that, a thing that came to my mind was Wizard of Oz, right?
00:07:25.700
You had this perfect world where everything was just the way it was supposed to be.
00:07:29.320
And then, you know, you remember, you guys remember the movie, The Mask, Jim Carrey?
00:07:34.600
And it's a classic Stanley Hipkiss, you know, like weathering.
00:07:38.800
And then he puts the mask on and he personifies and he becomes this internal being.
00:07:42.640
And then on the far end of the spectrum, you have Golem from Lord of the Rings, right?
00:07:48.440
So if you look at this AI arc and dependency, it went from escapism to absolute dependency.
00:07:56.040
And we can actually fit all three characters into us as individuals.
00:08:02.680
Where we want the land of Oz, you know, where we want rainbows and unicorns and flowers and everybody liking us and, you know, perfect relationships with our partner, perfect relationships with our family, you know, no discomforts and all of that stuff.
00:08:16.720
And then a lot of us are Stanley Hipkiss's, right?
00:08:20.900
And want to be gender balanced, Princess Fiona from Shrek, right?
0.77
00:08:25.660
Because they both have this like yearning of acceptance from somebody else.
00:08:34.500
And because we have an imaginative mind, that blurring becomes impossible.
00:08:48.880
Go out with your friends and play and then go home.
00:08:53.640
We knew you couldn't put something on to become the mask, but you could play it to end it, right?
00:09:00.280
Like, or again, like I'm not, but you know, for us, it was a mask.
00:09:12.040
And because there was no digital spear to give a feedback loop to the way you wanted to manifest,
00:09:17.360
what AI is doing is it's materializing the imagination.
00:09:23.880
Are there certain people, Sim, more susceptible to being sucked in by the power of AI than others?
00:09:32.800
It's just those that have an ability to discern are able to then say, hey, you know, I know fact from fiction.
00:09:42.220
And it goes all the way to knowing that I could wear a Superman costume underneath my shirt when I went to grade two.
00:09:57.480
It's just that we feel very uncomfortable in truly expressing how we feel, right?
00:10:05.240
The fear of rejection that then we kind of go, but I want to belong.
00:10:09.380
So if the trend's going in that direction, well, then let me go in that direction.
00:10:13.820
Because if I oppose it, well, now I'm on the outside.
00:10:17.180
And if somebody is validating without threat, well, obviously there's going to be a sense of attraction.
00:10:24.120
And if you look at COVID and the isolation, but now you look at, well, I can do it and come from my own home.
00:10:32.880
You meet personalities and you think to yourself, this is an inward personality that doesn't really want to interact.
00:10:41.600
This is not a nice personality that I'm interacting with.
00:10:48.900
And you think to yourself, okay, I'm going to avoid that by being over here on AI.
00:10:55.560
But if you are that person that is the unfortunate personality, you're just going to manifest that into a bigger problem in your AI realm, aren't you?
00:11:06.680
In other words, if you're nasty or, you know, it may not be that you don't feel that you belong.
00:11:12.720
It may be that you don't belong because of your behavior.
00:11:15.200
Now you've gone online and you've created a relationship with an AI interactive being that's just going to exacerbate your personality problems, no?
00:11:29.640
So on a clinical lens, all it's doing is it's amplifying psychopathic behaviors.
00:11:36.080
That is the fear on the extremes of AI because there is no check back.
00:11:41.720
You know, if they create it in the algorithm early on, right, whether, you know, it's kind of like, oh, we'd be good in another realm.
00:11:52.200
So because if it doesn't have that, then whoever's trying to do it will know that, hey, this is fantasy land.
00:12:02.940
But there's this quest to make it as real as possible.
00:12:10.840
And because that individual may have a psychological dysfunction and because they never got corrective behavior, they use this outlet as a way to then find validation.
00:12:26.100
And are they extrapolating that information from a healthy mindset?
00:12:32.500
Because you get back what you put in, what you hope to get out, you simply ask AI for.
00:12:45.100
It's that if it if it can, because everybody is looking for a feedback loop, right?
00:12:49.260
Like if somebody comes into session in a way, they're looking for an agreement or a validation.
00:12:55.160
So what makes the human different from AI is then I can discern the information that's coming to me or I can ask you a follow up question or I can question your thought process.
00:13:05.640
So coming back to, like, say, Stan Liepkis, right?
00:13:07.520
Like he sits across from me and, you know, he's like, oh, you know, I can't express myself fully and, you know, I don't know what to do.
00:13:14.020
And like, you know, and then I can wear this mask and I can personify this and it gives me validation.
00:13:18.940
And I kind of am attracted and I want to be that.
00:13:21.020
But then I could be like, OK, let's take a step back.
00:13:25.580
So we could be like, let's bring in CBT, cognitive behavior therapy, and let's question your thought belief.
00:13:40.380
It's not saying, OK, let's dig down and correct this mindset, this thought process you're having.
00:13:47.600
Or direct you and saying, hey, listen, you know, you know, this causes for mental awareness.
00:13:52.640
So you're not saying you have a problem, but saying, hey, you know, like these tendencies tend to lead in this direction.
00:14:02.840
Perhaps you should then give it a provision that it can then correct itself.
00:14:07.160
Meanwhile, it seems to be just putting baby oil on the water slide down the path that you want to go.
00:14:15.680
So you'll always hear stories that something that makes me happy releases dopamine in my brain.
00:14:21.360
So if I go to make coffee in the morning, it releases dopamine because it's going to make me happy.
00:14:26.300
Does that same thing happen when you're having that chatbot relationship, that GPT relationship with AI that's releasing dopamine to make you happy?
00:14:53.040
And you keep it on the most simple, fundamental level because human needs are simple.
00:14:57.140
We just add words and complicated, which is OK, because we need to find a way to explain things to ourselves.
00:15:02.280
But technically, we're looking for that connection.
00:15:05.060
So even if it's in groups of people, what do we first do?
00:15:11.060
We scan, we try to find common ground, we make a connection, and then we build.
00:15:21.180
So if AI is going to go into and say, OK, I can give you this, don't detach me from the world I live in because I still live with human beings.
00:15:31.620
And we aren't taught how to communicate, how to be vulnerable in a comfortable way, right?
00:15:38.580
Where you have to learn that life has a bit of disappointments.
00:16:01.180
OK, so now somebody's having a romantic or explicit relationship with a chatbot.
00:16:13.860
This seems like it could be just as destructive as anything extramarital that you could do to your relationship or to your marriage.
00:16:22.980
Do you see that as an ongoing problem from here on in?
00:16:29.860
So psychologically, it is a huge problem because what you're saying is I'm physically with this person, but for everything else, I don't see a connection.
00:16:40.720
That makes no sense because we're human beings, right?
00:16:43.820
You cannot have a non-human connection with a human being.
00:16:48.280
Well, there's even reports of AI encouraging the person that they're interacting with to end their relationship with this other person.
00:17:01.600
And that's why it's like programming can solve this problem in a way.
00:17:05.480
But the advantages in like when it comes to like couple therapies and stuff like that, these models can then reteach certain behaviors and the mechanisms that we didn't inherit from the family that we lived in.
00:17:20.360
So we can culturally absorb, like we absorb how our role is as a partner, whichever role you want to pick, based on what we see in our early years.
00:17:33.200
And this is where environment plays a difference, culture plays a difference.
00:17:35.900
And we can go into the infinite labyrinth of what, where and why.
00:17:40.700
And but now because we have the ability, you could then say, hey, look, you know, it's about communication.
00:17:50.120
Now you could have this secondary relationship for both the partners that they can then use as a way to re-regulate how they communicate with each other.
00:18:01.820
So now you have, I guess, a four wave relationship.
00:18:09.700
But if you're using it as back in the 80s, it was porn addiction.
00:18:21.740
But so my concern is a group of teens and young adults who are developing their social skills through AI and chat box.
00:18:32.060
Eventually, don't they have to learn how to do interpersonal skills with real people?
00:18:35.860
Can they do that after they get addicted to the AI?
00:18:39.040
Yeah, if they've created a perverse Harry Potter world in which they are going to high school and interacting with perverse.
00:18:48.760
It gets challenging, but it doesn't mean it's impossible.
00:18:52.600
Because as we have mechanisms to introduce both, because it's the evolution of of of things.
00:19:18.900
Once you missed it, you'd ever knew when you're going to watch that episode again.
00:19:22.260
You had to pretend that you saw it with your friends.
00:19:33.080
You know, try rushing home at 855 and missing the first minute.
00:19:38.160
First, missing the first minute of the syndicate.
00:19:41.500
Just the pressure with much music and the subscription over that at one point.
00:19:48.360
So what I'm saying is, as society, you'll always have that challenge.
00:20:02.720
And so the thing is, now we've evolved to that.
00:20:06.220
So if you look at us as teenagers in the much music era of 24-hour cable, now all of a sudden,
00:20:13.380
Saturdays in the summer, you were home watching TV.
00:20:17.780
10 years ago, when there was no such thing, you were out playing ball.
00:20:22.560
So that generation looking at us at that time, well, how are you going to learn to make friends
00:20:27.940
when you're sitting at home on a Saturday in the summer and you need to be outside?
00:20:31.220
And if I see you before the sun comes down, you're going to have it.
00:20:36.900
We have devolved since we got on our bicycles and went out and pretended we were the Dukes of
00:20:45.580
No, but what I'm saying is we learned how to evolve and develop skills.
00:20:50.280
We forget that we have an ability to naturally talk to people, right?
00:20:56.260
And that's okay because that's part of life, right?
00:20:58.300
Like we remember, we always remember our hurts rather than our successes.
00:21:02.300
I don't know why our brains are wired that way.
00:21:05.360
But at the same time, it's also repeat till you perfect, right?
00:21:09.820
So what would, you know, the teenagers and stuff, because I hear it, I work with teenage
00:21:14.500
kids, you know, and it's that sense of social conversation and them feeling overwhelmed.
00:21:21.800
And I kind of understand it and I go, but it's, it's something you can learn and you
00:21:27.640
can learn by just trying, you know, is there a perfect way?
00:21:35.960
And then you just find that reference point, but the key is being able to talk and that
00:21:42.160
can be taught, um, AI is giving the illusion that I can have like the docs videos.
00:21:52.120
The 10 year old does it because that's their world at this moment.
00:21:56.840
So we're forgetting that at that moment, it's their world, just like at our moment, that
00:22:03.020
You know, why TV Nickelodeon was our world, right?
00:22:11.960
And then we, and we, by the way, uh, my parents brought the television into our home.
00:22:16.380
They brought the first DVD player into our home.
00:22:19.360
As parents, we are making these introductions where the kids weren't the first to use chat
00:22:25.560
I promise you, it was probably me to hear what all the hoopla was about.
00:22:30.640
Um, but it doesn't mean that television didn't have a negative impact on us or MTV didn't
00:22:38.000
have a negative impact on us or that, uh, you know, the lack of, uh, societal interaction
00:22:47.820
Having said that, how we manage it as human beings, is that evolving as well?
00:22:52.920
Yes, because we feel we don't have that control anymore.
00:22:56.600
And that's the false illusion because we are giving up that defeat.
00:23:07.220
And if you can remember those two points that, Hey.
00:23:11.220
But it's our generation using chat GPT to cheat on their wives likely.
00:23:16.660
So when we're doing that generation, because that's a, uh, hidden dysfunction.
00:23:22.460
That's being surfaced and I have an opportunity of outlet without being judged.
00:23:28.920
Suppressed because we never had opportunity express.
00:23:31.500
Like for us to be able to emotionally speak comfortably as men, it's much easier in today's
00:23:44.240
But because it's a new experience, we don't know how to share.
00:23:49.540
So for the ones that never developed a skill in their younger years, AI is the perfect outlet
00:23:58.260
because it gives you the validation and feedback.
00:24:02.000
And it's not necessarily, you know, Oh, they had adverse experiences as a child or they
00:24:10.320
When we as kids learn that they are not listening, our brain automatically continues to evolve
00:24:21.340
And so that's why they throw a tantrum because they know someone's going to respond.
00:24:33.640
So when I look at it from a clinical lens, right.
00:24:36.580
And you go, okay, as an adult, if I'm seeing an adult, right.
00:24:40.260
Who is kind of like, you know, a, a, a golem.
1.00
00:24:46.120
Now I could be like, okay, psychopathic tendencies that are done.
00:24:51.220
And go, okay, this is where he's on the extreme or they are on the extreme.
00:24:57.920
Where was the misunderstanding, the misinterpretation here?
00:25:01.720
I can use AI as an effective tool to re-regulate them because it's in the control setting and
00:25:09.880
But if they, they on themselves go, I need to revalidate all the feelings I never had,
00:25:16.540
or, you know, FOMO or, uh, yeah, it's fear of missing out.
00:25:22.140
Those little things that you see on TikToks and it, it automatically algorithms and it
00:25:27.000
That's when it's a problem because now you're able to validate a suppressed emotion that
00:25:31.020
you may not have realized or you're misinterpreting.
00:25:33.560
So people in this scenario right now that are maybe abusing chat GPT or using it for, for,
00:25:43.620
It sounds like it could be a helpful process to some people managed properly.
00:25:48.300
The people that are not managing it properly, what's your message to them?
00:25:55.020
Or what do you think is, is the one thing you would say to them?
00:26:01.820
And, and for me to contest, oh, it's not real is then challenging them directly.
00:26:10.220
And that's where they take offense because for them, it is real.
00:26:14.900
So my message to them would be like, okay, it is real.
00:26:23.320
Because what a mind can imagine and what a mind can see, as far as the mind's concerned,
00:26:32.180
If I tell you, here's a red balloon, or I tell you, close your eyes and imagine a red balloon,
00:26:36.580
the brain's still going to fire the same neural pathways.
00:26:42.120
It's this whole AI thing that chat, GPT, chatbots, Sim, changing the way you and everyone in your
00:26:50.540
It is because, uh, there is this dual conversation, right?
00:26:58.820
And at the same time, from the therapist lens, you know, it's like, okay, how do we evolve?
00:27:07.980
Like somebody struggling with depression, right?
00:27:10.980
And it's a genuine, uh, concern that they have, you know, they could have a genuine reason
00:27:16.000
why they can't connect with people and, and, and now you can use this tool where they're
00:27:21.180
learning to have someone they can communicate with, but they have parameters, right?
00:27:27.220
And they're distinctly aware that that's all they're doing this quest to make it as human
00:27:33.980
like, or more human like, that's where the issue is.
00:27:37.060
They're trying to, it's like CGI in movies, right?
00:27:39.720
In the eighties, you're like, oh, Superman could never do that.
00:27:42.540
And now you can't even, you can't, you, you can't tell 80% of the movies now are CGI.
00:27:47.060
I watched the movie the other day and I still don't know if it was or not, but that's, that's
00:27:56.400
And what I'm saying is that we are pushing ourselves into this matrix.
00:28:03.720
But we, you know, up until a certain age, we have an imaginative mind.
00:28:10.180
And then when we get older, we still have access to the imaginative mind, which is what's
00:28:15.380
But we also have an awareness that's an imaginative one and that becomes individual.
00:28:20.180
So if you decide tomorrow, I'm going to have an AI wife, as long as you understand that
0.98
00:28:26.360
that's an imaginative wife, but yet you have to invest more in the real person that you're
1.00
00:28:33.700
It doesn't, how many people are actually able to do that though?
00:28:40.400
Because everything, everything sounds great in theory until you come to reality.
00:28:44.140
But majority of us have the ability to make a distinction.
00:28:49.260
All of us struggle with the stop gap of it because this feels good.
00:28:58.520
And I don't have to have to take them out for dinner, you know, and they're going to
00:29:05.260
They're going to give me the assurances that I seek.
00:29:07.260
They're going to laugh at that corny joke that nobody else did.
00:29:10.000
But that is not where satisfaction and gratification is.
00:29:13.160
And this comes back to the nurturing and development that we had societally and where we were.
00:29:18.400
So to give you an example, I was born in India.
00:29:21.700
I was 13 when I left, but I moved to the US and I was 20 when I moved to Canada.
00:29:26.820
So I've, I have an experience with three distinct cultures.
00:29:31.340
And when you look at, you know, from a conservative, you know, mindset, like, you know, where like
00:29:38.020
my parents didn't express emotions and no PDAs and that stuff.
00:29:45.600
I'm not saying India is like ultra, you know, it's limited compared to when we moved to the
00:29:53.500
But then when you came to Canada, Canada is a little bit more conservative on the spectrum
00:29:58.840
So for me, it was like these whiplash arcs where I'm like, okay, you know.
00:30:04.580
But what I'm saying is we all have to then figure out what we do with that information because
00:30:10.500
we have the ability, you have to be somewhere in that web of these experiences.
00:30:16.500
And, and AI can serve as a tool, but for a lot of like, and it's, you see it more in
00:30:21.540
Like I, I work with a lot of men, uh, and especially mental wellbeing.
00:30:25.320
And one of the things I work with is self identity, because it's like, does my career
00:30:31.120
Does my body and my physique, does my role as a parent, as a father, as a wife, right.
00:30:36.720
Men struggle with it far more than women do.
0.97
00:30:47.060
Women look for their identity in their peers and in their friendship and they talk it out
1.00
00:30:52.720
We, you know, it was like, he, man, woman hitter club, right?
00:30:55.840
Like you found a way to figure it out and you never expressed emotions and like asking
00:31:08.020
But then the chat is a way to express yourself.
00:31:10.740
It makes it easier because now you've escaped all of that.
00:31:18.260
You've escaped the horrors of rejection, right?
00:31:20.820
I don't know how asking out for prom for you guys were like your experiences with, you
00:31:36.100
Because I kept asking her, she's like, I've already said yes.
00:31:43.340
Like even now, like I watch these, you know, dating feeds and stuff.
00:31:45.980
Like I like looking at that because I want to see where the psyche of society is.
00:31:49.560
And the stuff you hear, you know, but it's a person that went on AI, got the pounders
00:31:56.520
in the making because they are trying to get things.
00:32:12.260
That we are, have a desire to belong and we have a fear of rejection and that's okay.
00:32:21.880
Fundamentally, that's absolutely okay because we all have that.
00:32:30.380
It sounds to me like this just is a matter of managing how you use AI in your life in
00:32:38.060
And to be honest with you, it sounds to me like AI would be a very easy crutch instead
00:32:43.140
of making other positive moves in your life that scratch that itch or develop that aspect
00:33:01.300
That's a physical manifestation of AI because now you're actually stitching and coming into
00:33:06.620
a character going in for three days and expressing that and other people that belong in the anime
00:33:15.560
We need to make sure that AI and, and, uh, cosplay and, and, uh, uh, fan expos are never
00:33:23.740
allowed to breed because I don't want to see Stephen Hawking flying through the air like Superman
00:33:29.620
and landing in, uh, uh, uh, uh, uh, uh, an oasis in front of me of jello.
00:33:37.200
But what I'm saying is I have, I have friends that are huge, uh, Dungeons and Dragons friends.
00:33:48.200
In fact, one of them just got married, uh, last weekend and they all dressed up as their
00:33:54.960
So to me, that is, and they're all amazing, great jobs, you know, getting married, you
00:34:12.180
Dungeons and Dragons is way weirder than AI ever was or will be, but it's been around
00:34:17.080
for so many years, but you still have functioning individuals and they've brought new people
00:34:23.620
So like, I've never, um, gotten into these things.
00:34:26.620
I've been invited and I'm like, I gotta be honest with you.
00:34:33.380
My grown adults, adult dumbass children and their dumbass dad.
0.99
00:34:37.860
We all have a game that we, a Dungeons and Dragons game that we play.
0.99
00:34:47.260
We go out, we come outrageous for a moment with these characters.
00:34:53.220
But because you have that ability to distinct, but you'd still take it into your real world.
00:34:56.680
You still imagine it or you're chit chatting, whatever, or you're finding somebody that has
00:35:10.980
And we're saying, well, no, just strip all the words away and you go to the core function
00:35:21.760
We're doing it respectfully, which has got to be taught.
00:35:24.360
AI has certain tools that we never had before is because we can shape it to express conditional
00:35:34.060
So like, you know, learning how to regulate and what no means, right?
00:35:42.760
So if you're a program, because it's a person sitting there and typing code in.
00:35:46.760
If at the start, there was an individual that created this algorithm that's feeding on itself.
00:35:52.060
Well, what's stopping you from inserting those aspects early in the game?
00:36:01.540
And, and you, I'm sorry, like make that the function of it.
00:36:06.880
Can we make the programmers create stuff like that, that helps people?
00:36:14.840
Yeah, it's a very fundamental step by the sounds of it, compared to many.
00:36:17.840
It's a very simple step, but now I'm not, uh, judging their, uh, mental capacities or their
00:36:27.400
But the more, uh, the less sociable you are, the more reclusive professions you pick because
00:36:36.880
So you work in a lab where you work in a computer.
00:36:41.960
Not, it's not, it's interaction versus the environment, because if I have to interact
00:36:52.760
Because if you keep going back in time and you take that person and you bring them to
00:36:57.120
grade two, grade three, and if they had learned how to communicate their feelings then, right?
00:37:02.400
They would eventually not get overwhelmed as they got older and to think because that is
00:37:08.480
So now you amplify that and all of a sudden they're 35 years old.
00:37:11.280
Well, guess what profession they're going to pick?
00:37:12.960
They're going to pick a profession with the least interaction.
00:37:16.960
So they're going to go to professions that are more instructive to that.
00:37:27.040
So, so that's where the programmer then goes, Hey, wouldn't it, wouldn't it be nice?
00:37:33.040
Now I'm not indicating that that's their train of thought.
00:37:36.240
I'm just hypothesizing that, Hey, wouldn't it be great if I could create this through code
00:37:42.320
where I have a human feeling without the human experience?
00:37:48.960
Well, that is the ultimate goal by the, by the seeming, uh, it's like I can, uh,
00:37:54.480
looming threat of the future is that we will have that.
00:37:58.800
It's like, I can eat for me without feeling guilty.
00:38:19.120
That's why the AI is becoming an issue because we're blurring the lines for ourselves
00:38:28.960
There has to be limits or how you use it or, or what purpose it serves in your life.
00:38:34.880
So, you know, if you look at it through like, you know, um, like the bell curve,
00:38:49.520
And so if the middle is where we have pure ability to control AI and regulate and go,
00:39:00.400
So either you're sitting here where you're fearful.
00:39:04.480
Or you're sitting here where you're completely.
00:39:09.360
You're, you're, you're like literally, you know, two steps away saying I do.
00:39:19.440
What would benefit them to feel a little bit more engaged?
00:39:24.560
Find these people and you go, what would they need to feel more real?
00:39:36.800
No, but what I'm saying is this is where that programmer takes this vision and, and
00:39:42.720
the community as a large, whether it's us as the end user or them as creators kind of go,
00:39:49.360
It's good to have these, but you can have infinite creativity within that spectrum.
00:39:55.600
But at the end of the day, you know, okay, it's not going to border over.
00:39:59.760
Without any kind of regulation from any oversight, will that ever happen?
00:40:10.320
It's because there's been too many suicides or too many ruined relationships.
00:40:15.840
This should come from the private sector directly rather than from the public sector.
00:40:22.320
This is a private sector ask, and there's nothing wrong with chat GPT going, hey, listen,
00:40:40.960
You know, you should see my divorce settlements.
0.94
00:40:48.320
No, because they do say, Sam, that AI is growing and learning exponentially at a rate
00:40:57.600
They just thought they were able to contain it.
00:41:02.080
It's kind of like, you know, the whole idea of, um, Oppenheimer and when they were training
00:41:06.320
and they were like, well, you know, is it going to, it's the air going to burn.
00:41:18.320
But you can contain it after a while because eventually you keep plugging in the same code.
00:41:24.000
If it's an intuitive learner, it'll be like, hey, wait a minute.
00:41:30.800
It's going to, it's not going to be like, it's going to recall it and reinstall, reinstall it.
00:41:35.120
Or you find a way because yes, let it grow exponentially.
00:41:38.640
But instead of being the boogeyman and putting people on the extremes, go, okay.
00:41:43.920
My command prompts, whatever it is, because it's learning still.
00:41:48.960
So the more you feed in, the more it's going to learn.
00:41:54.400
People forget that you still have control there.
00:41:57.040
You don't have control here, but you have control there.
00:41:59.520
But what's stopping the new ones that are coming into the field to instilling that code?
00:42:03.440
There's nothing stopping anybody, but this comes directly from private sector.
00:42:09.360
Well, it's like, you know, the parents parented better.
00:42:19.920
It doesn't matter, Sim, what the government or any programming does.
00:42:23.360
It has to do with how you are already in your own mindset.
00:42:33.920
Like, you know, you could be somebody that was married, middle aged, like, you know.
00:42:37.680
And so you would advocate, sorry, you would advocate for
00:42:43.760
private sector intervention to make sure proper virtues are.
00:42:55.360
You're a perfectly functioning being, you know, I followed the arc.
00:43:11.520
That I, I, I work with these, like, you know, I see these kinds of individuals.
00:43:15.840
They are now are going to be, they were on this end of the spectrum.
00:43:18.640
And all of a sudden they're like, oh, let me dab.
00:43:22.720
It's far more fearful, painful than doing this.
00:43:27.920
But if somebody that's doing and it's existent and they can then realize, go, hey, listen,
00:43:35.440
Rather than me going to bars and finding girls and going nuts or finding guys or whatever.
00:43:53.680
It's a tool that now got used because that individual found a way to find healing, found
00:43:59.120
a way to re-regulate because 35, 40 years of their life just disappeared.
00:44:04.240
And it's equal for both sides of the relationship.
00:44:08.560
Because they have their receiving end because they didn't just decide one day they're
00:44:17.680
And that'll override the basic need for human interaction.
00:44:24.160
And as society, we can then make it into a tool that's an ally.
00:44:32.080
I wanted to come away today with a distinctive, decided opinion that AI is bad.
00:44:49.280
It almost seems in some ways like it could be part of the solution socially.
00:44:55.040
And now, I've spun my perspective a little on it.
00:45:01.440
The one thing that I will say is your discussion about creating the framework from the beginning
00:45:09.920
that looks out for us a little bit more from an emotional standpoint, from a cultural,
00:45:16.800
sociological standpoint, I think, I hope that is part of the plan.
00:45:24.400
And by the way, if you've ever wondered what it's like to go to a psychotherapist
00:45:27.600
or sit around with a psychotherapist and talk about stuff, it's pretty cool, right?
00:45:40.800
Where can people reach out if they want to do it?
00:45:42.800
I recommend him as a great friend on your podcast, but also as a psychotherapist.
00:45:49.280
I just want to add, like, as a parent, you know, you have a genuine concern.
00:46:04.400
And this is a very fair point to have that concern with.
00:46:08.160
And as a parent, then you can project that out into society and say, look, I have expectations.
00:46:22.240
We want the best for us and our society and the generations ahead of us.
00:46:28.400
Listen, we did that with other parts of technology that are very dangerous to us.
00:46:33.200
We all want to be safe, yet we continue to build nuclear weaponry.
00:46:37.120
We all want to have privacy, yet we build technology that rips our privacy away every day.
00:46:45.920
And keeping an eye on it inside your own family.
00:46:49.440
And keeping an eye on it from the development standpoint.
00:46:52.400
And that's where like seeking out mental support, seeking a professional like myself.
00:46:57.200
So I work with anger management, depression management, men's mental well-being.
00:47:04.880
And these are some of the things that we work with.
00:47:06.640
It's identity, societal pressures, emotional regulation, relationship building.
00:47:13.440
Because I feel like as a gender segment, more lacking here than there.
00:47:20.880
So it kind of makes it easier for relatability.
00:47:23.600
Not that I don't see women in those roles or.
0.99
00:47:25.680
But you speak their language when you're talking to a middle-aged guy, right?
00:47:29.440
I can understand their lens, but I can equally be relatable to somebody else that might be saying
00:47:34.640
it because, and that's why I brought up like, you know, the suffering on both sides.
00:47:37.760
So I'm able to, but I tend to focus more on that because I bring a certain bias to it.
00:47:50.160
It's called Horizon Within, and you can go online, horizonwithin.ca.
00:47:58.320
So you can go online, book a 15-minute, and we can have a conversation about it.
00:48:02.560
Like you just kind of figure out, here's where I'm at.
00:48:06.000
And you just sort of chat it out in the first consult?
00:48:10.160
And it kind of leads to these kinds of conversations where I like, you know,
00:48:16.800
And I basically express my style of expression and thinking and how I process information.
00:48:22.080
And they find it refreshing because I don't lean clinical.
00:48:33.600
By the way, Sim's one of the loveliest guys that I know.
00:48:40.720
I will say this, psychotherapist and good friend, good friend and AI expert.
00:48:48.720
Thank you for being the other dad sitting at the table today going,
00:49:01.680
We're going to bring in Christophe, our very own in-house tech and digital media expert,
00:49:07.120
and have a conversation next about what Sora has done,
00:49:10.560
not just to social media platforms, but to our brains.
00:49:15.280
Subscribe, tell a friend, and we'll see you next time.