Episode 3093 - The Scott Adams School 02⧸12⧸26
Episode Stats
Length
1 hour and 12 minutes
Words per Minute
160.13469
Summary
On this episode of Scott Adams Locals, we're joined by returning guest Brian Ramelli and special guest Owen Gregorian to talk about how to deal with things that don't always go your way. Plus, we have a simultaneous sip.
Transcript
00:00:00.160
With the RBC Avion Visa, you can book any airline, any flight, any time.
00:00:15.980
Switch and get up to 55,000 Avion points that never expire.
00:00:30.000
Stuck in that winter slump? Try Dove Men plus Care Aluminum Free Deodorant.
00:00:36.020
All it takes is a small change to your routine to lift your mood.
00:00:39.380
And it can be as simple as starting your day with the mood-boosting scents of Dove Men plus Care Aluminum Free Deodorant.
00:00:44.940
It'll keep you feeling fresh for up to 72 hours.
00:01:33.960
We have Sergio right there and Marcella, our beautiful Marcella.
00:01:45.340
You guys, I just want to remind you as always, this is the Scott Adams School, which is completely
00:01:51.160
different than Coffee with Scott Adams, which lives on its own in thousands of hours of
00:02:00.120
So please know that Coffee with Scott Adams, you can go back and watch as many streams as
00:02:07.020
And we encourage you to subscribe to Scott Adams Locals channel, where we have some exclusive
00:02:15.220
You're going to be very excited about some of them.
00:02:21.760
Next to me in one of them also in an in-depth conversation about something very important.
00:02:27.120
So before we get to that, we're going to have a simultaneous sip.
00:02:32.480
And I just want to say it's a little throwback to when things just don't always go right with
00:02:48.740
I did not realize that you wanted me to do that.
00:03:56.920
Ryan, did you resist doing the simultaneous sip initially?
00:04:09.960
A number of people have asked me about that, but I finally got it, I think, maybe a couple
00:04:15.780
of weeks in because I wasn't sure where he was going with it.
00:04:22.580
Well, he kind of introduced it, at least eventually, as like, this is my way of hypnotizing you.
00:04:30.960
I just, I had that natural response, I think, that probably a lot of people had that were
00:04:35.780
But I think there are probably at least a few people that just never did it.
00:04:43.160
It's neuro-linguistic programming, and I detected it pretty early on.
00:04:53.560
You guys, this is the perfect sip for what's happening right now.
00:04:58.920
I'll hit, you guys hit mute, and we're going to play Scott.
00:05:27.040
When I tapped my papers, did the picture just disappear?
00:05:57.680
I mean, we're into some statistical impossible situation here.
00:06:01.780
It can't be that all the apps died at the same time.
00:06:08.220
I know there's a massive incompetence problem, but that's pretty impressive.
00:06:14.740
If you'd like to take your experience today up to levels that nobody can even understand
00:06:32.960
Because the starting assumption is that things went right.
00:06:40.560
Let's just do the, let's just do the, let's just do the side of the video soon.
00:06:48.860
Shall we surrender to the fact that everything's just going to go wrong today?
00:06:53.980
Just absolutely everything's going to go wrong.
00:07:06.900
How perfect was that for what happened just before?
00:07:10.560
I thought, Oh my gosh, that's a little wink and a nod right there.
00:07:14.160
But we did love, we did love when things went wrong because it was just always funny.
00:07:21.320
So you guys, um, I did send a link out on X yesterday, um, for Brian series.
00:07:27.940
And a lot of you said that you have been reading it and enjoying it.
00:07:33.040
Some of us are frightened and scared and there is a whole special section on Scott also, and
00:07:42.520
And I was just looking, cause I'm going to botch the name.
00:07:45.340
It's 5,000 days to help me, Brian, she, to, to the end of work as we know it.
00:07:56.220
So with that, we welcome back Brian and you guys, Sergio is going to be watching your
00:08:01.980
comments and YouTube and Marcel is going to be watching locals to see if you have any
00:08:06.780
questions for Brian as we talk, but you guys really loved him last time and everything
00:08:12.540
So we said, you have to come back again and again and again.
00:08:15.740
So Owen, I'm going to let you start because I know you had something ready.
00:08:21.700
Well, my first question I think relates to some of the earlier posts in the series where
00:08:26.740
you're saying essentially a lot of what people need to do is work through grief and go through
00:08:37.460
And it was all focused on emotional recovery, essentially like processing trauma and things
00:08:44.140
And that surprised me just cause I guess as a technologist, I kind of expected, okay, here's
00:08:52.640
And instead you're kind of taking this left turn into, you can't really do anything until
00:09:00.180
Can you talk a little bit about why you think that's so important?
00:09:13.400
So I don't believe that we can get through the next 5,000 days.
00:09:21.740
And this is when the old king dies and a new king needs to be crowned.
00:09:25.760
And it's the only word I can find in this middle space where the end point is going to
00:09:33.680
It's not going to be utopia, but it's going to be abundance.
00:09:37.460
And I'll just briefly touch upon it economically and philosophically, and then I'll get into
00:09:46.380
Philosophically, what's going to happen is there's going to be scaling of AI and robotics
00:09:53.380
to such a level where AI is building AI and robots are building robots.
00:09:58.360
Now, let's get away from the Terminator and negative side of it.
00:10:05.680
The center line is everything's going to become inordinately less expensive, even the robot.
00:10:12.000
At some point, and I know the initial reaction, mine, everybody, is going to be, I'm never
00:10:25.360
This is why we're seeing the wheels come off the cart.
00:10:29.140
There's a lot of reasons why the world is the way it is.
00:10:32.100
And people in power are concerned because the control mechanisms are no longer there.
00:10:40.160
We lived through a world of scarcity, but we had a world of abundance at one time.
00:10:47.980
And what I really mean by that is if you were hungry, you went to a tree and you pulled an
00:11:05.200
And it's very much if you go biblically and you go to the Garden of Eden, the story there
00:11:12.200
And the knowledge is what got you out of that abundance.
00:11:20.720
But where we are heading is the things we thought that were really expensive are going
00:11:31.140
Because as AI continues to build more AI, it's going to solve many more of the problems or
00:11:41.660
And either we didn't know it or it wasn't widely available for all of us to know it.
00:11:46.140
And that's going to come whether or not people in power want that to happen, because this
00:11:53.020
is a democratization of cognitive intelligence.
00:11:56.760
And it's a democratization of robotics and labor.
00:12:03.520
No, we barely have a bipedal humanoid robot, but it will happen.
00:12:09.820
And so a lot of people ask me, how do I know the future?
00:12:14.360
You find the point in the future and you work your way backwards.
00:12:19.900
So you're never looking in the rearview mirror.
00:12:22.300
You're looking forward and then you're coming back.
00:12:26.000
It's the middle point that is always going to be random.
00:12:31.780
And a lot of people like me were raised on dystopian movies.
00:12:38.260
And so we always tend to look at the car crash.
00:12:41.180
We slow down and say, oh, that's built into our programming.
00:12:46.280
And Scott was really, you know, you read his books, very much into why we do that.
00:12:54.980
When it comes down to our fear, our fear is the robot's going to take over.
00:13:02.600
Yeah, that variations of that are going to happen.
00:13:06.340
But the end result is there will be an equilibrium.
00:13:14.160
Psychopaths and sociopaths are always going to exist.
00:13:22.900
But the beautiful thing about where we're going is we are dethroning them by the democratization
00:13:33.940
And if they're using it as a hammer to bang people over the head, fine.
00:13:37.220
If they're using the fire to burn people, fine.
00:13:39.860
We, the majority, are going to use it to build a house, to build a structure.
00:13:45.020
We use the hammer as a tool to let us not have to suffer the consequences of what happens
00:13:53.920
You know, we get a little cold and we want to be in a home.
00:14:03.000
Now, there's a lot more to it than just a hammer.
00:14:05.680
But again, anything can be formed into a weapon.
00:14:08.980
Do not fall for the reframe that this is something that's going to get you because your fear is
00:14:21.280
Because from zero to eight, most of us have experienced some form of trauma.
00:14:33.040
It defines the trajectory of our life, whether we like it or not.
00:14:37.520
And when you are going through trauma, you have to deal with the trauma you had first
00:14:43.600
because that's the format you use to deal with the future.
00:14:48.160
And the traumatic period that we're going through right now, people are losing their jobs.
00:14:54.460
People who have worked their entire careers to make a line that looks beautiful.
00:14:59.220
That line can now be duplicated by an AI in 30 seconds.
00:15:13.440
We in the Industrial Revolution have defined ourselves by what we do.
00:15:25.280
All of these different, you know, blacksmith, goldsmith, all of these different
00:15:32.940
What if all of a sudden you can't make a living from the things that you spent your life doing?
00:15:44.380
And some of the people who are talking about it, unfortunately, are indoctrinating you into
00:15:54.280
You know, come back for your next half hour therapy session.
00:15:57.000
And we'll work through this grief and then we'll do it next week and then next week and
00:16:04.380
Trauma has to be dealt with by facing it full on.
00:16:08.440
And I, in part one, I open up a series of books that you can use to face trauma.
00:16:15.360
I think one of the most universal, and it's hard to take, is Teal Swan.
00:16:22.280
Teal is a victim of trauma and she's overcome her trauma tremendously.
00:16:29.020
It's very apropos with what's going on with Epstein files and things of that nature.
00:16:36.200
I've gotten to sit through seminars and I've gone through her training sessions to try to
00:16:43.020
Let me tell you, it is not normal psychotherapy.
00:16:50.620
And what we do as humans is we encase our trauma into a little ball and there are layers of onion
00:16:56.740
and we hold it deep inside and we don't look at it.
00:17:00.200
And that's the ghost that's always going to chase you the rest of your life.
00:17:03.080
A lot of us guys who choose technology and mostly guys that do this, we want to rationalize
00:17:13.880
I, as a kid, wanting to go to Princeton, wanting to become a subatomic particle physicist, I
00:17:26.940
Don't give me your fuzzy, stupid explanations of woo-woo garbage.
00:17:33.600
I thought I understood what a fact was and what the truth was.
00:17:40.040
And when you really, really dive into subatomic particle physics and understand quantum mechanics,
00:17:47.220
you realize you don't understand quantum mechanics and you don't understand physics.
00:17:56.400
And your observation is only as good as the tools that you use to perform that observation.
00:18:10.040
If I went up to you in 1760 and I said, hey, guys, there's trillions of creatures on my
00:18:20.860
I might even have evidence that there are trillions of creatures on my hand.
00:18:24.280
But it's not evidence that the crowd would approve of.
00:18:29.940
And then an invention takes place, the invention of the microscope.
00:18:34.640
And literally, the person that discovered this almost committed suicide.
00:18:41.260
The whole idea of pasteurization, the whole idea of a surgeon washing his hands before he
00:18:48.340
goes and delivers a baby after dealing with gangrene.
00:18:51.720
You've got to remember, if you study history, a surgeon thought you were woo-woo out of this
00:18:59.140
earth, crazy, that you would dare to tell him to wash his hands after dealing with one
00:19:20.980
When we become arrogant, we always think that this is a generation that knows everything.
00:19:28.200
And everything that is in the past, well, they were just stupid and now we're smarter.
00:19:34.460
Everything that we think is a fact today is probably going to be laughed at in less than
00:19:41.460
Because new tools of observation will allow us.
00:19:48.120
Now, getting back to the emotions, we think that we have our emotions in check.
00:19:54.620
Just don't rub me the wrong way and everything.
00:19:57.160
We become reactive because we have coping skills.
00:20:01.300
Our coping skills are developed when we're children.
00:20:06.720
We have to develop coping skills or we wouldn't be here right now.
00:20:10.400
We are the definition of a survivor of coping with whatever we thought was trauma.
00:20:17.600
Now, your trauma and my trauma is going to be markedly different.
00:20:21.100
My trauma might be, I stuck my finger in the light socket and that really messed me up.
00:20:28.780
To me, that trauma is just as big as the trauma of being tortured.
00:20:33.360
Or a parent that didn't care who were drug addicts who put their cigarettes out on me.
00:20:44.300
It doesn't change in the context of the human body.
00:21:07.240
When we look at somebody else's problems and we look at ours and say, well, mine are worse.
00:21:11.780
And then we find somebody in the street who has no legs and they're barely getting by and say, well, theirs are worse.
00:21:18.020
And yes, of course, you know, there are different metrics that we use universally.
00:21:26.340
If you don't deal with them, and I mean really deal with them, when you face what we're facing...
00:21:31.820
And I don't want to be doom and gloom because it's not about that.
00:21:36.380
It's just everything we thought was solid is going to change.
00:21:47.380
Because it's the only thing that we can hold on to that makes sense of a crazy world.
00:21:51.780
Like when I wanted the world to be one and zeros and be logical and, you know, come on, you guys.
00:22:04.940
And an atom 12 billion miles away can be affected by an atom right here faster than the speed of light.
00:22:14.580
You know, superluminal is how they would term it.
00:22:21.960
And then when you realize it is true, then you realize how much you don't know.
00:22:28.900
And the people who are in control of access to our psyche have a meter and a credit card slot.
00:22:42.480
He was showing us patterns to understand and ways to see the world,
00:22:47.420
not by paying a turd gold, but by reframing things into the proper context.
00:22:57.360
Everything that ever was made was an imagination in somebody's mind before it was made.
00:23:04.860
So part of the way you fix yourself is absolutely accepting the fact that you face some sort of trauma.
00:23:13.200
Now, that doesn't mean you victimize and label yourself.
00:23:15.800
I am diametrically opposed to victimizing and labeling.
00:23:21.820
I think you have those tendencies which you can change.
00:23:26.760
Labeling is a disempowering system and tool that I think, you know, Scott would reframe into saying,
00:23:33.860
well, no, that's how I make myself stronger by, you know, I had this impediment with my voice.
00:23:39.920
Well, I found a way to reframe it so that now I can speak.
00:23:47.180
But what's different this time than any other point in history,
00:23:50.000
and myself and Owen, I'm sure all you guys study history quite a bit,
00:23:53.880
is we've never had this happen at such a massive scale all at once
00:23:59.400
with the backdrop of absolutely chaotic economics, philosophies, and just direction of life.
00:24:13.000
So, the only thing that most people had to hold on to that they thought was solid was their career
00:24:20.080
And I'm going to tell you very clearly, you need to hear it.
00:24:25.280
And I don't care if an AI can do it 10 times better.
00:24:32.000
And the metric that you're using is wrong, right?
00:24:36.740
I did not fall apart when my slide rule, I learned to use a slide rule, right?
00:24:42.660
I can do differential, I can do any type of equation on slide rule, within reason.
00:24:50.480
When a calculator and the spreadsheet came along, I didn't fall to pieces.
00:25:00.960
And I said to myself, well, I don't need to use this anymore.
00:25:05.640
And there was some value that was achieved by me by learning that skill.
00:25:11.460
And in the post-scarcity world, in the abundance world, which we are going to get to,
00:25:20.580
I say it's 5,000 days when we finally see that.
00:25:36.160
And it's designed for people to actually focus on the fact that no matter how many hand grenades
00:25:42.020
are going off around them, because you're going to see people in power freak out.
00:25:47.220
Because freedom of speech, control of information, hierarchies of ivy towers,
00:26:00.500
I have local AIs that I can, if the whole internet comes down,
00:26:05.720
I can still solve most of the world's problems within a model that fits into my laptop.
00:26:11.700
And very soon, a Raspberry Pi that costs me 60 bucks.
00:26:20.140
But that doesn't necessarily solve all the problems.
00:26:23.480
You and I, for at least 25 years, had access to all the information in the world.
00:26:31.420
And when I was a kid, and if you were to look at all of the people we stand on their shoulders
00:26:37.560
that built knowledge, they prayed for the day that they could have access to all the information
00:26:48.120
In fact, if you really want to look at the world, it's gotten worse.
00:26:52.340
Over the last 30 years, with all of this information and all of this connectivity
00:26:57.500
and all this socialization electronically, society has gotten, by every metric, worse.
00:27:04.120
I thought you said not to be scared about that.
00:27:10.040
Because we're like the cats with infinite lives.
00:27:16.280
Humanity is always going to overcome the trauma that's in front of us.
00:27:25.340
I can go into hours of explaining why that's true.
00:27:36.580
And I believe that with every fiber of my being.
00:27:42.640
I'm not saying that everything in between will work.
00:27:49.500
You're the sum total product of all of your ancestors that have gone through unimaginable
00:27:56.400
traumas to make sure that you are here right now at this moment.
00:28:01.600
So whenever somebody gets a little banged up and a little depressed, look backwards at all
00:28:07.440
those people that made you become here right now.
00:28:11.360
Now, they might not know you, but they knew that that was what they were sacrificing for.
00:28:19.740
And we can sit here and we can get so angry at the world saying, look how bad it is.
00:28:28.920
The majority of us love each other, care about each other.
00:28:33.740
We might have a minor 1% difference in the way we might view the world.
00:28:37.880
But we all want to be left alone, to live a happy, healthy life, to raise children, and
00:28:46.040
And that's the universal directive, if you will.
00:28:53.500
That's not going to change no matter how many eugenicists or cyborg-loving people think that
00:29:00.720
they're going to slap themselves 50-50 with a computer and human biology.
00:29:11.780
And you're going to tell me you're going to cyborg yourself into the singularity, and
00:29:19.280
Being locked up inside a robotic silicone world is the very definition of hell.
00:29:28.680
If you want to get biblical, that's what hell looks like.
00:29:35.660
So coming back, trying to reel myself in, we have to deal with our trauma.
00:29:42.120
And the way we're dealing with the world right now is a direct reflection of how we dealt with
00:29:49.360
And again, you're not a victim, but you must start looking at the structures you use to solve
00:30:07.260
Or do you say, I need to adjust the way I think about this to move forward?
00:30:16.400
It's made us become so hyperreactive that, oh, I'm going to make this comment right now,
00:30:21.200
this libertard or this right wing, whatever your flavor is, right?
00:30:33.620
And you don't ever achieve anything but a slight hit off your crack pipe.
00:30:39.620
Because what it's doing is it's a neurotransmitter cascade that's dealing with a fear that you
00:30:46.360
have and you're projecting it outward to something very real.
00:30:50.540
I mean, there are incredibly bad things going on.
00:30:55.120
And we're finding more and more every couple of hours, right?
00:31:03.580
I don't think we're ready to really know what really was going on.
00:31:08.780
And if you don't adjust your ability to cope with this, I'm not saying accept it.
00:31:17.980
I'm saying channel your energy, understand why it's disempowering for you to fly off the
00:31:24.800
handle, understand why you see a news program having all these blinking lights and banners
00:31:44.880
I have more VHS tapes of news programs than probably anybody.
00:31:49.600
And, you know, it wasn't until Ted Koppel's Nightline and the Iranian crisis that we ever
00:31:55.340
had these urgent every night, day 200 of the hostage crisis.
00:32:01.300
And it became a serial, addictive doom loop cycle.
00:32:06.200
And it became the archetype of what all CNN and all news programs have become to constantly
00:32:29.160
If you're thinking, oh my God, what's Brian saying?
00:32:38.620
Well, AI is going to help us do that because it's going to help us create, again, it's the
00:32:45.200
But at the very least, use Grok more than any other platform because it comes closer
00:32:49.880
to whatever you would want with a truth-seeking AI platform coming from a corporation.
00:32:57.800
But we will have these tools that are going to empower you and will detect the different
00:33:04.560
techniques that are being used to excite your neurotropic release of urgent, you know,
00:33:17.920
Aggression, you know, all of these different things.
00:33:20.500
If we don't get out of that state, we will never make rational decisions in our life.
00:33:25.640
And we've been victimized of it for quite a long time.
00:33:36.280
When you look at people that you objectively can say are hypnotized, they are, right?
00:33:43.520
They are reading into something and de-rationalizing it to fit a narrative for a team.
00:33:56.000
Just like I rejected physics initially because I wanted everything to be black and white,
00:34:03.420
The same is true with human personality and human goals.
00:34:06.700
When we were growing up, probably everybody here, when we had political discussions,
00:34:13.700
You know, oh, yeah, you got some liberal opinion there, buddy.
00:34:16.480
Or you sound like a curmudgeon conservative cigar chomper.
00:34:21.160
You know, today, it's you're ready to go to war.
00:34:23.940
And there's reasons for that because the teams have been co-opted by people in power
00:34:33.980
And the people who are victimized of it on all sides are playing the part to make sure
00:34:40.900
that the dust is going back and forth and you don't see the left hand of the magician
00:34:49.460
So those are the things that we're going to be facing.
00:34:53.940
I want to get to the good part at the end, or at least what you think the path forward
00:35:00.560
But, you know, the next part I think that I remember is going through the stages of grief.
00:35:06.520
What I took away was that you think we all need to go through that in the sense of losing
00:35:10.160
our identity as it relates to having a career or having work be a core part of what our
00:35:17.540
Um, and, you know, both dealing with trauma and going through the stages of grief, I think
00:35:29.160
Um, but let's, let's assume we do that, you know, and we come out the other side, we've
00:35:36.340
Um, and by that, I think what you were saying is you need to accept that work isn't your purpose
00:35:44.120
And that, um, you know, there's, you, you have to find a different purpose essentially,
00:35:50.320
but that, you know, you, you can't depend on your contribution as, at least as you knew
00:35:57.820
it before in the sense of doing routine tasks or, you know, that your production was kind
00:36:05.380
of your measure of value, that that's going to be gone.
00:36:35.480
So whatever you think is going to happen in the future, you can invest in it at Wealthsimple.
00:36:46.800
Um, I, I urge everybody to read the hero with a thousand faces by Joseph Campbell.
00:36:54.600
I know I told everybody to read the user illusion.
00:36:57.000
I hope some people are reading that, but Joseph Campbell is going to open up your mind
00:37:00.680
about the archetypes that we all have in our, in our subconscious.
00:37:04.700
They, they, they are called Jungian archetypes.
00:37:10.940
And Joseph Campbell looked at all the mythologies and myths of all these cultures and tied them
00:37:17.660
together into the monomyth and the hero's journey, which we are all on, whether we know it or not.
00:37:24.720
And I've tied the, uh, 5,000 days series to the monomyth so that you can actually have
00:37:32.360
So Elizabeth Kubler-Ross, um, and the five stages of grief is a really good mythology and a really
00:37:41.520
good method, uh, cause it's been around before her is that you have to understand that you're a
00:37:49.540
caterpillar and the caterpillar goes into a cocoon and its entire body is eaten alive by acid.
00:37:58.380
And it comes out, it comes out as what comes out as a butterfly.
00:38:04.100
And when you, when you actually look at that and you say, okay, that's the metamorphosis,
00:38:09.360
that's a transformation, you know, everybody goes through this in their life to some degree
00:38:17.540
But this is happening across all cultures, all, all at once.
00:38:22.640
If you live in India right now and you were working at, um, a data center, AI just took
00:38:30.780
your job and that was your, that was your leg up.
00:38:34.380
Now, none of us are living in India right now and we don't work at a data center, but, um,
00:38:42.680
And right now, most of the AI top line AI from Claude Code, YGRAC, uh, chat GPT, they can code
00:38:57.460
Now that guy is going to go home one day to his wife and kids, built his entire life on
00:39:08.340
Now there's already our suicides in India because of this.
00:39:21.120
It's one of the many themes that I've been studying.
00:39:37.940
And if that means you're going to be a farmer, a plumber, because I also in the series talk
00:39:43.640
about blue collar jobs, taking off like a rocket and people are going to choose to do
00:39:51.740
Now I have some friends who are, who are in these trades and say, I'm never going to love
00:39:59.180
And if they had to stop doing it, they would feel lost.
00:40:03.740
But if you're doing a knowledge job, AI is going to replace you.
00:40:10.740
Brian, I just want to say that that solves a problem.
00:40:13.540
I mean, so maybe that's a weird way of solving.
00:40:25.860
Maybe that's a way of like the universe solving a massive crisis we have where we don't have
00:40:31.700
people doing blue collar jobs and we can't build, we can't grow the infrastructure, everything.
00:40:38.580
So maybe it's going to suck for a while for people, but maybe it will show people like
00:41:00.480
You get on a train and you've spent a lot of money for the ticket and you're going to
00:41:06.380
And that ticket's a hundred thousand, two hundred, three, $400,000.
00:41:11.080
And then while you're on the train, the destination disappears because law, radiology, medical,
00:41:21.220
um, all of these, all of these jobs that as parents, Oh, I can't wait till my kid becomes
00:41:33.640
And the people who are on that train right now, they're screwed and there's nobody coming
00:41:41.780
By the way, do we want the government to come say, I'm sorry, I'll fix it.
00:41:48.060
They don't even, they don't know what we know right now.
00:41:53.880
They, I have conversations with people in government who are saying they're in kindergarten.
00:41:58.280
They're not even able to comprehend the economic impact of what this means.
00:42:03.440
Because the economic impact means money becomes worthless.
00:42:06.800
And I don't, we don't have time to talk about that, but all of the Kings that are sitting
00:42:10.900
on these great thrones of money, they realize that that money is going to become devalued.
00:42:18.600
And the, the levers of control are going to change whether we like it or not.
00:42:25.100
Now, the question is, is it going to happen in the West or is it going to happen in China?
00:42:30.520
So when you are becoming Ned Ludd at a Luddite and taking out your baseball bat to take the
00:42:36.200
next Tesla robot, because you don't want these clankers in your world, good.
00:42:41.040
China just won another score because they are doing that.
00:42:44.520
They are openly embracing this and oh, good for them.
00:42:48.960
Well, when they send 20 million of these over clanking across America and yeah, well, I'm
00:42:59.960
The reality is if you choose to become a Luddite, that means you choose your future and
00:43:07.220
the Luddites didn't make it through that future, right?
00:43:13.400
The candle makers said, I'm not going to give up my candle.
00:43:19.080
Their house might've burned down when the Edison light came.
00:43:28.100
Don't get mad at me that I'm giving some realities.
00:43:39.480
I feel some guilt because I've always been in tech and I've been cheering on tech, but
00:43:46.740
So the question is get out of the denial phase, right?
00:43:51.040
The five stages of grief, Elizabeth Kubler-Ross.
00:43:53.880
We have to get out of the denial phase as quickly as we can.
00:43:59.020
I mean, it's sad because I'm using that to deal with Scott leaving us, right?
00:44:17.520
You need real people face-to-face that you can see and hang out with.
00:44:25.580
I really think you need to align yourself into some philosophical system if you don't.
00:44:31.100
You know, all I know is anybody who gets deep enough in science becomes religious.
00:44:36.780
They may not openly say that, but that's what happens.
00:44:43.640
You have to do this because it won't be done for you.
00:44:49.780
Mother bird is not going to chew the food for you and feed you.
00:44:56.140
A lot of our friends who are very conservative are going to beg for government intervention to stop this.
00:45:06.060
And somebody in China is laughing their ass off.
00:45:14.820
Now we have to choose how we are going to deal with it.
00:45:23.220
I tried to do it 10 years ago, and I look even more bizarre.
00:45:31.720
And it's incumbent upon you to help the people around you and to have grace because they're not ready for it just like we aren't.
00:45:39.300
But you are already expert in it just in this conversation than most people are because they're in complete denial.
00:45:46.500
They don't even know that the body is dead and it hasn't fallen over yet.
00:45:51.240
They just think it's still alive and the worms are controlling it and it's a zombie, you know?
00:45:56.740
What I took away from your latest post, I think, was that one path is to become the conductor of the AI, to really embrace the tools, become an expert in using them, and also focus on the human aspects that AIs are not good at.
00:46:11.020
Meaning being a tastemaker, being a curator, being able to maybe run the AI 10 times and say, that's the one out of the 10 that's the good one.
00:46:19.620
You know, it's similar to what Scott would say about how he can just intuitively know when something's funny and he might have a lot of ideas, but it was, you know, he would be the one that says, this is the good one.
00:46:32.620
And that's something that, you know, not everyone's good at, but you might have some natural talent in a particular field or in a particular area where you've developed that intuition and you can know what the right thing is.
00:46:44.780
And if you have that expertise, then that may be a durable skill that you could leverage with AI and use a bunch of AI workers to go carry out your tasks, but you'd be the one making the conducting decisions saying, okay, this is the task that should be done now.
00:47:00.800
And these are the good results, these are the bad results, and becoming kind of the top of the food chain.
00:47:08.340
And then the other one was more of what I think of as kind of like just a bridge, but, you know, go become an electrician or a plumber, and then it's going to take longer for it to get there.
00:47:19.960
But, you know, ultimately that may only last for another, let's say, 30 years, but at least it's something that's also a lot harder for AI to do because they may never build a robot that can go under your sink and, you know, fix your particular sink because it's unique in the world.
00:47:36.360
Are there other paths or are those the two primary ones you see?
00:47:41.800
You know, again, we have to define our lives that gives value in our life.
00:47:46.600
You want to know what everybody can do right now?
00:47:49.700
Become a better dad, become a better mom, become a better friend, become a better lover, spouse.
00:47:55.760
These are human skills that you now are going to have more time to spend with it because that's what humanity did for 99.9% of our existence.
00:48:05.140
Before the Industrial Revolution, we were better parents.
00:48:11.800
We were better at being a village that cared for each other.
00:48:16.600
Not in some manifesto Karl Marx socialist type of way, but in a real loving way because love is the theme that has been the cohesive force that makes humanity stay together.
00:48:29.720
And we just have hallmarked it and commercialized it to make it almost repulsive, like a really bad perfume.
00:48:39.780
It's like, ah, it would smell good in the drop, but man, you know, this is what we need to be able to do.
00:48:46.620
And it's going to be hard for some folks because they're so used to putting their nose to the grindstone.
00:49:07.240
When you are in your bed in your last minutes, you're not going to care about that report that you did.
00:49:14.940
You're not going to care about, you know, that gear that you made so perfect.
00:49:19.440
You're going to care about the people that have touched you and you have touched.
00:49:23.740
And do you want to be the richest person in the graveyard?
00:49:34.160
Everybody's name that you know today that is famous will not be known a thousand years from now.
00:49:40.280
So whatever we've been driven by to make ourselves, I'll show them because most of us guys, I'll show them because we have to prove our worth.
00:49:57.840
This is going to be a crisis for a lot of guys because it's built into our DNA.
00:50:02.420
I don't want to get into the sexist kind of thing, but it is the reality.
00:50:07.420
And all of a sudden you take our purpose for existence.
00:50:22.040
And Owen points out, you know, can you become a plumber if you're a lawyer?
00:50:33.000
The problem was that we made it dishonorable to begin with.
00:50:38.960
And the ivory tower world, remember, they created this world where you have these hierarchies of you spent a few hundred thousand dollars, you get this credential.
00:50:59.860
You go in through it, then you get out of it, and you get recognized.
00:51:03.640
For 99% of our existence, that's not how we judged a hierarchy.
00:51:15.740
The communities that didn't work, they didn't get to reproduce.
00:51:21.000
We are the victors by the very reality that we exist.
00:51:26.380
The ones that didn't have the right programming, they didn't exist.
00:51:30.480
The ivory tower will tell you it's survival of the fittest.
00:51:49.640
It is not I'm going to elbow this guy and elbow this guy to get ahead.
00:51:53.300
It is the ability to adapt to a changing environment.
00:51:58.480
We are the only species that I know of that are born naked.
00:52:11.220
When it's born, it's already born into his environment.
00:52:13.700
If we are born in most of the world and we don't find invention and creativity, we will either starve or freeze.
00:52:22.020
We, as humans, our directive from God is to go out there and invent and to build safety.
00:52:32.280
The very first thing we do is build a wall around ourselves to protect us from the outside.
00:52:38.800
Anybody telling you not to do that is anti-human and does not understand history.
00:52:45.800
You need protection because we are vulnerable because we are born naked.
00:52:53.520
One of the very first things we did, mostly guys, first was to get a loincloth for freak's sake.
00:53:01.240
Because we could get ourselves in trouble running around the woods without a loincloth.
00:53:18.340
Ooh, that makes our food taste a little better.
00:53:29.440
All of these things were going on long before we ever came about.
00:53:44.160
If the 1950s, really, it was World War II that brought the idea of breaking up the family unit as a solid structure.
00:53:55.280
And that emergency we call World War II said, okay, Rosie the Riveter.
00:54:00.680
Now, I'm not talking about the philosophy of suffragism and Bernays and what he did with his freedom sticks.
00:54:11.080
I'm talking about the most solid unit that humanity has ever created was a family.
00:54:16.880
And the thing that almost everybody in power wants to do is make sure the family doesn't work.
00:54:27.560
And all I'm doing is looking at the objective truth of how we got here.
00:54:31.700
We did not get here by the philosophical systems that we're using today.
00:54:37.040
In fact, it's objectively proven that those philosophies have failed because we are now going to de-evolution.
00:54:47.640
When you start de-evolving, you realize that the philosophical underpinnings that you're using are no longer valid.
00:55:08.340
I can't tell you who you are, but I can tell you that there will never be another person like you ever born.
00:55:27.780
We only have a few minutes left, and we promised people we were going to ask questions.
00:55:32.180
But you were just cooking so well, and we wanted to hear everything you said.
00:55:37.480
I know Marcella said she wrote down questions for you for next time, and I think Sergio did too.
00:55:42.960
And Sergio, Marcella, if you forgive me, I just want to ask Brian, because this is very important to Scott, his estate, to us, to our future,
00:55:51.100
that we just take the last few moments to talk about something that's happening right now that a lot of you know,
00:55:58.240
and that's about using somebody's likeness, intellectual property, intelligence, everything for AI that you don't own.
00:56:09.140
So Brian can speak more eloquently about this, but I just wanted to use this last bit of time.
00:56:17.100
So you guys know that there's just been, you know, people across the board that want to make a clone of Scott or Scott's son or Scott's dog or whatever.
00:56:27.100
And Brian has written about this and thought about this extensively, and I just think it's important to listen to someone who is on the forefront of this.
00:56:40.080
And like he said, he already thought about the problem and worked it backwards.
00:56:43.300
Um, so he, he said he would speak to us about that for the last few minutes and he's coming back again and again and again,
00:56:51.080
and we're going to do a long form interview with Brian in, um, on locals exclusively.
00:57:03.240
Um, Sergio, did you want to chime in at all before we get there?
00:57:36.900
Uh, we have to ask them next time though, but unfortunately, yeah, next time, but I'm going
00:57:43.040
Uh, and I just wanted to say that we have good questions and I'll pass them on.
00:57:51.600
Um, everyone, um, Sergio, will you drop Brian's handle in the YouTube chat and we'll do it here.
00:57:57.960
And please feel free to message Brian also, because maybe he can answer this.
00:58:04.780
I'm going to send him the questions and I think I'm going to interview him myself.
00:58:08.020
I'm going to have an interview just to ask him questions, you know, because I have like
00:58:14.920
Well, we're going to do a long form here on Scott's locals.
00:58:17.660
I mean, like I'm talking to him on the side, you know, maybe, I don't know.
00:58:22.900
Um, so, all right, we might not get Brian back you guys.
00:58:26.500
And I know he's bummed because he wanted to talk about this.
00:58:35.020
We, um, we want to make sure we can go long sometimes and we'll set up for that.
00:58:42.080
And we like to make sure the guests know that they have an in and an out time in case they
00:58:49.500
You know, for Scott, he could just go on as long as he wanted.
00:58:53.100
Um, but we will have Brian back on, um, maybe he can come back on next week and we'll start
00:59:02.520
And that way we could even just do like a question and answer show.
00:59:22.440
We're, we're committing you to like extra long shows for questions and answers all over
00:59:30.360
Um, so I was telling them we're going to do a longer, Oh, we're, we lost him again,
00:59:35.160
that we're going to do a longer form interview.
01:00:04.260
I just, um, wanted to let you know, I committed you to a long form interview on locals, um,
01:00:10.800
specifically because we have so many questions.
01:00:13.500
Um, so we're going to, we're going to schedule you in for that.
01:00:18.500
Cause we're going to just do questions and answers on locals.
01:00:21.660
Um, but if you could just take five minutes to talk about the AI.
01:01:05.440
Um, I really think we, as a society need to understand who owns our likeness, who owns
01:01:14.500
our face, our body, our DNA, our voice, uh, even our gut microbiome.
01:01:23.180
Um, I, I think it's vital that we have this conversation and do it as soon as possible,
01:01:28.000
because if we don't own ourselves, then who are we?
01:01:38.960
And if you don't have that, and you don't have it organized within the structure of a
01:01:47.540
And I think recent events with AI and likenesses are really important to start thinking in these
01:01:57.500
And I invite everybody to think in those terms because the world doesn't get to own us.
01:02:07.420
Um, and if you want to do the math and the logic about it, any other way does not work.
01:02:16.660
Um, and, um, that's, I think I can go into a lot more of it, but in a very short period
01:02:23.820
of time, I wrote a declaration, uh, about five, six years ago, who owns me.
01:02:28.540
You can look in my Twitter feed, X feed, um, just type in who owns you, and you can kind
01:02:35.840
Uh, I even, like I said, wrote a preamble and declaration of self-ownership, um, in the
01:02:47.560
And it's really easy to use this cut and paste technology that we have right now with
01:02:52.220
even just the last five years, let alone generative AI.
01:02:58.540
You don't want that world ahead unless you can claim ownership and rights to your own,
01:03:04.760
Uh, I think, uh, it's important for people to understand that these are, these are just
01:03:10.560
There are puppets that look and sound like someone, but there's a puppet master feeding
01:03:20.480
Imagine, imagine Brian isn't with us one day and think how brilliant Brian is and all the
01:03:27.180
work and energy he put into creating all of this.
01:03:31.080
And then he's not with us one day and someone's like, well, there's, you know, hundreds and
01:03:35.060
hundreds of hours of him and I've got his voice cloned and I'm just going to make him
01:03:40.180
And what if it changes everything Brian's ever worked for?
01:03:43.660
You know, that, that just cannot be allowed to stand.
01:03:48.520
Such an important thing because, um, our likeness is going to be used whether we like it or not.
01:03:56.440
Uh, but there needs to be a structure that allows us to address this and it has to be
01:04:04.480
Um, I mean, there are some laws where people, uh, make pornography into people and, you know,
01:04:17.940
We form governments to protect individuals from groups and groups from individuals, right?
01:04:27.260
So I'm not asking for an oppressive government.
01:04:31.860
I'm asking for the reason why we organize as a government to do these things and for somebody
01:04:41.140
Number one, that's a big problem, especially a well-known individual.
01:04:49.200
Cause I can tell you right now, AI is good, but unless you are really good at training
01:04:54.180
in AI, it's going to say stuff that a person would never actually say.
01:05:07.840
It's a monumental thing where you just are saying your answers in your own voice into
01:05:13.300
your own recorder that never goes on the internet.
01:05:15.780
I urge everybody to start doing that today because your wisdom is valuable.
01:05:22.700
And I say, it's time to not give it to, uh, you know, social media and anybody else record
01:05:30.600
And at the very least, if you don't use it in an AI project, that's local, you could share
01:05:40.720
Cause I want you to really answer those questions with all the intensity and emotions that you
01:05:45.760
should, because that's what you're trying to capture.
01:05:48.140
And in the process of doing that, you discover yourself and it's very important.
01:05:52.000
And it's apropos to the 5,000 days because you really got to get to know who the heck
01:05:55.940
you are because we're so busy trying to put food in our belly and a roof over our head.
01:06:01.520
We lost the compass of who we really are and we need to get back to that.
01:06:10.020
And I hope maybe in the future we could talk about the implications of somebody stealing
01:06:15.320
the likeness of an individual and what it really means for all of us.
01:06:20.020
And we also need to discuss that, you know, somebody could be paying somebody who knows
01:06:27.740
how to do it, to put words in that person's mouth and to change the course of history of
01:06:37.860
So, you know, you said it perfectly before when we were speaking, you know, I said this
01:06:42.700
also on here that, you know, oh, it starts with, oh, isn't this fun?
01:06:47.820
And I miss this person and it's making jokes and, oh, I like, you know, I like hearing it
01:06:52.520
and, oh, look, it's talking to me and whatever.
01:06:55.220
And then the next thing you know, it starts talking about elections or talking about wars
01:07:01.900
And now you're so used to this thing that you're thinking that this is really that person's
01:07:07.700
And it's easy how quickly you can become brainwashed.
01:07:14.300
And I think somebody like Scott actually knew this very well.
01:07:19.560
If you read enough of his books, you realize that you can get hypnotized into believing this.
01:07:24.200
And I believe if you look at the nuances of what he said in the past, he was predicting
01:07:31.840
And my interactions with him over X and such with AI, he was very concerned over what this
01:07:40.180
Sometimes you just throw your hands up and you say, oh, well.
01:07:43.180
But I think deep down inside, all of us have to start thinking, well, we don't want to be
01:07:50.760
And, again, I really think we need to, as a group, be very thoughtful in the way this
01:08:04.300
Yeah, there are provisions that allow that already.
01:08:06.980
But for the outright ownership and the outright redoing of what somebody has done in their
01:08:14.120
life and slightly shifting the words to the point there are 180 degrees from where they
01:08:23.840
Unless the people who really care about it have been entrusted.
01:08:27.220
Like, if you entrust, Disney entrusted his legacy.
01:08:33.080
You think Walt Disney would be very happy with the Disney he sees today.
01:08:37.300
No matter what you may think of Disney in the past, he is a very interesting character.
01:08:41.600
But he certainly is not the Disney of 1960 is not the Disney of 2026.
01:08:51.480
This is going to happen magnified by a thousand as AI likenesses start taking over.
01:09:00.440
Thank you for taking the extra time to talk about that.
01:09:03.420
And we'll definitely get into it more as we go on.
01:09:06.040
And then we, yeah, we, you got, we have a, oh, where, where do we find a thousand questions?
01:09:17.960
It'll be interesting to really get to know yourself.
01:09:28.340
And it's been so, um, it's been so, I can't even think of the word, like even more enlightening
01:09:38.240
And over the years, you know, I've, I've always communicated with you on X and, you know, it's
01:09:44.180
just so fun getting to talk to you now and getting to know you better.
01:09:50.680
Like there are such smart people and they're always craving knowledge and to know more.
01:09:58.500
And we're so thankful and we're so thankful that you're willing and agreeable to come back
01:10:12.540
Uh, I think we're all part of a really big group.
01:10:23.180
So I, I, I really got to thank you guys because, um, I don't know what the world would look
01:10:30.280
like if this group didn't exist, if what Scott did and the group that surrounded him did not
01:10:36.560
exist because sometimes, well, always it takes a single light to make a dark room, have a
01:10:44.420
light and all the darkness in the world can't put that light out and you guys are the light
01:10:53.900
Well, we will schedule our next one soon because we do want to do a Q and a with you.
01:11:00.080
And, um, until then I'll be chatting with you, but thank you so much, Brian and everybody
01:11:06.180
like Brian and I will do a closing sip and, um, please go out there and be useful.
01:11:12.020
Take the thousand question quiz, get to know yourself and then be prepared to be patient
01:11:18.440
while you help teach other people what's coming in 5,000 days or less now.