Real Coffee with Scott Adams - February 12, 2026


Episode 3093 - The Scott Adams School 02⧸12⧸26


Episode Stats

Length

1 hour and 12 minutes

Words per Minute

160.13469

Word Count

11,540

Sentence Count

999

Misogynist Sentences

9

Hate Speech Sentences

6


Summary

On this episode of Scott Adams Locals, we're joined by returning guest Brian Ramelli and special guest Owen Gregorian to talk about how to deal with things that don't always go your way. Plus, we have a simultaneous sip.


Transcript

00:00:00.160 With the RBC Avion Visa, you can book any airline, any flight, any time.
00:00:06.500 So start ticking off your travel list.
00:00:08.960 Grand Canyon? Grand.
00:00:10.960 Great Barrier Reef? Great.
00:00:13.380 Galapagos? Galapago!
00:00:15.980 Switch and get up to 55,000 Avion points that never expire.
00:00:20.540 Your idea of never missing out happens here.
00:00:23.560 Conditions apply. Visit rbc.com slash avion.
00:00:30.000 Stuck in that winter slump? Try Dove Men plus Care Aluminum Free Deodorant.
00:00:36.020 All it takes is a small change to your routine to lift your mood.
00:00:39.380 And it can be as simple as starting your day with the mood-boosting scents of Dove Men plus Care Aluminum Free Deodorant.
00:00:44.940 It'll keep you feeling fresh for up to 72 hours.
00:00:48.580 And when you smell good, you feel good.
00:00:51.300 Visit dove.com to learn more.
00:00:54.140 All right, let's get going.
00:00:56.320 Make sure everyone goes live.
00:00:58.760 I see locals.
00:01:00.000 All right.
00:01:01.380 Sergio, tell me when you see YouTube live.
00:01:04.060 I see Gracie.
00:01:05.340 I see Montreal Galaxy.
00:01:07.160 Aw, we love you.
00:01:09.720 Good morning.
00:01:11.580 YouTube is live.
00:01:13.020 YouTube's live.
00:01:14.140 Rumble, you're in the house.
00:01:16.840 YouTube is not live.
00:01:18.420 Oh, we're waiting for YouTube.
00:01:20.080 We miss you, Paul Collider.
00:01:21.640 I see one YouTube user on.
00:01:23.960 Oh, wait, wait.
00:01:24.700 No, we're live.
00:01:25.440 We're good.
00:01:25.920 Okay, good.
00:01:27.020 You guys, welcome.
00:01:27.900 Hi, it's me, Erica.
00:01:30.060 I'm joined today by Owen Gregorian, the voice.
00:01:33.960 We have Sergio right there and Marcella, our beautiful Marcella.
00:01:39.760 We have returning guests today.
00:01:41.840 One of our fan favorites, Brian Ramelli.
00:01:45.340 You guys, I just want to remind you as always, this is the Scott Adams School, which is completely
00:01:51.160 different than Coffee with Scott Adams, which lives on its own in thousands of hours of
00:01:56.840 videos.
00:01:57.220 Thank God he gave us all that wisdom.
00:02:00.120 So please know that Coffee with Scott Adams, you can go back and watch as many streams as
00:02:05.080 you want.
00:02:05.940 Soak in the material.
00:02:07.020 And we encourage you to subscribe to Scott Adams Locals channel, where we have some exclusive
00:02:12.920 interviews lined up in the future.
00:02:15.220 You're going to be very excited about some of them.
00:02:17.600 You might even see this guy.
00:02:20.580 How do I point?
00:02:21.760 Next to me in one of them also in an in-depth conversation about something very important.
00:02:27.120 So before we get to that, we're going to have a simultaneous sip.
00:02:32.480 And I just want to say it's a little throwback to when things just don't always go right with
00:02:37.160 Scott.
00:02:37.680 So Shelly, take it away for us.
00:02:47.520 I'm sorry, Erica.
00:02:48.740 I did not realize that you wanted me to do that.
00:02:51.920 I didn't put it on.
00:02:53.120 I didn't get one from you.
00:02:55.620 Oh, okay.
00:02:57.140 Let, you know what?
00:02:58.160 See how perfect, how apropos.
00:03:00.180 Let me pull it up.
00:03:02.580 That's so funny.
00:03:03.520 See, things don't always go right.
00:03:04.860 And that's what it was today.
00:03:07.860 Let's go.
00:03:09.140 So in the meantime, let's get it here.
00:03:17.260 Who wants to riff?
00:03:21.860 Does everybody have their flask ready?
00:03:25.820 Everybody's ready.
00:03:27.480 Yay.
00:03:28.360 Somebody find some papers to bang on a desk.
00:03:30.860 Yeah.
00:03:31.140 Someone bang some papers.
00:03:33.520 I think that's the book.
00:03:36.940 Oh my gosh.
00:03:38.620 Okay.
00:03:39.140 Here we go.
00:03:41.080 Let's search.
00:03:42.660 No pressure.
00:03:43.180 No pressure.
00:03:44.120 No, I don't feel any pressure.
00:03:46.000 No.
00:03:47.000 Gmail just closed.
00:03:48.580 It's all good.
00:03:49.320 It's all good.
00:03:49.380 You guys.
00:03:52.040 Da, da, da.
00:03:53.380 I feel excited.
00:03:54.200 Bum, bum, bum.
00:03:56.920 Ryan, did you resist doing the simultaneous sip initially?
00:04:01.760 Yes.
00:04:02.160 But I got it.
00:04:05.860 I did too for a while.
00:04:07.400 You did?
00:04:08.620 Yeah.
00:04:09.020 It's funny.
00:04:09.960 A number of people have asked me about that, but I finally got it, I think, maybe a couple
00:04:15.780 of weeks in because I wasn't sure where he was going with it.
00:04:19.620 Same thing, Owen, for you?
00:04:21.980 Yeah.
00:04:22.580 Well, he kind of introduced it, at least eventually, as like, this is my way of hypnotizing you.
00:04:28.240 And so I'm like, well, I'm not going for that.
00:04:30.960 I just, I had that natural response, I think, that probably a lot of people had that were
00:04:34.880 like, I'm not doing it.
00:04:35.780 But I think there are probably at least a few people that just never did it.
00:04:39.420 And I think they missed out.
00:04:41.460 Yeah.
00:04:41.680 I'm addicted.
00:04:43.160 It's neuro-linguistic programming, and I detected it pretty early on.
00:04:47.860 To the fact that everything's just good.
00:04:49.120 I always found it interesting.
00:04:50.020 Mm-hmm.
00:04:51.340 Okay.
00:04:51.840 I got it.
00:04:53.560 You guys, this is the perfect sip for what's happening right now.
00:04:57.280 Okay.
00:04:57.620 Ready?
00:04:58.920 I'll hit, you guys hit mute, and we're going to play Scott.
00:05:02.100 And here we go.
00:05:12.040 That's a terrible touchy.
00:05:15.880 Come on.
00:05:16.380 Come on.
00:05:16.480 Come on.
00:05:17.860 Come on.
00:05:20.020 Did you just disappear?
00:05:27.040 When I tapped my papers, did the picture just disappear?
00:05:32.840 Because it did throw me out of the app.
00:05:37.060 How could everything go wrong?
00:05:39.740 How's that even possible?
00:05:40.840 All I did was this.
00:05:45.500 I won't do it again.
00:05:48.700 And I turned off the app.
00:05:50.100 This is so beyond, this is so beyond possible.
00:05:57.680 I mean, we're into some statistical impossible situation here.
00:06:01.780 It can't be that all the apps died at the same time.
00:06:05.200 They couldn't all be broken at once.
00:06:07.720 Could they?
00:06:08.220 I know there's a massive incompetence problem, but that's pretty impressive.
00:06:14.360 Anyway.
00:06:14.740 If you'd like to take your experience today up to levels that nobody can even understand
00:06:25.040 with their tiny, shiny human brain.
00:06:26.800 None of it makes sense today, does it?
00:06:29.900 The preamble doesn't make any sense.
00:06:32.960 Because the starting assumption is that things went right.
00:06:38.780 So the preamble doesn't make sense.
00:06:40.560 Let's just do the, let's just do the, let's just do the side of the video soon.
00:06:46.740 Let's just surrender.
00:06:48.860 Shall we surrender to the fact that everything's just going to go wrong today?
00:06:53.980 Just absolutely everything's going to go wrong.
00:06:57.680 Sip to that.
00:06:59.160 Sometimes it goes that way.
00:07:06.900 How perfect was that for what happened just before?
00:07:10.560 I thought, Oh my gosh, that's a little wink and a nod right there.
00:07:14.160 But we did love, we did love when things went wrong because it was just always funny.
00:07:19.840 And I'll sip to that.
00:07:21.320 So you guys, um, I did send a link out on X yesterday, um, for Brian series.
00:07:27.940 And a lot of you said that you have been reading it and enjoying it.
00:07:33.040 Some of us are frightened and scared and there is a whole special section on Scott also, and
00:07:41.140 in his influence.
00:07:42.520 And I was just looking, cause I'm going to botch the name.
00:07:45.340 It's 5,000 days to help me, Brian, she, to, to the end of work as we know it.
00:07:54.620 No way.
00:07:55.220 Okay.
00:07:55.440 As you know it.
00:07:56.040 Okay.
00:07:56.220 So with that, we welcome back Brian and you guys, Sergio is going to be watching your
00:08:01.980 comments and YouTube and Marcel is going to be watching locals to see if you have any
00:08:06.780 questions for Brian as we talk, but you guys really loved him last time and everything
00:08:11.720 he had to say.
00:08:12.540 So we said, you have to come back again and again and again.
00:08:15.740 So Owen, I'm going to let you start because I know you had something ready.
00:08:19.320 You wanted to ask Brian.
00:08:21.700 Well, my first question I think relates to some of the earlier posts in the series where
00:08:26.740 you're saying essentially a lot of what people need to do is work through grief and go through
00:08:35.880 like rediscovering play.
00:08:37.460 And it was all focused on emotional recovery, essentially like processing trauma and things
00:08:43.600 like that.
00:08:44.140 And that surprised me just cause I guess as a technologist, I kind of expected, okay, here's
00:08:50.240 the plan.
00:08:50.680 Here's what you need to do.
00:08:51.480 Let's go do it.
00:08:52.640 And instead you're kind of taking this left turn into, you can't really do anything until
00:08:58.400 you've healed yourself.
00:09:00.180 Can you talk a little bit about why you think that's so important?
00:09:03.620 Oh, and thank you.
00:09:04.380 Brilliant.
00:09:04.780 And thank you for having me back.
00:09:07.140 I'm so honored.
00:09:08.360 I so miss Scott and I miss the audience.
00:09:11.840 So, okay.
00:09:13.400 So I don't believe that we can get through the next 5,000 days.
00:09:18.240 I call it the interregrum.
00:09:20.060 It's a Latin word.
00:09:21.740 And this is when the old king dies and a new king needs to be crowned.
00:09:25.760 And it's the only word I can find in this middle space where the end point is going to
00:09:32.380 be abundance.
00:09:33.680 It's not going to be utopia, but it's going to be abundance.
00:09:37.460 And I'll just briefly touch upon it economically and philosophically, and then I'll get into
00:09:43.680 the trauma.
00:09:46.380 Philosophically, what's going to happen is there's going to be scaling of AI and robotics
00:09:53.380 to such a level where AI is building AI and robots are building robots.
00:09:58.360 Now, let's get away from the Terminator and negative side of it.
00:10:03.220 Let's just look at the center line.
00:10:05.680 The center line is everything's going to become inordinately less expensive, even the robot.
00:10:12.000 At some point, and I know the initial reaction, mine, everybody, is going to be, I'm never
00:10:17.680 going to afford that.
00:10:18.580 They are going to control it.
00:10:20.320 It's going to be so inexpensive.
00:10:22.680 There's no control anymore.
00:10:25.060 Right?
00:10:25.360 This is why we're seeing the wheels come off the cart.
00:10:29.140 There's a lot of reasons why the world is the way it is.
00:10:32.100 And people in power are concerned because the control mechanisms are no longer there.
00:10:37.880 That is scarcity.
00:10:40.160 We lived through a world of scarcity, but we had a world of abundance at one time.
00:10:45.620 Most of human existence was abundance.
00:10:47.980 And what I really mean by that is if you were hungry, you went to a tree and you pulled an
00:10:52.080 apple off.
00:10:53.320 Right?
00:10:53.740 If you're hungry, you pulled up a root.
00:10:55.960 If you're thirsty, you went to the water hole.
00:10:58.420 Now, I'm not saying there wasn't adversity.
00:11:00.300 Adversity is always going to exist.
00:11:02.100 But we had a mentality of abundance.
00:11:05.200 And it's very much if you go biblically and you go to the Garden of Eden, the story there
00:11:10.000 is really about abundance.
00:11:12.200 And the knowledge is what got you out of that abundance.
00:11:16.660 Right?
00:11:17.020 And I don't want to go too deep into that.
00:11:18.920 We can maybe at some point.
00:11:20.720 But where we are heading is the things we thought that were really expensive are going
00:11:26.220 to become less expensive.
00:11:28.360 And this includes even energy.
00:11:30.660 Right?
00:11:31.140 Because as AI continues to build more AI, it's going to solve many more of the problems or
00:11:38.980 surface the answers that we already had.
00:11:41.660 And either we didn't know it or it wasn't widely available for all of us to know it.
00:11:46.140 And that's going to come whether or not people in power want that to happen, because this
00:11:53.020 is a democratization of cognitive intelligence.
00:11:56.760 And it's a democratization of robotics and labor.
00:12:01.320 Now, is that going to happen tomorrow?
00:12:03.520 No, we barely have a bipedal humanoid robot, but it will happen.
00:12:09.820 And so a lot of people ask me, how do I know the future?
00:12:13.200 It's very easy.
00:12:14.360 You find the point in the future and you work your way backwards.
00:12:18.440 You never work forward.
00:12:19.900 So you're never looking in the rearview mirror.
00:12:22.300 You're looking forward and then you're coming back.
00:12:25.360 Okay.
00:12:26.000 It's the middle point that is always going to be random.
00:12:28.840 The end point is always going to be clear.
00:12:30.520 We will get to that point.
00:12:31.780 And a lot of people like me were raised on dystopian movies.
00:12:38.260 And so we always tend to look at the car crash.
00:12:41.180 We slow down and say, oh, that's built into our programming.
00:12:46.280 And Scott was really, you know, you read his books, very much into why we do that.
00:12:51.780 Why are we fixated on certain things?
00:12:54.980 When it comes down to our fear, our fear is the robot's going to take over.
00:13:01.080 It's going to be Terminator.
00:13:02.600 Yeah, that variations of that are going to happen.
00:13:06.340 But the end result is there will be an equilibrium.
00:13:10.680 It's that process of reaching equilibrium.
00:13:14.160 Psychopaths and sociopaths are always going to exist.
00:13:17.780 They are a very small portion of society.
00:13:20.800 They tend to get into power.
00:13:22.900 But the beautiful thing about where we're going is we are dethroning them by the democratization
00:13:29.160 of these tools.
00:13:29.900 That's why you don't shun the tools.
00:13:32.680 You use them.
00:13:33.940 And if they're using it as a hammer to bang people over the head, fine.
00:13:37.220 If they're using the fire to burn people, fine.
00:13:39.860 We, the majority, are going to use it to build a house, to build a structure.
00:13:45.020 We use the hammer as a tool to let us not have to suffer the consequences of what happens
00:13:52.980 with weather.
00:13:53.920 You know, we get a little cold and we want to be in a home.
00:13:56.060 So we build a home.
00:13:56.920 We use tools to serve humanity.
00:14:00.400 AI is a tool and it's a co-collaborator.
00:14:03.000 Now, there's a lot more to it than just a hammer.
00:14:05.680 But again, anything can be formed into a weapon.
00:14:08.980 Do not fall for the reframe that this is something that's going to get you because your fear is
00:14:16.400 your dispowerment.
00:14:17.540 Now, why do I get into trauma?
00:14:21.280 Because from zero to eight, most of us have experienced some form of trauma.
00:14:26.120 Some have it as external scars.
00:14:28.800 Some have it as internal scars.
00:14:31.140 But everybody experiences that.
00:14:33.040 It defines the trajectory of our life, whether we like it or not.
00:14:37.520 And when you are going through trauma, you have to deal with the trauma you had first
00:14:43.600 because that's the format you use to deal with the future.
00:14:48.160 And the traumatic period that we're going through right now, people are losing their jobs.
00:14:54.460 People who have worked their entire careers to make a line that looks beautiful.
00:14:59.220 That line can now be duplicated by an AI in 30 seconds.
00:15:05.080 Who are you if you no longer are your job?
00:15:09.260 You have to face that existential definition.
00:15:13.440 We in the Industrial Revolution have defined ourselves by what we do.
00:15:18.240 In fact, our very names were what we did.
00:15:22.540 Cooper, a barrel maker, right?
00:15:25.280 All of these different, you know, blacksmith, goldsmith, all of these different
00:15:28.640 professions literally were our name.
00:15:32.940 What if all of a sudden you can't make a living from the things that you spent your life doing?
00:15:39.340 That is a crisis.
00:15:41.140 And nobody is talking about it.
00:15:42.940 I chose to talk about it.
00:15:44.380 And some of the people who are talking about it, unfortunately, are indoctrinating you into
00:15:49.640 a system of medication and forever therapies.
00:15:54.280 You know, come back for your next half hour therapy session.
00:15:57.000 And we'll work through this grief and then we'll do it next week and then next week and
00:16:01.360 forever.
00:16:02.240 That's not how you solve trauma.
00:16:04.380 Trauma has to be dealt with by facing it full on.
00:16:08.440 And I, in part one, I open up a series of books that you can use to face trauma.
00:16:15.360 I think one of the most universal, and it's hard to take, is Teal Swan.
00:16:22.280 Teal is a victim of trauma and she's overcome her trauma tremendously.
00:16:29.020 It's very apropos with what's going on with Epstein files and things of that nature.
00:16:33.500 Her trauma was tremendous.
00:16:36.200 I've gotten to sit through seminars and I've gone through her training sessions to try to
00:16:41.060 understand it.
00:16:43.020 Let me tell you, it is not normal psychotherapy.
00:16:47.520 It is about facing trauma head on.
00:16:50.620 And what we do as humans is we encase our trauma into a little ball and there are layers of onion
00:16:56.740 and we hold it deep inside and we don't look at it.
00:17:00.200 And that's the ghost that's always going to chase you the rest of your life.
00:17:03.080 A lot of us guys who choose technology and mostly guys that do this, we want to rationalize
00:17:11.960 an irrational world.
00:17:13.880 I, as a kid, wanting to go to Princeton, wanting to become a subatomic particle physicist, I
00:17:20.840 wanted the world to be black and white.
00:17:22.560 I wanted it to be logical, one or zero.
00:17:25.200 I wanted it to be very clear.
00:17:26.940 Don't give me your fuzzy, stupid explanations of woo-woo garbage.
00:17:31.200 I wanted it to be factual.
00:17:33.600 I thought I understood what a fact was and what the truth was.
00:17:40.040 And when you really, really dive into subatomic particle physics and understand quantum mechanics,
00:17:47.220 you realize you don't understand quantum mechanics and you don't understand physics.
00:17:51.960 What you have is an observation of something.
00:17:56.400 And your observation is only as good as the tools that you use to perform that observation.
00:18:02.800 On my hand is trillions of creatures.
00:18:08.100 Do you see them?
00:18:09.600 No.
00:18:10.040 If I went up to you in 1760 and I said, hey, guys, there's trillions of creatures on my
00:18:18.680 hand, they'd lock me up.
00:18:20.860 I might even have evidence that there are trillions of creatures on my hand.
00:18:24.280 But it's not evidence that the crowd would approve of.
00:18:29.940 And then an invention takes place, the invention of the microscope.
00:18:34.640 And literally, the person that discovered this almost committed suicide.
00:18:41.260 The whole idea of pasteurization, the whole idea of a surgeon washing his hands before he
00:18:48.340 goes and delivers a baby after dealing with gangrene.
00:18:51.720 You've got to remember, if you study history, a surgeon thought you were woo-woo out of this
00:18:59.140 earth, crazy, that you would dare to tell him to wash his hands after dealing with one
00:19:06.660 patient and another.
00:19:07.440 What are you talking about?
00:19:08.340 There's nothing on my hands.
00:19:10.040 Well, there's a little smear of blood.
00:19:11.460 Let me take that off.
00:19:12.340 That's fine.
00:19:13.040 Oh, the pus?
00:19:13.660 Okay, yeah, it's gone.
00:19:15.000 Now I'm good.
00:19:15.860 Let's go deliver a baby.
00:19:18.480 This is how dumb we are.
00:19:20.980 When we become arrogant, we always think that this is a generation that knows everything.
00:19:28.200 And everything that is in the past, well, they were just stupid and now we're smarter.
00:19:33.460 Well, guess what?
00:19:34.460 Everything that we think is a fact today is probably going to be laughed at in less than
00:19:39.880 50 years.
00:19:41.460 Because new tools of observation will allow us.
00:19:44.040 So that makes you humble.
00:19:45.060 And that's the humbling prospect of it.
00:19:48.120 Now, getting back to the emotions, we think that we have our emotions in check.
00:19:52.380 Look at me.
00:19:52.880 I'm doing fine.
00:19:53.680 Everything's okay.
00:19:54.620 Just don't rub me the wrong way and everything.
00:19:57.160 We become reactive because we have coping skills.
00:20:01.300 Our coping skills are developed when we're children.
00:20:04.300 It is not rocket science.
00:20:06.720 We have to develop coping skills or we wouldn't be here right now.
00:20:10.400 We are the definition of a survivor of coping with whatever we thought was trauma.
00:20:17.600 Now, your trauma and my trauma is going to be markedly different.
00:20:21.100 My trauma might be, I stuck my finger in the light socket and that really messed me up.
00:20:27.480 But it doesn't matter.
00:20:28.780 To me, that trauma is just as big as the trauma of being tortured.
00:20:33.360 Or a parent that didn't care who were drug addicts who put their cigarettes out on me.
00:20:38.840 However bad you want to imagine.
00:20:41.600 That is trauma.
00:20:42.940 But it's very real.
00:20:44.300 It doesn't change in the context of the human body.
00:20:48.360 And that's very important because we...
00:21:03.360 We tend to be scorekeepers.
00:21:07.240 When we look at somebody else's problems and we look at ours and say, well, mine are worse.
00:21:11.780 And then we find somebody in the street who has no legs and they're barely getting by and say, well, theirs are worse.
00:21:18.020 And yes, of course, you know, there are different metrics that we use universally.
00:21:23.500 But internally, they're very real.
00:21:26.340 If you don't deal with them, and I mean really deal with them, when you face what we're facing...
00:21:31.820 And I don't want to be doom and gloom because it's not about that.
00:21:36.380 It's just everything we thought was solid is going to change.
00:21:43.040 And we, as we get older, resist change.
00:21:47.380 Because it's the only thing that we can hold on to that makes sense of a crazy world.
00:21:51.780 Like when I wanted the world to be one and zeros and be logical and, you know, come on, you guys.
00:21:57.300 Up, down, strange quarks.
00:21:58.740 What are you talking about?
00:21:59.660 If I see it, it's there.
00:22:00.860 And if I don't see it, it's not there.
00:22:02.500 And there's a collapse of a wave function.
00:22:04.940 And an atom 12 billion miles away can be affected by an atom right here faster than the speed of light.
00:22:14.580 You know, superluminal is how they would term it.
00:22:17.440 I try not to use nomenclature.
00:22:18.820 Um, what are you, crazy?
00:22:21.960 And then when you realize it is true, then you realize how much you don't know.
00:22:26.860 And we don't know enough about the psyche.
00:22:28.900 And the people who are in control of access to our psyche have a meter and a credit card slot.
00:22:37.020 Folks like Scott democratized this.
00:22:40.800 He democratized it in his books.
00:22:42.480 He was showing us patterns to understand and ways to see the world,
00:22:47.420 not by paying a turd gold, but by reframing things into the proper context.
00:22:53.520 Because what your mind believes, you conceive.
00:22:57.360 Everything that ever was made was an imagination in somebody's mind before it was made.
00:23:03.480 And this includes yourself.
00:23:04.860 So part of the way you fix yourself is absolutely accepting the fact that you face some sort of trauma.
00:23:13.200 Now, that doesn't mean you victimize and label yourself.
00:23:15.800 I am diametrically opposed to victimizing and labeling.
00:23:19.740 I don't think anybody is a whatever.
00:23:21.820 I think you have those tendencies which you can change.
00:23:26.760 Labeling is a disempowering system and tool that I think, you know, Scott would reframe into saying,
00:23:33.860 well, no, that's how I make myself stronger by, you know, I had this impediment with my voice.
00:23:39.920 Well, I found a way to reframe it so that now I can speak.
00:23:43.000 And we're going to all have to face that.
00:23:47.180 But what's different this time than any other point in history,
00:23:50.000 and myself and Owen, I'm sure all you guys study history quite a bit,
00:23:53.880 is we've never had this happen at such a massive scale all at once
00:23:59.400 with the backdrop of absolutely chaotic economics, philosophies, and just direction of life.
00:24:13.000 So, the only thing that most people had to hold on to that they thought was solid was their career
00:24:18.360 or their skill set.
00:24:20.080 And I'm going to tell you very clearly, you need to hear it.
00:24:23.060 It's still extremely valuable.
00:24:25.280 And I don't care if an AI can do it 10 times better.
00:24:28.980 It can't do it like you, right?
00:24:32.000 And the metric that you're using is wrong, right?
00:24:36.740 I did not fall apart when my slide rule, I learned to use a slide rule, right?
00:24:42.660 I can do differential, I can do any type of equation on slide rule, within reason.
00:24:48.240 You know, I need a chalkboard too.
00:24:50.480 When a calculator and the spreadsheet came along, I didn't fall to pieces.
00:24:55.160 I had a skill set that I learned, mechanical.
00:24:58.400 I had a round slide rule I like to use.
00:25:00.960 And I said to myself, well, I don't need to use this anymore.
00:25:04.300 But I still have that skill.
00:25:05.640 And there was some value that was achieved by me by learning that skill.
00:25:11.460 And in the post-scarcity world, in the abundance world, which we are going to get to,
00:25:20.580 I say it's 5,000 days when we finally see that.
00:25:24.940 Approximately, you know, 2040.
00:25:28.280 I pick 5,000 days because I'm a marketer.
00:25:30.200 And, you know, it sounds cool.
00:25:32.360 No, it's the best that one could imagine.
00:25:36.160 And it's designed for people to actually focus on the fact that no matter how many hand grenades
00:25:42.020 are going off around them, because you're going to see people in power freak out.
00:25:47.220 Because freedom of speech, control of information, hierarchies of ivy towers,
00:25:54.040 you can't pronounce that.
00:25:55.860 We have control over the information flow.
00:25:58.380 That's what's breaking down.
00:26:00.500 I have local AIs that I can, if the whole internet comes down,
00:26:05.720 I can still solve most of the world's problems within a model that fits into my laptop.
00:26:11.700 And very soon, a Raspberry Pi that costs me 60 bucks.
00:26:15.760 So the world can end.
00:26:17.180 And I still have the sum total of knowledge.
00:26:20.140 But that doesn't necessarily solve all the problems.
00:26:23.020 Guess what?
00:26:23.480 You and I, for at least 25 years, had access to all the information in the world.
00:26:29.620 It was called the internet.
00:26:31.420 And when I was a kid, and if you were to look at all of the people we stand on their shoulders
00:26:37.560 that built knowledge, they prayed for the day that they could have access to all the information
00:26:43.580 and abundance we have access to right now.
00:26:46.860 Why hasn't it changed?
00:26:48.120 In fact, if you really want to look at the world, it's gotten worse.
00:26:52.340 Over the last 30 years, with all of this information and all of this connectivity
00:26:57.500 and all this socialization electronically, society has gotten, by every metric, worse.
00:27:04.120 I thought you said not to be scared about that.
00:27:06.340 It's an observation.
00:27:10.040 Because we're like the cats with infinite lives.
00:27:14.260 We're always going to land on our feet.
00:27:16.280 Humanity is always going to overcome the trauma that's in front of us.
00:27:20.700 That is an absolute certainty.
00:27:22.900 That's my guiding light working backwards.
00:27:25.340 I can go into hours of explaining why that's true.
00:27:31.100 Give me some grace to say maybe he's right.
00:27:34.940 But that's the end point.
00:27:36.580 And I believe that with every fiber of my being.
00:27:40.120 I'm talking about the grand arc of humanity.
00:27:42.640 I'm not saying that everything in between will work.
00:27:45.760 And I will say at least this.
00:27:47.820 Are you here right now?
00:27:49.180 Yes.
00:27:49.500 You're the sum total product of all of your ancestors that have gone through unimaginable
00:27:56.400 traumas to make sure that you are here right now at this moment.
00:28:01.600 So whenever somebody gets a little banged up and a little depressed, look backwards at all
00:28:07.440 those people that made you become here right now.
00:28:11.360 Now, they might not know you, but they knew that that was what they were sacrificing for.
00:28:17.420 That is humanity.
00:28:19.500 Right.
00:28:19.740 And we can sit here and we can get so angry at the world saying, look how bad it is.
00:28:24.300 Look how ugly.
00:28:25.220 You want to know something?
00:28:26.460 That is a small minority.
00:28:28.920 The majority of us love each other, care about each other.
00:28:31.960 We might speak different languages.
00:28:33.740 We might have a minor 1% difference in the way we might view the world.
00:28:37.880 But we all want to be left alone, to live a happy, healthy life, to raise children, and
00:28:44.240 to let them live a happy, healthy life.
00:28:46.040 And that's the universal directive, if you will.
00:28:50.760 And that's how we got here.
00:28:52.380 And that's how we get there.
00:28:53.500 That's not going to change no matter how many eugenicists or cyborg-loving people think that
00:29:00.720 they're going to slap themselves 50-50 with a computer and human biology.
00:29:07.400 Guess what?
00:29:08.060 We don't even know what the brain is.
00:29:09.820 We don't know where consciousness comes from.
00:29:11.780 And you're going to tell me you're going to cyborg yourself into the singularity, and
00:29:15.920 you don't even understand what that is.
00:29:17.420 I'll tell you what.
00:29:19.280 Being locked up inside a robotic silicone world is the very definition of hell.
00:29:28.680 If you want to get biblical, that's what hell looks like.
00:29:33.800 And I can get into the reasoning of that.
00:29:35.660 So coming back, trying to reel myself in, we have to deal with our trauma.
00:29:42.120 And the way we're dealing with the world right now is a direct reflection of how we dealt with
00:29:47.540 our childhood.
00:29:49.360 And again, you're not a victim, but you must start looking at the structures you use to solve
00:29:57.640 crises that are in front of you.
00:30:00.260 How not to become so reactive?
00:30:04.800 Meaning, do you shrivel up in a ball and cry?
00:30:07.260 Or do you say, I need to adjust the way I think about this to move forward?
00:30:11.580 Exactly.
00:30:12.540 All of the above.
00:30:13.860 And look what social media has done to us.
00:30:16.400 It's made us become so hyperreactive that, oh, I'm going to make this comment right now,
00:30:21.200 this libertard or this right wing, whatever your flavor is, right?
00:30:27.540 That is a disempowering state of mind.
00:30:31.420 That's a reptilian state of mind.
00:30:33.620 And you don't ever achieve anything but a slight hit off your crack pipe.
00:30:39.620 Because what it's doing is it's a neurotransmitter cascade that's dealing with a fear that you
00:30:46.360 have and you're projecting it outward to something very real.
00:30:50.540 I mean, there are incredibly bad things going on.
00:30:55.120 And we're finding more and more every couple of hours, right?
00:30:58.440 You know, oh, wow, they did that?
00:31:00.080 Okay.
00:31:01.100 You know, this is hard to deal with.
00:31:03.580 I don't think we're ready to really know what really was going on.
00:31:08.120 Right?
00:31:08.780 And if you don't adjust your ability to cope with this, I'm not saying accept it.
00:31:14.500 I'm not saying nullify it.
00:31:16.420 I'm not saying don't get mad.
00:31:17.980 I'm saying channel your energy, understand why it's disempowering for you to fly off the
00:31:24.800 handle, understand why you see a news program having all these blinking lights and banners
00:31:30.860 going by and special alert.
00:31:33.940 Why is it like that?
00:31:35.260 Why is it today?
00:31:36.560 If you go back, I'm an archivist.
00:31:38.960 I have probably more VHS tapes.
00:31:40.920 You're the best.
00:31:42.420 No, I'm trying to do the best I can.
00:31:44.880 I have more VHS tapes of news programs than probably anybody.
00:31:49.600 And, you know, it wasn't until Ted Koppel's Nightline and the Iranian crisis that we ever
00:31:55.340 had these urgent every night, day 200 of the hostage crisis.
00:32:01.300 And it became a serial, addictive doom loop cycle.
00:32:06.200 And it became the archetype of what all CNN and all news programs have become to constantly
00:32:13.880 have us in the most disempowered state.
00:32:17.120 And that is a reactive state.
00:32:19.460 Right?
00:32:20.000 Always reacting.
00:32:21.040 Not acting.
00:32:21.960 Always reacting.
00:32:23.060 Always stumbling into something.
00:32:25.260 From one crisis to the next.
00:32:27.340 We are all guilty of it.
00:32:29.160 If you're thinking, oh my God, what's Brian saying?
00:32:31.960 I am guilty.
00:32:33.060 We are all guilty of it.
00:32:34.340 So that's one of the elephants in the room.
00:32:37.440 How do we stop doing that?
00:32:38.620 Well, AI is going to help us do that because it's going to help us create, again, it's the
00:32:42.960 right AI.
00:32:43.640 I am more into local AI.
00:32:45.200 But at the very least, use Grok more than any other platform because it comes closer
00:32:49.880 to whatever you would want with a truth-seeking AI platform coming from a corporation.
00:32:57.800 But we will have these tools that are going to empower you and will detect the different
00:33:04.560 techniques that are being used to excite your neurotropic release of urgent, you know,
00:33:14.780 oh, serotonin, you know, what am I doing here?
00:33:17.920 Aggression, you know, all of these different things.
00:33:20.500 If we don't get out of that state, we will never make rational decisions in our life.
00:33:25.640 And we've been victimized of it for quite a long time.
00:33:28.460 Most definitely the last five years.
00:33:30.620 We're a chemistry experiment.
00:33:32.260 And we've proven that that experiment works.
00:33:36.280 When you look at people that you objectively can say are hypnotized, they are, right?
00:33:43.520 They are reading into something and de-rationalizing it to fit a narrative for a team.
00:33:51.480 And we've been bifurcated into two teams.
00:33:54.320 And that's not how the world works.
00:33:56.000 Just like I rejected physics initially because I wanted everything to be black and white,
00:34:01.920 one or zero.
00:34:03.420 The same is true with human personality and human goals.
00:34:06.700 When we were growing up, probably everybody here, when we had political discussions,
00:34:11.300 we had different opinions.
00:34:13.700 You know, oh, yeah, you got some liberal opinion there, buddy.
00:34:16.480 Or you sound like a curmudgeon conservative cigar chomper.
00:34:21.160 You know, today, it's you're ready to go to war.
00:34:23.940 And there's reasons for that because the teams have been co-opted by people in power
00:34:29.640 to obscure what they're actually doing, right?
00:34:33.980 And the people who are victimized of it on all sides are playing the part to make sure
00:34:40.900 that the dust is going back and forth and you don't see the left hand of the magician
00:34:46.100 while the right hand is playing upon it.
00:34:49.460 So those are the things that we're going to be facing.
00:34:52.140 Okay.
00:34:53.000 So I...
00:34:53.940 I want to get to the good part at the end, or at least what you think the path forward
00:34:59.040 really looks like for people.
00:35:00.560 But, you know, the next part I think that I remember is going through the stages of grief.
00:35:05.040 And as I...
00:35:05.940 Yeah.
00:35:06.520 What I took away was that you think we all need to go through that in the sense of losing
00:35:10.160 our identity as it relates to having a career or having work be a core part of what our
00:35:16.820 value is.
00:35:17.540 Um, and, you know, both dealing with trauma and going through the stages of grief, I think
00:35:25.360 are difficult things.
00:35:26.720 So you're asking a lot of people.
00:35:28.300 Yeah.
00:35:29.160 Um, but let's, let's assume we do that, you know, and we come out the other side, we've
00:35:34.160 dealt with it and we've gotten to acceptance.
00:35:36.340 Um, and by that, I think what you were saying is you need to accept that work isn't your purpose
00:35:43.720 anymore.
00:35:44.120 And that, um, you know, there's, you, you have to find a different purpose essentially,
00:35:50.320 but that, you know, you, you can't depend on your contribution as, at least as you knew
00:35:57.820 it before in the sense of doing routine tasks or, you know, that your production was kind
00:36:05.380 of your measure of value, that that's going to be gone.
00:36:08.820 But let's say we do accept that.
00:36:11.120 Um, where do we go from there?
00:36:13.160 Investing is all about the future.
00:36:15.660 So what do you think is going to happen?
00:36:17.680 Bitcoin is sort of inevitable at this point.
00:36:20.180 I think it would come down to precious metals.
00:36:22.760 I hope we don't go cashless.
00:36:24.860 I would say land is a safe investment.
00:36:27.440 Technology companies.
00:36:28.600 Solar energy.
00:36:29.580 Robotic pollinators might be a thing.
00:36:32.180 A wrestler to face a robot.
00:36:33.900 That will have, that'll have to happen.
00:36:35.480 So whatever you think is going to happen in the future, you can invest in it at Wealthsimple.
00:36:40.900 Start now at Wealthsimple.com.
00:36:44.120 Wonderful questions.
00:36:44.580 And then we'll take some questions.
00:36:46.100 Yeah.
00:36:46.800 Um, I, I urge everybody to read the hero with a thousand faces by Joseph Campbell.
00:36:51.880 It's very thick.
00:36:53.440 It's a hard book.
00:36:54.600 I know I told everybody to read the user illusion.
00:36:57.000 I hope some people are reading that, but Joseph Campbell is going to open up your mind
00:37:00.680 about the archetypes that we all have in our, in our subconscious.
00:37:04.700 They, they, they are called Jungian archetypes.
00:37:06.880 They are crossed all generations.
00:37:09.080 They're crossed all cultures.
00:37:10.940 And Joseph Campbell looked at all the mythologies and myths of all these cultures and tied them
00:37:17.660 together into the monomyth and the hero's journey, which we are all on, whether we know it or not.
00:37:24.720 And I've tied the, uh, 5,000 days series to the monomyth so that you can actually have
00:37:30.860 a guide and map.
00:37:32.360 So Elizabeth Kubler-Ross, um, and the five stages of grief is a really good mythology and a really
00:37:41.520 good method, uh, cause it's been around before her is that you have to understand that you're a
00:37:49.540 caterpillar and the caterpillar goes into a cocoon and its entire body is eaten alive by acid.
00:37:58.380 And it comes out, it comes out as what comes out as a butterfly.
00:38:04.100 And when you, when you actually look at that and you say, okay, that's the metamorphosis,
00:38:09.360 that's a transformation, you know, everybody goes through this in their life to some degree
00:38:16.460 anyway, right?
00:38:17.540 But this is happening across all cultures, all, all at once.
00:38:22.640 If you live in India right now and you were working at, um, a data center, AI just took
00:38:30.780 your job and that was your, that was your leg up.
00:38:34.380 Now, none of us are living in India right now and we don't work at a data center, but, um,
00:38:39.360 a few hundred million people do.
00:38:42.680 And right now, most of the AI top line AI from Claude Code, YGRAC, uh, chat GPT, they can code
00:38:52.140 better than mid-grade programmers in India.
00:38:56.060 Boom, gone.
00:38:57.460 Now that guy is going to go home one day to his wife and kids, built his entire life on
00:39:03.360 the narrative he was given.
00:39:06.660 His job's gone.
00:39:08.340 Now there's already our suicides in India because of this.
00:39:12.600 It's going to climb.
00:39:15.420 I have met people who have gone through this.
00:39:17.920 I spent my life trying to study this.
00:39:19.560 So it's not a new thing to me.
00:39:21.120 It's one of the many themes that I've been studying.
00:39:22.980 Cause I knew where we're going to get here.
00:39:24.960 Um, you have to grieve.
00:39:28.900 I'm not saying you're not going to work.
00:39:30.340 We're going to work for the rest of our lives.
00:39:31.880 That's fine.
00:39:32.700 We're going to choose the work that we do.
00:39:35.760 That is going to honor our existence.
00:39:37.940 And if that means you're going to be a farmer, a plumber, because I also in the series talk
00:39:43.640 about blue collar jobs, taking off like a rocket and people are going to choose to do
00:39:47.720 that because they love doing it.
00:39:49.200 Not because it's putting food on their table.
00:39:51.740 Now I have some friends who are, who are in these trades and say, I'm never going to love
00:39:55.000 doing it until you really talk to them.
00:39:57.360 And then you realize that they love doing it.
00:39:59.180 And if they had to stop doing it, they would feel lost.
00:40:03.740 But if you're doing a knowledge job, AI is going to replace you.
00:40:08.100 And that's a reality.
00:40:10.160 If you're a coach.
00:40:10.740 Brian, I just want to say that that solves a problem.
00:40:13.540 I mean, so maybe that's a weird way of solving.
00:40:19.040 Oh, we lost.
00:40:20.480 Yep.
00:40:20.720 There we go.
00:40:21.660 Maybe that's, um, are you back, Brian?
00:40:24.720 Yeah.
00:40:25.420 Yeah.
00:40:25.640 Okay.
00:40:25.860 Maybe that's a way of like the universe solving a massive crisis we have where we don't have
00:40:31.700 people doing blue collar jobs and we can't build, we can't grow the infrastructure, everything.
00:40:38.580 So maybe it's going to suck for a while for people, but maybe it will show people like
00:40:45.240 you better pick a different direction.
00:40:47.180 Maybe college isn't the answer.
00:40:49.020 And maybe learning a trade is the way to go.
00:40:52.020 You're absolutely right, Erica.
00:40:53.600 So imagine, and this is going to suck.
00:40:57.740 I have kids that are college age, right?
00:41:00.480 You get on a train and you've spent a lot of money for the ticket and you're going to
00:41:05.320 go to this destination.
00:41:06.380 And that ticket's a hundred thousand, two hundred, three, $400,000.
00:41:11.080 And then while you're on the train, the destination disappears because law, radiology, medical,
00:41:21.220 um, all of these, all of these jobs that as parents, Oh, I can't wait till my kid becomes
00:41:27.120 a doctor or a lawyer, whatever.
00:41:28.380 The, the, the, the, the, the, that's gone.
00:41:31.800 That is absolutely gone.
00:41:33.640 And the people who are on that train right now, they're screwed and there's nobody coming
00:41:38.720 for them.
00:41:39.560 Do, do we really want somebody to come for us?
00:41:41.780 By the way, do we want the government to come say, I'm sorry, I'll fix it.
00:41:46.100 They're not going to fix it.
00:41:48.060 They don't even, they don't know what we know right now.
00:41:51.060 I'm being very plain with you.
00:41:53.880 They, I have conversations with people in government who are saying they're in kindergarten.
00:41:58.280 They're not even able to comprehend the economic impact of what this means.
00:42:03.440 Because the economic impact means money becomes worthless.
00:42:06.800 And I don't, we don't have time to talk about that, but all of the Kings that are sitting
00:42:10.900 on these great thrones of money, they realize that that money is going to become devalued.
00:42:16.400 Much more than inflation ever did.
00:42:18.600 And the, the levers of control are going to change whether we like it or not.
00:42:24.060 It's happening.
00:42:25.100 Now, the question is, is it going to happen in the West or is it going to happen in China?
00:42:29.380 That's the real question.
00:42:30.520 So when you are becoming Ned Ludd at a Luddite and taking out your baseball bat to take the
00:42:36.200 next Tesla robot, because you don't want these clankers in your world, good.
00:42:41.040 China just won another score because they are doing that.
00:42:44.520 They are openly embracing this and oh, good for them.
00:42:48.960 Well, when they send 20 million of these over clanking across America and yeah, well, I'm
00:42:56.240 going to get my shotgun out.
00:42:57.240 I'll get EMP.
00:42:58.080 Fine.
00:42:58.320 Whatever you think, man.
00:42:59.960 The reality is if you choose to become a Luddite, that means you choose your future and
00:43:07.220 the Luddites didn't make it through that future, right?
00:43:11.220 They, they fell behind.
00:43:13.400 The candle makers said, I'm not going to give up my candle.
00:43:16.240 I'm going to candle light my house forever.
00:43:19.080 Their house might've burned down when the Edison light came.
00:43:22.320 We need to be able to face this.
00:43:24.720 I'm not saying I love this folks.
00:43:26.700 I'm not saying that.
00:43:28.100 Don't get mad at me that I'm giving some realities.
00:43:30.820 The reality is the wave is coming.
00:43:32.400 You choose.
00:43:33.120 It's Dawn Patrol.
00:43:34.200 You get on the surfboard or you wash out.
00:43:37.320 I didn't choose the wave.
00:43:39.480 I feel some guilt because I've always been in tech and I've been cheering on tech, but
00:43:44.780 we're now facing the wave.
00:43:46.740 So the question is get out of the denial phase, right?
00:43:51.040 The five stages of grief, Elizabeth Kubler-Ross.
00:43:53.880 We have to get out of the denial phase as quickly as we can.
00:43:57.300 I don't want anybody to grieve the wrong way.
00:43:59.020 I mean, it's sad because I'm using that to deal with Scott leaving us, right?
00:44:04.720 It's a process.
00:44:06.640 You don't need a professional.
00:44:09.100 You might need a group.
00:44:10.620 If you don't have a group, get a group.
00:44:13.780 And not just online.
00:44:15.740 I like the fact there are meetups.
00:44:17.520 You need real people face-to-face that you can see and hang out with.
00:44:22.380 Do it now in your church, in your community.
00:44:25.140 I don't care.
00:44:25.580 I really think you need to align yourself into some philosophical system if you don't.
00:44:31.100 You know, all I know is anybody who gets deep enough in science becomes religious.
00:44:35.900 That's a fact.
00:44:36.780 They may not openly say that, but that's what happens.
00:44:39.620 So if you're already there, thank you.
00:44:41.380 If you're not, get there.
00:44:43.640 You have to do this because it won't be done for you.
00:44:49.780 Mother bird is not going to chew the food for you and feed you.
00:44:53.720 And what is going to happen?
00:44:56.140 A lot of our friends who are very conservative are going to beg for government intervention to stop this.
00:45:02.480 Tax the heck out of those damn robots.
00:45:05.120 That'll fix it.
00:45:06.060 And somebody in China is laughing their ass off.
00:45:09.800 Sorry.
00:45:10.720 They're saying, oh, yeah, it works.
00:45:13.200 This is the world we're in.
00:45:14.820 Now we have to choose how we are going to deal with it.
00:45:19.040 I'm giving you 5,000 days.
00:45:20.480 I'm giving you the early warning.
00:45:22.140 I rang the bell.
00:45:23.220 I tried to do it 10 years ago, and I look even more bizarre.
00:45:26.220 But you guys are the early half of 1%.
00:45:31.720 And it's incumbent upon you to help the people around you and to have grace because they're not ready for it just like we aren't.
00:45:39.300 But you are already expert in it just in this conversation than most people are because they're in complete denial.
00:45:46.500 They don't even know that the body is dead and it hasn't fallen over yet.
00:45:51.240 They just think it's still alive and the worms are controlling it and it's a zombie, you know?
00:45:56.740 What I took away from your latest post, I think, was that one path is to become the conductor of the AI, to really embrace the tools, become an expert in using them, and also focus on the human aspects that AIs are not good at.
00:46:11.020 Meaning being a tastemaker, being a curator, being able to maybe run the AI 10 times and say, that's the one out of the 10 that's the good one.
00:46:19.620 You know, it's similar to what Scott would say about how he can just intuitively know when something's funny and he might have a lot of ideas, but it was, you know, he would be the one that says, this is the good one.
00:46:32.620 And that's something that, you know, not everyone's good at, but you might have some natural talent in a particular field or in a particular area where you've developed that intuition and you can know what the right thing is.
00:46:44.780 And if you have that expertise, then that may be a durable skill that you could leverage with AI and use a bunch of AI workers to go carry out your tasks, but you'd be the one making the conducting decisions saying, okay, this is the task that should be done now.
00:47:00.800 And these are the good results, these are the bad results, and becoming kind of the top of the food chain.
00:47:08.340 And then the other one was more of what I think of as kind of like just a bridge, but, you know, go become an electrician or a plumber, and then it's going to take longer for it to get there.
00:47:19.960 But, you know, ultimately that may only last for another, let's say, 30 years, but at least it's something that's also a lot harder for AI to do because they may never build a robot that can go under your sink and, you know, fix your particular sink because it's unique in the world.
00:47:36.360 Are there other paths or are those the two primary ones you see?
00:47:39.140 Oh, and you bring some great, great points.
00:47:41.800 You know, again, we have to define our lives that gives value in our life.
00:47:46.600 You want to know what everybody can do right now?
00:47:49.700 Become a better dad, become a better mom, become a better friend, become a better lover, spouse.
00:47:55.760 These are human skills that you now are going to have more time to spend with it because that's what humanity did for 99.9% of our existence.
00:48:05.140 Before the Industrial Revolution, we were better parents.
00:48:09.520 We were better spouses.
00:48:11.800 We were better at being a village that cared for each other.
00:48:16.600 Not in some manifesto Karl Marx socialist type of way, but in a real loving way because love is the theme that has been the cohesive force that makes humanity stay together.
00:48:29.720 And we just have hallmarked it and commercialized it to make it almost repulsive, like a really bad perfume.
00:48:39.780 It's like, ah, it would smell good in the drop, but man, you know, this is what we need to be able to do.
00:48:46.620 And it's going to be hard for some folks because they're so used to putting their nose to the grindstone.
00:48:53.680 I got other things.
00:48:54.700 I got my career.
00:48:55.660 Well, you don't got a career anymore.
00:48:57.740 Okay, now what?
00:48:59.300 My legacy.
00:49:00.500 Your legacy is how you touch people.
00:49:03.680 Your legacy is how you interact.
00:49:05.940 What time did you spend?
00:49:07.240 When you are in your bed in your last minutes, you're not going to care about that report that you did.
00:49:14.940 You're not going to care about, you know, that gear that you made so perfect.
00:49:19.440 You're going to care about the people that have touched you and you have touched.
00:49:23.740 And do you want to be the richest person in the graveyard?
00:49:29.080 Cool.
00:49:30.880 Big.
00:49:31.680 That's great.
00:49:33.480 Guess what?
00:49:34.160 Everybody's name that you know today that is famous will not be known a thousand years from now.
00:49:40.280 So whatever we've been driven by to make ourselves, I'll show them because most of us guys, I'll show them because we have to prove our worth.
00:49:49.500 It is our programming.
00:49:50.740 When we're born, we pop out.
00:49:52.700 We're not perfect.
00:49:53.560 We're not beautiful.
00:49:54.260 We got to go and prove our worth.
00:49:55.540 Now that's being taken away from a lot of men.
00:49:57.840 This is going to be a crisis for a lot of guys because it's built into our DNA.
00:50:02.420 I don't want to get into the sexist kind of thing, but it is the reality.
00:50:05.620 It's built into the DNA.
00:50:07.420 And all of a sudden you take our purpose for existence.
00:50:10.220 We are lost souls.
00:50:12.620 That means that our purpose was wrong.
00:50:15.280 What was our purpose for 99% of our existence?
00:50:19.560 Guess what?
00:50:20.300 We're going to start rediscovering that.
00:50:22.040 And Owen points out, you know, can you become a plumber if you're a lawyer?
00:50:27.700 Damn straight you can.
00:50:29.680 That's right.
00:50:30.200 Is it less honorable?
00:50:31.740 No.
00:50:32.680 No.
00:50:33.000 The problem was that we made it dishonorable to begin with.
00:50:36.840 That's right.
00:50:37.380 It's the most honorable thing.
00:50:38.960 And the ivory tower world, remember, they created this world where you have these hierarchies of you spent a few hundred thousand dollars, you get this credential.
00:50:50.320 Now you get the gold ring.
00:50:55.480 It's because it's the world that they created.
00:50:57.320 It's their credit card system.
00:50:59.860 You go in through it, then you get out of it, and you get recognized.
00:51:03.200 Guess what?
00:51:03.640 For 99% of our existence, that's not how we judged a hierarchy.
00:51:09.180 We judged it by empirical results.
00:51:12.260 How are our communities working?
00:51:15.740 The communities that didn't work, they didn't get to reproduce.
00:51:18.600 They didn't get to survive.
00:51:21.000 We are the victors by the very reality that we exist.
00:51:26.380 The ones that didn't have the right programming, they didn't exist.
00:51:30.480 The ivory tower will tell you it's survival of the fittest.
00:51:34.680 It is survival of the most able to adapt.
00:51:38.200 That is what Charles Darwin said.
00:51:40.840 And talk about reframing.
00:51:42.740 We've been victimized by that concept.
00:51:46.360 It is not dog eat dog.
00:51:47.920 It is not the struggle.
00:51:49.640 It is not I'm going to elbow this guy and elbow this guy to get ahead.
00:51:53.300 It is the ability to adapt to a changing environment.
00:51:58.480 We are the only species that I know of that are born naked.
00:52:02.500 And really understand this.
00:52:04.060 We are not of our environment.
00:52:06.640 We are the changers of our environment.
00:52:09.420 A rabbit is a rabbit.
00:52:11.220 When it's born, it's already born into his environment.
00:52:13.700 If we are born in most of the world and we don't find invention and creativity, we will either starve or freeze.
00:52:22.020 We, as humans, our directive from God is to go out there and invent and to build safety.
00:52:32.280 The very first thing we do is build a wall around ourselves to protect us from the outside.
00:52:37.700 That is natural.
00:52:38.800 Anybody telling you not to do that is anti-human and does not understand history.
00:52:45.800 You need protection because we are vulnerable because we are born naked.
00:52:50.360 It is not rocket science.
00:52:52.600 Right?
00:52:53.520 One of the very first things we did, mostly guys, first was to get a loincloth for freak's sake.
00:53:01.240 Because we could get ourselves in trouble running around the woods without a loincloth.
00:53:06.640 So we built clothes.
00:53:08.880 That was one of our first inventions.
00:53:10.700 We didn't invent fire.
00:53:12.400 We captured fire from the Prometheus story.
00:53:15.180 We saw fire and said, ooh, that keeps us warm.
00:53:18.340 Ooh, that makes our food taste a little better.
00:53:21.680 Or maybe not have so many maggots in it.
00:53:24.340 Right?
00:53:25.320 It makes flesh easier to eat.
00:53:27.900 Or should we be eating flesh?
00:53:29.060 I don't know.
00:53:29.440 All of these things were going on long before we ever came about.
00:53:36.040 Right?
00:53:36.880 We're going back to that operating system.
00:53:39.700 We're going back to our traditional roles.
00:53:43.620 Right?
00:53:44.160 If the 1950s, really, it was World War II that brought the idea of breaking up the family unit as a solid structure.
00:53:55.280 And that emergency we call World War II said, okay, Rosie the Riveter.
00:54:00.680 Now, I'm not talking about the philosophy of suffragism and Bernays and what he did with his freedom sticks.
00:54:09.780 There's a sideline to this.
00:54:11.080 I'm talking about the most solid unit that humanity has ever created was a family.
00:54:16.880 And the thing that almost everybody in power wants to do is make sure the family doesn't work.
00:54:23.100 And we're all hypnotized by that function.
00:54:27.560 And all I'm doing is looking at the objective truth of how we got here.
00:54:31.700 We did not get here by the philosophical systems that we're using today.
00:54:37.040 In fact, it's objectively proven that those philosophies have failed because we are now going to de-evolution.
00:54:45.140 We are going in reverse.
00:54:47.360 Right?
00:54:47.640 When you start de-evolving, you realize that the philosophical underpinnings that you're using are no longer valid.
00:54:54.960 And somebody has to say, the emperor is naked.
00:54:59.140 Right?
00:55:00.180 And that's where we are.
00:55:02.040 AI is just going to magnify that.
00:55:03.500 So, redefine who you are.
00:55:08.340 I can't tell you who you are, but I can tell you that there will never be another person like you ever born.
00:55:15.140 Amen.
00:55:16.060 Brian, we only have...
00:55:19.840 And if you're an absolute miracle...
00:55:20.680 Oh, sorry.
00:55:23.120 Did I interrupt you?
00:55:23.800 I'm not giving you a hallmark speech there.
00:55:25.820 Oh.
00:55:27.780 We only have a few minutes left, and we promised people we were going to ask questions.
00:55:32.180 But you were just cooking so well, and we wanted to hear everything you said.
00:55:37.480 I know Marcella said she wrote down questions for you for next time, and I think Sergio did too.
00:55:42.960 And Sergio, Marcella, if you forgive me, I just want to ask Brian, because this is very important to Scott, his estate, to us, to our future,
00:55:51.100 that we just take the last few moments to talk about something that's happening right now that a lot of you know,
00:55:58.240 and that's about using somebody's likeness, intellectual property, intelligence, everything for AI that you don't own.
00:56:09.140 So Brian can speak more eloquently about this, but I just wanted to use this last bit of time.
00:56:15.060 My computer is dying.
00:56:16.480 Oh, okay.
00:56:17.100 So you guys know that there's just been, you know, people across the board that want to make a clone of Scott or Scott's son or Scott's dog or whatever.
00:56:27.100 And Brian has written about this and thought about this extensively, and I just think it's important to listen to someone who is on the forefront of this.
00:56:40.080 And like he said, he already thought about the problem and worked it backwards.
00:56:43.300 Um, so he, he said he would speak to us about that for the last few minutes and he's coming back again and again and again,
00:56:51.080 and we're going to do a long form interview with Brian in, um, on locals exclusively.
00:56:58.600 And that will be so much participation.
00:57:00.680 Like we'll all be able to ask questions.
00:57:03.240 Um, Sergio, did you want to chime in at all before we get there?
00:57:08.520 You're on mute.
00:57:09.380 Still, Sergio's mute is undefeated.
00:57:18.480 You're on mute on rumble.
00:57:20.520 All right.
00:57:21.040 He got it.
00:57:22.020 Okay.
00:57:23.460 Oh, Sergio, we don't hear you.
00:57:26.500 We have great questions on YouTube.
00:57:33.660 Amazing.
00:57:34.360 Um, questions from Annie.
00:57:36.900 Uh, we have to ask them next time though, but unfortunately, yeah, next time, but I'm going
00:57:41.900 to pass them to you guys.
00:57:43.040 Uh, and I just wanted to say that we have good questions and I'll pass them on.
00:57:46.500 Yeah.
00:57:47.520 Uh, so yeah.
00:57:49.520 Do you want me to?
00:57:50.040 Oh, you know what?
00:57:50.780 To you guys?
00:57:51.600 Um, everyone, um, Sergio, will you drop Brian's handle in the YouTube chat and we'll do it here.
00:57:57.960 And please feel free to message Brian also, because maybe he can answer this.
00:58:04.100 Yeah.
00:58:04.540 Yeah.
00:58:04.780 I'm going to send him the questions and I think I'm going to interview him myself.
00:58:08.020 I'm going to have an interview just to ask him questions, you know, because I have like
00:58:12.080 so many questions.
00:58:13.480 Oh, don't we all?
00:58:14.920 Well, we're going to do a long form here on Scott's locals.
00:58:17.660 I mean, like I'm talking to him on the side, you know, maybe, I don't know.
00:58:21.580 Okay.
00:58:22.900 Um, so, all right, we might not get Brian back you guys.
00:58:26.500 And I know he's bummed because he wanted to talk about this.
00:58:29.440 So why don't we have him on again?
00:58:32.400 We could, Brian, I see your question.
00:58:34.560 You know what?
00:58:35.020 We, um, we want to make sure we can go long sometimes and we'll set up for that.
00:58:40.360 Um, but we didn't do that today.
00:58:42.080 And we like to make sure the guests know that they have an in and an out time in case they
00:58:47.940 need to plan the rest of their day.
00:58:49.500 You know, for Scott, he could just go on as long as he wanted.
00:58:53.100 Um, but we will have Brian back on, um, maybe he can come back on next week and we'll start
00:59:01.000 with your questions.
00:59:02.520 And that way we could even just do like a question and answer show.
00:59:06.860 Would that be fun for everybody?
00:59:08.560 We'll just do questions and answers.
00:59:10.160 Um, yeah, Brian's very interesting.
00:59:13.940 Yeah.
00:59:14.040 I think that would be great too.
00:59:15.020 Cause so many of us have questions there.
00:59:18.520 Oh, Brian, did you hear anything we just said?
00:59:22.440 We're, we're committing you to like extra long shows for questions and answers all over
00:59:27.380 the place.
00:59:28.000 I don't know what happened.
00:59:30.360 Um, so I was telling them we're going to do a longer, Oh, we're, we lost him again,
00:59:35.160 that we're going to do a longer form interview.
00:59:37.200 If you can hear me with you.
00:59:39.220 So, and let's see if he comes back, you guys.
00:59:43.000 Oh, good.
00:59:43.300 Thanks for following him.
00:59:44.360 Y'all.
00:59:47.580 Uh, okay.
00:59:49.360 Oh, Brian's back.
00:59:50.420 Okay.
00:59:50.720 Brian.
00:59:52.460 Oh, look at, he's our technological genius.
00:59:55.020 There he is.
00:59:55.520 Can you hear us?
00:59:58.780 Did I crash this system?
01:00:00.220 What's going on?
01:00:01.280 I mean, it's your energy.
01:00:02.620 It's your aura.
01:00:04.260 I just, um, wanted to let you know, I committed you to a long form interview on locals, um,
01:00:10.800 specifically because we have so many questions.
01:00:13.500 Um, so we're going to, we're going to schedule you in for that.
01:00:18.180 Okay.
01:00:18.500 Cause we're going to just do questions and answers on locals.
01:00:21.660 Um, but if you could just take five minutes to talk about the AI.
01:00:25.840 I'd be honored.
01:00:27.220 Thank you.
01:00:27.720 Okay, go ahead.
01:00:32.380 Um, okay.
01:00:33.100 So, okay.
01:00:35.640 Quite a long time ago.
01:00:37.160 Can you hear me?
01:00:39.580 Yeah.
01:00:43.120 No.
01:00:45.440 Hello.
01:00:46.520 Oh, now we can.
01:00:48.180 I know.
01:00:49.980 Scott can't hear me.
01:00:51.440 Okay.
01:00:52.580 Are we good?
01:00:53.960 Yes.
01:00:54.360 You appear to be frozen, but we can hear you.
01:00:59.100 All right.
01:00:59.260 Now you're moving.
01:00:59.940 Okay.
01:01:02.080 Good.
01:01:02.740 Okay.
01:01:03.660 All right.
01:01:04.300 I'm frozen a lot.
01:01:05.440 Um, I really think we, as a society need to understand who owns our likeness, who owns
01:01:14.500 our face, our body, our DNA, our voice, uh, even our gut microbiome.
01:01:23.180 Um, I, I think it's vital that we have this conversation and do it as soon as possible,
01:01:28.000 because if we don't own ourselves, then who are we?
01:01:33.980 Um, otherwise that's known as being a slave.
01:01:36.900 You have to have self self-sovency.
01:01:38.960 And if you don't have that, and you don't have it organized within the structure of a
01:01:42.860 legal system, um, you have immense chaos.
01:01:47.540 And I think recent events with AI and likenesses are really important to start thinking in these
01:01:57.140 terms.
01:01:57.500 And I invite everybody to think in those terms because the world doesn't get to own us.
01:02:06.340 You own yourself.
01:02:07.420 Um, and if you want to do the math and the logic about it, any other way does not work.
01:02:12.460 It does not serve society.
01:02:13.900 It does not serve any single person.
01:02:16.660 Um, and, um, that's, I think I can go into a lot more of it, but in a very short period
01:02:23.820 of time, I wrote a declaration, uh, about five, six years ago, who owns me.
01:02:28.540 You can look in my Twitter feed, X feed, um, just type in who owns you, and you can kind
01:02:33.880 of see my whole rant about that.
01:02:35.840 Uh, I even, like I said, wrote a preamble and declaration of self-ownership, um, in the
01:02:43.120 most ironic way.
01:02:44.400 Um, we got to really think about that.
01:02:47.560 And it's really easy to use this cut and paste technology that we have right now with
01:02:52.220 even just the last five years, let alone generative AI.
01:02:58.540 You don't want that world ahead unless you can claim ownership and rights to your own,
01:03:03.320 your own being.
01:03:04.760 Uh, I think, uh, it's important for people to understand that these are, these are just
01:03:09.800 puppets.
01:03:10.560 There are puppets that look and sound like someone, but there's a puppet master feeding
01:03:16.160 it words.
01:03:16.800 Okay.
01:03:17.200 It's not the words of the person.
01:03:18.760 And that is very dangerous.
01:03:20.480 Imagine, imagine Brian isn't with us one day and think how brilliant Brian is and all the
01:03:27.180 work and energy he put into creating all of this.
01:03:31.080 And then he's not with us one day and someone's like, well, there's, you know, hundreds and
01:03:35.060 hundreds of hours of him and I've got his voice cloned and I'm just going to make him
01:03:39.280 say what I want.
01:03:40.180 And what if it changes everything Brian's ever worked for?
01:03:43.660 You know, that, that just cannot be allowed to stand.
01:03:47.180 It has to be a legal.
01:03:48.520 Such an important thing because, um, our likeness is going to be used whether we like it or not.
01:03:56.440 Uh, but there needs to be a structure that allows us to address this and it has to be
01:04:03.200 very clearly defined.
01:04:04.480 Um, I mean, there are some laws where people, uh, make pornography into people and, you know,
01:04:11.340 their, their legal repercussions.
01:04:13.620 That's not enough.
01:04:14.640 It's got to go into detail.
01:04:17.940 We form governments to protect individuals from groups and groups from individuals, right?
01:04:25.260 That's why we do that.
01:04:27.260 So I'm not asking for an oppressive government.
01:04:29.420 I'm not asking for oppressive laws.
01:04:31.860 I'm asking for the reason why we organize as a government to do these things and for somebody
01:04:39.580 to commercialize it.
01:04:41.140 Number one, that's a big problem, especially a well-known individual.
01:04:45.240 Um, for somebody to have editorial control.
01:04:49.200 Cause I can tell you right now, AI is good, but unless you are really good at training
01:04:54.180 in AI, it's going to say stuff that a person would never actually say.
01:05:00.200 And I'll leave you with this one notion.
01:05:02.260 I have something called save wisdom.org.
01:05:04.740 This is a thousand questions.
01:05:07.840 It's a monumental thing where you just are saying your answers in your own voice into
01:05:13.300 your own recorder that never goes on the internet.
01:05:15.780 I urge everybody to start doing that today because your wisdom is valuable.
01:05:20.300 The ownership of your wisdom is valuable.
01:05:22.700 And I say, it's time to not give it to, uh, you know, social media and anybody else record
01:05:30.400 it.
01:05:30.600 And at the very least, if you don't use it in an AI project, that's local, you could share
01:05:35.640 it with your loved ones.
01:05:36.620 You could share it with your family.
01:05:37.780 And I wouldn't share it while you're doing it.
01:05:40.720 Cause I want you to really answer those questions with all the intensity and emotions that you
01:05:45.760 should, because that's what you're trying to capture.
01:05:48.140 And in the process of doing that, you discover yourself and it's very important.
01:05:52.000 And it's apropos to the 5,000 days because you really got to get to know who the heck
01:05:55.940 you are because we're so busy trying to put food in our belly and a roof over our head.
01:06:01.520 We lost the compass of who we really are and we need to get back to that.
01:06:07.080 Yes, I agree.
01:06:08.300 Thank you.
01:06:08.720 Those two things are interconnected.
01:06:10.020 And I hope maybe in the future we could talk about the implications of somebody stealing
01:06:15.320 the likeness of an individual and what it really means for all of us.
01:06:20.020 And we also need to discuss that, you know, somebody could be paying somebody who knows
01:06:27.740 how to do it, to put words in that person's mouth and to change the course of history of
01:06:34.960 the future and what their thoughts were.
01:06:37.860 So, you know, you said it perfectly before when we were speaking, you know, I said this
01:06:42.700 also on here that, you know, oh, it starts with, oh, isn't this fun?
01:06:47.820 And I miss this person and it's making jokes and, oh, I like, you know, I like hearing it
01:06:52.520 and, oh, look, it's talking to me and whatever.
01:06:55.220 And then the next thing you know, it starts talking about elections or talking about wars
01:07:00.040 or talking about politics.
01:07:01.900 And now you're so used to this thing that you're thinking that this is really that person's
01:07:07.340 opinion.
01:07:07.700 And it's easy how quickly you can become brainwashed.
01:07:12.340 It is instantaneous.
01:07:14.300 And I think somebody like Scott actually knew this very well.
01:07:19.160 Yes.
01:07:19.560 If you read enough of his books, you realize that you can get hypnotized into believing this.
01:07:24.200 And I believe if you look at the nuances of what he said in the past, he was predicting
01:07:29.400 that this is going to be somewhat of a crisis.
01:07:31.840 And my interactions with him over X and such with AI, he was very concerned over what this
01:07:39.180 really means.
01:07:40.180 Sometimes you just throw your hands up and you say, oh, well.
01:07:43.180 But I think deep down inside, all of us have to start thinking, well, we don't want to be
01:07:48.340 victimized by this.
01:07:49.300 We need to start making decisions.
01:07:50.760 And, again, I really think we need to, as a group, be very thoughtful in the way this
01:07:59.400 is composed.
01:08:01.280 You know, listen, parity and things like that.
01:08:04.300 Yeah, there are provisions that allow that already.
01:08:06.980 But for the outright ownership and the outright redoing of what somebody has done in their
01:08:14.120 life and slightly shifting the words to the point there are 180 degrees from where they
01:08:20.160 started, because that's what happened.
01:08:22.020 It's a slow drift.
01:08:23.840 Unless the people who really care about it have been entrusted.
01:08:27.220 Like, if you entrust, Disney entrusted his legacy.
01:08:31.280 And look what happened to Disney.
01:08:33.080 You think Walt Disney would be very happy with the Disney he sees today.
01:08:37.300 No matter what you may think of Disney in the past, he is a very interesting character.
01:08:41.600 But he certainly is not the Disney of 1960 is not the Disney of 2026.
01:08:47.340 That's right.
01:08:47.840 Right?
01:08:48.460 So look at what happened there.
01:08:51.480 This is going to happen magnified by a thousand as AI likenesses start taking over.
01:08:57.980 So we really understand it.
01:09:00.440 Thank you for taking the extra time to talk about that.
01:09:03.420 And we'll definitely get into it more as we go on.
01:09:06.040 And then we, yeah, we, you got, we have a, oh, where, where do we find a thousand questions?
01:09:11.840 Okay.
01:09:12.040 Marcella put it in the comments.
01:09:14.340 So that's something to consider.
01:09:15.640 You guys take the time to do that.
01:09:17.840 Yeah.
01:09:17.960 It'll be interesting to really get to know yourself.
01:09:20.280 You think, you know, but do you?
01:09:21.480 So Brian, thank you so, so much.
01:09:24.880 And we, um, are forever grateful to you.
01:09:28.340 And it's been so, um, it's been so, I can't even think of the word, like even more enlightening
01:09:36.740 than everything I knew about you.
01:09:38.240 And over the years, you know, I've, I've always communicated with you on X and, you know, it's
01:09:44.180 just so fun getting to talk to you now and getting to know you better.
01:09:47.180 And I think this chat is, uh, learning a lot.
01:09:50.680 Like there are such smart people and they're always craving knowledge and to know more.
01:09:56.240 And you're really bringing that to this group.
01:09:58.500 And we're so thankful and we're so thankful that you're willing and agreeable to come back
01:10:04.220 again and again and again.
01:10:06.980 Well, Erica, I'm honored.
01:10:09.360 I love you guys.
01:10:11.180 I love Scott's audience.
01:10:12.540 Uh, I think we're all part of a really big group.
01:10:15.940 Uh, let's call it, uh, Scott's tribe.
01:10:18.460 I don't know.
01:10:18.980 Yeah.
01:10:19.440 Uh, the bottom line is debris.
01:10:21.480 We've been saying our debris, right?
01:10:23.180 So I, I, I really got to thank you guys because, um, I don't know what the world would look
01:10:30.280 like if this group didn't exist, if what Scott did and the group that surrounded him did not
01:10:36.560 exist because sometimes, well, always it takes a single light to make a dark room, have a
01:10:44.420 light and all the darkness in the world can't put that light out and you guys are the light
01:10:48.380 and I really appreciate it.
01:10:49.920 Thank you.
01:10:50.580 And to you as well, same thing.
01:10:52.300 Same exact thing.
01:10:53.900 Well, we will schedule our next one soon because we do want to do a Q and a with you.
01:11:00.080 And, um, until then I'll be chatting with you, but thank you so much, Brian and everybody
01:11:06.180 like Brian and I will do a closing sip and, um, please go out there and be useful.
01:11:12.020 Take the thousand question quiz, get to know yourself and then be prepared to be patient
01:11:18.440 while you help teach other people what's coming in 5,000 days or less now.
01:11:24.980 All right, you guys to Scott.
01:11:27.620 Thank you, Shelly.
01:11:28.700 And we will see you guys tomorrow to Scott.
01:11:32.080 To Scott.
01:11:33.800 Thank you.
01:11:34.480 Thanks, Brian.
01:11:41.660 Oh.
01:11:41.860 Thank you.
01:11:42.600 Thank you.
01:11:44.420 Yeah.
01:11:45.000 Thanks, Brian.
01:11:46.980 Thanks, Brian.
01:11:47.260 Thanks, Brian.
01:11:49.560 Thanks, Brian.
01:11:51.620 Thanks, Brian.
01:11:52.740 Thanks, Brian.
01:11:54.240 Thanks, Brian.
01:11:56.100 Thanks, Brian.
01:11:57.080 Thanks, Brian.
01:11:59.060 Thanks, Brian.
01:11:59.240 Thanks, Brian.
01:12:00.060 Thanks, Brian.
01:12:00.160 Thanks, Brian.
01:12:00.240 Awesome.
01:12:01.440 Thanks, Brian.
01:12:01.600 Thanks, Brian.
01:12:02.100 Thanks, Brian.