Episode 3075 - The Scott Adams School 01⧸19⧸26
Episode Stats
Length
1 hour and 2 minutes
Words per Minute
165.74718
Summary
In this episode of Coffee with Scott Adams, host Scott Adams talks about what it's like to miss out on his favorite morning ritual: a cup of coffee with a special guest. Today's guest is Scott Adams.
Transcript
00:00:00.540
Nova Scotia, Canada is home to five of the world's top hedge fund administrators.
00:00:06.640
Learn how you can be part of a thriving financial services hub at investnovascotia.ca.
00:00:14.860
You guys, let's just make sure Locals is live, especially before we do.
00:00:42.780
Okay, let's get ready for the simultaneous sip.
00:00:53.520
And welcome to the highlight of the entire civilization.
00:01:00.760
And if you didn't think it could get any better, surprise.
00:01:07.160
Yes, we will have a whiteboard in which I'll connect the seemingly different fields of politics,
00:01:13.900
artificial intelligence, the simulation, and Twitter.
00:01:19.200
And in order for you to be primed and ready for that, this mind-blowing experience that is the simultaneous sip and coffee with Scott Adams.
00:01:29.540
And all you need to be ready for this amazing, amazing experience is a cup of hunger, a glass of tank, a gel, a cider, a canteen, a jug of glass, a vessel of any kind.
00:02:12.880
Well, if you have and you've bought more than one thing, you may have run into a situation I run into often.
00:02:21.840
As in, I think I'm buying a big old bag of something, and it shows up like it's a free sample.
00:02:34.660
You know, you buy a chair for your living room, and it shows up like it's a Barbie chair.
00:02:39.340
You're like, you know, that looked like a real chair.
00:02:49.680
Well, this brings me to my range of purchase, which should have been about this tall, not this wide, the big one.
00:03:00.740
But when you look at the little picture, it looks exactly the same.
00:03:04.800
And so I suggest the following human interface improvement for Amazon.
00:03:09.420
Jeff Bezos, if you're listening, I suggest this.
00:03:12.600
In any situation in which there might be any potential ambiguity about the size and scale of an object, it is not good enough to include it only in the description, which you must click.
00:03:27.280
You must also have a human hand in the picture, preferably the same human hand, because if this had a human hand, I would know exactly how big it was every single time.
00:03:38.200
And how hard is it to put a hand in the picture?
00:03:41.100
You can even digitally add it, just nearby hand, picture.
00:03:45.500
So, please, user interface developers at Amazon, who are, by the way, some of the best in the world.
00:03:57.100
But that one thing, that one thing bites me in the ass about one time in five, probably, literally.
00:04:13.340
I can't believe it's another, you know, that we're into another week now.
00:04:24.080
How do you guys quickly, you know, feel it's not easier?
00:04:34.360
Because you realize the weekend, we didn't have coffee with Scott Adams.
00:04:58.560
I saw a post or a reply online saying, oh, you know, I'm going to be sad when people forget Scott.
00:05:11.240
And we're not going to allow people to forget Scott Adams.
00:05:22.840
We have Sergio below with the beanie, the Scott Adams beanie, which is back for sale, you guys, by popular demand.
00:05:35.960
Sergio, you wanted to say something about that?
00:05:45.980
You know, well, I'm a pro at this, unfortunately.
00:05:48.760
You know, I'm a pro and I've been through it with the strongest ones.
00:05:53.760
And I've got to say that Scott's advice of putting yourself into service, being useful, and honoring that person's legacy and transition, too,
00:06:08.920
it's the most important thing to do to stay sane.
00:06:22.220
He has a beautiful song on this that I played over and over again when that happened.
00:06:28.940
Marcella, do you not remember the name of the song?
00:06:39.200
He's a Navy SEAL who went through Afghanistan and Iraq War.
00:06:51.180
Jocko Willink is the best at the end of the year.
00:06:54.620
So I think we are, it's been helping a lot, you guys, you know, like, you know, talking to you guys.
00:07:08.380
Kevin and Kevin has been very supportive with me for a long time, too.
00:07:23.220
And I keep reminding myself of some of Scott's reframes on death and grieving and loss.
00:07:37.760
And just using his own advice on him has been just sort of a saving grace, I'll say.
00:07:51.820
Well, I mean, I'm sure it certainly hit me hard and I'm still going through it, but I would, I was going to say the same thing that I, I think SJV just posted one of Scott's segments on getting rid of anxiety, but it really focused on getting ready for a loss.
00:08:11.560
And like a loss, you know, is coming in his case, he was focusing on like the death of a pet because he knew that he would have to deal with Snickers death coming up.
00:08:19.320
And, um, so he talked about how you could prepare for that.
00:08:23.500
And a lot of it was the reframe about it being an honor and a privilege to be part of it.
00:08:33.040
I think I reposted that from SJV on my timeline this morning, but, um, you know, that, that sort of thing is helping me.
00:08:40.240
I mean, that is really how I'm thinking about it is that it's been an honor and a privilege to be part of the process and just to be, you know, helping.
00:08:54.080
You guys on all the platforms listening, if you hear us mention someone's name, like Jocko or SJV, um, if you guys would in the chat, can you at them?
00:09:05.100
So people can say, Oh, who's Jocko or who's SJV or whatever we're talking about.
00:09:09.420
That would really help the others that are watching the chat to know who these people are and perhaps follow along.
00:09:15.880
So I like that, you know, you guys shared that I was having a really bad night last night.
00:09:21.260
Some of, you know, um, but if this is like a great time.
00:09:28.840
So listen, we can reframe anything as Scott has taught us.
00:09:33.660
And all you have to do is think about reframing whatever situation you're in and create it.
00:09:39.880
If there isn't one written in his book, create it.
00:09:44.380
He's going to tell us, um, what page in your reframe your brain book and let's get a reframe.
00:09:51.340
And then we're going to go right to the news with Owen and Marcella.
00:09:58.820
So, um, we're going to do the wanting versus deciding reframe today.
00:10:03.380
And this is like the third page of chapter two.
00:10:09.420
Um, in my own little hierarchy of reframes, this is like a big daddy reframe upon which
00:10:20.120
Uh, so I, I absolutely love this wanting versus deciding.
00:10:25.060
If you want something, you might be willing to work hard to get it within reason, but if
00:10:30.580
you decide to have something, you will do whatever it takes.
00:10:42.380
If I were to make a list of all the business startups and other money-making schemes I've
00:10:46.680
worked on during my career and then divide that long list into what worked out well and
00:10:57.200
The efforts that failed were all ones I wanted to succeed.
00:11:03.460
I wasn't doing all-nighters or risking money I couldn't afford to lose, but I put in great
00:11:14.200
Luckily, several business projects did work out.
00:11:21.460
I wrote several best-selling books and I was one of the highest paid speakers in the country
00:11:27.860
To be fair, I was only able to do all of that because the original Dilbert comic strip took
00:11:33.440
And there is one thing that stands out about that initial Dilbert success.
00:11:45.200
When I was offered a syndication contract in 1988, the ultimate big break for a cartoonist,
00:11:51.340
I made a promise to myself that no matter what happened, I would never allow myself to look
00:11:55.920
back and say I didn't work hard enough to make it a success.
00:11:59.900
I knew I was entering a field in which the odds of making it big were around 1% even after
00:12:05.880
getting the syndication contract, which was already insanely unlikely.
00:12:10.740
The syndication company sells comics to newspapers and web platforms and splits the money with
00:12:17.800
I had a contract but zero newspaper clients on day one.
00:12:21.560
By the end of the first year of selling Dilbert comics to newspapers, only a few small newspapers
00:12:28.000
At that point, seeing no hope of a big hit, the salespeople moved on to the next comic that
00:12:33.820
If Dilbert was going to succeed, I would need to make it happen on my own.
00:12:39.840
And so, I worked my day job while also writing and promoting the Dilbert comic for several
00:12:48.520
For over 10 years, I had the equivalent of three full-time jobs.
00:12:52.780
I worked seven days a week, including holidays.
00:12:55.320
I did everything I could do to promote the comic, putting 100% of my mind and body into it.
00:13:03.920
For many of those years, I answered hundreds of emails per day from fans.
00:13:08.220
By the way, I sent an email to him and he did respond within a day.
00:13:14.640
But 30 years ago, I traveled the country for book signings and autograph sessions that would
00:13:21.200
I did photo shoots and interviews several times per week for a decade.
00:13:25.840
Dilbert was the first syndicated comic to be published on the internet, which also took
00:13:32.420
On paper, my workload from those years looks impossible.
00:13:39.100
If I had merely wanted to succeed, I don't think I could have lasted.
00:13:48.260
And once you decide, the psychology of the situation changes.
00:13:55.360
I reminded myself that almost any cartoonist would want to trade places with me.
00:13:59.500
It was never easy and it was never painless, but I was unstoppable because I had succeeded.
00:14:08.300
If you are wondering how you can know if the thing you desire is a want or a decision, I
00:14:34.600
I think about this constantly and I think about it, I try to, generally I'm an optimist
00:14:42.200
and I try to think about it very positively, but I have had family members who have done
00:14:49.360
And, you know, you can talk yourself into just any reality you like.
00:14:57.060
So for me, you know, this informs the sort of that talk track you have going on in your
00:15:03.580
head and the decisions you make, the way you prioritize, you know, you are what you think
00:15:09.540
And, uh, for me again, it's, it's a big daddy reframe and, uh, I love this one.
00:15:16.880
I need to remind myself about this too, because sometimes like, Oh, I want to do that.
00:15:25.360
You guys, I'm bringing the reality to the engineer minds.
00:15:28.700
I'm not like, uh, I must be like a shock system for a lot of people, but anyway, um, I live
00:15:35.420
like in this Libra balancing act on top of a fence all the time.
00:15:41.800
If you ask me, do I want to go to this restaurant or this one, I will vacillate for an hour about
00:15:46.780
Um, so I definitely teeter between wanting and deciding a lot, and I have to push myself
00:15:56.360
Um, because if I stay in the want, I often have regrets.
00:16:01.240
So I I'm glad that you brought this up today because there's a few things I need to really
00:16:08.020
And I'm going to remember this and I'm going to think about, um, Scott up until the end,
00:16:16.340
like everything you read about his career and how he did Dilbert and for the amount of
00:16:21.720
years and working a full-time job and this and that, like, yeah, he made a decision and
00:16:26.540
he also made a decision to keep going till the very end with us.
00:16:30.780
And he could have not, he could have stopped anytime he wanted and any one of us would have
00:16:37.440
So yeah, I think that's a great lesson for me today and maybe for you guys.
00:16:42.720
Yeah, I think, I think also when you are a decider and become known as a decider and
00:16:54.020
People look at you a little differently instead of, you know, Oh, you know, I, I want this.
00:17:02.200
You, you, you develop a reputation with yourself.
00:17:04.660
You know, you start, you know, thinking, Oh, you know, this guy means business and, or
00:17:15.440
this girl, you know, and, uh, the more you do those.
00:17:18.880
And one guy that reminds me of that is Trump too.
00:17:21.340
So Trump, uh, this, uh, if you read the art of the deal, uh, you know, that is when he
00:17:27.440
goes from wanting to deciding basically when he says Greenland might be nice, that, that
00:17:37.220
I wanted to inject that because, uh, it's happening right now.
00:17:48.160
Does, does anyone else want to expound on the, um, on the reframe?
00:17:52.560
I think there's a lot of deciders within the group, because this is a very, you know,
00:17:59.580
intelligent, successful group of people Scott has cultivated, but I think we, in some aspects
00:18:08.520
of our life, we are still sitting on the want instead of deciding.
00:18:12.320
So maybe, maybe everybody today, your homework could be like, what's something I've been wanting
00:18:18.240
And then either make a decision that is worth the decision or let it go and, and then like
00:18:26.440
figure out what, what else you need to do to, to get where you want to be.
00:18:30.940
I actually have a couple of things I'm going to take off and then a couple of things I'm
00:18:40.240
Hey, actually that assignment publicly sharing those commitments, as we know, um, sure, uh,
00:18:48.240
And I like, yeah, I mean, I think, I think one thing I would add to it is just, you need
00:18:54.980
I mean, cause the frame we're talking about here is a real commitment.
00:18:59.920
Like you're going to do whatever it takes to make this happen.
00:19:02.560
And so you can't do that with everything, you know, you can't just be like, well, now
00:19:10.500
You have to, you have to be selective about probably only one thing at a time, really, of
00:19:16.160
saying, I'm going to make this thing happen and I'm going to put aside whatever else and
00:19:20.200
sacrifice whatever else I need to, to make that happen.
00:19:26.580
Um, and I think Scott has talked about that too, um, where you need to be willing to make
00:19:31.560
You need to be willing to, you know, put off other things or prioritize other things.
00:19:35.800
And then, you know, Kevin, to your point, this leads into a lot of other reframes or
00:19:39.440
a lot of other things Scott has talked about in terms of once you've decided to do something,
00:19:45.360
And, and that would be how you would get that going.
00:19:48.720
You would say, okay, what, what needs to be true or what needs to happen?
00:19:54.720
And I know Scott has talked about that in the same story of his career, saying how he
00:20:02.940
And, you know, we've seen it every day where he has his routine, he does his thing every
00:20:09.140
Um, but that's the result of a very honed system.
00:20:18.000
How do I make all these things happen so that I can always achieve the result that I need
00:20:22.700
And so I think, um, it definitely ties into a lot of his other wisdom that he shared with
00:20:30.380
When, when I agree with that, um, I agree with that on, but one of the things that he says
00:20:36.360
that Kevin read is not only do you have to decide, but you have to work.
00:20:42.520
For your position, like, I can't just decide I'll be a billionaire and just sit around.
00:20:59.020
And I think all of us, uh, want to, want to be like that.
00:21:03.260
And, uh, it's, it takes work and it takes focus because that's one thing that he said
00:21:09.600
in what Kevin read is like, I, I did this nonstop interviews, nonstop work.
00:21:17.320
And when I felt tired, I realized everybody would want to be in my place.
00:21:31.840
I think, I mean, it's certainly, it is all about taking the actions.
00:21:35.000
And I think that ties into his talk about affirmations because that focuses your mind
00:21:41.900
And he's talked about that saying, you can't do affirmations for lots of different stuff.
00:21:45.420
You have to pick one thing and just, you know, write it a hundred times, say it a hundred
00:21:49.000
times, you know, do it over and over again until it really sinks in.
00:21:52.560
And that focuses your mind on that one outcome or that one thing that you're trying to make
00:21:56.940
And, and I think it ties directly into this deciding and, um, but that's where like the
00:22:03.000
secret, if anybody remembers that book was wrong because that they had the same affirmation
00:22:08.520
sort of message, but they said what you were just saying, like, oh, you can just say, I'm
00:22:13.960
And then the universe is going to make it happen for you.
00:22:16.520
And it's like, they only gave you half the answer.
00:22:19.460
It's like, and it, and I would also say, it doesn't mean you can just ask for whatever
00:22:23.900
I mean, I, you know, Scott has said a lot of very amazing things have happened to him
00:22:28.260
So I'm not trying to limit that, but at the same time, he, he says, you know, don't be
00:22:34.140
Like, don't say I want a billion dollars, just say, I want more money or I want to be successful
00:22:39.000
or something that's more general so that it can be a lot of different things that would
00:22:42.540
fit that description as opposed to maybe something impossible.
00:22:45.680
And he said certain things like winning the lottery, you know, it's not going to work.
00:22:49.080
But, um, but if you, if you make that decision and you make those affirmations to focus on
00:22:58.300
You need to do the things that come into your head that says, okay, if I want this to happen,
00:23:10.720
It's not, it's not just the thought it's the actions you take as a result of that.
00:23:15.860
Well, and, oh, and like, this is exactly where, like you said, systems for your goals.
00:23:21.960
So you got to put in the place of systems, probably work on your talent stack.
00:23:26.120
You might fail along the way, but, you know, fall down seven, get up eight.
00:23:31.120
So you just keep going until like, if he, and if you exhaust all of it and you're like,
00:23:40.800
Um, not, I mean, Scott has shown us how many ideas in different business ventures he started
00:23:47.760
and, and had to stop, but he really put us all into it.
00:23:51.560
And, um, I think we're like better arm now than Scott was because he taught us all the
00:23:58.160
things that he did to be able to achieve the big goal.
00:24:01.240
So now we know how to do it, how to reframe and how to put in systems and, and yeah, like
00:24:10.400
Like everybody, you know, everything is a resource.
00:24:16.300
So my God, if I, if I owned any of the businesses I've owned in the past now with today's technology,
00:24:30.700
I'm going to move on to the news, you guys, cause we're at like the halfway point of anything,
00:24:35.400
but stick around Kev and, um, take an interstitial sip, you guys, and we're going to move on to
00:24:41.940
Owen and Marcella have picked out some news for us and, um, definitely drop your chat in
00:24:59.140
Well, I would say Marcella go pick some, pick out some of the stories you were looking at from
00:25:05.220
Um, so I picked out a few things from your feed.
00:25:17.720
Um, I wanted to bring out a few stories, but I don't know which one you would like to talk about
00:25:26.940
I think we typically start with something that's more in the science realm, either science or
00:25:31.480
psychology or technology, and then, and then we'll get into the politics.
00:25:35.380
I have a favorite story, Marcella, you can start with.
00:25:43.860
Um, so there was, um, a, a story on side posts by Eric Dolan, uh, a new research links Trump's
00:25:58.920
report and masculine insecurity to value in large penises.
00:26:03.780
Um, they act as if large penises are a bad thing.
00:26:07.680
Um, new research published by psychology of men and masculinities find that men who feel
00:26:13.440
insecure about their masculinity are more likely to place high value on having a large penis,
00:26:20.160
viewing it as a symbol of status and dominance.
00:26:23.220
The study author, Cindy Harmon Jones, a senior lecturer at Western Sydney, uh, Western Sydney
00:26:29.340
university explained the motivation of men seem to have a lot more admiration for large
00:26:37.200
Uh, anecdotally, I observed the men who admire large penises seem angry and hostile.
00:26:45.860
What do you think Scott would have said about this?
00:26:48.820
I think there's, there's, there's, there's three words that he would have said.
00:27:36.800
Um, the Trump is the large penis metaphor for national revival, strong borders, strong
00:27:43.380
economies, strong military, no apologies for that.
00:27:49.660
Um, and I think that he would have seen it as a feature.
00:27:56.520
Um, the fact that Cindy Harmon Jones, who I don't know, thinks that it's hostile and also
00:28:06.580
Um, why would you have to study something that you could just ask everyone?
00:28:14.780
Yeah, I wonder how, how that grant proposal looked and what they were thinking when they
00:28:19.580
Um, it definitely seems like, I mean, the, the phrase I remember that was big for a while
00:28:27.800
Um, and, and I think that's probably part of the, at least the mean part of this that
00:28:35.520
Um, it seems, it does seem hilarious to me that like people who are not supporting Trump
00:28:41.220
and are, you know, somehow not valuing large penises, I kind of wonder what that says
00:28:48.940
But, um, you know, I, I also understand that like men would obviously care a lot more about
00:28:55.000
women or about that than women, because they're the ones that have the penis and they only
00:28:59.860
And so it's natural to compare against other men and, um, you know, worry about that and
00:29:07.420
Um, so it would be kind of obvious to me that women wouldn't care as much about that.
00:29:13.660
Um, but you know, I, I guess that's, it depends on the woman how much they care about that.
00:29:19.580
But, um, you know, it's, uh, it may be that I, I would wonder if they went as far as correlating
00:29:26.460
the actual sizes of Trump supporters versus non-Trump supporters too.
00:29:30.860
Well, like somebody just wrote in the locals chat who funded the study and then like, imagine
00:29:43.760
And someone just wrote like, you know, today in backwards science, I mean, yeah, it's all
00:29:55.140
Well, uh, yeah, you had to get me into the deep conversation here.
00:30:02.140
I think that, uh, uh, this is, uh, important, uh, because right now, um, Trump right now he's
00:30:13.860
He's deploying, um, flotillas everywhere, armadas and, um, and it's just, uh, you know,
00:30:21.220
taking control of the security, making sure that everything is safe.
00:30:24.460
Um, just imagine a neighborhood and, and he's the, the head guy of the neighborhood and making
00:30:34.320
I would just also add, there was, there was a story that I posted might've been a couple
00:30:38.940
of years ago and it was a study about firearm ownership and penis size and it got the highest
00:30:46.440
engagement of probably almost any story I've ever posted on X.
00:30:50.340
And I think it might've been because as you might guess, firearm owners have larger penises
00:30:55.800
than non-firearm owners, which I guess it is counterintuitive to what the researcher was
00:31:01.620
probably hoping to find out because they were saying, oh, this kind of debunks the idea
00:31:05.180
that people have firearms because they were, you know, worried about their small penises,
00:31:11.380
So, yeah, it's, uh, it's interesting when they, when the science turns their attention
00:31:25.260
Let's just say eggplant parmesan is one of our favorite dishes here in, uh, in our house.
00:31:36.880
Um, the other story that I saw and a lot of us are, are going to be concerned.
00:31:43.040
Um, we, people that have jobs, you know, people, uh, the majority of us have jobs.
00:31:49.140
Um, Jonathan Ross, the founder and CEO of an AI chip company, um, named GROQ, uh, is that
00:32:00.340
Yeah, it is, but it's spelled different obviously than the G-R-O-K.
00:32:04.740
A, and basically what he said in his interview is that AI won't destroy jobs.
00:32:09.580
It'll create a massive labor shortage instead of the opposite.
00:32:14.140
Um, and he defined, he pointed out three things, um, huge deflation, um, people working less
00:32:27.900
Um, I think that people, when they argue about AI changing the economy and changing, uh, factory
00:32:37.560
work and, um, even possibly getting rid of lawyers, which I doubt, um, or any kind of other white
00:32:47.380
collar work is that when we had the internet, uh, explosion of the dot-com era, there was
00:32:59.140
They were, they were different type of jobs, different type of things, but it does go to
00:33:05.680
the point that there could be more explosive job creation.
00:33:10.300
Um, and what he does, um, highlight, and I don't know what your thoughts are on this, everybody,
00:33:17.860
but he says, we're not going to have enough people, which I guess can go to Elon Musk and
00:33:26.940
all the other points of population and depopulation.
00:33:30.180
I was reading, uh, wall street journal this morning in one of the stories was that China
00:33:38.260
is shrinking in, um, sorry, I'm going back to the first story.
00:33:44.240
Um, the China, um, is, it's lessening in population and there is a huge issue because there's a lot
00:33:54.180
more older people than there are young, yet they have robots.
00:34:04.840
Um, I'm not sure if it's because of AI will lead to more jobs, which is why we will have
00:34:12.080
not enough people, or is it just because of the population going down?
00:34:18.980
So thoughts, well, I think, I mean, go ahead, sorry.
00:34:24.960
Well, I think there's, there's an argument to be made for what he's saying that I certainly
00:34:31.040
think as AI and robots make things, you know, as they gain capabilities to do work, they
00:34:41.040
I mean, there is some debate about how much it costs to do all this AI and robot stuff,
00:34:44.960
but I think generally speaking, if you have a robot working 24 seven, you can probably
00:34:49.200
produce a lot more widgets for the same price as if you hired a person to do it.
00:34:55.840
And so I think that will be deflationary in terms of how, how, you know, a lot of things
00:35:02.760
Um, I don't think that'll happen to everything like real estate would be the obvious exception
00:35:08.460
So, you know, that may be even more scarce, but, um, but I think that even that might be
00:35:14.700
alleviated by it, you know, if we can get robots doing construction of homes, we might be able
00:35:18.640
to build a lot more homes and make those cheaper.
00:35:20.480
But, um, I do find a little bit of inconsistency when he says people will work less because that
00:35:27.800
to me would imply there's less jobs or less work to be done by people.
00:35:32.680
Um, so I, I, I get a question mark in my head when I hear that, but, um, I, I do think
00:35:38.740
there's a possibility that he's right in the sense that when you, when you make things
00:35:43.760
cheaper to build or to produce and, or to deliver a service, that means more things
00:35:53.380
Like you can make money doing more things than you could in the past.
00:35:55.920
Whenever you look at a startup, for example, you might say, okay, does, is this thing ever
00:36:00.660
Can I, you know, are people actually going to pay the price that they would need to pay
00:36:03.840
for this to be profitable, to be sustainable as a business?
00:36:09.900
Like, you know, are customers willing to pay the price for this product or service?
00:36:14.260
And so the lower you can make that price and still be profitable, the more businesses you
00:36:22.980
And so I think there would be a natural, uh, trend that would say businesses, you might've
00:36:31.180
You might say, you know what, now it would work.
00:36:33.840
And, um, so more things would potentially be possible in terms of starting new businesses and
00:36:39.480
having additional products and services that wouldn't have been profitable before, but
00:36:42.980
now that we can automate a lot of it, it is profitable.
00:36:45.780
So I do think there's a lot of potential for that.
00:36:51.620
Um, but I would point out that it would probably still shift the types of jobs that would be
00:36:57.300
available because if I make all these assumptions about how things are going to go, a lot of types
00:37:02.980
of jobs like manual labor and repetitive tasks and simpler things, even writing and some
00:37:08.560
of the things AI can do, um, you're not going to need people to do those things anymore.
00:37:12.640
And so you'll, you'll need people who know how to use AI and who know how to come up with
00:37:17.840
the ideas and to guide the AI and to review the output of the AI and to just, you know,
00:37:23.220
just have, do all the things like the creative things, the humor things, all the things that
00:37:31.040
And I don't know if they ever will be able to do.
00:37:36.220
And that means some people might be very busy because they have those skills.
00:37:39.760
Other people might have a hard time finding a job because they, all they could do is manual
00:37:43.980
So I don't know if it's going to work for everybody, but it'll probably work for some
00:37:49.400
I mean, Greg Gutfeld just said the other day to Scott that he had AI write the summary of
00:38:03.580
It's going to, I think that's a way that all the creative minds could really have a job
00:38:13.200
And by the way, Sergio, you've created an army of memers and you guys, they're so good.
00:38:24.640
Well, Kevin, I think you wanted to jump on this too.
00:38:26.820
I, so I, I work in this domain very closely and I, I, I think it's, it's going to be
00:38:35.540
It is going to be a disruption, but in many, many ways we're going to be more productive.
00:38:47.260
New tech, it's going to enable people to be more productive, right?
00:38:54.020
You might be able to do what took three or four or 10 people to do.
00:38:59.700
Um, but I think it's, it's, it's, I, I take the David Sachs sort of glass half full kind
00:39:08.140
It's, it's going to be, um, a profound, uh, positive, um, uh, advancement.
00:39:15.100
I do think just like you should question, always question numbers or what you're reading, you
00:39:22.060
know, people need to cultivate this critical assessment of, oh, this AI has given me this
00:39:30.900
So I think it's going to be important that in, in education, young folks are taught to,
00:39:37.620
to think critically about the results that they're getting.
00:39:39.920
Um, I know personally, you know, with code generation, you don't rely on a single tool,
00:39:46.080
You're going to, you're actually relying on a couple of tools and, and you're weighing
00:39:50.280
how these different tools, which, which tool is yielding the best results.
00:39:54.800
And when it comes to software, you know, a newbie software developer doesn't have sort
00:40:01.320
of the experience, uh, to understand, Hey, yeah, I got the, the basic functionality to
00:40:07.600
work, but there's, you know, all of the robustness, uh, the reliability, perhaps security things
00:40:13.860
that an experienced developer would be able to frame out, you know, in a, um, in working
00:40:22.020
Uh, I think it's going to be an age of immense productivity improvement.
00:40:31.420
It reminded me of something that Erica said right now, reminded me of something Scott would
00:40:36.480
have said, um, and it was regarding art and music and how we would deal with, with it coming
00:40:46.700
from AI, would it be the same or would these people still have jobs?
00:40:51.160
Because when you have an artist, you have a musician, you have a movie or something else.
00:40:59.180
If it comes from a human, then it is, it comes from a, uh, uh, official, uh, intelligence.
00:41:06.440
And so I thought that maybe those were one of the few jobs that would stay around.
00:41:11.640
Cause, because there was a, you know, Scott liked to play drums and he talked about drummers
00:41:19.120
and how he realized that the best drummers didn't follow any of the rules.
00:41:25.280
So they weren't perfect per se, but they were, um, just created their own rules.
00:41:32.200
That's not something that I, maybe AI will eventually do that, but it's, uh, it's just
00:41:39.160
I think you're right, but I would even extend it beyond just artists.
00:41:43.760
Like I do think that for music and art and other things that people are still going to
00:41:48.240
have those jobs and they're still going to, people are going to value human work in those
00:41:55.540
And I think, I think a lot of other white collar jobs and other types of jobs will also
00:42:01.400
like even, even manual labor jobs will, will still have a place for the artisans.
00:42:10.920
I mean, I, you know, I went to Europe once and I looked through a museum of some of the
00:42:15.000
furniture that was in this castle and I was just blown away by it because I saw these desks
00:42:20.460
with all this inlaid stuff and all these complicated things in it.
00:42:23.460
And I was like, I couldn't, I couldn't believe it.
00:42:26.640
Like I was like, that not only took many years to build just that one desk, but it took somebody
00:42:32.900
probably 30 years to get good enough to be able to build a desk like that.
00:42:36.920
Like that, that was someone's life work in that desk.
00:42:44.440
I mean, you, you might be able to duplicate it if you really, you know, had some laser 3d
00:42:48.640
thing to try and replicate it, but it probably still wouldn't be perfect.
00:42:51.260
And it wouldn't have that artistic flair of like, this is a unique desk with this unique
00:42:57.400
And so I think the same is probably going to be true to some extent within all human interactions.
00:43:03.860
Like you already had the experience of saying, oh, oh, great.
00:43:07.980
I'm talking to some computer right now on the phone, right?
00:43:14.260
And I think that's probably going to be true with a lot of things.
00:43:18.660
And people have already started complaining about like this email looks like it's AI generated
00:43:23.000
or this news article looks like it's AI generated.
00:43:25.380
Like it's different and we can tell the difference.
00:43:30.640
You're not going to say we won't see any more AI anymore, but I do think people are going to
00:43:35.360
value that humanness and, and that difference that humans put into things because that's that
00:43:42.260
creative piece of it, that intuitive piece of it, where you connect with another person
00:43:48.960
And I think it's still going to have a greater value than something that was produced by AI
00:43:56.420
I see both sides of that as the Libra again, because I think our prompting is going to have
00:44:03.960
to get better because I feel like you could tell AI to carve that desk and to make it look
00:44:15.640
And also my, my huge gripe for the last 10 years, I've just been like on this tear that
00:44:30.380
I love Bill Pulte, amazing, but he just built something right near me where Netflix is going.
00:44:43.940
And, um, but everything that's being built is so uncreative.
00:44:48.640
And then I'm thinking, well, maybe AI can make it creative.
00:44:51.180
If nobody has the architectural wherewithal to make things pretty and interesting and
00:44:58.420
that we'd want to see long lasting, like you go by like an old house somewhere that's got
00:45:03.400
like detail and we, you know, we're like, wow, that's so beautiful.
00:45:06.980
But then we build all these like institutional looking buildings with no thought.
00:45:11.900
I'm thinking AI could do it cheaper and put creativity into it and make some interesting
00:45:18.740
things all around, including furniture, including whatever, but it's going to be about the
00:45:24.700
So I feel like we need a lot of like QVC style hosts to prompt, uh, AI people that can talk
00:45:31.920
about, you know, this water bottle ad nauseum and describe it for an hour to make the most
00:45:41.660
It's the family and friends event at shoppers drug mart get 20% off almost all regular priced
00:45:47.240
merchandise two days only Tuesday, January 20th and Wednesday, January 21st, open your
00:46:00.880
You know, I wanted to, because, uh, um, when we talk about art, um, and AI and people, I
00:46:08.360
think that we are underestimating the power of, uh, the fusion of humans with AI.
00:46:14.480
This weekend, what I saw, it was people that they had never drawn anything in their lives.
00:46:24.840
And, and, and, and they were making amazing images that, uh, on their first time.
00:46:33.480
And there were a bunch of people that were like, Oh, I don't have time to try it.
00:46:38.100
But a lot of people tried it and, um, and when, when you start in, they were so happy
00:46:43.940
that finally they were able to express what they wanted to express.
00:46:48.380
And, and the same thing is going to, it's going to happen with construction too, because
00:46:52.860
it's monopolized by architects and designers and all this stuff.
00:46:56.980
So if you have an idea, they tell you to use, you know, F off, you know, they go
00:47:01.580
like, Hey, you know, the expert, I want to tell you how to draw this.
00:47:04.700
If everybody can start drawing their dream house and their dream ideas and start sharing
00:47:10.780
with everybody, that's how Trump got, uh, Kim Jong-un to, to, you know, when he showed
00:47:16.220
him a video of like what it could be in North Korea.
00:47:24.260
And, um, and that's what, that's what, uh, that's what these dads, you know, all these
00:47:29.260
people that are creating all these memes and, and videos and movies, they are just, uh,
00:47:34.860
spreading the, the, uh, this spark of creativity and everybody.
00:47:41.900
I've certainly seen it myself using AI and I, and I think it does get a lot messier in
00:47:47.900
When you say it's a combination of a person using AI, because with the right person, with
00:47:53.180
the right expertise, you can do some really powerful things with it.
00:47:55.900
And, and, and you're still putting that human judgment and intuition and, you know, like
00:48:00.520
Scott talks about how it can't, you know, it can't write a Dilbert cartoon.
00:48:04.040
And, and I think that that's because of a couple of things.
00:48:07.480
One is it, it can't know when something's funny the way a person can.
00:48:14.040
So, you know, you could have it infinitely generate Dilbert cartoons for forever, but it
00:48:21.460
But a person like Scott will almost instantly know if some idea is funny.
00:48:28.620
And I think, you know, it, to me, it's, it's that there, there are certain things that are
00:48:35.340
kind of uniquely human to have intuition and to have that feeling and that embodiment of,
00:48:40.640
I can just instantly know when I hear an idea, whether it's funny or not.
00:48:44.880
And most people, maybe at least two thirds of us, according to Scott, have a sense of humor
00:48:51.120
But the other part of it is that whenever Scott sits down to write a Dilbert, it's got
00:48:55.020
to be some joke that no one's ever told before.
00:48:58.100
He can't just do the same Dilbert over and over again.
00:49:03.160
But if you ask it to say, go create a joke that no one's ever told before, I would defy
00:49:15.060
Another difference between AI and us is that AI doesn't have any friends to share memes
00:49:23.860
All the memes that I share and most of us, my Bert too, he's amazing with those memes.
00:49:29.480
It's the same things that we, as soon as we see it, and that happens to all of you, think
00:49:34.540
As soon as you see a meme, you think of the person that you're going to send it to.
00:49:38.920
You start imagining the smile on that, on your husband, your wife, your friend.
00:49:44.560
I cannot wait until I send this to that person.
00:49:46.740
And on all the memes that I was posting on Locals, for example, it was because all you
00:49:54.740
If I had a bunch of other friends, I would be sending them to them.
00:49:57.440
But you guys were my, my circle of friends, right?
00:50:15.320
So we need to tell it, we need to become his friend.
00:50:24.960
Sergio, as, as typical, had a gold nugget in there.
00:50:36.100
A buddy of mine in an unnamed state lives on a lake in a subdivision and he teaches skeet
00:50:47.300
And apparently the homeowners association complained and sent him a cease and desist letter.
00:50:52.160
And he took that letter and interacted with his favorite LLM and lo and behold, came back with multiple rulings.
00:51:05.660
Remember, our law is based on precedent and fired off a lovely email letter back to the homeowners.
00:51:21.500
I'm not saying LLMs should replace your lawyer and what have you.
00:51:24.680
But my point is, there are a lot of things where, you know, intuitively or maybe you've got a little bit of information, but boy, you're not trained in that domain.
00:51:34.040
You know, this just relaxes a lot of those barriers in a huge, huge way.
00:51:41.080
I've personally used it for cooking and for my own health journey and goals and so forth.
00:51:47.900
And it's just really, really a big, big deal, I think.
00:51:59.680
I love, oh, wait, you guys, let's do like a little assignment again today.
00:52:11.660
So why don't we try today to create something that we think is like a desirable home or building and keep adding and improving upon it with your AI.
00:52:24.400
So like let it give you what you said, see how your prompts made it and then say, okay, like I like that, but change this and do that and post your, let's say homes.
00:52:34.100
Let's say like it could be crazy fantasy home, like on another planet if you want.
00:52:39.060
But let's try to see what AI can come up with via our prompts to make a cool ass house that you might want to live in wherever you want.
00:52:57.640
I've seen there's a little bit of an edge of women not caring about what anybody says.
00:53:03.520
There's a lot of guys that are going like, oh, you know, I'll do that later.
00:53:09.160
So guys, get rid of that embarrassment and just do it.
00:53:13.340
So I can run through some headlines real quickly.
00:53:16.840
I know we only have a few more minutes to the hour.
00:53:23.100
So if you want to talk about Don Lemon, then go ahead.
00:53:24.820
Okay, so real quick, I have to bring it to the front, forefront.
00:53:36.840
And I mean, I guess Scott would say that's one of the main things to be persuasive, and he got it yesterday.
00:53:43.060
He, with the protesters, anti-Ice protesters, went into a Minnesota church in St. Paul, Minnesota, while they're worshiping.
00:54:03.180
And he actually made comments and interviewed the, and was trying to interview the, believe the, the pastor there.
00:54:13.240
But the, and going back to Sergio's memes, he made a great meme of that.
00:54:18.380
But the heinous, it was a heinous act because he was saying, oh, the constitution allows us to protest wherever we, wherever we want.
00:54:28.780
There's nothing about there being some kind of indication that you can't protest here, or there's no time to protest now because there's worship.
00:54:44.580
And that's why for me as an attorney, I was like, no, that's not true.
00:54:49.060
And Megyn Kelly made a post under the United States constitution through the Supreme Court decisions.
00:54:55.440
There's a time, place, and manner where you can protest.
00:55:00.000
But on top of that, the U.S. Constitution only applies in regards to protesting in public places.
00:55:09.520
Yeah, and really just protesting government, right?
00:55:11.700
And they also have the separation of church and state, which really means the government needs to stay out of all the religious stuff.
00:55:17.480
So the assistant attorney general, Harmeet Dillon, and the attorney general, Pam Bondi, are trying to do something about it under what is the law of the FACE Act, which punishes intentional acts to try to stop worship.
00:55:36.160
It is extremely, it has become extremely, how would I say, Minnesota has become extremely on fire.
00:55:48.820
It's all not organic, and it's the winter of love in a way, and they're just trying to keep it going.
00:56:00.640
But I think it's going to backfire because Christian people aren't going to take it very well.
00:56:07.860
Yeah, well, I think we're going to see more protests.
00:56:11.840
We're going to see more interactions between them and ICE, and I'm hoping it doesn't get violent.
00:56:16.200
But we do see stories coming up about it in terms of deploying National Guard.
00:56:22.560
Trump apparently is getting about 1,500 troops ready to go back into Minnesota.
00:56:27.360
And, you know, so we may see more conflict there.
00:56:31.500
But I think, I'm hoping it'll stay peaceful, and then it'll die down, and we'll see less of this over time.
00:56:38.740
But, you know, it certainly still seems like it's blowing up there.
00:56:43.200
And then around the world, we have Iran blowing up.
00:56:47.140
We have maybe some progress in Gaza where they're putting out this peace board and starting phase two.
00:56:54.780
And Trump just said that he wants a billion dollars from everybody that's going to be having a permanent seat on the Gaza peace board.
00:57:00.820
So he's monetizing it, which I thought was funny.
00:57:04.560
But maybe that's meant to help rebuild Gaza and, you know, make it all work.
00:57:11.840
But it does seem like he's using his creative mind to say, how can we fund this?
00:57:18.620
And I'm hoping we don't have another military conflict in Iran.
00:57:22.700
But I would have to say, based on the headlines I'm seeing, that it looks like we may.
00:57:27.300
It looks like all the signs are pointing to maybe something happening soon there.
00:57:31.760
So we've got a lot of things going on around the world.
00:57:34.960
But I think we do have some good news here at home.
00:57:37.360
There's a story about border crossings being at historic lows again.
00:57:44.340
Housing payments apparently have dropped $260 a month under Trump.
00:57:50.840
And so I think there's plenty of good news we can point to as well.
00:58:04.900
Did he break a law that we can, you know, jail him?
00:58:07.080
I mean, they are trying to investigate him and that's what was said.
00:58:16.180
And they're going to look into charges under the FACE Act.
00:58:19.660
He would fall, not just him, but the anti-Ice protesters would fall under that.
00:58:25.580
Under the FACE Act, you would have anything that intentionally interrupts like a place like this of worship.
00:58:39.300
With an intention to, with an intention to, of threat.
00:58:44.400
And what I saw there, I mean, it was sickening the fact that there was children there.
00:58:53.320
And he kept saying, I mean, I guess he, this is what makes, this is what, why we're talking about him today, right?
00:59:03.840
It's just, to me, there's, there's ways, right?
00:59:11.380
Or they, they, they need this in order to wake up or something like that.
00:59:23.780
I don't know if a grand jury will find charges.
00:59:28.880
And then at the end of the day, when, when this, this is questioned as a lawyer, you're like, well, a jury would have to find him guilty.
00:59:39.800
And so would, so would this anti-Ice protest, but they do need to look into who is paying for this, who is behind this.
00:59:49.480
And these are the people that need to be in jail.
00:59:54.700
And I think Scott would probably inject somewhere in here that Don Lemon might enjoy going to prison, if you know what I mean.
01:00:03.480
Oh, my gosh, you guys, it's the top of the hour.
01:00:11.760
And I like to just keep it, you know, under an hour, like Scott, like to try to do an hour.
01:00:24.700
Owen, do you have closing thoughts for everybody?
01:00:28.140
No, I just, I appreciate everyone coming here and participating in this.
01:00:31.720
And if other people have reframes they want to share or, you know, other people that have been a big part of the community for a long time, you know, reach out to Erica or me or somebody.
01:00:40.400
And we may be able to get some other voices on here as well.
01:00:46.180
You know, we're just doing this because we want to keep it going.
01:00:48.740
And I know every, not everyone can do it every day, but if there is a day that you can do it, maybe you can step up and be part of this too.
01:00:55.720
So I appreciate everyone's help and let's just keep it all going.
01:01:01.100
And even if you guys just want to come for the simultaneous sip and then you got a jet, but you want to have a sip with everybody, come have a sip and go on to your day.
01:01:09.660
Like, again, we'll never, we'll never be Scott, but we all have that in common.
01:01:14.900
How much we love, respect, miss, and appreciate him and his wisdom.
01:01:22.060
So if you want to commune with, you know, Scott, Scott's friends and have a sip, please join us Monday through Friday right here.
01:01:30.780
And Shelly, thank you so much for providing this stream to us every day.
01:01:38.940
And Marcella, Sergio, and Kev, thank you so much.
01:01:46.340
And we'll see you guys tomorrow and do your memes for Sergio of a dream house.