Real Coffee with Scott Adams - January 19, 2026


Episode 3075 - The Scott Adams School 01⧸19⧸26


Episode Stats

Length

1 hour and 2 minutes

Words per Minute

165.74718

Word Count

10,355

Sentence Count

708

Misogynist Sentences

17

Hate Speech Sentences

11


Summary

In this episode of Coffee with Scott Adams, host Scott Adams talks about what it's like to miss out on his favorite morning ritual: a cup of coffee with a special guest. Today's guest is Scott Adams.


Transcript

00:00:00.540 Nova Scotia, Canada is home to five of the world's top hedge fund administrators.
00:00:06.640 Learn how you can be part of a thriving financial services hub at investnovascotia.ca.
00:00:14.860 You guys, let's just make sure Locals is live, especially before we do.
00:00:19.780 We can see Steven, Gunner.
00:00:22.740 Yeah, Locals is live.
00:00:24.440 Awesome.
00:00:25.800 Yay.
00:00:26.580 Hi guys.
00:00:27.320 Good morning.
00:00:27.900 I don't see you two.
00:00:31.000 Okay, guys, are you ready?
00:00:32.520 We're going to mute for the simultaneous sip.
00:00:35.820 Good morning.
00:00:37.540 We're doing it with Scott today, you guys.
00:00:40.200 Ready?
00:00:41.240 And then we'll introduce everyone.
00:00:42.780 Okay, let's get ready for the simultaneous sip.
00:00:51.260 Good morning, everybody.
00:00:53.520 And welcome to the highlight of the entire civilization.
00:00:57.900 It's called Coffee with Scott Adams.
00:01:00.760 And if you didn't think it could get any better, surprise.
00:01:05.160 It's Whiteboard Day.
00:01:07.160 Yes, we will have a whiteboard in which I'll connect the seemingly different fields of politics,
00:01:13.900 artificial intelligence, the simulation, and Twitter.
00:01:16.940 Yeah, we'll do all that.
00:01:19.200 And in order for you to be primed and ready for that, this mind-blowing experience that is the simultaneous sip and coffee with Scott Adams.
00:01:28.300 You're going to need to get ready.
00:01:29.540 And all you need to be ready for this amazing, amazing experience is a cup of hunger, a glass of tank, a gel, a cider, a canteen, a jug of glass, a vessel of any kind.
00:01:38.100 And join me now for the unparalleled pleasure.
00:01:47.080 It's the don't compete each other day.
00:01:48.720 It's the thing that makes everything better.
00:01:51.860 It's called the simultaneous sip.
00:01:53.660 That, ladies and gentlemen, is amazing.
00:02:05.720 I'd like to start with a helpful tip.
00:02:09.480 Have you ever bought anything on Amazon?
00:02:12.880 Well, if you have and you've bought more than one thing, you may have run into a situation I run into often.
00:02:18.420 It's called the scale problem.
00:02:21.840 As in, I think I'm buying a big old bag of something, and it shows up like it's a free sample.
00:02:32.400 How many times has that happened to you?
00:02:34.660 You know, you buy a chair for your living room, and it shows up like it's a Barbie chair.
00:02:39.340 You're like, you know, that looked like a real chair.
00:02:42.260 In my defense, I did not check the specs.
00:02:44.580 It looked like a chair.
00:02:45.600 It said a chair.
00:02:46.680 I bought the chair.
00:02:48.020 It just happened to be two inches tall.
00:02:49.680 Well, this brings me to my range of purchase, which should have been about this tall, not this wide, the big one.
00:03:00.740 But when you look at the little picture, it looks exactly the same.
00:03:04.800 And so I suggest the following human interface improvement for Amazon.
00:03:09.420 Jeff Bezos, if you're listening, I suggest this.
00:03:12.600 In any situation in which there might be any potential ambiguity about the size and scale of an object, it is not good enough to include it only in the description, which you must click.
00:03:27.280 You must also have a human hand in the picture, preferably the same human hand, because if this had a human hand, I would know exactly how big it was every single time.
00:03:38.200 And how hard is it to put a hand in the picture?
00:03:40.200 Not very hard.
00:03:41.100 You can even digitally add it, just nearby hand, picture.
00:03:45.500 So, please, user interface developers at Amazon, who are, by the way, some of the best in the world.
00:03:55.340 Amazon has some of the best user interface.
00:03:57.100 But that one thing, that one thing bites me in the ass about one time in five, probably, literally.
00:04:04.800 I just guess on the weird sides.
00:04:07.460 All right.
00:04:07.860 I miss him, you guys.
00:04:13.340 I can't believe it's another, you know, that we're into another week now.
00:04:18.320 And it's not getting easier, obviously.
00:04:24.080 How do you guys quickly, you know, feel it's not easier?
00:04:29.880 No, I think it's harder now as the days go.
00:04:34.360 Because you realize the weekend, we didn't have coffee with Scott Adams.
00:04:39.520 And how many decades is that?
00:04:42.000 Like, it was one decade without that.
00:04:44.800 So, it hit me hard yesterday.
00:04:47.580 I think I told you guys.
00:04:49.240 But, yeah, it's not easy.
00:04:52.840 It probably will get easier with time.
00:04:56.300 But it is forever.
00:04:58.560 I saw a post or a reply online saying, oh, you know, I'm going to be sad when people forget Scott.
00:05:09.520 No, people are not going to forget.
00:05:11.240 And we're not going to allow people to forget Scott Adams.
00:05:15.780 That's right.
00:05:16.500 He's unforgettable, Serge.
00:05:18.780 Oh, you guys.
00:05:19.440 I'm sorry.
00:05:19.900 I'm Erica.
00:05:21.560 Marcella just spoke.
00:05:22.840 We have Sergio below with the beanie, the Scott Adams beanie, which is back for sale, you guys, by popular demand.
00:05:30.140 Kevin Summers is joining us today.
00:05:32.520 And the voice of Owen Gregorian.
00:05:35.420 Good morning, everyone.
00:05:35.960 Sergio, you wanted to say something about that?
00:05:38.860 About Owen?
00:05:40.380 No, about how it's not easier.
00:05:43.000 And it feels almost harder now.
00:05:45.980 You know, well, I'm a pro at this, unfortunately.
00:05:48.760 You know, I'm a pro and I've been through it with the strongest ones.
00:05:53.760 And I've got to say that Scott's advice of putting yourself into service, being useful, and honoring that person's legacy and transition, too,
00:06:08.920 it's the most important thing to do to stay sane.
00:06:12.640 Because it's not going to get easier.
00:06:15.920 We all have to get stronger.
00:06:18.760 Okay.
00:06:19.940 Jocko Willink is the best at it.
00:06:22.220 He has a beautiful song on this that I played over and over again when that happened.
00:06:28.940 Marcella, do you not remember the name of the song?
00:06:31.240 Because I forget the name of the song.
00:06:33.520 Maybe you can post it in the notes later.
00:06:36.420 I'll post it later.
00:06:37.540 Yeah.
00:06:37.880 Jocko.
00:06:38.440 Yeah, I'll post it.
00:06:39.200 He's a Navy SEAL who went through Afghanistan and Iraq War.
00:06:46.340 And he has a lot of advice on grieving.
00:06:51.180 Jocko Willink is the best at the end of the year.
00:06:54.620 So I think we are, it's been helping a lot, you guys, you know, like, you know, talking to you guys.
00:07:01.060 And I look so happy to see Kevin, by the way.
00:07:03.640 Kevin is one of the OGs in the chat.
00:07:07.280 And everybody knows Kevin.
00:07:08.380 Kevin and Kevin has been very supportive with me for a long time, too.
00:07:13.000 And I appreciate that, Kevin.
00:07:14.280 Thank you.
00:07:14.720 You bet.
00:07:15.320 He's known as Kev.
00:07:16.380 I go by Kev on Locals.
00:07:22.320 Yeah, it's been tough.
00:07:23.220 And I keep reminding myself of some of Scott's reframes on death and grieving and loss.
00:07:30.040 And it, I lost my mom about a year ago.
00:07:35.740 And it was super helpful.
00:07:37.760 And just using his own advice on him has been just sort of a saving grace, I'll say.
00:07:47.800 Owen?
00:07:48.640 Smile.
00:07:49.160 Smile that I got to know him.
00:07:51.620 Yeah.
00:07:51.820 Well, I mean, I'm sure it certainly hit me hard and I'm still going through it, but I would, I was going to say the same thing that I, I think SJV just posted one of Scott's segments on getting rid of anxiety, but it really focused on getting ready for a loss.
00:08:11.560 And like a loss, you know, is coming in his case, he was focusing on like the death of a pet because he knew that he would have to deal with Snickers death coming up.
00:08:19.320 And, um, so he talked about how you could prepare for that.
00:08:23.500 And a lot of it was the reframe about it being an honor and a privilege to be part of it.
00:08:29.200 Yeah.
00:08:30.480 Um, but it's, it's worth seeing.
00:08:33.040 I think I reposted that from SJV on my timeline this morning, but, um, you know, that, that sort of thing is helping me.
00:08:40.240 I mean, that is really how I'm thinking about it is that it's been an honor and a privilege to be part of the process and just to be, you know, helping.
00:08:49.320 To keep Scott's legacy going.
00:08:51.480 So that's what I'm focusing on.
00:08:53.220 Yeah.
00:08:53.320 That's amazing.
00:08:54.080 You guys on all the platforms listening, if you hear us mention someone's name, like Jocko or SJV, um, if you guys would in the chat, can you at them?
00:09:05.100 So people can say, Oh, who's Jocko or who's SJV or whatever we're talking about.
00:09:09.420 That would really help the others that are watching the chat to know who these people are and perhaps follow along.
00:09:15.880 So I like that, you know, you guys shared that I was having a really bad night last night.
00:09:21.260 Some of, you know, um, but if this is like a great time.
00:09:25.540 And Kevin has decided to do a reframe for us.
00:09:28.840 So listen, we can reframe anything as Scott has taught us.
00:09:32.600 It's so important.
00:09:33.660 And all you have to do is think about reframing whatever situation you're in and create it.
00:09:39.880 If there isn't one written in his book, create it.
00:09:42.120 So I'm going to hand it over to Kevin.
00:09:44.380 He's going to tell us, um, what page in your reframe your brain book and let's get a reframe.
00:09:51.340 And then we're going to go right to the news with Owen and Marcella.
00:09:55.000 Okay.
00:09:55.280 So thanks, Kevin.
00:09:56.520 Take it away.
00:09:57.340 You bet.
00:09:57.820 Thank you.
00:09:58.820 So, um, we're going to do the wanting versus deciding reframe today.
00:10:03.380 And this is like the third page of chapter two.
00:10:07.120 If you have the soft copy, it's page 15.
00:10:09.420 Um, in my own little hierarchy of reframes, this is like a big daddy reframe upon which
00:10:17.820 many, many reframes hang.
00:10:20.120 Uh, so I, I absolutely love this wanting versus deciding.
00:10:25.060 If you want something, you might be willing to work hard to get it within reason, but if
00:10:30.580 you decide to have something, you will do whatever it takes.
00:10:35.220 Usual frame.
00:10:36.440 I want to do something.
00:10:38.840 Reframe.
00:10:39.620 I've decided to do something.
00:10:42.380 If I were to make a list of all the business startups and other money-making schemes I've
00:10:46.680 worked on during my career and then divide that long list into what worked out well and
00:10:51.240 what failed, there would be a pattern.
00:10:54.160 You wouldn't notice the pattern, but I would.
00:10:57.200 The efforts that failed were all ones I wanted to succeed.
00:11:01.160 And I worked hard to make them succeed.
00:11:03.460 I wasn't doing all-nighters or risking money I couldn't afford to lose, but I put in great
00:11:09.320 effort and, yes, some cash.
00:11:12.020 None of those wants worked out.
00:11:14.200 Luckily, several business projects did work out.
00:11:19.240 Dilbert became an international sensation.
00:11:21.460 I wrote several best-selling books and I was one of the highest paid speakers in the country
00:11:26.080 for several years.
00:11:27.860 To be fair, I was only able to do all of that because the original Dilbert comic strip took
00:11:32.660 off.
00:11:33.440 And there is one thing that stands out about that initial Dilbert success.
00:11:37.460 I didn't want to be a successful cartoonist.
00:11:41.780 I decided to be one.
00:11:45.200 When I was offered a syndication contract in 1988, the ultimate big break for a cartoonist,
00:11:51.340 I made a promise to myself that no matter what happened, I would never allow myself to look
00:11:55.920 back and say I didn't work hard enough to make it a success.
00:11:59.900 I knew I was entering a field in which the odds of making it big were around 1% even after
00:12:05.880 getting the syndication contract, which was already insanely unlikely.
00:12:10.740 The syndication company sells comics to newspapers and web platforms and splits the money with
00:12:16.300 the cartoonists.
00:12:17.800 I had a contract but zero newspaper clients on day one.
00:12:21.560 By the end of the first year of selling Dilbert comics to newspapers, only a few small newspapers
00:12:26.080 were carrying the strip.
00:12:28.000 At that point, seeing no hope of a big hit, the salespeople moved on to the next comic that
00:12:33.020 was being launched.
00:12:33.820 If Dilbert was going to succeed, I would need to make it happen on my own.
00:12:39.840 And so, I worked my day job while also writing and promoting the Dilbert comic for several
00:12:44.420 years.
00:12:45.260 I later wrote books and did licensing.
00:12:48.520 For over 10 years, I had the equivalent of three full-time jobs.
00:12:52.780 I worked seven days a week, including holidays.
00:12:55.320 I did everything I could do to promote the comic, putting 100% of my mind and body into it.
00:13:03.920 For many of those years, I answered hundreds of emails per day from fans.
00:13:08.220 By the way, I sent an email to him and he did respond within a day.
00:13:13.900 It was really neat.
00:13:14.640 But 30 years ago, I traveled the country for book signings and autograph sessions that would
00:13:19.920 last hours.
00:13:21.200 I did photo shoots and interviews several times per week for a decade.
00:13:25.840 Dilbert was the first syndicated comic to be published on the internet, which also took
00:13:31.560 a lot of work.
00:13:32.420 On paper, my workload from those years looks impossible.
00:13:39.100 If I had merely wanted to succeed, I don't think I could have lasted.
00:13:43.340 But I didn't merely want to succeed.
00:13:46.040 I decided to succeed.
00:13:48.260 And once you decide, the psychology of the situation changes.
00:13:52.480 A crushing workload felt like a privilege.
00:13:55.360 I reminded myself that almost any cartoonist would want to trade places with me.
00:13:59.500 It was never easy and it was never painless, but I was unstoppable because I had succeeded.
00:14:06.120 I had decided.
00:14:08.300 If you are wondering how you can know if the thing you desire is a want or a decision, I
00:14:14.820 can help with that.
00:14:15.760 It's easy.
00:14:17.040 If you are not sure, you have not decided.
00:14:21.200 If you decide, you won't have any doubt.
00:14:25.640 That's what makes it a decision.
00:14:29.500 I just love that refrain.
00:14:32.500 That's a banger.
00:14:34.600 I think about this constantly and I think about it, I try to, generally I'm an optimist
00:14:42.200 and I try to think about it very positively, but I have had family members who have done
00:14:47.560 the opposite here.
00:14:49.360 And, you know, you can talk yourself into just any reality you like.
00:14:57.060 So for me, you know, this informs the sort of that talk track you have going on in your
00:15:03.580 head and the decisions you make, the way you prioritize, you know, you are what you think
00:15:09.260 about.
00:15:09.540 And, uh, for me again, it's, it's a big daddy reframe and, uh, I love this one.
00:15:16.880 I need to remind myself about this too, because sometimes like, Oh, I want to do that.
00:15:21.960 And honestly, maybe it's a Libra thing.
00:15:25.360 You guys, I'm bringing the reality to the engineer minds.
00:15:28.280 I'm sorry.
00:15:28.700 I'm not like, uh, I must be like a shock system for a lot of people, but anyway, um, I live
00:15:35.420 like in this Libra balancing act on top of a fence all the time.
00:15:40.040 I can see both sides of a story.
00:15:41.800 If you ask me, do I want to go to this restaurant or this one, I will vacillate for an hour about
00:15:46.580 it.
00:15:46.780 Um, so I definitely teeter between wanting and deciding a lot, and I have to push myself
00:15:54.620 more for the decision.
00:15:56.360 Um, because if I stay in the want, I often have regrets.
00:16:01.240 So I I'm glad that you brought this up today because there's a few things I need to really
00:16:06.240 make the decision to do.
00:16:08.020 And I'm going to remember this and I'm going to think about, um, Scott up until the end,
00:16:16.340 like everything you read about his career and how he did Dilbert and for the amount of
00:16:21.720 years and working a full-time job and this and that, like, yeah, he made a decision and
00:16:26.540 he also made a decision to keep going till the very end with us.
00:16:30.780 And he could have not, he could have stopped anytime he wanted and any one of us would have
00:16:35.800 fully understood.
00:16:37.440 So yeah, I think that's a great lesson for me today and maybe for you guys.
00:16:42.720 Yeah, I think, I think also when you are a decider and become known as a decider and
00:16:50.200 there's no question, Oh, he's a decider.
00:16:53.000 He's a decider.
00:16:54.020 People look at you a little differently instead of, you know, Oh, you know, I, I want this.
00:16:58.660 I want that.
00:16:59.360 I want a million bucks.
00:17:00.600 You know, what have you.
00:17:02.180 Yeah.
00:17:02.200 You, you, you develop a reputation with yourself.
00:17:04.660 You know, you start, you know, thinking, Oh, you know, this guy means business and, or
00:17:15.440 this girl, you know, and, uh, the more you do those.
00:17:18.880 And one guy that reminds me of that is Trump too.
00:17:21.340 So Trump, uh, this, uh, if you read the art of the deal, uh, you know, that is when he
00:17:27.440 goes from wanting to deciding basically when he says Greenland might be nice, that, that
00:17:33.280 means that he decided to have it.
00:17:35.340 So that's a, I don't know.
00:17:37.220 I wanted to inject that because, uh, it's happening right now.
00:17:41.180 So, Oh, Erica, I can't hear.
00:17:45.720 I just said, yeah, I, I agree.
00:17:48.160 Does, does anyone else want to expound on the, um, on the reframe?
00:17:52.560 I think there's a lot of deciders within the group, because this is a very, you know,
00:17:59.580 intelligent, successful group of people Scott has cultivated, but I think we, in some aspects
00:18:08.520 of our life, we are still sitting on the want instead of deciding.
00:18:12.320 So maybe, maybe everybody today, your homework could be like, what's something I've been wanting
00:18:16.340 to do and I haven't decided to do it.
00:18:18.240 And then either make a decision that is worth the decision or let it go and, and then like
00:18:26.440 figure out what, what else you need to do to, to get where you want to be.
00:18:29.820 I'm going to do that today.
00:18:30.940 I actually have a couple of things I'm going to take off and then a couple of things I'm
00:18:34.120 going to make the decision.
00:18:35.640 So I'm going to do the same thing.
00:18:37.340 I'm going to do the same thing.
00:18:39.140 They can post it online.
00:18:40.240 Hey, actually that assignment publicly sharing those commitments, as we know, um, sure, uh,
00:18:46.720 sure helps.
00:18:48.240 And I like, yeah, I mean, I think, I think one thing I would add to it is just, you need
00:18:53.420 to be careful what you decide on.
00:18:54.980 I mean, cause the frame we're talking about here is a real commitment.
00:18:59.920 Like you're going to do whatever it takes to make this happen.
00:19:02.560 And so you can't do that with everything, you know, you can't just be like, well, now
00:19:05.860 I'm a decider.
00:19:06.640 So everything I decide I'm going to do.
00:19:08.320 And like, that doesn't work, right?
00:19:10.500 You have to, you have to be selective about probably only one thing at a time, really, of
00:19:16.160 saying, I'm going to make this thing happen and I'm going to put aside whatever else and
00:19:20.200 sacrifice whatever else I need to, to make that happen.
00:19:22.940 And so I think that's an important part of it.
00:19:26.580 Um, and I think Scott has talked about that too, um, where you need to be willing to make
00:19:31.000 sacrifices.
00:19:31.560 You need to be willing to, you know, put off other things or prioritize other things.
00:19:35.800 And then, you know, Kevin, to your point, this leads into a lot of other reframes or
00:19:39.440 a lot of other things Scott has talked about in terms of once you've decided to do something,
00:19:43.660 then you might develop a system for it.
00:19:45.360 And, and that would be how you would get that going.
00:19:48.720 You would say, okay, what, what needs to be true or what needs to happen?
00:19:52.220 What is a system for making this happen?
00:19:54.720 And I know Scott has talked about that in the same story of his career, saying how he
00:19:59.980 developed his system, uh, for that.
00:20:02.940 And, you know, we've seen it every day where he has his routine, he does his thing every
00:20:07.460 day, same way.
00:20:09.140 Um, but that's the result of a very honed system.
00:20:12.520 He, he figured out how do I manage my energy?
00:20:15.060 How do I, you know, get out of writer's block?
00:20:18.000 How do I make all these things happen so that I can always achieve the result that I need
00:20:22.080 to achieve?
00:20:22.700 And so I think, um, it definitely ties into a lot of his other wisdom that he shared with
00:20:28.000 us.
00:20:30.380 When, when I agree with that, um, I agree with that on, but one of the things that he says
00:20:36.360 that Kevin read is not only do you have to decide, but you have to work.
00:20:42.520 For your position, like, I can't just decide I'll be a billionaire and just sit around.
00:20:48.820 Um, no, he worked, um, tirelessly.
00:20:52.080 Yeah.
00:20:52.620 And almost to his death.
00:20:55.020 Um, and that's, that's a great, um, example.
00:20:59.020 And I think all of us, uh, want to, want to be like that.
00:21:03.260 And, uh, it's, it takes work and it takes focus because that's one thing that he said
00:21:09.600 in what Kevin read is like, I, I did this nonstop interviews, nonstop work.
00:21:17.320 And when I felt tired, I realized everybody would want to be in my place.
00:21:24.040 So I needed to be unstoppable.
00:21:26.680 Um, so that goes together with deciding.
00:21:31.280 Yeah.
00:21:31.840 I think, I mean, it's certainly, it is all about taking the actions.
00:21:35.000 And I think that ties into his talk about affirmations because that focuses your mind
00:21:40.800 on a particular thing.
00:21:41.900 And he's talked about that saying, you can't do affirmations for lots of different stuff.
00:21:45.420 You have to pick one thing and just, you know, write it a hundred times, say it a hundred
00:21:49.000 times, you know, do it over and over again until it really sinks in.
00:21:52.560 And that focuses your mind on that one outcome or that one thing that you're trying to make
00:21:56.560 happen.
00:21:56.940 And, and I think it ties directly into this deciding and, um, but that's where like the
00:22:03.000 secret, if anybody remembers that book was wrong because that they had the same affirmation
00:22:08.520 sort of message, but they said what you were just saying, like, oh, you can just say, I'm
00:22:12.720 going to be a billionaire.
00:22:13.960 And then the universe is going to make it happen for you.
00:22:16.520 And it's like, they only gave you half the answer.
00:22:19.460 It's like, and it, and I would also say, it doesn't mean you can just ask for whatever
00:22:23.500 you want.
00:22:23.900 I mean, I, you know, Scott has said a lot of very amazing things have happened to him
00:22:27.360 based on this process.
00:22:28.260 So I'm not trying to limit that, but at the same time, he, he says, you know, don't be
00:22:32.860 necessarily so specific.
00:22:34.140 Like, don't say I want a billion dollars, just say, I want more money or I want to be successful
00:22:39.000 or something that's more general so that it can be a lot of different things that would
00:22:42.540 fit that description as opposed to maybe something impossible.
00:22:45.680 And he said certain things like winning the lottery, you know, it's not going to work.
00:22:49.080 But, um, but if you, if you make that decision and you make those affirmations to focus on
00:22:55.760 that outcome, you still need to take action.
00:22:58.300 You need to do the things that come into your head that says, okay, if I want this to happen,
00:23:01.960 I need to put this system in place.
00:23:03.360 I need to take these actions.
00:23:04.480 I need to call these people.
00:23:05.600 I need to do these things.
00:23:07.380 And, and that's what actually makes it happen.
00:23:10.720 It's not, it's not just the thought it's the actions you take as a result of that.
00:23:15.860 Well, and, oh, and like, this is exactly where, like you said, systems for your goals.
00:23:21.960 So you got to put in the place of systems, probably work on your talent stack.
00:23:26.120 You might fail along the way, but, you know, fall down seven, get up eight.
00:23:30.800 Right.
00:23:31.120 So you just keep going until like, if he, and if you exhaust all of it and you're like,
00:23:36.320 okay, there's no way around it.
00:23:37.480 And I really worked on it.
00:23:38.540 It's also okay to move on from it.
00:23:40.800 Um, not, I mean, Scott has shown us how many ideas in different business ventures he started
00:23:47.760 and, and had to stop, but he really put us all into it.
00:23:51.560 And, um, I think we're like better arm now than Scott was because he taught us all the
00:23:58.160 things that he did to be able to achieve the big goal.
00:24:01.240 So now we know how to do it, how to reframe and how to put in systems and, and yeah, like
00:24:07.940 just utilize all the resources around you.
00:24:10.400 Like everybody, you know, everything is a resource.
00:24:13.740 You have AI now to help you as a resource.
00:24:16.300 So my God, if I, if I owned any of the businesses I've owned in the past now with today's technology,
00:24:23.120 so much easier, but, um, so we're lucky.
00:24:27.660 We have a good jumping off point.
00:24:29.300 I think that's great, Kevin.
00:24:30.700 I'm going to move on to the news, you guys, cause we're at like the halfway point of anything,
00:24:35.400 but stick around Kev and, um, take an interstitial sip, you guys, and we're going to move on to
00:24:41.940 Owen and Marcella have picked out some news for us and, um, definitely drop your chat in
00:24:49.360 there that we're trying to read.
00:24:50.780 I'm reading locals mostly.
00:24:52.380 So you guys definitely participate.
00:24:55.360 Thanks Kev.
00:24:57.840 Thank you, Kevin.
00:24:58.900 Yeah.
00:24:59.140 Well, I would say Marcella go pick some, pick out some of the stories you were looking at from
00:25:02.920 my feed and we can talk about them.
00:25:05.220 Um, so I picked out a few things from your feed.
00:25:08.720 Um, Owen and, um, Owen has a great feed.
00:25:12.180 Go check it out on X, uh, Owen Gregorian.
00:25:15.860 Um, this is Marcella.
00:25:17.720 Um, I wanted to bring out a few stories, but I don't know which one you would like to talk about
00:25:23.220 first.
00:25:24.260 Um,
00:25:26.940 I think we typically start with something that's more in the science realm, either science or
00:25:31.480 psychology or technology, and then, and then we'll get into the politics.
00:25:35.380 I have a favorite story, Marcella, you can start with.
00:25:38.300 Which one is it?
00:25:40.240 Um, it's the one about the penis size.
00:25:43.380 Okay.
00:25:43.860 Um, so there was, um, a, a story on side posts by Eric Dolan, uh, a new research links Trump's
00:25:58.920 report and masculine insecurity to value in large penises.
00:26:03.780 Um, they act as if large penises are a bad thing.
00:26:07.680 Um, new research published by psychology of men and masculinities find that men who feel
00:26:13.440 insecure about their masculinity are more likely to place high value on having a large penis,
00:26:20.160 viewing it as a symbol of status and dominance.
00:26:23.220 The study author, Cindy Harmon Jones, a senior lecturer at Western Sydney, uh, Western Sydney
00:26:29.340 university explained the motivation of men seem to have a lot more admiration for large
00:26:34.340 penises than women do.
00:26:37.200 Uh, anecdotally, I observed the men who admire large penises seem angry and hostile.
00:26:42.840 Um, so, you know, put it in the chat.
00:26:45.860 What do you think Scott would have said about this?
00:26:48.820 I think there's, there's, there's, there's three words that he would have said.
00:26:54.520 And if you could put it in the chat.
00:27:01.660 We're waiting for it to.
00:27:12.840 Well, Andy got it, right.
00:27:17.060 Just ask Scott, right?
00:27:19.260 Right, right, right, right.
00:27:21.400 He would have said, just ask Scott.
00:27:23.720 Um, the reef, the reframe of all this is that.
00:27:28.780 Um, the, as I see it, it's a feature.
00:27:33.240 It's not something that should be a negative.
00:27:36.800 Um, the Trump is the large penis metaphor for national revival, strong borders, strong
00:27:43.380 economies, strong military, no apologies for that.
00:27:47.640 So that's what he would have said.
00:27:49.660 Um, and I think that he would have seen it as a feature.
00:27:52.620 I think most men would see it as a feature.
00:27:54.460 Most women would see it as a feature.
00:27:56.520 Um, the fact that Cindy Harmon Jones, who I don't know, thinks that it's hostile and also
00:28:04.660 has to start to study it.
00:28:06.580 Um, why would you have to study something that you could just ask everyone?
00:28:11.280 Um, not just Scott, so.
00:28:14.780 Yeah, I wonder how, how that grant proposal looked and what they were thinking when they
00:28:18.740 approved that study.
00:28:19.580 Um, it definitely seems like, I mean, the, the phrase I remember that was big for a while
00:28:26.080 was big dick energy.
00:28:27.800 Um, and, and I think that's probably part of the, at least the mean part of this that
00:28:32.860 is, has gone through our culture.
00:28:35.520 Um, it seems, it does seem hilarious to me that like people who are not supporting Trump
00:28:41.220 and are, you know, somehow not valuing large penises, I kind of wonder what that says
00:28:48.060 about them.
00:28:48.940 But, um, you know, I, I also understand that like men would obviously care a lot more about
00:28:55.000 women or about that than women, because they're the ones that have the penis and they only
00:28:58.820 have one size.
00:28:59.860 And so it's natural to compare against other men and, um, you know, worry about that and
00:29:06.260 then have those insecurities.
00:29:07.420 Um, so it would be kind of obvious to me that women wouldn't care as much about that.
00:29:13.660 Um, but you know, I, I guess that's, it depends on the woman how much they care about that.
00:29:19.580 But, um, you know, it's, uh, it may be that I, I would wonder if they went as far as correlating
00:29:26.460 the actual sizes of Trump supporters versus non-Trump supporters too.
00:29:30.860 Well, like somebody just wrote in the locals chat who funded the study and then like, imagine
00:29:37.080 this woman being like, I'll do the story.
00:29:40.020 You know, she's like, I got it, but I love it.
00:29:43.760 And someone just wrote like, you know, today in backwards science, I mean, yeah, it's all
00:29:47.960 the things.
00:29:50.660 Yeah.
00:29:51.140 Yeah.
00:29:52.140 Any comments?
00:29:53.140 Sorry, George.
00:29:54.140 Kevin.
00:29:55.140 Well, uh, yeah, you had to get me into the deep conversation here.
00:30:01.140 I love it.
00:30:02.140 I think that, uh, uh, this is, uh, important, uh, because right now, um, Trump right now he's
00:30:11.020 deploying his guns everywhere, right?
00:30:13.860 He's deploying, um, flotillas everywhere, armadas and, um, and it's just, uh, you know,
00:30:21.220 taking control of the security, making sure that everything is safe.
00:30:24.460 Um, just imagine a neighborhood and, and he's the, the head guy of the neighborhood and making
00:30:30.540 sure that everything is nice and all around.
00:30:33.480 So, yeah.
00:30:34.320 I would just also add, there was, there was a story that I posted might've been a couple
00:30:38.940 of years ago and it was a study about firearm ownership and penis size and it got the highest
00:30:46.440 engagement of probably almost any story I've ever posted on X.
00:30:50.340 And I think it might've been because as you might guess, firearm owners have larger penises
00:30:55.800 than non-firearm owners, which I guess it is counterintuitive to what the researcher was
00:31:01.620 probably hoping to find out because they were saying, oh, this kind of debunks the idea
00:31:05.180 that people have firearms because they were, you know, worried about their small penises,
00:31:09.560 but they actually have larger penises.
00:31:11.380 So, yeah, it's, uh, it's interesting when they, when the science turns their attention
00:31:16.640 to this sort of thing.
00:31:18.180 Well, Kevin and I, we have, we have guns.
00:31:21.020 I don't know if the ladies have guns.
00:31:23.580 I must have a big penis.
00:31:25.260 Let's just say eggplant parmesan is one of our favorite dishes here in, uh, in our house.
00:31:34.720 All right.
00:31:35.720 All right.
00:31:36.880 Um, the other story that I saw and a lot of us are, are going to be concerned.
00:31:43.040 Um, we, people that have jobs, you know, people, uh, the majority of us have jobs.
00:31:49.140 Um, Jonathan Ross, the founder and CEO of an AI chip company, um, named GROQ, uh, is that
00:31:58.340 how you pronounce it G-R-O-Q?
00:32:00.340 Yeah, it is, but it's spelled different obviously than the G-R-O-K.
00:32:03.740 Yeah.
00:32:04.240 Yeah.
00:32:04.740 A, and basically what he said in his interview is that AI won't destroy jobs.
00:32:09.580 It'll create a massive labor shortage instead of the opposite.
00:32:14.140 Um, and he defined, he pointed out three things, um, huge deflation, um, people working less
00:32:23.960 and explosive creation of new jobs.
00:32:26.660 That's key.
00:32:27.900 Um, I think that people, when they argue about AI changing the economy and changing, uh, factory
00:32:37.560 work and, um, even possibly getting rid of lawyers, which I doubt, um, or any kind of other white
00:32:47.380 collar work is that when we had the internet, uh, explosion of the dot-com era, there was
00:32:55.700 more jobs created than they were.
00:32:59.140 They were, they were different type of jobs, different type of things, but it does go to
00:33:05.680 the point that there could be more explosive job creation.
00:33:10.300 Um, and what he does, um, highlight, and I don't know what your thoughts are on this, everybody,
00:33:17.860 but he says, we're not going to have enough people, which I guess can go to Elon Musk and
00:33:26.940 all the other points of population and depopulation.
00:33:30.180 I was reading, uh, wall street journal this morning in one of the stories was that China
00:33:38.260 is shrinking in, um, sorry, I'm going back to the first story.
00:33:44.240 Um, the China, um, is, it's lessening in population and there is a huge issue because there's a lot
00:33:54.180 more older people than there are young, yet they have robots.
00:33:58.440 Um, and that's just one of the economies.
00:34:01.300 We have the same issue here.
00:34:03.400 So he might be right.
00:34:04.840 Um, I'm not sure if it's because of AI will lead to more jobs, which is why we will have
00:34:12.080 not enough people, or is it just because of the population going down?
00:34:18.980 So thoughts, well, I think, I mean, go ahead, sorry.
00:34:24.960 Well, I think there's, there's an argument to be made for what he's saying that I certainly
00:34:31.040 think as AI and robots make things, you know, as they gain capabilities to do work, they
00:34:38.480 will make things cheaper.
00:34:39.820 At least I think so.
00:34:41.040 I mean, there is some debate about how much it costs to do all this AI and robot stuff,
00:34:44.960 but I think generally speaking, if you have a robot working 24 seven, you can probably
00:34:49.200 produce a lot more widgets for the same price as if you hired a person to do it.
00:34:53.080 And so a lot of things will get cheaper.
00:34:55.840 And so I think that will be deflationary in terms of how, how, you know, a lot of things
00:35:00.020 will cost less than they did in the past.
00:35:02.760 Um, I don't think that'll happen to everything like real estate would be the obvious exception
00:35:06.480 where you can't make more real estate.
00:35:08.460 So, you know, that may be even more scarce, but, um, but I think that even that might be
00:35:14.700 alleviated by it, you know, if we can get robots doing construction of homes, we might be able
00:35:18.640 to build a lot more homes and make those cheaper.
00:35:20.480 But, um, I do find a little bit of inconsistency when he says people will work less because that
00:35:27.800 to me would imply there's less jobs or less work to be done by people.
00:35:32.680 Um, so I, I, I get a question mark in my head when I hear that, but, um, I, I do think
00:35:38.740 there's a possibility that he's right in the sense that when you, when you make things
00:35:43.760 cheaper to build or to produce and, or to deliver a service, that means more things
00:35:50.660 are worth doing at a profit, right?
00:35:53.380 Like you can make money doing more things than you could in the past.
00:35:55.920 Whenever you look at a startup, for example, you might say, okay, does, is this thing ever
00:35:59.180 going to make a profit?
00:36:00.660 Can I, you know, are people actually going to pay the price that they would need to pay
00:36:03.840 for this to be profitable, to be sustainable as a business?
00:36:07.260 And every business really has to look at that.
00:36:09.900 Like, you know, are customers willing to pay the price for this product or service?
00:36:14.260 And so the lower you can make that price and still be profitable, the more businesses you
00:36:19.600 would say, yeah, that's a good one.
00:36:21.360 That'll make money.
00:36:22.980 And so I think there would be a natural, uh, trend that would say businesses, you might've
00:36:29.340 said, no, that's never going to work before.
00:36:31.180 You might say, you know what, now it would work.
00:36:33.840 And, um, so more things would potentially be possible in terms of starting new businesses and
00:36:39.480 having additional products and services that wouldn't have been profitable before, but
00:36:42.980 now that we can automate a lot of it, it is profitable.
00:36:45.780 So I do think there's a lot of potential for that.
00:36:48.820 And I really hope this is how things go.
00:36:51.620 Um, but I would point out that it would probably still shift the types of jobs that would be
00:36:57.300 available because if I make all these assumptions about how things are going to go, a lot of types
00:37:02.980 of jobs like manual labor and repetitive tasks and simpler things, even writing and some
00:37:08.560 of the things AI can do, um, you're not going to need people to do those things anymore.
00:37:12.640 And so you'll, you'll need people who know how to use AI and who know how to come up with
00:37:17.840 the ideas and to guide the AI and to review the output of the AI and to just, you know,
00:37:23.220 just have, do all the things like the creative things, the humor things, all the things that
00:37:29.160 machines can't do still.
00:37:31.040 And I don't know if they ever will be able to do.
00:37:33.260 So I think it'll shift jobs in that direction.
00:37:36.220 And that means some people might be very busy because they have those skills.
00:37:39.760 Other people might have a hard time finding a job because they, all they could do is manual
00:37:43.520 labor.
00:37:43.980 So I don't know if it's going to work for everybody, but it'll probably work for some
00:37:47.520 people.
00:37:48.840 Yeah.
00:37:49.400 I mean, Greg Gutfeld just said the other day to Scott that he had AI write the summary of
00:37:55.900 his book or the first chapter of his book.
00:37:57.940 And he's like, Oh my God, it was really good.
00:38:00.880 So yeah, I mean, I, I think you're right.
00:38:03.580 It's going to, I think that's a way that all the creative minds could really have a job
00:38:09.380 too, because AI is all about prompting.
00:38:11.760 Like Sergio was teaching us.
00:38:13.200 And by the way, Sergio, you've created an army of memers and you guys, they're so good.
00:38:21.300 Everyone's doing so good.
00:38:22.320 Okay.
00:38:22.500 Sorry.
00:38:22.820 Back to your story.
00:38:23.480 All right.
00:38:24.640 Well, Kevin, I think you wanted to jump on this too.
00:38:26.440 Yeah.
00:38:26.820 I, so I, I work in this domain very closely and I, I, I think it's, it's going to be
00:38:34.360 profound, the change.
00:38:35.540 It is going to be a disruption, but in many, many ways we're going to be more productive.
00:38:40.860 It's just like the same as it ever was.
00:38:44.340 Talking heads, anybody?
00:38:47.260 New tech, it's going to enable people to be more productive, right?
00:38:50.720 Yeah.
00:38:51.280 Yeah.
00:38:51.320 Your job is going to change.
00:38:53.480 Yeah.
00:38:54.020 You might be able to do what took three or four or 10 people to do.
00:38:58.040 You might be able to do with one person.
00:38:59.700 Um, but I think it's, it's, it's, I, I take the David Sachs sort of glass half full kind
00:39:07.120 of position.
00:39:08.140 It's, it's going to be, um, a profound, uh, positive, um, uh, advancement.
00:39:15.100 I do think just like you should question, always question numbers or what you're reading, you
00:39:22.060 know, people need to cultivate this critical assessment of, oh, this AI has given me this
00:39:28.240 answer.
00:39:28.900 Is that true or not?
00:39:30.900 So I think it's going to be important that in, in education, young folks are taught to,
00:39:37.620 to think critically about the results that they're getting.
00:39:39.920 Um, I know personally, you know, with code generation, you don't rely on a single tool,
00:39:45.780 right?
00:39:46.080 You're going to, you're actually relying on a couple of tools and, and you're weighing
00:39:50.280 how these different tools, which, which tool is yielding the best results.
00:39:54.800 And when it comes to software, you know, a newbie software developer doesn't have sort
00:40:01.320 of the experience, uh, to understand, Hey, yeah, I got the, the basic functionality to
00:40:07.600 work, but there's, you know, all of the robustness, uh, the reliability, perhaps security things
00:40:13.860 that an experienced developer would be able to frame out, you know, in a, um, in working
00:40:19.300 with AI.
00:40:19.940 So I'm, I'm very bullish.
00:40:22.020 Uh, I think it's going to be an age of immense productivity improvement.
00:40:29.060 New types of jobs.
00:40:31.420 It reminded me of something that Erica said right now, reminded me of something Scott would
00:40:36.480 have said, um, and it was regarding art and music and how we would deal with, with it coming
00:40:46.700 from AI, would it be the same or would these people still have jobs?
00:40:51.160 Because when you have an artist, you have a musician, you have a movie or something else.
00:40:57.920 It's not the same.
00:40:59.180 If it comes from a human, then it is, it comes from a, uh, uh, official, uh, intelligence.
00:41:06.440 And so I thought that maybe those were one of the few jobs that would stay around.
00:41:11.640 Cause, because there was a, you know, Scott liked to play drums and he talked about drummers
00:41:19.120 and how he realized that the best drummers didn't follow any of the rules.
00:41:25.280 So they weren't perfect per se, but they were, um, just created their own rules.
00:41:32.200 That's not something that I, maybe AI will eventually do that, but it's, uh, it's just
00:41:39.160 I think you're right, but I would even extend it beyond just artists.
00:41:43.760 Like I do think that for music and art and other things that people are still going to
00:41:48.240 have those jobs and they're still going to, people are going to value human work in those
00:41:52.620 fields more than AI or robot work.
00:41:55.540 And I think, I think a lot of other white collar jobs and other types of jobs will also
00:42:01.400 like even, even manual labor jobs will, will still have a place for the artisans.
00:42:07.560 They'll have a place for the real masters.
00:42:10.920 I mean, I, you know, I went to Europe once and I looked through a museum of some of the
00:42:15.000 furniture that was in this castle and I was just blown away by it because I saw these desks
00:42:20.460 with all this inlaid stuff and all these complicated things in it.
00:42:23.460 And I was like, I couldn't, I couldn't believe it.
00:42:26.640 Like I was like, that not only took many years to build just that one desk, but it took somebody
00:42:32.900 probably 30 years to get good enough to be able to build a desk like that.
00:42:36.920 Like that, that was someone's life work in that desk.
00:42:41.300 And, and no machine can produce that.
00:42:43.940 I'm sorry.
00:42:44.440 I mean, you, you might be able to duplicate it if you really, you know, had some laser 3d
00:42:48.640 thing to try and replicate it, but it probably still wouldn't be perfect.
00:42:51.260 And it wouldn't have that artistic flair of like, this is a unique desk with this unique
00:42:56.300 artistic thing.
00:42:57.400 And so I think the same is probably going to be true to some extent within all human interactions.
00:43:03.860 Like you already had the experience of saying, oh, oh, great.
00:43:07.980 I'm talking to some computer right now on the phone, right?
00:43:10.520 Like you hate it.
00:43:12.080 You want to say, how do I get to a human?
00:43:14.260 And I think that's probably going to be true with a lot of things.
00:43:18.660 And people have already started complaining about like this email looks like it's AI generated
00:43:23.000 or this news article looks like it's AI generated.
00:43:25.380 Like it's different and we can tell the difference.
00:43:27.720 And I think both of them are going to coexist.
00:43:30.640 You're not going to say we won't see any more AI anymore, but I do think people are going to
00:43:35.360 value that humanness and, and that difference that humans put into things because that's that
00:43:42.260 creative piece of it, that intuitive piece of it, where you connect with another person
00:43:46.660 and people do value that.
00:43:48.960 And I think it's still going to have a greater value than something that was produced by AI
00:43:53.280 or something that was produced by a machine.
00:43:56.420 I see both sides of that as the Libra again, because I think our prompting is going to have
00:44:03.960 to get better because I feel like you could tell AI to carve that desk and to make it look
00:44:12.120 just slightly imperfect.
00:44:15.640 And also my, my huge gripe for the last 10 years, I've just been like on this tear that
00:44:22.640 we're not making artistic things anymore.
00:44:25.780 Like everything's like a dying skill.
00:44:27.920 Like look at all of the, all right.
00:44:30.380 I love Bill Pulte, amazing, but he just built something right near me where Netflix is going.
00:44:36.580 Netflix is moving to my town.
00:44:38.200 You guys, like I live in a very tiny town.
00:44:41.040 Netflix is moving here.
00:44:43.940 And, um, but everything that's being built is so uncreative.
00:44:48.640 And then I'm thinking, well, maybe AI can make it creative.
00:44:51.180 If nobody has the architectural wherewithal to make things pretty and interesting and
00:44:58.420 that we'd want to see long lasting, like you go by like an old house somewhere that's got
00:45:03.400 like detail and we, you know, we're like, wow, that's so beautiful.
00:45:06.980 But then we build all these like institutional looking buildings with no thought.
00:45:11.900 I'm thinking AI could do it cheaper and put creativity into it and make some interesting
00:45:18.740 things all around, including furniture, including whatever, but it's going to be about the
00:45:23.380 prompting.
00:45:24.700 So I feel like we need a lot of like QVC style hosts to prompt, uh, AI people that can talk
00:45:31.920 about, you know, this water bottle ad nauseum and describe it for an hour to make the most
00:45:37.060 amazing things.
00:45:38.460 Yeah.
00:45:41.660 It's the family and friends event at shoppers drug mart get 20% off almost all regular priced
00:45:47.240 merchandise two days only Tuesday, January 20th and Wednesday, January 21st, open your
00:45:52.760 PC optimum app to get your coupon.
00:45:58.960 Go ahead.
00:45:59.360 Go ahead.
00:45:59.640 Oh yeah.
00:46:00.240 Sorry, Owen.
00:46:00.880 You know, I wanted to, because, uh, um, when we talk about art, um, and AI and people, I
00:46:08.360 think that we are underestimating the power of, uh, the fusion of humans with AI.
00:46:14.480 This weekend, what I saw, it was people that they had never drawn anything in their lives.
00:46:21.560 They thought that they had no talent at all.
00:46:24.840 And, and, and, and they were making amazing images that, uh, on their first time.
00:46:33.480 And there were a bunch of people that were like, Oh, I don't have time to try it.
00:46:38.100 But a lot of people tried it and, um, and when, when you start in, they were so happy
00:46:43.940 that finally they were able to express what they wanted to express.
00:46:47.580 Right.
00:46:48.380 And, and the same thing is going to, it's going to happen with construction too, because
00:46:52.860 it's monopolized by architects and designers and all this stuff.
00:46:56.740 Right.
00:46:56.980 So if you have an idea, they tell you to use, you know, F off, you know, they go
00:47:01.580 like, Hey, you know, the expert, I want to tell you how to draw this.
00:47:04.700 If everybody can start drawing their dream house and their dream ideas and start sharing
00:47:10.780 with everybody, that's how Trump got, uh, Kim Jong-un to, to, you know, when he showed
00:47:16.220 him a video of like what it could be in North Korea.
00:47:20.820 Right, Owen.
00:47:21.460 I mean, I think I'm not wrong on that.
00:47:22.900 I think he did that.
00:47:24.260 And, um, and that's what, that's what, uh, that's what these dads, you know, all these
00:47:29.260 people that are creating all these memes and, and videos and movies, they are just, uh,
00:47:34.860 spreading the, the, uh, this spark of creativity and everybody.
00:47:39.180 So yeah.
00:47:39.980 Yeah.
00:47:40.300 No, I, I think there's a lot of power in that.
00:47:41.900 I've certainly seen it myself using AI and I, and I think it does get a lot messier in
00:47:47.020 terms of how to look at it.
00:47:47.900 When you say it's a combination of a person using AI, because with the right person, with
00:47:53.180 the right expertise, you can do some really powerful things with it.
00:47:55.900 And, and, and you're still putting that human judgment and intuition and, you know, like
00:48:00.520 Scott talks about how it can't, you know, it can't write a Dilbert cartoon.
00:48:03.580 Right.
00:48:04.040 And, and I think that that's because of a couple of things.
00:48:07.480 One is it, it can't know when something's funny the way a person can.
00:48:14.040 So, you know, you could have it infinitely generate Dilbert cartoons for forever, but it
00:48:20.260 would never know which one was funny.
00:48:21.460 But a person like Scott will almost instantly know if some idea is funny.
00:48:26.160 Right.
00:48:26.440 And that's very different between that.
00:48:28.620 And I think, you know, it, to me, it's, it's that there, there are certain things that are
00:48:35.340 kind of uniquely human to have intuition and to have that feeling and that embodiment of,
00:48:40.640 I can just instantly know when I hear an idea, whether it's funny or not.
00:48:44.880 And most people, maybe at least two thirds of us, according to Scott, have a sense of humor
00:48:49.200 and we can do that.
00:48:51.120 But the other part of it is that whenever Scott sits down to write a Dilbert, it's got
00:48:55.020 to be some joke that no one's ever told before.
00:48:57.800 Right.
00:48:58.100 He can't just do the same Dilbert over and over again.
00:49:01.000 AI could copy his Dilbert real easily.
00:49:03.160 But if you ask it to say, go create a joke that no one's ever told before, I would defy
00:49:09.340 you to try and get AI to do that.
00:49:11.200 It won't, it won't be able to.
00:49:13.620 Because all it knows how to do is.
00:49:15.060 Another difference between AI and us is that AI doesn't have any friends to share memes
00:49:20.560 to.
00:49:21.080 Okay.
00:49:21.700 That's the difference.
00:49:23.860 All the memes that I share and most of us, my Bert too, he's amazing with those memes.
00:49:29.480 It's the same things that we, as soon as we see it, and that happens to all of you, think
00:49:34.020 about it, right?
00:49:34.540 As soon as you see a meme, you think of the person that you're going to send it to.
00:49:38.920 You start imagining the smile on that, on your husband, your wife, your friend.
00:49:43.840 Oh my God.
00:49:44.560 I cannot wait until I send this to that person.
00:49:46.740 And on all the memes that I was posting on Locals, for example, it was because all you
00:49:51.400 guys were my friends.
00:49:52.200 I didn't have anybody else to send memes to.
00:49:54.740 If I had a bunch of other friends, I would be sending them to them.
00:49:57.440 But you guys were my, my circle of friends, right?
00:50:01.780 My, my thread on my phone.
00:50:04.300 I see a good meme and I share it right away.
00:50:06.240 I don't even think about it.
00:50:07.580 I'm not like, oh, let me think.
00:50:08.980 I'm going to, no, I'm just going to let it go.
00:50:11.520 So AI doesn't have that, you know, like that.
00:50:15.320 So we need to tell it, we need to become his friend.
00:50:17.940 So he knows, he wants to please us, you know?
00:50:21.020 I just want to interject one thing, sir.
00:50:24.960 Sergio, as, as typical, had a gold nugget in there.
00:50:30.840 And he said, AI democratizes things for us.
00:50:34.280 I've got a great example of this.
00:50:36.100 A buddy of mine in an unnamed state lives on a lake in a subdivision and he teaches skeet
00:50:43.560 shooting on it in his backyard.
00:50:46.120 He's right on the lake.
00:50:47.300 And apparently the homeowners association complained and sent him a cease and desist letter.
00:50:52.160 And he took that letter and interacted with his favorite LLM and lo and behold, came back with multiple rulings.
00:51:05.660 Remember, our law is based on precedent and fired off a lovely email letter back to the homeowners.
00:51:15.820 And that solved his problem.
00:51:19.540 He didn't have to actually get a lawyer.
00:51:21.500 I'm not saying LLMs should replace your lawyer and what have you.
00:51:24.680 But my point is, there are a lot of things where, you know, intuitively or maybe you've got a little bit of information, but boy, you're not trained in that domain.
00:51:34.040 You know, this just relaxes a lot of those barriers in a huge, huge way.
00:51:41.080 I've personally used it for cooking and for my own health journey and goals and so forth.
00:51:47.900 And it's just really, really a big, big deal, I think.
00:51:53.180 Oh, yeah.
00:51:54.520 Immense.
00:51:55.080 Do you guys want to switch topics?
00:51:59.680 I love, oh, wait, you guys, let's do like a little assignment again today.
00:52:04.160 You guys practice your AI memeing.
00:52:08.500 Tag Sergio again.
00:52:10.040 He loves it and we all see it.
00:52:11.660 So why don't we try today to create something that we think is like a desirable home or building and keep adding and improving upon it with your AI.
00:52:24.400 So like let it give you what you said, see how your prompts made it and then say, okay, like I like that, but change this and do that and post your, let's say homes.
00:52:34.100 Let's say like it could be crazy fantasy home, like on another planet if you want.
00:52:39.060 But let's try to see what AI can come up with via our prompts to make a cool ass house that you might want to live in wherever you want.
00:52:48.960 Okay, that's the challenge for today.
00:52:50.720 Tag Sergio, okay?
00:52:52.180 And by the way, the ladies are winning here.
00:52:55.740 I've seen all the memes.
00:52:57.640 I've seen there's a little bit of an edge of women not caring about what anybody says.
00:53:03.520 There's a lot of guys that are going like, oh, you know, I'll do that later.
00:53:07.260 But a lot of women are taking action.
00:53:09.160 So guys, get rid of that embarrassment and just do it.
00:53:13.240 Yeah.
00:53:13.340 So I can run through some headlines real quickly.
00:53:16.840 I know we only have a few more minutes to the hour.
00:53:18.740 Please talk about Don Lemon.
00:53:21.640 I don't think I caught that one.
00:53:23.100 So if you want to talk about Don Lemon, then go ahead.
00:53:24.820 Okay, so real quick, I have to bring it to the front, forefront.
00:53:31.280 Don Lemon, he just wants attention.
00:53:36.840 And I mean, I guess Scott would say that's one of the main things to be persuasive, and he got it yesterday.
00:53:43.060 He, with the protesters, anti-Ice protesters, went into a Minnesota church in St. Paul, Minnesota, while they're worshiping.
00:53:56.460 They didn't care that there was worship.
00:54:00.560 They didn't care.
00:54:01.140 It's not a public place.
00:54:03.180 And he actually made comments and interviewed the, and was trying to interview the, believe the, the pastor there.
00:54:13.240 But the, and going back to Sergio's memes, he made a great meme of that.
00:54:18.380 But the heinous, it was a heinous act because he was saying, oh, the constitution allows us to protest wherever we, wherever we want.
00:54:28.780 There's nothing about there being some kind of indication that you can't protest here, or there's no time to protest now because there's worship.
00:54:42.100 And he's basically wrong legally.
00:54:44.580 And that's why for me as an attorney, I was like, no, that's not true.
00:54:49.060 And Megyn Kelly made a post under the United States constitution through the Supreme Court decisions.
00:54:55.440 There's a time, place, and manner where you can protest.
00:55:00.000 But on top of that, the U.S. Constitution only applies in regards to protesting in public places.
00:55:09.520 Yeah, and really just protesting government, right?
00:55:11.700 And they also have the separation of church and state, which really means the government needs to stay out of all the religious stuff.
00:55:17.480 So the assistant attorney general, Harmeet Dillon, and the attorney general, Pam Bondi, are trying to do something about it under what is the law of the FACE Act, which punishes intentional acts to try to stop worship.
00:55:36.160 It is extremely, it has become extremely, how would I say, Minnesota has become extremely on fire.
00:55:48.820 It's all not organic, and it's the winter of love in a way, and they're just trying to keep it going.
00:56:00.640 But I think it's going to backfire because Christian people aren't going to take it very well.
00:56:06.920 That's just my opinion.
00:56:07.860 Yeah, well, I think we're going to see more protests.
00:56:11.840 We're going to see more interactions between them and ICE, and I'm hoping it doesn't get violent.
00:56:16.200 But we do see stories coming up about it in terms of deploying National Guard.
00:56:22.560 Trump apparently is getting about 1,500 troops ready to go back into Minnesota.
00:56:27.360 And, you know, so we may see more conflict there.
00:56:31.500 But I think, I'm hoping it'll stay peaceful, and then it'll die down, and we'll see less of this over time.
00:56:38.740 But, you know, it certainly still seems like it's blowing up there.
00:56:43.200 And then around the world, we have Iran blowing up.
00:56:45.500 We have Ukraine still blowing up.
00:56:47.140 We have maybe some progress in Gaza where they're putting out this peace board and starting phase two.
00:56:54.780 And Trump just said that he wants a billion dollars from everybody that's going to be having a permanent seat on the Gaza peace board.
00:57:00.820 So he's monetizing it, which I thought was funny.
00:57:04.560 But maybe that's meant to help rebuild Gaza and, you know, make it all work.
00:57:08.560 So I'm sure it's not for him.
00:57:10.140 It's just for the cause.
00:57:11.840 But it does seem like he's using his creative mind to say, how can we fund this?
00:57:18.620 And I'm hoping we don't have another military conflict in Iran.
00:57:22.700 But I would have to say, based on the headlines I'm seeing, that it looks like we may.
00:57:27.300 It looks like all the signs are pointing to maybe something happening soon there.
00:57:31.760 So we've got a lot of things going on around the world.
00:57:34.960 But I think we do have some good news here at home.
00:57:37.360 There's a story about border crossings being at historic lows again.
00:57:41.520 So that seems to be continuing.
00:57:44.340 Housing payments apparently have dropped $260 a month under Trump.
00:57:48.880 So housing is becoming more affordable.
00:57:50.840 And so I think there's plenty of good news we can point to as well.
00:57:56.620 Thank you for some good news.
00:57:59.100 I wanted to ask Marcela on the legal.
00:58:02.040 Did Don Lemon break any law?
00:58:04.900 Did he break a law that we can, you know, jail him?
00:58:07.080 I mean, they are trying to investigate him and that's what was said.
00:58:16.180 And they're going to look into charges under the FACE Act.
00:58:19.660 He would fall, not just him, but the anti-Ice protesters would fall under that.
00:58:25.580 Under the FACE Act, you would have anything that intentionally interrupts like a place like this of worship.
00:58:36.480 What is the FACE Act?
00:58:39.300 With an intention to, with an intention to, of threat.
00:58:44.400 And what I saw there, I mean, it was sickening the fact that there was children there.
00:58:53.320 And he kept saying, I mean, I guess he, this is what makes, this is what, why we're talking about him today, right?
00:59:02.240 So he got attention.
00:59:03.840 It's just, to me, there's, there's ways, right?
00:59:07.140 And he was saying about the children leaving.
00:59:09.140 And he's like, oh, they deserve it.
00:59:11.380 Or they, they, they need this in order to wake up or something like that.
00:59:16.880 And they, they need to see this.
00:59:19.240 And it wasn't.
00:59:21.140 So, yeah, he's being investigated.
00:59:23.780 I don't know if a grand jury will find charges.
00:59:28.880 And then at the end of the day, when, when this, this is questioned as a lawyer, you're like, well, a jury would have to find him guilty.
00:59:39.800 And so would, so would this anti-Ice protest, but they do need to look into who is paying for this, who is behind this.
00:59:49.480 And these are the people that need to be in jail.
00:59:54.700 And I think Scott would probably inject somewhere in here that Don Lemon might enjoy going to prison, if you know what I mean.
01:00:03.480 Oh, my gosh, you guys, it's the top of the hour.
01:00:11.760 And I like to just keep it, you know, under an hour, like Scott, like to try to do an hour.
01:00:17.480 So I'm going to respect that.
01:00:18.360 And also Shelly and all of you guys.
01:00:21.860 So we'll, we'll be back tomorrow.
01:00:24.700 Owen, do you have closing thoughts for everybody?
01:00:28.140 No, I just, I appreciate everyone coming here and participating in this.
01:00:31.720 And if other people have reframes they want to share or, you know, other people that have been a big part of the community for a long time, you know, reach out to Erica or me or somebody.
01:00:40.400 And we may be able to get some other voices on here as well.
01:00:44.320 Because it's all, this is all volunteer.
01:00:46.180 You know, we're just doing this because we want to keep it going.
01:00:48.740 And I know every, not everyone can do it every day, but if there is a day that you can do it, maybe you can step up and be part of this too.
01:00:55.720 So I appreciate everyone's help and let's just keep it all going.
01:01:00.800 Thanks everyone.
01:01:01.100 And even if you guys just want to come for the simultaneous sip and then you got a jet, but you want to have a sip with everybody, come have a sip and go on to your day.
01:01:09.660 Like, again, we'll never, we'll never be Scott, but we all have that in common.
01:01:14.900 How much we love, respect, miss, and appreciate him and his wisdom.
01:01:19.700 But man, there's so much of it out there.
01:01:22.060 So if you want to commune with, you know, Scott, Scott's friends and have a sip, please join us Monday through Friday right here.
01:01:30.780 And Shelly, thank you so much for providing this stream to us every day.
01:01:37.500 You, you make it possible.
01:01:38.940 And Marcella, Sergio, and Kev, thank you so much.
01:01:43.200 Owen, love you guys so much.
01:01:46.340 And we'll see you guys tomorrow and do your memes for Sergio of a dream house.
01:01:51.060 Okay.
01:01:51.400 All right.
01:01:53.400 Thanks everyone.
01:01:54.120 Bye guys.
01:01:55.240 Have a wonderful week.
01:01:57.300 Gonna leave the stream.
01:01:58.800 Bye.
01:02:04.140 Bye.
01:02:04.580 Bye.
01:02:04.760 Bye.
01:02:05.260 Bye.
01:02:09.400 Bye.
01:02:11.140 Bye.
01:02:11.560 Bye.
01:02:12.040 Bye.
01:02:12.320 Bye.
01:02:12.540 Bye.
01:02:12.600 Bye.
01:02:12.740 Bye.
01:02:14.160 Bye.
01:02:14.220 Bye.
01:02:16.240 Bye.
01:02:26.340 Bye.
01:02:26.520 Bye.
01:02:26.760 Bye.
01:02:26.980 Bye.
01:02:27.860 Bye.