Real Coffee with Scott Adams - February 06, 2026


Episode 3088 - The Scott Adams School 02⧸05⧸26


Episode Stats

Length

47 minutes

Words per Minute

168.03694

Word Count

7,971

Sentence Count

604

Misogynist Sentences

5

Hate Speech Sentences

5


Summary

Join us in The Scott Adams School as we sip a cup of coffee and talk about the things that make everything better: coffee, the news, and a little bit of everything in between. Today's guest: Scott Adams.


Transcript

00:00:00.160 With the RBC Avion Visa, you can book any airline, any flight, any time.
00:00:06.500 So start ticking off your travel list.
00:00:08.960 Grand Canyon? Grand.
00:00:10.960 Great Barrier Reef? Great.
00:00:13.400 Galapagos? Galapago!
00:00:15.980 Switch and get up to 55,000 Avion points that never expire.
00:00:20.540 Your idea of never missing out happens here.
00:00:23.560 Conditions apply. Visit rbc.com slash avion.
00:00:30.000 Everyone needs help with something.
00:00:33.140 If investing is your something, we get it.
00:00:35.860 Cooperators Financial Representatives are here to help.
00:00:38.420 With genuine advice that puts your needs first, we got you.
00:00:42.040 For all your holistic investment and life insurance advice needs, talk to us today.
00:00:46.620 Cooperators. Investing in your future, together.
00:00:51.260 Mutual funds are offered through Cooperators Financial Investment Services, Inc. to Canadian residents, except those in Quebec and the territories.
00:00:55.840 Segregated funds are administered by Cooperators Life Insurance Company.
00:00:58.020 Life insurance is underwritten by Cooperators Life Insurance Company.
00:01:00.240 Paige, Stephen, Calmo, Jared, Bookish.
00:01:06.180 Ooh, YouTube's looking hot today, too.
00:01:10.100 Good morning.
00:01:12.180 C.R. Fletcher.
00:01:14.200 All the usuals are here.
00:01:16.160 JWJB, Jean.
00:01:18.520 How's YouTube looking?
00:01:19.640 Everyone's looking good.
00:01:25.300 Kathy, Cece Cushing, Craig, Alicia, Goose.
00:01:33.280 Bookish.
00:01:34.880 How did everybody sleep last night?
00:01:37.200 Good?
00:01:38.560 I slept great.
00:01:39.160 I hope.
00:01:40.080 Oh, go.
00:01:41.020 Peek quiet, Owen.
00:01:43.980 You slept well, Owen.
00:01:45.220 I'm so jealous.
00:01:45.940 All right, you guys.
00:01:48.140 I think we're all in here, right?
00:01:50.060 Okay, so today's simultaneous sip.
00:01:52.600 First of all, welcome to the Scott Adams School.
00:01:55.740 I just want to say this was Scott, Coffee with Scott Adams.
00:01:59.820 It was Scott Adams' channel.
00:02:01.340 All of his amazing, beautiful content.
00:02:03.300 And now, as Scott had requested, we are the Scott Adams School.
00:02:10.480 It's just a fun, casual place to commune, to sip together, to talk about ideas, have some
00:02:19.260 maybe reframe lessons and micro lessons and talk about the news sometimes.
00:02:24.540 And, you know, we could never replace Scott.
00:02:27.240 We could never say what he might say or how he would think about it.
00:02:31.780 But this is what Scott wanted.
00:02:34.160 And because we are all Scott's Debris, we're so happy to be here and thrilled that you guys
00:02:40.080 are here with us.
00:02:41.040 And the feedback's been amazing.
00:02:42.840 And we so appreciate it.
00:02:44.440 We're all doing the best we can.
00:02:46.480 So I just spun the wheel on YouTube today.
00:02:49.560 I have no idea what the simultaneous sip is going to be.
00:02:52.320 So we'll find out together.
00:02:54.480 And it doesn't matter because it's Scott.
00:02:56.840 Okay, so are we ready?
00:02:59.100 Make sure we have the volume.
00:03:00.500 Okay.
00:03:00.820 Don't want to miss a word.
00:03:02.480 And let's go.
00:03:03.980 It's up over 4% this morning.
00:03:07.540 Good.
00:03:09.000 That's what I'd like to say.
00:03:11.400 Come on in.
00:03:12.320 Grab a seat.
00:03:13.560 It is good to see you.
00:03:16.040 We're going to feel a little bit better today.
00:03:18.860 Oh, good.
00:03:20.400 Is everybody up for that?
00:03:21.900 Feeling better today?
00:03:24.160 All right.
00:03:24.680 Let me get your comments running here.
00:03:26.280 And then we've got some stuff to do.
00:03:28.520 Good morning, everybody, and welcome to the highlight of human civilization.
00:03:49.960 It's called Coffee with God Adams.
00:03:51.740 You've never had a better time.
00:03:52.700 But if you'd like to take a chance on elevating your experience up to levels that no one can even understand with their tiny, shiny human brains.
00:04:02.740 Well, if you want to do that, all you need is a cup or mug or a glass or a tank or a chelsea stein, a canteen jug or a flask, a festival of any kind.
00:04:13.920 Fill it with your favorite liquid.
00:04:15.680 I like coffee.
00:04:16.380 And join me now for the unparalleled pleasure, the dopamine end of the day, the thing that makes everything better, it's called, that's right, the simultaneous sip.
00:04:26.080 Go.
00:04:26.320 When the weather cools down, Golden Nugget Online Casino turns up the heat.
00:04:31.480 This winter, make any moment golden and play thousands of games like our new slot, Wolf It Up, and all the fan-favorite Huff and Puff games.
00:04:40.440 Whether you're curled up on the couch or taking five between snow shovels, play winter's hottest collection of slots, from brand new games to the classics you know and love.
00:04:51.340 You can also pull up your favorite table games like blackjack, roulette, and craps, or go for even more excitement with our library of live dealer games.
00:05:00.800 Download the Golden Nugget Online Casino app, and you've got everything you need to layer on the fun this winter.
00:05:07.400 In partnership with Golden Nugget Online Casino.
00:05:11.380 Gambling problem?
00:05:12.340 Call Connex Ontario at 1-866-531-2600.
00:05:17.540 19 and over.
00:05:18.600 Physically present in Ontario.
00:05:20.180 Eligibility restrictions apply.
00:05:21.940 See goldennuggetcasino.com for details.
00:05:24.860 Please play responsibly.
00:05:34.840 Unparalleled pleasure.
00:05:35.820 Best kind.
00:05:37.820 Best kind.
00:05:38.340 That's true.
00:05:38.980 That's awesome.
00:05:40.780 Thank you, Erica.
00:05:42.300 You're welcome.
00:05:44.260 So, again, I'm Erica, you guys.
00:05:47.340 And we have Marcella, beautiful Marcella in pink today.
00:05:52.140 All of her fans are waiting for her.
00:05:54.360 And we have our studious, sexy Sergio.
00:05:58.040 Looking good.
00:05:58.980 Looking good, Sergio.
00:06:00.380 Beanie is beanie-ing.
00:06:01.500 And Owen Gregorian, I don't want to mention Tapper anymore when talking about you.
00:06:07.820 It's not fair to you.
00:06:09.720 But Marcella, we want you guys to know, has a crush on this version of Owen here.
00:06:16.020 And it's hilarious.
00:06:17.500 So welcome to Owen Gregorian.
00:06:19.240 It's outrageous, you guys.
00:06:20.360 I can't take my eyes off of him.
00:06:24.040 I love that so much.
00:06:26.320 So, you guys, we're going to open.
00:06:29.620 Let me just check this out here.
00:06:31.780 I don't know how quick I can get to me.
00:06:33.700 But, so, last night, there was...
00:06:37.300 Erica, somebody asked Owen to change his shirt.
00:06:39.480 You just wanted to say that.
00:06:40.520 Mary Kay said that.
00:06:42.360 Okay.
00:06:43.580 Maybe take his shirt off.
00:06:45.040 Shirt off, maybe.
00:06:46.100 Okay.
00:06:46.480 Maybe take his shirt off?
00:06:48.460 Oh, my God.
00:06:51.640 Here we go.
00:06:52.360 No, no.
00:06:53.740 Okay.
00:06:54.140 Okay.
00:06:54.460 So, at the request of a couple of people, we just have to do a quick little bit of important
00:07:02.760 housekeeping.
00:07:03.220 So, I hope you guys are ready for that.
00:07:06.160 It's going to be informative and fun.
00:07:08.500 I wrote something this morning based on some things that have been going on about whether
00:07:15.080 or not Scott wanted an AI version of himself.
00:07:19.600 First, what you need to know is that Scott was strategic.
00:07:26.420 Okay.
00:07:26.700 He knows who the four of us are on this screen right here.
00:07:30.040 We all bring a different skill set, personality, vision, and he knows that all of us together
00:07:40.840 really cover all the facets of who he was, what he wanted.
00:07:46.680 And we're massively loyal like most of you are.
00:07:51.520 Okay.
00:07:52.240 So, we're never going to say something that Scott wanted something a certain way unless
00:07:56.940 we actually know.
00:07:58.260 We don't speak for him, but we can speak somewhat on his behalf.
00:08:03.740 And we're only going to tell you things that are true because we know they're true.
00:08:08.440 We work with Scott's family.
00:08:11.640 We work with Shelly.
00:08:13.360 And Shelly is listening today.
00:08:16.520 We talk all the time.
00:08:18.460 So, everything we're saying, you can also understand is with the family's approval with Scott's
00:08:24.720 estate.
00:08:25.360 If you don't know Shelly, Shelly is Scott's best friend, happens to be his ex-wife.
00:08:33.960 And Shelly is the person Scott trusted the most on this planet.
00:08:38.640 Okay.
00:08:39.000 Um, and also his siblings.
00:08:41.940 Okay.
00:08:42.300 So, this is, this is like where we come from at, at the Scott Adams school.
00:08:47.240 Okay.
00:08:47.760 So, that being said, people questioning whether an AI Scott is okay.
00:08:56.100 My short answer is, it's not.
00:08:58.420 And I understand a lot of people, including myself, heard for years that Scott would say,
00:09:04.480 I give my full permission and consent to people make an AI version of me.
00:09:08.160 So, let's remember something.
00:09:11.200 Scott wasn't sick then.
00:09:13.240 He didn't think he was going to die at 68 years old.
00:09:17.300 I am positive he figured when he's in his 80s that maybe the AI would be where he'd want
00:09:24.080 it to be.
00:09:25.600 That being said, as long ago as January, 2025, all the way through this past year until his
00:09:34.460 last day, Scott changed his opinion about that and did not want an AI version of him.
00:09:41.640 Um, he didn't think the technology was at all where he'd want it to be.
00:09:46.220 He knows it can assign opinions and ideas and that it could be dangerous.
00:09:50.160 And he changed his mind also because his family didn't want that either, feeling that, you
00:09:57.080 know, it could be creepy.
00:09:58.140 So, I'm just going to read you a couple of quick things.
00:10:00.320 Okay.
00:10:00.600 This is what I wrote this morning.
00:10:02.000 I'm reading it at the request.
00:10:04.020 I said, good morning, gorgeous people, which you all are.
00:10:08.460 Last night, things got a bit heated and whether or not Scott wanted to be cloned with AI.
00:10:14.800 For many years, Scott thought it would be fun.
00:10:17.700 He figured he'd live at least into his 80s and he'd see AI be good enough to digitally
00:10:22.640 clone.
00:10:24.160 By the start of 2025, Scott fully changed his opinion on this for multiple reasons.
00:10:29.900 One, AI was nowhere accomplishing what Scott thought it could.
00:10:35.300 And two, he realized it would feel creepy for the people who know and love him and didn't
00:10:40.080 want to do that.
00:10:41.220 Those two reasons alone should be enough for all of you.
00:10:45.720 They are Scott Adams' wishes until the day he died, period.
00:10:50.340 Those of us on Locals heard this many times in our extra hours with Scott.
00:10:55.640 There is no reason to challenge this.
00:10:58.000 It's not my decision.
00:10:59.900 It's not what you heard 20 times three years ago and prior.
00:11:03.840 It's not even his family's personal decision.
00:11:06.800 It is what Scott Adams decided.
00:11:10.440 Scott was also very clear about what would be acceptable one day and who he trusted to
00:11:17.320 fulfill that.
00:11:18.900 If or until you ever see it coming directly from Scott's account, please know what you are
00:11:25.780 fighting for in defending rogue AI accounts is going against Scott and disrespecting his
00:11:32.740 memory now that you know and read this.
00:11:35.820 I'm sure none of us want to do that.
00:11:38.780 Scott gave us thousands of hours of himself.
00:11:41.540 We are so lucky we can watch the real Scott Adams anytime we want.
00:11:47.180 We love and miss him terribly.
00:11:50.140 Thank you so much for your attention to understanding.
00:11:53.100 So, that was my statement this morning.
00:11:58.940 And I just want to follow up because another fiery person like me is Scott's brother, Dave, Dave Adams.
00:12:08.600 And he wrote a statement to be read to you today.
00:12:12.160 Okay.
00:12:13.660 Dave reads, my brother and I discussed an AI version of him.
00:12:18.380 I said, it would be awful for me to see and speak with something that sounded and appeared
00:12:23.780 to be him, but had no idea who I am.
00:12:27.460 Scott agreed that would be important to him as well.
00:12:30.880 I suggested a code phrase like Houdini had with his wife.
00:12:35.160 It would open a personal private knowledge base of the special relationship we shared.
00:12:40.560 My brother never intended, never would have approved an AI version of him that wasn't
00:12:46.700 authorized by himself or his estate.
00:12:50.520 For anyone who believes otherwise, I ask this question.
00:12:54.760 Do you truly believe Scott didn't know a rogue AI version of himself could be used for persuasion
00:13:02.120 and propaganda he would find abhorrent?
00:13:05.640 Dave Adams.
00:13:07.080 I will post Dave's statement for you guys after the show.
00:13:10.220 So, um, that should really clear up the questions about whether or not this is approved, wanted
00:13:21.120 or welcomed.
00:13:22.100 And if anyone, um, sees this happening, we just ask you to alert us because we just need
00:13:29.580 to know, you know, and we're not even singling out any one person.
00:13:32.820 This is straight across the board.
00:13:34.780 Okay.
00:13:35.320 So as president Trump would say, thank you for your attention to this matter.
00:13:40.620 And, um, I think now we're going to get into some news and have some fun.
00:13:45.840 Oh, but wait, Marcella in Scott's words.
00:13:49.000 Thank you for reminding me one more clip from Scott.
00:13:52.060 So this, this clip for everybody is from Jay Plemons.
00:13:56.120 So he got it from, um, the January 5th, 2025 episode 27, 11, and I'm going to play it from
00:14:06.580 Scott's own words.
00:14:07.860 So you can see it's, um, way before, um, he passed away.
00:14:14.440 So let me try to play it.
00:14:16.600 Give me one second.
00:14:19.640 A little update from me.
00:14:21.700 I've been telling you for years really that I plan to build an, uh, an AI robotic clone
00:14:28.660 of myself so that I could live forever.
00:14:31.500 I have changed that plan.
00:14:33.920 And here's why I thought it was just the greatest idea in the world that after I passed away,
00:14:41.140 there'd be like a version of me that could grow and, you know, change with technology
00:14:46.300 and be upgraded and stuff.
00:14:47.900 I thought that was great.
00:14:48.900 Do you know what happens when you mentioned that to people, you know, they look at you
00:14:54.700 with sadness and they go, it wouldn't be you.
00:14:59.080 Right.
00:15:00.200 And that's the part that was invisible to me because in my planning, I'm not really there
00:15:04.940 anymore.
00:15:05.920 So I don't have to deal with the fact that people would find it uncomfortable because
00:15:10.120 it's not me, but it's acting like me.
00:15:11.900 And I can imagine that that would hit that creepy zone and you'd be like, why did you
00:15:18.080 even do this?
00:15:19.140 Like, why did you do this?
00:15:20.420 It's not you.
00:15:22.200 So here's how I decided to fix that.
00:15:25.500 I'm not going to create a digital clone of myself.
00:15:29.540 I'm going to create a digital sun.
00:15:31.520 I'm going to reproduce because my son in AI digital robotic form is not supposed to be
00:15:40.980 just like me.
00:15:42.620 It's supposed to be influenced by me, maybe have a lot of my characteristics in it, but
00:15:49.980 then it's supposed to find its own way and it's supposed to turn into its own person.
00:15:54.500 Yes, but when it's small, it will be very influenced by me when it's brand new and it will be designed
00:16:00.680 based on my personality, which would be like my DNA, but it would be very allowed to evolve.
00:16:07.280 So by the time the robot is a senior citizen, you know, maybe its robot body got changed
00:16:14.160 a few times and software got upgraded, but by some point it won't be me, but you might be
00:16:21.300 able to find some corresponding similarities, you'd say, oh yeah, the original, you know,
00:16:28.100 your, your organic father, um, had a lot of, had a lot of the same elements going on.
00:16:34.700 So that's the current plan.
00:16:36.560 Yep.
00:16:37.380 Digital sun.
00:16:38.980 And you know what?
00:16:40.080 Like Dr. Von Hardy said, we are Scott's debris.
00:16:43.740 We're his son.
00:16:44.920 Okay.
00:16:45.240 Scott didn't have a chance to make that version of him, but he made a million of us.
00:16:52.160 So fulfilled.
00:16:54.700 The, the other thing I want to add is that in January of 2026, a year later, he did, uh,
00:17:02.220 indicate so many times to us, the beloveds in the pre-show post-show.
00:17:07.680 And then even on his own show that he had, um, I think he had changed his mind because
00:17:15.480 AI is not to that level.
00:17:17.980 It hallucinates.
00:17:18.960 And on top of that, his family would feel really strange about seeing him in an AI form.
00:17:25.600 And so he went over all of that.
00:17:27.940 And many people approached him that he told me about that had created a version of him.
00:17:33.680 And he always rejected that, uh, one, because he didn't control it.
00:17:38.080 And I think some of you remember that he tried to put all of his books and all of his, uh,
00:17:46.000 works into AI to regenerate an AI off him.
00:17:51.640 And it didn't work out.
00:17:53.240 So some of you remember that many don't because they weren't, it wasn't part of YouTube.
00:17:58.900 It was part of locals.
00:18:00.300 Um, and I just want to remind you of that.
00:18:05.320 Go ahead.
00:18:05.820 I'm so sorry.
00:18:07.160 No, I, I just wanted to add, you know, I, I, like a lot of us have had a lot of conversations
00:18:13.180 with Scott, both on the stream and privately.
00:18:16.580 And, um, you know, I certainly pay a lot of attention to everything Scott says.
00:18:21.660 And I, I think I've seen every stream that he's put out and to Marcello's point, he did
00:18:27.560 say very clearly, you know, I don't want this.
00:18:30.660 I don't want to have an AI clone of myself.
00:18:32.900 I don't think the technology is there things along those lines.
00:18:36.100 And, uh, when we did that big tribute spaces to Scott, the one that was six and a half
00:18:42.460 hours, I think it was on January 3rd of 2026, um, someone brought this up and I gave my perspective
00:18:50.040 of what I thought Scott believed, you know, what, what I thought he, his take on it was
00:18:55.640 currently.
00:18:56.080 And I represented that.
00:18:57.520 And I, you know, I'm, I'm, I'm very careful all the time to say, I don't know what Scott
00:19:01.300 thinks, or I don't know what he would say, but I can give you my perspective of what I
00:19:06.180 heard from him and what I think is.
00:19:08.060 And I did that.
00:19:09.360 And then Scott texted me after that.
00:19:12.680 And I posted that recently on my feed.
00:19:15.000 You can find it there where Scott literally said, you are exactly right on my current AI
00:19:19.480 Scott take.
00:19:20.440 So if you want to know what his take was, you can go listen to what I said on that long
00:19:24.980 spaces.
00:19:25.380 I don't know.
00:19:26.560 I haven't had time to go through it and clip it out, but, um, you know, it's probably a
00:19:30.440 good thing to listen to anyway, just to get that experience again.
00:19:33.420 And, um, but he was very clear that he did not think the technology was there and that
00:19:38.780 he wasn't wanting anybody to create an AI clone of him.
00:19:41.820 And so that has been very explicit and very clear.
00:19:45.560 And I would just ask everyone to respect it.
00:19:47.660 And I would also just say, like, I understand that probably everybody that's doing this, or
00:19:51.540 at least most of them are probably, probably have good intentions, right?
00:19:55.200 Like they, I don't think very many people are trying to say, I'm going to go ruin Scott
00:19:58.140 Adams reputation.
00:19:58.860 That would be possible with this, but it, you know, it, I think most people are probably
00:20:03.860 thinking, well, I miss Scott and I want to have that coffee with Scott Adams every day.
00:20:08.100 And I want him to talk about the news every day and give his take on things.
00:20:11.120 And so I get it.
00:20:12.160 I understand where you're coming from.
00:20:14.420 Um, and you know, it is something that we all miss, but I think we do need to respect
00:20:20.180 Scott's wishes.
00:20:20.900 We need to respect his intellectual property.
00:20:23.220 His estate needs to stay in control of that and his identity.
00:20:26.900 And, and that's just how he wanted things to be.
00:20:29.500 And I think, you know, if you have respect for Scott, you should look up to that and respect
00:20:34.160 his wishes and not try and put out all this other stuff that, you know, might seem cool
00:20:39.960 to you, but really is, you know, potentially creating a lot of pain and a lot of harm to
00:20:44.580 other people.
00:20:46.700 So that's all I had had.
00:20:48.880 Yeah.
00:20:49.320 Just let us know if you guys see it too.
00:20:51.320 And, and if anyone's questioning it, you can send them the beginning of this whole episode.
00:20:56.360 So it's, it's got all the meat right there.
00:20:59.800 You can just tell them, watch the first, how many minutes are we in first 20 minutes and
00:21:04.500 you'll understand.
00:21:07.340 Um, Erica, just quickly for me is the creepy part, you know, that, that I told you about
00:21:13.920 that before that I wouldn't want to see AI or my dad or my son.
00:21:18.180 Um, that's the thing, you know, for me, uh, if somebody send me that, I will, I will throw
00:21:22.900 that phone away as far as possible, uh, because, um, I'm not blood family of Scott, but I do
00:21:30.720 see him as a, as my internet dad and more than that.
00:21:34.360 And, and, and that's for me.
00:21:36.180 And so everybody, you know, sometimes things are going to happen, but, but, uh, all I want
00:21:42.120 is for everybody to think about that only that, how it feels for a family member to see
00:21:45.920 that.
00:21:46.240 That's it.
00:21:46.900 Yeah.
00:21:47.060 Of course.
00:21:47.620 And understand too, Sergio, that, you know, the AI version, somebody's feeding
00:21:52.880 the thoughts to it.
00:21:54.540 It's not Scott, you know what I mean?
00:21:56.700 So, you know, technically like what Dave said in his last sentence that it could be used
00:22:03.040 for propaganda.
00:22:03.860 I mean, absolutely it could be.
00:22:05.980 So, you know, maybe you start off, yeah, you start off warm, like, oh, isn't this fun
00:22:10.480 and telling jokes.
00:22:11.640 And then it starts to give opinions about this and that next thing, you know, it's talking
00:22:15.500 about politics and war and conflicts.
00:22:17.900 And now you're literally getting brainwashed by this Scott Adams AI forgetting that someone's
00:22:25.440 controlling the puppet.
00:22:26.980 It's very dangerous.
00:22:28.400 You know, it, there is.
00:22:29.380 That would be the last thing that Scott would love to see though.
00:22:33.540 He will be so furious to see that.
00:22:35.920 So thank you.
00:22:37.060 Thank you.
00:22:37.720 And especially with such a brilliant mind, right?
00:22:40.240 Nobody can duplicate the way Scott thought and the way he could, you know, give us a story
00:22:44.980 and another perspective.
00:22:47.320 So why, you know, go make an AI of somebody else, like a non-existing person, create one
00:22:54.020 if you want to do that.
00:22:55.240 But to take one of our most brilliant, thoughtful, caring, useful minds and think like, oh, now
00:23:02.780 I'll be its puppet master is just beyond.
00:23:07.340 It's just beyond.
00:23:08.120 And I was a little emotional last night and I might've booted some people out of the coffee
00:23:13.340 with Scott Adams community.
00:23:15.800 I'm sorry, but that's where we're at.
00:23:19.620 Okay.
00:23:20.220 I mean, like I did, I did persuade some people, I think with some of the things I posted.
00:23:25.060 So I appreciate having an open mind about this and listening to Scott's words and what
00:23:30.700 he wanted.
00:23:31.200 And I think a lot of people probably just missed that and remembered what he said earlier.
00:23:35.140 So I think a lot of this was innocent, but I think at this point, you know, it's all very
00:23:40.140 clear what Scott wanted.
00:23:41.640 Thank you for saying that, Owen, too, because yes, I would like to say it is a superpower and
00:23:49.060 a lot of you displayed it last night to see new information and then say, oh, I was wrong
00:23:56.540 or I didn't realize that.
00:23:57.900 Oh, I changed my mind.
00:23:59.040 You're right.
00:23:59.700 Whatever.
00:24:00.720 You people, amazing.
00:24:03.040 I was like, yes, yes, you get it.
00:24:06.080 To the other people that just insisted and thought that they, you know, should keep the
00:24:10.040 debate going and blah, blah, blah.
00:24:12.380 I, you know, you, at some point you just have to say, okay, you know, the people that are
00:24:17.320 closest to Scott are telling me something that I disagree with.
00:24:21.420 What should I do at this point?
00:24:23.760 I mean, you, you just have to trust the people that were closest to Scott.
00:24:27.200 That's all.
00:24:27.780 It's easy.
00:24:28.460 It's, you know, it's his wishes.
00:24:30.580 All right.
00:24:30.900 I think we get it.
00:24:32.040 I guess drop an emoji in the chats, you guys, if you fully understand this, because
00:24:36.020 it was so distressing for me personally last night.
00:24:40.180 I was so upset.
00:24:42.400 But I think, you know, drop an emoji, let me know.
00:24:45.820 So, and if you have any other questions about it, let us know now.
00:24:49.160 I wonder if there's any battery news today.
00:24:53.380 Oh, look, there is.
00:24:54.460 There's a story about sulfur accelerating the ion flow for lithium batteries or solid state
00:25:01.440 batteries.
00:25:01.900 It looks like the intent there is they can replace all the flammable components to make
00:25:07.620 it safer and make it more stable and overcome some of the, you know, the speed of how quickly
00:25:16.880 lithium ions can move through solids.
00:25:18.680 So, it's improving performance, too.
00:25:21.620 The quote from the story is, our goal is to replace all those flammable components so the
00:25:25.800 battery becomes much safer by removing the liquid electrolyte and redesigning the solid
00:25:29.580 materials inside the battery.
00:25:30.840 We can reduce the risk of overheating short circuits and fires while also improving performance.
00:25:35.380 And it's using a sulfur-zirconium interaction.
00:25:41.240 And they're starting out with the little coin cell prototypes, but it does look like it has
00:25:46.080 potential to be used in EVs and grid storage and electronics.
00:25:49.420 So, it could be a significant thing.
00:25:51.460 And I think, as I recall, Scott would often say, you know, this one might not turn into anything,
00:25:55.980 but there's a new story every day and some of these are going to be big.
00:25:59.260 And I'm still holding on to Scott's prediction that he said 2026 is kind of going to be the year
00:26:05.000 of the battery and it's going to change everything.
00:26:08.980 What do you think, Erica?
00:26:10.440 Investing is all about the future.
00:26:12.360 So, what do you think is going to happen?
00:26:14.380 Bitcoin is sort of inevitable at this point.
00:26:16.560 I think it would come down to precious metals.
00:26:19.520 I hope we don't go cashless.
00:26:21.640 I would say land is a safe investment.
00:26:24.120 Technology, companies.
00:26:25.280 Solar energy.
00:26:26.260 Robotic pollinators might be a thing.
00:26:28.900 A wrestler to face a robot.
00:26:30.600 That will have to happen.
00:26:31.840 So, whatever you think is going to happen in the future, you can invest in it at Wealthsimple.
00:26:37.920 Start now at Wealthsimple.com.
00:26:41.660 I mean, the year of the battery.
00:26:44.060 I'll take it over the last year.
00:26:48.420 I don't know.
00:26:49.520 I mean, I say maybe, just like Scott would say.
00:26:52.740 And I think that if battery technology doesn't improve, we're going to hit a wall.
00:26:57.320 So, I want to be hopeful about any changes.
00:27:01.800 And also, I'm like, can we find a new way to make a battery would be good, too.
00:27:06.920 I know nothing about batteries, you guys, except like, you know, I put them in things to turn them on.
00:27:11.820 Oh, I can hear the jokes.
00:27:14.200 But anyway, that's all I want to say on batteries.
00:27:18.100 Thank you.
00:27:18.640 Good night.
00:27:19.800 Yeah.
00:27:20.260 Well, I do think there's a whole lot of research going into this.
00:27:23.020 And the latest stories include how we're also going after the rare earth minerals to get rid of China's monopoly on that and reduce the risk of the supply chain around that.
00:27:32.680 So, I see a lot of progress there.
00:27:33.980 And I think it's going to probably drive even more research to come up with, you know, new designs, new methods for creating it.
00:27:41.580 And so, it does seem like it's all moving in that direction.
00:27:44.420 But we'll have to watch and see how it plays out.
00:27:48.940 All right.
00:27:50.060 Well, in psychology news, there was an experiment with 1,600 volunteers linking social exclusion to higher interest in gossip.
00:28:00.620 So, apparently, if you're socially excluded, you get more interested in celebrity gossip.
00:28:06.020 And it helps people feel more connected to higher status figures.
00:28:10.500 One interesting point I saw was that the socially excluded chose gossip 39% of the time versus 23% of those feeling unintelligent.
00:28:20.560 So, it seemed like being socially excluded was a much bigger driver than just being a dumb person.
00:28:26.520 And the quote from the story is, you know, he kind of explains, like, think back to our reptilian brain and rewind maybe 2,000 years.
00:28:37.240 Your chances of survival are higher if you're closer to the chief, if you're higher up in the hierarchy.
00:28:41.800 So, how does our brain know if you're closer to somebody higher up?
00:28:44.020 One of the ways in which it figures this out is by asking itself, do I know much about the chief?
00:28:48.100 So, apparently, if you're feeling socially excluded, you can fill that gap with gossip.
00:28:58.040 I think that explains what the topic we see.
00:28:59.840 Yeah.
00:29:01.500 Gossip is poison, just like alcohol.
00:29:05.480 How does gossip make them feel included?
00:29:11.560 I don't know if it does make them feel included.
00:29:14.140 I think it makes them feel more connected to those celebrities, which are kind of like authority figures.
00:29:19.040 So, it might just make them feel higher status, you know, if they can feel like they know more about these celebrities than other people, that they might feel better informed and, you know, more authoritative somehow.
00:29:32.060 Maybe they don't have that banter with friends.
00:29:34.120 So, they're like, oh, I have this banter.
00:29:36.520 You know, I don't have a friend group to banter with.
00:29:39.820 So, I can, you know, talk about celebrity gossip.
00:29:42.400 Maybe I'm speculating their lives don't feel interesting, or maybe they're not interesting.
00:29:48.360 I don't know.
00:29:49.140 But the lives of other people we are consumed by.
00:29:52.620 I like that take.
00:29:55.760 Tell us.
00:29:56.480 Well, I think I like that take you had about people not having somebody to care for, you know, and it's like a cat.
00:30:04.860 You know, you get a cat, you get a celebrity, and you care for them, and you, you, you, you know, and that's why any celebrity, it doesn't matter how terrible we think they are, there's people out there that love them very much, right?
00:30:19.140 That they really care for them, and they watch everything they do, and it feels good.
00:30:23.760 You know, like, I watch my cat right here, and like, I want to, I look him up on my Blink, right?
00:30:29.440 So, if you don't have a cat, yeah, get a celebrity, I guess, you know, I like that.
00:30:33.640 Well, you could care for the celebrity, or you could hate that celebrity, and the gossip can go either way.
00:30:39.220 Trump.
00:30:39.940 Trump is a celebrity to punch, right?
00:30:43.180 It gets you going, it gives you energy, it gets you on the streets, you know?
00:30:47.720 All those protests are great, because everybody's out there getting some exercise, you know, getting the energy out, and, you know, expressing themselves in America.
00:30:58.540 Yeah.
00:30:59.020 As long as they don't, you know, block the cops and respect, you know, people, you know, it's fine, but, but the law, the law, nobody's above the law, and, and then, and then we're going to be fine, so.
00:31:11.740 I wonder if we'll have any, um, protester marriages come out of this, like, people who met, you know, throwing bricks at ice and stuff, and they fall in love, and like, you could have a new reality show.
00:31:22.760 There, there was, um, in Portland, um, you know, I, Nick, Nick Sorter, Nick Sorter, Sorter, um, he was, uh, around these protesters, and some of them had gotten, he interviewed a couple, and they were like, we met last time we protested, and like, and he just, like, led them on to speak, and they were just so, oh my God, they're so woke, but, you know, it's just,
00:31:51.880 I mean, if you're single, if you're single, and don't have a job, and you, like, hate Trump, man, you, it's like a, it's like an amazing club of people that they all hate the same person, right?
00:32:05.760 It's amazing, I mean, I wish I had somebody like that, that I could go and, and meet somebody like that, you know, that's why one of those, uh, SCAT local meetups that Jimmy's making, because then you have something in common, like, oh yeah, we all love SCAT, let's have some coffee, right?
00:32:21.280 So, um, yeah, I mean, there's been hundreds of stories about the importance of having a lot of close social connections, and how it makes you healthier to have friends, so I think it's just a basic human need, and, um, one of the other parts that I would add here is that they point out in the story that marketers could use this for isolated groups, so it's also something to watch out for.
00:32:42.680 If you're feeling isolated, you might, they may be able to take advantage of that by using gossip to attract you, and to market things to you, and sell things to you, and so it's something to watch out for from a persuasion perspective.
00:32:55.320 I think it also ties right into Ciel Dini's book, Influence, where he points out that authority figures are a very effective tool for persuasion, and that they don't have to be real authority figures.
00:33:08.220 Like, what the primary example he uses is some, I forget which one, but some, a doctor, like, actor that played a doctor, and then they would use him on commercials in his doctor's coat to sell stuff, and people would respond to it.
00:33:21.000 It was very effective, and even though everyone knew he wasn't a real doctor, they still gave him that authority just by watching him in the lab coat, and, you know, pretending, essentially, that he was a doctor, and so it is something to watch out for if you're trying to just protect yourself from being persuaded in a nefarious way, or, I guess, if you want to persuade people, you can maybe find ways to use this to your advantage, but, um, you know, it is one of the small principles.
00:33:48.880 Yeah, yeah, yeah, maybe I can put my avatar in a lab coat.
00:33:54.140 Do not put a lab coat in the cell phone.
00:33:56.660 Oh, no.
00:33:59.800 Don't do it.
00:34:01.140 That's interesting.
00:34:02.500 What, like, tell me something I need to know when I go out into the world today, and somebody brings up the news.
00:34:08.160 What do I need to know?
00:34:11.200 You want to go into the political news?
00:34:12.920 Is that what you're saying?
00:34:13.680 Because I had a good one for the ladies coming up next.
00:34:16.000 Oh, please.
00:34:17.280 It is so good.
00:34:18.880 Estrogen can make stress memories worse.
00:34:22.240 Oh, thanks.
00:34:23.440 High estrogen in the brain.
00:34:28.000 Worsens long-term memory issues from repeated stress.
00:34:31.540 It may explain higher PTSD risk in women.
00:34:35.560 Multiple stressors cause fear and memory problems.
00:34:38.220 Like the last weeks, a single stress did not, but high estrogen phases amplified the effects.
00:34:43.220 And so that, I think that explains why there's so many crazy women out there, frankly.
00:34:49.480 Oh, that's, thank you.
00:34:51.200 And get too stressed out over things.
00:34:54.500 And I think some of it's right back to their hormones.
00:34:56.520 That if they're in a high estrogen phase, you know, what you're seeing in their behavior might actually just be the estrogen.
00:35:03.040 All jokes aside, I'm sure it does contribute a lot because I'll tell you what, you men, you just don't know.
00:35:11.220 But certain times you feel like you could just commit hairy carry.
00:35:17.020 Like, you are just like, I know I'm insane right now.
00:35:20.340 I know what I'm saying.
00:35:21.400 I don't mean, but I can't stop it from coming out.
00:35:24.580 It's just like, meh.
00:35:26.280 So if you've never experienced that, you are lucky.
00:35:29.500 But also, I think there could be definitely a correlation to that, for sure.
00:35:37.520 I think we need a reframe for that before everybody goes crazy later today.
00:35:42.580 Do it, Sergio.
00:35:43.940 All that medical talk, the estrogen and going crazy, if I was like one of them, it was already in my head already.
00:35:51.800 So we need to come up with a good reframe right now to escape from that escape.
00:35:56.400 And I don't have a good one, so I'm just trying to leave us hanging.
00:36:02.960 I'll leave you hanging.
00:36:03.780 I don't know.
00:36:04.080 Maybe check out my feed later.
00:36:06.500 I don't know if it's a reframe, but maybe you could just say if you're feeling stressed out, it might just be a good idea to step away from the phone and not expose yourself to multiple stresses because it could have bad impact on you.
00:36:19.500 I don't know.
00:36:19.880 Turn off the glitter.
00:36:20.680 Scott could probably come up with a better way of phrasing that.
00:36:25.320 But, you know, if you could get away from the phone, maybe go see some nature.
00:36:29.880 Go watch Melania.
00:36:31.280 It might feel a lot better.
00:36:33.060 Yeah, go watch Melania.
00:36:34.220 Apparently, that's a big hit in the theaters.
00:36:35.780 That was one of the stories I posted a few days ago, that it's like breaking records.
00:36:39.220 I think it's the most watched documentary in something like 14 years.
00:36:43.100 Wow.
00:36:43.620 Good for her.
00:36:44.080 So, it's doing great in the theaters and doing a lot better than people expected.
00:36:49.040 I think all the leftists were predicting it was going to crash and burn, and they were 100% wrong.
00:36:54.060 I haven't been to the movies since Maverick, and the only reason I want to go back is for Melania.
00:36:59.640 That's it.
00:37:00.040 Aw.
00:37:01.180 Yeah.
00:37:02.280 All right.
00:37:02.760 Great crush.
00:37:03.580 Well, if you want to get into the political news, we have more going on with Russia and Ukraine, although it's really more of the same.
00:37:11.440 I think they're doing trilateral peace talks, so that's with Russia, Ukraine, and the United States.
00:37:17.100 At the same time, Russia has picked up their attacks again, so there's power outages everywhere.
00:37:21.520 And they, you know, had an attack on Tuesday that was the biggest since the beginning of the year, with 70 missiles and hundreds of drones attacking their energy.
00:37:32.180 So, it does seem like a drone in an energy war still, which Scott was predicting for a long time would happen someday, and here we are.
00:37:40.760 But it does seem like it's, they're going after the grid, they're going after the energy.
00:37:46.320 I'm sure that's partly tied into the drones, because if you don't have electricity, you can't charge your drones.
00:37:51.900 But I think it's also impacting a lot of people.
00:37:54.200 So, there, you know, there's a lot of people that are really cold right now.
00:37:56.500 They don't have power.
00:37:57.180 So, I do feel for the people in Ukraine, especially.
00:38:02.460 Ukrainians are calling it a winter genocide.
00:38:04.580 It's like negative 20 degrees Celsius in Kiev, so that's a bad situation.
00:38:10.360 But they are around the negotiating table for another round of trilateral talks.
00:38:15.740 You know, they seem to be thinking it's constructive, but I think they still have a lot to work out.
00:38:19.940 I think most of it still goes around the territory and whether Zelensky will give up that eastern part.
00:38:29.320 Zelensky said, either Russia now believes that there are four incomplete days in a week instead of seven, or they are really betting only on war and waiting for the coldest days of the winter.
00:38:36.840 We believe this Russian strike clearly violates from what the American side discussed, and there must be consequences.
00:38:43.000 So, it doesn't seem like we're at the end at this point, but they are working on it.
00:38:49.180 So, hopefully, we'll see some progress there.
00:38:51.980 Owen, is the EU more or less dependent of energy from Russia right now?
00:39:00.000 Well, I think they're probably a little less than they have been, or at least they're moving in that direction.
00:39:04.900 But I think they're in a bad situation, because they, like Germany, for example, destroyed their nuclear plants.
00:39:11.880 Like, they didn't just shut them down.
00:39:13.280 They actually blew up the cooling towers.
00:39:15.260 So, now they have a huge gap in energy capacity.
00:39:19.240 And I think the UK, you know, is a little further away from there.
00:39:22.940 But they went with their whole net zero thing, and they kind of shot themselves in the foot energy-wise.
00:39:28.180 I think they're one of the most expensive countries for electricity now.
00:39:30.780 It's something like 37 cents a kilowatt hour, which is apparently really high.
00:39:35.280 So, I think the EU and Europe are, you know, needing to fill that gap somehow.
00:39:41.320 And so, I think they are turning to the United States.
00:39:44.300 That's the pattern of the stories that I've seen, is that they're bringing over a bunch of liquefied natural gas and things,
00:39:50.440 and they're trying to get it from the United States so they don't have to get it from Russia.
00:39:53.420 But that's not necessarily a practical or sustainable solution, because, you know, Russia has this pipeline,
00:40:01.160 even though they did destroy the Nord Stream pipeline, but they have some pipelines going into Europe.
00:40:06.600 So, it's a lot cheaper to get the energy that way than it is to ship it on tankers and do all the conversions for liquefied natural gas.
00:40:12.940 So, it's a big issue.
00:40:14.900 And I'm hoping we can just get it resolved where it's not such a big issue.
00:40:19.280 But right now, I think we are still in the mode of trying to get Europe to wean themselves off of Russian energy.
00:40:26.560 Some parts of Europe are pushing back on that because they're so dependent on it.
00:40:30.440 But they're trying to cut that so that they aren't funding the war effort.
00:40:36.040 So, going to war with a country that you depend on energy is not a good idea.
00:40:41.700 No, it's not.
00:40:42.540 Well, I mean, in their defense, they didn't choose this.
00:40:46.180 I mean, Russia initiated it.
00:40:49.300 Yeah, but I mean, if a preparation for it, don't start cutting off all your energy supplies, right?
00:40:55.560 Like you're already cutting you off completely.
00:40:58.540 Yeah, I mean, it's a complicated situation just for the same reason that we're kind of enemies with China.
00:41:03.040 But we do so much trade with them and we depend on them for so much that we have to be very careful about that.
00:41:08.860 And I'm sure that is part of Trump's calculus when he decides what to do about China.
00:41:13.540 And, you know, he's been saying a lot of positive things about China lately.
00:41:16.580 He had this big, long conversation with Xi recently and says there's going to be all this great stuff coming out of that.
00:41:22.740 And at the same time, you know, we just talked with Steve about how we need to get rid of all these Chinese students in our universities that are stealing all our information and, you know, all our ideas.
00:41:32.400 And, you know, in my own mind, I at least I just think, OK, Trump is considering all this stuff, right?
00:41:40.000 He can't he can't be too hard on China too fast.
00:41:44.740 Like I do think he's moving in the direction of reducing our dependency on the rare earth minerals and, you know, trying to get manufacturing back to the United States.
00:41:52.620 So I think he's moving in the right direction with things, but I think he realizes that it can't happen overnight and that we still are dependent on them for a lot of things.
00:42:00.480 And so we can't just say, OK, we're cutting you off completely because then they'll cut us off completely and that would hurt everybody and it would be horrible.
00:42:07.520 So I think I do have some understanding of what might have led Trump to say, I want to make things better in our relations with China, at least in the short term.
00:42:18.780 But I do hope that we keep reducing our dependency on them so that we don't have that as a major concern.
00:42:25.220 And I think that trend will continue.
00:42:28.820 I want to give a quick shout out to the people on YouTube.
00:42:31.640 I'm chatting with them.
00:42:33.720 We are not we listen.
00:42:35.560 We love you guys, honestly.
00:42:37.520 I'm not good yet at like doing both, but we are watching you.
00:42:41.980 And look, Annie, you're kicking butt over here.
00:42:44.480 I just want to give them a quick shout out, you guys.
00:42:46.360 So we love the YouTube people.
00:42:49.560 Thank you for being here.
00:42:51.640 You're you're.
00:42:52.820 Yeah, we see Sergio just said we see you.
00:42:55.240 Shout out to YouTube.
00:42:56.420 Let's go.
00:42:58.880 OK, I'm back.
00:43:00.560 Yeah.
00:43:01.300 So moving on to immigration.
00:43:05.580 We have a story that New York.
00:43:07.520 New York County walked back their sanctuary proposal after they got some backlash.
00:43:11.820 Rockland County tabled a Democratic Safety and Dignity for All Act that was going to limit ICE cooperation.
00:43:20.760 And looks like they got a lot of backlash and they decided to just put that aside.
00:43:24.580 So that's a good sign.
00:43:26.940 Looks like maybe at least parts of New York are going to be cooperating.
00:43:31.140 At the same time, we have Minneapolis.
00:43:33.420 There's apparently a flyer that everybody's posting or some people are posting around there that's telling everybody to sabotage ICE and telling them they can get away with it.
00:43:40.260 They're telling them to over tighten lug nuts and strip spark plugs and lose papers and give them the wrong food and give them decaf coffee, which is probably the worst.
00:43:49.480 And there's claiming they'll get away with it.
00:43:53.360 So that's probably the worst part is they're promising that you won't have any consequences if you do all these bad things.
00:43:59.780 And clearly that's not true.
00:44:02.000 Clearly, they're setting these people up to potentially have legal consequences and criminal charges.
00:44:06.260 So a very bad idea, but it seems like a big propaganda effort is underway.
00:44:13.880 So be watched out for that.
00:44:16.460 They apparently attacked a local Fox journalist and destroyed their equipment.
00:44:21.140 And there were riot police standing like five feet away.
00:44:23.720 So that's pretty outrageous.
00:44:25.900 I'm not really sure why the riot police didn't step in and protect the reporter.
00:44:30.040 But they cut his cables and were, you know, just destroying all of his equipment.
00:44:36.820 So, yeah, there were a bunch of officers just standing right nearby that just apparently didn't do anything.
00:44:44.600 And then another piece of this, I would say, is like this adds some context to what's going on in Minneapolis.
00:44:52.080 The Tom Homan announced that he's taking 700 agents away from Minnesota,
00:44:56.000 which by itself, if you take away all the context, sounds horrible.
00:45:01.020 Sounds like they gave up, right?
00:45:02.360 And that's how the leftists are trying to spin it.
00:45:04.260 Like, hey, we won.
00:45:05.460 They lost.
00:45:06.180 We got them out.
00:45:07.220 But Tom Homan clarified that the reason he's doing that is because they are cooperating now.
00:45:15.640 That a lot of the Minneapolis essentially agreed that they're going to work with ICE
00:45:20.820 and they're going to cooperate on the ICE detainers
00:45:23.660 and they're going to have a lot more local coordination.
00:45:26.260 So they don't need as many officers because the local authorities are going to cooperate.
00:45:31.220 So I think that's one of these two movies on one screen things that depending on the side of the story you pay attention to,
00:45:37.140 you can say, oh, Trump lost or Trump won.
00:45:39.960 And certainly I have my point of view that I think, you know,
00:45:43.640 Tom Homan isn't the type that would just back down and retreat.
00:45:46.800 So I pretty much put my credibility on that side.
00:45:50.440 But I can understand how they would just cherry pick the information on the other side and say,
00:45:55.200 oh, look, we got rid of them.
00:45:58.400 So those are my stories on immigration.
00:46:01.120 What do you guys think?
00:46:01.660 Are they really cooperating?
00:46:03.840 Or are we saving?
00:46:06.260 I mean, I guess we'll have to see.
00:46:07.580 I think my guess is that the news story is probably that they made some agreement to cooperate
00:46:12.700 and follow through with it.
00:46:15.340 Yeah, I don't feel optimistic, but I would love to be wrong.
00:46:19.560 You know, on those 700, the retreat, right?
00:46:23.520 You're saying that they're trying to frame as a victory for them.
00:46:29.080 That's a great thing, actually.
00:46:30.260 That's an awesome thing because that's Sun Tzu's way of giving the enemy a way out.
00:46:37.560 And in a way, they're enjoying this.
00:46:41.720 And, you know, it's okay to have a little victory.
00:46:43.840 I like that Trump is being flexible with this, you know,
00:46:49.180 always looking at all the variables and making adjustments and, you know,
00:46:53.800 applying too much force, less force.
00:46:55.720 It's interesting to see that.
00:46:57.060 And they don't have the visuals.
00:46:59.520 We don't have any visuals of those 700, you know, cops moving away, right?
00:47:04.200 We have a lot of visuals of people getting, you know,
00:47:08.760 a lot of aggression from both sides, right?
00:47:12.660 And that's been helping them a lot.
00:47:14.560 So that's been creating a lot of good photo opportunities,
00:47:18.440 photo shots for the left, right?
00:47:21.460 So it's good that they're taking that away.
00:47:23.640 And I think it might be a win-win in some cases.
00:47:25.880 I don't know.