Real Coffee with Scott Adams - April 10, 2026


The Scott Adams School - 04⧸09⧸26 BRIAN ROEMMELE Joins The Home Team


Episode Stats


Length

39 minutes

Words per minute

161.51402

Word count

6,304

Sentence count

363

Harmful content

Hate speech

6

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode of The Sip, Marcella and Owen are joined by Brian Ramelli, a music professor at the Berklee College of Music in Boston. They talk about the importance of public service announcements and how important it is to be useful.

Transcript

Transcript generated with Whisper (turbo).
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 square knows that in hospitality efficiency is everything that's why their system lets you take
00:00:07.980 payments track sales handle inventory manage staff send invoices and keep up with finances
00:00:13.340 all in one place apply through orders with zero mistakes get the data you need and keep everything
00:00:19.200 working together so you're ready for whatever's next learn more about their customizable plans
00:00:24.240 Visit vw.ca to learn more.
00:00:54.240 UVW, German-engineered for all. 0.95
00:00:57.780 I hope Lang is first. 1.00
00:01:00.580 He is.
00:01:01.560 God bless.
00:01:03.100 Okay, good.
00:01:03.620 Good morning, everybody.
00:01:05.980 Welcome, guys.
00:01:08.480 So if you're just coming in, we have our favorite guest professor today, Brian Romelli or Romilly or Romelli or however.
00:01:20.060 He said however you want to say it.
00:01:21.520 Well, however you want to say it.
00:01:22.760 He answers to it all.
00:01:24.240 we'll just call him brian for now but so brian's a music guy and i was telling him about our time
00:01:30.260 for the sip groovy intro so we're going to play that just so brian can hear it and you guys filter
00:01:34.980 in and then we'll get moving how would you like to take it up to a level that you've never seen
00:01:46.060 All you need for that is a cup or a mug or a glass.
00:01:51.700 So take your chalice aside.
00:01:53.340 You can't get a jug of glass.
00:01:55.160 A vessel of any kind.
00:01:56.860 It's time for Scott Adams school.
00:01:59.060 Be useful is the primary rule.
00:02:02.080 Owen is here with the news.
00:02:04.600 Marcella brings the legal views.
00:02:07.540 Erica, she runs the show.
00:02:10.360 Who's the guest?
00:02:11.340 We want to know.
00:02:13.340 Got your glass, your mug, your cup
00:02:17.240 Pour your coffee
00:02:19.280 I like coffee
00:02:19.920 Visit BetMGM Casino and check out the newest exclusive
00:02:25.000 The Price is Right Fortune Pick
00:02:27.040 BetMGM and GameSense remind you to play responsibly
00:02:30.100 19 plus to wager
00:02:31.300 Ontario only
00:02:32.500 Please play responsibly
00:02:33.680 If you have questions or concerns about your gambling or someone close to you
00:02:36.920 Please contact Connects Ontario at 1-866-531-2600
00:02:41.680 To speak to an advisor
00:02:42.840 free of charge. BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:02:49.420 When the weather cools down, Golden Nugget Online Casino turns up the heat. This winter,
00:02:55.860 make any moment golden and play thousands of games like our new slot Wolf It Up and all the
00:03:02.020 fan-favorite Huff and Puff games. Whether you're curled up on the couch or taking five between
00:03:07.660 snow shovels play winner's hottest collection of slots from brand new games to the classics you
00:03:13.440 know and love you can also pull up your favorite table games like blackjack roulette and craps or
00:03:19.800 go for even more excitement with our library of live dealer games download the golden nugget
00:03:25.680 online casino app and you've got everything you need to layer on the fun this winter in partnership
00:03:32.080 with Golden Nugget Online Casino.
00:03:34.980 Gambling problem?
00:03:35.940 Call Connex Ontario at 1-866-531-2600.
00:03:41.140 19 and over.
00:03:42.200 Physically present in Ontario.
00:03:43.780 Eligibility restrictions apply.
00:03:45.540 See goldennuggetcasino.com for details.
00:03:48.440 Please play responsibly.
00:03:49.720 Fill it up.
00:03:52.360 It's time for the sip.
00:03:54.780 It's time for the sip.
00:03:57.360 It's time for the sip.
00:04:00.560 It's time for the sip.
00:04:01.660 It's time for Scott Adams School.
00:04:11.940 I think it's all working today.
00:04:15.280 Amazing.
00:04:16.500 Love it.
00:04:17.520 Isn't that the best group?
00:04:19.080 Little Philadelphia soul, little Curtis Mayfield there.
00:04:22.120 I love it.
00:04:22.520 It's amazing.
00:04:24.040 All right, you guys, let's get moving.
00:04:26.260 And if you'd like to enjoy
00:04:27.980 Simultaneously sipping
00:04:30.560 It's one of the great bonding
00:04:33.100 Experience of your life
00:04:34.460 All you need is a cup or a mug
00:04:37.400 Or a glass, a tank or chalice or stein
00:04:39.140 A canteen jug or flask
00:04:40.900 A vessel of any kind
00:04:42.720 Fill it with your favorite liquid
00:04:44.360 I like coffee
00:04:45.120 And join me now for the unparalleled pleasure
00:04:47.940 The dopamine of the day
00:04:49.340 The thing that makes everything better
00:04:51.180 With a simultaneous sip
00:04:53.220 Go
00:04:56.260 oh yeah oh yeah yeah oh yeah yeah yeah that's the way to do it welcome to the scott adams school
00:05:10.600 it is oh my gosh is it thursday already it is if it's thursday it's brian ramelli day it is april
00:05:17.980 9th, 2026. I'm here with my amazing co-hosts, Marcella and Owen. And Brian, we just want to
00:05:27.180 welcome you back to this school. How are you? I'm doing wonderful. In the background of all
00:05:33.840 the world events, I think we're doing good. Excellent. That's what we want to hear for
00:05:38.220 sure. So we were just talking, as Marcella calls it, in the green room. And just one thing I wanted
00:05:45.120 to put out there right away because this is going to be if anybody can relate to this or
00:05:52.000 know someone who may and you could tag a person or share this live stream with them because brian
00:05:59.120 has a public service announcement regarding how we are going to train feed and educate ai
00:06:09.280 and it's very important and i think we've talked about this before where i've said like you know
00:06:13.760 save the old encyclopedias or save the old newspapers because if everything goes away
00:06:19.600 where are we getting our information from so this this pertains to for those of you who are old
00:06:24.480 enough microfiche from federal institutions and i'm gonna just let brian if you don't mind like
00:06:31.520 showing them and maybe explaining because it is really important for moving forward well thank
00:06:38.000 you erica and and wonderful to be here guys i really appreciate it love you guys so much all
00:06:44.480 right um to train ai you need to have the widest breadth of data possible and you're training ai
00:06:53.440 not just simply on the content of the data like you know who what where when and why and how
00:07:01.920 but you're training it on the context and the situations of how words follow each other
00:07:09.680 large language models are next word predictors that's why some people call it smart spelling
00:07:15.520 correctors um and some truth to that but not really because so are we i mean we are next word
00:07:22.560 predictors to a certain level but we operate in a non-word environment right hemisphere
00:07:28.720 our concepts so we're really next concept predictors which is the next ai by the way
00:07:33.520 i just gave you a secret on where this is all going next content concept but don't want to
00:07:38.960 digress i'm going to try to stay really tight on this um we are running out of data from the online
00:07:47.600 world the internet and we assume unfortunately that 99 of everything of any value was digitalized
00:07:55.600 that is fatally incorrect in fact my calculations is about 12 percent if that and the unfortunate
00:08:04.480 part about it is it's in modernized vernacular and what that means is to get this microfiche
00:08:12.880 i don't know if you could see inside there but inside there is data and this one happens to be
00:08:18.800 one page this one is four pages yeah four and i don't really care so much about the
00:08:27.760 hollereth punch holes that you might see in here i use that what i care about is what that
00:08:35.280 microfiche it's a photographic negative for anybody who remembers that what it recorded
00:08:42.080 it recorded the work product of today billions of dollars translated maybe trillions i could be off
00:08:49.840 by an order of magnitude they're being thrown away in fact most of them are gone and most of
00:08:57.440 them have never been digitalized what that means is they are gone the work product the notes the
00:09:05.280 The lab notes, the research, the parts, the layout, the architectural drawings, the schematics, gone.
00:09:15.140 And the problem is the chain of custody is so bad within government and industry because a lot of this is from the government.
00:09:24.400 None of this is classified.
00:09:25.860 None of it ever has ever been classified.
00:09:27.660 I'm not diving dumpsters necessarily.
00:09:31.440 I have at the CIA.
00:09:33.680 So nobody come after me. This has been decommissioned by the federal government and they are left in warehouses. And then decades later, after one librarian or curator is gone, somebody sees that they have 40,000 square feet in a warehouse. What's in there? IBM punch cards, boss. What? We don't even read those anymore. Throw them away. And that's what happens.
00:10:00.280 And that's the innocent way that it happens. There are other people internally within the government just don't care about our history, would prefer it to be gone. They're not nefarious as they sound. They just hate humanity. There are a lot of people, unfortunately, that exist that are like that.
00:10:17.260 They don't overtly say that, but they don't like technology, they don't like humanity, they're not the Unabomber, but maybe they're in that class of individual.
00:10:28.740 And I get it, you know, technology hasn't always worked well, you can cook with fire, you can burn people with fire, it's all according to how you utilize it.
00:10:36.940 But my point is, we are the amnesia generation more than any generation in history.
00:10:43.740 You can go back to the Library of Alexandria being burnt.
00:10:47.360 We're burning a library of Alexandria every week of data.
00:10:52.040 We are forgetting much more.
00:10:54.820 And the young folks that are building AI, most of them are about 22, 23 years old.
00:10:59.780 I show them this stuff and they say, Google did something like that.
00:11:04.040 I guess it's saved.
00:11:05.480 It doesn't, you know, we'll use Reddit.
00:11:07.800 So you can train AI on next word prediction from any human data.
00:11:13.040 You can train facts on Wikipedia facts, right?
00:11:17.560 Data that the commissars at Wikipedia have voted are the facts.
00:11:22.420 That's the problem with when you start to say there are facts or observations and you get better tools of observation.
00:11:30.340 That changes the observation.
00:11:32.600 Therefore, it changes the fact.
00:11:34.340 So are there observations that tend to be true?
00:11:37.680 Yeah, gravity works every morning.
00:11:39.100 You don't need to check it.
00:11:40.520 So we call that a fact.
00:11:41.540 But Einstein comes along and now that's a new observation. Gravity works differently. I don't want to get down that rabbit hole. But different observations, different colloquial explanations of how those facts work.
00:11:54.920 this data being removed leaves us with modern data and what is important is the amount of work
00:12:03.280 that went into this yeah and i don't mean that necessarily physically i mean um mentally yeah
00:12:10.240 to get something to get this far hundreds of people very hard to reach that data it wasn't
00:12:17.640 one guy in their basement who doesn't like their parents who are scrubbing along at reddit at
00:12:23.720 three in the morning with some nihilistic rant about humanity that is in our chat gpt that is
00:12:31.320 in our anthropic clod models and what they do is they try to uh align it to human ambition
00:12:38.880 it's equivalent to taking sewage and trying to find the good stuff that are in the stew
00:12:44.960 in the sewage i don't know if there's anything good in the sewage once it's in the sewage
00:12:48.880 it's no good anymore i'm not so so where so people are asking you know who do they go to
00:12:57.140 like you know so we're all like oh my god let's get this stuff so it's basically federal microfiche
00:13:02.760 governmental uh microfiche um maybe some like amazing uh like astrophysic companies that you
00:13:11.780 know were private so so we're talking about the the big thinkers of yesteryear and how they got to
00:13:18.420 work out problems and solutions and come up with amazing things, all this work's being thrown away.
00:13:24.980 So we have to think creatively, you guys. So if you're that type of a person or you have friends
00:13:30.340 that are, or you're looking for a project or a mission or a way to help or be useful, as we say,
00:13:37.080 see if you can get your hands on this stuff. And you could coordinate with Brian, who is digitizing
00:13:43.900 as much of it as he can. It's like we either get it now or we lose it forever. So, and like he
00:13:50.360 says, so much of it's already gone. Is that the gist? Yes, Erica. It's actually even worse.
00:13:56.980 First off, first principles, go to savewisdom.org and start recording your 1000 questions because
00:14:06.040 your data is much more valuable than this, period, end of story. And I don't care if you don't think
00:14:11.560 it's not important. It is. Save it. And hopefully, we at least have that audio that somebody in your
00:14:18.460 family will discover and say, wow, I didn't know grandpa really did that. It's important. We don't
00:14:23.980 spend enough time together. Do this now. And make it a challenge. Do it in a car, in a closet,
00:14:30.860 and do it alone and emote, because you're going to answer some of those questions. And you are
00:14:36.020 you are not doing it right unless you're going to cry. Because those questions aren't emotional,
00:14:41.640 the memories are. And it's supposed to be that way. It's not supposed to be in the internet.
00:14:47.060 It's not supposed to be in chat GBT. But I pray and hope that it will be in your AI one day.
00:14:53.120 And that AI, if you choose, can be shared with the world. Decades, hundreds, thousands of years
00:15:00.000 after you're gone, your voice will still be heard. That's number one. Number two, you have old books.
00:15:04.620 don't assume that they were digitalized. They probably weren't. I guarantee you,
00:15:09.960 you will be blown away what you see in old medical texts. You will be blown away with
00:15:14.600 what you see in old encyclopedias, in old maps, Tartaria. Now, I'm not going to go down that
00:15:19.740 rabbit hole. But there's going to be a lot of things that you might think, oh, well,
00:15:25.840 that was antiquated thinking. No, it was thinking from a different place. It was thinking from a
00:15:31.960 different point in time. It may be more flowering terms, right? People use the vernacular syntax
00:15:39.200 and nomenclature from their epoch, right? So if it's 1800s, the words are going to be a little
00:15:45.760 difficult. The other things, oh, all the magazines and newspapers, only 3% of the newspapers have
00:15:51.840 been digitalized. And I have archives of newspapers in the Midwest that shut down
00:15:58.240 and they were hauling it away i literally dived in dumpsters to save microfilm at that point
00:16:04.840 of a newspaper that existed for 112 years wow and it was all going to get thrown away we would
00:16:11.380 have never heard it i don't have time even to to dive in and digitalize that stuff i have
00:16:16.400 unfortunately uh self-storages all around the country from where i dumpster dives in the 80s
00:16:22.600 and 90s i don't get to do it as much lately i'd probably get arrested now back then people tried
00:16:29.000 to but once you throw in a dumpster it's public domain period end of story uh i'm not talking
00:16:34.760 intellectual property that's a little complicated but when a company goes fully bankrupt where i
00:16:39.560 dumpster dived a lot and bankrupt companies i was looking for their libraries and you have the work
00:16:45.400 product of somebody worked for 37 years on a project and and some new guy from harvard came
00:16:51.960 in and did some funny things with the finance and all of a sudden company gets bankrupt and they get
00:16:57.560 fully liquidated nobody buys their ip it's gone um and it gets extinguished so that life that was
00:17:05.640 dedicating 37 years to build some widget i want to save because not only is the content probably
00:17:15.480 important the context is even more important what were the processes george that you used
00:17:21.720 to get here. Oh my gosh, we don't use that anymore. We need to do that again. That's why I
00:17:27.940 save that stuff. So Brian, for people like Freebird just said, my neighbor has 30 years
00:17:32.280 of newspapers piled up in her house. I mean, I have some old amazing magazines and things that
00:17:38.580 I saved. Should I do something with those? Yes. If you're going into a basement and garage,
00:17:45.700 wear a mask. Black mold is a big problem. Do not breathe that dust. I'm not being a freak. I'm not,
00:17:51.420 I am in California, but I'm telling you, well, there are people right now that are suffering
00:17:56.320 miserably from black mold and, and they're being misdiagnosed and they're being told that,
00:18:01.520 you know, Papa's got, you know, the dementia, you know, he's losing it. No, he's got black mold and
00:18:07.880 we can probably bring him back. That's another thing. How do I know that? Well, I got a medical
00:18:13.580 texts from 1890s that actually go into this. They knew that black mold would cause mental
00:18:21.860 decline and had ways to reverse it. They put leeches on your face and do some bloodletting.
00:18:27.760 No. There are protocols that they knew that we no longer know. In fact, we do follow a guy called
00:18:42.280 a Midwestern doctor. He is one of those people preserving old texts, you know, that will be a
00:18:48.780 lightning bolt. He's like the Scott Adams of the medical world. You know, the world would like to
00:18:55.200 see him go away because he constantly brings up things like, I don't know, using DMSO to reverse
00:19:00.720 cataracts and, you know, retina failure, things like that. Read him. Don't take my medical advice.
00:19:07.900 He is, in fact, a doctor. But we save a lot of these things that are vitally important for
00:19:15.060 humanity. So mask up, make sure you use gloves, take out your iPhone. If you have an iPhone,
00:19:23.740 if you don't, get the Android equivalent. Inside of Notes, the iPhone app, you can go to the
00:19:30.160 scanning app and just start scanning things. Don't complicate things. Make sure it's readable.
00:19:35.320 put them into PDF files, save them, and ultimately, if you can, put them on some physical medium,
00:19:42.760 at the very least, a good DVD, if not a CD. And, you know, catalog what they are.
00:19:49.940 And if you think it's important, I don't know if it's important.
00:19:53.240 And then what? And then what?
00:19:55.020 Well, what I hope is that enough people that are listening to me today,
00:20:00.160 enough in positions of power start making a drive to stop the federal government from throwing
00:20:07.660 things away. Period. End of story. I think President Trump should sign an executive order
00:20:13.400 to stop all destruction of data right now. Do you want to know why? Guess who's got more data
00:20:20.560 than the U.S. government? Who? The Chinese government. They've been spying on us for
00:20:25.420 decades and they never threw these away they scanned them they saved them and guess what
00:20:30.540 they're doing this is going to be a revelation to people who train ai they're training ai on our data
00:20:37.660 our spy data and the stuff that we leaked out like every tv uh news broadcast every tv channel
00:20:43.980 broadcast and every major market for the last i don't know 55 years they have it we don't
00:20:50.060 because we don't have the foresight to save stuff they are so guess what's going to happen and i get
00:20:56.320 a little heated here they're going to produce ai models that are extraordinarily more powerful than
00:21:02.260 ours at some point because they are not thinking every quarter they're not thinking to show great
00:21:08.660 results for a vc they're not thinking about all the seat licensing that they're going to get
00:21:13.540 they're thinking in thousand year this episode is brought to you by telus online security
00:21:19.700 Oh, tax season is the worst.
00:21:22.300 You mean hack season?
00:21:23.820 Sorry, what?
00:21:25.080 Yeah, cybercriminals love tax forms.
00:21:27.480 But I've got TELUS Online Security.
00:21:29.920 It helps protect against identity theft and financial fraud,
00:21:33.080 so I can stress less during tax season, or any season.
00:21:36.520 Plans start at just $12 a month.
00:21:38.680 Learn more at telus.com slash online security.
00:21:41.200 No one can prevent all cybercrime or identity theft.
00:21:44.100 Conditions apply.
00:21:46.300 Spring break isn't what it used to be.
00:21:48.440 It's better. This spring, stay three nights and get a $50 Best Western gift card. Life's a trip.
00:21:54.780 Make the most of it at Best Western. Visit bestwestern.com for complete terms and conditions.
00:22:02.580 Time scales. We're thinking in three or four month time scales. That's a problem. I am not
00:22:09.280 into government expansion, but I do believe that we have a national crisis in our amnesia that's
00:22:16.360 taking place. And guess what? You think it's still on the internet? Whatever it is, it's being
00:22:22.060 deleted right now. Anybody that's been on the internet long enough will know, I know it was
00:22:28.060 there, but now it's gone. It's gone because it got deleted. Everything you put on Facebook is going
00:22:33.920 to be gone one day because sooner or later, somebody's going to say, hey, that's going to
00:22:38.320 be $100 a month to store it. Sorry, it's going to go away. We trained our AI in it. Thank you. We
00:22:44.460 appreciate it. Now we're not going to hold it anymore. I was like a free user of Evernote and
00:22:51.120 all of a sudden they sent me an email and said, unless you start paying, we're deleting all your
00:22:54.380 data. Yes. I love it. Some of the Google accounts and other things that do that too. Google accounts
00:23:00.120 are being deleted. Guess what? Some of the best YouTube accounts are gone. In the great purge of
00:23:05.740 YouTube, back in the day when it was cool to eliminate people's speech, now they're a different
00:23:10.480 youtube please thank us and you know everything's okay i'm sorry youtube i won't forget why because
00:23:15.440 some of the data that they deleted were people who are dead they're gone and their data will never
00:23:21.280 ever be brought up again and we assumed naively that you know youtube would do the right thing
00:23:27.840 oh they haven't touched your account in uh you know three years yeah but they have 50 videos
00:23:34.640 of of an old gentleman talking in his lab showing you some physics that he discovered
00:23:39.920 because he worked at skunk works and now it's eliminated now i i have one or two videos maybe a
00:23:45.840 few other people have one or two videos but the rest are gone and now he is just a a ghost that
00:23:51.120 didn't exist and why is it more important now because we live in a digital age we no longer
00:23:56.960 have physical medium right this is going to last barring some kind of high heat event um
00:24:03.680 This is going to last longer than all of us here, longer than our great-grandkids.
00:24:10.620 Nothing that we're doing right now on this medium is going to last.
00:24:15.160 It is going to be gone.
00:24:17.280 And the unfortunate part is we don't respect books.
00:24:21.920 We don't respect records.
00:24:23.680 These are analog physical mediums that do not require sophistication to reduplicate its data.
00:24:32.820 Right?
00:24:32.920 What is the sophistication of a photographic? Well, some light and maybe a magnifying glass.
00:24:39.460 And I decoded this. What is the sophistication of decoding a DVD? And again, I said, store it on a
00:24:44.980 DVD. That's the best we have right now. Well, you better have laser technology. You better have all
00:24:50.000 the electronics that are going to do that. Now I'm not talking prepper end of the world. I am,
00:24:54.680 but I'm not. I mean, that's a different thing. You know, I am talking, just assume everything
00:25:00.600 goes along the way it has, right? It keeps rolling along. Those old dictionaries, words that are no
00:25:08.780 longer allowed to be used in the way that they are today, right? Old encyclopedias that literally
00:25:14.580 define things much differently than today. And this, you know, it has nothing to do with whims.
00:25:20.760 Certain things do not have the whim of political, you know, colorings to try to change its
00:25:26.100 definitions. The definitions are solid. Geographical things, history. I mean, I learned about
00:25:34.180 the Knights Templar from a book that blew my mind. I had no idea. I thought that they were doing,
00:25:42.860 you know, as a Catholic, right? You know, I'm a Christian now. It blew my mind because I thought
00:25:49.780 they were beheading people because they weren't saying the right thing to the Pope. 0.97
00:25:54.620 it turns out that they were trying to push back an invasion of a certain Middle Eastern culture
00:26:01.900 that was taking over Europe and bringing it back into the Dark Ages. We're told a certain group of 0.94
00:26:09.100 people, a certain religion burned a library of Alexandria. I saw the historical link that it was
00:26:15.900 another religion that took it out and beheaded the people and used albano balio so i don't know
00:26:23.340 shells of a shellfish to take hypatia and torture her to death for daring to save information that
00:26:32.220 offended their their deity and um the unfortunate thing is this is happening again and so when you
00:26:41.340 have the coloring of history and you can't go back and by the way the winners do change history
00:26:47.060 that's a fact but if you have enough pieces of it you can go in a dark room and start putting
00:26:52.580 the puzzle pieces together better and a lot of us are thinkers in fact i guarantee you everybody
00:26:57.960 listening to right now that are taking the time to listen to what scott adams was saying and what
00:27:03.940 you folks are saying and hopefully what i'm trying to say here um you really know that things are
00:27:10.480 changing right before our eyes. And we are the last generation and maybe the first generation
00:27:17.400 who can do something about it. If we save our data, if we start training AI on this data.
00:27:24.560 And by the way, here's the interesting part. This is as important to people like Anthropic and Open
00:27:30.460 AI as it is to XAI and Grok and me building AI models in my garage and you one day building your
00:27:38.340 AI models. You're going to laugh. Oh, I can't do that, Brian. Yes, you can. It's going to be as
00:27:43.560 easy as you opening up your, your smartphone. It's going to be that easy to train in AI. It's not
00:27:48.720 right now. It is just like maybe one day when the Apple one came out, you looked at a pile of junk
00:27:54.260 of wires and circuits, and I'm never going to do anything with that. Yeah, you aren't with that,
00:27:59.360 but one day you will. So when I talk about the future, I'm extrapolating what I know is
00:28:04.060 definitely going to happen. And I work my way backwards. And I'm saying this data, at some
00:28:09.240 point in the future, people are going to be screaming at us as old cranky folks. Why didn't
00:28:14.200 you save it? Didn't you know? Didn't you know? And I'm saying, now you know. And I'm not telling
00:28:20.800 you to be a hoarder, but hoarders are cool people. I hoard a little junk. But when you hoard,
00:28:28.980 try to do it with intention. Try to say, you know, there's something valuable here.
00:28:32.840 and try to find it because you're on a journey too. And exercise the brain muscle and say,
00:28:38.620 what is in this book? Holy cow, look what I found, Brian. Look what I found,
00:28:43.560 Scott Adams School. And talk about it because we need this. And it's not so much in the political
00:28:49.920 sense of now. Yeah, sure. Tell off somebody who's got it wrong. That's cool. But that's not going
00:28:55.480 to matter in 15 or 20 years, really. What's going to matter is that you did the work,
00:28:59.900 that you saved it. So I'm talking to leaders right now in the political realm. Talk to the
00:29:05.900 president. Sign an order today to stop throwing away everything that the government has. Number
00:29:11.920 one. Number two, start an organized campaign of all the curators and libraries within the federal
00:29:17.660 government to try to organize this data. It is massive. I know more of where this data is than
00:29:26.800 any one person in the government because i spent more than 40 years trying to study this i didn't
00:29:31.920 just guess this when ai came i knew in 1979 that we're going to need ai training data i knew that
00:29:39.840 right and it sounded crazy more back then as it does now i always kind of sound but back then i
00:29:47.840 would say no save these cars you know what the hell is wrong with you romley what is wrong with
00:29:51.680 you it's punch cards no someday people are going to want what's on it no somebody will save it no 0.87
00:29:58.880 don't assume somebody saved it oh google books google i think you know you're i think you're
00:30:04.800 also just saying it should be at every level like you know companies should be doing this
00:30:08.640 with their historical records um people should be doing it individually within their families
00:30:14.080 and communities yes um every level of government not just federal i'm sure there's a bunch of this
00:30:18.240 at the state level and state government absolutely own yes state governments i have an archive from
00:30:25.040 one state government and that i'm working with right now that is massive it is literally every
00:30:30.480 newspaper in that state that they had the foresight to digitalize zero has been touched by google so
00:30:38.400 google hasn't scanned everything google turned off their book scanning system by the way because they
00:30:43.360 they were stealing data. They never worked out the social contract. Okay. We have a problem with
00:30:50.160 social contract and AI, right? Once you scan this, I don't care. I don't care what happens to it. I
00:30:56.340 got it. It's in my open AI model. You don't have it. So we're thinking in this, you know, remember
00:31:03.500 Bugs Bunny and Daffy Duck, they're going to Pismo Beach and they arrive in a cave and there's
00:31:09.480 nothing but gold and jewelry in there. And Daffy Duck is running and grabbing all the jewels.
00:31:14.440 I'm rich. I'm rich. I'm a miser. And, you know, Bugs Bunny is saying, hey, we're going to Pismo
00:31:20.860 Beach. You know, they realize, yeah, we do need money. But at some point in the interregnum,
00:31:26.860 5,000 days from now, money is going to have a different thing. The money that the wealthy have
00:31:32.260 is going to be worthless. It's on its way. That's another show. I'd be more than happy to go into
00:31:37.240 that but and again it's just as crazy as say saving this in 1979 um same thing's going to
00:31:44.520 happen with uh what we call uh cash um i'm sorry to say that and gold i'm sorry to say that there
00:31:50.960 are there are asteroids you know and uh they're very close and um it's very easy to get to and
00:31:57.280 once that happens your gold is worthless i'm sorry okay wait what okay hang on asteroids are
00:32:02.980 going to do what to what um there are asteroids that are basically full of gold and it's a massive
00:32:09.840 amount full of platinum full of oxygen full of hydrogen full of helium they're of course
00:32:18.260 abound in different chemical forms and it's so much easier to mine asteroids because how do i
00:32:24.060 how do i smelt in space it's called a parabolic mirror a really big mirror that focuses and you
00:32:30.880 can smelt anything but it's in zero g well that's actually a benefit because sometimes smelting is
00:32:36.320 easier in zero gravity or microgravity really and well but i do need gravity to separate okay spin
00:32:42.320 it well that's hard no it's spinning once you throw it in motion it spins pretty much forever
00:32:47.840 in space so all the things that we do to separate and smelt is going to be thousands of times easier
00:32:55.320 in space. And there's a guy called Elon that's planning to put up a rocket about every day
00:33:00.420 in the next five years. And it's going to be as cheap as going from here to, I don't know, Japan
00:33:07.340 on a first class ticket to get, I don't know, a hundred pounds. I'm averaging right now. I got
00:33:13.140 the data. So that's not even futuristic. I'm not even talking sci-fi. That is happening.
00:33:18.480 Wake up. That's the reality. Now, I don't want to scare people. Hold on to your gold. It's valuable.
00:33:23.100 But I'm going to tell you that if everything that we think is solid is not going to be solid
00:33:29.500 in the end of this 5,000-day cycle. And it's valuable to understand that. It's valuable
00:33:36.700 to have self-agency to know that you know this. It's not designed to disempower you. It's designed
00:33:43.660 to empower you. You who are listening right now know more than the people who are leading this
00:33:50.100 country. And I mean that because I talk to them. All right. Why is that important? Because we have
00:33:57.980 mechanisms now where you are liberated to know things that you're not necessarily normally are
00:34:03.820 going to know. When we lived in villages, we didn't know what happened in the next village
00:34:07.780 until the clowns came in and the roaming minstrels and they started singing. Every now and then the
00:34:13.540 king nailed something to the post and told you something that you couldn't even read because
00:34:18.000 you weren't allowed to read well we're allowed to read right and so what i'm saying is use that
00:34:24.020 muscle called the brain and understand that this is taking place use your conspiratorial mind a
00:34:30.860 conspiratorial mind of those bastards sorry um but don't go down that rabbit hole because you're
00:34:38.940 wasting your brain energy just know that it's happening maybe suspect the usual suspects and
00:34:46.640 then say, I'm going to find the things worth saving. You can't save it all. But what we can
00:34:52.400 do is people in political places can say, stop throwing away data at a state, local, federal
00:34:58.840 level. If you're in a corporation, stop throwing away your libraries. Right now, libraries are
00:35:04.720 being thrown away at major corporations. It's a bunch of junk. Nobody reads anymore. Somebody
00:35:09.640 digitalized it, I'm sure. There was a guy named Patrick. Is he still with us? No, boss. He retired
00:35:15.300 five years ago he's you know you know uh well he knows where everything is well you know throw it
00:35:21.780 away it's history who cares we're a young modern company all the great ideas come from somebody
00:35:27.460 who quit stanford and is now 23 years old and now we're going to pray to them because they know
00:35:33.140 vector math and i don't know that's temporary you know sooner or later well they're already
00:35:40.500 being replaced, right? The jobs, the people getting replaced are the programmers, right?
00:35:46.500 And they were so short-sighted, they didn't really understand what was going to take place.
00:35:51.780 No, I'm still valuable. Yes, you are, but not in the way you thought you were. You got de-skilled.
00:35:57.160 I talked about that last time. I got de-skilled. I used to know how to use the slide rule. I kind
00:36:01.660 of do still, but that's no longer a skill. That's a value. We just spoke about that yesterday. Is it,
00:36:08.460 you know, should students be learning with calculators and all these fancy tools or should
00:36:14.460 they just be allowed to, you know, should they have to know how to do it manually with as a skill
00:36:20.720 or should they be able to use the tool that makes it easy? And we were saying like, if it all goes
00:36:25.160 away, then you don't know how to do it. The process of thinking through it is being lost.
00:36:33.420 i think uh the owen had like a study right and um where the process was being lost you just get to
00:36:41.340 the answer one aspect of this is that at least in the current technology and you know i'm sure
00:36:46.120 brian you're aware there may be other ones coming that are more brain-like but the ones that are
00:36:49.420 currently there they're kind of mimicking thinking they're not really thinking and the example that
00:36:54.160 was given was some scientists put up a fake study and they made it obviously fake like they said you
00:37:00.280 know, thanks to the Starfleet Academy. And they even had sentences in the study saying this is
00:37:04.900 all made up several times, but it got fed into AI. And then co-pilots started saying, oh yeah,
00:37:10.840 there is this new rare condition, you know, because it was a fake disease that they made up.
00:37:14.840 And I think Gemini or something else did the same thing. We're like, oh yeah, that's a real thing. 0.51
00:37:19.520 And it just shows that AI doesn't have much judgment. It doesn't have the critical thinking.
00:37:26.180 it doesn't have the ability to say this is good data and this is bad data it just takes it all in
00:37:31.060 and now it's an average is all out you guys are on a brilliant track here i'll give you two words
00:37:36.660 wisdom and discernment wisdom is not a function of the current ai platforms that we have today
00:37:44.580 even agi artificial general intelligence even artificial super intelligence wisdom is a
00:37:50.820 construct of human brain, and we can go into what that truly is. But we are drowning in a sea of
00:37:57.540 data, but we are unable to understand that we need to save wisdom. Data is the ocean. Wisdom
00:38:06.500 is the life preserver that you grab upon within all of that data that you're writing. Data is
00:38:13.060 useless unless you have wisdom. Discernment is the ability to decide what data is of value.
00:38:19.880 That is instinctual. It cannot really be fully broken down into math. I have a number of different algorithms. I use a math equation. You can look up the rebellious bee equation. In a beehive, about 12% of bees will not follow the other bees, right?
00:38:44.340 If they do this bee dance and there's a pheromones,
00:38:46.560 it's a complicated thing, and they all go,
00:38:49.320 hey, I just found some new pollen.
00:38:51.540 There's about 12 that say, no, I ain't going there.
00:38:55.580 I'm gonna go over here.
00:38:57.000 They're the rebellious bees.
00:38:58.680 Guess what happens if you eliminate
00:39:00.080 the rebellious bees from a colony?