In this episode of The Sip, Marcella and Owen are joined by Brian Ramelli, a music professor at the Berklee College of Music in Boston. They talk about the importance of public service announcements and how important it is to be useful.
00:09:33.680So nobody come after me. This has been decommissioned by the federal government and they are left in warehouses. And then decades later, after one librarian or curator is gone, somebody sees that they have 40,000 square feet in a warehouse. What's in there? IBM punch cards, boss. What? We don't even read those anymore. Throw them away. And that's what happens.
00:10:00.280And that's the innocent way that it happens. There are other people internally within the government just don't care about our history, would prefer it to be gone. They're not nefarious as they sound. They just hate humanity. There are a lot of people, unfortunately, that exist that are like that.
00:10:17.260They don't overtly say that, but they don't like technology, they don't like humanity, they're not the Unabomber, but maybe they're in that class of individual.
00:10:28.740And I get it, you know, technology hasn't always worked well, you can cook with fire, you can burn people with fire, it's all according to how you utilize it.
00:10:36.940But my point is, we are the amnesia generation more than any generation in history.
00:10:43.740You can go back to the Library of Alexandria being burnt.
00:10:47.360We're burning a library of Alexandria every week of data.
00:11:41.540But Einstein comes along and now that's a new observation. Gravity works differently. I don't want to get down that rabbit hole. But different observations, different colloquial explanations of how those facts work.
00:11:54.920this data being removed leaves us with modern data and what is important is the amount of work
00:12:03.280that went into this yeah and i don't mean that necessarily physically i mean um mentally yeah
00:12:10.240to get something to get this far hundreds of people very hard to reach that data it wasn't
00:12:17.640one guy in their basement who doesn't like their parents who are scrubbing along at reddit at
00:12:23.720three in the morning with some nihilistic rant about humanity that is in our chat gpt that is
00:12:31.320in our anthropic clod models and what they do is they try to uh align it to human ambition
00:12:38.880it's equivalent to taking sewage and trying to find the good stuff that are in the stew
00:12:44.960in the sewage i don't know if there's anything good in the sewage once it's in the sewage
00:12:48.880it's no good anymore i'm not so so where so people are asking you know who do they go to
00:12:57.140like you know so we're all like oh my god let's get this stuff so it's basically federal microfiche
00:13:02.760governmental uh microfiche um maybe some like amazing uh like astrophysic companies that you
00:13:11.780know were private so so we're talking about the the big thinkers of yesteryear and how they got to
00:13:18.420work out problems and solutions and come up with amazing things, all this work's being thrown away.
00:13:24.980So we have to think creatively, you guys. So if you're that type of a person or you have friends
00:13:30.340that are, or you're looking for a project or a mission or a way to help or be useful, as we say,
00:13:37.080see if you can get your hands on this stuff. And you could coordinate with Brian, who is digitizing
00:13:43.900as much of it as he can. It's like we either get it now or we lose it forever. So, and like he
00:13:50.360says, so much of it's already gone. Is that the gist? Yes, Erica. It's actually even worse.
00:13:56.980First off, first principles, go to savewisdom.org and start recording your 1000 questions because
00:14:06.040your data is much more valuable than this, period, end of story. And I don't care if you don't think
00:14:11.560it's not important. It is. Save it. And hopefully, we at least have that audio that somebody in your
00:14:18.460family will discover and say, wow, I didn't know grandpa really did that. It's important. We don't
00:14:23.980spend enough time together. Do this now. And make it a challenge. Do it in a car, in a closet,
00:14:30.860and do it alone and emote, because you're going to answer some of those questions. And you are
00:14:36.020you are not doing it right unless you're going to cry. Because those questions aren't emotional,
00:14:41.640the memories are. And it's supposed to be that way. It's not supposed to be in the internet.
00:14:47.060It's not supposed to be in chat GBT. But I pray and hope that it will be in your AI one day.
00:14:53.120And that AI, if you choose, can be shared with the world. Decades, hundreds, thousands of years
00:15:00.000after you're gone, your voice will still be heard. That's number one. Number two, you have old books.
00:15:04.620don't assume that they were digitalized. They probably weren't. I guarantee you,
00:15:09.960you will be blown away what you see in old medical texts. You will be blown away with
00:15:14.600what you see in old encyclopedias, in old maps, Tartaria. Now, I'm not going to go down that
00:15:19.740rabbit hole. But there's going to be a lot of things that you might think, oh, well,
00:15:25.840that was antiquated thinking. No, it was thinking from a different place. It was thinking from a
00:15:31.960different point in time. It may be more flowering terms, right? People use the vernacular syntax
00:15:39.200and nomenclature from their epoch, right? So if it's 1800s, the words are going to be a little
00:15:45.760difficult. The other things, oh, all the magazines and newspapers, only 3% of the newspapers have
00:15:51.840been digitalized. And I have archives of newspapers in the Midwest that shut down
00:15:58.240and they were hauling it away i literally dived in dumpsters to save microfilm at that point
00:16:04.840of a newspaper that existed for 112 years wow and it was all going to get thrown away we would
00:16:11.380have never heard it i don't have time even to to dive in and digitalize that stuff i have
00:16:16.400unfortunately uh self-storages all around the country from where i dumpster dives in the 80s
00:16:22.600and 90s i don't get to do it as much lately i'd probably get arrested now back then people tried
00:16:29.000to but once you throw in a dumpster it's public domain period end of story uh i'm not talking
00:16:34.760intellectual property that's a little complicated but when a company goes fully bankrupt where i
00:16:39.560dumpster dived a lot and bankrupt companies i was looking for their libraries and you have the work
00:16:45.400product of somebody worked for 37 years on a project and and some new guy from harvard came
00:16:51.960in and did some funny things with the finance and all of a sudden company gets bankrupt and they get
00:16:57.560fully liquidated nobody buys their ip it's gone um and it gets extinguished so that life that was
00:17:05.640dedicating 37 years to build some widget i want to save because not only is the content probably
00:17:15.480important the context is even more important what were the processes george that you used
00:17:21.720to get here. Oh my gosh, we don't use that anymore. We need to do that again. That's why I
00:17:27.940save that stuff. So Brian, for people like Freebird just said, my neighbor has 30 years
00:17:32.280of newspapers piled up in her house. I mean, I have some old amazing magazines and things that
00:17:38.580I saved. Should I do something with those? Yes. If you're going into a basement and garage,
00:17:45.700wear a mask. Black mold is a big problem. Do not breathe that dust. I'm not being a freak. I'm not,
00:17:51.420I am in California, but I'm telling you, well, there are people right now that are suffering
00:17:56.320miserably from black mold and, and they're being misdiagnosed and they're being told that,
00:18:01.520you know, Papa's got, you know, the dementia, you know, he's losing it. No, he's got black mold and
00:18:07.880we can probably bring him back. That's another thing. How do I know that? Well, I got a medical
00:18:13.580texts from 1890s that actually go into this. They knew that black mold would cause mental
00:18:21.860decline and had ways to reverse it. They put leeches on your face and do some bloodletting.
00:18:27.760No. There are protocols that they knew that we no longer know. In fact, we do follow a guy called
00:18:42.280a Midwestern doctor. He is one of those people preserving old texts, you know, that will be a
00:18:48.780lightning bolt. He's like the Scott Adams of the medical world. You know, the world would like to
00:18:55.200see him go away because he constantly brings up things like, I don't know, using DMSO to reverse
00:19:00.720cataracts and, you know, retina failure, things like that. Read him. Don't take my medical advice.
00:19:07.900He is, in fact, a doctor. But we save a lot of these things that are vitally important for
00:19:15.060humanity. So mask up, make sure you use gloves, take out your iPhone. If you have an iPhone,
00:19:23.740if you don't, get the Android equivalent. Inside of Notes, the iPhone app, you can go to the
00:19:30.160scanning app and just start scanning things. Don't complicate things. Make sure it's readable.
00:19:35.320put them into PDF files, save them, and ultimately, if you can, put them on some physical medium,
00:19:42.760at the very least, a good DVD, if not a CD. And, you know, catalog what they are.
00:19:49.940And if you think it's important, I don't know if it's important.
00:24:32.920What is the sophistication of a photographic? Well, some light and maybe a magnifying glass.
00:24:39.460And I decoded this. What is the sophistication of decoding a DVD? And again, I said, store it on a
00:24:44.980DVD. That's the best we have right now. Well, you better have laser technology. You better have all
00:24:50.000the electronics that are going to do that. Now I'm not talking prepper end of the world. I am,
00:24:54.680but I'm not. I mean, that's a different thing. You know, I am talking, just assume everything
00:25:00.600goes along the way it has, right? It keeps rolling along. Those old dictionaries, words that are no
00:25:08.780longer allowed to be used in the way that they are today, right? Old encyclopedias that literally
00:25:14.580define things much differently than today. And this, you know, it has nothing to do with whims.
00:25:20.760Certain things do not have the whim of political, you know, colorings to try to change its
00:25:26.100definitions. The definitions are solid. Geographical things, history. I mean, I learned about
00:25:34.180the Knights Templar from a book that blew my mind. I had no idea. I thought that they were doing,
00:25:42.860you know, as a Catholic, right? You know, I'm a Christian now. It blew my mind because I thought
00:25:49.780they were beheading people because they weren't saying the right thing to the Pope.0.97
00:25:54.620it turns out that they were trying to push back an invasion of a certain Middle Eastern culture
00:26:01.900that was taking over Europe and bringing it back into the Dark Ages. We're told a certain group of0.94
00:26:09.100people, a certain religion burned a library of Alexandria. I saw the historical link that it was
00:26:15.900another religion that took it out and beheaded the people and used albano balio so i don't know
00:26:23.340shells of a shellfish to take hypatia and torture her to death for daring to save information that
00:26:32.220offended their their deity and um the unfortunate thing is this is happening again and so when you
00:26:41.340have the coloring of history and you can't go back and by the way the winners do change history
00:26:47.060that's a fact but if you have enough pieces of it you can go in a dark room and start putting
00:26:52.580the puzzle pieces together better and a lot of us are thinkers in fact i guarantee you everybody
00:26:57.960listening to right now that are taking the time to listen to what scott adams was saying and what
00:27:03.940you folks are saying and hopefully what i'm trying to say here um you really know that things are
00:27:10.480changing right before our eyes. And we are the last generation and maybe the first generation
00:27:17.400who can do something about it. If we save our data, if we start training AI on this data.
00:27:24.560And by the way, here's the interesting part. This is as important to people like Anthropic and Open
00:27:30.460AI as it is to XAI and Grok and me building AI models in my garage and you one day building your
00:27:38.340AI models. You're going to laugh. Oh, I can't do that, Brian. Yes, you can. It's going to be as
00:27:43.560easy as you opening up your, your smartphone. It's going to be that easy to train in AI. It's not
00:27:48.720right now. It is just like maybe one day when the Apple one came out, you looked at a pile of junk
00:27:54.260of wires and circuits, and I'm never going to do anything with that. Yeah, you aren't with that,
00:27:59.360but one day you will. So when I talk about the future, I'm extrapolating what I know is
00:28:04.060definitely going to happen. And I work my way backwards. And I'm saying this data, at some
00:28:09.240point in the future, people are going to be screaming at us as old cranky folks. Why didn't
00:28:14.200you save it? Didn't you know? Didn't you know? And I'm saying, now you know. And I'm not telling
00:28:20.800you to be a hoarder, but hoarders are cool people. I hoard a little junk. But when you hoard,
00:28:28.980try to do it with intention. Try to say, you know, there's something valuable here.
00:28:32.840and try to find it because you're on a journey too. And exercise the brain muscle and say,
00:28:38.620what is in this book? Holy cow, look what I found, Brian. Look what I found,
00:28:43.560Scott Adams School. And talk about it because we need this. And it's not so much in the political
00:28:49.920sense of now. Yeah, sure. Tell off somebody who's got it wrong. That's cool. But that's not going
00:28:55.480to matter in 15 or 20 years, really. What's going to matter is that you did the work,
00:28:59.900that you saved it. So I'm talking to leaders right now in the political realm. Talk to the
00:29:05.900president. Sign an order today to stop throwing away everything that the government has. Number
00:29:11.920one. Number two, start an organized campaign of all the curators and libraries within the federal
00:29:17.660government to try to organize this data. It is massive. I know more of where this data is than
00:29:26.800any one person in the government because i spent more than 40 years trying to study this i didn't
00:29:31.920just guess this when ai came i knew in 1979 that we're going to need ai training data i knew that
00:29:39.840right and it sounded crazy more back then as it does now i always kind of sound but back then i
00:29:47.840would say no save these cars you know what the hell is wrong with you romley what is wrong with
00:29:51.680you it's punch cards no someday people are going to want what's on it no somebody will save it no0.87
00:29:58.880don't assume somebody saved it oh google books google i think you know you're i think you're
00:30:04.800also just saying it should be at every level like you know companies should be doing this
00:30:08.640with their historical records um people should be doing it individually within their families
00:30:14.080and communities yes um every level of government not just federal i'm sure there's a bunch of this
00:30:18.240at the state level and state government absolutely own yes state governments i have an archive from
00:30:25.040one state government and that i'm working with right now that is massive it is literally every
00:30:30.480newspaper in that state that they had the foresight to digitalize zero has been touched by google so
00:30:38.400google hasn't scanned everything google turned off their book scanning system by the way because they
00:30:43.360they were stealing data. They never worked out the social contract. Okay. We have a problem with
00:30:50.160social contract and AI, right? Once you scan this, I don't care. I don't care what happens to it. I
00:30:56.340got it. It's in my open AI model. You don't have it. So we're thinking in this, you know, remember
00:31:03.500Bugs Bunny and Daffy Duck, they're going to Pismo Beach and they arrive in a cave and there's
00:31:09.480nothing but gold and jewelry in there. And Daffy Duck is running and grabbing all the jewels.
00:31:14.440I'm rich. I'm rich. I'm a miser. And, you know, Bugs Bunny is saying, hey, we're going to Pismo
00:31:20.860Beach. You know, they realize, yeah, we do need money. But at some point in the interregnum,
00:31:26.8605,000 days from now, money is going to have a different thing. The money that the wealthy have
00:31:32.260is going to be worthless. It's on its way. That's another show. I'd be more than happy to go into
00:31:37.240that but and again it's just as crazy as say saving this in 1979 um same thing's going to
00:31:44.520happen with uh what we call uh cash um i'm sorry to say that and gold i'm sorry to say that there
00:31:50.960are there are asteroids you know and uh they're very close and um it's very easy to get to and
00:31:57.280once that happens your gold is worthless i'm sorry okay wait what okay hang on asteroids are
00:32:02.980going to do what to what um there are asteroids that are basically full of gold and it's a massive
00:32:09.840amount full of platinum full of oxygen full of hydrogen full of helium they're of course
00:32:18.260abound in different chemical forms and it's so much easier to mine asteroids because how do i
00:32:24.060how do i smelt in space it's called a parabolic mirror a really big mirror that focuses and you
00:32:30.880can smelt anything but it's in zero g well that's actually a benefit because sometimes smelting is
00:32:36.320easier in zero gravity or microgravity really and well but i do need gravity to separate okay spin
00:32:42.320it well that's hard no it's spinning once you throw it in motion it spins pretty much forever
00:32:47.840in space so all the things that we do to separate and smelt is going to be thousands of times easier
00:32:55.320in space. And there's a guy called Elon that's planning to put up a rocket about every day
00:33:00.420in the next five years. And it's going to be as cheap as going from here to, I don't know, Japan
00:33:07.340on a first class ticket to get, I don't know, a hundred pounds. I'm averaging right now. I got
00:33:13.140the data. So that's not even futuristic. I'm not even talking sci-fi. That is happening.
00:33:18.480Wake up. That's the reality. Now, I don't want to scare people. Hold on to your gold. It's valuable.
00:33:23.100But I'm going to tell you that if everything that we think is solid is not going to be solid
00:33:29.500in the end of this 5,000-day cycle. And it's valuable to understand that. It's valuable
00:33:36.700to have self-agency to know that you know this. It's not designed to disempower you. It's designed
00:33:43.660to empower you. You who are listening right now know more than the people who are leading this
00:33:50.100country. And I mean that because I talk to them. All right. Why is that important? Because we have
00:33:57.980mechanisms now where you are liberated to know things that you're not necessarily normally are
00:34:03.820going to know. When we lived in villages, we didn't know what happened in the next village
00:34:07.780until the clowns came in and the roaming minstrels and they started singing. Every now and then the
00:34:13.540king nailed something to the post and told you something that you couldn't even read because
00:34:18.000you weren't allowed to read well we're allowed to read right and so what i'm saying is use that
00:34:24.020muscle called the brain and understand that this is taking place use your conspiratorial mind a
00:34:30.860conspiratorial mind of those bastards sorry um but don't go down that rabbit hole because you're
00:34:38.940wasting your brain energy just know that it's happening maybe suspect the usual suspects and
00:34:46.640then say, I'm going to find the things worth saving. You can't save it all. But what we can
00:34:52.400do is people in political places can say, stop throwing away data at a state, local, federal
00:34:58.840level. If you're in a corporation, stop throwing away your libraries. Right now, libraries are
00:35:04.720being thrown away at major corporations. It's a bunch of junk. Nobody reads anymore. Somebody
00:35:09.640digitalized it, I'm sure. There was a guy named Patrick. Is he still with us? No, boss. He retired
00:35:15.300five years ago he's you know you know uh well he knows where everything is well you know throw it
00:35:21.780away it's history who cares we're a young modern company all the great ideas come from somebody
00:35:27.460who quit stanford and is now 23 years old and now we're going to pray to them because they know
00:35:33.140vector math and i don't know that's temporary you know sooner or later well they're already
00:35:40.500being replaced, right? The jobs, the people getting replaced are the programmers, right?
00:35:46.500And they were so short-sighted, they didn't really understand what was going to take place.
00:35:51.780No, I'm still valuable. Yes, you are, but not in the way you thought you were. You got de-skilled.
00:35:57.160I talked about that last time. I got de-skilled. I used to know how to use the slide rule. I kind
00:36:01.660of do still, but that's no longer a skill. That's a value. We just spoke about that yesterday. Is it,
00:36:08.460you know, should students be learning with calculators and all these fancy tools or should
00:36:14.460they just be allowed to, you know, should they have to know how to do it manually with as a skill
00:36:20.720or should they be able to use the tool that makes it easy? And we were saying like, if it all goes
00:36:25.160away, then you don't know how to do it. The process of thinking through it is being lost.
00:36:33.420i think uh the owen had like a study right and um where the process was being lost you just get to
00:36:41.340the answer one aspect of this is that at least in the current technology and you know i'm sure
00:36:46.120brian you're aware there may be other ones coming that are more brain-like but the ones that are
00:36:49.420currently there they're kind of mimicking thinking they're not really thinking and the example that
00:36:54.160was given was some scientists put up a fake study and they made it obviously fake like they said you
00:37:00.280know, thanks to the Starfleet Academy. And they even had sentences in the study saying this is
00:37:04.900all made up several times, but it got fed into AI. And then co-pilots started saying, oh yeah,
00:37:10.840there is this new rare condition, you know, because it was a fake disease that they made up.
00:37:14.840And I think Gemini or something else did the same thing. We're like, oh yeah, that's a real thing.0.51
00:37:19.520And it just shows that AI doesn't have much judgment. It doesn't have the critical thinking.
00:37:26.180it doesn't have the ability to say this is good data and this is bad data it just takes it all in
00:37:31.060and now it's an average is all out you guys are on a brilliant track here i'll give you two words
00:37:36.660wisdom and discernment wisdom is not a function of the current ai platforms that we have today
00:37:44.580even agi artificial general intelligence even artificial super intelligence wisdom is a
00:37:50.820construct of human brain, and we can go into what that truly is. But we are drowning in a sea of
00:37:57.540data, but we are unable to understand that we need to save wisdom. Data is the ocean. Wisdom
00:38:06.500is the life preserver that you grab upon within all of that data that you're writing. Data is
00:38:13.060useless unless you have wisdom. Discernment is the ability to decide what data is of value.
00:38:19.880That is instinctual. It cannot really be fully broken down into math. I have a number of different algorithms. I use a math equation. You can look up the rebellious bee equation. In a beehive, about 12% of bees will not follow the other bees, right?
00:38:44.340If they do this bee dance and there's a pheromones,
00:38:46.560it's a complicated thing, and they all go,