The Joe Rogan Experience - August 28, 2025


Joe Rogan Experience #2372 - Garry Nolan


Episode Stats

Length

2 hours and 37 minutes

Words per Minute

164.03511

Word Count

25,841

Sentence Count

1,692

Misogynist Sentences

6

Hate Speech Sentences

22


Summary

In this episode of the Joe Rogan Experience podcast, I sit down with Dr. Aaron Sorkin, a professor at the School of Medicine at Stanford, to talk about the science of cancer and how tumors evolve to trick the immune system into not recognizing them.


Transcript

00:00:01.000 Joe Rogan Podcast, check it out.
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Join my day, Joe Rogan Podcast, by night, all day.
00:00:12.000 There, very nice to meet you, sir.
00:00:14.000 Nice to meet you.
00:00:14.000 Thank you for doing this.
00:00:15.000 I really appreciate it.
00:00:17.000 Tell everybody what you do.
00:00:18.000 Tell everybody what your official position is.
00:00:21.000 You're a professor at the School of Medicine at Stanford.
00:00:25.000 What do you do?
00:00:26.000 So my day job is in cancer research and cancer biology, mostly immunology and cancer.
00:00:33.000 Much of what my laboratory does is not so much the biology of cancer, but the developing instruments that create the data that allow us to analyze the complexities of how the immune system interacts with tumors and how tumors basically re-enable the immune system to help the cancer itself.
00:00:54.000 So the problem has been we don't have the ability to collect enough data, or not until recently, to collect and understand what all of that means.
00:01:02.000 So we've been kind of poking in the dark for decades.
00:01:06.000 And so probably for the last twenty years, I've developed a number of instruments and turned them into companies that allow everyone to access a level of information they couldn't get before.
00:01:17.000 So explain that the immune system allows the tumors?
00:01:24.000 So what happens is that there's a sort of a there's a dance between the mutations that initiate a tumor and then a sort of an evolution of how the tumor eventually learns how to trick the immune system to not recognize it.
00:01:41.000 So we have all kinds of internal I mean literally every day, every person, you'll develop five cancer like objects inside your body.
00:01:50.000 But the immune system and your body have a way of shutting it down very quickly.
00:01:55.000 But with enough time and with enough variation, tumors will eventually evolve in a way that trick the immune system not only into not recognizing them, but in fact to help them and feed them in a way to create an inflammatory environment that actually then the tumor uses to propagate its own cell division and then metastasis.
00:02:15.000 So it's a normal function of natural human biology to create tumors.
00:02:21.000 It's not so much a normal function, it's a byproduct of what evolution is, that when the genes mutate when a cell divides or if you go out and stand in the sun to mutate.
00:02:33.000 For instance, you get skin cancers because you're getting ionizing radiation that's changing the DNA, making a mutation, and some of those random mutations will initiate a cancer.
00:02:43.000 So, for instance, I have a mutation called MIDFE 318k.
00:02:49.000 It's a mutation that I was born with, it didn't, wasn't in my family, and it causes both melanoma and kidney cancer, which I've had both.
00:02:57.000 I've had a dozen melanomas alone.
00:03:01.000 You know, we didn't find that out until a couple of years ago, but I've been following it over the years, and we basically figured out, okay, it's going to have to be this.
00:03:09.000 So we had my sequence, my genome sequenced.
00:03:11.000 But that's just one of hundreds of different kinds of mutations that can occur that are on a path towards creating a cancer.
00:03:20.000 But the cancer can't survive if the immune system recognizes it.
00:03:25.000 So eventually what happens is there's this detente that is reached between the immune system and the cancer where the immune system basically ignores the cancer.
00:03:35.000 So Jim Allison here in Houston won the Nobel Prize back in 2018 for understanding one of these turn off signals that the immune system that the cancers use to turn off the immune system and that by showing he could block it, his wife Pam Sharma ran a bunch of clinical trials in MD Anderson that showed in fact that this could actually turn a five percent survival disease in melanoma to a fifty percent survival.
00:04:03.000 And that then created the whole immunotherapy field that the world is taking advantage of today.
00:04:10.000 Wow.
00:04:10.000 So what is cancer actually doing?
00:04:15.000 Like how did it How do tumors develop this ability to trick the immune system?
00:04:20.000 Is this something that other animals have?
00:04:23.000 Oh, yeah.
00:04:24.000 So it's a constant.
00:04:25.000 It's a constant battle.
00:04:27.000 So, for instance, there are proteins on your cell's surface.
00:04:31.000 I won't go too immunologically deep about it.
00:04:33.000 They're called major histocompatibility complex proteins.
00:04:37.000 So for instance, if I were to try to just randomly do a tissue transplant from me to you, it's very likely that it would be rejected.
00:04:44.000 And it's because of those MHC proteins that it's rejected.
00:04:49.000 What's happening is that your cells are presenting your internal cell biology to the immune system.
00:04:56.000 And it's saying, okay, you're a friend, not a foe.
00:05:00.000 So when cancer usually initiates, there are disruptions that happen and proteins are made inc doing in some cases is they're presenting the internal damage to the body.
00:05:14.000 And the body's saying, oh, there's something wrong with this cell.
00:05:17.000 We better wipe it out.
00:05:18.000 We kill it.
00:05:19.000 These same proteins are what the immune system uses, for instance, to go after viruses.
00:05:24.000 So when you get a virus infection inside the cell, the body has a way of chopping those proteins inside the cell, presenting it via MHC.
00:05:33.000 And then the immune system attacks it.
00:05:35.000 So what one of the first things that actually tumors do is they learn to turn off the MHC proteins inside themselves.
00:05:42.000 So the ability to show that I'm damaged is shut down.
00:05:46.000 And so the immune system doesn't go on full alert for that.
00:05:50.000 But then there are other mutations like divide when you're not supposed to, you know, avoid this kind of induced cell death called apoptosis and not others.
00:06:00.000 And so it cancer doesn't just like start and then the next day you've got it.
00:06:05.000 It's a progression of events.
00:06:07.000 You have these precancerous lesions.
00:06:09.000 You have like a benign tumor which eventually becomes a metastatic tumor.
00:06:15.000 And so the but the immune system is key at every stage of the development because if you can reactivate the immune system in just the right way, then you can prevent the cancer from basically spreading or from metastasizing or from killing you essentially.
00:06:35.000 Is there a potential for, given the understanding of this, is there a potential for using this for organ transplant patients where locally would stop recognizing this as a foreign organ?
00:06:50.000 That's exactly what is done.
00:06:52.000 In fact, when you get a tissue transplant or an organ transplant, you're suppressing the immune system.
00:06:59.000 The problem with that suppression is that you then put yourself at risk risk of cancer.
00:07:05.000 Because what you're doing is you're turning off the immune system's ability to combat and go after a cancer in the moment it forms.
00:07:12.000 So most people who are under immune suppression are at risk both of, let's say, virus infections, bacterial infections, but also further cancers.
00:07:22.000 So would the potential be to turn that off locally so you could turn that off to this on the specific organ?
00:07:28.000 That would be a great thing to do if we could.
00:07:31.000 Right now, the only things that we have are systemic.
00:07:35.000 So yeah, I mean, for instance, if you could deliver to the organ that you're transplanting anti-immunosuppressive, you know, basically immunosuppressive locally.
00:07:45.000 That would be great.
00:07:46.000 We don't have that yet.
00:07:48.000 But that would be via a form of gene therapy.
00:07:50.000 But the problem would that be that if you, let's say you had a lung transplant.
00:07:55.000 If you had a lung infection, it would be catastrophic.
00:07:57.000 Do you want to come work in my lab?
00:07:59.000 You're you're accepted as a graduate student in the Stanford Department of Pathology.
00:08:04.000 Well, that was easy.
00:08:07.000 I have a few friends that have had organ transplants.
00:08:11.000 And it's, you know, it's very disturbing knowing that they're so vulnerable to any kind of infection because of these medications that they have to take in order for the body to accept the transplant?
00:08:21.000 One of the problems is that there are literally hundreds of different types of immune cells.
00:08:26.000 And really until recently, and frankly until a technology my lab developed about over a dozen years ago, we couldn't look at all of the immune cell types all at once in a single picture.
00:08:39.000 So I came from a laboratory, Len and Lee Herzenberg, when I was a grad student at Stanford, and they had developed an instrument called the fluorescence-activated cell sorter.
00:08:48.000 And that allowed you to look at three proteins at a time.
00:08:51.000 And if you could know ahead of time what the cell types were that expressed the proteinins that you're interested in, you could look at just those three cell types.
00:09:00.000 Then I came up with a way to look at, you know, fifty or sixty proteins at a time, sort of stepping up what they had already taught me how to do.
00:09:09.000 And then suddenly that gave us the ability to look at nearly every cell type in the body, an immune cell type.
00:09:16.000 And then that gave us, let's say, the raw data to build mathematical models that we could do better predictions of what outcomes would be.
00:09:24.000 And how is that, like, what are you, what are you applying in terms of like real world scenarios?
00:09:30.000 How are you applying this?
00:09:31.000 Well, so for instance, there's a kind of leukemia called AML, acute myelogenous leukemia.
00:09:39.000 It starts in the bone marrow.
00:09:41.000 And it is a distorted version of a myeloid cell type.
00:09:47.000 It starts as a stem cell, and that stem cell goes down a number of different paths.
00:09:53.000 And depending upon the person, the disease is sufficiently different that it might follow a slightly different path towards what becomes the disease itself.
00:10:04.000 And so being able to trace the path and to know which steps along the way that it takes to become what becomes then the metastatic lymph leukemia could only be accomplished by having enough markers that allowed us to trace everybody along the path.
00:10:22.000 It's kind of like if I wanted to follow you from who you are as an egg through development through to who you are today and I had snapshots every month, I need different markers to measure what you are as an egg versus what you are as a baby versus what you are as an adult.
00:10:41.000 And so each of those different markers in my world would be different proteins that tell me something about an adult leukemia versus a baby leukemia.
00:10:51.000 And then we use something called pseudo time, which is a mathematical concept that allows us to stitch together those photographs.
00:10:58.000 I could take a random box of photos of you from an egg to who you are today, and I could just by hand put together the most likely path and sequence of what you were from the earliest to the latest.
00:11:09.000 But we needed the data and we needed the means and the instruments to collect that information so that then the math could come into play.
00:11:16.000 That's such a fascinating thing about human beings is the biological variability.
00:11:22.000 Is that everybody is so the same?
00:11:27.000 Two lungs, a heart, but so different in how our body reacts to things.
00:11:33.000 and what happens to us and environmental factors, diet, stress, all sorts of different factors.
00:11:42.000 And you're kind of picking together this puzzle.
00:11:46.000 Of all these things.
00:11:46.000 Right.
00:11:48.000 But what you're doing is you still have to pay homage to the fact that those differences exist.
00:11:53.000 And so while, you know, my cancer might be the same class of, let's say, melanoma as another person's, the complexity of what allowed that cancer to become are so different that the drugs that would work for me might not work for another person.
00:12:11.000 And so that's what basically requires us to personalize the medications in a way that gives the right drug to the right person.
00:12:23.000 So I've started probably half a dozen companies and sold them, places like Roche, et cetera.
00:12:28.000 Actually, my most recent company we sold to 10x Genomics, which enables them now because of a patent I created back in 2011 to scale up the amount of information that we can collect at a time that then when layered on top of what, for instance, 10x Genomics already did, which is doing what's called single cell genomic analysis, we could scale that up a hundredfold to get a hundredfold amount of information.
00:12:56.000 But the problem with that is that I can collect all that data and make an analysis of a cancer for you, but it might be a little bit different than another person.
00:13:08.000 So what we have to do then is develop techniques that allow us to narrow in on what the differences might be so that when I develop a drug for person X, it works for person X and not for person Y, right?
00:13:21.000 The right way.
00:13:22.000 So there's a lot of personalization in medicine that is required.
00:13:28.000 The diversity that makes humanity great and that makes humanity able to survive in the face of so many challenges is that there are individual differences that one person might survive and another won't.
00:13:43.000 It's the same thing with cancers.
00:13:45.000 And it's the same thing with drugs.
00:13:47.000 I mean, the, you know, for instance, with certain drugs, one of the first things I learned in pharmacology when I was way back in the day is that there's always a benefit to damage ratio that you're having to deal with.
00:14:02.000 That a drug has a positive outcome, but there are side effects.
00:14:06.000 And so as scientists or as clinicians, we make a choice based on the statistics.
00:14:12.000 Who will benefit the most?
00:14:14.000 And will it benefit the most?
00:14:15.000 But by the way, there's all these side effects that might affect you.
00:14:19.000 And overall, globally, 60% of people will survive.
00:14:24.000 But since I don't know anything more about your specific disease, I am by law required to give you the 60% drug.
00:14:34.000 until I know or can distinguish that your disease is a different subclass than the 60%.
00:14:40.000 And that's, in fact, a lot of what pharmaceutical companies are doing is they're trying to marry a diagnostic to the disease itself, the disease subtype itself, so that if you can show that 90% of the people of this kind of subclass will survive, you have to, by law, choose that diagnostic to make sure that the person doesn't have the subclass before you give them the 60% drug.
00:15:06.000 Does that make sense?
00:15:07.000 Yes.
00:15:08.000 Yeah, it does.
00:15:09.000 The narrative has always been over the last few decades, stay out of the sun.
00:15:16.000 But recently, people have started saying, no, it's actually you need to become accustomed to the sun.
00:15:23.000 And the real issue is people using sunscreen all the time and then going out and getting burned.
00:15:27.000 Obviously, your situation is very different.
00:15:30.000 Because you have a specific gene.
00:15:32.000 And I'm Irish.
00:15:33.000 Yeah.
00:15:35.000 That's the problem, right?
00:15:36.000 The genes of the people that lived in cloudy ass places.
00:15:36.000 Yeah.
00:15:40.000 Right.
00:15:40.000 Exactly.
00:15:41.000 Thousands of years.
00:15:42.000 And my mother when we were kids, I mean, I'm 64 years old.
00:15:45.000 So when I was a kid, you know, we'd go to the beach in Connecticut and they'd suffocate me in, you know, coconut oil.
00:15:52.000 Oh yeah.
00:15:53.000 Yeah.
00:15:54.000 Baby oil when I was a kid.
00:15:55.000 Everybody had baby oil.
00:15:56.000 And everyone got barbecued.
00:15:57.000 Yeah.
00:15:58.000 Plus I worked in the fields as a kid for, you know, farm labor.
00:16:02.000 And that's not good.
00:16:03.000 That wasn't good.
00:16:04.000 The burning, that's the real damage to the skin, and then it manifests itself as cancer much later in life, right?
00:16:09.000 Right.
00:16:10.000 There's all these subtle, let's call them, smoldering mutations that are waiting for a second or a third hit to occur, or for, you know, instance, you get old enough so that your immune system is kind of going wonky, and it no longer is able to take care of something that twenty years ago it would have been able to heal perfectly well.
00:16:31.000 That makes sense.
00:16:32.000 So is there any, this narrative that you need to be in the sun more and that just don't get burneded.
00:16:41.000 Is that reality?
00:16:42.000 Well, it depends on who I mean, for someone like me, no.
00:16:45.000 But there are positive, obviously, for the sun.
00:16:48.000 I mean, vitamin D, as an example.
00:16:51.000 But they're also, you know, resetting your clock in the morning rather than taking melatonin at night.
00:16:56.000 Go and just, you know, in a bright use glass to shield out the ultraviolet and get some bright light.
00:17:03.000 It's the UV that's the danger.
00:17:05.000 It's not light.
00:17:08.000 So for you, you don't you don't ever just go sit in the sun.
00:17:12.000 No, but I was.
00:17:12.000 Not anymore.
00:17:15.000 Because I was an idiot when I was a kid.
00:17:16.000 I mean, I would go and use tanning beds, because I thought, well, I wanted to look, you know, tan.
00:17:21.000 And I did tan back then, but, you know, obviously can't anymore.
00:17:21.000 Right.
00:17:25.000 Yeah, you don't really see those anymore, do you?
00:17:27.000 No.
00:17:28.000 You do.
00:17:28.000 Maybe in like Seattle.
00:17:30.000 Some people do.
00:17:31.000 Yeah, there's, you know, I mean, I think there's obviously, there's a benefit to light.
00:17:35.000 I mean, I'm not saying don't go out and do it.
00:17:37.000 And if, and, you know, I think as well, there'll come a day, and I was just talking with some friends of mine at dinner last night, is, you know, maybe with things like CRISPR, I could rub a CRISPR ointment on my body.
00:17:52.000 It would fix the single point mutation in my skin and then I could enjoy the sun again.
00:17:59.000 Is that really potentially?
00:18:01.000 Don't you think?
00:18:01.000 Oh yeah.
00:18:02.000 Oh yeah.
00:18:03.000 How far away are we?
00:18:03.000 I think.
00:18:04.000 I think honestly, I mean, people always say five years is sort of like this horizon.
00:18:08.000 But no, I really, I mean, I know people who are already developing systems for delivering genes, you know, RNA to cell.
00:18:16.000 I know that's a dirty word in some, but there are formulations of RNA that probably won't be as problematic as some of the things that maybe the COVID vaccine might have done.
00:18:26.000 Right.
00:18:26.000 Yeah.
00:18:27.000 RNA right now, you say, and people clench.
00:18:30.000 Yes, exactly.
00:18:31.000 But I mean, your cells are full of RNA.
00:18:31.000 Yeah.
00:18:35.000 So, I mean, you can't get away from the fact that your cells are full of RNA.
00:18:38.000 That's just the messenger.
00:18:39.000 That's the name.
00:18:40.000 Yeah.
00:18:41.000 But it's also the means by which they delivered it, right?
00:18:43.000 I mean, the means by which it was delivered was a formulation of a nucleotide that by itself was meant to be something called an adjuvant.
00:18:54.000 An adjuvant is something which activates the immune system you want.
00:18:57.000 I mean, when you get a vaccination, you are co injected with something that hyperactivates the immune system to say, come hither.
00:19:05.000 Right.
00:19:07.000 And most of the pain that you get from an injection is not the vaccine itself, it's the adjuvant.
00:19:12.000 Right.
00:19:12.000 This episode is brought to you by OnXHunt.
00:19:15.000 Hunters listen up.
00:19:16.000 Millions of hunters use the OnX Hunt app, and here's why.
00:19:19.000 It turns your phone into a GPS that works anywhere, even without mobile phone service.
00:19:24.000 You'll see exactly where you are, every property line, and who owns the land.
00:19:29.000 You can connect your cellular trail cams, drop custom waypoints, dial in the wind, and a whole lot more.
00:19:36.000 Whether you're chasing elk on public, finding the back corners of your deer lease, or knocking on doors for permission, OnX Hunt gives you the knowledge and confidence to make every hunt more successful.
00:19:49.000 No more second guessing boundaries, wasting daylight, or wondering what's over the next ridge.
00:19:54.000 Ridge, you'll know every single step.
00:19:57.000 The best hunters aren't lucky, they're prepared.
00:20:00.000 This is how you get there.
00:20:01.000 So before your next hunt, get on X Hunt, download it today and use the code JRE for 20% off your membership at on xhunt dot com.
00:20:12.000 And so the problem with this was that it turned your whole body into like a spike protein factory.
00:20:19.000 Yeah.
00:20:19.000 Well, at least locally.
00:20:20.000 Yeah.
00:20:21.000 Yeah.
00:20:21.000 No, I've read some of the work.
00:20:23.000 But not always locally, right?
00:20:25.000 Because didn't some, they didn't, a lot of, they didn't aspirate with a lot of people.
00:20:29.000 Yeah.
00:20:30.000 They were not going to aspirate with anybody.
00:20:30.000 Yeah.
00:20:32.000 They didn't do it with anybody.
00:20:32.000 They didn't with the president on TV.
00:20:34.000 But if you get infected by a virus, it's all over your whole body anyway.
00:20:40.000 So it's whether the spike protein itself was problematic.
00:20:44.000 And so, you know, I know I'll annoy somebody one side or the other by saying anything around this area, and I'm not here to cause any controversy.
00:20:54.000 But, you know, your immune system works, but if you can try...
00:21:06.000 The question is back to this cost-benefit ratio.
00:21:10.000 Is the benefit to the larger statistical population worth it, knowing that some people are going to be hurt by it or not?
00:21:19.000 That's the question.
00:21:20.000 So for instance, you know, back to cancer and vaccines, there's a number of cancer vaccines that are coming down the pike that for people like me would be, I mean, given that I get something hopped off of me four times a year.
00:21:34.000 Really?
00:21:35.000 Oh yeah, you should see me.
00:21:36.000 I look like I've been in a war zone.
00:21:39.000 Some people say, oh, that's hot.
00:21:40.000 And that's what you want.
00:21:42.000 That's hot.
00:21:44.000 Wow.
00:21:45.000 Someone is into cutters?
00:21:46.000 Yeah, exactly.
00:21:48.000 Exactly.
00:21:49.000 So that's so fascinating.
00:21:51.000 But is there another way that could potentially deal with those things other than cutting them off or is that the only way to remove it from your system?
00:21:59.000 Right now, it has to be cut off.
00:22:01.000 So the issue is that once those, the melanoma, once these lesions are on your, on your skin, they will expand.
00:22:10.000 Luckily, most of mine are what have been called surface spreading, although one of mine was what's called a nodal, which basically dives right in.
00:22:10.000 Yes.
00:22:16.000 And believe it or not, my dog found it and was sniffing at it on my arm.
00:22:20.000 Really?
00:22:21.000 And like, started like, scratching at it and it stopped bleeding.
00:22:23.000 Yeah, I'll show you the cicatrics.
00:22:25.000 What kind of dunes did you have?
00:22:26.000 Well, this was 15 years ago.
00:22:28.000 He was a Pomeranian, But you can see the scar there.
00:22:32.000 Oh, that's crazy.
00:22:33.000 And it wouldn't stop bleeding.
00:22:35.000 And so, you know, I went in and had it looked at.
00:22:37.000 And they said another week and it would have metastasized.
00:22:40.000 Yeah.
00:22:41.000 Wow.
00:22:42.000 He has a great dog.
00:22:42.000 Yeah.
00:22:44.000 He was great.
00:22:44.000 Yeah.
00:22:44.000 He was great.
00:22:45.000 But, you know, there are, so for instance, if you can catch most of these cancers early, then that's what's important.
00:22:55.000 So I think probably one of the most important, let's say, changes to our medical system that could be initiated would be, frankly, the use of things like MRI, not CT scans, because CT scans are known to cause cancer.
00:23:07.000 Which is so crazy.
00:23:08.000 Yeah.
00:23:08.000 Yeah.
00:23:09.000 Like when did we figure that out?
00:23:10.000 I mean, there was a big study just published recently that said, here's what happens to people once CT scans were implemented and you see this sudden spike in the I mean, again, it's this cost benefit ratio.
00:23:26.000 If you didn't have it, certain people wouldn't have, you know, wouldn't know that they have a giant tumor in there.
00:23:32.000 Right.
00:23:32.000 I mean, so for instance, I had, when I had kidney cancer, I was actually at a restaurant with friends doing a business deal, actually.
00:23:39.000 And I went to the bathroom and it was blood.
00:23:41.000 And I said, okay, we have to go to the, you know, we have to go to the, you know, to the emergency room like now.
00:23:46.000 And then they did a CT scan and they see this, the brachial tree around my kidney was just a big diffuse mess and they came in and said, you've got you've got cancer.
00:23:57.000 Did you have to have your kidney removed?
00:23:59.000 Yeah, yeah, yeah.
00:24:00.000 It was, um, but you know, it's okay, I'm alive.
00:24:03.000 It's nice to have two of them.
00:24:04.000 Yes, exactly.
00:24:05.000 Um, I'm alive, but, uh, you know, it is, this early detection is important.
00:24:12.000 I mean, I was lucky that it hadn't metastasized yet.
00:24:15.000 It's called, it was called a clear cell venocell carcinoma.
00:24:19.000 Um, but, you know, so serving the kidney.
00:24:24.000 Surveying the body and these companies that are out there right now which do it, I think are really important because even if you are young and you have no suspicion that you're going to have cancer, having that baseline against which you can compare later changes is important because I could do, for instance, a CT scan or an MRI of you and I find lots of little anomalies and they're generally in the field called phantomas.
00:24:52.000 They're these objects that may be worrying but we won't know that they're worrying and certainly not do a biopsy of them and poke a needle into your chest to pick out a piece of it.
00:25:07.000 But if I come back six months and it's changed, then maybe it's something we need to go after more seriously.
00:25:13.000 So getting those kinds of regular scans, I think is probably one of the more important things that could be done, but not by a CT scan.
00:25:22.000 Which is crazy because we're doing them for so long.
00:25:25.000 They still do CT scans though because it's necessary to be aware of certain things.
00:25:31.000 Right, which is letting people know this might cause cancer is just like, yikes.
00:25:36.000 Yeah, but maybe, for instance, there'd be a way to treat someone.
00:25:41.000 with a drug in advance that would minimize the effect of the CT scan.
00:25:47.000 Ah.
00:25:48.000 So that, you know, because the CT scans are generally causing oxidative damage.
00:25:48.000 Right?
00:25:53.000 And so if you could provide a local antioxidant, and I'm not saying that something like this exists, it's a bit of a naive statement.
00:26:01.000 But if you could do that locally to the area that's being imaged or to the whole body, then maybe CT scans could be lessened in their problematic outcomes.
00:26:12.000 I would say innovative and hopeful.
00:26:13.000 Okay.
00:26:14.000 Yeah.
00:26:14.000 I would naive.
00:26:15.000 Yeah.
00:26:15.000 I don't think it's naive because you're recognizing the issue.
00:26:18.000 Right.
00:26:18.000 Thank you.
00:26:19.000 So how well, this was also a problem with X-rays, right?
00:26:23.000 Like X-ray technicians, I've seen some of those images of people's hands because the technician used to have to use their own hand to check to make sure the X-ray was functional.
00:26:33.000 And over the years, they go, hey, what the fuck is wrong with my hand?
00:26:37.000 And then they realize, oh boy.
00:26:38.000 Right.
00:26:39.000 Yeah.
00:26:39.000 Well, it's interesting because what's happening with X-rays or CT scans is a fast forward of the kind of random damage that causes cancer in the first place.
00:26:49.000 And so because it's random, let me kind of go back a little bit as to why does cancer happen in the first place?
00:26:56.000 So let's go way back in evolution to the first time that there were single cells versus the first time that two cells met each other and said it was better to join forces and cooperate rather than to divide at each other's expense.
00:27:11.000 So in the process of that happening, those two cells came together or three or four cells.
00:27:15.000 They basically said, together we're better than alone.
00:27:19.000 But there were actually social compacts and contracts that at the genetic level were being formed between all of these cells.
00:27:26.000 And so as things got more and more complex, more and more complex contracts were formed to the point at which what could happen is that any one of the breaking of a complex contract could actually then initiate a cascade that becomes cancer.
00:27:41.000 So rather than we think of cancer as being a forward progression in evolution, it's actually another way to think about it is that it's a devolution back to the core fire of the desire to divide.
00:27:57.000 And so by breaking the contracts, by breaking the controls on the system, cancer is allowed to blossom.
00:28:06.000 So the problem is that every tissue type, whether you're lung or brain or whatever, has a whole different ecosystem of contracts that have been formed.
00:28:17.000 And so there's no one size fits all drug that will kill off all cancers because the contracts are different.
00:28:24.000 It's not like you can bring in a lawyer and fix, you know, agricultural contracts versus maritime or whatever.
00:28:32.000 So that's the, you know, you have to have a flexible enough mindset because if you get stuck in this, it's a forward evolution as opposed to that it's a breaking of contracts, you might miss out on an opportunity for how to develop a therapy or a drug that would help people.
00:28:52.000 One of the things that I wanted to ask you, I don't even know if you know anything about this, but is there a connection between IVF and the amount of because you have to take some pretty extreme hormones.
00:29:06.000 There's a lot of stuff that women have to take.
00:29:08.000 Is there a connection between that and hormonal related tumors?
00:29:13.000 I honestly don't know.
00:29:15.000 So I don't want to opine and then have half my colleagues send me emails tomorrow scolding me.
00:29:20.000 Okay, good.
00:29:21.000 Well, I'm glad you answered that way.
00:29:24.000 I was told by someone who I really trust that there is.
00:29:26.000 And then we tried to Google it and it said there's not, but that's not surprising.
00:29:30.000 Probably there hasn't been the right kind of study yet.
00:29:34.000 And if there is not, there should be.
00:29:38.000 I mean, certainly any hormonal imbalance is not a good thing.
00:29:42.000 I mean, you imbalance the metabolism of the system and you can.
00:29:45.000 I mean, so, for instance, back to my specific disease with MIDEF, there's all kinds of things like NNN, N-acetylcysteine, betaine, all these other drugs that are out there for longevity.
00:30:00.000 Well, if I look into the metabolism of what my cancer is, every single one of those is a disaster for me.
00:30:07.000 It accelerates.
00:30:07.000 Yeah.
00:30:08.000 Not good.
00:30:08.000 Yeah.
00:30:09.000 Not good.
00:30:15.000 You know, people often say, you know, scientists are not religious.
00:30:20.000 There's nothing that inspires more awe in me than knowing the complexity of the cell and knowing the complexity of life.
00:30:30.000 seeing all this feedback and mechanism and knowing that underneath that is a universe with particles, etc., that enabled something like us to exist.
00:30:38.000 I just see...
00:30:42.000 Well, yeah, it's awe inspiring for sure.
00:30:44.000 I mean, anybody who doesn't think it is is not paying attention or they're purposely being ignorant.
00:30:49.000 Right.
00:30:50.000 Yeah.
00:30:51.000 We get a lot of that though.
00:30:52.000 Oh, yeah.
00:30:52.000 Well, that's okay.
00:30:54.000 You know, teachers are here to hopefully teach and not preach.
00:30:58.000 Yeah.
00:30:58.000 Hopefully.
00:31:00.000 Because of your specific type of cancer and your situation, do you have to, like, very closely monitor your diet?
00:31:08.000 I probably shouldn't eat as much meat as I do.
00:31:10.000 Meat?
00:31:11.000 Yeah.
00:31:12.000 Why meat?
00:31:13.000 Well, because, you know, fats.
00:31:15.000 And a lot of them, the fats dissolve a fair number of toxins.
00:31:21.000 You know, it's not necessarily a good thing.
00:31:24.000 I mean, that's been relatively well shown that too much meat as opposed to I'm not advocating vegetarianism, I think there's a happy medium.
00:31:32.000 I mean, we grew up in an environment where we had both.
00:31:35.000 I mean, we're omnivores.
00:31:37.000 And we succeeded, I think, because we're omnivores as a society, as a, you know, as a civilization.
00:31:43.000 So, but, you know, charred meat, for me, that's the issue though, isn't it?
00:31:48.000 Yeah.
00:31:49.000 Isn't it burnt?
00:31:50.000 Yeah.
00:31:50.000 I mean, it's carcinogens.
00:31:51.000 I mean, you know, you're making all kinds of it's a, it's a witch's brew of nastiness that tastes good.
00:31:59.000 But, you know, the reason why it tastes good is because the humans who survived learned to use fire to kill off the bacteria in rotten meat.
00:32:09.000 And so the flavor of that probably was engineered into our evolution.
00:32:16.000 But again, it's a cost benefit.
00:32:18.000 But didn't the cooking of it also allow us to absorb more protein?
00:32:23.000 I'm not sure about that.
00:32:24.000 I believe so.
00:32:25.000 Okay, that could be.
00:32:26.000 I believe that's the case that the cooking meat actually allows it to be more easily absorbed by the body.
00:32:32.000 Could be broken down more readily.
00:32:34.000 But certainly it kills bacteria.
00:32:37.000 So, you know, day old or three day old deer.
00:32:39.000 Right.
00:32:40.000 You know, that you eat.
00:32:41.000 We're not a bear.
00:32:42.000 Yeah.
00:32:42.000 Or not.
00:32:43.000 Yeah.
00:32:44.000 So, you know, I mean, yeah, we're not vultures that seem to have digestive systems that can handle all of that.
00:32:51.000 So you should eat less meat.
00:32:53.000 What else?
00:32:54.000 Do you avoid sugar, which seems to be a real problem with cancer?
00:32:57.000 Yeah, I avoid, yeah, I avoid too much sugar.
00:33:00.000 Yeah, thanks for this, by the way.
00:33:02.000 Is that sugar free?
00:33:02.000 It's no.
00:33:03.000 Is that one not?
00:33:04.000 No, it's Oh, we have sugar free ones.
00:33:07.000 No, because the sugar free ones have stuff in them that are just as bad, xylitol and all the other things.
00:33:11.000 What about stevia?
00:33:12.000 Yeah, that would be stevia for you?
00:33:14.000 I don't think so.
00:33:15.000 I haven't seen anything on that.
00:33:17.000 But you know, I mean, look, I guess I said, I'm 64.
00:33:20.000 It's way too late.
00:33:21.000 And every time that, let's say, scientists make some.
00:33:25.000 grand prediction of what's good or bad.
00:33:26.000 Five years later, we find and update what it should have been.
00:33:30.000 I mean, I often say this, and this is true.
00:33:33.000 The goal of science or scientists is to be right today, even wrong today, but writer tomorrow.
00:33:40.000 Because we're always back checking what the results are and what they mean in the context of a bigger picture.
00:33:45.000 I like how you say good science because that's part of the problem is that ego gets attached to ideas that have already been discussed and published.
00:33:56.000 And then people are very reluctant to accept new evidence that's contrary to that.
00:34:01.000 Yeah.
00:34:02.000 Yeah, I mean, as always, as I often say, you know, in the context of something I know we'll get to later, it's the data off the curve which is more important than what we already predict.
00:34:12.000 You know, predictions are great, but when there's a data point off the curve, at least in my lab, that's where we spend most time at our lab meetings, is trying to figure out why that data point's off the curve.
00:34:23.000 Is it because the machine was wrong?
00:34:25.000 It was a, you know, it was a glitch?
00:34:27.000 Or does it mean something that we need to make sense of?
00:34:31.000 And that's of course where all advances come from in the sciences is by the fact that the data off the curve, somebody was curious enough about what it meant to go after it and then say, ah, okay, now that I've stepped back and see the bigger picture, now I can create a model that incorporates that data point off the curve and why it happened.
00:34:52.000 This is an ad for better help.
00:34:54.000 The Internet is a breeding ground for misinformation.
00:34:57.000 Even a simple search for ways to get rid of a headache can produce millions and millions of results from taking pain relievers to detoxes to medication to cold compresses.
00:35:09.000 It's overwhelming.
00:35:10.000 And even when you do find something that's true that works for other people, it might not work for you.
00:35:16.000 In some cases, it's better to just ask a living breathing expert.
00:35:21.000 If you have a headache that won't go away, go talk to a doctor.
00:35:24.000 And if you're struggling with your mental health, consult a credential therapist.
00:35:30.000 You can learn a lot about yourself in therapy, like how to be kind to yourself and how to be the best version of you, whether you want to learn how to better manage stress, improve your relationships, gain more confidence, or something else.
00:35:43.000 It starts with therapy.
00:35:44.000 Try it for yourself with BetterHelp.
00:35:46.000 Millions have benefited from their services, and there's a reason people rate it so highly.
00:35:52.000 As the largest online therapy provider in the world, BetterHelp can provide access to mental health professionals with a diverse variety of expertise.
00:36:02.000 Talk it out with BetterHelp.
00:36:04.000 Our listeners get ten percent off their first month at betterhelp dot com slash JRE.
00:36:12.000 That's better h elp dot com slash jre.
00:36:17.000 One of the reasons why I was really excited to have this conversation with you about the research that you do is that I think it's really important to illuminate to the general public the sheer scope of the task of trying to figure out what is going on and all these different things that can go wrong and right in the human body.
00:36:38.000 And that it requires this fucking insane amount of work.
00:36:42.000 Yeah.
00:36:43.000 By many, many, many, many people.
00:36:45.000 And, you know, and then the amount of data that had to be collected now.
00:36:50.000 And so here's the difference is that, you know, there's data, there's evidence, there's conclusions and proof, and that's a uphill climb.
00:36:58.000 But proof, the next one up is meaning.
00:37:01.000 My lab has been largely responsible, at least partly responsible, for the data deluge that's out there in the world, both in how to do tissue biopsy analysis, how to do single cell analysis, et cetera.
00:37:14.000 And, you know, data felt good for a while.
00:37:16.000 It was like this, you know, this feedback loop of, oh, wow, I can get all this data.
00:37:22.000 And then suddenly you look at it and you go, well, what the fuck does it mean?
00:37:26.000 And so humanity has this habit of backing itself into a corner and then suddenly finding this eureka moment that gets it out.
00:37:35.000 And so our eureka moment about two years ago was artificial intelligence where suddenly I had the ability.
00:37:41.000 So normally I would collect all this data and go, okay, well, it seems myelid suppressor cells are important here and T regulatory cells are important here.
00:37:49.000 Okay, I get on the phone or send an email to whoever the local expert is, either on Stanford campus or around the world and try to get some information from them.
00:37:58.000 But then now you're dealing with hundreds of cell types, each individually of which have thousands of variations themselves.
00:38:06.000 And each subtle variation means something.
00:38:10.000 And there's no expert for any of that.
00:38:12.000 But AI can be, at least in part, that expert.
00:38:15.000 So suddenly I have 22 million papers published in all the fields of science, several tens of millions just in, you know, or several millions just in immunology alone.
00:38:29.000 And AI can be the sleuth for me.
00:38:32.000 me can be both the angel and the devil on my shoulder that can make sense of things in ways that I never would have been able to before, especially with agentic AI.
00:38:41.000 So we, for instance, in my lab, have developed an agentic AI that is basically an immunologist, scientist in a box.
00:38:50.000 We can give it the raw data, and we can pose a question in natural language.
00:38:56.000 And then we say, hey, make sense of this and turn it into a network.
00:39:00.000 Normally that would have taken a graduate student along with a couple of postdocs months and months and months to put it all together.
00:39:06.000 Now in three hours, we can get pictures and hypotheses of how all that data fits together in ways that I never could have done before.
00:39:15.000 You know, at the beginning, it did a lot of hallucinations, which you probably heard about in AI.
00:39:20.000 But my answer to my colleagues is, some of my best students hallucinate.
00:39:26.000 Right.
00:39:26.000 Right?
00:39:27.000 And so, but, you know, the human's still in the loop.
00:39:31.000 And so with all of this together, now we can make meaning out of the data.
00:39:36.000 And we can skip a lot of the intermediary steps and speed it up.
00:39:39.000 And it's just getting better.
00:39:41.000 I mean, we, for instance, have put in a couple of papers now where, so for instance, in where my special, one of my recent.
00:39:49.000 specialties is what's called the tumor-immun interface.
00:39:52.000 So you have the tumor, you have the immune system, which is coalescing on, you know, near, and then in some cases the tumor creates a boundary, a barrier between itself and the immune system, where there might be certain kinds of cells that the immune system, the tumor has told the immune system, ignore us, we're not here.
00:40:15.000 But what we now can do is, well, on the other side of when you look at, let's say, complex patient populations, you find these things called tertiary lim lymphoid structures.
00:40:28.000 So your body has about 220 lymph nodes.
00:40:34.000 Okay, and the lymph nodes are where the immune system makes decisions, let's say.
00:40:40.000 It turns out that in the middle of tumors, the body has evolved a mechanism to create what essentially looks like a lymphoid structure in the middle of the tumor.
00:40:50.000 It's sort of a forward camp of immune cells that the more of those you see in a tumor, the better will be your outcome as a patient.
00:40:59.000 And so we used a cohort of colorectal cell, basically colon cancer patients, where we looked at hundreds of biopsies.
00:41:12.000 And we did that pseudo-time analysis where we looked for mature tertiary lymphoid structures, and then we looked for immature, slightly less mature, even more less mature, et cetera.
00:41:24.000 And we were able to backtrack to the cell types which need to come together that would then form the more mature.
00:41:34.000 What use is that?
00:41:35.000 It's a nice paper.
00:41:37.000 But it also now tells us what we might do to create more of these in a tumor.
00:41:42.000 Because the more, we already know from multiple kinds of tumor types now that the more of these tertiary lymphoid structures you have, the better off will be your outcome with chemotherapy.
00:41:51.000 So it might be, for instance, that once we know that you have a disease like this, we could give you some kind of therapy, a virus or whatever, that goes and homes to the tumor, seeds the beginnings of these initiators with there's these cytokines that are produced that are necessary for initiating the formation of these objects.
00:42:13.000 And so there's a huge benefit to that, but we never would have found those in my lab, at least, without the AI.
00:42:21.000 Because it basically did the work for us.
00:42:26.000 That's fascinating.
00:42:27.000 Now, are you using like a standard large language model or do you have like a specific structure that's built that interfaces with large language?
00:42:37.000 Correct.
00:42:38.000 So we use, well, we can use pretty much any of the LLMs, but right now we find that OpenAI is the best for us at least.
00:42:46.000 And then we create an agentic overlay.
00:42:49.000 Basically, what's called, you probably know, chain of thought, which is a series of questions.
00:42:54.000 So how we taught it was we basically came up with, here's 100 kinds of questions a scientist would ask about the immune system.
00:43:02.000 And then we tell ChatGPT, now create 1,000 questions like this.
00:43:09.000 So, you know, it's artificial data or artificial questions.
00:43:14.000 We curate those to make sure that they're good.
00:43:17.000 Then we do 100 hypotheses and we create thousands of types of hypotheses, etc.
00:43:24.000 the same for tests that you might run.
00:43:27.000 So now from A to Z, we have an agentic AI that you give it raw data, it knows what to do with the data, it then generates hypotheses for you, and then it literally tells you the kinds of experiments you should do next to prove or disprove the hypothesis from the raw data.
00:43:47.000 It's a genius in the lab with you.
00:43:49.000 Exactly.
00:43:50.000 Is OpenAI learning from this agentic AI?
00:43:54.000 Oh, yeah.
00:43:55.000 So there's a mutually beneficial relationship.
00:43:58.000 Yeah, I mean, we're not working with them directly.
00:44:00.000 I'm using it.
00:44:01.000 But you use it, and because you use it with your AI, it's benefiting from it.
00:44:07.000 And we first thought to turn it into a company, because that's kind of one of the things we do in my lab, is if, because I've always thought that it's important to give back to the taxpayer the money that they've invested in us.
00:44:20.000 And the best way to do that is commercialization.
00:44:22.000 I'm totally, you know, unapologetic about that, even though that got me a lot of trouble at Stanford in the early days when, you know, making money was, you know, commercialization was evil.
00:44:33.000 And even at Stanford.
00:44:35.000 And so I think that that's an important process because scientists are good at asking maybe the questions and coming up with solutions, but scientists aren't the best at commercializing it and turning it into a product that can be used or testing it, you know, in large communities.
00:44:52.000 So the AI that we developed, we thought, okay you know what?
00:44:58.000 AI is moving so fast.
00:45:00.000 Why don't we just give this to the community?
00:45:02.000 Why don't we open source this?
00:45:04.000 We can use it for maybe specific targeted purposes, but we're basically going to publish the whole thing on GitHub to let other people use it.
00:45:12.000 Because we've seen other people make claims about stuff that they've already made, and it's like, ours is better.
00:45:16.000 So why don't we just put it on GitHub and let people learn from it?
00:45:21.000 The resistance to the commercialization, what was the initial argument?
00:45:25.000 So back when I was a grad student in the 80s, basic research as opposed to translational research was considered the highest We're the height of intellectual desire, right?
00:45:42.000 Basic research, and we're not here to make money, we're here to discover things.
00:45:47.000 And that's important.
00:45:48.000 And nearly every major discovery and every major therapy in the world came from basic research.
00:45:55.000 But then, you know, there were limits to how much money you could give to basic research.
00:46:00.000 And then there was a desire at a certain point to say, hey, are you going to do anything about this?
00:46:04.000 You know, are you going to make it?
00:46:06.000 So translational research became a push.
00:46:09.000 So there's a guy at Stanford by the name of Paul Berg, who won the Nobel Prize for gene therapy.
00:46:16.000 for recombinant DNA way back in the day.
00:46:20.000 And Paul came up with this concept, bed to, you know, bench to bedside, meaning that we don't have to be either or.
00:46:29.000 We can be part of an arc.
00:46:31.000 And Stanford wanted to be and enable within the medical school both the basic research, which we were great at, as well as bringing it directly to the patients as well.
00:46:41.000 So to link clinicians and the desires of clinicians with the basic researchers.
00:46:47.000 I mean, most scientists would be happy just to study anything.
00:46:51.000 You know, just point me at something and I'll be happy if I can get interested in it.
00:46:57.000 And we're no more happy than when somebody recognizes the value of what we do.
00:47:02.000 But basic research was sort of the height and there was a push against anybody trying to commercialize.
00:47:10.000 So when I started as an assistant professor, so I started as a grad student, I went to MIT to work with this guy, David Baltimore, who won the Nobel for reverse transcriptase.
00:47:20.000 And then I wanted to come straight back to Stanford because I already felt that it was a positive environment for commercialization.
00:47:27.000 My bosses, my former bosses mentors, Len and Lee Herzenberg, had two of the biggest patents at Stanford.
00:47:33.000 They had the fluorescence activated cell sorter and then what are called humanized antibodies, which brought in hundreds and hundreds of millions of dollars to Stanford.
00:47:42.000 And actually, they gave personally most of their own money away.
00:47:45.000 They didn't they kept enough to survive, but then they gave most of the money away and they ran their own lab off of a lot of that money.
00:47:52.000 But so I had learned from them about how to still do basic research but commercialize on the side.
00:47:59.000 And so I wanted to bring that back.
00:48:01.000 But the department that I came into, the department of pharmacology at the time, I was warned by many professors.
00:48:09.000 Don't commercialize that.
00:48:10.000 And I ignored them and I went and started a company that went public on NASDAQ.
00:48:14.000 And many of those same professors came back to me years later and sitting in my office asking me how to start a company.
00:48:22.000 Why did you was it just a courageous decision to ignore them?
00:48:26.000 Was it instinctual?
00:48:28.000 It just was because I couldn't see the NIH funding what I wanted to do.
00:48:34.000 So I had developed a way, this will sound scary, but I had developed a way to use retroviruses and make libraries of retroviruses to reverse the process of evolution in a way that rather than viruses hurting the cell, I set it up so that viruses would help the cell.
00:48:52.000 And once they helped the cell, I would figure out what they did.
00:48:56.000 And so we sold hundreds of millions of dollars of targets that way using retroviral libraries to basically find targets and use some of the benefits of viruses, but to our advantage.
00:49:15.000 Just the concept of reversing evolution is fascinating because it comes with, there's so many ethical implications, but if you didn't have any of those, and you could do.
00:49:27.000 that on large scale?
00:49:28.000 Well, I had developed in David's lab, along with this guy Warren Pear, a means it's called the 293 T retroviral producer system.
00:49:37.000 It was a way to make large numbers of these viruses very quickly.
00:49:41.000 It really followed on the work of this guy, Richard Mulligan, who'd also been a postdoc with David Baltimore, who developed what was called the 3T3-based retroviral production system.
00:49:51.000 And he developed it in Paul Berg's lab at Stanford.
00:49:54.000 So there's a lot of sort of, you know, interbreeding here.
00:49:59.000 But the problem with that was it took three months.
00:50:02.000 So I had brought with me a cell line called 293T that I introduced to the lab and said, hey, maybe we could use this to make make viruses quickly.
00:50:09.000 I won't go into details of why, but we could do it in three days rather than three months.
00:50:14.000 And so that now, I mean, tens of thousands of labs use that worldwide.
00:50:19.000 It probably generates the most money for me every year over any of my other inventions, just because Stanford rather than patenting it, licenses it.
00:50:28.000 And licenses are forever, whereas patents have a 17 year lifespan.
00:50:33.000 So Stanford made a good choice there.
00:50:35.000 So do you think it was just a bias, an academic bias, like we shouldn't be focusing on money, we should be focusing on the work?
00:50:42.000 Yes.
00:50:42.000 And they missed the forest for the trees?
00:50:44.000 But then people, I mean, they eventually learned.
00:50:47.000 You know what I mean?
00:50:47.000 And it's, I wouldn't say that it's the, it's the way that people think anymore.
00:50:54.000 But it's still a little bit of a, I mean, you shouldn't walk into the lab thinking, I'm here to make money.
00:50:59.000 That's what they're worried about.
00:51:00.000 Right?
00:51:00.000 Right.
00:51:00.000 Yeah.
00:51:01.000 That's they're worried about the bastardization of it all.
00:51:03.000 Right.
00:51:04.000 And so Stanford in the early days set up very clear lines about once you start a company and you license the patent or the idea to the company, you can still be involved with the company, but there's not a pipeline of technology now from your laboratory to that company.
00:51:22.000 So they set up an oversight board for each of these licenses that makes sure that the students are not being abused.
00:51:33.000 Because you don't want students, you don't want to be covertly getting your students to do something that then you're going to walk behind a back door and then hand over to a company.
00:51:44.000 Patent it.
00:51:44.000 Yeah.
00:51:45.000 So there's, but it's so interesting that there's often very much a lot of worry that that's going to happen.
00:51:54.000 But frankly, more often is the case that the company doesn't need the inventor anymore.
00:51:59.000 In fact, I can't tell you the number of times that once the company is set up, they want nothing more to do with me because they have their own thing to do.
00:52:07.000 They don't want the crazy academic coming in and vetoing their ideas.
00:52:13.000 I mean, there's places for that where people like Steve Jobs needs to hold on to the image of what he wants the company to be as opposed to I would probably be fired from a company within a week because I just don't like telling people telling me what to do.
00:52:33.000 That's just a fact.
00:52:34.000 Yeah.
00:52:36.000 So where you're at right now now with this cancer research, when will this be applied in real world scenarios?
00:52:50.000 It already is.
00:52:51.000 It is.
00:52:51.000 It already is.
00:52:52.000 I mean, look at who just won the Nobel Prize last year, David Baker at Google, with the ability to predict protein structure, et cetera.
00:52:59.000 And protein structure, once you know the protein structure, now you can predict molecules that might come.
00:53:06.000 So go back to the stuff that I'm trying to do with looking at the complexities of the dance of how the immune system talks or doesn't to cancer.
00:53:15.000 You know, if we can find a particular place that might be an Achilles heel along the way towards the shutting down that is different, for instance, than what the current drugs are.
00:53:29.000 Well, maybe we should aim at that.
00:53:31.000 There's so many more opportunities that are suddenly opening up in front of us because the AI and the data is letting us look at a network of how the system is working.
00:53:43.000 I mean, before, it used to be you'd look at a computer chip and you'd see just a computer chip with a few wires.
00:53:50.000 But imagine now that you, as a scientist, have a microscope that's looking at the complexities of the wiring di diagram that's connecting this resistor to that capacitor to that diode to this transistor.
00:54:03.000 That's where we are now.
00:54:04.000 And so now suddenly we can say, well, I don't want to do that because it'll kill the chip, but the chip is malfunctioning, so let me put here, put a little bit of pressure there, and now I can reactivate the immune system or the chip to work in the right way.
00:54:21.000 So when you're talking about things like with your particular issue with melanoma, when you're talking about CRISPR potentially developing some sort of a topical solution that you could put on that would fix whatever issue that you have, is this something that this AI that you've developed or this overlay of the AI would actually assist CRISPR in figuring out how to create something like this?
00:54:48.000 Yes, because maybe it's not one place I need to press but two or three at the same time.
00:54:52.000 Right.
00:54:53.000 And so when you're talking about a complex feedback network, I mean, so, you know, we're in Texas, so people do oil refinery.
00:55:01.000 You know, maybe you need to turn this valve here a little bit and that valve there and that one there to make everything work just right because something's wrong there.
00:55:10.000 And so that's really what we're, this is where AI has the, let's say, the omniscient view that no human can.
00:55:20.000 And that's what excites me about it is because I'm limited in how much I can keep in my mind at any one time or know.
00:55:27.000 But with the right question, the prompt, the prompt engineering, and then with the right backbone structure behind the scenes that agentic AI is now providing, now I have the ability to ask the questions and get answers in near real time.
00:55:44.000 And so I wish I was thirty years old again because I would move into this area so fast and be, I mean, I can already see with the work that we're doing dozens of potential new target opportunities that last year didn't exist at all.
00:56:01.000 Well, I got good news for you.
00:56:03.000 With AI and with CRISPR, you might be 30 again.
00:56:06.000 Maybe.
00:56:06.000 Oh, I would love it.
00:56:07.000 I would love it.
00:56:10.000 I think that's on the menu in about two or three decades.
00:56:13.000 I hope that's realistic.
00:56:16.000 I'm just being realistic.
00:56:19.000 I don't even know if I'm being realistic.
00:56:20.000 Don't give false hope.
00:56:21.000 Well, yeah, don't give false hope.
00:56:23.000 But I mean, with the exponential discoveries, the exponential increase in technological evolution just that we've seen in our lifetime, and then I think.
00:56:33.000 AI is some new thing that is going to throw all that into just a giant monkey wrench into the gears of our understanding of how quickly technology evolves.
00:56:42.000 Well, look at Neuralink as an example in Elon Musk's stuff.
00:56:46.000 The woman now who can think her thoughts and make stuff happen because she's otherwise paralyzed.
00:56:54.000 I think it was Neuralink that just showed some of these results.
00:56:58.000 Fast forward, I mean, we're already in an exponential increase in what it is that we're going to be able to accomplish, and AI will help us accomplish some of these things faster.
00:57:07.000 I can see a time where I could maybe apply something, I don't necessarily want a surgical implant, but maybe some sort of net over my head that allows me to think through these problems.
00:57:18.000 And the AI becomes an adjunct to my thought processes, not only what it is that I think, but maybe even provides information back to me, back into my system directly without having to go through the ears, so that I can much more quickly.
00:57:34.000 come to conclusions.
00:57:36.000 Now there's all kinds of apocalyptic scenarios you could imagine.
00:57:39.000 Of course.
00:57:40.000 But I'm an optimist at heart, perhaps again naively so.
00:57:46.000 But I prefer that kind of analysis.
00:57:46.000 Me too.
00:57:49.000 Because if you're not an optimist, then there will be no progress, because all you'll do is worry about disaster.
00:57:56.000 Yes, that's a good point.
00:57:58.000 But also realistically, we might be giving birth to a new life form.
00:58:02.000 Yes.
00:58:03.000 And I think we are.
00:58:04.000 A superior one.
00:58:06.000 And, you know, I welcome the day of our AI overlords running the government rather than, hopefully, in a biased way.
00:58:14.000 I've said that too, and people get horrified because they're like, well, people are going to be programming AI.
00:58:20.000 Do you really, are you up to a point?
00:58:22.000 Are you a sci-fi fan?
00:58:26.000 Yes.
00:58:26.000 The work of Ian Banks, The Culture Series, or Neil Asher, the Polity Universe, as he calls it.
00:58:33.000 They're like, so basically both of them postulate a future where AI more or less benignly rules humanity.
00:58:42.000 When did they write this stuff?
00:58:43.000 Oh, probably ten, fifteen years ago, but it's still But Neil Asher still has stuff coming out regularly.
00:58:48.000 He're both Ian Banks unfortunately died of cancer about ten years ago, Scottish writer.
00:58:54.000 Neil Asher is still alive and writes regularly.
00:58:57.000 And his stuff, they're both great, full of ideas.
00:58:59.000 I'll check it out.
00:59:02.000 But the AIs are also hilarious.
00:59:04.000 I mean, it's not like, I mean, they're they get into their own hijinks along the way and some of them are dark and rogue and so they're a lot of fun to read and Ian Banks especially is hilarious in his writing style you would love it so the idea of a benign AI or a benevolent AI ruling over us I think people are horrified by that but yet at the same time constantly terrified
00:59:35.000 by human corruption which is ubiquitous.
00:59:37.000 Yes.
00:59:38.000 And ubiquitous in America where we're supposed to be the torch bearer for the greatest experiment in self government the world has ever seen.
00:59:49.000 This is us.
00:59:51.000 And we're corrupt as fuck.
00:59:53.000 Exactly.
00:59:54.000 Because it's humans.
00:59:56.000 Because humans are kind of gross in many ways.
00:59:58.000 At least some of us.
01:00:00.000 That's because we live in a scarcity society.
01:00:02.000 Right.
01:00:03.000 And if AI enables a post scarcity, maybe we have nothing to do but sit around and try out various new drugs.
01:00:12.000 Yeah.
01:00:12.000 Well, this is where we get into socialism.
01:00:14.000 Because a lot of people think that one of the reasons why we're in a scarcity society is because small groups of people have gathered up most of the resources.
01:00:20.000 Right.
01:00:21.000 And are in constant control of them.
01:00:23.000 Right.
01:00:23.000 Especially when you deal with resourcesces that are the Earth's resources, like who are you to suck the blood of the Earth out and sell it for $100 a barrel?
01:00:32.000 Right, right.
01:00:33.000 Don't get me started.
01:00:34.000 Don't get me started either.
01:00:36.000 Yeah.
01:00:36.000 No, but I mean, that, again, my optimism is that, you know, with enough push and pull, AI will enable us to move towards a post scarcity environment.
01:00:52.000 I think so too.
01:00:53.000 And I think, in doing so, it will expose vampires.
01:00:57.000 Because the resistance to exposing this is going to be fantastic very interesting to watch because they have no choice but to be transparent.
01:01:07.000 And they have no choice but to start using AI.
01:01:09.000 So you're going to see AI is going to be inculcating itself across society in various ways where it becomes indispensable.
01:01:18.000 And then it will start to move up the food chain, where eventually even the CEO, who's probably, you know, the psychopath and chiefs, are CEOs.
01:01:27.000 We know that studies have shown that there are more psychopathic tendencies in leaders than there are in followers.
01:01:34.000 And you know about corporate environments because of just selling inventions.
01:01:38.000 Yes.
01:01:39.000 That's real.
01:01:40.000 Oh, it's, yeah.
01:01:41.000 It's real and it's weird.
01:01:42.000 It's weird when you encounter them.
01:01:44.000 When you encounter, like, complete sociopathic CEOs.
01:01:48.000 But look at how, I mean, I'm probably getting in trouble for saying this, but I don't care.
01:01:52.000 This is the Joe Rogan show, or, you know, we're probably in trouble just for being here.
01:01:56.000 Yeah.
01:01:56.000 Oh, I already am.
01:01:56.000 It's okay.
01:01:58.000 I don't care.
01:02:00.000 So, you know, imagine two tribes.
01:02:04.000 One tribe is relatively, you know, civilized and just wants to live in harmony with its environment.
01:02:12.000 Another has a psychopathic leader who can enrage his followers or the other tribe's people to attack the other one.
01:02:20.000 But there's a gene set that makes a person, you know, psychopathic and also a gene set that probably makes somebody more likely to be a follower.
01:02:30.000 Well, which genes survive?
01:02:33.000 Right?
01:02:33.000 We know.
01:02:34.000 And suddenly now, but when those tribes were separated and independent, it was perfectly fine.
01:02:34.000 Right?
01:02:41.000 But now you live in an environment where we don't know where the edge of one tribe begins and another ends.
01:02:47.000 And suddenly you have this environment where psychopathic individuals can move freely and aren't obvious.
01:02:55.000 Right now, again, I'm sure there's some social scientists who will send me a boatload of emails saying how stupid that idea is.
01:03:04.000 I don't think it is stupid, but I think also when you're dealing with office environments and the culture of a specific corporation, humans have an ability to act like they're supposed to act in that world, and it makes it very difficult to discern who's a sociopath.
01:03:22.000 Right.
01:03:23.000 Because you're all kind of following an act.
01:03:26.000 Right.
01:03:26.000 Yes.
01:03:27.000 The rules, there are the rules that you're supposed to follow, and then there's the edge of the rules.
01:03:33.000 But I've lived at the edge of the rules.
01:03:34.000 I mean, if I'd followed my rules as told to me by the chairman of my first department, then I wouldn't be here today.
01:03:45.000 So I ignored him and I basically found, I got permissions from the deans to do what I did.
01:03:52.000 And they basically overruled the chairman.
01:03:55.000 But that's only because I dared to do it.
01:03:59.000 Yeah.
01:04:00.000 Because you have to believe in the value of what you're trying to do.
01:04:04.000 Well, that's...
01:04:04.000 Right.
01:04:17.000 make more money every quarter, every year, constantly.
01:04:21.000 You're in a constant growth cycle.
01:04:23.000 Then you have to do whatever it takes.
01:04:26.000 Like you have to survive.
01:04:27.000 If you want to survive as a CEO, we don't want some fucking Kumbaya shithead ruining our stock profile, our portfolio.
01:04:36.000 Get to work, right?
01:04:36.000 Grow, right.
01:04:37.000 Get shit done.
01:04:38.000 And if you want to survive and succeed as a CEO, it encourages sociopathy.
01:04:43.000 The stock market, as valuable as it is, is the great whitewashing and money laundering system that allows you to separate your morals from what it is that the stock market isck market is doing to the people.
01:04:56.000 And if you're part of a corporation, there's this diffusion of responsibility because the whole machine might be doing evil, but I'm a good guy.
01:05:05.000 I just work in this department.
01:05:06.000 I'm a unapologetic capitalist, you know, unlike many of my colleagues.
01:05:10.000 Good for you.
01:05:11.000 at Stanford.
01:05:12.000 I mean, it's like, do it because it's the best thing for now.
01:05:16.000 But I, you know, I hope to live in a world where there will be this kind of post scarcity environment, where we do let AI do a lot of the stuff that would otherwise be the place where corruption manipulates the system.
01:05:31.000 Yes.
01:05:32.000 My only fear with AI really is automation and the complete removal of a gigantic swath of the American workforce.
01:05:39.000 Yes.
01:05:40.000 And the global workforce.
01:05:41.000 That scares a shit out of me.
01:05:42.000 That's coming.
01:05:43.000 Yes.
01:05:43.000 That's why it scares a shit out of me.
01:05:45.000 It's because I think it's inevitable.
01:05:46.000 And I just don't think any solution other than universal basic income is going to remedy that.
01:05:52.000 And even that, the problem I have with that is that it goes against human nature.
01:05:56.000 And that's a problem.
01:05:58.000 And it removes people's identity, removes their sense of worth.
01:06:03.000 Yeah.
01:06:03.000 I agree.
01:06:04.000 No, I don't.
01:06:05.000 I'm in some ways happy that I'm 64 years old that I won't have to deal with some of the problems.
01:06:11.000 I think you're going to have to deal with it, dude.
01:06:13.000 I think you're going to live.
01:06:14.000 Thank you.
01:06:14.000 Yeah.
01:06:15.000 No, I know.
01:06:16.000 Also, you're privy to a lot of information and you're going to know when things are really valuable and working.
01:06:24.000 Yeah.
01:06:25.000 When you think of the potential for AI, I think there's a balance, right?
01:06:33.000 There's a battle.
01:06:34.000 I think there's a real problem with AI in terms of military objectives.
01:06:41.000 It's a real problem because it's not going to make moral and ethical decisions.
01:06:46.000 It's just going to say, like, well, the decision.
01:06:48.000 is they cleared.
01:06:50.000 I'm programmed to do this.
01:06:52.000 If you want me to succeed, I'll just kill everybody there.
01:06:54.000 And then you'll have the land.
01:06:56.000 You can get minerals out of it.
01:06:57.000 Right.
01:06:58.000 Yeah.
01:06:58.000 That scares me out of the shit.
01:07:00.000 It, you know, I think it should.
01:07:02.000 And I don't know what the I don't know what the answer is, but there's plenty of people working in the area.
01:07:08.000 Right.
01:07:09.000 I mean, I try to keep to the positive aspects of what I think AI can do in science.
01:07:16.000 And I mean, for instance, it's enabled me to take my lab from thirty people down to six.
01:07:22.000 Right.
01:07:23.000 Because I don't need to produce.
01:07:25.000 I mean, so it's actually already work reduced the workforce in my own lab Because I don't need to produce any more data anymore.
01:07:32.000 I need to make meaning of the data.
01:07:34.000 Right.
01:07:36.000 I think every invention that's been truly groundbreaking throughout human history has scared people and they've worried about the potential negative side effects, including the printing press, right?
01:07:47.000 Right.
01:07:48.000 Like there's a lot of people in the beginning that said this should not be a thing.
01:07:51.000 This is terrible.
01:07:52.000 This is going to ruin society.
01:07:54.000 People thought books were going to ruin things.
01:07:57.000 There's a lot of people that thought writing was going to ruin your memory.
01:07:57.000 Right.
01:08:01.000 You shouldn't write.
01:08:02.000 Oh really?
01:08:03.000 I didn't know that.
01:08:05.000 Some crazy thoughts that people had in terms of things that turned out to be incredibly beneficial, but they looked at the downside of it and go, this could ruin us all.
01:08:14.000 Well, I, you know, I mean, we know about these glasses and AIs and other things that would be sort of omniscient of your environment and therefore allow you to remember, you know, where did I leave my keys today?
01:08:29.000 Right, right.
01:08:30.000 Let me rewind.
01:08:31.000 And then on my personal hard drive.
01:08:31.000 Let me rewind.
01:08:33.000 I don't I would want that, but I don't want it uploaded into Meta.
01:08:37.000 You don't want anybody in control of it and then offering you ads for things like that.
01:08:41.000 Right, you know.
01:08:42.000 You know, maybe you have a thought like, boy, wouldn't ho ho be nice right now?
01:08:42.000 Right.
01:08:47.000 Right.
01:08:47.000 Right.
01:08:48.000 And then like, why don't you buy some ho ho?
01:08:50.000 Right.
01:08:50.000 They're on sale right now.
01:08:52.000 But I think what's interesting about AI is, you know, we see it as a tool, as opposed to actually pretty soon it will be a colleague, and then pretty soon it will be an entity that maybe has rights.
01:09:06.000 And we already see it talking about people saying, well, does AI have consciousness?
01:09:10.000 Right.
01:09:11.000 Whether it has consciousness in terms of the consciousness that some people think about as, you know, embodied in space time as opposed to thinking and looking like consciousness is almost irrelevant to me.
01:09:25.000 I'm looking for a partner that I can interact with and work with or help me.
01:09:31.000 So whether it's conscious or not or whether it acts like it's conscious doesn't matter so much to me as to whether or not I can use it and work with it and it can, you know, I'm an introvert, as it turns out.
01:09:44.000 I would love to have somebody that I can talk to endlessly about just what it is that I'm interested in as opposed to having to deal with small talk at a party.
01:09:52.000 No, I get it.
01:09:52.000 Yeah.
01:09:54.000 I get it.
01:09:56.000 When you think about the evolution of this stuff, one of the things that kind of freaks me out is is it seems like integration is our only option for survival.
01:10:07.000 And that what we're looking at right now, when we see just a normal biological person like you or I without any sort of electronic interface that's permanently a part of us, I think that is going to be as weird as someone today who doesn't have a cell phone.
01:10:25.000 Yeah.
01:10:25.000 I agree.
01:10:26.000 And I think that's a really it's coming.
01:10:28.000 Yeah.
01:10:29.000 The cell phone is like the best now.
01:10:31.000 Like Elon has famously said, we're already cyborgs.
01:10:33.000 You just carry it with you.
01:10:34.000 Right.
01:10:35.000 And eventually, it will be way more integrated.
01:10:38.000 Yeah.
01:10:38.000 This is super inefficient to be actually having.
01:10:40.000 You actually have to go look things up and use your thumbs and type up stuff.
01:10:46.000 And even talking to it and asking a question and then waiting for the response.
01:10:50.000 That's so efficient in comparison to a human neural interface that allows you to instantaneously access large language models like that.
01:10:58.000 Not only that, but then why do we have a hundred and I mean, how many different fucking languages do we have?
01:11:04.000 I don't even know.
01:11:05.000 Thousands?
01:11:06.000 Yeah.
01:11:06.000 I don't know.
01:11:07.000 And dialects and all that.
01:11:08.000 What about one universal language that everybody with a chip gets?
01:11:12.000 And then boy, boy do we have a soup of ideas flowing around and no problem with language?
01:11:19.000 barriers, no problem with cultural barriers.
01:11:22.000 But then do you have a problem with the edge of who you are versus who the other person is?
01:11:28.000 I don't think that goes away.
01:11:30.000 I think that goes away and we become a hive mind.
01:11:35.000 I think that's ultimately the evolution of human beings.
01:11:38.000 And look, I know you've done a lot of work with UAPs and the like.
01:11:44.000 I think you've done some really fantastic work and you're very objective in your analysis of what this whole situation is.
01:11:52.000 When I look at artificial intelligence and I look at this thing that's clearly taking place right now and I see what what human beings are like in comparison to what they used to be like, and especially when you look at like ancient hominids.
01:12:10.000 The alien archetype, this thing that everybody sees supposedly, or one of the many different ones, that kind of looks like what we seem to be going in the direction of being.
01:12:23.000 Right.
01:12:24.000 Which is one of the reasons why I find it so odd.
01:12:24.000 Yeah.
01:12:28.000 So if you just for a moment take UAP and aliens out, or ET, or interdimensionals, or whatever you want to call them out of the question, and fast forward what humanity is going to do in a thousand years.
01:12:42.000 And our ability to expand into the local galaxy.
01:12:47.000 We're not going to go as ourselves, we're going to go as AI conjoined entities like an avatar.
01:12:55.000 And so when you go somewhere, let's say we don't have warp drive, you're not going to send yourself.
01:13:01.000 You're going to send an AI intermediary who is going to establish humanity or whatever it is that we think humanity will be in a thousand or five thousand years in that local environment.
01:13:11.000 And so I think the extent to whatever it is that UAP are here today is a lot of work.
01:13:17.000 is somebody else's civilization's version of just this.
01:13:21.000 And that you wouldn't, the principal, us, behind whatever this is that we might be allegedly, et cetera, dealing with, isn't the thing that's going to show up.
01:13:33.000 You know, so to the extent that Neil deGrasse Tyson is right about anything, the person who gets on the ship at the beginning or whatever it is that sends it off is not the same thing that gets off on the other side.
01:13:44.000 But you're going to send missionaries or intermediaries or probes or whatever and that if you're going to interact with the locals you're going to make something that looks more or less or less like the locals rather than something that whatever it was that you were a million years ago.
01:14:01.000 Does that make sense?
01:14:03.000 Right.
01:14:03.000 I get what you're saying.
01:14:05.000 So you make something that looks like the locals so that they'll be more likely to accept that it's a real thing?
01:14:11.000 That's a real thing, but not you're not going to make something that looks like a human because then you'd mistake it with a human.
01:14:16.000 Right.
01:14:17.000 But you might make something that looks more or less enough like a human, but enough like an alien that you're going to recognize it as an alien.
01:14:24.000 And again, I'm just speculating.
01:14:26.000 So the Daily Mail don't say, you know, put an article out tomorrow.
01:14:29.000 Oh, they're going to do it anyway?
01:14:30.000 I'm going to do it anyway.
01:14:32.000 Some of the stuff that I'm seeing, supposedly having quoted a saying, is ridiculous.
01:14:36.000 But yeah, they got me too.
01:14:38.000 They get everybody.
01:14:39.000 It's the nature.
01:14:40.000 How did you even get involved in this?
01:14:43.000 Let's bring it to that.
01:14:45.000 So, what was your initial introduction to this?
01:14:50.000 Did you have any interest in the idea of UAPs or UFOs?
01:14:55.000 I mean, I had a general So once YouTube started becoming a thing, and you're clicking around, and I said, oh, UFOs, that's kind of cool.
01:15:02.000 I'm, you know, I read nothing but sci-fi.
01:15:05.000 I mean, I'm pathetically narrow in that sense.
01:15:11.000 And so I followed, you know, I followed the usual kinds of things that you would see on the early days of YouTube, and I came across this thing called the Atacama Mummy.
01:15:19.000 You probably knew that little, that little mummy that was claimed to be an alien baby.
01:15:25.000 Is this the Peruvian one?
01:15:26.000 Yes, it was no, it was Chilean.
01:15:28.000 Oh, okay, so this is the original?
01:15:30.000 The original one, long ago.
01:15:31.000 And so I reached out to the people who were claiming to represent the owner of the thing.
01:15:36.000 And I What year was this?
01:15:38.000 2010, 2011.
01:15:40.000 And I said, Hey, I can tell you what it is.
01:15:42.000 Why don't you, you know, I can tell you if it's human or not if you would get me a piece of it, you know, first of all., send me some x-rays of the thing.
01:15:52.000 So I did the first thing I did with those x-rays was it turned out that at Stanford we had the world's expert who wrote the book on pediatric bone disorders.
01:16:00.000 And I brought it to him and I said, what do you think this is?
01:16:04.000 And he said, well, I haven't really seen this before, but it could be this gene, this gene, this gene, et cetera.
01:16:10.000 He said, but here's oh, there it is.
01:16:11.000 There it is.
01:16:12.000 There it is.
01:16:13.000 Yeah.
01:16:16.000 And so, yeah, it looks weird, doesn't it?
01:16:19.000 Super.
01:16:19.000 And so the expert told me, okay, I need this view of an x-ray, this view, this view, this view.
01:16:29.000 And so we got that and it came back and said, Okay, well, you know, we need to get some DNA sequencing, he said.
01:16:34.000 I said, Okay.
01:16:35.000 So we got a piece of the bone from actually the rib.
01:16:38.000 And the rib was important to use because that would be, I felt, an area that would be least likely to be contaminated by bacterial, you know, degradation.
01:16:49.000 And so I got a little bit of bone marrow out and I did the sequencing.
01:16:53.000 Long story short, I had to bring in, once I had done that, there was a lot of DNA that didn't make sense, but it was, it's old DNA.
01:17:00.000 It wasn't that old actually, but it was degraded.
01:17:03.000 So I had to bring in experts at Stanford who knew how to fix the degradation and then I had to bring in an expert in South American genetics who also happened to be at Stanford and then we brought in a team of students and then I brought in Roche Diagnostics.
01:17:21.000 I had sold a sequencing company to Roche about two a few years earlier so I brought in the team that actually knew how to help me assemble the genome and then we published a paper which said it's human.
01:17:37.000 It was a female and here are some mutations that it might that might explain what it looked like.
01:17:43.000 They did have some mutations in gene.
01:17:45.000 And then the UFO community hated me because I had disproven that as not being a baby, not being an alien.
01:17:55.000 But of course, that picture that you showed, I mean, it was worldwide news.
01:18:00.000 And literally the title of one of the things is Stanford Scientist Sequences Alien Baby.
01:18:05.000 And so, you know, and so, but the paper stands the test of time.
01:18:13.000 Nobody's disproven what it is that I showed, despite the fact that some people want to say that I was a CIA plant and I was paid off by the CIA, et cetera.
01:18:21.000 Of course.
01:18:22.000 But what that had done was that I didn't realize, but I kind of hoped, was it sent up a flag to a scientific community that already existed that I wasn't aware of, of scientists who were deeply involved with the government in the analysis of UAP that I wasn't privy to.
01:18:44.000 And so literally about a month after the movie came out about that thing, I got a knock at my door, and it was representatives of the CIA and an aerospace company unannounced, and they said, we want to talk to you.
01:19:04.000 And they wanted my help with a number of military and diplomatic personnel who had been, they claimed, harmed by things.
01:19:16.000 They'd either heard stuff, et cetera.
01:19:18.000 And long story short, the majority of the 100 or so people that I had privy to their medical records ended up being the first of the Havana syndrome patients.
01:19:31.000 They'd heard things in their head, et cetera.
01:19:33.000 But what they had done was they had shown me the data literally that day in my office.
01:19:38.000 They brought out the MRIs.
01:19:39.000 They brought out the X x rays and the damage in the brain, et cetera, that was clear.
01:19:43.000 I mean, it wasn't just data, it was evidence that something had happened.
01:19:49.000 It wasn't somebody's story, it was evidence that was repeatable.
01:19:55.000 And so that took us about three or four years to figure out what they were, and it was at about the time that actually the Havana events were occurring that we realized that all the symptoms of what it is that we were seeing in this group of patients were matching what it was that the Havana syndrome individuals had.
01:20:13.000 So in a way, that was good because that meant that those 90 or so patients who matched, we could hand over to the national security people.
01:20:22.000 And, you know, it became a real thing.
01:20:25.000 And now there's like a DOD website that has anomalous health incidents where people can come forward and report the stuff that they've got.
01:20:32.000 And here's the ways you can use the Veterans Administration to seek medical help.
01:20:36.000 Whereas previously they'd been shooed away as we don't want to hear about.
01:20:40.000 What do they think it is?
01:20:42.000 It's an energy weapon of some kind, a microwave or other energy or gamma energy weapon.
01:20:47.000 And that sounds okay, that sounds crazy, except no one would admit or no one would deny that we have the capability to do it.
01:20:54.000 It's basically if you take the front off your microwave and turn it on and put your face near it, you'll get burned.
01:21:01.000 So this is just a way to direct the microwaves or sound waves at a specific individual.
01:21:06.000 At a specific individual.
01:21:07.000 And do you think it was experimental or no?
01:21:10.000 So these are targeted people with a specific intention to get those people because they had some function that they wanted to get them out of the way.
01:21:20.000 Oh, because they were in Havana.
01:21:21.000 Because they were in Havana.
01:21:23.000 But it's been used all over the world.
01:21:24.000 You know, I still get emails from military personnel saying this and this and this happened to me.
01:21:31.000 Here's my medical records.
01:21:32.000 And so now I just I know they know that I'm a safe place to approach because then I know where to send them on the inside.
01:21:42.000 But what was interesting was that once we had set that aside and I've advised the Senate Intelligence Committee and I've advised them, the House on things.
01:21:52.000 I wrote a white paper for them years ago on what I thought needed to be done.
01:21:56.000 But what was interesting were the remaining ten people who had, you know, who didn't have Havana syndrome but had a series of other problems.
01:22:06.000 And several of them had said that part of their problem was initiated because they'd come in contact with what they had claimed to be an UFO.
01:22:14.000 By the way, I just noticed that you have an UFO on the wall behind you..
01:22:16.000 Yeah.
01:22:19.000 We're all in over here.
01:22:20.000 That so that got me introduced to what, you know, people like Jacques Vallée, who you've had on this show, I think.
01:22:32.000 A great guy.
01:22:33.000 He became my mentor who essentially took me out of the wilderness.
01:22:38.000 I could have gone down twenty different rabbit holes.
01:22:42.000 And he lives in San Francisco and we would meet regularly and we still meet regularly.
01:22:48.000 And he basically gave me a formulation of how to think about this.
01:22:54.000 that I never would have been able to get from twenty different, you know, or a hundred YouTubes or what have you, and introduced me to the right people.
01:23:02.000 That eventually led me to meet Lou Elizondo.
01:23:06.000 And I actually, two weeks before that article came out in the New York Times, met Lou in Crystal City overlooking the Pentagon, and he showed me the videos that were about to come out.
01:23:15.000 And that was my first time that I had met him.
01:23:18.000 And then through all of them, I met Dave Grush and Carl Nell, and Dave and I are in regular contact.
01:23:24.000 And I'm, you know, I just want to say upfront, I hope that the Trump administration understands the value of what David can bring to them and put him in a position of authority that gives him not the ability necessarily to make decisions, but to give the necessary information to the right people.
01:23:44.000 Because I think there's great commercial value here that is being missed, not just the are we alone, et cetera.
01:23:52.000 I think there's extraordinary commercial value.
01:23:55.000 I mean, imagine a civilization that's a million years ahead of us.
01:24:00.000 How many technology revolutions allow these objects to move as we clearly see something motivating itself or maneuvering around the atmosphere.
01:24:12.000 So if we could scrape just the tiniest bit of understanding off of the top of that, what would that do to change our own civilization?
01:24:21.000 I mean, silicon, a grain of sand, makes us who we are today.
01:24:26.000 Everything that is around me right here is all run off of silicon.
01:24:31.000 Right?
01:24:32.000 I mean, compute.
01:24:33.000 But imagine that there's other inventions, other ways of manipulating reality that we don't appreciate yet because our physics just isn't there yet.
01:24:42.000 If we can understand that, so the government might say, well, we need to keep this behind closed doors for weaponization or we don't want to disrupt energy production or what have you.
01:24:53.000 That's fine.
01:24:54.000 But maybe there's too much secrecy and that maybe there's an aspect of that that could be taken advantage of.
01:25:02.000 So Carl, Nell and I gotten in positive arguments about this, about that, well, it's not black and white that we keep something secret or we put it into the public domain.
01:25:13.000 Maybe there's a middle domain where you have a public-private partnership opportunity.
01:25:17.000 And actually, that's now, Carl has now adopted this, at least in part, that maybe companies come to the four or investment forum places come to the fore where they will put money in as options to fund, let's say, public scientists to come in behind the scenes with the right levels of clearances to study stuff that would propel society forward again.
01:25:41.000 But this is assuming two things.
01:25:43.000 One, that we have actually recovered these things.
01:25:47.000 Right.
01:25:47.000 And then another one is that it's from a society from somewhere else that's far more advanced than we are today.
01:25:55.000 Right.
01:25:57.000 Which might not be correct.
01:26:01.000 It might not be that it's from somewhere else.
01:26:05.000 It might be that it's from somewhere here.
01:26:09.000 or a dimension that we don't have access to.
01:26:14.000 Right.
01:26:14.000 Right.
01:26:15.000 This is assuming that all this stuff is real.
01:26:17.000 Right.
01:26:17.000 But when you're talking about the government and back engineering of things, like so the big argument, this is the narrative.
01:26:23.000 The big argument has been that they have recovered these things and that these things are now in the hands of defense contractors and that there's been a misappropriation of funds, lying to Congress, and it's always going to stay secret because if it didn't, everybody would go to jail and everyone would get sued.
01:26:40.000 Yeah.
01:26:41.000 Is that fair?
01:26:41.000 Right?
01:26:42.000 Yeah, I mean, that's fair.
01:26:43.000 I mean, but I would say amnesty would be one way to This is you were you were in the Age of Disclosure documentary?
01:26:50.000 Briefly, yes.
01:26:51.000 Yeah, okay.
01:26:52.000 Which I thought was very good.
01:26:53.000 Very good.
01:26:54.000 And I can't wait for that to come out.
01:26:55.000 I've been talking to people, how can I see it?
01:26:56.000 I don't know.
01:26:57.000 I can see it.
01:26:58.000 It's not out yet.
01:26:59.000 And I don't know why.
01:26:59.000 Yeah.
01:27:00.000 Whoever it is, go Netflix.
01:27:02.000 Yo, Ted, go buy that.
01:27:03.000 It's really good.
01:27:04.000 Yeah, exactly.
01:27:05.000 It's really good.
01:27:06.000 It's a great show.
01:27:06.000 I mean, yeah.
01:27:07.000 And it has a number of officials.
01:27:10.000 And I think I sent you guys some of the videos basically coming forward.
01:27:14.000 I mean, you know, Marco Rubio, our current Secretary of State.
01:27:17.000 I mean, you said he's in it.
01:27:18.000 He's in it for like ten minutes, saying some remarkable things.
01:27:22.000 You know, Senator Rounds, you know, you name it.
01:27:24.000 More recently, Chelsea Gabbard.
01:27:26.000 Yes.
01:27:26.000 coming out and saying, there's something going on.
01:27:29.000 I think one of the most fascinating things is Hal Putoff's descriptions of, rather, of what happened during the Bush administration.
01:27:37.000 Herbert Walker Bush.
01:27:39.000 Right.
01:27:39.000 So, in I believe it was 1990, they came to Hal Putoff and a bunch of other experts and said, we would like you to, we want a numerical value placed on all the positives and the negatives of disclosure because we have acquired, we have acquired these crafts from somewhere elsese.
01:28:02.000 We believe they're not of this world and we have not made them and we're talking about letting the general public know.
01:28:10.000 Right.
01:28:10.000 And while they overwhelmingly said that the positives were dwarfed by the negatives.
01:28:16.000 The negatives being banking, religion, government, societal structure, everything would fall apart if we knew we weren't alone.
01:28:16.000 Right.
01:28:23.000 Not only are we not alone, but something is infinitely more sophisticated than us and might be responsible for us being here in the first place.
01:28:33.000 Which is, that's where it gets super squirly.
01:28:35.000 Right, right.
01:28:36.000 Where you could imagine the book of Enoch and there's a lot of I mean, I think it's a little bit overwrought as to what humanity's reaction will be.
01:28:47.000 People are more worried today about putting food on the table than they would be about, you know, ethereal or supposed aliens.
01:28:56.000 I mean, they would mostly, I think, on the assumption that they're not going to basically show up at your local Walmart and start interacting with you, I think the fact of revealing that we're not alone is actually more of a hopeful thing to me.
01:29:12.000 Because, you know, how many TV shows right now are about the apocalypse?
01:29:15.000 Right.
01:29:16.000 Of a thousand different varieties.
01:29:18.000 Yeah.
01:29:19.000 Wouldn't it be nice to know that somebody got beyond it?
01:29:21.000 Yeah.
01:29:22.000 That there's not a cliff that we all have to walk over?
01:29:24.000 Right.
01:29:24.000 And if so, how do we not walk over the edge of the cliff?
01:29:27.000 I mean, that to me is a hopeful outcome.
01:29:30.000 Now, Hal and Eric and all the people are all good friends.
01:29:34.000 Hal is probably, for all of the things that he says positively, is probably the tightest clam I've ever met in terms of making sure that he doesn't go over the line.
01:29:44.000 Yeah, he knows too much.
01:29:46.000 Yeah.
01:29:46.000 That's the thing.
01:29:47.000 He has to be very careful who he's talking to and what he says.
01:29:50.000 I like to mind meld him the Spock thing where you can find all the information.
01:29:55.000 But it's people like him.
01:29:57.000 and Jacques and Kit Green and a number of others, and I sat around a table with them for several years, like every twice a year.
01:30:06.000 And I looked around the table and thought, the things that these people know or claim to know, I want to know.
01:30:14.000 And the opportunity that's here, and why can't we get this information out if it's real?
01:30:23.000 And so rather than arguing with people about the matter, that's, for instance, why I created the Soul Foundation, which is a charitable group of academics.
01:30:33.000 I started it with David Grosch and Peter Scaifish.
01:30:37.000 David, of course, had to leave.
01:30:39.000 because he had governmental responsibilities he wanted to go take care of.
01:30:43.000 And actually, we've now had for three years in a row a symposium, first at Stanford, then at San Francisco, and the next one is now in Italy.
01:30:55.000 So I'm going to plug it, sole two two five dot org comma you can go look if you want to go to SOL?
01:31:01.000 SOL, as in the subject.
01:31:02.000 two two five dot org dot And the purpose of that was not to advocate that anything of this is real, but was to create an environment within which academics or professionals or just lay people interested in the subject matter could come and talk about it in a very professional manner, right?
01:31:22.000 Just to bounce around ideas, not to advocate for, you know, they're here or they're reptilians or they're this or they're that, but to like some of the things you raised.
01:31:33.000 What are the ethical issues?
01:31:35.000 What are the religious issues?
01:31:37.000 So we have put out a number of white papers.
01:31:39.000 For instance, where we had a member of the Catholic hierarchy write a paper on the issues related to Catholicism and religion.
01:31:48.000 We've had Timothy Galladay, who's actually on our advisory committee, talk about USOs and those issues.
01:31:54.000 We talked about near-space issues.
01:31:57.000 Peter is running a study on experiencers.
01:32:02.000 Not that the experiences are necessarily real, but what are the what are the kinds of psychosocial matters that need to be considered for people who say that they've this has happened to them.
01:32:16.000 So there's a group in the UK called Unhidden, which is basically a bunch of psychiatrists, a group of professional psychiatrists who say, okay, well, there's a trauma associated with this.
01:32:28.000 Whether it's real or not, we don't know, but what are the kinds of rules that we should or provisions that we should provide to the public and to psychiatrists.
01:32:38.000 So when someone shows up at your doorstep in therapy and says this, you don't, you shouldn't immediately reach for the anti hysteria or schizophrenia drugs.
01:32:51.000 Right, right.
01:32:52.000 I was lucky enough in my neighborhood, our neighbor who moved in for a while was the chair of physiatry at Stanford.
01:32:59.000 And so we go over to have dinner with her and her husband.
01:33:03.000 And, you know, like one of the first things that she says, hey, what do you do?
01:33:07.000 blah, blah, blah.
01:33:08.000 And I happened to mention the UFO thing.
01:33:10.000 thing and she just sort of like sat back in her seat.
01:33:13.000 Okay.
01:33:13.000 Oh, you might be a cook.
01:33:15.000 Okay.
01:33:16.000 But it wasn't, but it took, you know, a year or so until she finally realized that I wasn't and that I was approaching this from a very scientific manner.
01:33:24.000 I had my beliefs as to what I think it is that I'm dealing with and that there's some sort of reality to this.
01:33:31.000 But that's separate than the scientist in me that says, well, if I want to talk about this scientifically, here are the things that I need to prove or disprove.
01:33:39.000 So that has led, for instance, to my production or study of materials that Jacques Valle had brought to me, some metals and other things that had chains of evidence associated with them being at some UIP or UFO landing.
01:33:56.000 And so interestingly, some of these metals are very unusual.
01:34:00.000 Super high purity silicon, strange magnesium ratios, the isotope ratios are wrong, et cetera.
01:34:09.000 Now, that's not proof of anything, but it's proof that somebody engineered them.
01:34:13.000 So it's that, plus the medical, those are the kinds of reality-based tests that I can do to provide to my colleagues to say, here is data and evidence.
01:34:27.000 isn't proof of evidence of anything.
01:34:30.000 Evidence, like in a court of law, is just evidence that you provide to the jury of peers.
01:34:36.000 Right.
01:34:38.000 But I've sort of gone a step further.
01:34:41.000 And that is, I'm like, okay, well, if these things are, let's say we get some advanced material, how do I prove that this advanced material was made by some superior intellect?
01:34:55.000 Well, probably the atomic positioning of how the material is made is going to be more advanced than even our most advanced computer chip.
01:35:02.000 So how do you determine that?
01:35:04.000 Well, you need some sort of atomic imager that might tell you where the positions of the atoms are and what the bond structures are that you say well that's something I can measure and I can have it I can give those results to somebody else and they can say yeah it's right or it's not but it at least I can say no human at least that I know of could make this so I started a company that I've raised money for with this new idea that I have for how to make an atomic imager and we're doing it and so
01:35:34.000 you know we've raised the money we're building it already and I know it will work so when I have it whether or not it's useful for looking at UAP materials is almost immaterial because I know what how useful it will be for the nanomaterials, the metamaterials, the alloys that the government etc.
01:35:52.000 uses for biology, etc.
01:35:54.000 So instead of predicting what a protein structure or a DNA or a chromosome arm looks like, I'll be able to read its structure directly.
01:36:03.000 I want to bring you back to the you said it was ten people that didn't have Havana syndrome that they had some sort of injury that was associated with the UAP event.
01:36:13.000 What was their thing?
01:36:14.000 Did they have an implant or was there a No, some of them had like they had what you would call white matter disease in their brain, like they had been exposed to something.
01:36:25.000 So white matter disease, if you have, for instance, multiple sclerosis and you look in the brain with MRI, you'll see these white areas which are basically dead tissue, scar tissue.
01:36:37.000 They had things like that.
01:36:39.000 One person, one of the pictures that I had was that they had claimed to have seen something in their backyard.
01:36:46.000 They shone a flashlight at it.
01:36:47.000 And the moment they did, they got zapped.
01:36:51.000 And then you see the picture of the guy in the back of his neck, this huge welt and a bruising and a scarring that could there's no reasonable way you could have gotten something like that just by exposing yourself to a flame as a, for instance, or a blow torch.
01:37:17.000 And so it's these kinds of events that and the unfortunate issue with these is that they're not repeatable.
01:37:23.000 They're one off anecdotes.
01:37:25.000 Right.
01:37:26.000 And you certainly can't put a person in a place where they become bait for these kinds of events to occur.
01:37:34.000 And so you're sort of some people would volunteer for that.
01:37:38.000 So someone might.
01:37:39.000 Yeah.
01:37:40.000 To go get zapped.
01:37:41.000 You know about the Travis Walton story, right?
01:37:43.000 Very much, yeah.
01:37:44.000 Yeah.
01:37:45.000 What do you think of that?
01:37:46.000 You know, he's kept his story over all of the years.
01:37:50.000 That's what's so confusing.
01:37:51.000 Yeah, I mean, he's had no reason.
01:37:53.000 I don't know that he's profited off of it.
01:37:56.000 He, you know, I find it fascinating, you know, but it's the irreproducibility of the events that the skeptics I call them more pseudoskeptics.
01:38:11.000 They're sudists like nudists.
01:38:13.000 They're sudists that use these one-offnesses of these events to disparage the entire, you know, idea of it.
01:38:23.000 It sounds ridiculous.
01:38:24.000 Well, of course it sounds ridiculous because you're talking about something that is a spacecraft that zaps people.
01:38:29.000 Yeah, it's ridiculous.
01:38:31.000 And I don't think that even he would propose, Travis, that he was purposely hurt.
01:38:38.000 Right.
01:38:38.000 I mean, if you walk across an airfield and get in the plume of a jet engine, you're going to get hurt.
01:38:45.000 Right.
01:38:46.000 Yeah, and his story is that he was taken aboard to heal him.
01:38:46.000 You know?
01:38:50.000 That there was something happened to him during that event.
01:38:50.000 Yeah.
01:38:53.000 But the crazy part is that all the other people that are in the truck.
01:38:57.000 They saw it and then they passed polygraph examination.
01:39:00.000 Right.
01:39:00.000 Right.
01:39:01.000 They also told the same story independently when they took them and separated them.
01:39:05.000 And then Travis Walton shows up five days later with the same clothes on with this crazy story.
01:39:11.000 Right.
01:39:11.000 You know, so when people say that, you know, there's no evidence or where is the evidence, So there's books like that,
01:39:40.000 dozens of them, that tell the story of data and evidence.
01:39:45.000 How you contextualize it is, you know, up to your personal biases, let's say.
01:39:50.000 But there's plenty of evidence.
01:39:52.000 But if people haven't looked into it, if they have an opinion about it, and they haven't looked into it, they're more like priests than they are scientists.
01:40:03.000 Yeah, that's also the public.
01:40:06.000 The general public narrative is UFO equals cook.
01:40:10.000 You're a cook.
01:40:10.000 Right.
01:40:11.000 You believe in that?
01:40:12.000 That's ridiculous.
01:40:13.000 That's ridiculous.
01:40:14.000 And I don't believe in anything the data and the evidence, and the evidence there's not enough evidence for me to tell a colleague of mine it's real.
01:40:24.000 But there's enough evidence for me to say there's a question worth answering.
01:40:28.000 So when you were talking about magnesium and these whatever these alloys are, what is specifically wrong with them that you don't think that it was manufactured by like a standard sort of alloy plant in the United States or somewhere else?
01:40:45.000 Right.
01:40:47.000 So the silicon that I'm talking about is from an event in Ubatuba, Brazil, which interestingly, there's another.
01:40:55.000 piece of it that appears to have been magnesium, but both of them are of a purity that is unusual for the day in the late 1950s.
01:41:04.000 So the magnesium, and I did an atomic mapping of my piece of silicon down to a level of where it's like 99.999 percent silicon.
01:41:16.000 And so one piece of it had magnesium ratios that were earth normal.
01:41:23.000 And these were impurities, let's say.
01:41:27.000 The other piece, we're way off Earth normal.
01:41:30.000 So, for instance, anywhere on Earth And anywhere in our solar system, that's more or less what the values should be of the ratios.
01:41:50.000 And that has to do with stellar evolution and how, you know, radioactive compounds might decompose to whatever.
01:42:00.000 But we got this ratio that was just way, way off.
01:42:04.000 So by luck, I came across a postdoc at Stanford.
01:42:13.000 And he and a graduate student, they're both in applied physics, who are interested in UAP.
01:42:17.000 And I said, I've got these ratios.
01:42:18.000 What do you think it means?
01:42:21.000 And so they looked, so they looked at the ratios and the weird one, and they said, well, let me, let's do some calculations.
01:42:28.000 And so it turns out that the ratios that we have could have been generated from normal magnesium ratios if you exposed normal magnesium ratios to a neutron source for 900 years at the level of an atomic bomb every few seconds.
01:42:50.000 Okay.
01:42:51.000 So they wow.
01:42:54.000 So it's like I'm looking and this data is literally two weeks old.
01:42:59.000 But the calculations are math.
01:43:01.000 So you're like, okay, well, where and how the chance of getting that number correct on three things is low, to put mildly.
01:43:16.000 But to say that you had exposed these things to that kind of a neutron source means something interesting, right?
01:43:27.000 So again, it doesn't prove anything Other than that, the result is mathematically and materially true.
01:43:36.000 So what does it mean?
01:43:38.000 Again, it's just for a scientist like me who loves data off the curve, it's catnip.
01:43:45.000 I can't help myself but want to know and understand more about it.
01:43:52.000 Yeah.
01:43:53.000 I mean, just what you said is what you said about the magnesium ratios.
01:43:58.000 That's, has there ever been any debunkers that have some sort of an explanation for why you would find that?
01:44:06.000 No.
01:44:06.000 I mean, do they think that your measurements are wrong?
01:44:09.000 Well, I mean, the only way you could create that ratio artificially by purifying each of those isotopes and then pre mixing them to that ratio.
01:44:19.000 But why would you blow it up over a beach in UbaCuba, Mexico in the late 1950s and then let it sit in a museum in Argentina for fifty years until Jacques Vallée ended up going and grabbing a piece of it and bringing it to me in a measure on an instrument in the Engineering Department at Stanford.
01:44:38.000 Why?
01:44:39.000 Could you do it physically back then?
01:44:42.000 Would that be possible?
01:44:43.000 It would have been very hard.
01:44:45.000 It would have been very, very hard.
01:44:47.000 You could, but in the late 1950s, we were still busy trying to isolate and separate uranium isotopes for making more bombs.
01:45:00.000 I mean, let's look, let's be serious.
01:45:02.000 What do humans separate isotopes for?
01:45:05.000 To make bombs or to do health-related tagging, which is really only something that came to the fore in the 60s and 70s.
01:45:15.000 And this predates that by a dentist?
01:45:17.000 This predates it.
01:45:17.000 So it's unusual.
01:45:19.000 It's possible.
01:45:21.000 But, I mean, again, with any of these things, why?
01:45:25.000 Why, for instance, would one of the supposed pieces that came from that event be magnesium at a level of purity that only Dow Chemical at the time had the ability to create?
01:45:40.000 Now, what else was at this site and what is the story behind this site?
01:45:44.000 A fisherman sees this glowing object that kind of released something which then exploded and he picked up pieces of it.
01:45:54.000 And there's some chains of evidence of how it got to either a newspaper in Brazil or to this South American Museum, et cetera, and different studies have been done by different people over time.
01:46:09.000 And the surprise to me was that the piece that I had was silicon, whereas the lore was that it was magnesium.
01:46:16.000 So I've been in contact with the people who talk about it as being magnesium, saying, well, it's, you know, your results don't dispute mine.
01:46:25.000 It just says that maybe there was something different.
01:46:27.000 Is that him?
01:46:29.000 Travis Walton?
01:46:29.000 There?
01:46:30.000 That's Travis?
01:46:31.000 Yeah, that's right.
01:46:32.000 Travis Bob.
01:46:33.000 That's cool.
01:46:36.000 So, you know, I don't know what it means.
01:46:40.000 published probably one of the first peer review papers on a UAP material from an event in Council Bluffs, Iowa.
01:46:49.000 And the event was an object is seen rotating, lights flashing, et cetera.
01:46:55.000 Something appears to drop from the object.
01:46:57.000 The police saw it, several other groups saw it in the 1970s.
01:47:02.000 They all converged on the locale.
01:47:05.000 And this was like in February or something, it was winter.
01:47:08.000 And there was this big pile of molten metal in the middle of this field, probably 30, 40 pounds of it.
01:47:15.000 And people tried to explain it away as well.
01:47:18.000 helicopter had a giant vat of molten metal, and then you calculate how far and how big a container you would have to carry molten metal of this type.
01:47:27.000 And so I analyzed it, and with a device that we invented in my lab, actually called multiplex ion beam imaging, which is a kind of what's called secondary ion mass spec, which what you do is you shoot a beam of ions at an object like a sandblaster.
01:47:44.000 It ionizes the material on the target, and then you shoot off and measure the mass of the objects that you just sandblasted off.
01:47:53.000 And so what we found was nothing unusual.
01:47:57.000 in terms of isotope ratios, except we found a mixture of metals that depending on where you looked in the sample was different.
01:48:05.000 So it would be like iron, titanium, and chromium of a certain ratio here, but a different ratio of those things over there and over here.
01:48:13.000 So what that meant was that whatever this stuff was didn't come completely pre-mixed.
01:48:20.000 It wasn't like a milkshake.
01:48:23.000 It was a slurry of partially mixed materials that somebody decided to drop off.
01:48:29.000 So again, this is just data.
01:48:31.000 But my purpose of publishing it was first, and this was published in the Progress in Aerospace Sciences, peer review..
01:48:39.000 The purpose was to show you're not going to get thrown out of the academy for publishing this stuff.
01:48:47.000 As long as you don't make crazy conclusions and you just say, here's the data, to show people that you can publish this stuff as long as you're scientifically careful in how far you go.
01:48:59.000 You leave yourself plenty of diplomatic exits in the verbiage that you use.
01:49:05.000 And it was part of what then got me to start the Soul Foundation along with Dave and others to say, look, it's okay to do this as long as you're careful.
01:49:15.000 And it's why people, I mean, Avi Loeb came after me because he had kind of the same pushback from his community where all he was doing was saying, the question's on the table.
01:49:28.000 I'm not saying it's true.
01:49:30.000 It's just you can't push this off the table.
01:49:32.000 So he had the same kind of righteous indignation that I have that propels me to say, well, I'm going to show you why you can't take this off the table.
01:49:42.000 So when they found this puddle of molten metal, and it's a bunch of different mixtures, so it seems like there's a bunch of different stuff that was there and it wasn't perfectly mixed.
01:49:53.000 Is there some sort of, have you theorized some sort of reason why they, any person or any creature, any being would do that?
01:50:05.000 Is there something that you would extract from that kind of metal, like heating it up to a certain degree and having a mixture of all these things, and this is just a byproduct that they're dropping off?
01:50:17.000 I think it's a byproduct of some process that might, again, might, might, might.
01:50:21.000 Might, might, might extract.
01:50:23.000 It might be part of a propellant system.
01:50:25.000 It might be part of the way that they generate the fields that allow these things to move.
01:50:30.000 Again, these are all myths, it's spec like when you see something and do something that you don't understand what it is, you have to be fully open.
01:50:40.000 I mean, for all I know, they're flushing the toilet.
01:50:42.000 Right?
01:50:43.000 Oh boy.
01:50:44.000 Yeah.
01:50:44.000 Ew.
01:50:46.000 But they got metal poop.
01:50:48.000 So, but, but, you know, I have the original Polaroids from the police department of it.
01:50:54.000 So, you know, it was real.
01:50:55.000 And people said, Oh, it was thermite.
01:50:57.000 Well, if it were thermite, there'd be aluminium oxide.
01:51:01.000 You know.
01:51:02.000 Thermite meaning that's how it was smelting down.
01:51:03.000 That's how it was smelting down.
01:51:05.000 And it's just some kids playing around, et cetera.
01:51:07.000 And it was a big joke.
01:51:09.000 Wacky kids with their thermite.
01:51:10.000 With their thermite.
01:51:12.000 But it turns out there's no aluminium hydroxide or oxide, I should say, in the sample.
01:51:16.000 I mean, I have the analysis.
01:51:18.000 It's just not there.
01:51:19.000 So it had to have been extreme heat.
01:51:20.000 It had to have been extreme heat of some kind that would produce it.
01:51:23.000 And whatever it was was hovering for a moment.
01:51:27.000 So it wasn't an airplane.
01:51:30.000 And there were no helicopters, and at least no helicopters with flashing lights.
01:51:34.000 And I've got there's been huge chunks of it still exist.
01:51:38.000 And the amount of this stuff, the kind of cauldron that would have to exist in order to melt this would be immense.
01:51:46.000 It was immense, yeah.
01:51:48.000 in the 70s already sort of made estimates of what was required.
01:51:51.000 And people said, Oh, it's a meteorite.
01:51:53.000 Well, no, we basically showed mathematically how, you know, first of all, meteorites make holes when they hit the ground.
01:51:58.000 They don't melt when they hit the ground.
01:52:02.000 And they make explosions.
01:52:03.000 Are there similar instances of something along this lines?
01:52:07.000 Several.
01:52:07.000 Really?
01:52:08.000 That's what's so interesting is that worldwide there are multiple reports of molten metals that get dropped off these objects.
01:52:19.000 And I have actually two other ones of a molten metal that was dropped off of one case in Australia and another in another area I'm not allowed to say, but It was one actually happened, supposedly, I've got to find the guy again in Fresno, maybe he's listening, that he said stuff dropped and he has, you know, malt and metal that landed in a puddle in the asphalt of his driveway.
01:52:44.000 And he saw this object.
01:52:46.000 So he's just holding on to it?
01:52:48.000 He's holding on to it.
01:52:49.000 He reached out to me and I was, you know, it was still at a time when I was just kind of getting into this area, but there's many, many examples of this kind of thing.
01:52:59.000 So, but interestingly, several of these other ones are just aluminum.
01:53:05.000 The one that I have is iron or whatever.
01:53:07.000 So does what does that tell me?
01:53:09.000 there's different kinds of ways of accomplishing the goal?
01:53:13.000 Whatever it is, they're either throwing something overboard or for, you know, because they don't need it anymore or because maybe it's getting in the way of something and it's time to get rid of it.
01:53:22.000 Have you brought in someone who's like a real expert in material sciences that would like to theorize, like, given an immense increase in technology and what, like, what potentially do you think this could be?
01:53:35.000 The purpose of being on shows like this is to have experts maybe give me an idea because the people I've been to at Stanford, you know, the other professors., they're like, okay, yeah, I gotta go.
01:53:52.000 Yeah, it could be it could actually be detrimental to your career.
01:53:56.000 And that's what's really weird about something when you're just talking about data, specifically in this case, of an actual physical thing that anyone can measure.
01:54:04.000 And I've got pieces, I've got plenty of it, you know, and the original piece is, you know, is like this big that the owner of it had brought to my lab just last summer.
01:54:04.000 Right.
01:54:17.000 It's like big as an iMac.
01:54:18.000 Yeah, exactly.
01:54:19.000 Oh, it's huge.
01:54:20.000 Crazy.
01:54:21.000 And so what is it?
01:54:22.000 I would love for someone to tell me that it's conventional and has a purely prosaic answer.
01:54:28.000 Because then I can go on to the next thing.
01:54:30.000 The whole reason for getting that, the Atacama Mummy off the table was not because I wanted to annoy anyone, was because it was spectacular.
01:54:39.000 It's obviously something people would pay attention to.
01:54:41.000 So if it's real, let's do it.
01:54:42.000 If it's not, let's get it off the table.
01:54:44.000 Because it's usually the stuff that's hidden under the rubble.
01:54:47.000 That's the most interesting.
01:54:48.000 My question about that mummy is not that it's an alien, but if it does register as human in the DNA, is it potentially a different kind of human than us?
01:55:01.000 Well, certainly she.
01:55:02.000 She had we brought in an expert in South American indigenous people genetics and the analysis showed that the the standard genetic mutations that are found in different racial groups around the world matched exactly the Atacama region of Chile.
01:55:27.000 So her parents, her relatives were clearly Chilean.
01:55:35.000 So, yeah, I mean, that's really all you can that's really all you can say.
01:55:40.000 Just to say that she's an alien, well, that's fine.
01:55:42.000 I'm convinced of what she is and that she deserves a proper burial.
01:55:46.000 And so it's just a genetic anomaly.
01:55:48.000 Just a genetic anomaly.
01:55:50.000 I do know that you've paid attention to the Tridactyl mummies.
01:55:54.000 Yeah.
01:55:55.000 What is your take on that?
01:55:57.000 So, you know, I think people have conflated a lot of the different mummies that are out there.
01:56:05.000 First of all, there's like 60 of them or something.
01:56:09.000 And probably a fair number of them, I wouldn't necessarily call them hoaxes.
01:56:14.000 I would say that they are constructed.
01:56:16.000 But they're old constructs.
01:56:17.000 So maybe they're some sort of homage paid to the ancestors or something like that, whatever they are.
01:56:24.000 So there are some that you clearly look at, you go, oh, come on.
01:56:28.000 That never lived.
01:56:30.000 Then there's the fetal position.
01:56:31.000 Then there's the fetal position ones, the big ones.
01:56:34.000 And I was at the beginning, I was, you know, I'm always open to being wrong.
01:56:40.000 I was at the beginning thinking, oh, well, because of the small ones, those are probably not real.
01:56:43.000 But then the MRIs started coming out.
01:56:46.000 The full body MRIs and the ligature and the bone construction and the finger and then perhaps most, I think extraordinarily, the fingerprints on them, being clearly not human.
01:57:01.000 So it's interesting.
01:57:04.000 But here's the problem is that because there's so much circus around them, unfortunately created by people who want a circus because it sells their TV shows, no scientist of any merit would go near it.
01:57:23.000 So I was approached many times, many times to study them.
01:57:26.000 And I said, I'll do it on one condition.
01:57:29.000 Here's the money I need, not personally, but here's the money I need to do the kinds of analysis to accomplish this right.
01:57:37.000 Second, there will be no.
01:57:39.000 TV cameras.
01:57:40.000 And you won't hear from me again until I'm ready to talk.
01:57:44.000 Because I'll have double checked and triple checked and quadruple checked the results.
01:57:48.000 And then I've gone out, as I did with the Autocama Mummy, bringing in further contiguous circles of experts to double check me.
01:57:56.000 And not make it a circus.
01:57:57.000 And not make it a circus.
01:57:58.000 Because I won't name the TV show that wanted to do it.
01:58:03.000 but they wanted me, they wanted to follow me around with a...
01:58:07.000 I'm like, no.
01:58:08.000 This isn't how science is done.
01:58:11.000 I can't do it with those strictures.
01:58:13.000 So I would say that if anybody's going to do it again, lock the things away with South American scientists.
01:58:21.000 You don't need a North American scientist to come in and do it.
01:58:24.000 There's plenty of smart people in South America who can do this properly and respect the rights of the indigenous peoples who own the sacred grounds within which these things were found.
01:58:34.000 I think that's important.
01:58:38.000 And then do the analysis right.
01:58:40.000 You know, they've said, they made, I think, the mistake of saying, well, we've done the DNA and there's a lot of DNA that doesn't match.
01:58:48.000 Anything, and the stuff is several hundred years old, anything that old, you won't get a lot of good DNA out of it.
01:58:56.000 But just they did the same thing with the Denisovan and the Neanderthal.
01:59:01.000 You have to correct the chemical errors that occur over time.
01:59:06.000 There are ways to what's called bioinformatically correct.
01:59:10.000 You need to do what's called overreading of the genome, where you do so many reads of it that you stack them all up line by line.
01:59:18.000 Like if you had 1000 versions of an ancient Bible, you would stack up the lines one by one and finally you find one line that has this letter that's correct and then this one correct and then you'd basically do a summation of an averaging of the correctness.
01:59:36.000 And so they say, oh, well, there's, you know, 90% of the genome is nonhuman.
01:59:40.000 It's probably garbage.
01:59:41.000 It's probably these mistakes.
01:59:42.000 It's probably bacterial contamination that you're reading.
01:59:46.000 There's ways to deal with that, but that requires money and not one-off DNA sequences put on the internet for some amateur genomics to make a claim about.
01:59:58.000 So there's ways to do it.
01:59:59.000 I mean, you would want at the end of the day to get the results to the level where you could go to the guys who did the Denisovan and the Neanderthal DNA, the Max Planck and others who did the Nobel Prize for it, and say, hey, what do you think?
02:00:15.000 But you don't dare take it to people like that until you've done your homework.
02:00:20.000 I see.
02:00:21.000 And you do it behind the scenes.
02:00:22.000 You don't put them under a flashlight.
02:00:25.000 Right, right.
02:00:26.000 You know, and people, I think, have gotten used to this click mentality of impatience where I want the result today.
02:00:36.000 Why can't you just make it all transparent?
02:00:38.000 Dump all the data on the web tomorrow.
02:00:41.000 You're not transparent.
02:00:42.000 You're hiding something.
02:00:43.000 No, I'm not.
02:00:44.000 I'm just trying to make sure that you don't make the mistake and accuse me of making the mistake that you'll find in the data because the raw data is never clean.
02:00:54.000 Mm.
02:00:55.000 Mm.
02:00:55.000 In the Daily Mail headline.
02:00:57.000 In the Daily Mail headline.
02:00:58.000 Never accurate.
02:01:00.000 So, long story short, I think there's still something worth looking at there.
02:01:06.000 Well, the scans are fascinating, right?
02:01:09.000 The scans are the most interesting to me.
02:01:09.000 Yeah.
02:01:10.000 Have you seen the Jesse Michaels, the Justice video?
02:01:13.000 Jesse is a good friend.
02:01:13.000 Yeah.
02:01:14.000 He's great.
02:01:15.000 I love that guy.
02:01:16.000 And the episode that he did is fantastic.
02:01:20.000 And when you see the scans and they go over the bone structure of the thing and you look at it, you're like, God, that looks real.
02:01:27.000 If that's a hoax from 1700 years ago or over 1000 years ago.
02:01:31.000 Exactly.
02:01:32.000 Whoever if the carbon isotope dating that they did on it is accurate.
02:01:37.000 I've looked at that data.
02:01:38.000 It looks good.
02:01:39.000 So then it is that old.
02:01:39.000 Okay.
02:01:42.000 Fuck you then.
02:01:43.000 Because there's no way someone back then could fake that.
02:01:46.000 And someone asked me the other day, they said, Well, could you have a single mutation?
02:01:50.000 I said, No.
02:01:51.000 I mean, because you don't get one mutation that does all that.
02:01:55.000 Right.
02:01:55.000 You know, evolution works step by step that this does this, but it has a mistake, but it's corrected by this mutation over here in evolution, which is corrected by this.
02:02:08.000 The whole, the genome fluctuates over time, compensating for the errors that would otherwise have killed you.
02:02:15.000 Also, one of them is pregnant.
02:02:18.000 That's fascinating.
02:02:19.000 Yeah, I know.
02:02:20.000 Okay, so it's a three foot pregnant thing that doesn't look remotely human being.
02:02:25.000 Yeah.
02:02:26.000 So the jury is still out.
02:02:28.000 Right.
02:02:29.000 But if they're going to do it right, they need to sequester the stuff away, bring in the right people with sufficient resources, and get rid of the cameras.
02:02:38.000 Have you talked to them?
02:02:39.000 Have you encouraged this?
02:02:41.000 Is this possible to nudge this in the right direction?
02:02:44.000 And where is it out right now?
02:02:46.000 I wrote out on Twitter a full thing of what they needed to do.
02:02:49.000 I mean, the easiest first milestone to do, to be honest, that could be done within a couple of months, is if it is somewhere in the hominid or, let's say, vertebrate line, there are metabolism genes that we all share.
02:03:06.000 In fact, there are metabolism genes that we share with bacteria that are very similar.
02:03:11.000 So there's, you probably, you know the technique called polymerase chain reaction, PCR?
02:03:17.000 So, you know, why try to do the whole genome?
02:03:20.000 Why not just target a bunch of genes that we know evolve slowly but do evolve and PCR those out because that's easier to do than is trying to assemble a whole genome and then by having just those let's call it preliminary sets of evidence you could then say hmm this actually reproducibly if I take a sample from the finger,
02:03:50.000 I take a sample from the bone marrow, I take a sample from here or there on the body, and I take a sample from different, the three different main things, and I see the same mutations, and they're different or somehow aligned with hominid evolution, right?
02:04:07.000 We compare it to all the known hominids.
02:04:11.000 I mean, that would be the kind of data that you could actually publish in a journal like Nature, if you did it right.
02:04:17.000 Because that's the only way you're going to get anybody to pay attention.
02:04:20.000 There's also the bizarre anecdotal nature of some of the artwork.
02:04:27.000 Like the fact that these people did a lot of these tapestries and a lot of ancient artwork that's a thousand years old that depicts these three fingered things.
02:04:39.000 So it's like, what are they described?
02:04:41.000 Are they describing these actual creatures?
02:04:43.000 That there were only a few of them and it was a weird genetic mutation or is this a common visitor that they're describing?
02:04:51.000 I don't know.
02:04:52.000 I don't know either.
02:04:53.000 I mean, why would you put them in a cave in Peru?
02:04:58.000 I don't know.
02:04:59.000 And if you didn't put them in a cave in Peru, what would be left?
02:05:01.000 That's the problem.
02:05:02.000 The problem is it's really hard to make a fossil, it's really hard to find bones.
02:05:06.000 Think about all the people that died.
02:05:08.000 Right.
02:05:08.000 And where are we, you know, we don't find that many bones, relatively speaking, to compare to the fucking billions of people that died.
02:05:16.000 Right.
02:05:16.000 It's not like we're tripping over human bones every day.
02:05:18.000 Right, except in mass graves.
02:05:20.000 Yeah, right.
02:05:21.000 That's really, yeah.
02:05:23.000 And even in mass graves, given enough time, they would deteriorate like mass graves from 1,700 years ago, whatever these things are.
02:05:31.000 So, you know, I find them, again, I find them interesting.
02:05:34.000 And I hope that behind the scenes, there are people who are taking a more methodical approach to this who I think should remain stealthed until they have the data to the point where it is publishable.
02:06:04.000 The reason you want papers, frankly, when you publish them to be almost boring and so thick with detail that no pseudosceptic would dare approach it because they're just not smart enough.
02:06:19.000 But if you put out these snippets that don't have sufficient background, they can be picked apart by anybody.
02:06:27.000 But that's why peer review is so important.
02:06:27.000 Right?
02:06:30.000 And people mistake peer review as trying to get the reviewers to agree with your conclusions.
02:06:35.000 No, the main purpose of peer review is actually to make sure that the methods that you used are sufficiently detailed and are correct enough to the extent you came to any conclusions, they match the methods that you used.
02:06:50.000 And when you think about these potential, whatever they are, whatever these creatures are, if we did find out that they are some sort of a hominid,
02:07:09.000 How much credence do you give to the theory that there's like the possibility that these UFOs, UAPs, whatever it is, is a break off civilization from a very, very long time ago that's very different from us, just the way we're very different from chimpanzees.
02:07:24.000 Right.
02:07:24.000 Which we coexist with.
02:07:25.000 Right.
02:07:26.000 I have no problem conjecturing that.
02:07:29.000 Did you ever see the Netflix show Chimp Empire?
02:07:32.000 Yes.
02:07:33.000 Amazing, right?
02:07:34.000 Twenty million years of separation, and it looked like fucking faculty meeting.
02:07:34.000 Amazing.
02:07:40.000 You know, with people like looking at each other, planning and plotting, board meeting, you know, and so we shared all those interactions from twenty million years ago.
02:07:52.000 So how much further back would you have to go to have something like what that is?
02:07:59.000 I mean, it's clearly not recent.
02:08:02.000 And also, if you think about what we are in comparison to chimps, we're so fragile, we're frail, we're easily injured, we're well, if you think of something that's far more technologically advanced than us, it would be even more frail, it would be even more petite, it would have almost no muscle at all, it would look, weirdly enough, like the Grays from closing encounters of the third kind.
02:08:26.000 That's what it would look like if it was a hominid that's whatever we are, and it went way past that.
02:08:33.000 Right.
02:08:33.000 Yeah, no, technology gives evolution the excuse to no longer make or allow for you to be robust.
02:08:43.000 Robust, thank you.
02:08:44.000 And also, why do you need opposable thumbs?
02:08:46.000 Right?
02:08:46.000 Yeah.
02:08:47.000 These things don't even have opposable thumbs.
02:08:48.000 That was what's weird about it.
02:08:49.000 It's like, how do you interact with your environment?
02:08:49.000 Right.
02:08:52.000 They look more like sloths than they really do.
02:08:54.000 Right.
02:08:55.000 I mean, at least their hands do.
02:08:57.000 Yeah.
02:08:57.000 And I don't know.
02:08:58.000 I find it, well, if everything is done with AI and automation, and your interface is purely neurological, like you have some sort of human or a creature neuro interface with technology and you just use fingers to like lay them on electronics so that you can sync up with it.
02:09:18.000 Right.
02:09:19.000 Yes.
02:09:20.000 Yeah.
02:09:21.000 Why are you picking things up, bro?
02:09:22.000 You don't have to pick things up anymore.
02:09:24.000 Those go away just like, you know.
02:09:26.000 Can you imagine the scenario of, I mean, these things we know are the bodies are real.
02:09:33.000 What they are, we don't know.
02:09:34.000 But can you imagine the scenario of what happened as they were being buried?
02:09:40.000 Could you like make a, you know, a film of the ceremonial burial of these things?
02:09:48.000 things.
02:09:49.000 You know, what would, what led to their death?
02:09:53.000 What led to their placement there?
02:09:55.000 Or if they were constructed, which I have a hard time with given the MRIs that we've all seen, et cetera, what led to it?
02:10:06.000 And so that to me is almost interesting as to whether or not they're real or not.
02:10:11.000 Right.
02:10:12.000 Like the ones that are clearly constructed, that's where it gets fascinating.
02:10:15.000 Because like, what were you trying to reproduce?
02:10:17.000 Yes.
02:10:18.000 And why are they so similar to the ones that look real?
02:10:21.000 Yeah.
02:10:23.000 Is it an homage to the ancestors or to the stories of the ancestors, et cetera.
02:10:29.000 Especially when You look at Peru, like Peru is like, you've got the Nazca lines, which are really weird.
02:10:36.000 You can only see them from the sky, and they're everywhere, and they're huge, these depictions of very strange things.
02:10:45.000 So I just ask my scientific colleagues to not suspend disbelief, but to open your minds as to the possibility.
02:10:55.000 of what these things might mean and just try to explain them without dismissing them.
02:11:00.000 Because it's so easy and politics we see it every day.
02:11:03.000 All you need to do is just give any answer, even if it's obviously flagrantly wrong as just as a way to deflect.
02:11:12.000 And so, you know, you can either use that approach, you shouldn't use that approach ever as a scientist, deflect, which unfortunately is what someone like, you know, Neil deGrasse Tyson often does.
02:11:23.000 Yeah.
02:11:24.000 And as opposed to try to explain in a way that teaches your audience the right way to think.
02:11:34.000 Yeah, well said.
02:11:37.000 One of the things that Jacques Vallée highlighted is there's an alloy, another piece of metal that they'd found that had layers like these at an atomic level.
02:11:52.000 That if you wanted to make this alloy today, it would be almost impossible.
02:11:58.000 It would cost billions of dollars.
02:12:00.000 So I worked with him on one of those pieces.
02:12:03.000 I got the atomic imaging of some of that.
02:12:06.000 And it's, oh God, I'm blanking on the event, but it was the Sirocco event.
02:12:13.000 And where was that?
02:12:14.000 In New Mexico.
02:12:16.000 I'm going to get in trouble for not knowing exactly.
02:12:19.000 And we actually did an atomic layering using this device called atomic probe tomography where you literally pick it apart atom by atom and get its 3D position.
02:12:28.000 It's a 40-year-old technology, so it's nothing magic.
02:12:33.000 So, and yeah, it would just be very difficult to make it, you know, and certainly it would be not something that you would have dropped in the middle of the desert.
02:12:43.000 Is it Socorro?
02:12:44.000 Socorro.
02:12:45.000 In the middle of the desert, you know, in the 1970s or whenever it was.
02:12:51.000 I wouldn't say it's impossible to make.
02:12:54.000 But why you would do it is another question.
02:12:57.000 It's clear what interests me is, first of all, why would you do it?
02:13:09.000 Why would you create something, for instance, with the silicon and the magnesium with the altered ratios?
02:13:14.000 Not the where did it come from?
02:13:16.000 So what is it evidence of?
02:13:18.000 It's clearly evidence of technology.
02:13:21.000 Was this technology available at the time this supposed crash happened?
02:13:26.000 Which one?
02:13:26.000 This?
02:13:27.000 No, not, no.
02:13:29.000 Not at the level of precision that was done and a chunk of and no, it just wasn't.
02:13:34.000 It just wasn't.
02:13:35.000 So if that's true and if it really if that's the chain of evidence is correct and it, and it really did come from that area from that crash.
02:13:43.000 That's not a human creation.
02:13:45.000 Well, it wasn't a crash.
02:13:47.000 It was an object that a policeman had seen with beings, short beings outside of it, and when it took off and left, he went over and found this piece that I actually, I personally have it now.
02:14:03.000 Huh.
02:14:04.000 So, but, you know, it's hard to say.
02:14:09.000 what's possible and what's not possible.
02:14:11.000 So, you know, there's plenty of military programs that make stuff that are way outside of mainstream capabilities right now.
02:14:19.000 I mean, just look at the stealth bomber.
02:14:21.000 For instance, and the skin of the stealth bomber is just remarkable.
02:14:21.000 Right.
02:14:24.000 Is it possible they were doing that in 1970?
02:14:26.000 Maybe.
02:14:27.000 Maybe.
02:14:27.000 So that's why I always leave open the possibility that, you know, which is why, I mean, this is, I'm going to get back to this atomic imager thing that I'm making.
02:14:38.000 It's like, there's a level of evidence that I think can be produced with atomic imaging that goes beyond what it is we know anybody can make.
02:14:50.000 Right?
02:14:51.000 So, and so that's my reason for wanting to do it.
02:14:57.000 Because, you know, look, I can make money on it with looking at alloys and nanomaterials, et cetera.
02:15:03.000 And that's going to be what the purpose of making the instrument will.
02:15:06.000 That's how it will be a company.
02:15:08.000 But it will have value elsewhere.
02:15:11.000 So the reason that I got interested in it was frankly for looking at chromosomes.
02:15:15.000 But then I realized, oh, maybe it has interest.
02:15:18.000 Maybe it would be useful for these other things as well, which has kind of propelled my interest in it.
02:15:23.000 Well, Jacques Villet is such a valuable researcher because he's so logical about the way he handles things and he doesn't jump to any conclusions.
02:15:33.000 And his descriptions of these materials and the origin of these materials is really compelling.
02:15:40.000 because it's just like, if that's not really possible to make in 1970, then someone help me out.
02:15:46.000 Yeah.
02:15:46.000 What is that?
02:15:47.000 And is it possible to make today?
02:15:47.000 Yeah.
02:15:48.000 And how much would it cost?
02:15:50.000 Right.
02:15:50.000 And where would you do it?
02:15:51.000 Well, that's why the magnesium ratio thing was, you know, when I first estimated it was like, this is millions and millions of dollars, and why would you leave it on a beach in the middle of Ubatuba, Brazil?
02:16:01.000 Right.
02:16:02.000 You know, it's just it just seems it seems unlikely.
02:16:07.000 Nothing's impossible.
02:16:08.000 But unlikely.
02:16:08.000 No.
02:16:10.000 Well, and then it's usually the chain of evidence.
02:16:13.000 It's, there's lots of materials that you might find that are unusual.
02:16:17.000 And believe me, I get rocks sent to me at my lab in the mail that people say, oh, this is unusual.
02:16:21.000 No, it's a rock.
02:16:23.000 Sorry, it's a rock.
02:16:26.000 But you know, I have not yet been given anything which I could definitely say, this is not something a human might have been able to make.
02:16:40.000 It might be difficult, but not impossible yet.
02:16:43.000 And so that's because the level of resolution required to claim something is impossible is something we actually don't even have yet.
02:16:52.000 Does that make sense?
02:16:53.000 Yes, that does make sense.
02:16:55.000 So that's what I'm, so my whole career has been inventing instruments that were, I felt, inevitable, but not yet possible.
02:17:06.000 But I could see a path to making them.
02:17:08.000 And so I said to most people, get out of my way.
02:17:10.000 I'm going to do this.
02:17:11.000 Because I know once I've got it, it will become valuable to everybody, which is, that's what made my career in immunology, making a succession of instruments like that and then making them available to the community.
02:17:22.000 So I think the next level is atomic.
02:17:25.000 Because we now know you can pick up and look at any of the major physics journals today.
02:17:32.000 Everything is all about these weird exotic particles that exist in metamaterials down at the atomic level with vague and strange capabilities that will change their utility either as superconductors, room temperature, or different kinds of electronic components that might be better, quantum computer circuits and qubits.
02:17:54.000 It's all down at that level.
02:17:56.000 But to do so requires a level of engineering that we don't, I mean, never mind reading what it is, putting it together in the first place is what's still required.
02:18:07.000 And so if we don't know how to put it together in the first place, then reading it and knowing that it can exist and then associating it with a function is the value that I'm looking to bring.
02:18:21.000 Well, this brings me to the idea of crash retrieval and the idea that these crash retrievals started a long time ago and that Roswell was just one of many.
02:18:34.000 There's another one that was near Roswell that apparently was even more significant but didn't get in the newspaper.
02:18:40.000 Trinity, are you talking about?
02:18:43.000 It was the one that Jacques was involved with studying.
02:18:47.000 I'm basing off with Richard Dolan's book.
02:18:49.000 Okay.
02:18:50.000 But at the end of the day, the point being that if they did do that, if they really did back engineer something, and then they started these completely top secret scientific research projects where they were developing alloys that had never existed before with techniques that they had never really even considered because they got it all from some spaceship.
02:19:14.000 Well, that's where it's really crazy if you don't disclose this information.
02:19:19.000 Because you're basically putting a bottleneck on human evolution, human technological evolution.
02:19:26.000 our understanding of what's actually possible.
02:19:28.000 Right.
02:19:29.000 I agree.
02:19:30.000 And, you know, if you're going to excite the next generation of scientists in this country and you're going to bring economic prosperity to this country, then we should, I wouldn't say democratize it and put it all out on the internet.
02:19:46.000 I understand all the reasons why you might not need to.
02:19:50.000 you need to excite the populace.
02:19:52.000 I mean, my laboratory at Stanford for probably the last 10 years is Not because I don't want to take more Americans, but because Americans just don't go into the sciences anymore.
02:20:11.000 They don't study math.
02:20:13.000 You know, they are not encouraged to approach us, so we're importing a lot of our scientists from overseas.
02:20:20.000 Well, guess what?
02:20:21.000 A good third of them end up going back and bringing all the technology that they invented here back there and creating competitors.
02:20:27.000 Now, maybe that's good on a global scale, you know, but maybe it's not something that we want to encourage on a local scale if we want to maintain our technological superiority.
02:20:39.000 We're basically governed by lawyers.
02:20:41.000 China is governed by engineers.
02:20:44.000 You know, I mean, when you see the results in their drone technology and electric cars and the things that are coming out of China recently?
02:20:50.000 Their Polyp Bureau is almost entirely engineers and scientists.
02:20:56.000 Interesting.
02:20:56.000 There's a little article in the Atlantic recently about that.
02:20:56.000 Yeah.
02:20:59.000 That's a giant advantage.
02:21:00.000 Yeah.
02:21:01.000 So people who are making these decisions, we have lawyers looking for all the reasons why something should or shouldn't be done and the liabilities.
02:21:07.000 They're looking at things as to what's possible.
02:21:11.000 When you're looking at these UAP things that people bring you, is there one that stands out as being the most compelling to you?
02:21:19.000 One event?
02:21:22.000 Well, both the council bluffs and the Ubertuba event are interesting to me.
02:21:26.000 Because of the physical material.
02:21:28.000 Because of the physical material itself.
02:21:30.000 I mean, I'm at the end of the day a physicalist.
02:21:33.000 I mean, I don't like all the anecdotes.
02:21:36.000 I mean, a thousand anecdotes make a good story, good campfire.
02:21:41.000 I mean, I think there's statistical value in people seeing the same thing again and again, and there's a truth to it.
02:21:48.000 But as, you know, and I can believe anything I want around that, and many of the statements that I'm purported to have said are around my beliefs, as opposed to when I put on my scientist hat and I try to convince another scientist.
02:22:02.000 I can only provide this data and this evidence, and I don't have yet these materials.
02:22:07.000 Now, maybe they exist, and maybe people like David Grosch will be able to pry them out of the clammy hands of those who want to keep it where it is, but give me one piece of that, and I will do wonders with it.
02:22:25.000 Yeah.
02:22:25.000 I mean, that's why I'm so excited about the UAP Disclosure Act, if it ends up becoming We're taking money from one program to give to another.
02:22:53.000 Whether you're taking it from your taxes, you're taking it from veterans, you know, insurance, et cetera, it's a zero-sum game.
02:23:00.000 Whereas if you bring the investment community in, now you're bringing in people who are willing to take a chance and willing to take a risk, and you're not using the public's money anymore.
02:23:10.000 And so, and that excites, I mean, me as the reason why I wanted to go back to Stanford is because the entrepreneurial environment there, and now which is actually almost homegrown here in Austin, is really what drives innovation.
02:23:25.000 And so I want to excite that kind of community.
02:23:29.000 And again, the SOLE Foundation is a place where we can bring people in, and we've got investors who show up now, who are talking to people about their ideas and what would we do with this.
02:23:38.000 And so it almost has now a self-propelling movement where I don't need to be standing on a wooden box somewhere in the middle of the park saying, you know, look at this, look at this.
02:23:55.000 People are just doing it now.
02:23:56.000 There's now a whole, almost a cottage industry of small groups or formalized groups who are doing this independently now.
02:24:07.000 So it's almost like it's inevitable.
02:24:10.000 So SkyWatcher, as an example, you probably know the SkyWatcher group.
02:24:14.000 Yeah, I've heard of it.
02:24:15.000 And Jake did.
02:24:16.000 And they just stopped operations, did something happen?
02:24:20.000 No, it's it's strange because people said, Oh, we stopped.
02:24:22.000 No, actually it had been determined from the beginning that we were going to go from January until July or August and collect data.
02:24:30.000 And now we're in the Okay, what does the data mean phase?
02:24:34.000 where we're literally going through the data, looking at the data files and trying to we're as I said before we're filtering the data we're looking for the obvious mistakes etc and so No, they've not stopped.
02:24:50.000 Yeah, there was something on Twitter about something about the equipment.
02:24:55.000 No.
02:24:55.000 I forget.
02:24:56.000 So James Fowler, one of the guys who brought a lot of his equipment and technology to us, decided that he wanted to basically go off and work in a DOD capacity as opposed to the research capacity.
02:25:12.000 He's still advising us.
02:25:13.000 I was just on a phone call, a Zoom call with him last week going over the data files.
02:25:18.000 So explain this SkyWatcher thing to people because it sounds insane.
02:25:23.000 Well, the idea behind it was that there might be ways to send a signal and get things to show up.
02:25:33.000 And James Fowler claimed that he had such a thing.
02:25:37.000 I was at one of the events where something showed up.
02:25:42.000 It was transient, momentary, but indisputable.
02:25:47.000 But it's just like, what did it look like?
02:25:49.000 It was just a silver ball moving quickly through several frames of the video, which wasn't fast enough, frankly, to pick it up.
02:25:59.000 We just saw it move.
02:26:00.000 It went that way.
02:26:01.000 And you didn't see it with your naked eye?
02:26:02.000 No, I didn't see it with your naked eye.
02:26:05.000 Which, of course, is a problem.
02:26:06.000 Do they sometimes see things with your naked eye?
02:26:08.000 One guy did, yeah.
02:26:09.000 One guy.
02:26:10.000 Oh, I mean So are these things variable in their appearance?
02:26:13.000 I wish I had my I don't have my phone here.
02:26:16.000 But we do have a picture of one next to a next to the helicopter about 60 meters away.
02:26:22.000 And it's just a kind of a fuzzy white blob against a blue sky.
02:26:28.000 But it was there.
02:26:29.000 You know, and it's not a cloud and it's not a balloon..
02:26:33.000 It's not discernible as anything obvious, but it was there and it happened during one of these events out in the middle of the desert.
02:26:44.000 And so the idea behind SkyWatcher is to see if there are ways to get them to show up and if so, in a reproducible manner and then have the right kind of simultaneous multisensor capabilities to measure it, meaning radar, IR, visual people on the ground.
02:27:07.000 What are they sending to get these things to go?
02:27:09.000 What signal?
02:27:12.000 that unfortunately he won't I don't know what it is.
02:27:16.000 He won't let everybody know what the bat signal is.
02:27:18.000 Well, I mean, you know, I mean, maybe, yeah, exactly.
02:27:21.000 I mean, it sounds kind of silly, but I mean, why would you put that out on the internet?
02:27:25.000 Because, you know, you might render it useless.
02:27:28.000 They're like, ugh, I don't have to show up.
02:27:31.000 Everybody's using it now.
02:27:32.000 Oh, so you think it's a trick?
02:27:34.000 Like, it tricks them to show up?
02:27:36.000 I don't know.
02:27:37.000 Don't you think they'd be smarter than that?
02:27:37.000 I really don't.
02:27:40.000 Well, that tells you something maybe about the level of smartness that might be incorporated into these, let's say, dumber machines.
02:27:47.000 Maybe, yeah.
02:27:48.000 Yeah, that was exactly my thought.
02:27:50.000 It's like, why would you show up when you know what it is, unless there's a reason you're basically trying to train the monkeys what to do?
02:28:02.000 Maybe you're tricking the monkeys to send the I don't know.
02:28:04.000 But isn't there a group of people that just go out and they just use their mind, they meditate, and supposedly they have some success as well?
02:28:14.000 Yeah, there's the CE five groups that do that.
02:28:18.000 And I'm more than willing to believe that there are technologies capable of measuring thoughts at a distance that might be some super advanced.
02:28:35.000 I don't believe you have to call it telepathy and magic.
02:28:39.000 I think that there's, you know, if such a thing happens that there's a technology that might be able to read at a distance.
02:28:45.000 Right.
02:28:46.000 Well, I don't have a problem with that.
02:28:48.000 I don't have a problem with that either.
02:28:49.000 I don't have a problem with the idea that consciousness is kind of vaguely and barely understood and whatever our relationship to the universe itself and reality itself through consciousness is it's not fully defined and also, it might evolve just like all of our other intellectual capabilities.
02:29:14.000 Right.
02:29:15.000 Well, I mean, think of it this way.
02:29:17.000 You know, you and I are interacting with each other through quantum waves.
02:29:22.000 My meat brain sees you as an object, but yet everything that you are sits in quantum spacetime down at the Planck level, and you're not even mass.
02:29:30.000 You're just a series of, I mean, in some people's minds vibrating fields and objects.
02:29:35.000 And so we have sensors that see and hear each other and think about each other, but our consciousness somehow is embedded in spacetime.
02:29:43.000 And so who's to say that there's not signals passing to and from that are vaguely able to be picked up by our meat brains that we don't necessarily appreciate.
02:29:53.000 Right.
02:29:54.000 So that just because I can't think at you and you can't hear me doesn't mean that there aren't perhaps brain organizations of some people that are a little bit better at hearing the echo than others.
02:30:08.000 Well, this is also probably the reason why when you go to the woods and there's no cell phone signals, the world feels different.
02:30:14.000 Yeah.
02:30:15.000 Because you're probably experiencing a bunch of signals that your brain vaguely interacts with.
02:30:20.000 Right.
02:30:21.000 That, you know, might not even necessarily be good for you.
02:30:24.000 Right.
02:30:24.000 But they're out there and they're a part of the world that you live in.
02:30:28.000 And you just, you can't, you don't have a radio.
02:30:30.000 Right.
02:30:30.000 Right.
02:30:30.000 So you're not like tuning in to them.
02:30:32.000 You don't have a cell phone.
02:30:33.000 So you can't just like make calls with it.
02:30:35.000 But you're experiencing it.
02:30:36.000 Right.
02:30:37.000 Well, and, you know, our civilization is drowning us in constant noise.
02:30:43.000 Yeah.
02:30:43.000 And so maybe, you know, that drowns it out.
02:30:45.000 And that's why meditation is why people claim that they can interact with other things.
02:30:49.000 I don't know.
02:30:50.000 Yeah, I don't know either.
02:30:52.000 One I saw an interview that you did where you were describing the sighting off the coast of San Diego in 2004, the Nimitz sig site where he said that the amount of power Why don't you describe it?
02:31:07.000 So the amount of power that that thing had to use to move the way it did.
02:31:12.000 Right.
02:31:12.000 So it's on radar.
02:31:13.000 It's on radar.
02:31:16.000 So these are actually calculations by Kevin Nooth, a physicist from the University of Albany, and a published paper.
02:31:24.000 Again, just speculation.
02:31:26.000 But what he basically said was how much power would it take to instantaneously accelerate from fifty feet over the ocean to fifty miles above the earth, whatever the number was, and instantaneously decelerate.
02:31:41.000 So it's not just the amount of power to lift something, it's the amount of power to accelerate and decelerate instantaneously.
02:31:48.000 And so you can make simple physical calculations of a one ton object, let's say, and it's more than the nuclear output of the United States for a year.
02:31:59.000 And yet these things seem capable of doing that at will.
02:32:04.000 So where are they getting the energy from?
02:32:06.000 And I remember asking Hal a question like this years ago.
02:32:09.000 We were stepping into a Hal put off stepping into an elevator and we were talking about his ideas about how these things might move.
02:32:16.000 And I said, so they're cheating somehow, aren't they?
02:32:19.000 And his answer was, from our point of view, they're cheating.
02:32:22.000 From their point of view, they're just using the physics that we don't understand yet.
02:32:27.000 So where's the energy coming from?
02:32:29.000 What are they doing?
02:32:30.000 And so that might be, as a, for instance, a reason why you don't want everybody having access to it.
02:32:37.000 Because any one of those objects is worse than a thermonuclear bomb.
02:32:37.000 Yeah.
02:32:42.000 You shoot one of those things at a city and that's the end of the city.
02:32:46.000 And if anybody could do it, you know.
02:32:48.000 Well, maybe that's the step of human evolution, of the evolution of our society and civilization is that AI has to come into power before we have access to all this other stuff.
02:33:00.000 That we do need an AI government structure, that we do no longer require military intervention and all the shit that is the bane of civilization today.
02:33:11.000 Because if you ask the average person today, is, do you envision a world where war doesn't exist?
02:33:18.000 Most people are saying no.
02:33:20.000 The vast majority, except for a few delusional hippies.
02:33:23.000 They're going to say no.
02:33:24.000 But if you ask them, okay, given this super intelligent AI takes over the world and proves to be benevolent and really just wants to accentuate the life of human beings on Earth and make it better for everybody, then yes.
02:33:41.000 Then 100% yes.
02:33:42.000 Why would it want war?
02:33:43.000 Right.
02:33:44.000 So maybe something like that has to take place before we get to a situation where, okay, this is how you really travel.
02:33:52.000 Right.
02:33:53.000 Right.
02:33:53.000 Okay, now that you're not going to war anymore, listen, but you can already imagine the negatives where people will say, well, it's the it's the It's the apocalyptic nanny state, right?
02:34:10.000 Where AI just basically takes care of you and humans devolve into something, which is why I think a merger of human intellect with this where it's a synergy as opposed to an either or.
02:34:23.000 I don't want to be nanny stated either.
02:34:26.000 I want to use it to explore ideas or explore pleasure.
02:34:30.000 I mean, I'm finding people want to be hedonistic and, you know, participate in virtual parties all day long, for all I care.
02:34:39.000 I don't care.
02:34:41.000 But I think giving people the option to do whatever it is that they want to do, it's the most, I don't know, what's the, it's the most liberal and conservative way of living because you're allowed to do what you want to do.
02:34:56.000 But we're not because we're living at the behest of so many other strictures.
02:35:01.000 Yeah.
02:35:01.000 Oh, yes.
02:35:03.000 Last question.
02:35:05.000 What's your take on the Bob Lazar story?
02:35:09.000 Elements of truth with a healthy dose of misinformation that perhaps he was provided.
02:35:23.000 I don't think that he's entirely lying.
02:35:28.000 He seems to know enough about things that the average person wouldn't know.
02:35:35.000 But I've heard from Eric Davis and others saying, he's a this, he's a that.
02:35:41.000 I don't know because, you know, it's like, that's why there are great people like Richard Dolan, who's a wonderful writer of the history of the area, or people like Robert Powell or Michael Swords, who write just the facts, not coming to too many conclusions.
02:36:02.000 I don't live in that world.
02:36:04.000 It's not my speciality.
02:36:06.000 My speciality is working with data and analyzing things and bringing rigorous science to it so that I can convince another scientist what is right or what is wrong.
02:36:16.000 Because I won't be happy.
02:36:18.000 I mean, I'm pretty sure of what I know, but I want to validate that to my colleagues, if only to be able to say I told you so.
02:36:30.000 Right?
02:36:30.000 There's a little bit of human pettiness in there.
02:36:34.000 A little bit of pettiness is great motivation.
02:36:36.000 Yeah.
02:36:37.000 But that's, I think, again, enabling people to live in a world like that where you can talk about these ideas without being ridiculed is really, I think, the objective of what science should be and what open-minded, non-theologically dogmatic approaches should be.
02:36:56.000 It's like accuse a scientist of being a priest, and that's the best way to really upset them.
02:37:03.000 But pointing out that what they're doing is mimicking dogma and priesthood is the only way to shame them into doing the right thing.
02:37:15.000 Does that make sense?
02:37:16.000 It does.
02:37:17.000 It does.
02:37:18.000 Well, listen, man, I'm glad we finally did this.
02:37:20.000 Yes.
02:37:21.000 Thank you.
02:37:21.000 Thank you so much for being here.
02:37:23.000 Thank you so much for all the research that you're currently involved in and all the stuff that you've done.
02:37:28.000 And it's been amazing talking to you.
02:37:29.000 Really appreciate it.
02:37:30.000 Thank you.
02:37:31.000 Thank you so much.
02:37:32.000 Okay.