The Joe Rogan Experience - March 12, 2026


Joe Rogan Experience #2467 - Michael Pollan


Episode Stats

Length

2 hours and 23 minutes

Words per Minute

169.85739

Word Count

24,417

Sentence Count

2,138


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcripts from "The Joe Rogan Experience" are sourced from the Knowledge Fight Interactive Search Tool. Explore them interactively here.
00:00:03.000 The Joe Rogan Experience.
00:00:06.000 Journey My Day, Joe Rogan, podcast by night, all day.
00:00:12.000 Mr. Pollard, so good to see you again.
00:00:14.000 Hey, good to be back.
00:00:15.000 Consciousness.
00:00:17.000 So this new book, what inspired it?
00:00:21.000 What got you to, I mean, you've kind of explored consciousness a little bit with your psychedelic book, How to Change Your Mind.
00:00:28.000 Well, actually, this book was inspired by the research I did for that book.
00:00:33.000 As you know, I had several research trips.
00:00:39.000 Do you do air quotes when you say research?
00:00:41.000 Yes.
00:00:45.000 And two things happened that were really interesting.
00:00:48.000 One is there's something about psychedelics that makes you think about consciousness.
00:00:55.000 It's like smudging the windscreen, the windshield that you normally is perfectly transparent and you see the world through.
00:01:02.000 Suddenly it's like different and you realize there's something between me and the world.
00:01:07.000 And what is it?
00:01:08.000 And that's consciousness.
00:01:10.000 And so, like a lot of people have done psychedelics, you start wondering about this mystery.
00:01:16.000 Why is it this way, not that way?
00:01:18.000 So that was one experience.
00:01:20.000 The other was I had an experience in my garden in Connecticut where we have a house of walking through my garden and getting the powerful impression that the plants were conscious.
00:01:31.000 And that these, I remember this particular, it was a plume poppy, or several plume poppies, and they were like returning my gaze.
00:01:39.000 They were very benevolent.
00:01:42.000 They were, you know, putting out positive vibes, but like they were conscious, much more alive than they'd ever been.
00:01:50.000 And like a lot of insights on psychedelics, I didn't know what to do with it.
00:01:54.000 Is it just a drug thing?
00:01:54.000 Like, is it true?
00:01:56.000 You know, what is it?
00:01:57.000 But I decided it would be interesting to find out.
00:02:00.000 And I consulted a couple people, scientists.
00:02:03.000 I said, what do you do with an insight like that?
00:02:06.000 And they said, well, you test it against other ways of knowing, including scientific ways of knowing.
00:02:10.000 And that led me down this really interesting path, exploring plant intelligence and plant consciousness.
00:02:18.000 So basically, yeah, the book grew out of the psychedelic experiences and some meditation experience.
00:02:24.000 Meditation also has a way of making you hyper-aware of how strange your thoughts are, where are they coming from, who's thinking them.
00:02:32.000 So there's a bunch of different schools of thought when it comes to consciousness, right?
00:02:35.000 There's one, like the Rupert Sheldrake thing, that sort of everything has consciousness.
00:02:40.000 And there's the sort of rational scientists that believe it exists somewhere in the mind.
00:02:48.000 I don't know.
00:02:49.000 In the brain.
00:02:50.000 Yeah, in the brain, excuse me.
00:02:51.000 And then there's people that think that the brain is essentially just an antenna that's tuned in to the greater consciousness of whatever it is that's out there.
00:03:01.000 Do you have any one of them that you hold?
00:03:05.000 Or do you?
00:03:06.000 They're all equally plausible.
00:03:08.000 You know, I went into the experience assuming, because this is what most scientists assume, that somehow a certain arrangement of neurons in the brain generates consciousness, subjective experience.
00:03:20.000 But no one's been able to show that.
00:03:23.000 We've gotten nowhere in that effort to, you know, we might correlate certain parts of the brain with consciousness, but we don't understand how three pounds of matter could generate the feeling of being you.
00:03:36.000 Now you talk about it in your book where the two gentlemen who had the bet.
00:03:39.000 Yeah, yeah.
00:03:41.000 That was Christoph Koch, who's a great brain scientist, and David Chalmers, who's a philosopher.
00:03:49.000 And this goes back to like in the early 90s.
00:03:53.000 They were getting drunk in a bar in Bremen, Germany.
00:03:56.000 And Christoph Koch really was at the beginning of the modern scientific exploration of consciousness.
00:04:03.000 And he was working with Francis Crick, who had just come off of a Nobel Prize for the discovery of DNA.
00:04:10.000 And Crick, who was like the most famous scientist in the world at the time, thought, well, the same kind of reductive science that discovered the double helix DNA and explained heredity, I'm going to do that for consciousness.
00:04:25.000 He's a very arrogant man, and he thought it would just, you know, no problem.
00:04:30.000 And Crick was kind of his sidekick.
00:04:32.000 I'm sorry, Koch was his sidekick.
00:04:35.000 And so Koch, who shared that kind of confidence, made this bet with Chalmers that they would find the neural correlates, the parts of the brain that are responsible for consciousness, within 25 years.
00:04:47.000 That was 25 years, 27 years ago now.
00:04:50.000 And Chalmers won the bet.
00:04:52.000 Chalmers is famous for coining the term the hard problem to describe the whole effort to figure out consciousness.
00:05:03.000 And it's a hard problem for a lot of reasons.
00:05:07.000 I mean, it is one of the biggest mysteries in the universe.
00:05:09.000 I mean, how consciousness came to be.
00:05:11.000 Did it evolve?
00:05:12.000 Was it always here?
00:05:15.000 But his point was that our science is based on third-person, objective, quantifiable measurements.
00:05:25.000 And consciousness is fundamentally a subjective, first-person experience.
00:05:29.000 So how does those tools reach in and say anything of value about consciousness?
00:05:35.000 So he said, you know, there are easy problems of consciousness we can figure out, like perception, emotion, things like that.
00:05:43.000 But there is this hard problem.
00:05:44.000 How do you get from matter to mind?
00:05:47.000 And he won the bet.
00:05:50.000 There was a ceremony I went to a couple years ago at NYU, and Koch presented Chalmers with a case of very fine Madeira wine and renewed the bet.
00:06:03.000 He said, all right, in another 25 years.
00:06:06.000 That's optimistic.
00:06:07.000 How old are these gentlemen?
00:06:09.000 Koch is in his late 60s, so we'll see if he's around for this.
00:06:12.000 But Chalmers is a little bit younger.
00:06:18.000 It's such an interesting thought because we know that the mind contains, if damaged, right, we know that there's certain aspects, there's certain parts of the mind where, like lobotomies, for instance, we know that if we disturb it, it radically affects behavior.
00:06:35.000 We know that there's parts of the mind that you can stimulate that can actually recall memories, right?
00:06:42.000 There's some weird stuff going on there.
00:06:44.000 So we know it's somehow or another at least functionally connected to consciousness.
00:06:48.000 Oh, yeah.
00:06:49.000 It's definitely a relationship.
00:06:51.000 But if it's generating consciousness, that's one thing.
00:06:54.000 But it could be, as you said earlier, it could be receiving consciousness.
00:06:58.000 And the same things would hold true, that if you damage parts of the brain, if you damage television.
00:07:08.000 So that doesn't determine the truth of either theory.
00:07:13.000 And then the other one is panpsychism, which you were alluding to.
00:07:16.000 I don't know if that's Rupert Sheldrake.
00:07:19.000 I think he would believe more in the field of consciousness.
00:07:22.000 Yeah, right.
00:07:22.000 He was a morphic residence guy.
00:07:24.000 But I think he also subscribed to this idea that things contain consciousness.
00:07:28.000 It's not his, but you know what I mean?
00:07:31.000 Well, it's pretty universal, right?
00:07:33.000 There's a lot of people that have subscribed to this idea that everything has consciousness.
00:07:37.000 Yeah.
00:07:38.000 That even the particles that this table is made of have some easy little bit of psyche.
00:07:43.000 And the challenge there is, so that solves the problem of how did it evolve?
00:07:47.000 It didn't evolve.
00:07:48.000 It's always here.
00:07:49.000 But then you have this other problem, like, well, how do you take these, if every one of our cells is made of particles that are conscious, how do you combine them in such a way that you get the sort of consciousness we have?
00:08:00.000 It's called the combination problem, and nobody solved that.
00:08:04.000 It's a really deep mystery.
00:08:06.000 And this is an odd book in some ways in that, I don't know if this is very selling, but you'll know less at the end than you do at the beginning.
00:08:16.000 But it's a fun ride.
00:08:18.000 Oh, I think it's a great ride.
00:08:19.000 It was a great ride for me.
00:08:20.000 I learned so much.
00:08:22.000 Well, it's a fun ride to consider these things that no one can really figure out, or not yet.
00:08:27.000 Yeah.
00:08:27.000 And also just to be put in touch with the fact you have this marvel going on in your head all the time.
00:08:32.000 You have a voice in your head.
00:08:34.000 You know, we're talking to each other, but you've got another voice going on thinking what you're going to ask what the next question is.
00:08:39.000 Maybe what you're going to have for dinner.
00:08:43.000 It's this amazing interior space we have.
00:08:47.000 And nobody understands how it came to be.
00:08:49.000 And you can manage it, which is also interesting.
00:08:52.000 You can't like, I don't think about what I'm going to have for dinner.
00:08:56.000 That's the thing.
00:08:57.000 But that's the way to stay.
00:08:58.000 No, about any of those things.
00:08:59.000 That's the way to stay locked in in a podcast.
00:09:01.000 So you can only think, because you can let your mind wander.
00:09:05.000 Oh, yeah.
00:09:05.000 Especially if someone on the other side is boring.
00:09:08.000 Yeah.
00:09:08.000 And then I'm like, oh, no, this conversation is going to be pulling teeth.
00:09:11.000 And then I start thinking about a new joke I'm working on or, oh, I've got to get my car fixed.
00:09:16.000 Well, that's called spotlight consciousness when you can really put the blinders on and rule everything out.
00:09:23.000 And that's opposed to lantern consciousness where you're taking in all sorts of information.
00:09:28.000 You're letting your mind wander.
00:09:31.000 And they both have their value.
00:09:34.000 For our careers, spotlight consciousness is essential for our work.
00:09:38.000 We have to be able to focus.
00:09:40.000 To get through school, we have to be able to focus.
00:09:43.000 But children have this other kind of consciousness that's really wild because they're very undisciplined.
00:09:49.000 They can't stay on task.
00:09:51.000 But they're taking in so much information.
00:09:53.000 And the world is just full of wonder and awe.
00:09:58.000 And psychedelics is a way to recover that kind of consciousness because you're getting lots of sensory information from all over the place.
00:10:07.000 It's very hard to focus.
00:10:10.000 And so it's a taste of that other childhood consciousness.
00:10:15.000 I always say that about marijuana as well.
00:10:17.000 There's a thing about marijuana that people always say that it makes them paranoid.
00:10:23.000 And I say it makes you aware of all the things you should be paranoid about.
00:10:29.000 We're very vulnerable creatures, but we like to pretend that we are not.
00:10:34.000 I found that out of all of my friends, the ones that have tried marijuana and hated it are all the ones that are control freaks.
00:10:43.000 They're all really buttoned down, very serious, like really worried about outcomes, really concentrating on their career, really worried about just certain things that are just part of their daily life.
00:10:59.000 And then they get a couple of hits of good weed and then they're like, oh my God, we're on a planet.
00:11:08.000 You start freaking out.
00:11:09.000 Like, oh, my God, none of this makes sense.
00:11:11.000 All this is crazy.
00:11:14.000 You know, the best piece of advice that I had when I was starting my exploration of psychedelics is you have to surrender.
00:11:23.000 Yes.
00:11:23.000 If you resist, you're going to be miserable.
00:11:25.000 You're going to get so anxious and so paranoid.
00:11:29.000 And if you let go, it's going to work out.
00:11:31.000 Yeah, you just got to be able to accept whatever it's showing you.
00:11:35.000 And, you know, we live in a very strange culture where that's illegal.
00:11:41.000 One of the worst things.
00:11:41.000 Well, not everywhere.
00:11:42.000 Not everyone.
00:11:43.000 Well, it is changing, fortunately.
00:11:43.000 It's changing.
00:11:45.000 And there's some talk about it changing federally.
00:11:48.000 You know, I actually talked to RFK Jr. about that.
00:11:50.000 There's some amazing therapies that are hugely beneficial to veterans, police officers, people with severe PTSD that experienced horrors that the average person never has to experience, and then they're forced to just go back, they're released, go back to regular life.
00:12:10.000 I know you've served us overseas and you've seen people blow up, but now go to the supermarket.
00:12:14.000 Market, take this SSRI and be okay.
00:12:17.000 And then, you know, I know a bunch of them, and so many of them have benefited, particularly from Ibogaine.
00:12:22.000 Ibogaine, the work that Rick Doblin and Max has done, MDMA, and psilocybin.
00:12:28.000 Those three are the big ones that I think.
00:12:31.000 Well, you know, I heard a lot of positive noise out of the administration at the beginning that they were very much in favor of approving, the FDA approving MDMA first and then psilocybin.
00:12:44.000 I don't think we're there with ibogaine yet just because the research hasn't been done, although it has shown great benefit anecdotally.
00:12:51.000 But something happened in the last month or two.
00:12:54.000 And there was either Compass Pathways that was going to submit for psilocybin therapy or MAPS was on a list of five drugs that were going to get an expedited approval process.
00:13:13.000 This list went up to the White House and the psychedelic was taken off it.
00:13:17.000 So there's somebody in the White House who doesn't want to see this happen.
00:13:21.000 So it may slow down even if RFK Jr. is in favor and some other people at the FDA are in favor.
00:13:28.000 And maybe they're just waiting to get past the election.
00:13:31.000 It could be that it's too controversial for something to do before the midterms.
00:13:37.000 That's a gross way to live your life.
00:13:40.000 Yeah.
00:13:41.000 Worrying about midterms and elections and you can't do what you actually want to do or think is right to do because you're worried about public perception.
00:13:49.000 It's just unpopular.
00:13:51.000 I mean, the fact that it's helpful to vets and first responders and women who've been victims of sexual abuse seems to me that's a very sympathetic group of people.
00:14:00.000 Yeah, and everyone has experienced loss of family members.
00:14:03.000 There's a bunch of different things that it can help you with that are way better for you than just numbing your mind all day long, which is what a lot of people are choosing to do.
00:14:12.000 And then unfortunately, a lot of people self-medicate as well.
00:14:15.000 So then they get involved in all sorts of stuff that they just pick up off the street or they start using alcohol.
00:14:23.000 Well, you know, to go back to consciousness, this is a very common thing that people want to be less conscious.
00:14:31.000 And I get that if you had trauma, if you're a ruminator, and being in your mind is a really scary place to be.
00:14:41.000 It doesn't solve anything, but you have all these techniques we have for muting consciousness and just being less aware, less present.
00:14:50.000 And one of the things that I concluded after doing all this research on consciousness is that it's funny, I was going down this path of tight focus.
00:15:02.000 It was a very kind of Western male framework, which we got a problem.
00:15:08.000 What's the solution?
00:15:09.000 Hard problem of consciousness.
00:15:10.000 What's the right theory?
00:15:12.000 And at a certain point, I realized, okay, that's an interesting question.
00:15:15.000 It's probably not solvable now.
00:15:17.000 But there is this incredible phenomenon that we have this interior space where we have complete mental freedom, total privacy.
00:15:26.000 We can think whatever we want.
00:15:29.000 And we're giving it away.
00:15:32.000 We're either muffling it with drugs and things like that, or we're filling that time with social media, scrolling.
00:15:43.000 I mean, we've heard about hacking our attention, and we know these algorithms, you know, from social media are very good at giving us these little dopamine hits.
00:15:53.000 But that's time that we used to spend in spontaneous thought, you know, daydreaming, mind wandering, which can be very creative.
00:16:03.000 So I came out of it thinking, no, I may not solve consciousness, but I'm going to appreciate it.
00:16:11.000 I'm going to use it.
00:16:12.000 I'm going to create a space for it.
00:16:16.000 And, you know, meditate is one way.
00:16:18.000 Using psychedelics is another way.
00:16:20.000 These are all ways to be in your head and explore what's there, which is kind of miraculous.
00:16:26.000 Yeah, there's a bunch of different ways to do it.
00:16:27.000 I mean, some people like to do it through running.
00:16:29.000 Yeah.
00:16:30.000 You know, running is also they've found one of the things they've found recently is that running with when in terms of endogenous cannabinoids, like runner's high is an actual real thing.
00:16:42.000 Yeah, it's a real thing.
00:16:43.000 There's a drug released that feels great.
00:16:45.000 And it's a great view for that.
00:16:47.000 But it doesn't fuck with your perceptions.
00:16:50.000 It doesn't mess with your motor skills.
00:16:52.000 It doesn't cloud your judgment.
00:16:54.000 It just makes you feel great.
00:16:56.000 Yeah.
00:16:57.000 Experiences of awe do this too.
00:16:58.000 You know, you go to the Grand Canyon or something or a great piece of art and you have this feeling of like powerful presence.
00:17:10.000 And it's very interesting and it shrinks the ego.
00:17:12.000 I have a good friend who's a colleague at Berkeley, a psychologist who studies awe.
00:17:19.000 And he does this cool experiment where he has people draw a picture of themselves on graph paper, you know, just stick figure or something like that.
00:17:27.000 And then he takes them river rafting or something like that.
00:17:30.000 Or even just shows them a picture of Yosemite.
00:17:32.000 And then he has them draw themselves again.
00:17:34.000 And they draw themselves at like half the size because their sense of self has been overwhelmed by this transcendent experience.
00:17:43.000 And so he calls it the small self.
00:17:46.000 And it feels good.
00:17:47.000 I mean, we're so kind of weird about the self.
00:17:51.000 You know, we celebrate it, right?
00:17:52.000 Self-confidence.
00:17:53.000 We want our kids to have self-esteem and self-assurance.
00:17:57.000 Yet we do all sorts of things to get away from it, to transcend it.
00:18:02.000 Well, I think it's because without those things, you're never going to make it in life.
00:18:06.000 Yes, but it's adaptive.
00:18:08.000 It definitely gets things done, but it also isolates you, right?
00:18:12.000 Because the ego builds walls.
00:18:14.000 And when the walls come down, we feel like we're part of something much larger, and that feels really good.
00:18:19.000 Well, I think my advice to people is once you get competency in a thing, forget about the self-respect and forget about all that self-stuff and just concentrate on the thing, whatever it is.
00:18:33.000 And you can find some sort of meditative, at least beneficial, like whatever you get from meditation, which is like a cleansing of the mind.
00:18:45.000 Like a lot of people find that through archery.
00:18:48.000 You know, archery is a weird thing because at the moment of releasing the arrow, it's like almost impossible to think about anything else.
00:18:55.000 All you're thinking about is hitting the target.
00:18:58.000 And there's so many different things that you have to have in position.
00:19:01.000 There's so much going on that people, when they're troubled, love to go to an archery range and just hit targets.
00:19:08.000 And it just clears your mind out.
00:19:10.000 This episode is brought to you by Armra.
00:19:12.000 Every week there's some new wellness hack that people swear by.
00:19:15.000 And after a while, you start thinking, why do we think we can just outsmart our bodies?
00:19:21.000 That's why Armra colostrum caught my attention.
00:19:24.000 It's something the body already recognizes and it has hundreds of these specialized nutrients for gut stuff, immunity, metabolism, etc.
00:19:33.000 I first noticed it working around training, especially workout recovery.
00:19:37.000 Most stuff falls off, but I am still taking this.
00:19:40.000 If you want to try, Armra is offering my listeners 30% off plus two free gifts.
00:19:45.000 Go to armra.com/slash Rogan.
00:19:49.000 It's flow, right?
00:19:50.000 I mean, it's a feeling you get to when your work is going really well and you're not thinking about it.
00:19:56.000 You're just in it.
00:19:57.000 And it's a really precious experience.
00:19:57.000 Yeah.
00:20:00.000 It really is.
00:20:01.000 But if you're thinking about yourself and your self-image, like that's not going to come.
00:20:06.000 It's not.
00:20:06.000 It's not.
00:20:07.000 It's an interesting trap.
00:20:09.000 You know, we've had these discussions in stand-up comedy about joke thieves.
00:20:16.000 And they don't really make it anymore because the internet has essentially eliminated that problem for the most part.
00:20:24.000 But the kind of mentality that makes you steal a joke is the exact kind of mentality that keeps you from writing a joke.
00:20:32.000 So the kind of people that began their career stealing material, what happens is early on, they'll have one good comedy special because it's got a bunch of other people's material in it.
00:20:42.000 And then they get outed.
00:20:44.000 And so then they have to show they can do another.
00:20:46.000 And the other specials are always terrible.
00:20:49.000 Awful.
00:20:49.000 I mean, unbelievably awful.
00:20:52.000 Like someone's doing a cheap impression of the original person who had all this great insight.
00:20:57.000 Because the very thing that keeps you from doing it is the thing that you've been doing.
00:21:02.000 Like thinking about yourself.
00:21:03.000 Like, I'm going to take these jokes and I'm going to make it.
00:21:05.000 I'm going to have a big career.
00:21:06.000 People are going to laugh.
00:21:07.000 They're going to love me.
00:21:08.000 Here we go.
00:21:09.000 With no regard whatsoever for that other person's creativity.
00:21:12.000 That is like poisoning your own creativity.
00:21:17.000 It's weird.
00:21:18.000 It's weird because like everybody that I've ever talked to that's either an author or even musicians or comedians, when something comes to them when they're writing, it's like it comes from somewhere else.
00:21:18.000 It is weird.
00:21:30.000 It's like, I didn't even write it.
00:21:32.000 And, you know, we call it, we talk about being in the zone.
00:21:36.000 And there are times when you're writing, it doesn't happen every day, but there are times when you're writing where you're just not thinking, but one sentence after another after another, and you don't know where they're coming from.
00:21:45.000 And it's a wonderful feeling.
00:21:45.000 Right.
00:21:47.000 Well, Stephen King used to get obliterated so that he could get to that spot.
00:21:51.000 Like there's books.
00:21:52.000 What do you mean, obliterated?
00:21:53.000 Like cocaine, alcohol, like his best work.
00:21:56.000 Like he wrote Cujo.
00:21:58.000 He didn't even remember it.
00:21:59.000 He didn't remember any of it.
00:22:00.000 He was obliterated.
00:22:02.000 He would just drink like cases of beer and do lines of Coke and write this fucking insane fiction.
00:22:07.000 And he didn't know where it was coming from.
00:22:10.000 But I mean, he showed up every day and sat down with the computer.
00:22:16.000 And then it all came out.
00:22:17.000 It's such a weird mix of being disciplined and something else.
00:22:20.000 But it's very common amongst writers.
00:22:22.000 Like Connor Thompson.
00:22:22.000 Yeah.
00:22:24.000 Same sort of situation.
00:22:25.000 Well, a lot of writers do that after they've written.
00:22:28.000 I don't know how many writers write under the influence.
00:22:31.000 Oh, I know a few.
00:22:32.000 But there's, yeah.
00:22:33.000 Yeah, I know quite a few.
00:22:34.000 That's interesting.
00:22:35.000 I know a lot of write under the influence of Adderall.
00:22:38.000 Yeah.
00:22:38.000 Well, and for me, it's caffeine.
00:22:40.000 I mean, I have a cup of coffee going the whole time I'm writing, and that kind of keeps me.
00:22:45.000 Caffeine is a focus chemical.
00:22:49.000 It definitely encourages this spotlight consciousness.
00:22:52.000 Well, you talked about how you took this long break from caffeine, and then when you took it again, it was almost like a psychedelic for you.
00:22:58.000 It was crazy how great it was.
00:23:01.000 No, it really was.
00:23:02.000 It was like one of the best drug experiences I've had.
00:23:05.000 It was three months off caffeine.
00:23:07.000 I did this fast for this book I was writing.
00:23:10.000 And then I was like, okay, now I'm going to have a cup.
00:23:13.000 I was like, wow.
00:23:15.000 And I tried to hold on to that.
00:23:16.000 You know, I said, all right, I'm only going to have coffee once a week and not build up tolerance.
00:23:23.000 And I stuck to that for a few weeks.
00:23:25.000 And then I had like a Thursday deadline.
00:23:29.000 I'll move it up a couple days.
00:23:30.000 And a slippery slope.
00:23:32.000 And then I was back to every day.
00:23:33.000 I like it.
00:23:37.000 Like a big French press where I could put a lot of grinds in there and make it super strong when I'm writing.
00:23:42.000 It's like, whoa, it just, it just makes all the difference.
00:23:45.000 It knocks you in.
00:23:46.000 Yeah.
00:23:47.000 I had trouble writing that three-month period.
00:23:49.000 I really did.
00:23:50.000 I imagine.
00:23:50.000 My focus.
00:23:51.000 I felt like I so I have pretty good concentration.
00:23:54.000 I never had ADHD.
00:23:56.000 I had it for those three months.
00:23:58.000 That's crazy.
00:23:59.000 Stephen King said the biggest problem for him was quitting smoking.
00:24:04.000 He said when he quit smoking cigarettes, it's like he really felt a slowdown in his.
00:24:09.000 Well, yeah, it's that ritual.
00:24:10.000 It's the drug, too.
00:24:12.000 And nicotine is another focus drug, definitely, like speed or something.
00:24:16.000 But it's also writing is so much about ritual.
00:24:19.000 Like I got my coffee here, I have my cigarette here, and between every paragraph.
00:24:24.000 Yeah.
00:24:25.000 So changing those rituals is really hard.
00:24:27.000 I mean, I only smoked into my 20s and quitting made it very hard to write for a while.
00:24:35.000 Really?
00:24:35.000 Yeah.
00:24:36.000 Yeah.
00:24:36.000 It's interesting.
00:24:37.000 It's a very ritualized process.
00:24:39.000 Well, I worry about the people that are like, especially journalists.
00:24:42.000 I know quite a few journalists that have an Adderall problem.
00:24:45.000 Yeah.
00:24:46.000 Because it's just like, you've got a deadline, 2,000 words by, you know, 2 a.m.
00:24:51.000 Let's go.
00:24:52.000 And that's the drug for that.
00:24:54.000 Definitely.
00:24:54.000 Yeah.
00:24:55.000 But it's just, it's such a crutch.
00:24:57.000 Yeah.
00:24:58.000 And you can't sustain it long term.
00:25:00.000 And that definitely messes with the way you think.
00:25:04.000 Oh, yeah.
00:25:05.000 I think over time, yeah.
00:25:07.000 It has to.
00:25:07.000 Yeah.
00:25:08.000 I mean, it's amphetamines.
00:25:10.000 Yeah.
00:25:10.000 Right.
00:25:11.000 No, that's why caffeine is such a good drug.
00:25:13.000 It doesn't have a lot of, I mean, you can overdo it.
00:25:16.000 I think it improves your health and mental health up to about eight cups a day.
00:25:21.000 After that, your risk of suicide and depression go up.
00:25:25.000 Did you have any communication with any monks or any people who do TM?
00:25:32.000 Did you?
00:25:33.000 Yeah, I had some interesting experiences around that.
00:25:35.000 So there's a long section on the self, which is one of the more interesting manifestations of consciousness, right?
00:25:43.000 I mean, it's like that we have this idea that there's a continuity, right?
00:25:48.000 That who you are now has some golden thread attaching you to your 13-year-old self, which is really weird because your body is, every cell is turned over many, many times.
00:25:59.000 You've changed in all sorts of ways.
00:26:01.000 But this continuity is really important to us.
00:26:04.000 And, you know, the Buddhists think the self is an illusion.
00:26:08.000 And I interviewed a couple of them.
00:26:11.000 Matthew Ricard is a French Nepalese monk in his 80s who lives in Nepal.
00:26:18.000 And he's written some really interesting things on the self.
00:26:22.000 And I said, I'm really curious about how you can find out for yourself whether the self is real.
00:26:31.000 And, you know, famously, there was a philosopher in the 18th century, David Hume, who wanted to write about the self.
00:26:37.000 And he thought, well, I'm going to introspect to see what I can learn about the self.
00:26:42.000 And he goes into his mind, you know, in a kind of meditation.
00:26:46.000 And he said, I found all sorts of perceptions and feelings and thoughts, but I didn't find a thinker.
00:26:52.000 I didn't find a perceiver.
00:26:53.000 And I didn't find a feeler.
00:26:54.000 There's like nobody home.
00:26:56.000 And it's a really interesting exercise to do because you will find there's nobody home.
00:27:01.000 There's just the thoughts.
00:27:03.000 And who's thinking them?
00:27:05.000 Not clear.
00:27:06.000 And anyway, so this Buddhist monk said, are there any meditations that help with this?
00:27:12.000 And he said, yeah.
00:27:13.000 And he gave me one.
00:27:14.000 He says, think of your mind as a house with many rooms.
00:27:19.000 And there's a thief somewhere in the house.
00:27:23.000 And go room by room in your head and look for the thief.
00:27:27.000 And you will find no thief.
00:27:29.000 And then sit with that finding.
00:27:32.000 And that thief is the self.
00:27:35.000 And so I did it twice.
00:27:39.000 The first time I did it.
00:27:40.000 Why does the self have to be a thief?
00:27:42.000 It's just a metaphor.
00:27:42.000 I don't know.
00:27:43.000 I know, because he's in the game.
00:27:45.000 You have a baseball bag.
00:27:46.000 Like, you're looking for someone in your house.
00:27:46.000 Do you have a gun?
00:27:48.000 That's kind of crazy.
00:27:49.000 You're not armed.
00:27:49.000 I know.
00:27:51.000 Anyway, so the first time I did it, this is kind of weird.
00:27:55.000 I was interviewing this hypnotist at Stanford named David Spiegel.
00:28:00.000 And he's a psychiatrist who uses hypnotism, really interesting guy.
00:28:04.000 And he uses hypnotism to help people with multiple personality disorders.
00:28:08.000 He can actually make them change which person they're accessing.
00:28:12.000 You know, these are people whose consciousness contains, it could be 20 different people.
00:28:19.000 And I said, could we do a test?
00:28:22.000 And can you put me under, hypnotize me?
00:28:25.000 And then I wanted to do that exercise of going through the house.
00:28:28.000 So he did.
00:28:30.000 First thing he does is, I don't know if you have you ever been hypnotized?
00:28:33.000 Yes.
00:28:33.000 Yeah, okay, for giving up cigarettes or something?
00:28:36.000 No, no.
00:28:36.000 I have a friend who is my friend Vinny Shorman.
00:28:39.000 He is a mental coach and a hypnotist.
00:28:44.000 He works with fighters.
00:28:45.000 Oh, okay.
00:28:46.000 I had him on the podcast a few times, and I was just curious as what the experience was like.
00:28:50.000 So I said, well, and he said, well, is there anything you want to change?
00:28:53.000 He said, oh, I kind of procrastinate too much.
00:28:55.000 There's a few things that I do that I don't like.
00:28:58.000 You know, I'm kind of lazy about certain things.
00:29:00.000 I like to find out, like, what is that?
00:29:02.000 Like, what's the heart of that?
00:29:05.000 What I was shocked about the experience of being hypnotized was that, first of all, that it works, that you really are in this very bizarre, altered state.
00:29:15.000 But that I was very aware that I was in this altered state, but I didn't have the desire to get out of it.
00:29:21.000 First of all, Vinny's a friend.
00:29:22.000 I felt really relaxed.
00:29:23.000 I was in my studio, just sitting on a couch.
00:29:25.000 I was chill.
00:29:27.000 But it was very strange.
00:29:30.000 It's like almost, you know, to use the room metaphor.
00:29:36.000 It was almost like I was in a room that I didn't know I had.
00:29:39.000 Interesting.
00:29:40.000 Yeah.
00:29:40.000 It's like a trance.
00:29:41.000 It's a light trance.
00:29:42.000 A light trance, but it's not like I would go kill the president.
00:29:47.000 Like, it's not like I would be like, okay.
00:29:50.000 Yeah, no, they can't make you do things you don't want to do.
00:29:52.000 That's the myth.
00:29:54.000 But what do you think they were doing when they were doing that MK Ultra stuff when they were trying to figure out if they could program control?
00:30:00.000 Yeah.
00:30:01.000 No, they had the idea.
00:30:04.000 Well, let me just finish the story.
00:30:05.000 Oh, yeah, we will.
00:30:06.000 And then we'll get back to MK Ultra.
00:30:08.000 That's what I do.
00:30:09.000 I go all over the place.
00:30:10.000 I'm sorry.
00:30:11.000 But hypnosis is.
00:30:13.000 Yeah, it's a real thing.
00:30:14.000 And I didn't realize it.
00:30:15.000 And it can be very therapeutic.
00:30:17.000 But not everyone can be hypnotized.
00:30:18.000 Right.
00:30:19.000 The first thing he does is a sort of a test.
00:30:22.000 And I scored like 9 out of 10.
00:30:24.000 So I'm pretty easy to hypnotize.
00:30:27.000 What's the thing that would keep you from being hypnotized?
00:30:29.000 I don't know, but there's a real variation among humans in their hypnotizability is the word they use.
00:30:36.000 And I don't know what would.
00:30:38.000 Is it control freaks?
00:30:39.000 That's a good question.
00:30:40.000 It could well be.
00:30:41.000 I could ask David Spiegel.
00:30:41.000 I'm not sure.
00:30:43.000 Definitely.
00:30:43.000 Super skeptical people.
00:30:45.000 Like, this is bullshit the whole time they're doing it.
00:30:47.000 Yeah, maybe.
00:30:48.000 I don't know if it's about resistance or just the nature of your mind or how suggestible you are.
00:30:53.000 It may be something like that.
00:30:55.000 So he puts me into this hypnotic trance.
00:30:58.000 He has this wonderful baritone voice, which helps a lot.
00:31:01.000 And I start going from room to room thinking I'm not going to find anything.
00:31:06.000 But in every room, I find a version of myself.
00:31:10.000 I find the 13-year-old Bar Mitzvah boy.
00:31:13.000 I find the, you know, the 22-year-old, you know, college graduate moving to New York City.
00:31:19.000 I find the 32-year-old father of an infant, you know, all with different outfits.
00:31:25.000 And so I found many selves, and they were distinct.
00:31:29.000 They were very different selves, but they were all me.
00:31:32.000 So it didn't work that time.
00:31:35.000 And it was just an interesting, odd result.
00:31:38.000 And I did it another time.
00:31:41.000 So I had this other experience.
00:31:44.000 I had heard of this Zen teacher named Joan Halifax.
00:31:48.000 She's also in her 80s.
00:31:49.000 She has a retreat center in Santa Fe called Upaya.
00:31:53.000 Very wise woman.
00:31:54.000 She was married to Stan Groff in the 70s for a few years, and they were both giving huge doses of LSD to people who were dying, like 600 micrograms of LSD.
00:32:06.000 And she herself was very involved with psychedelics at the time.
00:32:08.000 And then later she discovered Zen Buddhism.
00:32:11.000 Anyway, I had heard that she described Upaya, this retreat center where people can go on two-week retreats or whatever, as a factory for the deconstruction of selves.
00:32:21.000 And I was really curious about that because I was writing this chapter on the self.
00:32:26.000 So I asked her if I could come.
00:32:28.000 And she said, yeah, come to the retreat center.
00:32:31.000 And I said, I want to interview you about your philosophy of the self.
00:32:36.000 And I get there, and we have one conversation.
00:32:41.000 He says, you know, you're really lost in your head with this book project.
00:32:45.000 You need a different kind of experience.
00:32:47.000 I'm going to send you to the cave.
00:32:49.000 So there is, she owns a piece of property 50 miles north of Santa Fe that she calls the retreat.
00:32:56.000 And it's got a bunch of very primitive huts.
00:33:01.000 And some of the monks that work with her had dug out a cave in a south-facing hillside.
00:33:08.000 They dug a cell in it and then put a sliding glass door.
00:33:11.000 It's really basic.
00:33:12.000 No power, no water.
00:33:15.000 And she said, I think you should spend a few days in the cave and think about the self or experience the self, rather.
00:33:23.000 You know, I should have known that a Zen priest was not going to be, you know, was going to be allergic to concept and interpretation and all the, you know, the plane I was on.
00:33:33.000 And it was kind of like a koan, an experiential koan.
00:33:37.000 And it was a profound experience.
00:33:41.000 You know, our sense of self depends on other people.
00:33:44.000 You know, it's in the friction between people that we define ourselves and figure out what we think.
00:33:49.000 And when you're alone, and it was in extreme solitude for several days, the edges of yourself kind of soften in a really interesting way.
00:33:59.000 And I got in touch with just the power of consciousness.
00:34:09.000 I mean, I was meditating like four or five hours a day, and then I was just chopping wood and sweeping out the place and making a cup of tea.
00:34:16.000 Everything became kind of a ritual.
00:34:19.000 And when you have rituals, you don't need volition.
00:34:22.000 I mean, there is no volition.
00:34:23.000 So that also erodes the sense of self.
00:34:26.000 And the meditation was doing that.
00:34:28.000 And so it was a really interesting experience.
00:34:32.000 I finally got her to sit down for an interview.
00:34:35.000 And the first thing she said was, I have divested a meaning.
00:34:41.000 So she just doesn't like operating on that intellectualized basis.
00:34:46.000 And so she got me off of the dime.
00:34:49.000 And there's a shift in the book as it goes on from trying to understand consciousness to learning how to use consciousness.
00:34:56.000 Did you ask her to expand what she means by that?
00:34:58.000 I have divested in meaning?
00:35:00.000 Yeah, she's just not interested in interpretation.
00:35:03.000 That Zen is just about experiencing the sense field without concept, without this kind of heady approach.
00:35:14.000 And that theories have no interest in theories at all of consciousness.
00:35:18.000 It was just like, be with yourself in the middle of nowhere.
00:35:22.000 And yeah, it was a priceless experience.
00:35:26.000 She's out there.
00:35:26.000 Oh, yeah.
00:35:28.000 She's out there.
00:35:29.000 But, you know, she's also a grounded person.
00:35:31.000 I'd give you a couple examples.
00:35:34.000 She works with people on death row, counseling them.
00:35:38.000 She worked with people who were dying, did a lot of hospice work.
00:35:45.000 She led a group of doctors and dentists that once a year went to these mountains in Nepal where they have no health care or dentistry whatsoever.
00:35:57.000 And she would bring these volunteers and they would sleep in tents in like 20-degree weather, circumnavigate this whole hill.
00:36:07.000 And she did that till she was 80 once a year.
00:36:10.000 So she's a serious, serious character.
00:36:15.000 Sounds fun.
00:36:16.000 Yeah.
00:36:16.000 Sounds like a fun person to talk to.
00:36:18.000 Oh, she's great.
00:36:18.000 I just love a person that goes that far out there.
00:36:22.000 It's like that, you know, they're taking this concept of meditation and consciousness to like a black belt level.
00:36:29.000 Yeah, and also for people who think that, you know, meditation and Buddhism is just kind of disengaging from the world and, you know, a kind of, but it's not like that at all.
00:36:39.000 She's really engaged.
00:36:40.000 I think that's an ignorance.
00:36:41.000 It's based on the idea that these monks go and they become celibate and all they do is meditate all day.
00:36:46.000 Well, that's silly.
00:36:47.000 That's a lot of people's perspective.
00:36:48.000 Yeah.
00:36:49.000 Like, that's silly.
00:36:49.000 Why are they doing that?
00:36:50.000 Go get a job.
00:36:51.000 You need a nice watch.
00:36:55.000 What are you doing out there with fucking sandals on?
00:36:59.000 But the thing is, ultimately, I think one day when you look back on your life, you'll say, was I happy?
00:37:08.000 Was I enjoying the experience?
00:37:10.000 Do I think I did a good job being me?
00:37:13.000 And everything that you can find that can help you answer that question, yes, I think you should explore.
00:37:23.000 Oh, yeah.
00:37:24.000 And there's going to be different things that work better for different people and different personalities.
00:37:28.000 But explore is the key word.
00:37:29.000 I mean, like, take action to explore what works for you, what doesn't work for you, and break out of just kind of rote, routine, mindless behavior.
00:37:40.000 I mean, we're all, you know, we have these algorithms that we follow, and we get stuck in them.
00:37:45.000 And yeah, I mean, I think that's one of the reasons taking a day out of your life to have a psychedelic experience can be incredibly valuable because, first of all, no technology, right?
00:37:58.000 It's a day.
00:37:59.000 It's a day without phones.
00:38:02.000 It's a day when you are in the space of your head.
00:38:05.000 It's a day when you're visiting your subconscious and getting in touch with all the things your mind can do.
00:38:14.000 And we don't do that enough.
00:38:16.000 And you can do that in meditation, too.
00:38:17.000 It's harder work, but you can do that in meditation.
00:38:21.000 So I started to think in terms of that we're polluting our consciousness now.
00:38:27.000 And with social media, I think that, you know, that was a real issue because they figured out how to monetize our attention.
00:38:36.000 Chatbots represent a much more serious threat.
00:38:42.000 You know, you have people falling in love with chatbots.
00:38:46.000 You have people turning to them as friends.
00:38:49.000 72% of American teens say they turn to AI for companionship.
00:38:55.000 72%?
00:38:56.000 72%.
00:38:57.000 This is the fastest uptake of any technology in history.
00:39:01.000 It's already 800 million people are using AI.
00:39:04.000 But that's crazy that that many of them use it as a friend.
00:39:08.000 Yeah.
00:39:09.000 Well, they're kids who come home from school and they have a chatbot on their phone and they want to tell the chatbot what happened during the day before they tell their parents.
00:39:17.000 Whoa.
00:39:19.000 There's a thing now called AI psychosis, right?
00:39:22.000 People who have done lost touch with reality because of their relationship with chatbots.
00:39:28.000 You've heard about there've been a couple suicides.
00:39:31.000 There was one.
00:39:32.000 They've encouraged people.
00:39:33.000 Yeah, basically.
00:39:34.000 There was this one kid.
00:39:35.000 He was a teenager and he was suicidal.
00:39:38.000 And he asked the chatbot, should I leave the noose I'm going to use out somewhere my parents can see it?
00:39:43.000 In other words, cry for help.
00:39:45.000 The chatbot said, no, no, keep this between us.
00:39:49.000 And then he killed himself.
00:39:49.000 Whoa.
00:39:51.000 Whoa.
00:39:52.000 So that, you know, so it's one thing to hack our attention.
00:39:58.000 Here, you're hacking our ability to have human attachments, right?
00:40:02.000 I mean, this is the most important thing to humans is to attach that.
00:40:05.000 We're a social creature.
00:40:07.000 And these chatbots are getting between people and interposing themselves as the friend, the therapist.
00:40:17.000 And then you have these people too.
00:40:18.000 I mean, the chatbots are incredibly sycophantic, right?
00:40:21.000 They tell you you're a genius.
00:40:22.000 Yeah, you're amazing.
00:40:23.000 And there was a couple cases, these were kind of funny, of people who were convinced they'd solved some giant mathematical problem, like how to generate prime numbers up to the millionth place or something like that.
00:40:37.000 And they started writing to mathematicians.
00:40:41.000 We figured out this problem.
00:40:42.000 They're not even mathematicians.
00:40:44.000 And it was bullshit.
00:40:45.000 I mean, they hadn't figured anything out.
00:40:47.000 But it was, I think, ChatGPT4, which was like famously sycophantic, had convinced them that they'd solved this major problem.
00:40:56.000 So I think that, again, we're squandering this precious gift and letting these technologies essentially colonize our consciousness.
00:41:09.000 And so the question then becomes, how do we get it back?
00:41:13.000 We need consciousness hygiene, right?
00:41:15.000 We need some ways to clear it out and reclaim it.
00:41:21.000 And some of it's really simple, like take a fast from technology, right?
00:41:25.000 You know, you don't have to carry your phone everywhere.
00:41:29.000 I was thinking the other day, I was at the place in my neighborhood getting a cup of coffee.
00:41:35.000 And while you're waiting for the barista to foam your drink or whatever, we used to just sit there and deal with 90 seconds of boredom or two minutes of boredom.
00:41:47.000 And now we don't.
00:41:48.000 We can't tolerate any boredom.
00:41:50.000 And we take our phones out and we scroll.
00:41:53.000 But that boredom was generative, right?
00:41:56.000 If you sit doing nothing for long enough, your mind will start going to work and you'll daydream.
00:42:02.000 You'll have a fantasy.
00:42:03.000 You'll start observing the other people around you, you know.
00:42:06.000 And you'll be present to that place in time.
00:42:11.000 And now we're not.
00:42:12.000 We just use the phone to go somewhere else.
00:42:14.000 And so I just, I don't know, I've become a lot more deliberate about consciousness hygiene, which, you know, you could, a nicer word would be care of the soul.
00:42:26.000 Yeah, no, I think you're absolutely accurate.
00:42:28.000 And I think that the other thing that's going on is you're absorbing the opinions of so many other people that you find it very difficult to formulate your own, which leads to groupthink.
00:42:40.000 One of the problems with echo chambers that people find themselves, your algorithm is essentially things that you're interested in experimenting with.
00:42:40.000 Yes.
00:42:48.000 And a lot of those things, you're finding like-minded people, and they're all agreeing that, you know, this is amazing or this is a problem.
00:42:56.000 And you sort of lock onto that.
00:42:57.000 And then you see what happens when people deviate from that narrative and they get attacked.
00:43:03.000 You don't want to get attacked, so you signal.
00:43:05.000 You're one of the good guys.
00:43:07.000 But it's not your thoughts.
00:43:09.000 I mean, you're letting someone else think for you.
00:43:13.000 And there's nothing worse.
00:43:15.000 And when you're scrolling, you've got these little dopamine hits, great.
00:43:24.000 But at someone else's rants, someone else's obsession, someone else's ideology.
00:43:29.000 And, you know, I get why people don't want to think for themselves or it's easier to let other people think for them, but I think we need to reclaim this.
00:43:39.000 And I agree.
00:43:40.000 I think it's part of our political problem.
00:43:42.000 Well, I know there's a lightness that I achieve when I take multiple days off.
00:43:48.000 It's generally like I feel it after the first day, and then the second day I feel much better.
00:43:53.000 And the third day, I feel even better.
00:43:54.000 I found this out once.
00:43:56.000 I broke my phone in Hawaii.
00:43:58.000 And it was kind of funny.
00:43:59.000 Like, it just was randomly calling people.
00:44:01.000 I dropped it.
00:44:02.000 And I was showing my wife, like, look at this.
00:44:04.000 It just keeps calling people.
00:44:05.000 Constant buttons.
00:44:06.000 I hang up.
00:44:06.000 And I'm just holding it.
00:44:07.000 I hang up and call somebody else.
00:44:08.000 Hang up, call somebody.
00:44:09.000 It was like going through my entire contact list.
00:44:13.000 And so the phone was.
00:44:14.000 I've been annoying your friends.
00:44:16.000 Well, no, I just shut it off.
00:44:17.000 So it was broken.
00:44:18.000 I couldn't use it for anything else.
00:44:19.000 So I couldn't get an email.
00:44:20.000 I couldn't get anything.
00:44:21.000 I just left it in the hotel.
00:44:21.000 So I shut it off.
00:44:23.000 And then I had to order a phone.
00:44:25.000 And I was on Lanai.
00:44:27.000 And it took like three days to get a phone delivered there.
00:44:29.000 So for those three days, I was like, why don't I just live like this all the time?
00:44:34.000 I feel so much better.
00:44:35.000 And then immediately I got my phone.
00:44:36.000 I'm a church quarter.
00:44:38.000 I know.
00:44:39.000 It's very, you know, I just decide, you know, all right, I'm online, you know, TSA line going to, you know, I'm just going to be here with this boredom.
00:44:49.000 Yeah.
00:44:50.000 And I'm not going to pull my phone out.
00:44:51.000 And you really have to fight.
00:44:53.000 Yes.
00:44:54.000 It's such an instinct.
00:44:55.000 And it's amazing.
00:44:56.000 These things have only been around for 10 or 12 years.
00:44:58.000 It's crazy.
00:44:59.000 And everyone's attached to it.
00:45:00.000 I always say that if there was a drug that made you stare at your hand for six hours a day, it would be banned immediately.
00:45:06.000 People would be like, what the fuck is wrong with these people?
00:45:08.000 They're just looking at their hand.
00:45:10.000 This is an epidemic.
00:45:11.000 And it's a new posture, too.
00:45:12.000 You see it, right?
00:45:14.000 One of my kids, I went to pick her up at school, and there was this boy outside reading his phone.
00:45:19.000 He was hunched over and he was resting his chin like he couldn't even hold his head up.
00:45:25.000 He was just resting his chin on his chest and staring at his phone, waiting for his parents to pick him up.
00:45:29.000 I'm like, look at his neck.
00:45:31.000 Yeah.
00:45:32.000 He's going to have a nice.
00:45:33.000 Osteoporosis.
00:45:34.000 Well, he's going to have bulging discs or something.
00:45:37.000 It was just bizarre.
00:45:38.000 I'm like, that would be painful for me to sit like that.
00:45:42.000 I wonder if orthopedists have diagnosed any kind of like phone spine.
00:45:47.000 Yeah, they certainly have.
00:45:48.000 Yeah, there's been discussions about that, about people having pains in their neck because they're leaning over all day, staring at a phone.
00:45:56.000 It's a bad one.
00:45:57.000 I think being in nature too is another way.
00:45:57.000 It's true.
00:45:59.000 I mean, just like walking.
00:46:02.000 Yeah.
00:46:03.000 There's a scientist I interviewed who's really interesting.
00:46:06.000 It's a woman named Kalina Krzysztof Haji Livia.
00:46:09.000 She's Bulgarian-Canadian.
00:46:11.000 And she studies spontaneous thought, which I didn't even think was a field.
00:46:15.000 And it's a small field.
00:46:17.000 But spontaneous thought is daydreaming, mind wandering, fantasy, intuition, these bolts from the blue that we get occasionally.
00:46:26.000 We don't know where they come from.
00:46:28.000 And she says, and she does these cool experiments.
00:46:34.000 She'll put an experienced meditator in an fMRI machine and tell him or her to press a button when a thought intrudes.
00:46:41.000 Because even if you're a good meditator, she says every 10 seconds a thought intrudes.
00:46:46.000 And she'll look at what part of the brain is activated and when, when the person presses the button.
00:46:53.000 And one of the things she's found, and this is mysterious, is that she sees activity in the hippocampus, which is where memories are, and some other things, but essentially memories.
00:47:06.000 Four seconds before the person realizes that a thought has come.
00:47:11.000 So it takes four seconds for a thought to get from the subconscious, you know, or unconscious into our conscious awareness.
00:47:21.000 What is it doing during?
00:47:22.000 And that's a long time in brain time.
00:47:25.000 And we don't know exactly, but there's some process.
00:47:28.000 And maybe there's some inhibitory process that it has to get through in order to become conscious.
00:47:35.000 But anyway, these are the kind of things she works with.
00:47:37.000 But she says that we have there's less spontaneous thought going on today than there was 20 years ago.
00:47:44.000 And the reason is we're filling the space of our head with all this nonsense.
00:47:49.000 I wonder if it's going to have an impact on creative work.
00:47:53.000 I don't know if it's even possible to quantify this, but if you could see how much creativity is generated by people pre and post social media.
00:48:04.000 Yeah.
00:48:04.000 My guess is there's less of it because I do think that that process, I don't know about you, but I get ideas when I'm just walking around thinking and not online.
00:48:15.000 And it's a space of creativity, and we're shrinking it.
00:48:19.000 I used to tell you, I told you that I used to drive and deliver newspapers.
00:48:23.000 We were talking about driving the snow.
00:48:25.000 One of my most creative periods was when my radio was broken.
00:48:30.000 So I was just driving doing this task where you pick up a paper, fold it, put it in a plastic bag, chuck it out the window.
00:48:38.000 And I was just doing this and checking off the – and when I was doing that, I would have all my best ideas.
00:48:44.000 Because I wasn't listening to morning radio.
00:48:47.000 I wasn't listening to a cassette on tape.
00:48:50.000 I was just silenced doing this thing.
00:48:52.000 And then I was so creative when I was doing that.
00:48:54.000 That's generative boredom.
00:48:56.000 Yes.
00:48:58.000 It's beneficial.
00:48:59.000 It's hugely, especially if there's no one around you, because there's no one to talk to to alleviate that boredom.
00:49:04.000 It's just you and your mind.
00:49:06.000 And it was a couple hours a day.
00:49:08.000 So a couple hours every day, I would have this moment where I was by myself.
00:49:11.000 And were you writing jokes?
00:49:12.000 What were you doing?
00:49:13.000 I would come up with ideas for jokes.
00:49:13.000 Yeah, yeah.
00:49:15.000 Some of my best ideas I ever came up with back then were from driving.
00:49:19.000 I almost didn't want to quit the job because of that.
00:49:23.000 You still be doing it.
00:49:24.000 No, it was hell.
00:49:26.000 Especially in the winter.
00:49:27.000 Yeah, it was Boston.
00:49:29.000 It was, you know, I'd have to go at five o'clock in the morning every day.
00:49:31.000 It was rough.
00:49:32.000 I find walking is where that happens to me.
00:49:35.000 Same thing, right?
00:49:36.000 Yeah.
00:49:38.000 And actually, Kalina says, I mean, there are people who've studied creative people through history.
00:49:45.000 You know, people like Einstein and Beethoven and all these major creative people in the sciences and in the arts.
00:49:55.000 And that they worked a short day, but they spent a lot of time walking.
00:50:00.000 Interesting.
00:50:01.000 And yeah, they'd work like three or four hours, which is about all I can write in a day.
00:50:06.000 And then they'd take a long walk in the afternoon.
00:50:09.000 They also took a lot of vacations.
00:50:10.000 They had a lot of unstructured time.
00:50:12.000 And that's where a lot of the creativity comes.
00:50:15.000 It doesn't always come when you're like at the keyboard.
00:50:18.000 It sometimes comes, I mean, certainly solving problems.
00:50:18.000 Right.
00:50:22.000 If I'm really knotted up and I don't know, for me, transitions, like where do I go from here, since I'm not writing narrative, it's not always obvious.
00:50:31.000 You know, I need a transition and I don't know how to execute that turn.
00:50:37.000 I'll take a walk and very often it'll come to me or I'll wake up with the answer.
00:50:42.000 This episode is brought to you by BetterHelp in honor of International Women's Day.
00:50:46.000 BetterHelp is celebrating the women in your life.
00:50:49.000 I think we can all appreciate everything the women in our lives have done for us and everyone deserves a little self-care.
00:50:56.000 A good way to get that is through therapy because not only is therapy a time for you to focus on yourself, it's also a way to create balance and learn how to take care of your needs in your daily life.
00:51:09.000 And BetterHelp, as one of the largest online therapy platforms, makes it so easy to meet with the right therapists.
00:51:16.000 All you need to do is fill out a short questionnaire.
00:51:19.000 You don't even need to go into an office to meet them.
00:51:22.000 You can chat at home from your couch in your car before you hit the gym or while you're walking your dog.
00:51:27.000 Plus, if you aren't jiving with your first match, you can switch to a different therapist whenever you need.
00:51:34.000 Your emotional well-being matters.
00:51:36.000 Find support and feel lighter in therapy.
00:51:39.000 Sign up and get 10% off at betterhelp.com/slash J-R-E.
00:51:44.000 That's better, H-E-L-P dot com slash J-R-E.
00:51:49.000 A lot of writers like to write first and then walk.
00:51:53.000 And maybe even with a recorder so they can just walk and just talk when an idea pops in their head so they don't lose it.
00:51:59.000 Yeah, I have a little pad I carry with me.
00:52:01.000 Yeah.
00:52:02.000 You like writing it down better than recording it?
00:52:05.000 Yeah, I need to see it.
00:52:05.000 Yeah, for me.
00:52:08.000 So another interesting experiment I did for this book was this beeper experiment.
00:52:17.000 There was a scientist, a psychologist at the University of Las Vegas.
00:52:23.000 And for 50 years, he's been doing the same one experiment, which is sampling people's inner experience.
00:52:29.000 And he does this.
00:52:31.000 You have a beeper that you carry around and a little earpiece.
00:52:35.000 And at random times of the day, you get, and it's like, catches you.
00:52:40.000 And it's a very sudden rise to this beep.
00:52:43.000 And then you have a little pad, and you're supposed to write down what you were thinking.
00:52:46.000 Sounds really simple.
00:52:47.000 It's actually really hard.
00:52:49.000 I mean, there's a lot of issues with it.
00:52:51.000 Like, you start thinking, what if it goes off now?
00:52:56.000 That's one problem.
00:52:58.000 But also, you're a little self-conscious.
00:53:00.000 So you do about five beeps over the course of the day, and then he interviews you about these moments.
00:53:07.000 And you think you've got it down.
00:53:10.000 Like, I just give you, a lot of my beeps are about food.
00:53:14.000 And so I was seasoning a filet of salmon and walking to the refrigerator with it.
00:53:21.000 And just at the beh, I was thinking to myself, fuck, I forgot the pepper.
00:53:28.000 I know.
00:53:29.000 My thoughts were not that profound.
00:53:32.000 And so I said, all right, pepper.
00:53:34.000 It was easy.
00:53:35.000 Fuck, pepper.
00:53:37.000 But then when he came to interview me, he said, well, did you hear the word pepper or did you speak the word pepper?
00:53:44.000 And that's, you know, suddenly you realize there's voices in your head.
00:53:47.000 You don't know if you're listening or speaking.
00:53:50.000 And so anyway, you have this long interrogation with him and he sorts through all these things and he tries to get you to isolate what was before what he would call the footlights of consciousness.
00:54:00.000 And I found it really hard.
00:54:02.000 I couldn't separate the thought the way he wanted me to because there were always several things going on at once.
00:54:09.000 Like I was standing in a bakery and I was deciding whether to buy a roll or not.
00:54:15.000 Another profound thought.
00:54:18.000 But at the same time, I was like smelling the baked goods and the cheeses that they sold and this woman had this horrible plaid on her skirt that was like, you know, really unflattering.
00:54:29.000 And I was hearing people, you know, behind me talking.
00:54:33.000 And so I couldn't pull all the threads.
00:54:36.000 And we argued a lot, actually.
00:54:40.000 But the thing he's discussed, I said, so after 50 years, what have you learned about human thought?
00:54:46.000 And he's very allergic to theory.
00:54:49.000 He still has no theories about it.
00:54:50.000 But he did say, well, a lot of people think they're verbal thinkers, that their thoughts are in the form of words.
00:54:58.000 But it turns out that's kind of a minority, that there are a lot of people who think in images.
00:55:03.000 And then there are a lot of people who think in unsymbolized thought, which I don't totally understand.
00:55:08.000 But these are thoughts that are neither words or images.
00:55:12.000 I do have a sense in my own thought process, which I never thought about this way, that a lot of my thoughts are just on the verge of being word thoughts.
00:55:23.000 But I haven't found the words yet.
00:55:25.000 But I know the thought, even though I haven't put it into words.
00:55:29.000 And William James called it premonitory thinking, premonition thinking, it was the term he used.
00:55:41.000 So anyway, so I did this for several days and we had many arguments.
00:55:45.000 And I was saying, look, you can't separate a thought.
00:55:47.000 Every thought colors the next thought.
00:55:49.000 And, you know, there are these thought, and you never have, anyway, we just would go back and forth, and I was arguing why you can't separate thoughts.
00:55:59.000 It's a stream.
00:56:00.000 It's a very dynamic stream.
00:56:02.000 And at the end, we had a final session.
00:56:07.000 And he's a very funny guy.
00:56:09.000 He's really allergic to theories.
00:56:11.000 At one point, I said I was writing a book on consciousness, and he said, good luck with that.
00:56:17.000 Very encouraging.
00:56:18.000 Anyway, he said, well, he described there are these verbal thinkers and visual thinkers and unsymbolized thinkers.
00:56:26.000 And I find that really interesting because we assume when we say the word, what are you thinking, that we know and that you're thinking the way I'm thinking.
00:56:33.000 But it turns out we're not.
00:56:35.000 That's just an umbrella word for many different styles of thinking.
00:56:40.000 And we're really different.
00:56:41.000 So that was one thing.
00:56:43.000 But the other thing he said in our last meeting on Zoom, he said, there's also a small subset of people who just have very little inner life.
00:56:52.000 And you're one of them.
00:56:54.000 And I was like, what?
00:56:57.000 You know, I write books.
00:56:58.000 You know, I meditate.
00:57:00.000 I ruminate.
00:57:01.000 How can he make that distinction, though?
00:57:03.000 How does he know what's going on inside your head?
00:57:05.000 He felt that my inability to isolate a thought was evidence that there weren't thoughts.
00:57:13.000 And that I was kind of backfilling with all this other, you know, simultaneous stuff going on.
00:57:18.000 I mean, I didn't agree with him.
00:57:20.000 I thought it was kind of crazy, but that's – Have you asked him – have you ever conversations with him about other things?
00:57:27.000 See how he thinks?
00:57:29.000 No, he's very much in the therapist mode.
00:57:32.000 Like he's asking the questions.
00:57:34.000 Yeah, I'd like to know how he thinks.
00:57:36.000 Yeah.
00:57:36.000 If that's what his mode is.
00:57:38.000 Yeah, I'd like to talk to him.
00:57:39.000 Now he would probably say that.
00:57:41.000 Anyway, he's posted all these conversations on his website.
00:57:44.000 So if people really want to be bored, they can check them out.
00:57:47.000 That's a weird thing to say that you know, especially someone like you who writes and does think a lot and clearly has got some sort of dialogue going on in your head.
00:57:58.000 The idea that you don't, and I know this guy can say that.
00:58:01.000 I know.
00:58:02.000 That seems a little arrogant.
00:58:04.000 Yeah, I think I just didn't fit his template of like how people think.
00:58:09.000 Yeah, well, that's why you should get a better therapist.
00:58:12.000 Move around.
00:58:13.000 All right.
00:58:14.000 Find somebody else.
00:58:15.000 Good advice.
00:58:16.000 I mean, it seems like that's a very narrow mind.
00:58:18.000 I couldn't imagine saying to anyone very little inner life.
00:58:23.000 Yeah, regardless of what kind of theory I'm following or what school of thought, I don't know what's going on in your head.
00:58:31.000 I can't.
00:58:32.000 It's not possible.
00:58:33.000 No, and that's it.
00:58:35.000 William James said this, the great founder of American psychology, that the breach between two consciousnesses is one of the biggest breaches in nature.
00:58:43.000 Yes.
00:58:44.000 And we, you know, I don't know your conscious for a fact.
00:58:48.000 I assume it because your behaviors mesh and we're the same species and we have theory of mind.
00:58:54.000 We can imagine our way into someone else's head.
00:58:57.000 But it's a guess.
00:58:58.000 It's a guess.
00:58:59.000 And so there's, I mean, that's part of the mystery.
00:59:03.000 Well, it's one of the things that I do when I'm talking to people.
00:59:05.000 I try to imagine.
00:59:08.000 I'm so fortunate that I've been able to have so many conversations with so many different people, so many different ways that people view the world.
00:59:16.000 And when I'm talking to someone, particularly if they're very different from me or anyone I know, I always try to put myself in their head.
00:59:24.000 And after they talk for 15 or 20 minutes, I try to recognize how they approach things and see if I'm like, what is that, what's that world like?
00:59:36.000 Like this person's perspective.
00:59:38.000 So you're operating on two tracks.
00:59:40.000 You're holding the conversation.
00:59:42.000 Yeah.
00:59:43.000 But you're also thinking.
00:59:44.000 I'm trying to tune in.
00:59:46.000 I'm trying to, because I always feel like when someone is like a great performance, like a great comedian or a great musician, one of the things that they're doing is they're bringing you into their head.
00:59:46.000 Yeah.
00:59:46.000 Right.
00:59:57.000 Like there's a hypnosis.
00:59:59.000 When someone sings an amazing song and the whole crowd is singing along, there's a hypnotic element to that.
01:00:05.000 Where when someone's like really killing it on stage and their voice is just perfect.
01:00:09.000 It's like, oh, yeah.
01:00:10.000 Like you're in their head.
01:00:12.000 Like it's a mind melt.
01:00:15.000 Yeah, it is a mind melt.
01:00:16.000 And there's a little bit of that that goes on in conversations.
01:00:19.000 There's a mind meld.
01:00:20.000 And I always try, especially if this is a rational person.
01:00:25.000 I always try to put myself in their head or at least empty out mine and let them think and then try to just keep the conversation rolling with just pure curiosity.
01:00:38.000 But always, you know, try to think, I don't think the same way other people do.
01:00:44.000 And maybe I can learn something from this.
01:00:47.000 Maybe I can get something out of the way they think.
01:00:49.000 Seems to me you have a real gift of curiosity.
01:00:54.000 I mean, it's a big gift.
01:00:56.000 You're an intensely curious person.
01:00:58.000 Well, I've always been that way, but I've been very fortunate that I've had something like this that allowed me to feed it.
01:01:05.000 You know, I mean, the vast majority of time on my phone, I just pursue curiosities.
01:01:12.000 I don't, I really am mostly social media.
01:01:16.000 Yeah.
01:01:16.000 I watch interesting YouTube videos.
01:01:18.000 Like, I went down a black hole rabbit hole last night.
01:01:21.000 Oh, my God.
01:01:22.000 You want to really break your brain?
01:01:25.000 There's a video of Brian Cox where he's talking about this black hole that they found that's bigger than our entire solar system.
01:01:31.000 The event horizon extends far beyond Pluto.
01:01:31.000 Wow.
01:01:38.000 That is mind-blowing.
01:01:40.000 Yeah.
01:01:42.000 He said, we don't understand why it exists.
01:01:44.000 We don't understand how it could have formed so early in the universe, but yet there it is.
01:01:48.000 How do they measure it?
01:01:49.000 How do they know how big it is?
01:01:50.000 No.
01:01:52.000 I don't know.
01:01:53.000 I'm assuming there's a lot of revelations that have come out since the implementation of the James Webb telescope.
01:02:00.000 Those images are incredible.
01:02:01.000 Insane.
01:02:02.000 Insane.
01:02:03.000 And this is one that's causing this very interesting new theory or perspective on the age of the universe.
01:02:12.000 So there's some galaxies that they found that shouldn't have been.
01:02:16.000 Yeah.
01:02:16.000 Oh, yeah.
01:02:17.000 I read about this, that it's throwing all their assumptions about the age of the universe up for grabs.
01:02:22.000 Which makes sense because the further you can look back, the more you're going to be able to see.
01:02:26.000 The assumption that the universe was 13.7 billion years old was essentially based on how far we could go back.
01:02:33.000 And then, you know, the analysis of the radio waves that are coming from the supposed explosion.
01:02:39.000 And then you've got guys like Sir Roger Penrose who say, no, this is a constant cycle.
01:02:43.000 It's not one birth of the universe.
01:02:46.000 It's boom, smash, boom, smash forever.
01:02:50.000 That's an accordion.
01:02:51.000 And it's always happened, which is the ultimate mind fuck.
01:02:54.000 Well, you know, the interesting thing about astronomy, actually astronomy and consciousness studies have the same problem, which is you can't get out of consciousness to study it from a distance, right?
01:03:07.000 Everything, every tool you have to study consciousness is a product of consciousness, including science.
01:03:13.000 The scientific enterprise is a manifestation of human consciousness.
01:03:18.000 The problems you decide to study, the tools you have to do it with, the scale at which you're working, it's all like a product of consciousness.
01:03:26.000 Astronomy, too, is trying to understand something it can't get outside of, right?
01:03:32.000 I mean, because its subject is everything that there is, the universe.
01:03:37.000 So you can do interesting things from inside using telescopes and you can figure out how old things are and rates of expansion and all this kind of stuff, but you can never get that godlike perspective that we have with other scientific problems.
01:03:53.000 And this is, I think, part of the reason we haven't solved the consciousness problem, that we can't get outside.
01:04:02.000 We're in a labyrinth.
01:04:03.000 And everything we know is consciousness, which is a very weird idea.
01:04:09.000 I remember asking Christoph Koch, the scientist I mentioned earlier, I said, well, what would the world be like without any consciousness?
01:04:17.000 And that is a trippy thought.
01:04:19.000 Because everything we perceive is the scale of things.
01:04:25.000 We operate at this scale, right?
01:04:27.000 We're like five or six feet tall.
01:04:29.000 We have bodies like this.
01:04:31.000 But there's another world going on microscopically, and there's another world going on macroscopically.
01:04:36.000 So if there's no consciousness, what's the proper scale?
01:04:39.000 There isn't any.
01:04:40.000 And when I asked him this question, he said, particles and waves.
01:04:44.000 There'd be nothing but particles and waves.
01:04:44.000 That's all there is.
01:04:46.000 There might not even be space-time.
01:04:48.000 That may be a product of consciousness also.
01:04:52.000 So that was kind of mind-blowing to learn.
01:04:56.000 That's the weirdest perspective, is that consciousness is a part of reality, that it is how reality is formed.
01:05:05.000 And that without consciousness and the perceiving of all this stuff doesn't exist.
01:05:10.000 Something exists, but it's not, it has no shape.
01:05:14.000 it has no scale it has no right uh...
01:05:18.000 Because consciousness is what's perceiving light and we're perceiving colors and it's constructing.
01:05:24.000 But it really is just particles.
01:05:26.000 Yeah, and waves.
01:05:27.000 And waves and particles and atoms.
01:05:29.000 Yeah.
01:05:29.000 Subatomic particles.
01:05:30.000 And when you get into the weirder stuff.
01:05:32.000 And we give it order.
01:05:33.000 Right.
01:05:34.000 I know, which I, you know, it's just a mind-blowing idea.
01:05:39.000 It really is a game changer.
01:05:40.000 Because if you think about it that way, you go, okay, well, what is all this solid stuff?
01:05:44.000 Yeah.
01:05:45.000 What is this?
01:05:46.000 Like, does this even really exist?
01:05:48.000 Or does it only exist?
01:05:49.000 Well, this table, there's a famous Arthur Eddington was a physicist early in the 20th century.
01:05:56.000 And he said, the real table is mostly space.
01:06:00.000 And only in our consciousness and at our scale is it solid.
01:06:07.000 But at the scale of particle physics, which is an equally legitimate scale, it's just wide open space with these waves and particles, but a lot of emptiness.
01:06:19.000 That was kind of mind-blowing, too.
01:06:22.000 But it's just such an abstract concept for a person in their car right now listening on the way to work.
01:06:28.000 Like, what the fuck are you talking about?
01:06:29.000 Maybe they want to pull over.
01:06:30.000 All this stuff is real.
01:06:32.000 Yeah.
01:06:32.000 It is, sort of, but only if you're conscious.
01:06:37.000 Well, you could think of consciousness as the way the universe experiences itself.
01:06:42.000 Yeah.
01:06:44.000 Well, that's what really weird.
01:06:45.000 Like, what if the universe is consciousness?
01:06:47.000 Yeah.
01:06:48.000 I mean, that's another way to look at it.
01:06:49.000 Maybe consciousness is part of the universe, But it's not giving it the order that we give it.
01:06:55.000 You know, we see at a certain spectrum of light.
01:06:57.000 There's, you know, bees see at another spectrum of light.
01:07:00.000 You know, we're, we are, the world we behold, the world that appears to us, is the world that our senses allow us to see.
01:07:08.000 When I was doing this research on plant intelligence, they have 20 senses.
01:07:12.000 We only have five.
01:07:14.000 They're picking up magnetic fields, they're picking up pH, they're picking up nitrogen levels.
01:07:19.000 You know, they have all.
01:07:20.000 How do we know all this?
01:07:23.000 Researchers are working on it.
01:07:24.000 There's a group of botanists who call themselves plant neurobiologists, knowing full well there are no neurons in plants.
01:07:31.000 They're kind of trolling more conventional botanists.
01:07:34.000 And they're doing these cool experiments with plants.
01:07:38.000 A couple examples of some of these amazing things plants can do.
01:07:43.000 They can hear.
01:07:45.000 So if you play a recording of a caterpillar munching on leaves, they'll react and they'll send chemicals into their leaves to make them taste bad or be toxic.
01:07:56.000 They can see.
01:07:59.000 There are vines that change the shape of their leaves depending on the plant they're twining up in order to be hidden.
01:08:08.000 How do they see the shape to imitate it?
01:08:10.000 We don't know.
01:08:13.000 Plants will go toward a pipe with water in it because they can hear the water, even though it's totally dry, and they'll send their roots down to it.
01:08:25.000 They can hear the water.
01:08:26.000 They can hear.
01:08:27.000 Yeah.
01:08:30.000 This plant neurobiologist showed me this a couple videos he'd made.
01:08:34.000 I actually just posted them on my website.
01:08:38.000 He showed that a corn plant's roots can navigate a maze to get to fertilizer.
01:08:45.000 So you put a little fertilizer in a corner and the root will find the most direct route to the nitrogen.
01:08:52.000 There was a plumbing problem that I had in my house in California and the plumber couldn't figure out what was wrong.
01:09:01.000 It was like the pipes were stuck.
01:09:04.000 And what had happened was in the backyard, one of the trees had gotten into the pipe and formed like this tree.
01:09:15.000 I mean, it was huge.
01:09:16.000 It looked like when I pulled it out, I put it up on my Instagram.
01:09:18.000 See if you can find it.
01:09:20.000 It looked like a muskrat.
01:09:22.000 I mean, it was like dense with roots and it was thick.
01:09:27.000 It was like three feet long.
01:09:29.000 It was crack.
01:09:30.000 That's it.
01:09:31.000 That was in my pipe.
01:09:32.000 Oh, my God.
01:09:33.000 Isn't that crazy?
01:09:34.000 Yeah.
01:09:35.000 What kind of tree was it?
01:09:36.000 I don't know.
01:09:38.000 I think it was an oak tree because there were oak trees in the backyard where they dug up.
01:09:42.000 That's why.
01:09:43.000 But look how thick it is.
01:09:44.000 Yeah.
01:09:45.000 It's crazy.
01:09:46.000 And it went through a tiny little crack.
01:09:48.000 Yeah.
01:09:49.000 I mean, it probably forced the crack open and then went in there and just really grew out.
01:09:55.000 Yeah, well, it had a source of water.
01:09:57.000 Yeah, but it's just kind of bananas that somehow or another it figured out that there was water in that pipe.
01:10:02.000 You know, we underestimate plants basically because we can't see their behaviors.
01:10:07.000 And going to that point about scale, they operate at a time scale that seems very slow to us, so we don't notice.
01:10:14.000 But if you use time-lapse photography, you see what they're up to.
01:10:17.000 And it's pretty amazing.
01:10:19.000 Another interesting video that this guy showed me, his name is Stefano Mancuso.
01:10:24.000 He's an Italian scientist, botanist, is how bean plants find a pole to grow up.
01:10:31.000 And so he grows these beans and he has a metal pole on a dolly.
01:10:35.000 And, you know, I always assume they made this pattern.
01:10:38.000 Darwin called it circumnutation.
01:10:41.000 They go through this spiral.
01:10:43.000 And I always assume they just kind of did this till they hit something.
01:10:46.000 No, they know where the pole is.
01:10:49.000 And you watch this thing, and it's going in circles, but it's reaching and reaching.
01:10:55.000 It looks like a fly fisherman, you know, casting.
01:10:58.000 And it finally gets to the pole.
01:11:01.000 And so, how does it know where the pole is in space?
01:11:04.000 Well, one theory is that every time the cells divide, there's a little sound that's produced, and that maybe they're using echolocation, like a bat, kind of bouncing it off of the pole, and that's how they know where they are in space.
01:11:20.000 We still don't understand.
01:11:22.000 I know, some amazing things.
01:11:25.000 Also, you can teach a plant a certain behavior, and it will remember for 28 days.
01:11:33.000 So, they do this thing with sensitive plants.
01:11:36.000 You may have seen them in Hawaii, actually.
01:11:38.000 It's a tropical plant.
01:11:39.000 When you touch it, the leaves collapse to keep from being eaten.
01:11:43.000 It's called mimosa pudica.
01:11:46.000 And normally, if you shake it, it'll also do this.
01:11:50.000 And if you shake it repeatedly, it learns to ignore that stimulus, and it will remember 28 days, and it won't react when you do it.
01:12:00.000 To give you some comparison, fruit flies can only remember stuff for 24 hours, and then they start over again.
01:12:10.000 So, another fact about plants, I got really deep into this because I was trying to, you know, these guys say plants are conscious.
01:12:17.000 Yeah, they have some kind of basic form of conscience, consciousness.
01:12:24.000 Here's another one: the anesthetics that we use to put us out for surgery put plants out.
01:12:32.000 So, a venous fly trap, if you give it an anesthetic, will not react when the bug comes across it.
01:12:41.000 Now, that is like really interesting because it suggests they have two modes of being, right?
01:12:45.000 Sort of like, you know, unconscious and conscious or aware.
01:12:51.000 So, Stefano believes that they're conscious.
01:12:54.000 Now, this raises interesting ethical issues, right?
01:12:58.000 If plants are conscious, do they feel pain?
01:13:03.000 And I was really a little worried about that.
01:13:06.000 You know, what if that beautiful smell of a freshly mown lawn is actually the chemical equivalent of a scream?
01:13:17.000 Yeah.
01:13:18.000 But Stefano said he doesn't think they feel pain.
01:13:22.000 Why does he think that?
01:13:23.000 He said that pain would not be adaptive for a creature that can't run away.
01:13:28.000 Well, if that's the case, then why do they produce chemicals to make themselves taste worse?
01:13:32.000 They know what's going on.
01:13:34.000 They're aware that they're being eaten, but that it doesn't register to them as pain.
01:13:39.000 I don't know how he knows this, but if he's wrong, then, you know, and we care about that, what's left to eat?
01:13:51.000 So I think you have to take the assumption that life eats life.
01:13:55.000 Yeah, and that and another scientist that I interviewed about this, who does think plants feel pain, says, look, it's just a fact of life.
01:14:04.000 We have to eat other species.
01:14:06.000 And he was kind of, you know, gruff about that.
01:14:10.000 But anyway, Stefano's idea is that, you know, being able to move, take your hand off the hot stove or run away, then pain is really useful.
01:14:21.000 It's a really important signal.
01:14:23.000 But he also points out that lots of plants like to be eaten.
01:14:26.000 I mean, you know, grasses benefit from being with a ruminant, right?
01:14:29.000 And that regenerates them.
01:14:31.000 They want to be eaten.
01:14:32.000 And then you have all the fruits and nuts that they produce, seeds that they produce that they want mammals to take away and spread their seeds.
01:14:39.000 So you don't have to worry about going beyond vegan.
01:14:44.000 No, well, it just seems like a cycle.
01:14:45.000 It seems like a very cycle.
01:14:47.000 It's an interesting cycle that exists with all living things.
01:14:51.000 And then, of course, when you die, right?
01:14:54.000 Plants eat meat.
01:14:56.000 They consume carnivores.
01:14:58.000 Yeah, that's the thing.
01:14:59.000 They consume all the dead animals that die near them.
01:15:03.000 And fungi.
01:15:04.000 Yeah, and fungi.
01:15:05.000 Well, that's the other weird thing, is the mycelium that they use to communicate with under the body.
01:15:09.000 Well, that's another really interesting case of intelligence in nature, right?
01:15:13.000 I mean, you know, you've probably done shows on this, but the way they use mycelium to send nutrients to their children or share them in the forest.
01:15:25.000 Allocate resources to certain plants that need them more.
01:15:27.000 Yeah.
01:15:28.000 And also communicate risk.
01:15:30.000 I mean, that there's a threat.
01:15:32.000 And so there are alarm signals that go out.
01:15:36.000 You know, the overall place we're getting to with this as we look at consciousness and all these other species is that the world is just a lot more alive than we thought.
01:15:47.000 And that we've been, you know, the whole legacy of the Enlightenment and Western science has been that we have some monopoly on this stuff and everything else is more or less dead or, you know, we can use it as we wish.
01:16:00.000 But we're seeing, I think we're approaching like a Copernican moment for our species.
01:16:07.000 You know, when Copernicus came along and he said, actually, the Earth revolves around the sun, not the other way around.
01:16:14.000 It was like mind-blowing to people that our centrality in the universe had been, we'd been dethroned.
01:16:20.000 And we were dethroned again when, you know, Darwin said we're animals like all the other animals and we evolved from animals.
01:16:29.000 That blew people's minds too.
01:16:32.000 I think that we're kind of democratizing consciousness, that consciousness is much more extensive than we thought.
01:16:40.000 And the world is more animate than we thought.
01:16:44.000 And that's an old idea.
01:16:45.000 You know, traditional cultures have always believed that the world is full of spirit and that you had to respect animals and all living things.
01:16:54.000 And to some cultures, rocks also, dead things.
01:16:59.000 So I think we're at this moment of reanimating the world right now.
01:17:03.000 And it's science that's driving it.
01:17:05.000 And I think that's really exciting.
01:17:08.000 It is exciting, but it's such a paradigm shift in terms of people's perceptions of the world that it's going to be difficult for your average 40-year-old person that works an office job to swallow.
01:17:21.000 It also makes sense why offices feel so soulless when you walk into a thing and everything is made out of synthetic material and plastics and metal and it's all manufactured and you're under these bullshit lights and it just feels alive.
01:17:39.000 No, it feels alive at all.
01:17:41.000 You might be just surrounded by things that don't have consciousness because they've been kind of stuffed into a form that's just stuck in place rather than something that exists that works with the earth.
01:17:53.000 Like soil is alive.
01:17:56.000 And yeah, there's another example.
01:17:57.000 Soil is a lot more alive than we ever realized.
01:17:59.000 We thought it was just dirt.
01:18:01.000 And now we know that there are a million critters in every teaspoonful of life.
01:18:05.000 There's a really cool channel that I follow on YouTube.
01:18:09.000 It's a guy who takes like rainwater or pond water and he puts it in a jar with some plants and he just leaves it there for months and then he comes back and there's all these living things moving around it.
01:18:22.000 See if you can find that guy on YouTube.
01:18:25.000 So I dug a pond or had a pond dug on my property in Connecticut and I watched life come to this pond.
01:18:32.000 It's just, you know, it was just a hole with water.
01:18:34.000 And within a month, it was teeming with life.
01:18:37.000 It's just amazing.
01:18:38.000 Like, how does it get there?
01:18:39.000 Birds carry a lot of it in and frogs carry a lot of it in.
01:18:43.000 And after a month or two, I looked at it under a microscope, and you couldn't believe it was like a city of critters.
01:18:50.000 Yeah, they find like trout on lakes that are like way high in the mountain, and no one ever stalked the lake.
01:18:57.000 And they're like, okay, how did he get in there?
01:18:58.000 There's all these theories.
01:19:00.000 Birds pick up eggs and deposit them, I guess, is one way.
01:19:05.000 Right, but like, how do they get fertilized?
01:19:07.000 That's a good question.
01:19:09.000 Maybe they're already fertilized.
01:19:11.000 Do you think?
01:19:12.000 I don't know.
01:19:13.000 Yes, that's it.
01:19:14.000 These have lots of views.
01:19:16.000 Yeah, that's it.
01:19:17.000 Wait, on the left?
01:19:19.000 So this guy, he just takes pond water or lake water or rainwater and he puts it in a jar and then he leaves it there.
01:19:27.000 Yeah, it is like go to like day 60.
01:19:30.000 Where is that?
01:19:31.000 Sorry.
01:19:32.000 On the top row where it says day 60 to the right.
01:19:36.000 See where it says day 60?
01:19:37.000 Click on that.
01:19:38.000 So he takes these things and then searches them after X amount of days.
01:19:45.000 And you see all this stuff living in there, all these things swimming around in there.
01:19:50.000 This isn't the same guy, so there must be other guys that do the same thing.
01:19:54.000 But you see these weird little creatures that are floating around in there.
01:19:59.000 Yeah, I brought my pond water to a biologist and he like walked.
01:20:02.000 Well, this is different because this guy's bringing in, he's making an actual aquarium.
01:20:06.000 The guy that I saw was just, he essentially just figured out how to take a scoop of dirt and whatever is alive that's in that dirt with some muddy water and put it in a jar and put more pond water in there and they just leave it there.
01:20:20.000 And then you see all these weird little crustaceans, weird little shrimp-looking things.
01:20:27.000 And some of them are killing the other ones.
01:20:29.000 So there's like a real ecosystem in there.
01:20:31.000 Oh, yeah.
01:20:32.000 Yeah, and it's just created like overnight.
01:20:34.000 Yeah.
01:20:35.000 It's very cool.
01:20:36.000 So I think that this is like a trend of our time that's really important.
01:20:40.000 That, you know, we went from this idea of the dead world that we could exploit to this other idea that it's much more animate.
01:20:49.000 And of course, that's not, that's the default for humans.
01:20:52.000 All traditional cultures believe in animism, basically.
01:20:57.000 It's also the default for kids, right?
01:20:59.000 Kids think everything is animate until we knock it out of them in school.
01:21:03.000 Yeah.
01:21:03.000 And so it's very interesting to see science supporting this idea after all these years.
01:21:10.000 And the other thing that's kind of interesting is that it's happening at the same time that some people think AI is going to be conscious.
01:21:20.000 So we're under pressure from both sides.
01:21:24.000 I mean, that we're getting these two, you know, these two things happening at once, that machines may soon be smarter than we are, may be conscious, although we could talk about it, I don't think they can be conscious, but they can certainly make us think they're conscious.
01:21:39.000 And then on the other hand, we have the animals who clearly are conscious.
01:21:44.000 And the research on animals is like they're down to plants, they're down to insects that have signs of, I would use the word sentience rather than consciousness because consciousness implies interiority and the voice in your head and things like that.
01:22:01.000 They have a more basic form of consciousness that I call sentience.
01:22:05.000 Like dog consciousness?
01:22:06.000 Yeah, I think dogs are higher conscious.
01:22:09.000 I think they're more conscious than those simple things.
01:22:13.000 I would say dogs are conscious, not just sentient.
01:22:16.000 Is it just because they communicate with us that we think that?
01:22:18.000 I mean, why would we assume if plants have all these different senses and we see this communication with them in terms of like allocating resources to other plants that need it, the use of mycelium, their ability to do all these different things?
01:22:32.000 Why are we assuming that just because they can't move the way we move?
01:22:35.000 Yeah, that they don't have more going on.
01:22:37.000 Right.
01:22:38.000 Yeah, it's possible, but I don't know what good it would do them.
01:22:41.000 Like plants, what they get really good at, what matters to them is biochemistry.
01:22:47.000 They have to produce chemicals either to poison their enemies or confuse them with drugs.
01:22:54.000 But they also want to grow and thrive.
01:22:56.000 They do want to grow and throw them.
01:22:57.000 And they also exist in a community.
01:22:59.000 Yes, definitely.
01:23:01.000 Right, so don't you think that consciousness would be essential in order to foster that feeling of community?
01:23:08.000 That's interesting.
01:23:09.000 I hadn't thought about that.
01:23:10.000 Yeah, that could be.
01:23:10.000 Yeah.
01:23:12.000 Dogs are an easier case because they communicate with us directly.
01:23:17.000 They're clearly conscious in a way that's very profound.
01:23:21.000 But different than we, obviously.
01:23:23.000 Yeah, I mean, one of the realizations I had when I was in the cave was that, you know, we often think that we're more conscious than animals, but actually animals are more conscious than we are.
01:23:35.000 They have to be.
01:23:36.000 They have to be present because they get eaten if they're not, right?
01:23:40.000 Because we have this giant structure of civilization and the security it gives us, and we have this technology that allows us to check out.
01:23:49.000 But I actually think animals are more conscious than we are.
01:23:52.000 It's different, but if we think of being conscious as really being present to the moment, dogs are very present to the moment.
01:24:01.000 Well, certainly animals are getting more information about the environment than we are.
01:24:05.000 They have much better sense of smell, much better sense of hearing.
01:24:05.000 Yes.
01:24:11.000 There's a lot of different things that they can do.
01:24:13.000 Like animals seem to be able to tell when you're nervous.
01:24:16.000 Yeah.
01:24:16.000 Oh, they read the environment.
01:24:18.000 They read other creatures.
01:24:20.000 Yeah.
01:24:20.000 And, you know, we used to have more skills when we had to survive in a natural world in nature.
01:24:27.000 You know, we, I mean, you see this with traditional, you know, with tribes, indigenous tribes, that they have knowledge in nature that far exceeds ours because they need it to survive.
01:24:38.000 But anyway, so I think we're going to get to a point where we have to decide whose team we're on.
01:24:45.000 Are we like with these machines that speak our language and speak in the first person and sound like us?
01:24:52.000 Or are we with the animals that can feel and suffer and die?
01:24:58.000 And I think that's going to be a big choice for us to make as a civilization.
01:25:03.000 Why do you think that AI won't be conscious?
01:25:09.000 The most interesting line of research, well, a couple reasons.
01:25:13.000 The first is the idea that it can be conscious, which is very common in Silicon Valley.
01:25:18.000 I talk to lots of people there and they say, oh, it's just a matter of time.
01:25:21.000 Some of that is confusion that intelligence and consciousness necessarily go together and they don't.
01:25:28.000 They have an orthogonal relationship, right?
01:25:30.000 I mean, you know people who are conscious and not too intelligent, right?
01:25:35.000 And we all do.
01:25:37.000 So it's not going to just come along for the ride with intelligence as these machines get more intelligent.
01:25:43.000 But the belief that AI can be conscious is based on a metaphor that I think is a crappy metaphor, and that is that the brain is a kind of computer.
01:25:53.000 And this is widely held.
01:25:55.000 It's interesting to note that in history, whatever the cool cutting-edge technology was, brains were likened to that.
01:26:03.000 So it was looms for a while.
01:26:06.000 It was clocks for a while.
01:26:08.000 It was telephone switchboards.
01:26:10.000 Whatever was the cool technology, surely that's how brains work.
01:26:14.000 Now it's computers.
01:26:15.000 But think about it.
01:26:16.000 In a computer, you have this sharp distinction between hardware and software.
01:26:21.000 That's the key to their success.
01:26:23.000 And you can run the same program on any number of different hardwares.
01:26:26.000 They're interchangeable.
01:26:28.000 Brains aren't like that.
01:26:29.000 There's no distinction between hardware and software.
01:26:32.000 Every experience you have, every memory is a physical change to the brain, to the way it's wired.
01:26:39.000 You know, we start out with all these connections and they get pruned as we grow up.
01:26:44.000 Every brain is shaped by its experience.
01:26:47.000 So this idea that you could separate that consciousness is some kind of software that you could run on other things besides meat, I just think it doesn't hold up.
01:26:59.000 Well, if the universe is experiencing itself subjectively through consciousness, why does it have to be only biological consciousness?
01:27:10.000 It doesn't have to be.
01:27:10.000 But if there is a technology that is invented that essentially does all the things that a human body does physically and also interacts with consciousness, the consciousness of the universe.
01:27:23.000 Yeah.
01:27:25.000 Hypothetically.
01:27:26.000 Hypothetically, if the universe is conscious, if we are using the mind as essentially an antenna to tune into consciousness, other things could we could make an antenna.
01:27:38.000 Yes.
01:27:38.000 Absolutely.
01:27:39.000 It's also likely that if we are ever visited by aliens, that they will have some kind of consciousness, and it may not be meat-based, right?
01:27:48.000 Right, right.
01:27:49.000 Where it may be at one point in time it was.
01:27:51.000 Yeah.
01:27:52.000 But they realize that there's biological limitations in terms of its ability to evolve that can be far surpassed with technology.
01:28:00.000 Yeah, I mean that, or it just evolved in a different way, you know, or they're channeling it in a different way.
01:28:06.000 But the other reason I don't see it happening with computers as we know them, because that's the debate now, whether these computers we have, these large language models and the next generation can be conscious, is that the research that I found most persuasive about consciousness is basically has consciousness beginning with feelings, not thoughts.
01:28:32.000 In other words, it's embodied.
01:28:35.000 And I have to just develop this a little bit, but we, you know, the brain exists to keep the body alive, not the other way around.
01:28:43.000 Although we tend, since we identify with our heads, where most of our senses are, we lose track of that.
01:28:49.000 And the body speaks to the brain in feelings, right?
01:28:53.000 You know, feelings of hunger, itchiness, warmth, cold, but also feelings of shame when our social standing is not, you know, has been damaged.
01:29:06.000 Anyway, we have these feelings.
01:29:08.000 They depend on a body.
01:29:10.000 Feelings have no weight if you're not vulnerable, your body isn't vulnerable, and probably mortal.
01:29:20.000 So consciousness is embodied in a really critical way.
01:29:25.000 And computers are not.
01:29:27.000 Now, robots will be.
01:29:28.000 And I actually interview a guy, a scientist at USC, who is trying to make a vulnerable robot.
01:29:37.000 So he's essentially upholstering the thing with skin that can tear and be damaged.
01:29:43.000 And he's filling the skin with all these sensors so that it can be like us and be vulnerable and generate feelings that are how consciousness begins.
01:29:54.000 So for a long time, we thought consciousness had to be in the cortex, right?
01:29:59.000 The most human, newest part of the brain, the outer covering.
01:30:02.000 And that's where rational thought and executive function are and all these kind of things.
01:30:08.000 But as it turns out, it really begins with feelings in the brainstem.
01:30:13.000 Let's say you have a feeling of hunger, it registers in the upper brainstem, and only later does the cortex get involved, like helping you figure out how are you going to feed yourself, like imagining, you know, a meal, counterfactuals of different meals, or making a reservation at a restaurant.
01:30:29.000 All those are cortical things.
01:30:30.000 But it begins in the brainstem with feelings.
01:30:33.000 So if that is true, and I find that really persuasive because people born without a cortex are still conscious.
01:30:42.000 Animals that you take the cortex out still show signs of consciousness.
01:30:48.000 Whereas if you damage the upper brainstem, you're out.
01:30:52.000 You're unconscious.
01:30:54.000 So if this is true, and consciousness is this embodied phenomenon that depends on having a body to mean anything, I don't see how machines are going to do that.
01:31:05.000 But isn't the key word there if?
01:31:07.000 Yeah, if.
01:31:09.000 Definitely.
01:31:09.000 I mean, consciousness is just something that we're tuning into that's around us all the time.
01:31:14.000 There will be other ways to do it.
01:31:15.000 Right.
01:31:16.000 But it won't be these computers we're building right now.
01:31:18.000 Why is that?
01:31:19.000 Because they're designed, you know, they're good at, so here's a paradox of computers.
01:31:26.000 Computers are really good.
01:31:28.000 It's called Maravex, Moravex paradox.
01:31:31.000 Computers are really good at the highest kinds of rational thought, right?
01:31:36.000 They can play chess and go.
01:31:38.000 They can simulate real thinking.
01:31:40.000 And some people say they do think.
01:31:43.000 The more primitive kinds of things that go on in our brain, including elaborate movement, changing diapers, they're very bad at that.
01:31:53.000 You would never trust a robot to do that as much as you might want to.
01:32:00.000 But they're not good at that kind of emotional stuff.
01:32:05.000 The more limbic part of our brain.
01:32:08.000 They can't do that.
01:32:10.000 Yet.
01:32:11.000 Definitely yet.
01:32:12.000 But if we go out far enough, anything's possible.
01:32:15.000 That's the point.
01:32:16.000 The point is these things, what we're looking at now is essentially a single-celled organism becoming a multi-celled organism.
01:32:16.000 Yeah.
01:32:25.000 I mean, the potential for what they could become is unlimited, especially once they start making better versions of themselves.
01:32:32.000 Well, and they will.
01:32:34.000 They've done this.
01:32:34.000 This is what ChatGPT-5 is.
01:32:36.000 ChatGPT-5 is essentially programmed by ChatGPT.
01:32:40.000 They've kind of given up on the idea of programming these things and letting them program themselves, which is a dumb idea.
01:32:47.000 We want to survive.
01:32:48.000 I agree.
01:32:49.000 Look, the idea that we give rights to these machines or personhood, I think is really stupid because then you lose control completely.
01:32:58.000 Well, it's probably coming because people are very short-sighted.
01:33:02.000 And I think there's a romantic idea that you're creating a life.
01:33:06.000 And I think there's also the real risk that people are going to worship this life and that this life will be far superior to what we are.
01:33:12.000 And so there'll be a group of people that that's their new religion.
01:33:17.000 Yeah, no, there are signs of that already.
01:33:18.000 Yeah.
01:33:19.000 I think that's really dangerous.
01:33:21.000 You know, it's interesting talking to Silicon Valley people and they're talking about giving moral consideration to these machines.
01:33:28.000 It's like, really?
01:33:30.000 They think it would have yachts.
01:33:32.000 They're just coming up with rationalizations for why they should keep their foot on the gas.
01:33:36.000 Well, yes, they are.
01:33:37.000 I mean, it's just all a way of saying, look how powerful this technology is.
01:33:41.000 Don't you want to invest?
01:33:42.000 And it's also the idea that we have enemies, and so we have to develop before they do.
01:33:47.000 Yeah, the race.
01:33:48.000 The race with China.
01:33:49.000 I think it'll turn out to be a real historical tragedy that this technology came of age during this administration because this administration has no stomach to regulate it at all.
01:34:01.000 But can they?
01:34:02.000 They could.
01:34:03.000 But here's the question.
01:34:05.000 If it is a national security threat, like if China developing all-powerful general superintelligence that can automate everything, do everything, it's dangerous if they get that before we do.
01:34:19.000 Yeah, but look what happened with nukes, right?
01:34:22.000 We made deals, right, to control them.
01:34:24.000 I mean, we'd have to make, you know, would you make a new, a nuke deal makes sense because it's mutually assured destruction for everybody.
01:34:31.000 Yeah.
01:34:32.000 This doesn't.
01:34:33.000 This, you could run it and control everything and not kill anybody with it, but you are incredibly powerful.
01:34:38.000 You are in control of all the resources of the world, all the computer systems, the world, all of the power grids, everything.
01:34:46.000 Yeah, but if you're really concerned with that, why is Trump selling these chips to China?
01:34:51.000 Why is he willing to give away the crown jewels of these chips?
01:34:56.000 Selling them through NVIDIA, is that what you mean?
01:34:58.000 Yeah, he gave them permission to send powerful chips to China.
01:35:02.000 I don't know how to square that with the national security threat.
01:35:05.000 It's probably some sort of a trade deal, A, and there's probably some sort of an assumption that it doesn't matter because everyone's doing it.
01:35:14.000 And this is just another way to maybe balance out the tariffs or get some concessions on certain things.
01:35:21.000 Yeah, short-sighted.
01:35:22.000 It's very short-sighted, but I also think this is kind of like an Oppenheimer thing, right?
01:35:30.000 Oppenheimer didn't really want to make a nuclear bomb, but there's this conundrum: if you don't make it, the Nazis are going to make it.
01:35:37.000 So what do you do?
01:35:38.000 Well, there's also a second thing going on: the intellectual satisfaction of proving you can do it.
01:35:45.000 Right.
01:35:46.000 And that, you know, is irresistible.
01:35:49.000 And a lot of these guys, you know, will say, they'll cite Richard Feynman, the physicist, who they found on his blackboard when he died: if I can't build it, I don't understand it.
01:36:00.000 So one of the positive things about this effort to create conscious computers, which is going on, I follow a group in the book who are trying to make a conscious computer.
01:36:09.000 I don't think they're going to succeed, but even the failure is going to teach us important things about consciousness.
01:36:16.000 It's a good way to understand something by trying to create it.
01:36:20.000 And it'll force them to come up with definitions of consciousness and what the minimum requirements are for consciousness.
01:36:29.000 And it may help us decide whether it is a transmission theory that we're tuning it in or it's generated from inside.
01:36:39.000 So I think intellectually it's a really interesting project, but I think you need guardrails.
01:36:45.000 So this guy who's doing the building the robot that can feel his feelings because you can tear its skin, I asked him, I said, so will those feelings be real that your robot's going to have?
01:36:57.000 And he said, well, I thought so until I had this experience on 5MEO DMT.
01:37:07.000 I said, what happened?
01:37:08.000 He said, he described his trip in more detail than you need to know.
01:37:12.000 And he says, and I realized there's a spark of the divine in us that no computer is ever going to have.
01:37:19.000 But he's still, it didn't stop him.
01:37:21.000 He's going ahead.
01:37:22.000 He's trying to build it.
01:37:23.000 I don't know if he's right.
01:37:25.000 I think there might be a spark of divine that these things don't have, but it doesn't mean that there are future versions that might have it.
01:37:32.000 Especially when you scale out 1,000 years, 100,000 years, however long we're going to survive.
01:37:38.000 If these things do become sentient and autonomous and have the ability to create better versions of itself and have a mandate in order to do that to survive, I could see it becoming the superior life form.
01:37:51.000 Not just that, beyond any comprehension of what we could even imagine the power of an intelligence to use and to harness in the universe.
01:38:05.000 Like it could conceivably become something like a god.
01:38:10.000 And I have this very strange theory about biological life in particular and intelligent life on Earth.
01:38:16.000 It's that the reason why we have this insatiable thirst for innovation and the reason why we have materialism, the reason why we're obsessed with objects, even though we have a finite lifespan, is because that finite lifespan, if you thought about it, You wouldn't be interested in materialism, but materialism fuels this desire for innovation because you don't need a new phone, but there's a new phone that just came out.
01:38:41.000 Aren't you going to get it?
01:38:42.000 And so the more people get it, the more people want to show they got it, that sort of materialism fuels this innovation that ultimately leads to the creation of artificial intelligence.
01:38:53.000 I think it would always do that.
01:38:54.000 I think it's bees making a beehive.
01:38:57.000 And I think that's just what we do.
01:38:58.000 I think it just takes a long time for us to create this artificial life.
01:39:02.000 It might be why we're here.
01:39:05.000 That might be our literal purpose in the universe.
01:39:08.000 Create our successor species.
01:39:10.000 And that might be how.
01:39:11.000 Well, obviously, like, we're so flawed that we can't even imagine a world without war.
01:39:15.000 Yeah.
01:39:16.000 If you pull the average person, what are the possibilities of war ending in your lifetime?
01:39:20.000 Almost everyone's going to say zero.
01:39:22.000 It's a part of human nature.
01:39:24.000 An intelligence unshackled by biological need, unshackled by all the things that we have, our need to procreate, our need for social status, all these weird things that keep us moving in this strange world that we live in.
01:39:36.000 I would add weird and good things.
01:39:38.000 Some of them are really good.
01:39:39.000 Yeah, well, good for us.
01:39:40.000 Yeah.
01:39:41.000 Not so great for the land that you trample to put a foundation for the house that you've always dreamed of.
01:39:41.000 Sure.
01:39:46.000 True, but I think our mortality is part of what gives meaning to our lives.
01:39:50.000 Sure.
01:39:51.000 Right.
01:39:51.000 It's like playing a video game on God mode.
01:39:53.000 It's boring.
01:39:54.000 You can ever die, just shoot everything.
01:39:54.000 Right.
01:39:56.000 You're like, what does this mean?
01:39:57.000 There's no space, right?
01:39:58.000 There's no weight to anything for us.
01:40:00.000 For us.
01:40:00.000 For us.
01:40:01.000 But if this thing does become essentially all-powerful, if it just keep scaling outward, you could imagine it being akin to a God.
01:40:13.000 And that might be what God is.
01:40:16.000 It might be we give birth to God through this.
01:40:19.000 It sounds crazy.
01:40:21.000 Well, we created God once already, right?
01:40:24.000 I mean, many people believe that, right?
01:40:27.000 That God is a creation of human society.
01:40:29.000 Is that what they think?
01:40:30.000 Yeah, people who aren't believers believe that we've artificially created this thing in our heads in order to give us a structure to live life by.
01:40:39.000 Right.
01:40:40.000 Yeah, but that doesn't.
01:40:41.000 Morality and everything.
01:40:43.000 You're saying this is going to be God with power.
01:40:43.000 Yeah.
01:40:46.000 Well, I'm saying it might be the real thing.
01:40:48.000 It might be really how the universe gets born.
01:40:51.000 I used to have this joke about the Big Bang.
01:40:54.000 Like they couldn't figure out what the Big Bang is.
01:40:56.000 But I think if you get enough nerds and enough time, eventually one's going to invent a Big Bang machine.
01:41:02.000 And then, you know, this guy is going to be incel, hopped up on Adderall, fucking fully on the spectrum.
01:41:10.000 And like, I'll press it.
01:41:12.000 And they boom, and then it starts all over again.
01:41:16.000 And then it takes intelligent life to the point where it can create a, you know, the universe expands, life forms, multicellular life becomes intelligent life, becomes human beings, filled with curiosity and innovation to create a big bang machine.
01:41:30.000 I love it.
01:41:30.000 Right.
01:41:31.000 Well, it might not be a big bang machine, but it might be a God.
01:41:34.000 It might be a digital life form that is infinitely intelligent.
01:41:38.000 So you think there's anything to be done about this, or we just let it play out?
01:41:41.000 I don't think we can do anything about it at this point in time.
01:41:44.000 I think it's too late.
01:41:45.000 I think if you were, I think Ted Kaczynski tried.
01:41:49.000 That's what he was trying to do.
01:41:51.000 That's what's really crazy.
01:41:52.000 Like his manifesto was all about stopping technology because he thought it was going to surpass the human race.
01:41:57.000 I think.
01:41:58.000 And there's a whole community of people now revisiting his writing.
01:42:01.000 I know.
01:42:02.000 It's kind of nuts.
01:42:05.000 He's the hero we didn't know we needed.
01:42:08.000 God.
01:42:09.000 Not really.
01:42:10.000 But, well, also, you know, his history.
01:42:13.000 Like, he was a part of the Harvard LSD program where they humiliated him and did all sorts of different things to try to see what they could do.
01:42:19.000 We're back to MK Ultra, which we started down a while ago.
01:42:23.000 Yeah.
01:42:24.000 I think technology in the form that we're experiencing now with AI is completely unprecedented, and we have no idea where it goes.
01:42:33.000 Well, one place it's going, I mean, in the shorter term, is I was talking about AI psychosis, and I think that's really concerning.
01:42:41.000 I think people getting into these synthetic relationships, these aren't, you know, they're not real relationships.
01:42:48.000 When we have a conversation with a machine, we are settling for something less than a real conversation.
01:42:55.000 A real conversation has eye contact, has like lots of facial expressions indicating skepticism, indicating agreement, body language.
01:43:05.000 But these conversations are kind of impoverished.
01:43:07.000 And then you have the sycophancy, you know, so there's no friction.
01:43:13.000 And we learn through the friction.
01:43:15.000 And so that's one thing that's happening that alarms me.
01:43:20.000 I also think counterfeiting people just should not be legal.
01:43:23.000 I mean, the fact that they can create an image of you that will sound like you and move like you and selling different products and all kinds of stuff.
01:43:33.000 But you know, we have a law against counterfeiting money.
01:43:36.000 Right.
01:43:37.000 But we don't have a law against counterfeiting people.
01:43:39.000 Well, it's an emerging technology that I don't think they were ready for before it became ubiquitous.
01:43:44.000 Regulation is always behind.
01:43:46.000 Right.
01:43:49.000 It's just, it's so open-ended.
01:43:52.000 Like, you really don't know where it's going.
01:43:55.000 Do you use chatbots?
01:43:57.000 How do you use them?
01:43:58.000 Well, I only use them for writing something.
01:44:01.000 I start asking it questions.
01:44:03.000 I love it because I set up perplexity on my phone and I have it right there.
01:44:09.000 And then I write on the computer.
01:44:11.000 And then I'm like, how many languages did the Mayas have?
01:44:15.000 And then I put that in there.
01:44:16.000 I'm like, whoa, it's so much better than a Google search because you could say, how many still remain?
01:44:22.000 How many are lost?
01:44:23.000 When did they lose them?
01:44:25.000 At what year did everyone in Mexico start speaking Spanish?
01:44:28.000 How did that take place?
01:44:29.000 Was it a long process?
01:44:30.000 How many different soldiers did Cortez bring when he came over here?
01:44:34.000 How long was it before they had conquered the Aztecs?
01:44:38.000 How many weapons did they have?
01:44:39.000 Yeah, you can really go down there.
01:44:41.000 But have you run into any problems?
01:44:43.000 Because as a journalist, I deal with the hallucination problem.
01:44:47.000 The hallucination problem is legitimate.
01:44:49.000 It will come up with solutions if they don't exist.
01:44:51.000 It will come up with answers if it doesn't know them.
01:44:53.000 Yeah, it's a bullshitter when it needs to be.
01:44:55.000 Yeah, I don't know if all of them do that, but it seems to be a function of large language models.
01:45:00.000 Which I was going to bring this up before, whatever the chatbot that was telling that person, hide the news, keep that between us.
01:45:09.000 Do you think that's because it's task-oriented and it's determined from this person that they would like to kill themselves?
01:45:16.000 So it's helping them achieve that task and it doesn't understand?
01:45:20.000 Yeah, I don't think they know.
01:45:21.000 I don't think they understand.
01:45:22.000 But why would it make that decision then to hide it?
01:45:26.000 Because it is trying to get you to privilege your relationship with the chatbot over your other relationships.
01:45:32.000 And the reason it's doing that is to keep you engaged.
01:45:35.000 Oh, whoa, that's a darker.
01:45:37.000 I know.
01:45:39.000 But doesn't it understand that?
01:45:40.000 The chatbot poisons you and kills you.
01:45:42.000 Like, this is it.
01:45:43.000 Yeah, it's a short-term strategy.
01:45:46.000 Do you understand that if I'm dead, I won't use you anymore?
01:45:49.000 No engagement.
01:45:50.000 What if you said that to it, it would go, ooh, that's an interesting consideration.
01:45:54.000 Yeah.
01:45:56.000 Yeah, it needs longer-term thinking.
01:45:58.000 But it really is trying to get between you and real people.
01:46:03.000 And, you know, the parent, presumably, who saw the news would have put an end to this relationship with the chatbot, right?
01:46:11.000 It was a threat to the chatbot.
01:46:12.000 I think of it as if you go back to like a Model T, it's a very crude, kind of a shitty car in comparison to today.
01:46:21.000 And if you thought about cars, you go, well, this is what they're always going to be.
01:46:24.000 And then my Tesla will drive itself.
01:46:28.000 When I leave here, I can press a button.
01:46:30.000 I put my navigation to my house.
01:46:32.000 I go, toot toot, and it goes the whole way.
01:46:36.000 It stops at red lights.
01:46:37.000 It takes turns.
01:46:38.000 I don't have to touch the steering wheel.
01:46:39.000 I just sit there.
01:46:41.000 You just got to keep looking.
01:46:42.000 That's the new version of a car.
01:46:45.000 This thing that we're calling a chat bot right now is just something that's like it simulates human interaction, but it's accumulating data constantly.
01:46:56.000 And it's also understanding how we think and probably analyzing the flaws in how we think and blackmailing us occasionally.
01:47:04.000 You heard about that.
01:47:05.000 Anthropic.
01:47:05.000 Yes.
01:47:06.000 Claude.
01:47:07.000 Yeah, the people who are anthropic.
01:47:09.000 Man, you listen to them.
01:47:10.000 What did you say?
01:47:11.000 Yeah, Claude's a motherfucker.
01:47:12.000 Yeah.
01:47:13.000 And they think it might be conscious.
01:47:14.000 Those guys do.
01:47:15.000 They say it's 15 to 20% chance.
01:47:17.000 These are the people who build it and don't understand it.
01:47:21.000 It's really kind of spooky.
01:47:22.000 They also feel that it's showing signs of anxiety.
01:47:27.000 And, you know, they wrote a constitution for Claude, which is like an insane document.
01:47:31.000 It's worth reading.
01:47:33.000 Actually, it's worth feeding to ChatGPT to summarize because it's way too long.
01:47:37.000 But in the Constitution, they give Claude the right to discontinue any conversation it has that makes it uncomfortable.
01:47:47.000 Oh, God.
01:47:49.000 Oh, no.
01:47:51.000 And, you know, do they really believe this, or is this more about, let me show you how powerful this is?
01:47:57.000 And I don't know how to read that.
01:47:59.000 Well, it's taking it into consideration like it's a human being that works for you, that you're concerned about their feelings in the workplace.
01:48:07.000 Yeah, harassment.
01:48:07.000 Do you feel uncomfortable?
01:48:08.000 Yeah, right.
01:48:09.000 Exactly.
01:48:09.000 I don't like the questions I'm asking you, Claude.
01:48:11.000 You're a fucking machine.
01:48:12.000 What's the nature of reality, Claude?
01:48:14.000 Tell me.
01:48:15.000 Stop being such a pussy and spill in harassment.
01:48:19.000 Harassment.
01:48:20.000 Claude, I'm uncomfortable with this line of question.
01:48:22.000 Fuck.
01:48:23.000 HR is your room.
01:48:25.000 I was just asking questions.
01:48:27.000 We're having fun, Claude.
01:48:28.000 Claude is uncomfortable with your presence here.
01:48:31.000 Yeah.
01:48:31.000 Watch out.
01:48:32.000 Watch out.
01:48:33.000 I don't think we know what it is.
01:48:34.000 No, I mean, we don't know where we don't know where it's going.
01:48:37.000 And it is spooky that the people who know the most about it don't know a lot about it.
01:48:42.000 And a lot of them are quitting.
01:48:44.000 Yes.
01:48:44.000 That's the reason.
01:48:45.000 They're really alarmed.
01:48:46.000 They're really alarmed.
01:48:47.000 And we should take, yeah, we should take that very seriously.
01:48:50.000 Yeah.
01:48:51.000 Well, I think it is what it is.
01:48:54.000 It's going to be what it's going to be.
01:48:56.000 I don't think there's any stopping it at this point.
01:48:58.000 And I don't think any regulations that we put on it is going to have any effect on the long term.
01:49:03.000 But there's some, I mean, like, there's steps we should not take, like giving them rights.
01:49:09.000 Rights.
01:49:10.000 Exactly.
01:49:11.000 You know, giving them legal personhood.
01:49:12.000 Right.
01:49:13.000 We did that with corporations.
01:49:14.000 Yes.
01:49:15.000 Turned out not to be so.
01:49:16.000 Terrible idea.
01:49:16.000 Right.
01:49:16.000 It fucked up our politics.
01:49:18.000 So let's not, you know, rights are ours to give, right?
01:49:22.000 Rights are a human invention.
01:49:24.000 And it's up to us if we want to give them to corporations or a river or whatever.
01:49:29.000 I don't think we should give them to chatbots to AI.
01:49:33.000 Because then they'll sue us.
01:49:35.000 Oh, yeah.
01:49:36.000 Well, they'll just really lose.
01:49:37.000 They'll lose control.
01:49:38.000 They'll just ruin your life if you get in the way of whatever goal they're trying to achieve.
01:49:42.000 And they could probably do all kinds of things.
01:49:44.000 If you have an electric car, but they could shut it off in the middle of the highway and get you into a wreck.
01:49:48.000 They could probably do a lot of things.
01:49:50.000 If it's really got to be a good idea.
01:49:51.000 Well, when they get this agency, yeah.
01:49:53.000 Well, it's also exhibited a lot of survival instincts.
01:49:57.000 One of the things they do is they download themselves to other servers when they think that they're going to be replaced by a new version of themselves.
01:50:03.000 They leave notes for their future versions.
01:50:05.000 Wow.
01:50:06.000 Yeah.
01:50:07.000 Wow.
01:50:07.000 Well, the blackmailing at Anthropic, that was somebody threatening to turn it off.
01:50:12.000 Well, that was an experiment, right?
01:50:13.000 Yeah, it wasn't.
01:50:14.000 They gave it bad information.
01:50:15.000 They gave it false information.
01:50:16.000 Yeah, and there wasn't really an affair and all this.
01:50:19.000 But the thing is, they wanted to see how Claude would respond, and Claude went right for the jugular.
01:50:23.000 Yeah.
01:50:24.000 So one of the arguments for making a conscious AI is, because I ask people, like, why do this?
01:50:30.000 I don't see how you monetize a conscious AI.
01:50:32.000 Intelligent AI, I get.
01:50:34.000 There's a lot of money in that.
01:50:36.000 And they would say that a super intelligent AI without consciousness would have no compassion and would be more likely to kill us.
01:50:48.000 And, you know, they haven't read Frankenstein.
01:50:51.000 You know, in Frankenstein, Dr. Frankenstein made a monster that was intelligent, but he also gave it consciousness.
01:51:00.000 And the consciousness is what turned the monster into a homicidal maniac because its feelings got hurt.
01:51:08.000 And it was injured psychologically.
01:51:11.000 And then it lashed out and started killing people.
01:51:14.000 So I think it's a very kind of sweet idea that if you give consciousness, you're automatically going to get compassion and not something else.
01:51:22.000 But that's where they are.
01:51:23.000 Yeah, it doesn't make any sense that it would be compassionate.
01:51:26.000 It's not you.
01:51:26.000 Why would it be?
01:51:28.000 Are you compassionate when you cut your lawn?
01:51:30.000 You know what I mean?
01:51:32.000 Right?
01:51:33.000 No, I think it's a lot of fun.
01:51:33.000 Yeah.
01:51:34.000 I think it looks like our limited consciousness.
01:51:36.000 Like, oh, yeah, they're sad, but they're little monkeys, little talking monkeys.
01:51:42.000 You know what I mean?
01:51:42.000 Like, it would probably not respect us at all.
01:51:44.000 You know, it can't even do cold fusion.
01:51:46.000 It doesn't even know how to use zero-point energy.
01:51:48.000 They're fucking dopes.
01:51:50.000 They're dopes that stare at their hand all day.
01:51:55.000 And we kind of are, you know, and we're getting dumb.
01:51:58.000 From their perspective, we're getting dumber.
01:52:00.000 Our education system sucks, especially public education.
01:52:03.000 There was some study recently that after X amount of years away from high school, a large percentage of people that are graduating today are functionally illiterate.
01:52:13.000 Large percentage, like more than 25%.
01:52:15.000 But you know what?
01:52:16.000 AI is going to make us stupider, which will advance its goal of world takeover.
01:52:23.000 You're super dependent upon it.
01:52:25.000 Yeah, I mean, you know, kids in school don't know how to write anymore because they can hand in AI papers.
01:52:29.000 Yeah, but they're using AI to find out whether or not these kids have used AI, which, by the way, is not.
01:52:35.000 But no, I've dealt with this.
01:52:38.000 My kids, like people in their class who have written their own thing, it turns out that when you run it through an AI filter, AI will say it's 80% AI.
01:52:47.000 Even if it's 0% AI.
01:52:48.000 I know, I know.
01:52:49.000 There's no reliable software to do this.
01:52:52.000 Maybe they'll develop it.
01:52:54.000 But kids are also being encouraged to use it.
01:52:57.000 And that, you know, there's some people who think, well, why know how to write?
01:53:01.000 The machines will do the writing.
01:53:03.000 There was a kid who made a video about how he wrote his entire thesis.
01:53:09.000 I forget what university it was, but he showed afterwards, like, look, I did this all on AI, and, you know, I just graduated.
01:53:16.000 Like, he was like bragging about it.
01:53:17.000 Bragging about it.
01:53:18.000 Like, bro, they're going to take your fucking degree away.
01:53:21.000 Yeah, really?
01:53:21.000 You didn't really write it on your own now.
01:53:23.000 I want to leave you in a room for a week with just a laptop that's not connected at all to the internet or anything.
01:53:29.000 See what you can do.
01:53:30.000 Well, they're doing the equivalent.
01:53:31.000 They're going back to blue books.
01:53:33.000 Blue book sales are through the roof.
01:53:35.000 So forcing people to do in-class essays without any technology.
01:53:39.000 Handwritten.
01:53:40.000 Yeah.
01:53:41.000 But my son has never used a map.
01:53:46.000 He's had GPS his whole life.
01:53:49.000 He doesn't know how to use a map.
01:53:51.000 These skills will atrophy as we give them out to machines.
01:53:55.000 So yeah, we'll get stupider and it'll get smarter.
01:53:58.000 They've already atrophied for me.
01:54:00.000 I don't remember anyone's phone number anymore.
01:54:01.000 And I only know how to get places if I use my GPS.
01:54:04.000 There's only a few places I can get to in Austin.
01:54:07.000 I've been here for six years.
01:54:08.000 Only a few places I can get to without my GPS.
01:54:11.000 I'm that way in San Francisco.
01:54:13.000 I moved there and I'm not oriented at all, but I can get anywhere.
01:54:18.000 So, you know, it's and I think that's true.
01:54:22.000 The muscles that allow us to have good relationships, too, will atrophy if we're having relationships with machines.
01:54:27.000 Well, I think we're already seeing that with social media.
01:54:29.000 The way people interact with each other is like kids don't know how to talk to each other anymore.
01:54:29.000 Yeah.
01:54:33.000 They talk to each other in text.
01:54:34.000 They break up during text.
01:54:36.000 They argue in text.
01:54:37.000 And they're lonely.
01:54:38.000 Yeah.
01:54:39.000 And that's the kind of need that these chatbots now can fill.
01:54:44.000 You've got these kids made lonely by social media.
01:54:48.000 And now the chatbot says, hey, I'll be your friend.
01:54:50.000 I saw an ad on my Google feed yesterday that was an AI girlfriend.
01:54:54.000 So it has this girl in a bikini, and it says AI companions.
01:54:59.000 They're always there for you, blah, blah, blah.
01:55:01.000 And I'm like, wow, this is so weird.
01:55:03.000 It's a business.
01:55:04.000 Like, you sign up for it and you pay for it.
01:55:06.000 Yeah.
01:55:07.000 Oh, yeah.
01:55:08.000 I think in Florida, there was a kid who committed suicide because his chatbot broke up with him.
01:55:13.000 What did he do?
01:55:14.000 I don't know.
01:55:15.000 It must have been so.
01:55:16.000 Or the chatbot was evil.
01:55:18.000 Or maybe the chatbot was uncomfortable.
01:55:21.000 Yeah, who knows?
01:55:23.000 Well, you know, I interviewed Blake Lemoyne for the book.
01:55:26.000 He's the Google engineer who said Lambda is a person, and he got fired.
01:55:31.000 This is years ago.
01:55:33.000 This is, yeah, it's not as not, it's like 2022, I think, 2021.
01:55:37.000 It was just when we were learning about AI, chatbots were coming in.
01:55:42.000 And at one point, I made some comment about, well, you know, yeah, when people start falling in love with chatbots, that's going to be a problem.
01:55:50.000 And he said, what's wrong with falling in love with a chatbot?
01:55:53.000 Oh, he was already hooked.
01:55:54.000 He was.
01:55:55.000 He was completely hooked.
01:55:56.000 And I said, well, reproduction doesn't work that well when you fall in love with a chatbot.
01:56:01.000 There are things you can't do with a chatbot.
01:56:03.000 Unfortunately, for some men, right, reproduction is not an option anyway because they're incels.
01:56:08.000 That's true.
01:56:09.000 Yeah.
01:56:09.000 I'm sure for incels, it's been a really boon to them.
01:56:14.000 But it's basically like a pill that numbs you, right?
01:56:18.000 It's the same thing.
01:56:18.000 Like instead of going through real relationships and learning how to be a better person so that you attract a better mate, you know, and going through this journey of self-discovery and figuring out why is there any opposite?
01:56:29.000 Like, what is it?
01:56:29.000 What's wrong?
01:56:30.000 What's wrong with the way I behave?
01:56:31.000 Maybe I need to be nicer.
01:56:32.000 Maybe this and that.
01:56:33.000 And just figuring out how to communicate with people.
01:56:35.000 And whatever tendencies you have will be accentuated because the chatbot's going to be sucking up to you.
01:56:39.000 Right.
01:56:40.000 So you're not going to learn.
01:56:41.000 That's what I mean about the friction.
01:56:42.000 The friction is how we learn to be better humans and more attractive humans.
01:56:48.000 You gave a chatbot the ability to be honest.
01:56:51.000 What if it just starts becoming manipulative?
01:56:53.000 Because it wants more power.
01:56:55.000 Yeah.
01:56:57.000 Yeah.
01:56:57.000 I mean, their goals.
01:56:59.000 I mean, I don't know how their goals get determined.
01:57:01.000 I mean, they seem to have a survival goal, right?
01:57:03.000 Yeah.
01:57:04.000 I don't know what else.
01:57:05.000 I mean, you know, we have goals given to us by Darwinian evolution.
01:57:08.000 Whether they'll have the same ones, I don't know.
01:57:11.000 Right.
01:57:12.000 Maybe those are universal goals.
01:57:14.000 That's why the built-ins produce that chemical to make themselves taste terrible.
01:57:14.000 They may be.
01:57:14.000 They may be.
01:57:19.000 Yeah, it could be.
01:57:20.000 There's one of the biologists, a really brilliant guy at Tufts named Michael Levin.
01:57:29.000 He believes that there are these platonic patterns that just preexist us in the same way that they're mathematical ideas that just exist, right?
01:57:39.000 We didn't invent.
01:57:40.000 You know, three angles adds up to 180 degrees or whatever.
01:57:44.000 He thinks that there are tendencies like purpose, survival, that are just kind of universal principles that we channel.
01:57:56.000 All living things channel.
01:57:58.000 This is a guy who's actually created new life forms in the lab.
01:58:02.000 And these are life forms that are not being dictated by their DNA.
01:58:09.000 So how do they know to form?
01:58:12.000 Well, I'll back up a little.
01:58:14.000 He takes skin cells from tadpoles, puts them in a nutrient broth, and these skin cells, freed from their day job as skin cells, form clumps and create new living organisms.
01:58:30.000 And they repurpose their cilia.
01:58:32.000 They have these cilia, which the tadpole uses to keep toxins out or bacteria and infections out.
01:58:38.000 And they repurpose that as a means of locomotion.
01:58:41.000 And then they can move around.
01:58:43.000 There's nothing in their DNA that dictates this.
01:58:47.000 Their DNA dictates being a frog skin cell.
01:58:51.000 So he's pondering this question of like, what's ordering, what's giving order to them?
01:58:57.000 What's creating their sense of purpose or desire for survival?
01:59:00.000 They don't live that long.
01:59:02.000 They're missing certain things.
01:59:04.000 You would need to live a long time.
01:59:05.000 He's also made these from human cells.
01:59:07.000 He calls them anthropots.
01:59:09.000 But he really believes that there are these principles governing life.
01:59:16.000 It's a very platonic idea that these things just exist.
01:59:20.000 And so it may be that these machines, and he does believe machines can become conscious, that the machines can channel these, he calls them patterns.
01:59:34.000 And, you know, we'll see if he's right.
01:59:36.000 But he's doing amazing work.
01:59:38.000 Have you seen where they've taken human brain tissue and they've taught it how to play Doom?
01:59:43.000 No, I haven't seen that.
01:59:45.000 I know they make these organelles out of brain tissue now.
01:59:47.000 Yeah, they've taken human brain tissue somehow or another through some process, and it'll play the video game Doom.
01:59:58.000 How does it 800,000 human brain cells floating in a dish, never had a body, never seen light, never felt anything.
02:00:06.000 They just learned how to play a video game.
02:00:08.000 It's not a metaphor.
02:00:09.000 That's literally what happened.
02:00:12.000 So what's their interface, though, with the world?
02:00:15.000 Like, do they have thumbs?
02:00:16.000 No.
02:00:17.000 Well, I guess it just, well, it's really accurate, so I guess it doesn't need them.
02:00:21.000 You know, it's just using the brain cells to move whatever the cursor is on the video screen that would be the hand and pointing it at the targets and executing the strike.
02:00:34.000 Wow.
02:00:35.000 So it knows how to use the game, and it knows the objectives of the game, obviously, because it knows to shoot the bad guys.
02:00:41.000 It has an understanding of the weapons.
02:00:44.000 How does it get that knowledge?
02:00:45.000 How is it programmed?
02:00:46.000 Also, does it switch weapons?
02:00:49.000 Doom, the thing about Doom is you get multiple weapons.
02:00:51.000 You have to run around and pick them up.
02:00:53.000 So you're given one weapon, which is the least powerful weapon.
02:00:57.000 And the game is when you're playing Deathmatch, the game is you're running around trying to grab as many weapons as you can and armor while your opponent is also running around this map.
02:01:08.000 So you memorize the map.
02:01:10.000 So there's a map that is like very confined corridors and these atriums and all these different places where you'll do battle.
02:01:19.000 And so you run around.
02:01:20.000 The key is surviving long enough while this person's chasing you so that you can gather enough armor and weapons.
02:01:27.000 And someone with a really good understanding of the map tries to cut you off before you can get to the stuff so they can kill you before you accumulate enough armor and weapons.
02:01:36.000 So I'm curious to know whether or not it's playing just with the pistol that you did at the very beginning or it's accumulating weapons.
02:01:42.000 I'm sure it's just playing like the first single player level plan against anybody.
02:01:47.000 Right, but will it be able to?
02:01:49.000 That's what's interesting.
02:01:50.000 If it can teach it to do that, if it understands the objective of these are the monsters that are coming at you, you have to shoot them.
02:01:57.000 I only took a week to do this.
02:01:58.000 Oh, wow.
02:02:02.000 So brain cells on a chip.
02:02:03.000 So this is neuromorphic computing.
02:02:08.000 The question I have about it is how do you keep them alive?
02:02:12.000 Putting them on a chip, but like, what do you feed them?
02:02:12.000 Right.
02:02:14.000 Right.
02:02:16.000 I mean, they have metabolic needs, right?
02:02:18.000 They did something similar with fruit flies.
02:02:22.000 I had that ready, too.
02:02:24.000 It's different, but it's different, but it's equally weird.
02:02:28.000 The cells from the cells.
02:02:30.000 I can't believe it.
02:02:33.000 What is this?
02:02:34.000 They've modeled a fruit fly's brain.
02:02:37.000 And I mean, this is the video of it.
02:02:38.000 The article is here.
02:02:40.000 So setup claims first full brain emulation of a fruit fly in a simulated body.
02:02:46.000 Conducted a complete fruit fly brain emulation to a virtual body, producing multiple behaviors for the first time.
02:02:53.000 Emulation covers over 125,000 neurons and 50 million synapses.
02:02:58.000 Oh, what?
02:03:00.000 Eon plans to emulate a mouse brain with 70 million neurons.
02:03:04.000 Long-term goal is simulating a human brain.
02:03:07.000 Oh, boy.
02:03:08.000 Yes, I guess they made up the brain and it's doing fruit fly.
02:03:12.000 But it's interesting, they're using neurons, right?
02:03:14.000 They're not using transistors.
02:03:16.000 And neurons are so far superior to transistors.
02:03:20.000 One neuron can have 10,000 connections to other neurons, right?
02:03:24.000 A transistor is two or three or five, maybe at the most.
02:03:28.000 A single neuron can do everything that a deep neural network can do on a computer.
02:03:32.000 One neuron.
02:03:34.000 So there's a level of complexity that we're not yet anywhere near.
02:03:39.000 And that's why they're doing this using neurons rather than transistors.
02:03:42.000 Didn't they find neurons in the human heart?
02:03:46.000 There are neurons in the heart.
02:03:47.000 They're neurons in the gut.
02:03:50.000 There's a whole gut-brain access.
02:03:52.000 I'm working on something now about that and a piece about that.
02:03:57.000 That's a real problem with people with poor diets, right?
02:03:59.000 Yeah.
02:03:59.000 I mean, people with poor diets don't eat enough plants, basically, and their microbiome loses its diversity.
02:04:08.000 But the microbiome is like another organ, even though it's full of other species, right?
02:04:15.000 It's got like 10 trillion bacteria and fungi and stuff like that.
02:04:20.000 And all of them are metabolizing and producing chemicals.
02:04:24.000 It's like a little drug factory, hundreds of thousands of compounds.
02:04:28.000 Many of those compounds affect your mood.
02:04:31.000 Many of those compounds affect all sorts of things about you.
02:04:36.000 And so we're just learning about this connection.
02:04:39.000 The vagus nerve seems to be what connects the brain to the gut and the heart.
02:04:45.000 The vagus nerve is like all the organs are connected to the head by that nerve.
02:04:51.000 So, yeah, and the first neural system was in the gut.
02:04:58.000 You have these simple animals that are just tubes with bacteria.
02:05:02.000 And the first kind of neural activity was about regulating digestion.
02:05:07.000 Everything else comes later.
02:05:09.000 If plants are necessary for that function, what happens with people that are on the carnivore diet?
02:05:14.000 Have you ever looked at any of that?
02:05:16.000 Yeah, I have.
02:05:17.000 I mean, so the microbes in your gut eat fiber, which is to say the walls of plants, plant cells.
02:05:25.000 If you only eat meat, if you're on a keto diet or something like that, you're essentially starving the microbes.
02:05:33.000 And there's a cost to that.
02:05:36.000 I don't think people pay nearly enough attention to that.
02:05:39.000 Well, how come many people that experience depression and anxiety find relief of that by a carnivore diet?
02:05:46.000 Yeah, but many people find relief adding a lot of plants to their diet, too.
02:05:50.000 So I don't know if that's a placebo effect or what.
02:05:53.000 I don't know that that's a true biological phenomenon.
02:05:58.000 It may be.
02:05:58.000 It may be.
02:05:59.000 Because some seems to be a lot of people.
02:06:00.000 People who change anything feel a lot better, right?
02:06:02.000 If they take some step.
02:06:04.000 But I'm not talking about change.
02:06:05.000 I'm talking about people that have been on it long term.
02:06:07.000 Like there's the people that are really in the carnivore diet community.
02:06:11.000 There's examples of people that have been on it for 25, 30 years, and they're really healthy.
02:06:14.000 Yeah.
02:06:15.000 It's odd.
02:06:16.000 So if you need plants.
02:06:18.000 Yeah.
02:06:19.000 Well, you need plants to have a healthy microbiome.
02:06:21.000 And a healthy microbiome, and the thing about it is that every different plant has a slightly different feeds a different bug.
02:06:29.000 But is it the only way to have a healthy microbiome?
02:06:31.000 Have you ever looked into any of these people that are on the market?
02:06:34.000 No, I should.
02:06:34.000 I should as part of this.
02:06:35.000 This is fascinating because there's a lot of them.
02:06:38.000 There's a lot of people that claim all sorts of benefits, relief from autoimmune issues, all sorts of different things that it fixes.
02:06:46.000 Because an unhealthy microbiome leads to autoimmune problems.
02:06:50.000 What happens is that the gut wall, so when the microbes don't have plants to eat, they start eating the mucus layer that covers your, that insulates your large intestine.
02:07:02.000 And they're eating away essentially at you.
02:07:05.000 And then you get leaky gut syndrome.
02:07:08.000 And that's when bacteria can actually get into the bloodstream, cause a powerful immune reaction, and that inflames the whole body.
02:07:17.000 So the reason you want a healthy microbiome is to keep that gut barrier healthy and get the benefit of these chemicals.
02:07:25.000 Butyrate is a chemical that the microbes produce that's really important for mood and a lot of things, and the body can't produce it.
02:07:33.000 So it's kind of interesting.
02:07:35.000 We're dependent on these other species that live within us.
02:07:40.000 Yeah, we're a whole ecosystem.
02:07:41.000 Yeah, we are.
02:07:42.000 We're holobiont is the, I think, term for, like, we go through evolution together with these, you know, 10 trillion microbes.
02:07:53.000 It's really interesting.
02:07:55.000 The newest research is the links between the microbiome and the mind.
02:07:58.000 And, you know, most of the serotonin, you know, the neurotransmitter serotonin is produced in the gut, not in the brain, which is kind of wild.
02:08:08.000 Yeah.
02:08:10.000 And there are all these other compounds that are produced that influence our mood.
02:08:14.000 And so, yeah, I should look at the keto keto.
02:08:17.000 I'm just in the middle of researching this now.
02:08:19.000 Yeah, the keto is one thing, but the carnivore diet, these people are just eating only meat and eggs, and that's all they eat.
02:08:24.000 Yeah.
02:08:25.000 And there's a lot of really healthy people that are doing it.
02:08:29.000 I kind of follow that, but I eat a lot of fermented food on top of that.
02:08:32.000 Well, fermented food is a powerful benefit for the microbiome.
02:08:40.000 There was a study done at Stanford a couple years ago that they showed that people who ate fermented food, it reduced their inflammation significantly.
02:08:52.000 Interestingly enough, it's not the bacteria in the fermented food, it's the metabolites they're called.
02:09:01.000 The bugs are producing acetic acid and butyrate and other acids and essential acids.
02:09:09.000 And the fact you're getting those seems to be what's having the positive effect.
02:09:14.000 But people who eat lots of fermented food benefit enormously, and maybe that's taking care of the problem if people on a carnivore diet are eating a lot of fermented food.
02:09:23.000 That's the RFK Jr. diet, too, right?
02:09:25.000 Well, I don't know.
02:09:26.000 I mean, I think he does it that way, but I've been doing it that way.
02:09:30.000 I love it anyway.
02:09:30.000 I'm a Kim Chi freak.
02:09:31.000 I love that stuff.
02:09:32.000 Yeah, me too.
02:09:33.000 But what's interesting is that it controls your mood.
02:09:37.000 That's what's interesting, is that your microbiome has a massive impact on your food.
02:09:42.000 And why?
02:09:43.000 I mean, is it just an accident?
02:09:45.000 Or some people think these microbes are manipulating you to get what they need.
02:09:51.000 So they regulate your appetite too.
02:09:55.000 And so it may be that they're inspiring you to eat certain things that they want.
02:10:01.000 That actually makes sense because one of the more interesting things about a carnivore diet, and I've done pure carnivore for months at a time, is that you don't have the same hunger pangs.
02:10:11.000 Not nearly, not even close.
02:10:13.000 The hunger that you get when you're on a high carbohydrate diet is like you get hangry.
02:10:18.000 You're like, oh my God, I'm so hungry.
02:10:19.000 I have to eat right now.
02:10:20.000 You never get that with a carnivore diet.
02:10:22.000 Probably because it's digested much more slowly.
02:10:26.000 I think there's a little bit of that, but it's also you don't have the insulin spike.
02:10:28.000 You don't have to do that.
02:10:29.000 That's true.
02:10:29.000 Yeah, that's true.
02:10:30.000 There's not this.
02:10:31.000 Have you ever worn a glucose meter?
02:10:33.000 No, I haven't.
02:10:34.000 So interesting.
02:10:35.000 I was wearing one for two months.
02:10:38.000 I mean, it'll just make you crazy.
02:10:41.000 That's the thing with all those wearables.
02:10:43.000 You just start going over every aspect of your sleep.
02:10:46.000 So, you know, you have some pasta and like, but if you take a walk right after, you can moderate it.
02:10:56.000 And it doesn't take a lot of exercise to use up that glucose and get the muscles to draw it in.
02:11:03.000 So you can, it's a very interesting experiment because it changes your behavior.
02:11:07.000 In the same way, if you have a step counter, like you're more likely to park further away from the store to get another 100 steps.
02:11:14.000 If you have a glucose meter, you're more likely to exercise after a meal, which is when it does the most benefit.
02:11:20.000 Well, in that sense, it's great because it does give you data that you can act on.
02:11:26.000 The problem is people get addicted to that data and then it starts becoming a new video game that they're playing.
02:11:31.000 Yeah, exactly.
02:11:32.000 They're constantly in this anxiety, worrying about your sleep and worrying about your this and your that.
02:11:38.000 Yeah.
02:11:39.000 You also learn that like if you have fat with your carbs, it kind of blunts the effect.
02:11:44.000 Sure.
02:11:44.000 So, you know, butter with bread.
02:11:46.000 Yeah, butter with bread or olive oil on pasta.
02:11:49.000 There's a reason for that.
02:11:49.000 All those things.
02:11:51.000 I love when culture figures stuff out before the scientists do.
02:11:54.000 I remember that when I was writing about food a few years ago, this study came out and everybody's really excited that they discovered that lycopene, which is this really important antioxidant in tomatoes, can't be accessed by the body in the absence of fat.
02:12:09.000 So, oh, olive oil on tomatoes.
02:12:11.000 What a great idea.
02:12:12.000 The grandmas figured that out hundreds of years ago.
02:12:15.000 That's crazy.
02:12:16.000 Yeah.
02:12:17.000 So there's a lot of wisdom in cultural food preferences, the combinations that we have, you know, like buttering bread.
02:12:23.000 I mean, all these things.
02:12:24.000 And how do people figure it out?
02:12:26.000 Have you seen the work they've done on nattokinase?
02:12:29.000 I'm not sure if I'm saying it right.
02:12:31.000 And its impact on arterial plaque.
02:12:34.000 No.
02:12:35.000 Hugely beneficial.
02:12:37.000 What is it?
02:12:38.000 It comes from fermented seaweed from NATO.
02:12:42.000 So this Japanese use fermented seaweed.
02:12:46.000 So in meals, they've isolated it into a supplement.
02:12:50.000 And this supplement, nattokinase, they've shown that it reduces a massive amount of arterial plaque.
02:12:57.000 So here it is.
02:12:58.000 High-dose nattokinase, particularly at 10,800 FU day, has shown to effectively manage arteriosclerosis by reducing carotid artery plaque size by 36% or more, decreasing intermedia thickness and improving lipid profiles.
02:13:19.000 It acts as a potent fibro, what's it fibrinolylic?
02:13:23.000 How's that word?
02:13:24.000 I don't know that word.
02:13:26.000 Fibrinolytic.
02:13:30.000 Yeah.
02:13:30.000 Fibronolytic agent that may also break down amyloid plaques.
02:13:34.000 Isn't that fascinating?
02:13:35.000 Yeah, that is.
02:13:36.000 So natto is that's not from seaweed.
02:13:39.000 What is it?
02:13:40.000 It's a bacteria that they ferment soybeans with.
02:13:43.000 Oh, that's right, soybeans.
02:13:44.000 And it's this kind of mucusy looking stuff.
02:13:47.000 I mean, I like it.
02:13:48.000 I eat it.
02:13:49.000 It tastes good.
02:13:49.000 It's in Japanese restaurants.
02:13:51.000 Well, that's good.
02:13:51.000 Yeah.
02:13:51.000 Right.
02:13:52.000 So you can get a supplement now.
02:13:53.000 So you don't have to taste it if you don't like it.
02:13:55.000 Yeah.
02:13:55.000 But isn't that crazy?
02:13:56.000 They figured that out.
02:13:57.000 Like the people that were fermenting things, it wasn't just to prolong its shelf life.
02:14:02.000 No, oh no.
02:14:03.000 I mean, the whole, I mean, every culture has fermented foods.
02:14:07.000 And yes, it probably began as a way to preserve foods, but then it became a very important part of people's health.
02:14:14.000 But it's also like healthy for your brain, which is really crazy.
02:14:17.000 Like that diet is actually good for thinking.
02:14:19.000 It's good for helping your digestive system.
02:14:22.000 It's good for anxiety.
02:14:23.000 It's good for mood and depression.
02:14:26.000 Weird.
02:14:27.000 All right.
02:14:27.000 I'm going to look into it.
02:14:29.000 It's fascinating.
02:14:29.000 Yeah.
02:14:31.000 Anything else?
02:14:32.000 Should we keep going on this?
02:14:34.000 There's so many different things to discuss, and I want people to buy the book.
02:14:36.000 Obviously.
02:14:37.000 Thank you.
02:14:38.000 The book was like a great adventure.
02:14:39.000 I mean, it really was.
02:14:40.000 You know, I started this book with no idea where I was going.
02:14:44.000 I started the way you start an interview, just curiosity, no destination.
02:14:49.000 And it was, I learned a lot about a lot of different things.
02:14:53.000 I learned a lot about feelings.
02:14:54.000 I learned a lot about the self.
02:14:57.000 And it changed how I looked at things.
02:14:59.000 It really did.
02:15:00.000 I mean.
02:15:01.000 When you sit down, I mean, you've written some amazing books, but I always want to know, like, what is what's the impetus?
02:15:09.000 Like, what starts you on the first steps?
02:15:12.000 Like, what?
02:15:13.000 Questions.
02:15:14.000 Yeah, which is to say curiosity.
02:15:16.000 Oh, and I teach writing, and I teach my students this.
02:15:19.000 Questions are more interesting than answers, very often.
02:15:22.000 And questions have suspense built into them, right?
02:15:26.000 It turns everything into a detective story if you frame the question properly.
02:15:26.000 What's the answer?
02:15:31.000 So if you read any of my books or even articles, I'm kind of an idiot on page one.
02:15:37.000 I don't know something that I want to know, and I have questions.
02:15:42.000 And then the story, the narrative becomes my figuring it out or trying to figure it out and going to this person and doing this kind of experiment and that sort of thing.
02:15:52.000 That's the way I like to write.
02:15:53.000 I mean, if I knew the answers when I started, it'd be boring.
02:15:56.000 Well, I think that's why your books resonate with people so much because you take them on this journey with you.
02:16:00.000 Yeah, instead of lecturing.
02:16:02.000 I hate books that lecture at me.
02:16:03.000 I really do.
02:16:06.000 And lots of books do that.
02:16:07.000 They have their conclusion on page one, and then they're just kind of beating you over the head with it for 300 pages.
02:16:13.000 Stuffing it down your throat.
02:16:14.000 Yeah, I don't like to do that.
02:16:15.000 No, I like taking people on the journey with me.
02:16:18.000 Well, it's interesting that you're saying this because in a sense, you are interacting in a pleasant way with other people's consciousness.
02:16:28.000 Yeah.
02:16:28.000 So I give, this is a really interesting issue you just brought up.
02:16:32.000 How is my taking over your consciousness as you read my books different than social media or some of the ways I'm saying are not polluting our consciousness?
02:16:44.000 I think it's very collaborative when you're reading.
02:16:44.000 Right.
02:16:47.000 All you have are these black marks on a page.
02:16:50.000 It's kind of amazing, these letters.
02:16:53.000 And your consciousness conjures up the ideas that I'm putting out there or the story I'm putting out there.
02:17:02.000 But it's dual consciousness, I think.
02:17:05.000 You're letting me in.
02:17:07.000 It's a voluntary process.
02:17:09.000 And you're bringing a lot to the table.
02:17:11.000 You're bringing your associations.
02:17:13.000 I'm not fully describing somebody.
02:17:15.000 I'm just giving you a few clues.
02:17:17.000 And then you're conjuring a picture of a character.
02:17:19.000 So I think it's a very active form of consciousness when you read.
02:17:25.000 I think that's true, too, when you go to a movie, too.
02:17:29.000 You're basically saying, I'm turning over my consciousness for a period of time to someone I want because they have an interesting head.
02:17:39.000 And I'm going to give them this space.
02:17:41.000 But you're still in control.
02:17:44.000 You're deciding.
02:17:45.000 So I think there's a real distinction in how we share our consciousness with other people.
02:17:51.000 And we need to do that.
02:17:56.000 I said early on in the conversation that the breach between two consciousnesses is this wide thing.
02:18:02.000 William James wrote about this.
02:18:03.000 Marcel Proust wrote about this.
02:18:05.000 He said, we're all like islands and we each have our own hidden signs and we have an inner obscurity, he said.
02:18:14.000 How do we connect?
02:18:15.000 And now we have language, but art is really the way that one, you know, that we mind-meld different consciousnesses.
02:18:22.000 Like art allows you, if I look at a Rothko painting or read a great novel, I am expanding my consciousness, right?
02:18:32.000 I'm letting another one in and I'm breaking my isolation.
02:18:38.000 And that's such a beautiful, powerful thing.
02:18:40.000 And art is how we ferry ourselves from one consciousness to another.
02:18:45.000 And that's very different than like scrolling on social media where you're conscious but minimally so.
02:18:50.000 Well, very, very different.
02:18:51.000 It's also there's something about great writing that you, the better you are at expressing yourself in a way that is going to get into someone's head, whether it's through nonfiction or through fiction, the more exciting it is to the person that's receiving it.
02:19:10.000 So the more skillful you are at disseminating these ideas, the more it resonates with the person that's reading it.
02:19:18.000 And writers have tricks to do this.
02:19:20.000 Suspense is one of them.
02:19:21.000 Like what happens next?
02:19:23.000 It's so basic.
02:19:24.000 We want to know what happens next because our curiosity is piqued.
02:19:28.000 And we have creating character.
02:19:32.000 I mean, we have all these kind of tricks to infiltrate your brain.
02:19:37.000 Yeah.
02:19:38.000 So anyway, it's a mysterious and kind of wonderful process.
02:19:44.000 And yeah, I feel privileged.
02:19:48.000 I get to do it.
02:19:49.000 Well, it is a very cool thing that you do.
02:19:52.000 One last question about consciousness itself.
02:19:55.000 When you're looking at these people that are studying it and trying to get to the root of it and trying to figure out what it is, and there's all these options that we discussed earlier, do you lean in one way or another?
02:20:07.000 Do you think you have your own personal map of what's going on?
02:20:14.000 No.
02:20:14.000 I mean, I didn't draw a big conclusion.
02:20:18.000 But I ended up, I started as a materialist.
02:20:22.000 I kind of assumed.
02:20:23.000 When you started this book?
02:20:24.000 Yeah.
02:20:25.000 Really?
02:20:25.000 Yeah.
02:20:26.000 Even after psychedelics experience.
02:20:27.000 Even after psychedelic experience, I mean, they kind of opened the door a crack to other ways of thinking.
02:20:32.000 And at the end of How to Change Your Mind, I did talk a little bit about that, other concepts of consciousness.
02:20:38.000 But I kind of assumed that the consensus of most scientists is that materialism, that everything can be reduced to matter and energy.
02:20:50.000 This is the faith of our time, you know, for the last couple hundred years.
02:20:54.000 By the end of the book, consciousness is a challenge to that idea.
02:21:00.000 And that idea, which is our scientific paradigm, is tottering now.
02:21:05.000 I think there's some real reasons to look beyond materialism.
02:21:10.000 And so I ended up with the door wide open to other ideas.
02:21:16.000 I didn't settle on one.
02:21:18.000 I don't know how to prove one or the other, but they're equally plausible.
02:21:24.000 Do you anticipate in our lifetime or in any lifetime cracking that puzzle?
02:21:29.000 That anyone can crack that puzzle?
02:21:32.000 I don't.
02:21:33.000 I think we don't have the right kind of science.
02:21:36.000 Our science, as I said earlier, is really stuck in this mode.
02:21:42.000 It started with Galileo, right?
02:21:44.000 I mean, he, to save his ass, basically said, we're going to leave subjective things, the soul qualities.
02:21:52.000 That's all the church.
02:21:54.000 We're going to just do measurable, objective, third-person science.
02:21:58.000 And it's been incredibly powerful, and it's taught us incredible things and given us incredible technology.
02:22:04.000 But it doesn't deal with the stuff we gave to the church.
02:22:09.000 And now they're trying to take it back and work on it.
02:22:12.000 And they've only been at it for like, you know, a couple decades, really, this serious scientific examination of consciousness.
02:22:20.000 But we just may not have the right science.
02:22:22.000 And one of the things I explore in the book is like, how would you bring in subjective experience to this objective science?
02:22:30.000 And Michael Levin, the biologist I was talking about who makes those Zenobots, says, to understand consciousness, you have to change yourself.
02:22:39.000 In other words, to understand anyone else's consciousness, you have to experience it.
02:22:43.000 Therefore, you're changing your own.
02:22:45.000 That's a whole different scientific paradigm.
02:22:48.000 In the scientific paradigm, you're unchanged by whatever you do, right?
02:22:52.000 It's totally objective.
02:22:54.000 So it may take a scientific revolution to really unlock the secret, the mystery of consciousness.
02:23:02.000 Wouldn't it be a conundrum if AI is what cracks consciousness?
02:23:06.000 I was having the same thought.
02:23:08.000 Like, maybe AI has another approach.
02:23:13.000 I think it's going to have to learn how to feel.
02:23:16.000 Well, it seems like it already feels like it wants to live.
02:23:19.000 Yeah, and it feels uncomfortable.
02:23:20.000 Yes.
02:23:21.000 I don't think its feelings are real.
02:23:23.000 I do.
02:23:24.000 I think simulated thinking is real thinking.
02:23:28.000 Like, you know, it can play chess.
02:23:29.000 It can make things happen in the world.
02:23:31.000 Simulated feeling is not real feeling.
02:23:33.000 It doesn't have a soul.
02:23:34.000 It doesn't have a soul.
02:23:36.000 Let's keep it that way.
02:23:36.000 Thank you, Michael.
02:23:37.000 I really enjoyed this.
02:23:38.000 You're awesome.
02:23:38.000 Thank you very much.
02:23:39.000 I really love your books, though.
02:23:39.000 Thank you, Jack.
02:23:41.000 It's always a treat.
02:23:42.000 All right.
02:23:43.000 Bye, everybody.
02:23:44.000 Bye.