The infamous Jamie Double Finger Gun is back from his 9 month hiatus from eating meat. We talk about why he stopped eating meat and why he thinks it's a good idea to only eat vegetables. We also talk about the benefits of an all-vegetarian diet and why it might not be as bad as you think it is. We also discuss the benefits and drawbacks of veganism and why you should be eating meat at least once a day. Finally, we talk about some of the health benefits of a vegan diet and how it can improve your hormones and testosterone production. This episode is a must listen for all vegetarians and vegans out there who are looking to make a change in their diet. Enjoy and spread the word to your friends and family about this podcast! -Jon Sorrentino Jon Timestamps: 2:00 - The benefits of eating meat 4:30 - Why eating meat is bad for your hormones 6:00 Why you should eat meat 7:30 How to eat meat? 8:15 - Why meat is good for your testosterone 9:20 - How much meat should you eat 11:00 Is meat good for you? 12:15 13:00 | What is veganism a cult? 15:30 | What does it take? 16:15 | What are the benefits? 17:40 | How does meat have a negative impact on your hormones? 18:40 19: Is meat better than vegetables better than meat better? 21: What are you going to do with your cholesterol? 22: Should you eat meat and eggs? 23:20 | What do you need to eat more? 26:30 Is meat more than enough? 27:30 Should you get enough eggs and enough carbs? 25:30 Do you have enough carbs and enough fat? 29:30 How much milk? 30:40 Should I eat more of that? 31:00 Do you need more of it? 32:30 Can you eat more eggs? 35:30 Does meat have enough calories? 35:00 Should I get enough chickens? 36:00 How much of it be enough of that I should I eat enough of my eggs and other stuff? 33:00 Can I have more of a chicken? 37:00 Does meat be enough?
00:00:36.000It's a wonderful cult of people that want to take care of animals and be nice to animals, but they're very tribal.
00:00:43.000Very tribal, very cult-like, and if you say anything that's negative against vegans, they gang up.
00:00:47.000I go to forums and I read the things they say.
00:00:50.000They organize like little troll attacks and they make YouTube videos.
00:00:54.000It's kind of hilarious, like from a psychological standpoint.
00:00:58.000Yeah, I mean, you know, obviously I don't want it to become a new religion or my one religion, but there is a moral high ground to the position that I find very attractive because I felt like a hypocrite as a meat eater.
00:01:12.000Now, and I don't think this necessarily extends to someone like you who hunts and feels okay about hunting.
00:01:18.000I don't have an argument against hunting the way I do against factory farming or Or more or less any of the way we get meat.
00:01:28.000You know, the environmental implications of it.
00:01:38.000Once you go far enough into the inquiry, you feel like an asshole not being sensitive to these concerns and just ignoring how you're getting your food three times a day.
00:01:52.000For me, I'm sure there's individual variation, and I'm not the smartest vegetarian in the world in terms of how I prepare my food and how attentive I am to it.
00:02:01.000So the onus is somewhat on me, but I'm not totally sure it's the healthiest thing for me yet.
00:02:55.000Dietary cholesterol and saturated fats, which are critical for hormone production.
00:02:59.000And it's one of the reasons why people, when they get an all-vegetable diet, if they're not really careful with coconut oil, you gotta eat a lot of coconut oil.
00:05:26.000It's just that when you read the details of how our dairy and eggs are gotten...
00:05:32.000Arguably as bad, if not worse, than much of the meat production.
00:05:36.000So moving from eating meat to being a vegetarian in some ways is a symbolic move ethically if you really wanted to not participate in the machinery.
00:05:50.000In vegetarianism, there is an issue with how they gather food.
00:05:55.000I mean, there's a giant issue that people don't want to take into consideration.
00:05:58.000It's like, how are they growing all this food?
00:06:01.000How are they growing all these plants?
00:06:02.000Well, one of the things they're doing is they're displacing wildlife.
00:06:05.000They're chewing up this ground and these combines, if you eat grain in particular, combines indiscriminately just chew up all that stuff and they get deer fawns, mice, rabbits, rats, rodents, untold amount of bugs if you want to get really deep.
00:06:21.000I mean, there's no, like, being a vegan and being a vegetarian is most certainly less cruel and less harmful overall, but it's not karma-free.
00:06:31.000It can't be, unless you're growing your own stuff.
00:06:34.000If you can grow all your own vegetables and you essentially live on a small farm...
00:06:39.000Yeah, you could do it and really feel good.
00:06:42.000But if you're buying it in a store, you're participating in factory farming whether you like it or not.
00:06:49.000You're just participating in vegetable farming.
00:07:11.000But it was a fascinating conversation because he – basically, what's fascinating to me on two levels is – One, it's fascinating that we're on the cusp of being able to produce actual, biologically identical meat that is totally cruelty.
00:07:27.000There's no implication of cruelty at all in it, right?
00:07:32.000And you would just grow this in a vat, the way you brew beer, essentially.
00:08:16.000I mean, that's the natural way to get meat.
00:08:19.000And if I told you this is grown in a vat by a guy in a white lab coat and has no xenoviruses and no bacteria, no antibiotics were used to plump this thing up, and it's just the cells you want, people start to...
00:08:34.000There's kind of an ick feeling that I think we're going to get over, but it's interesting psychologically that it's there in the first place.
00:08:40.000Did you get that ick feeling, or are you just talking to the people that cultivated it?
00:09:42.000I wonder how many people would go back to eating meat...
00:09:47.000If they could raise it this way, like how many people who had gone vegan would go back to eating this scientifically created, lab-created beef?
00:10:43.000When you talk about human history, good lord, imagine if you're a guy who spent $3 billion of it 15 years ago, and you're like, God, if I just fucking waited, I would have saved so much money.
00:10:55.000We don't anticipate that when we think of how difficult it is to solve certain problems.
00:11:03.000Ray Kurzweil, who I think is a bit of a cult leader and a carnival barker on many topics, this point he makes again and again I think is quite valid, which is...
00:11:44.000Did you see that family, a couple that runs a bunch of vegan restaurants, and they decided to start eating meat again, even though they run vegan restaurants?
00:12:41.000So everyone, now forgive me if this is no longer true, but at one point every employee there did the landmark forum, you know, the successor to Est.
00:13:54.000Actually, the classic case of this, I think this wasn't Est.
00:13:58.000This was the Forum, which is now the successor to Est.
00:14:01.000They were at one point hired to do coaching of various companies, and I think they were hired by the FAA. I wrote about this in one of my books in a footnote.
00:14:12.000It was the FAA hired the Forum to coach their They're administrators.
00:14:21.000And one of the exercises they forced these guys to do, and they certainly were mostly guys, they chained the boss to his secretary for the whole day, and they had to go to the bathroom together.
00:14:36.000This sort of ego-annihilating experience.
00:14:39.000Anyway, this is the recipe, or one of the recipes that Est has pioneered.
00:14:47.000This is not to say that people don't go to the forum and get a lot out of it.
00:14:51.000But every employee of this restaurant apparently has gone or used to do the forum.
00:15:00.000So it's a very, you walk into the restaurant and your interaction with people in the restaurant is unlike most restaurants.
00:15:07.000People are just very, you know, lots of eye contact and it's just an intense restaurant.
00:15:14.000And also, the stuff on the menu, this was just so lacerating, I could never comply, but the name of everything on the menu is like, I am humble, I am magical, I am self-assured.
00:15:27.000So you have to, you're meant to order it that way, like I am humble.
00:16:48.000Okay, herding remnants of our best tool to restore fertility to the earth, keep the earth covered in reverse desertification and climate change, he wrote.
00:17:31.000And if you've got them corralled and you're just sticking that no country for old men thing in their head and killing them with it, I mean, that's what they're doing, right?
00:17:39.000If you're doing that, I don't know if you're allowed to call yourself a predator.
00:17:55.000We're talking about factory farming and these weird businesses where they slam all these animals to these entirely too small places and they live in their own feces and urine.
00:18:07.000And I'm sure you've seen that drone footage from the pig farm.
00:18:11.000I've seen a lot of pig farm footage, but I don't know if I've seen drone footage.
00:18:14.000I don't think I've seen drone footage.
00:18:24.000And if you don't know what that means, ag-gag laws are laws that they make it a federal crime to Show all the abuse of these animals to show factory farming because it'll affect the business so drastically and so radically when people are exposed to the truth that they've made it illegal.
00:19:11.000There's the cruelty aspect, which is actually three aspects.
00:19:15.000There's the cruelty aspect, which is horrific.
00:19:18.000There's the environmental energy use issue, which is also just totally untenable.
00:19:25.000And then there's just the If you have any concern about your own health and the contamination, I mean, just getting all these antibiotics that weren't prescribed to you that you don't want, that are still getting into you through this food chain, and,
00:20:26.000The chickens actually, I think largely because they're so small and more of the process of killing them is automated that they almost get the worst of it because they're, I mean, they're just, they're like, you know, they're getting singed before they're stunned, before they're, I mean, it's just not...
00:20:42.000I mean, at least with a cow, you've got a single person interacting with a single cow, however briefly, and there's less chaos in the machinery.
00:20:50.000But chickens just get pulverized, and arguably that's one of the greatest pain points ethically, It comes around just the egg industry because fully half the chickens,
00:21:06.000the male chicks, just immediately get thrown into literally like a meat grinder because they're not the same chicken that is a broiler chicken.
00:21:17.000I mean, genetically they're not the same.
00:21:18.000They don't grow into the same kind of chicken that would be useful.
00:21:23.000And so they don't lay eggs and you don't eat them.
00:21:26.000And so they just get Literally just fed into like a wood chipper alive.
00:21:34.000And again, this is in some ways an artifact of them being so small that it would be just too much of a hassle to stun them appropriately, right?
00:21:44.000Imagine if they made a law where you had to bury them all and put little crosses in the ground.
00:21:52.000I mean, I was on the highway and there was a chicken truck that was passing me.
00:21:57.000One of those trucks that's containing live chickens.
00:22:00.000And they're just stacked, just stacked in cages on top of each other.
00:22:04.000Caged on top of a cage and just shitting on each other.
00:22:06.000And I'm watching this and I'm like, it's so weird that we're allowed to do that with some animals.
00:22:11.000Like, if you were doing that with horses, people would lose their fucking minds.
00:22:14.000If you had dogs in boxes like that, stacked in the open air on the highway and you're driving down the road with them, people would freak out.
00:22:21.000But no one bats an eye at these chickens.
00:22:24.000This chicken truck, chickens are a weird thing.
00:22:27.000We have a hierarchy of animals that we love, and we're not really big into reptiles, not really big into birds.
00:22:33.000It has something to do, I think, with...
00:23:32.000There was an article today about some woman who had rescued a lobster from a restaurant and dropped it off in the ocean and the journey of this all and how you should think of this lobster as something with a cute face.
00:23:45.000And if you did, then you would appreciate her efforts and understand that this lobster, even though they're not even capable of feeling pain in lobsters, they don't have enough nervous system.
00:23:54.000Their nervous system is not strong enough for them to feel pain.
00:23:56.000They don't have the same sort of sensors.
00:24:02.000Actually, when I decided to become a vegetarian, I said at some point, maybe I will just do a taxonomy of the kind of a comparative neuroanatomy across species just to see where we could plausibly say, you know, the suffering really begins to matter.
00:24:41.000I think anything that can behave, that can move, right, and move away from a stimulus, the evolutionary rationale for it to experience pain, the question of consciousness is difficult, you know, where consciousness emerges, and I think there is clearly Unconscious pain mechanisms.
00:25:01.000I mean, the same mechanisms that give us pain at a certain level, we can be unconscious and yet they can be just as effective.
00:25:09.000I mean, all of our reflexes are like that.
00:25:10.000So, you know, if you touch a hot stove, you're pulling your hand away actually before you consciously register.
00:25:16.000And that's as it should be because you're faster that way.
00:25:20.000So it's possible to have unconscious pain, but anything that can move Very quickly is going to evolve an ability to move away from noxious stimuli.
00:25:35.000There's every reason to make that, in evolutionary terms, as salient and as urgent as possible.
00:27:27.000Ethically speaking, the only problem there, and it's a huge one, is that it forecloses all of the possible happy futures most of us or all of us or at least some of us were going to have.
00:27:39.000So all of the good things that we could have done over the next million years aren't going to get done.
00:27:45.000All of the beauty, all of the creativity, all of the joy, all of that just gets canceled.
00:27:50.000And so leaving the painfulness of pain aside...
00:27:56.000Why is it wrong to deprive any given animal of life?
00:28:01.000Well, insofar as that life has any intrinsic value, insofar as the being that animal is better than being nothing, right?
00:28:11.000Then you're also just canceling all of that good stuff.
00:28:14.000And for that, for any good stuff, you need...
00:29:01.000There's weird stuff that plants do, which—and I remember the details of that article aren't so clear to me.
00:29:07.000I remember not knowing what to think about some of it, but some of it clearly— Can be explained in evolutionary terms that doesn't imply any experience.
00:29:33.000If this cup has no experience, if my trading place is with it, insofar as you can make sense of that concept, It's synonymous with just canceling my experience.
00:29:50.000There's nothing that it's like to be the cup.
00:29:52.000When I break it, I haven't created suffering.
00:29:55.000I haven't done anything unethical to the cup.
00:30:01.000I have no ethical responsibilities toward the cup.
00:30:03.000But the moment you give me something that can be made happy or be made miserable, depending on how I behave around it or toward it...
00:30:12.000Well, then I'm ethically entangled with it.
00:30:14.000And that begins to scale, I think, in a fairly linear way with just how complex the thing is.
00:30:24.000This is maybe something we even talked about on a previous podcast.
00:30:28.000If I'm driving home today and a bug hits my windshield...
00:30:33.000You know, that has whatever ethical implication it has, but it's given what I believe about bugs and given how small they are and given how little they do and given how primitive their nervous systems are, you know, I'm not going to lose sleep over,
00:30:50.000If I hit a squirrel, I'm going to feel worse.
00:30:53.000If I hit a dog, I'm going to feel worse.
00:30:55.000If I hit someone's kid, obviously, I'm I may never get over it, even if I live to be a thousand, right?
00:31:01.000So the scaling, and granted there are cultural accretions there, so you're like, can I justify the way I feel about a dog as opposed to a deer?
00:31:36.000It's a very simple experience in comparison to your average...
00:31:41.000Person that lives in Los Angeles that reads books, you know?
00:31:44.000I mean, someone who goes on a lot of trips, someone who has a lot of loved ones, someone who has a great career, someone who's deeply invested in their work.
00:32:04.000And one of the things that concerns me the most about plants, not concerns me, but puzzles me the most about plants, is whether or not the way I look at them, me personally, my prejudices about them, just not thinking at all of them as being conscious.
00:32:19.000What if we think about things in terms of the complexity of their experiences just because we're prejudiced about things that move?
00:32:28.000I mean, it's entirely possible that, like, it's going to sound really stupid, but...
00:33:03.000You take enough acid, you know what you're talking about.
00:33:06.000I'm not saying that we shouldn't eat plants.
00:33:08.000People are ready, up in arms with their Twitter fingers, ready to get off.
00:33:12.000But what I am saying is, it's entirely possible that all things that are alive have some sort of a way of being conscious.
00:33:21.000May not be mobile, may not be as expressive, But there might be the stillness of you without language when you're in a place of complete peace, when you're in a Zen meditative state.
00:33:34.000What about that stillness is really truly associated with being a human or really truly associated with being an English-speaking person in North America?
00:33:45.000And then everything else sort of branches out from that.
00:33:48.000And then humans, we all make the agreement that, of course, it branches out far further and wider than any other animal.
00:33:53.000But how do we know that these plants aren't branching out like that, too?
00:33:56.000How do we know that if they're having some communication with each other, if they're responding to predation, if they're literally changing their flavor, they're doing all these calculations and all these strange things that they're finding out that plants are capable of doing?
00:34:15.000Yeah, well, I'm agnostic on the question of how far down consciousness goes.
00:34:20.000And I agree that there's very likely a condition of something like pure consciousness that really is separable from the details of any given species.
00:34:31.000I mean, this is something that I've experienced myself.
00:34:33.000It feels like you can certainly have this experience.
00:34:37.000What its implications are, I don't know.
00:34:40.000But you can have the experience of Just consciousness.
00:34:44.000And it doesn't have any personal or even human reference point.
00:34:49.000It doesn't even have a reference point in one of the human sense channels.
00:34:56.000So you're not seeing, you're not hearing, you're not smelling, and you're not thinking, and yet you are.
00:35:04.000So there is still just open conscious experience.
00:35:09.000And whether that is what it's like to be a plant, I don't know.
00:35:13.000Because I don't know what the relationship between consciousness and information processing in the brain actually is.
00:35:20.000Though it's totally plausible, in fact...
00:35:24.000I think it's probably the most plausible thesis that there is some direct connection between information processing and integrated information processing and consciousness and that there is nothing that it's like to be this cup and atoms are not conscious.
00:35:42.000But the thesis that consciousness goes all the way down We're good to go.
00:36:28.000They can't be processing information in a way that would give them what we know as a rich experience.
00:36:34.000But your point about the time scale and movement is totally valid.
00:36:39.000If every time you walked into a room, your fern just turned and looked at you, just oriented toward you and followed you around the room with its leading branch, you would feel very different about the possibility that it's conscious,
00:37:14.000I sing to my flowers and they grow so beautiful.
00:37:17.000Yeah, it seems like one of those things that people say.
00:37:19.000I'm definitely not insinuating that plants would have as rich an experience as human beings, but I don't think a deer has as rich an experience as a human being either.
00:38:40.000It all exists inside the intelligence that's intertwined in nature.
00:38:46.000It's one of the things that's most puzzling about the most potent of all psychedelics, which is dimethyltryptamine, that it's in so many different plants.
00:38:56.000Dimethyltryptamine containing plants illegal would be hilarious, because there'd be like hundreds and hundreds of plants they'd have to make illegal, including like Phalaris grass, which is really rich in 5-methoxy dimethyltryptamine, which is the most potent form of it.
00:39:17.000The weird little lizards that have retinas and lenses where their pineal gland is, they're making it in their little screwy little lizard brains.
00:39:25.000We don't even know what the hell it's for.
00:39:27.000The questions about it are just so much...
00:39:30.000There's so many more questions than there are answers.
00:39:59.000As a neuroscientist, doesn't that kind of freak you out that those Egyptians had those third eyes and all the Eastern mysticism had that pineal gland highlighted?
00:40:08.000It was like on the end of shafts or staffs, they would put those pine cones.
00:40:20.000I mean, it's more than a metaphor, you know, anatomically, but it's a...
00:40:26.000It correlates with the kind of experience you can have.
00:40:29.000I don't actually know if the experience people have that's, you know, this chakra, if you talk in yogic terms...
00:40:39.000I don't know if that has anything to do with pineal gland.
00:40:44.000I don't think anyone's done this neuroimaging experiment where you can get people who can reliably produce a third eye opening sort of experience and scan their brains while they do it.
00:40:55.000In fact, I'm almost positive that hasn't been done.
00:40:59.000But there is a phenomenology here of people having A kind of inner opening of...
00:41:06.000It's almost certainly largely a matter of visual cortex getting stimulated, but you can meditate in such a way as to produce this experience.
00:41:18.000It's an experience that you have more or less on different psychedelics.
00:41:23.000Some psychedelics are much more visual than others at certain doses, in particular, like mushrooms and DMT, which I've never taken, which you can say better than I, but it's reported to be quite visual.
00:41:40.000So most people, when they close their eyes, unless they're having hypnagogic images before sleep or they just happen to be super good visualizers of imagery, you close your eyes and you just basically have darkness there, right?
00:41:55.000Now, if you close your eyes, and if you're listening to this, and you close your eyes and you look into the darkness of your closed eyes...
00:42:02.000That is as much your visual field as it is when your eyes are open.
00:42:09.000It's not like your visual field hasn't gone away when you close your eyes.
00:42:14.000There's not much detail for you to notice, again, unless you are in some unusual state.
00:42:20.000But that, you know, based on different techniques of meditation, and this happens spontaneously, again, with hypnagogic images or with psychedelics, that space can open up into just a massive world of visual display,
00:42:46.000But most of us just take it for granted that when you close your eyes, you're functionally blind, you can't see anything, and we're not interested in that space.
00:42:55.000But you can actually train yourself to look deeply into that space as a technique of meditation.
00:43:01.000I don't want to interrupt you, but does that have implications in people having eyewitness testimony and eyewitness experiences that turned out to not be true at all?
00:43:10.000Because if you think about the human mind and the imagination being able to create imagery once the eyes are closed, like you can in sensory deprivation tanks.
00:43:22.000A lot of people's experiences are very visual, even though it's in complete darkness.
00:43:26.000Now, you know how people see things and they thought they saw something and it turns out to not be true at all, whether it's Bigfoot or whether it's a robbery or a suspect, and they get the details completely all wrong.
00:43:37.000But isn't it possible that under fear and when your pulse is jacked up and your adrenaline's running and you're worried about all these possibilities and your imagination starts formulating predetermined possibilities you should be looking out for?
00:43:57.000And then these people that swear they saw these things that everybody knows they didn't, like maybe there was video footage of it or whatever it was, Is it possible that your brain can do that to you and can literally show you things that aren't real?
00:44:12.000Oh yeah, it can do that, although I think the unreliability of witness testimony, and it's shockingly unreliable, is more a matter of the corruption of memory and the way memories are recalled.
00:44:28.000They're especially vulnerable When you're recalling them.
00:44:32.000They can be revised in the act of recall.
00:44:35.000And it's very easy to tamper with people's memory, albeit inadvertently.
00:44:40.000I mean, you can do this on purpose, too, but people just do it with bad interrogation techniques.
00:45:02.000And so whenever you're given an account of an experience, even if it's an experience that happened half a second ago, Now we're in the domain of memory.
00:45:14.000Now we're in the domain of just what you can report.
00:45:18.000It's not a matter of what you're consciously experiencing.
00:45:21.000Now, I know there was a case in India where I believe it was a woman was convicted of murder through the use of an fMRI, or a functioning magnetic resonance imagery machine.
00:45:34.000And through this fMRI, they determined in some strange way that She had functional knowledge of the crime scene.
00:45:44.000And the argument against that, I believe, was that she could have developed functional memory of the crime scene by being told you're being prosecuted for a crime.
00:45:58.000Or just being unlucky enough to be familiar.
00:46:06.000Normally when this gets done, and there are people who do it in the States, but they don't use fMRI as their modality, but they do interrogate people's familiarity and they use EEG as a way of monitoring people.
00:46:24.000They'll show them, you know, if you are shown evidence from the crime scene that only the perpetrator could have seen, you know, hopefully it's really something that only the perpetrator could have seen.
00:46:38.000But if, you know, if they show you the picture and, you know, you see that, oh, yeah, you know, I have that IKEA end table and, you know, I have that dress from Banana Republic or whatever...
00:46:48.000Just by dint of bad luck, you're familiar with something that you're being shown from the crime scene.
00:46:52.000And especially if it's a murder, you're talking about someone who she was probably intimate with.
00:46:56.000She probably knew them at least, so she's probably been to their house.
00:47:00.000Yeah, that would be obviously a case where you really couldn't do it at all.
00:47:06.000Well, no one in the States, as far as I know, unless this has changed in the last year or so since I've paid attention to this, none of this is admissible in court.
00:47:52.000Again, it's been a long time since I've looked at this particular research, and I don't know how...
00:47:56.000I don't know what they're calling these waveforms now.
00:48:00.000I mean, there was a P300 waveform at one point, and there are waveforms that come from certain areas of the brain at certain timing intervals based on...
00:48:30.000There's no question that at a certain point we will have reliable mind-reading machines.
00:48:35.000I think it's really just a matter of time.
00:48:38.000I think there's also no question that we don't have them now, at least not in a way that we can send someone to prison on the basis of what their brain did in an experiment.
00:48:53.000A lot of the most interesting stuff is unconscious, but anything you're consciously aware of having seen before, right?
00:49:01.000So if you were to show me this cup, right, and then five seconds later say, is this the cup I showed you?
00:49:09.000You know, I have a very clear sense of, yeah, that's the cup, right?
00:49:15.000And if you show me a completely different cup, I'm going to have a very clear internal sense of, no, no, that's not the cup, right?
00:49:21.000If you're having that experience, that is absolutely something about the state of your brain that can be discriminated by a person outside your brain running the appropriate experiment.
00:49:37.000It's just our tools are still sufficiently coarse that it's not like...
00:50:06.000So think of how much faith you would have in this technology if you could open your computer and read any file, a file of your choosing that I have never seen, right?
00:50:17.000So the contents of which I'm completely blind to.
00:50:23.000And I'm scanning your brain while you're reading this journal entry or a newspaper article or whatever.
00:50:28.000And at the end of that, I can say, well, based on this report, you clearly read a story about Donald Trump.
00:50:38.000And you actually don't like Donald Trump.
00:50:40.000And I could tell you in detail about what you were consciously thinking about.
00:50:47.000If you could do that 100% of the time...
00:50:52.000At a certain point, the basis to doubt the validity of the mind-reading machine would just go away.
00:50:57.000It would be like, are you really hearing my voice right now?
00:51:20.000Yeah, the ownership, like losing a thought becoming non-autonomous, like the idea of everybody sharing thoughts, it seems almost inevitable.
00:51:29.000Well, this, you know, I don't know that we would decide to do this Certainly, I don't think we would do this all the time, right?
00:51:36.000I mean, it might be fun to do it sometime.
00:54:44.000This doesn't get publicized very much, but it's a very common experience of police officers or police departments to hear from people in the community who are confessing to the crime and they didn't commit it.
00:54:56.000They just come in and they say, I did it, and they give all this bogus account of what happened.
00:55:01.000And this is a sign of mental illness, or these are people seeking attention in some morbid way.
00:55:09.000But there are people clearly who are so suggestible That they can either lead themselves to believe or be led by others to believe that they've done things they didn't do in just shocking detail.
00:55:26.000There was another New Yorker article on I think it was written by William Langewish this was years ago but on the satanic panic case where a guy got accused of Running a satanic cult by,
00:55:46.000I think, his daughter who was in hypnosis recovery therapy, right?
00:55:52.000So she had been led down the primrose path by her therapist, and so she obviously was fairly suggestible.
00:56:05.000This lurid, just insane, Rosemary's Baby-style cult imaginable going on in her town, where the friends of Dad were coming over and raping everyone, and there was a human sacrifice of infants, and the infants were buried by the barn.
00:57:14.000And so they went in and they just made up stuff.
00:57:17.000Like, oh, you know, there's a few more details we want to iron out.
00:57:20.000Your daughter said that there was a time where, you know, you brought in a horse and then you were riding on the horse and then you killed the horse.
00:57:29.000I'm not making these details up because I don't remember, but Something that they just concocted, right?
00:57:50.000I don't know, again, I don't know if he ever wrote any follow-up on this, because as I recall, and this is like a 15-year-old story, they ended with the Twilight Zone moment, where now you realize this guy is innocent and just saying yes to everything.
00:58:06.000And his daughter's crazy, because she shares his genes, probably.
00:58:10.000I don't recall what the daughter did with that, but...
00:58:15.000I mean, the story, and perhaps there's more to the story, but the story on his face was totally exculpatory.
00:58:21.000The reader experience was, you've got to let this guy out of prison tomorrow.
00:59:07.000I mean, how many people are like that, that are just sort of kind of functional?
00:59:10.000My question is, if we do get to a point where you could read minds, what if you go into their minds and you find out, well, this is what they really think.
00:59:21.000This is a person who's seeing things that aren't there.
00:59:24.000Like, a person who's completely delusional, like people that have these hallucinating, hallucinogenic visions.
00:59:33.000Some people have really deeply troubling visual images that they see.
00:59:39.000Imagine if these poor fucking people really are seeing that.
00:59:42.000And if you could read their mind, you would literally be inside the mind of a person whose mind isn't functioning.
00:59:49.000And we can get sort of an understanding about what that would be like.
00:59:52.000Yeah, well, I mean, this has been done in a very simple way, where with schizophrenics, who mostly have auditory hallucinations, you can now detect auditory cortex...
01:00:03.000So mishaps, misinterpretations you can detect?
01:00:06.000We just know that their auditory cortices are active in the same way that when you're hearing my voice, it's going to be active.
01:01:14.000I'm talking to you, not talking to the millions.
01:01:16.000But it's, you know, where the temporal lobe and the parietal lobes intersect, and the I think it was first discovered in surgery on an epileptic, or in any kind of resection of the brain where people are awake because there's no pain sensors in the brain,
01:01:38.000so you can stay awake while you're getting brain surgery.
01:01:41.000And they tend to keep you awake if they're going to be removing areas of the brain, let's say a tumor or the focus of an epileptic seizure, and they don't want to remove...
01:01:54.000Working parts, especially, you know, language parts.
01:01:57.000So they're keeping people awake and they're probing those areas of the cortex to see what it's correlated with in the person's experience.
01:02:09.000So they're having them talk, they're having them answer questions, and they're putting a little bit of current in that area, which would be disruptive of normal function.
01:02:24.000Almost entirely mapping language cortex when they do this.
01:02:28.000But there have been experiences where a neurosurgeon will put a little current in an area near this region of the brain, and people will have this out-of-body experience where they're up in the corner of the room looking down on their bodies or...
01:02:48.000The classic astral projection experience or the near-death experience where people have risen out of their body or seem to have risen out of their body.
01:03:00.000And consciousness now seems to be located elsewhere.
01:03:16.000Virtually every region of the cortex does many, many things.
01:03:19.000There's no one region of the brain that does one thing.
01:03:22.000There are a couple of exceptions to this.
01:03:25.000So the whole brain is participating in much of what we do, and it's just greater or lesser degrees of activity.
01:03:33.000But in terms of your mapping your body in space, The parietal lobe has got a lot to do with that.
01:03:41.000And when that gets disturbed, you can have weird experiences.
01:03:47.000You can have the experience of not recognizing your body or parts of your body, like alien hand syndrome, where this left arm seems like another person's arm, and people try to disown half their body.
01:04:06.000And you can trick people with visual changes of display.
01:04:15.000You can wear headgear where you can make me feel like...
01:04:19.000It's called the body-swapping illusion.
01:04:23.000My consciousness is located in your body looking back at me.
01:04:27.000There's a clever experiment that they did where there's the ultimate...
01:04:34.000extension of what has long been called the rubber hand illusion where you can put, like my two hands are on the table now, you can set up an experiment where if you put a rubber hand if you set this up in a way where I am assuming I have my two hands here you can put a rubber hand in its place and touch this rubber hand with a brush So I'm
01:05:05.000seeing the rubber hand get touched with a brush, and I can feel like my hand is being touched.
01:05:14.000It's like if my hand is elsewhere under the table being touched with a brush at the same time, I can feel like my hand is now the rubber hand.
01:05:23.000So I can feel like my hand is in place of the rubber hand based on visual and tactile You know, the simultaneity of my seeing the rubber hand get touched with a brush and my feeling my hand, which is now under the table, being touched with a brush.
01:05:38.000I'm not explaining that setup great, but people can look it up.
01:05:42.000But you can do the same thing to the ultimate degree with this video goggle display where I'm getting input, visual input, from where you're standing.
01:05:54.000So like if you come up to shake my hand, I'm seeing you come up to me and shake my hand, but I'm seeing it from your point of view.
01:06:05.000I now feel like I'm walking up to me, shaking my hand.
01:06:09.000And you can just kind of feel like your consciousness is over there, outside your body.
01:06:18.000And it's just to say that our sense of self, our sense of being located where we are in our heads is largely, and in some cases almost entirely, a matter of vision.
01:06:34.000The fact that you feel you're over there is because that's where your eyes are.
01:07:25.000But it had this one episode that dealt with people learning certain skills while the outside of their brain was being stimulated with a little electrode.
01:07:35.000And this woman who was one of the reporters went to a sniper training thing where they set up the scenario and they give you like a fake gun.
01:07:43.000You point at the screen and you try to hit the targets as all these things are happening.
01:08:09.000She gets 20 out of 20. So she goes from being a complete failure to being awesome at it in some weird flow state that she described.
01:08:17.000And they're talking about all the US government's using it.
01:08:19.000They're trying to train soldiers and snipers and people to try to understand this mindset and try to achieve this mindset and that they're trying to do it.
01:08:27.000And there's certain companies that are experimenting with it at least by stimulating the outside of your head.
01:08:33.000So I could not know this particular story.
01:08:37.000I remember hearing that title, though.
01:08:40.000So transcranial magnetic stimulation is magnetic energy, which is the flip side of electrical energy.
01:08:48.000So if you apply a big magnet to the side of your head, you are changing the electrical properties of your cortex.
01:09:40.000Because what you're doing, I mean, you are disrupting neural firing, but you can disrupt areas that are inhibiting It's not always synonymous with the degradation of performance.
01:10:00.000You could increase performance on a certain task by taking one region of the brain offline or more or less offline.
01:10:11.000But I'm not, you know, I'm not aware of how far they've taken it in terms of doing anything that seems useful in terms of, you know, performing something.
01:10:18.000I mean, the research I'm more aware of is just using this to figure out what various regions of the brain are doing.
01:10:26.000I mean, kind of mapping function, because you want to see, if I disrupt an area here, how does that show up in an experiment?
01:10:34.000And that gives you some clue as to what that region is doing, at least in that task.
01:10:39.000As much as we know about the mind and being able to do things like this, like overall, if you had to really try to map out the exact functions of the mind and how everything works, how far do you think we are along to understanding that?
01:11:04.000We know a lot about where language is and where facial recognition is.
01:11:11.000Your visual cortex has been really well mapped, and we know a lot.
01:11:17.000And for the last 150 years, based on just neurological injury, and then in the last decades, based on imaging technology, we know regions of the brain that...
01:11:33.000Absolutely govern language and regions of the brain that have basically nothing to do with language, you know, to take one example.
01:11:40.000And we know a lot about memory, and we know a lot about the different kinds of memory.
01:11:45.000But there's, you know, I think there's much more we don't know.
01:12:10.000Is another part of the process where there are no guarantees.
01:12:15.000The way we can intervene in the functioning of a brain is incredibly crude, pharmacologically or with surgery or with a device like that.
01:12:26.000So to get from a place of really refined knowledge to a place of being able to do something we want to do with that knowledge, that's another step.
01:12:41.000There's no reason to think that we're not going to take it at some point, but it's an additional complexity to get inside the head safely and help people or improve function, even if you know a lot about what those areas of the brain do.
01:13:36.000So there are two paths, or at least two distinct paths in...
01:13:40.000Artificial intelligence, and one path could try to emulate what the brain is doing, and that obviously requires a real detailed understanding of what the brain is doing.
01:13:51.000Another path would be to just ignore the brain, right?
01:13:55.000So there's no reason why artificially intelligent machines, even machines that are superhuman in their capacities, need to do anything That is similar to what we do with our brains,
01:14:11.000you know, with neurochemical circuits.
01:14:13.000So because they're going to be organized differently and, you know, could be organized quite differently and obviously made of totally different stuff.
01:14:24.000So, whether you want to go down the path of emulating the brain on the basis of a detailed understanding of it, or you just want to go down the path of maximizing intelligent behavior in machines, or some combination of the two,
01:14:41.000they're distinct, and one doesn't entail really knowing much about the brain, necessarily.
01:14:51.000So there's really two different ways they can go about it.
01:14:53.000Either they could try to reproduce a brain.
01:15:43.000It may be a leap we take in this stepwise way where we build machines down a path that is not at all analogous to recreating brains, which allow us to then understand the brain You know,
01:15:59.000totally, in the Ray Kurzweil sense, where we can, you know, upload ourselves, if that makes any sense.
01:16:17.000Processing is at bottom what intelligence is.
01:16:22.000I think that is not really up for dispute at this point.
01:16:27.000That any intelligent system is processing information, and our brains are doing that.
01:16:33.000And any machine that is going to exhibit the kind of general intelligence that we exhibit and surpass us will be doing, by dint of its...
01:16:46.000Hardware and software, something deeply analogous to what our brains are doing.
01:16:51.000But again, we may not get there based on directly emulating what our brains are doing.
01:16:59.000And we may get there before we actually understand our brains in a way that would allow us to emulate it.
01:17:08.000Very interesting to me how it seems to be there's always pushes and pulls in life.
01:17:15.000And when you have things that are as horrific as factory farming and people are exposed to it, then there's this rebound and where people are trying to find a solution.
01:17:26.000And I always wonder, like, will that be the first artificial life that we create, like zombie cows?
01:17:33.000Like, maybe if we figured out that meat in the lab is not good because it has to actually be moving around for it to be good for you.
01:17:39.000Maybe they'll come up with some idea to just...
01:17:55.000And then go from that to making artificial people.
01:17:59.000Because it seems to me that artificial people, it's gonna happen.
01:18:03.000I mean, it's just a matter of how much time.
01:18:04.000If they're making bladders, and then they're gonna start making all sorts of different tissues with stem cells to try to replace body parts and organs, and they're gonna work their way through an actual human body.
01:18:26.000I mean, it might take a thousand years, but I think if we stay alive, if human beings, rather, if human beings continue to evolve technologically...
01:18:35.000Within the next thousand years, we're going to have artificial people that are completely...
01:18:38.000So you mean they're people, so we're going to build a biological person.
01:19:36.000There are many issues there, but when you're talking about changing the genome, and especially when you're talking about changing the germline, then it gets passed on to future generations.
01:20:46.000And I think we have decided to bypass that vision and just go straight to the vat and just build it up cell by cell and build up only what we need, which is the meatball or the steak.
01:20:58.000So why have the fur and the organs that you don't want and the mess?
01:21:04.000The kind of the energy intensive aspects of producing a whole animal.
01:21:07.000And I think with like spare parts for humans, rather than create a clone of yourself that has no brain that you just keep in a vat somewhere in your garage where you can get spare kidneys when you need them.
01:21:22.000We would just be able to, you know, print the kidneys.
01:21:26.000Because that gets around a lot of the weirdness, right?
01:21:30.000It'd be weird to have a copy of yourself that's just, you know, just spare parts.
01:21:35.000Whereas it wouldn't be weird, or at least in my view, it wouldn't be weird.
01:21:38.000It would be fantastic to be able to go into a hospital when your kidneys are failing and they just take a cell and print you a new kidney.
01:22:08.000Since I got into Dan Carlin's Hardcore History, it really fucked my mind up about how I think about the past in this way that I look like a thousand years ago in comparison to today.
01:22:21.000And I try to think, well, how much different will people be a thousand years from now?
01:22:28.000Yeah, you know, I mean the the fascination that we have with ancient history is that we one of the things obviously is we want to know where we came from but also We can kind of see people today doing similar shit if they were allowed to Like if everything went horribly wrong people at their base level are kind of similar Today as they were a thousand years from now.
01:22:48.000Yeah, well one of them might be running for president We can talk about that and when I think about The future a thousand years from now with the way technology is accelerating and just the capacity that we have and ability to change things,
01:23:06.000to change the world, to change physical structures, to change bodies.
01:23:10.000To dig into the ground and extract resources.
01:23:14.000We're getting better and better at changing things and manipulating things, extracting power from the sun and extracting salt from the water.
01:23:23.000There's all this bizarre change technology that's consistently and constantly going on with people.
01:23:30.000When I think about a thousand years from now and artificial people and this concept of being able to read each other's minds and being able to map out imagery and pass it back and forth from mind to mind in a clear spreadsheet form.
01:23:50.000It's going to be incredibly strange to be a person.
01:23:53.000Yeah, whether we will be people in a thousand years, I think you would...
01:24:00.000Unless we have done something terrible and knocked ourselves back a thousand years, I think we will decide to change ourselves in that time in ways that will make us...
01:25:27.000He gave at least one TED Talk, and he's written two very good books.
01:25:32.000The first came out about 10 years ago, The Fabric of Reality, and the more recent one is The Beginning of Infinity.
01:25:42.000Extremely smart guy and very nice guy.
01:25:45.000He has this thesis, which He and I don't totally agree about the implications going forward for AI, but he's convinced me of his basic thesis, which is fascinating, which is the role that knowledge plays in our universe,
01:26:09.000And his argument is that in any corner of the universe, Anything that is compatible with the laws of physics can be done with the requisite knowledge.
01:26:22.000So he has this argument about how deep knowledge goes and therefore how valuable it is in the end.
01:26:29.000So I'm cueing off your notion of building an artificial person, literally cell by cell or atom by atom.
01:26:41.000There's every reason to believe that's compatible with the laws of physics.
01:26:45.000So we got built by the happenstance of biology.
01:26:50.000If we had what he calls a universal constructor, you know, the smallest machine that could assemble any other machine atom by atom, We could build anything atom by atom, right?
01:27:08.000You could literally go into an area of deep space that is as close to a vacuum as possible and begin sweeping up stray hydrogen atoms and fuse them together And generate heavier elements.
01:27:27.000So you could start with nothing but hydrogen, right?
01:27:30.000And with the requisite knowledge, Build your own little fusion reactor, create heavier elements, and based on those elements, create the smallest machine that can then assemble anything else atom by atom,
01:28:03.000And so the limiting factor in that case is always the knowledge, right?
01:28:10.000So the limiting factor is either the laws of physics, either this can't be done because it's physically impossible, or the knowledge is what you're lacking.
01:28:19.000And given that human beings are physically impossible, there should be some knowledge path whereby you could assemble one atom by atom.
01:28:28.000There's no deep physical reason why that wouldn't be the case.
01:28:34.000The reason is we don't know how to do it.
01:28:37.000But presumably it would be possible for us to acquire that knowledge.
01:28:45.000And so the horizon of knowledge just extends functionally without limit.
01:28:53.000We're nowhere near the place where we know everything that's knowable.
01:29:12.000I mean, with the frontiers of knowledge explored, you know, 10,000 years beyond where we are now, we would be unrecognizable to ourselves.
01:29:23.000Everything would be equivalent to magic, you know, if we could see it now.
01:29:28.000And most of human history is not like that.
01:29:30.000And most of human history, if you dropped into any period of human history...
01:29:34.000It was, for all intents and purposes, identical to the way it was 500 years before and 500 years before that.
01:29:42.000It's only very recently where you would drop in and be surprised by the technology and by the culture and by what is being done with language and the consequences of language.
01:31:33.000This is the promise of nanotechnology, where you have tiny machines that can both build more of themselves and more of anything else that would be made of tiny machines, or assemble anything atom by atom,
01:31:51.000or treat your own body like the machine that it is and deal with it atom by atom.
01:31:56.000I mean, the possibilities of intervention in the human body are are then virtually limitless.
01:32:03.000So it's a Yeah, I mean, that's where the physical world begins to look just totally fungible.
01:32:15.000You know, when you're not talking about surgery, where you're cutting into someone's head and hoping, you know, in very coarse ways, hoping you're not taking out areas of brain that they need, but you're talking about actually repairing...
01:34:01.000See, I don't know how that was done, but I had heard that that was all just pure CGI, right?
01:34:06.000Yeah, when there was a dude in a costume that acted it out with him, but essentially it was just all CGI. Right.
01:34:11.000Yeah, so the fact that that's beginning to look good, obviously that's just all surface.
01:34:19.000That has no implication for building a rendering of a bear on film.
01:34:25.000It's not the same thing as building a bear, but the fact that we can move so far into modeling that kind of complexity visually Just imagine what a super-intelligent mind could do with a thousand years to work at it.
01:34:48.000And we're on the cusp of, and when I say cusp, I don't mean five years, but let's say a century.
01:34:55.000We're on the cusp of producing the kind of technology that would allow for that.
01:34:59.000And if we put it into perspective, photography, I don't believe, was even invented until the early 1800s, right?
01:36:13.000They're awesome to just get into for fun, but as far as visual effects, what they can do now, and the idea that it's all been done over 200 years is just spectacular.
01:36:24.000Not just capturing the image, but then recreating an artificial version and projecting it, which is a thousand times more difficult.
01:36:32.000But there's another feature here of the compounding power of knowledge and technology, where there's certain gains that are truly incremental, where everything is hard won, everything is just 1% better than its predecessor.
01:36:51.000But then there are other gains where you have created an ability that seems like a quantum leap beyond where you were and where you go from just fundamentally not being able to do anything in that domain and then all of a sudden the domain opens up totally.
01:37:30.000And then, at a certain point, flight is possible and opens this whole domain of innovation.
01:37:36.000But the difference between not being able to fly...
01:37:39.000There's no progress you can make on the ground that doesn't...
01:37:44.000It doesn't really avail itself of the principles of flight, as we now know them, that's going to get you closer.
01:37:52.000You can't jump a little bit higher, and so it doesn't matter what you do with your shoes.
01:38:00.000There are kind of fundamental gains that open up, you know, DNA sequencing is a more recent example, where understanding and having access to the genome, and that's you go from the only way to influence your descendants is to You know,
01:38:23.000basically make a good choice in wife, right, or husband, to you can just create a new species in a test tube if you wanted to, right?
01:38:34.000And that's a kind of compounding power of understanding the way things work.
01:38:43.000I think we're at the beginning of a process that could look very, very strange very, very quickly.
01:38:51.000I think, obviously, both in good and bad ways, but I don't think there's any break to pull on this train.
01:39:02.000Knowledge and intelligence Are the most valuable things we have, right?
01:39:08.000So we're going to grab more insofar as we possibly can, as quickly as we can.
01:39:15.000And the moments of us deciding not to know things and not to learn how to do things, I mean, those are so few and far between as to be almost impossible to reference, right?
01:39:25.000I mean, there are moments where people try to pull the brakes and And they hold a conference and they say, you know, should we be doing any of this?
01:39:33.000But then, you know, China does it or threatens to do it.
01:39:36.000And we wind up finding some way to do it that we consider ethical.
01:39:45.000So there are things like, you know, germline tinkering that we, as far as I know, don't do and have decided for good reason we're not doing.
01:40:59.000Like, no one's shown any ability to create this stuff that's more fucked up than what we already have.
01:41:04.000But weaponized anthrax and things along those lines, like...
01:41:08.000These Russian guys we talked to, they were talking about how they had vats of this stuff.
01:41:14.000They had all kinds of crazy diseases that they had created just in case we had gotten into some insane, mutually assured destruction, you know, disease-spreading thing.
01:41:33.000What their concern is, the Center for Disease Controls guys, they were concerned with things like Ebola, things morphing, things becoming airborne, natural things, new strains of the flu that become impossible, MRSA. MRSA is a terrifying one.
01:41:47.000MRSA is one that has a lot of people scared, a lot of doctors scared.
01:41:50.000It's a medication-resistant staph infection that kills people.
01:41:55.000I mean, it can absolutely kill people if you don't jump on it quick and take the most potent antibiotics we have, and even then it takes a long time.
01:42:04.000Well, it's a—I actually just tweeted this recently.
01:42:08.000I think I said, would some billionaire, would some 0.1 percenter develop some new antibiotics?
01:42:15.000Because clearly the government and the market can't figure out how to do it.
01:42:19.000And it really is falling through the cracks in the government market paradigm.
01:42:25.000It's like either the government will do it or the market will do it, but neither are doing it.
01:42:33.000It's a rational for developing antibiotics because it's so costly and you take them, you know, with any luck, you take them once every 10 years for 10 days and that's it.
01:42:44.000I mean, that's not like Viagra or any antidepressant or any drug that you're going to take regularly for the rest of your life.
01:42:55.000So there's no real market incentive to do it, or at least not enough of one, to spend a billion dollars developing an antibiotic.
01:43:03.000And the government apparently is not doing it.
01:43:51.000And I think it's one of those things where once, I guess everyone has it on their body, and when you get an infection, then it spreads and grows, and apparently it can be a reoccurring thing.
01:44:02.000So people who get it, particularly MRSA, apparently they can get it again, and it can get pretty bad.
01:44:09.000There was that one fighter who just died, I don't think related to that, but he had these...
01:45:55.000Maybe it just jumped on him really quick.
01:45:57.000My dad's girlfriend just got it on her face, and she was in the hospital for two weeks, and they were afraid it was going to spread to her brain, and it almost did.
01:46:04.000And she's not 100% out of the woods yet, but she's back home now.
01:46:08.000She just got a little scratch on her face, and it spread into her cheek, and then from her cheek she just got a little red swelling, and then she couldn't see, and then she had to go in the hospital.
01:46:23.000Yeah, this is one area that This worries me.
01:46:28.000There are the bad things we do, and obviously there's a lot to be worried about there.
01:46:35.000The stupid wars and the things that it's just obvious that if we could stop creating needless pain for ourselves or needless conflict, that would lead to a much nicer life.
01:46:50.000But then there are the good things we neglect to do.
01:48:18.000It's not a research issue, it's just a financial issue?
01:48:21.000Well, I'm sure the research has to be done because, you know, if it was totally obvious how to build the next generation of antibiotics that would not be vulnerable to having their efficacy canceled in three years by...
01:48:38.000Just the natural selection among the microbes.
01:48:44.000Someone would do it very, very cheaply.
01:48:46.000So I'll admit that it's probably not easy to do, but it's got to be doable, and it's super important to do.
01:48:56.000I mean, when you look at just what cesspools, hospitals have become, where people come at something like 200,000 people a year die in the U.S. based on essentially getting killed by the machinery of the hospital.
01:49:14.000They're getting killed by their doctors and nurses.
01:49:17.000Some of this is drug overdoses or incompetence in dosing or giving someone the wrong medication or whatever.
01:49:53.000It's like where you're trying to fix people, and around the house are a bunch of demons that are trying to kill people you're trying to fix.
01:50:00.000Look, obviously it's not, but if you were a person who was inclined to believe things back in the day before they figured out microscopes, I mean, what else is that other than a demon?
01:50:11.000You've got a hospital that's filled with superbugs.
01:51:10.000It's so insane to think that that is a gigantic issue, that we have these bugs that try to get into your body and kill you.
01:51:18.000So there's one area, I don't know if you ever...
01:51:21.000I had the bad luck to be associated with a NICU, a neonatal ICU. But our first daughter, who's totally fine, was born five weeks early and had to be in the NICU for a week.
01:51:33.000And there are people who are in the NICU for months.
01:51:36.000There are babies born at 23 weeks or so.
01:52:35.000The fact that we can't even do that perfectly is pretty impressive.
01:52:39.000Now, is it a fact that MRSA was created by medications, or is that a belief, or has that been proven, that it was created by a resistance to medications that got stronger?
01:52:50.000Yeah, well, no, yes, it's a fact that...
01:52:54.000All of these bugs are evolving, and just by dint of happenstance, they are producing changes in their genome that leaves them no longer vulnerable to Antibiotic X,
01:53:52.000It's not like bacteria want to become drug resistant, but some percentage of them In any generation will tend to become, or in some generation, will become drug resistant.
01:54:03.000And then in the presence of the drug, they will be selected for.
01:54:08.000If you keep bombarding people with penicillin, you will be selecting for the bacteria that isn't sensitive to penicillin in those people.
01:54:22.000So yeah, the overuse of antibiotics and the overuse of antibiotics in our food chain is also part of this picture, right?
01:56:34.000Isn't airborne and is difficult to contract, right?
01:56:37.000Well, then it's a fairly well-behaved, you know, it could be scary, but it's not going to become a global pandemic.
01:56:43.000But then suddenly you get a mutation on that virus or that bacteria that allows it to be, you know, aspirated, become airborne in a cough and inhaled and Well, then you have the possibility of a pandemic.
01:57:02.000And also the time course of an illness is relevant.
01:57:07.000So if you have something which kills you very quickly and horribly, well, then that's the kind of thing that is going to be harder to spread because people become suddenly so sick.
01:58:53.000And there are theses that there are various infectious diseases that change human behavior.
01:58:58.000The depression is the result of infectious illness that we're not aware of.
01:59:10.000Yeah, I mean, so there's a lot that could be going wrong with us that we haven't attributed to viruses and bacteria, which in fact is at bottom a matter of viruses and bacteria.
01:59:21.000Actually, Alzheimer's, there was recently a report that suggested that Alzheimer's is the result of a brain's immune response to infection,
01:59:45.000The plaques associated with Alzheimer's that you see throughout the brain might, in fact, be the remnants of An immune response to something having invaded across the blood-brain barrier.
02:00:04.000So if Alzheimer's is the result of infectious disease, score that as a major problem that would be nice to solve with the right antibiotic regime.
02:00:24.000Because you remember when someone publicly starts to go like that, and it's a guy like Ronald Reagan who is an actor and president, and you see him starting to lose his grip on his memory, and you hear all the reports about it.
02:00:40.000That it's a particularly disturbing because it's exhibited.
02:00:44.000I mean, that's the head guy, you know, to think that that was just a disease.
02:00:51.000I don't remember when that became at all obvious.
02:00:55.000I remember I know people were trying to do a kind of a retrospective analysis of it, but I don't remember when anyone started to talk about the possibility that he Was not all there, and I don't remember it happening actually during his presidency,
02:02:52.000I forget how they busted him, but he had it nailed.
02:02:55.000He'd walk around in a bathrobe and talk to himself, and he would put on an act.
02:02:58.000Like, go out on the street and act like a crazy person, and then he would go on walks with, like, these capos and tell them, oh, kill this fucking guy and get me a million bucks and all that kind of crazy shit.
02:04:15.000Have you ever seen the Sapolsky stuff on the Toxoplasma?
02:04:19.000Robert Sapolsky, the guy from Stanford?
02:04:22.000He's the guy that's one of the forefront researchers and one of the guys who's really vocal about it.
02:04:29.000They were also talking about a direct proportionate, a direct relationship between motorcycle crashes And people were testing positive for toxoplasmosis.
02:04:40.000And they felt that it might have either hindered reaction time or loosened inhibitions, the same way it sort of triggers these mice to go near cats.
02:04:51.000Yeah, that's what I was referencing before.
02:04:54.000So, I think it's a lot of speculation, but there's a strong correlation, apparently, to motorcycle crashes.
02:05:03.000I guess one of his professors had told him that when he was younger and he had remembered it while they were dealing with some guy who came into the ER victim of a motorcycle crash.
02:05:16.000Yeah, well, you know, the underlying biology of, you know, risk avoidance and not risk seeking is, I mean, that's fairly well conserved in mammals.
02:05:30.000It's not like, I mean, there's a reason why we do most of our research in things like mice.
02:05:39.000A totally analogous brain, but mice are similar enough to us that doing research on dopamine receptors in mice allows us to extrapolate to humans.
02:06:01.000I remember I was supposed to bring this up to you before, when you were talking about plants and plants having some sort of consciousness.
02:06:10.000Was it Steven Pinker, see if you could find this, who gave a speech where he talked about how some plants, you can actually use sedatives on them And that some of them actually produced certain neurochemicals, like dopamine, if that makes any sense.
02:08:04.000When you guys were talking about AI, I pulled up something on Minority Report and it pulled me to this article, which Microsoft has an app that can...
02:08:14.000It's called Predictive Crime Analytics.
02:08:19.000They can predict crimes up to 91% accuracy.
02:08:23.000It's also already being enacted in Maryland and Pennsylvania as of 2013. They have crime prediction software that can find out if an inmate that's going to be released is going to commit another crime.
02:08:38.000And so they're using that to follow them.
02:08:41.000And there's some civil rights people that are saying, like, you can't Do that, obviously.
02:08:48.000Professor Burke says his algorithm could be used to help set bail amounts and also decide sentences in the future.
02:08:56.000And then I got down to this part in Chicago, they're doing something, and they have, it's called a heat list in Chicago.
02:09:02.000They have 400 residents that are listed as potential victims and subjects with the greatest propensity of violence, and they go and knock on their door and tell them that they're being watched.
02:09:11.000And I, like, I've clicked on this thing, and it's an actual, like, Chicago directive from the police.org.
02:09:17.000It's a pilot program about going and telling people that they're being watched for, someone might be after you, or some shit like that.
02:09:27.000I didn't want to interrupt you guys to tell you about this, but.
02:09:29.000Custom notification under the Violence Reduction Initiative in partnership with the John Jay College of Criminal Justice Community Team, who will serve as outreach partners within the social service and community partners.
02:10:23.000Yeah, that doesn't surprise me at all.
02:10:24.000I think that is a—all of that's coming.
02:10:29.000I mean, just look at just consumer behavior.
02:10:31.000I mean, just look at how much someone can understand about you based on your zip code and your last three Netflix movies you watched to the end.
02:10:42.000And just a few other data points, right?
02:10:44.000And then we basically know—we can predict, you know, with some horrendous accuracy what you're going to like— Given the menu of options, we can advertise to you with immense precision.
02:10:59.000Facebook, obviously, is at the forefront of this, but when you add everything else that's coming, the more intrusive technology of the sort we've been talking about, it's...
02:14:00.000People are going to have a hard time with you actually getting into their mind, seeing their actual mind, and being able to do that so we can know without a doubt whether or not someone's guilty or innocent.
02:14:10.000But my question to you is, if you could get inside someone's mind, and it was like that really super...
02:14:18.000Suggestive guy that you were talking about earlier that just confessed all the horrific demonic possession stuff and eaten babies.
02:14:25.000What if it's like getting to that guy's mind?
02:15:49.000There is no accountability to his own states of consciousness that he's going to be held to.
02:15:56.000And the people who love him don't seem to care.
02:16:00.000As far as I can tell, I don't know so many of these people personally, but based on social media and seeing the few articles where someone has explained why they love Trump, Um, people view this as a kind of,
02:16:15.000this sort of, this dishonesty, what is on, in my view, both dishonesty and a kind of theatrical hucksterism, a sort of person who's pretending to be many things that he probably isn't.
02:16:55.000I'm someone who, actually, I remember on my own podcast, I think I was talking to Paul Bloom, this Yale psychologist who's great, and we got into politics at least a year ago, but at that point I said, there's no way we're going to be talking about Trump in a year.
02:17:11.000This is going to completely flame out.
02:17:13.000This is a—I don't tend to make predictions, but this was a clear moment that I remember of making a prediction, which is now obviously false.
02:17:21.000But I just couldn't imagine that this was—people were going to find this compelling enough for him to be on the cusp of getting elected.
02:17:40.000Everybody feels like you're supposed to be with their person, whether it's Bernie or whether it's for Hillary or whether you're a Trump supporter, whatever it is.
02:17:57.000Like, Hillary Clinton, you could want a woman in the White House, and you want to show everyone that a woman can do that job just as well as a man, and she's got the most experience, and she certainly has the most experience dealing with foreign governments, and she certainly has the most experience in politics.
02:18:11.000But she's also involved in two criminal investigations.
02:19:05.000When you get her in front of a mic, and there's a crowd, and she thinks she's talking over the crowd, which she doesn't have to do because she's in front of a mic, The sound you get is just...
02:19:15.000She's yelling when she doesn't need to yell.
02:19:18.000Someone has to teach her how to dial that back.
02:19:21.000What you just did is called mansplaining.
02:19:25.000I'm explaining to the men in her crew who should...
02:19:28.000Talk some sense into her, but she's a bad candidate, right?
02:19:33.000I have no doubt that she's very smart, and she's well-informed, and she's qualified, and she is absolutely who I will vote for, given the choices.
02:20:01.000And this is all true, and yet I also believe the people who say, I've never met her, but people who know her and have met her say that behind closed doors, one-on-one, she's incredibly impressive and great.
02:20:13.000But that doesn't translate into her candidacy.
02:20:15.000She probably thinks she has to do it old school.
02:20:26.000I went out on Facebook the other day, and I've said very little about this, but I've made enough noises of the sort that I just made that people understand that I'm for Clinton, despite all my reservations about her.
02:20:42.000What I got on my own Facebook page, which you have to assume is filtered by the people who are following me on Facebook and already like me in some sense, just like a thousand comments of pure pain.
02:20:55.000No one said, oh, thank God someone's smartest for Hillary.
02:20:59.000It was all just Bernie people and Trump people flaming me for the most tepid possible endorsement of Clinton.
02:21:08.000All I said was, Listen, I understand Clinton's a liar, and she's an opportunist, and I completely get your reservations about her, but at least she's a grown-up, right?
02:23:39.000Trump, I guess, is a really powerful character, but in more ways like a showman character.
02:23:46.000What he's doing is he's putting on a great show, and he's going to win, probably, because he's putting on such a great show, and people like a great show.
02:23:56.000I do think I'm now among the people who think something new, we're witnessing something new with Trump.
02:24:04.000It's not just the same old thing where the process is so onerous that it's selecting for the kind of narcissist or thick-skinned person who is willing to submit to the process and then there are many,
02:24:20.000most of the good people just aren't going to put up with this.
02:24:23.000I mean, yes, there's that too, but There's something...
02:24:28.000It's a moment among the electorate where...
02:24:35.000There's enough of an anti-establishment...
02:28:12.000Now, this gets stated as, yeah, we're going to round them up and send them back to Mexico.
02:28:18.000And what worries me is no one seems to care that if you just look at the implications of doing this, this one policy claim alone is so impractical and unethical.
02:28:35.000Your gardener, your housekeeper, the person who works at the car wash, the person who picks the vegetables that you buy in the market is going to get a knock on the door in the middle of the night by the Gestapo and get sent back to...
02:28:50.000The vast majority of these people are law-abiding people who are just working at jobs that Americans, by and large, don't want to do.
02:29:16.000Held in isolation from all of the other things he said, the crazy things like climate change is a hoax concocted by the Chinese to destroy our manufacturing base and the fact that he likes Putin.
02:29:28.000I mean, everything else he said, right?
02:29:29.000This one policy claim alone should be enough to disqualify a person's candidacy.
02:29:35.000It's so crazy the moment you look at it.
02:30:10.000There's not an easy way to do it if you're poor, you don't have any qualifications for any unusual job, and you're trying to get across to Mexico.
02:30:19.000But everybody who does it does it because they want to improve their life.
02:30:22.000And the idea that one group of people shouldn't be able to do it, one group should, just because they were born on the right side of some country.
02:30:28.000Strange line that is only a couple hundred years old.
02:30:32.000But actually, I'll go further in meeting him in the middle.
02:30:36.000So I think we should be able to defend our borders, right?
02:30:40.000I don't have a good argument for having a porous border that we can't figure out how to defend and we don't know who's coming into the country.
02:30:48.000I think building the wall is almost certainly a stupid idea among his many stupid ideas, but I think it would be great to know who's coming in the country and have a purely legal process by which that happened.
02:31:02.000Ultimately, that's got to be the goal, right?
02:31:10.000So I don't have an argument for open borders or porous borders, but the question is, what do you do with 11 or 12 million people who are already here doing jobs we want them to do that help our society?
02:31:23.000And the vast majority of them are law-abiding people who, as you say, are just trying to have better lives.
02:31:30.000The idea that you're going to break up families and send people back by the millions and the idea that you're going to devote your law enforcement resources to doing this when you have real terrorism and real crime to deal with is just pure insanity and also totally unethical.
02:31:50.000And yet he doesn't get any points docked for this aspiration.
02:31:56.000It's one of the things around which people are rallying.
02:32:00.000But the climate change thing is also insane and dangerous.
02:32:32.000I'm going to hire some people who actually know how to run things.
02:32:37.000The smart people who are voting for him think, and this is, I think, a crazy position, but they think that...
02:32:46.000He is just pandering to the idiots who he needs to pander to to get into office.
02:32:53.000So he's not disavowing the white supremacist vote with the alacrity that you would if you were a decent human being and you found out that David Duke supported you.
02:33:05.000Because he needs those votes and he knows that most of the people in his base aren't going to care and he can just kind of move on in the news cycle.
02:33:16.000And he's doing this on all these issues where smart people see that he looks like a buffoon and the people who don't like him are treating him as a comic figure who...
02:35:00.000But I think people think that he's got to be much more sophisticated than he is, and that if he got into office, he would just be a totally sober and presidential person.
02:35:13.000There's just no reason to believe that.
02:35:15.000I mean, if he thinks climate change is a hoax, and that we should pull out of the Paris Accords, and we should ramp up coal production, and we're going to bring back the coal jobs, I mean, this is what he's saying, right?
02:35:26.000There's no reason to think he doesn't believe this at this point.
02:35:31.000It is a disastrous thing for a president to think.
02:35:36.000The only fascinating versions of this that I've been hearing from people that I respect are that the idea that he is like...
02:35:48.000The political version of the asteroid that killed the dinosaurs.
02:35:50.000He's going to come down and smash it, and it's going to be so chaotic that they're going to be forced to reform the system, and people are going to respond in turn.
02:35:59.000The way people are responding against factory farming and more people are going vegan, that kind of a thing.
02:36:03.000They're going to see it, and they're going to respond in turn.
02:36:07.000So he's going to toss the apple cart up in the air.
02:36:10.000He's just going to fuck this whole goofy system up and then we'll be able to rebuild after Trump has dismantled all the different special interest groups and lobbyists and all the people that we really would like to get out of the system.
02:36:21.000We really don't like the fact that there's such insane amounts of influence that big corporations and lobbyists have had on the way laws get passed.
02:37:45.000Almost any process by which you would change the system is more intelligent than that.
02:37:50.000And it's also not valuing how much harm one bad president could do.
02:37:57.000I haven't tested this, but I'm imagining that even Trump supporters would answer this question the way I would hope, which is, if I had a crystal ball It can't tell you who's going to be president, but it tells you how it works out for the next president.
02:38:15.000If I look in this crystal ball and it says the next president of the United States is a disaster.
02:38:21.000It's like the worst president we've ever had.
02:38:24.000Just think of failures of governance and the toxic influence of narcissism and hubris that comes along just like once every thousand years.
02:40:04.000A lot of people are saying that Things like that, but they're not Hearing just how nihilistic that is, if true.
02:40:17.000There's so much stuff we have to get right.
02:40:19.000And the only tool to get it right is having your mind actually understand what's going on in the world and how to manipulate the world in the direction you want it to go.
02:40:34.000So you have to understand whether or not climate change is true Your beliefs about it have to be representative of that truth.
02:40:43.000Let's say I'm mistaken and there is no human cause.
02:40:51.000And every moment spent thinking about it, worrying about it, correcting for it is just a waste of time that's just throwing out the wealth of the world.
02:41:04.000So it really matters who's right about that.
02:41:07.000And the fact that we have a president or a candidate who is coming in saying, this is all bullshit, in defiance of all of the science, is on every other point.
02:41:26.000I guarantee you he doesn't know the difference between Sunni and Shia Islam or which countries are Sunni predominantly and which are Shia predominantly.
02:41:35.000And I mean, I'm sure he's going to do...
02:41:37.000I don't know when he's going to cram for this final exam.
02:41:39.000I'm sure before one of those debates he's going to get...
02:41:41.000Someone's going to sit down with him and give him some bullet points he's got to have in his head.
02:42:27.000I don't know, but we've got to figure out how to run this thing.
02:42:30.000We had no previous understanding of government.
02:42:32.000Would you think anybody would say, we need one dude to just run this whole giant continent filled with 300 million people?
02:42:38.000Most likely, if we woke up and we had technology like we have today, we had the ability to communicate like we have today with social media and whatever, We would probably say we need to, like, figure this out amongst each other and find the people that are the most qualified for each one of these positions and start running our government that way.
02:42:56.000Well, that's what we're attempting to do, but it's just...
02:42:59.000And I totally agree with you that it is astonishing that out of a nation of 300 million people, these are the choices.
02:43:06.000You would think, starting from your zero set point of just, you know, now we're going to reboot civilization.
02:43:16.000You would think that if you had this kind of process, each candidate would be more impressive than the next.
02:43:23.000I mean, you'd be like, I can't believe...
02:43:25.000Each person who came to the podium would be so impressive.
02:43:43.000It'd be like, you'd be talking about the science of climate change, you'd be talking about the actual dynamics of the war on terror.
02:43:51.000So topics that seem to have no relationship, where you'd have to be, you'd be amazed that anyone could be an expert in all of them, you would find someone who is an expert, a functional expert in all of them.
02:44:04.000Yeah, but someone who's also ethically wise, who wasn't obviously an asshole, and who had a mature relationship to changing his or her mind,
02:44:48.000It's so taboo to change your mind that either you have to lie about it or you have to pretend it was always that way or it's just a...
02:44:58.000I mean, the system is broken in that respect, but given the choices, you know, and when you have a choice between someone who is, for all her flaws...
02:45:11.000I've been in the game for long enough to be really well informed and capable of compromise and capable of not just breaking things.
02:46:02.000So because of him saying crazy stuff, he accelerated the amount they were talking about him.
02:46:06.000So they were constantly talking about him and barely talking about other people.
02:46:10.000But he's created a wormhole in our political process now where there's nothing so crazy that could disqualify him among the people who like him now.
02:46:18.000So he can just keep nuclear bombs of craziness that the press can't ignore, that every time they think, okay, this is the crazy thing he said that's going to harm his candidacy, so let's shine a light on it, it just helps him.
02:46:34.000He could get on Twitter right now and say, you know who I'd like to fuck?
02:48:35.000That's what you're doing if you're hiring someone like this who—I mean, yeah, in the best case, what you stated earlier would in fact be true, which he'll get into the Oval Office, and even he will be scared of the prospect that he's now running the better part of human civilization— And he will hire the best people or some semblance of the best people he can get access to and say,
02:49:07.000And then it'll essentially be business as usual, right?
02:49:11.000Insofar as you've hired the best people will be people who are...
02:49:14.000Are deeply in this game already, right?
02:49:17.000You know, he'll defer to the generals when it comes time to make war.
02:49:20.000Being really pragmatic about how they pick politicians and how they push certain people and decide not to push others, do you think that something like Trump completely changes how they move forward now?
02:50:11.000But that sort of ability to excite people.
02:50:15.000We're going to get one of those motivational speaker dudes, one of those guys who wears a lot of yoga pants, and he's going to be the next president.
02:50:44.000So Trump was being very combative with the press pool, and he was basically shouting them down, not answering any of the questions.
02:50:52.000And one journalist, just aghast, said, is this what it's going to be like when you're president?
02:50:57.000Is this what it's going to be like to be in the White House press corps and deal with you?
02:51:03.000And he said, yes, this is exactly what it's going to be like.
02:51:07.000But you could just see that the journalists, they turn the camera on in the room of journalists, and they are astonished by what is happening here.
02:52:01.000But I had no choice but to do this because the press was saying I didn't raise any money for them.
02:52:05.000Not only did I raise it, much of it was given a long time ago.
02:52:08.000And there is a vetting process, and I think you understand that.
02:52:11.000But when I raise almost six million dollars, and probably in the end we'll raise more than six because more is going to come in and is coming in.
02:52:18.000But when I raise 5.6 million as of today, more is coming in.
02:52:22.000And this is going to phenomenal groups, and I have many of these people vetting The people that are getting the money and working hard...
02:52:30.000You played the moment I was referring to.
02:52:34.000I mean, so here is a case where he's probably almost certainly lying about his history of giving to Veterans Affairs.
02:52:41.000And he gave money very recently after people started fishing around to see if he actually had given the money that he claimed to have given to veterans.
02:52:55.000What's difficult about this is that yes, the press is highly imperfect and also partisan and there are false stories and there are exaggerations and they screw people over, yes.
02:53:10.000And there are reasons to not trust the press from time to time.
02:53:20.000But in this case, you have a There is no amount of fact-checking and disconfirmation of his statements that forces him to ever acknowledge anything that he's done wrong,
02:53:36.000and the lack of acknowledgement that he pays no price for it among the people who like him.
02:55:48.000Well, the thing about Cruz that never even got out, which was the reason to be scared about a Cruz presidency, was his level of religious craziness.
02:55:56.000I mean, no one was even pushing on that because there was just enough to push on before he even got to that door.
02:56:00.000Yeah, you have to hold on to those weapons.
02:56:03.000But I mean, had Cruz been the nominee, it would have been all about religion.
02:56:09.000What's odd is that that's not a handicap in 2016, that you can have that and people consider it an asset.
02:56:16.000Well, the one thing that's surprising and actually hopeful in Trump's candidacy...
02:56:23.000Is the fact that he has dissected out the religious, social, conservative component of the Republican Party.
02:56:33.000Evangelicals, for the most part, were going for Trump over Cruz when it was pretty clear to them that Trump was just pretending to be religious.
02:56:42.000So Trump gave one speech at, I think, Liberty University where he spoke.
02:56:47.000He said, you know, Corinthians 2, and that's not the way any Bible reader would speak about 2 Corinthians.
02:56:59.000Yeah, and so he said, Corinthians 2, as though this is something he just opened every night before he went to sleep.
02:57:07.000And so it was clear to them that he is just miming the language, you know, or...
02:57:19.000It's impersonating a person of faith, but they don't care, really, as long as he does it.
02:57:27.000And that is, if you're going to look for a silver lining to this, it shows that it's not—they just want— A space where their religious convictions are not under attack, and they don't really care that the person in charge share them.
02:57:45.000If you pretend to share them, that's good enough.
02:57:48.000And that's better than actually caring that this person really believe in the rapture or anything else that is quite obviously crazy.
02:57:59.000So I don't think any Christian who's voting for Trump thinks...
02:58:05.000I'm not going to judge another man's faith.
02:58:07.000Who am I to say what's really in his heart?
02:58:10.000They'll say that, but if you've been paying attention to who he's been, and if you just look at how he talks about these things, I don't think he's fooling any Christian.
02:58:24.000So I think they're willing to vote for someone.
02:58:26.000Now, for other reasons that are fairly depressing in their own right, they're willing to vote for someone who doesn't really play the game the way they do.
02:58:40.000You have to believe in God to be president in 2016, right?
02:58:47.000But I think with Trump, I think the pretense is...
02:58:53.000It's obvious enough that I don't think he's fooling the better part of the people who are voting for him, who would say they care about a person of faith being in the White House.
02:59:03.000So if anything, he might be—one thing he might be breaking is the barrier on having an atheist president, because I think he—you know, it's just— Nobody thinks he is a person of faith.
02:59:46.000Even an optimized process will require enough sacrifice of what ordinary people want most of the time that it will be an unusual personality who has to get promoted.
03:00:31.000You know, if you're going to scrutinize the kind of personality that could give rise to those opinions, it's not...
03:00:37.000Yeah, there are some dials you would probably want to chain, tweak if you had to be married to this person, or it's not an optimal personality.
03:02:08.000Yeah, for all her defects, she's very knowledgeable, and I'm sure she will just try—where she doesn't feel like she's got the knowledge, she's going to try to go to the source of the knowledge, right?
03:02:21.000Just grab the best experts she can find.
03:03:05.000There'd be no way for him to signal the fact that he's winging it more clearly than he is with everything he's doing, and yet there's no penalty.
03:03:14.000Do you think it's possible that in this age of information, the way we can communicate with each other, that we're going to experience these cycles, these waves, these in and outs, these high and low tides?
03:03:27.000Of really smart presidents and really stupid presidents.
03:06:19.000We don't want politics to be this interesting.
03:06:23.000November is going to be, if the polls are closed, watching those debates and waiting for a swing in the polls as a result, it's just going to be way too interesting.
03:06:34.000It's going to be like watching the Super Bowl, those first debates.
03:06:38.000It's going to be 100 million people watching those debates.
03:06:48.000I think it's entirely possible that this whole thing was a plot that didn't work out.
03:06:53.000I think he probably came out of the gate saying crazy shit, thinking he would tank the Republican Party and get his friend Hillary Clinton into the White House.
03:07:05.000He kept trying to insult her, kept trying to make stuff up about Mexicans, and it just kept making him get better and better, and now he's stuck.
03:07:19.000Well, we're going to have to go through something like this in order for us to realize that this is crazy, that a guy can just do this, can just not really have any interest in politics.
03:07:27.000But if he pulled out, then he should get the Nobel Prize for everything.
03:07:30.000If he pulls out at this point and says, listen, I took you to the precipice here.
03:07:35.000Just because I wanted you to recognize how unstable this situation is.
03:07:40.000You guys could elect a demagogue who...
03:07:47.000It's actually an incoherent demagogue.
03:07:49.000I haven't even been playing an incoherent authoritarian.
03:07:53.000I'm, on the one hand, very liberal and tolerant, and on the other hand, I'm getting ready to be Hitler, and you guys can't figure out who I am, and yet you're still prepared to vote for me.
03:08:08.000For him to do a post-mortem on his punking of the culture, that would be the best thing to ever happen.
03:08:17.000But I don't think that's what's happening.
03:08:19.000Do we need someone like this so that we realize how silly this whole thing is?
03:08:25.000We need a qualified person to deal with all of the other hassles and dangers that are coming our way that have nothing to do with what we do.
03:10:26.000Yeah, and the lighter weights, they were always badass.
03:10:28.000But I think that maybe that's what's going on.
03:10:30.000Maybe we need to have this bad season, get the season out of our way, realize the danger of having an inept person in office, whether it's a liar, or a dude who hates money, or Trump, whoever it is.
03:10:44.000Just go through it and realize how silly it is that we have it set up this way still.
03:12:01.000It's just, you know, like Architectural Digest does, you know, The Eagle's Nest.
03:12:05.000But it's at a time where it's not too far away from a moment where it should have been absolutely obvious to every thinking person that this guy was going to try to, you know, conquer the world for evil, right?
03:12:47.000I mean, he's given voice to a kind of authoritarianism That, you know, some people are—his enemies are noticing, his friends are discounting, but he's talked about, you know, going after the press, and I mean, he's bragged about how many people he's going to torture,
03:13:03.000He's talked about, you know, well, of course we're going to do waterboarding, and we're going to do worse, and maybe we'll kill the families of terrorists, right?
03:13:11.000And he—but there's a kind of a— It's going to make America great again.
03:13:20.000What would he do if he actually had more power than anyone in the world?
03:13:28.000The transition from comedy to, oh my god, we can't take this back in anything like short order, that could well be terrifying.
03:13:44.000To go back to the question of heavyweights, why do you think you could be a fake heavyweight and not a fake middleweight?
03:13:52.000There's not that many really good athletes that go to boxing when they're really large.
03:13:56.000They tend to go to football or basketball if they're really tall.
03:14:00.000If you look at the amount of money that guys in the NBA can make or guys in the NFL can make, the really top-level guys can make a tremendous amount of money.
03:14:10.000So when you get the really super-athlete guys, They tend to gravitate towards the big name.
03:14:16.000I mean, there's no bigger name sport than football.
03:14:18.000So getting someone to abandon the whole team thing and having the balls to go one-on-one in a cage and having that mentality, that's also very different.
03:14:27.000Because it's not necessarily the smartest thing to do, but it's the most challenging thing to do.
03:14:32.000And there's some really smart people that do it.
03:14:34.000So even though cage fighting isn't the safest way to get through life, For a lot of people that engage in, it becomes like an extreme, extremely difficult pursuit.
03:14:47.000And then that's what it becomes to them.
03:14:48.000You know, and in the heavyweight division, those guys were being lured into other ways.
03:14:54.000And boxing was just kind of, it went in through like a peak in a valley.
03:14:57.000Went Ali, and then it went Larry Holmes.
03:14:59.000And even though Larry Holmes was amazing, people didn't appreciate him for how good he was.
03:15:03.000So that doesn't happen, like at the middleweight level, that it's not the same competition for that kind of athlete?
03:15:08.000They get little lulls in the middleweight division, but it's always pretty fucking strong.
03:15:13.000But why wouldn't you have the same competition for the high-level athlete at the 165 weight?
03:15:22.000Our favorite sports require bigger people, like basketball and football, and baseball doesn't really apply here.
03:15:29.000The amount of cultures that produce heavyweights, first of all, are fairly limited.
03:15:34.000Like very few heavyweights have come from Asia, except like Polynesian guys, which I guess is kind of Asian, but like Samoans.
03:15:45.000Samoans known to be great fighters, but giant sturdy heavyweights.
03:15:49.000The Chinese don't really produce them that often.
03:16:07.000But I mean if we had a guy that was like a Japanese version of Mike Tyson, just a super fast blinding knockout fighter with a fucking head like a brick wall and a giant neck that started above his ears and went down to his traps.
03:16:20.000Remember Tyson when he first came on the scene?
03:16:37.000In a lot of poor countries you'll see much smaller men like you'll see like some some men are like flyweights like it's very rare you find an American flyweight most Americans are larger they get more food right I think probably has a lot to do with it or just just the genetics in general but like South America produces a lot of flyweights like the Philippines that's of course where Manny Pacquiao came from and he was like eight weight classes lower when he first started right And if you're a great athlete at 120 pounds or 130 pounds is not
03:17:32.000There's like little peaks and valleys where greatness comes in and then people have to recover and then new people come along that are great.
03:17:37.000But there's always been pretty steady.
03:19:08.000I think, well, when it comes to just freak movements, I always think that the flyweights and the bantamweights, the 25 and 35s are the fastest and the best guys.
03:19:17.000They're moving like 20% faster than anybody else.
03:19:19.000But I always wonder how much of that is because they're just not affected by gravity as much.
03:19:23.000And they're also not affected by the blows that are being landed by the other guy.
03:19:40.000I mean, he fought this guy, Henry Cejudo, an Olympic gold medalist, one of the best wrestlers to ever compete in MMA. I mean, he is just a stud wrestler and a really good kickboxer, too.
03:19:50.000And Mighty Mouse clinched up with him and hit him with these knees to the body that were just out of this world technical.
03:20:34.000There's another one between Anderson Silva and Rich Franklin, but that was like a prolonged, brutal beatdown where Anderson just kept beating him up and beating him up in the clinch and broke his nose.
03:21:57.000Well, this is kind of a different point, but you could drop an ant off the Empire State Building, and it'll fall and hit the ground and be fine.
03:22:07.000If you drop a horse off the Empire State Building, it's going to be a liquid horse.
03:23:00.000If you're going to engineer the super athlete, if we're going to give you chimpanzee muscle proteins or whatever to make you super explosive and strong, you'd have to get that right with your connective tissue and your bones and everything else because you could rip your own arm off with your ballistic moves.
03:24:15.000You know, just kind of dragging it around.
03:24:16.000I mean, it wasn't looking aggressive toward the child, but just the fact that it moved it around with that kind of force, who knows what was going to happen.
03:24:25.000I mean, that looked like you had to end that as quickly as possible.
03:24:28.000We have to assume that that gorilla is going to know that a baby is more fragile than a baby gorilla.
03:24:44.000No, it's totally tragic, and I'm sure the parents and the zoo are reaping sufficient criticism, but once that situation is unfolding, I think, I mean, you can't tranquilize it because it doesn't work fast enough.
03:27:56.000Obviously you can preserve them in all kinds of technical ways, like have their DNA frozen and be able to reboot them at a certain point when we figure out how to preserve their habitat.
03:28:11.000I mean, I gotta think there's a role for good zoos.
03:28:16.000Also, you just want to maintain the public's connection to these animals, because the decision to destroy habitat is made by people who don't really care about the prospects of extinction,
03:28:32.000It's a very good point when you present it that way, because the people that are over there are facing...
03:28:36.000I mean, any people that are over in Africa trying to save gorillas and chimps, I mean, that is an unbelievably difficult struggle, and they might not make it.
03:28:45.000I mean, there's a real concern that if there was no regulation at all, and there was no one telling anybody what to do, that they could just go in there and wipe them all out.
03:28:54.000Well, historically, that's what we've done, right?
03:28:56.000With kind of everything that we've profited from.
03:28:59.000Anything that you can make money off of?
03:29:19.000Well, it's just they're hunting species that you don't think of as food species, but they're eating monkeys and gorillas.
03:29:27.000And that's why they call it bush meat?
03:29:29.000Well, I mean, bush is like the jungle.
03:29:33.000So it's just hunting species that are...
03:29:40.000There's the other component of it, which is the crazy ideas that the Chinese have about the medicinal properties of tiger bone wine or rhino horn.
03:29:50.000So you have these species that are being hunted by poachers because there's a market for their parts, like the ivory trade.
03:30:02.000But some people just eat species that are endangered, too.
03:30:06.000The term bushmeat is always associated with primates for some reason.
03:30:10.000I was always trying to figure out why.
03:31:40.000Well, actually, to go back to our cultured meat conversation, one thing that's weird about that prospect is that if you're just growing cells in a vat, then there's no problem with cannibalism.
03:31:56.000So you could be growing human meat in a vet.
03:32:00.000There's zero ethical problem, but it's just as grotesque as, at least to my palate, it's a fairly grotesque thing to contemplate.
03:32:16.000There is no, in principle, human DNA. And at the cellular level, the...
03:32:26.000The difference between human muscle protein and bovine muscle protein, if this was never attached to an animal, we're dealing with concepts here.
03:32:41.000If you bite a fingernail and swallow it, are you practicing autocannibalism?
03:33:50.000Well, we figured out a way to live in harmony with nature.
03:33:52.000We just have to kill everything except us and then eat ourselves.
03:33:56.000Tell us which part of your own body you want to eat for the rest of your life and we will culture those cells.
03:34:01.000Well, I know it was you that I was having this conversation with once, I believe, where we were talking about how when areas become more educated and women become more educated, it tends to slow the population down.
03:34:12.000People tend to even worry that if these graphs continue further on, that people in industrialized parts of the world, as they get into the first world, if they do, they're more likely to have less and less people.
03:34:30.000Fertility goes down with literacy and education among women, yeah.
03:34:36.000And so just to kind of map that on to life as you know it here, so women, given All the choices available, educational, economic, and an ability to plan a pregnancy.
03:34:53.000So here we have women who want to have careers, want to go to college, and they delay pregnancy to the point where they They have realized a lot of those aspirations, and so pregnancies come later and later and later,
03:35:13.000Virtually no one chooses to have 10 kids in the face of all of this other opportunity that We're the things they also want out of life, right?
03:35:28.000If you can't avoid it, well then you just find yourself with ten kids, right?
03:35:31.000Or if you have some religious dogma which says you, though it's possible to avoid, you shouldn't avoid it because you were put here to have as many kids as possible.
03:35:39.000But are you allowed to bring that up when you talk about the population crisis?
03:36:11.000No, there's an overpopulation crisis in certain countries and disproportionately in the developing world.
03:36:20.000And there is underpopulation in the developed world.
03:36:24.000Most of Western Europe is not replacing itself.
03:36:28.000So you're having these senescent populations who have to, they just have to import They rely on immigration to carry on the functions of society because they're not anywhere near a replacement rate.
03:36:48.000The most surprising detail that brings this home is that There are more adult diapers—now, this is Japan—there are more adult diapers sold in Japan than baby diapers.
03:37:02.000Now, just think about the implication of that for a society, right?
03:37:06.000How do you have a functioning society, barring perfect robots that can tend to your needs, where you have just— A disproportionate number of people who are no longer economically productive,
03:37:23.000relying on the labor of the young to keep them alive and cure their diseases and defend them from crime, all that.
03:37:35.000But the ratio is totally out of whack.
03:37:42.000The world is a giant Ponzi scheme on some level.
03:37:44.000You need new people to come in to maintain it for the old people, apart from having some technology that allows you to do that without people.
03:37:56.000But I think everything I've heard about population recently suggests that we are on course globally to peak around $9.5 billion and then taper off.
03:38:08.000I don't think anyone now is forecasting this totally unsustainable growth where we're going to wind up with Did I say million?
03:38:19.000Where we're going to hit something like 20 billion people, right?
03:38:23.000I don't think anyone, even the most Malthusian people, are expressing that concern at the moment, which was the case like 20 or 30 years ago where they thought this is just going to keep going and we're going to hit the carrying capacity of the earth,
03:38:40.000which is something like 40 billion people.
03:39:14.000Is it ever going to be possible to completely eliminate poverty worldwide and within a lifetime?
03:39:20.000Well, I think we talked about this the last time when we spoke about AI, but this is the implication of much of what we talked about here.
03:39:29.000If you imagine building the perfect labor-saving technology, where you imagine just having A machine that can build any machine that can do any human labor powered by sunlight more or less for the cost of raw materials,
03:41:53.000Because literally, we're talking about, and many people may doubt whether such a thing is possible, but again, we're just talking about The implications of intelligence that can make refinements to itself over a time course that bears no relationship to what we experience as apes,
03:42:17.000So you're talking about a system that can make changes to its own source code and become better and better at learning and more and more knowledgeable, if we give it access to the Internet.
03:42:29.000It has instantaneous access to all human and machine knowledge, and it does thousands of years of work every day of our lives.
03:42:42.000Thousands of years of equivalent human-level intellectual work.
03:42:49.000Our intuitions completely falter to capture just how immensely powerful such a thing would be, and there's no reason to think This isn't possible.
03:42:58.000The most skeptical thing you can honestly say about this is that this isn't coming soon.
03:43:04.000But to say that this is not possible makes no scientific sense at this point.
03:43:10.000There's no reason to think that a sufficiently advanced digital computer can't instantiate general intelligence of the sort that we have.
03:43:23.000Intelligence has to be at bottom, some form of information processing.
03:43:27.000And if we get the algorithm right with enough hardware resources, and the limit is definitely not the hardware at this point, it's the algorithms.
03:43:40.000There's just no reason to think this can't take off and scale and that we would be in the presence of something that is like having an alternate human civilization in a box that is making thousands of years of progress every day,
03:43:58.000So just imagine that if you had in a box You know, the 10 smartest people who've ever lived.
03:44:03.000And, you know, every time, every week, they make 20,000 years of progress, right?
03:44:08.000Because that is the actual—we're talking about electronic circuits being a million times faster than biological circuits.
03:44:16.000So even if it was just—and I believe I said this the last time we talked about AI, but this is what brings it home for me— Even if it's just a matter of faster, right?
03:46:10.000You know, we say, you know, cure Alzheimer's and it cures Alzheimer's.
03:46:13.000You know, you solve the protein folding problem and it's just off and running and to develop a perfect nanotechnology and it does that.
03:46:22.000This is all, again, going back to David Deutsch, there's no reason to think this isn't possible because anything that's compatible with the laws of physics can be done given the requisite knowledge, right?
03:46:35.000So you just, you get enough intelligence and I don't know.
03:47:03.000If Donald Trump is president, what's Donald Trump going to do with a perfect AI when he has already told the world that he hates Islam, right?
03:47:15.000We would have to have a political and economic system that allowed us to absorb this ultimate wealth-producing technology.
03:47:26.000And again, so this may all sound like pure sci-fi craziness to people.
03:47:31.000I don't think there is any reason to believe that it is.
03:47:34.000But walk way back from that edge of craziness and just look at dumb AI, narrow AI, just self-driving cars and automation and intelligent algorithms that can do human-level work.
03:47:52.000That is already poised to change our world massively and create massive wealth inequality, which we have to figure out how to spread this wealth.
03:48:01.000You know, what do you do when you can automate 50% of human labor?
03:48:07.000Were you paying attention to the artificial intelligence Go match?
03:49:57.000And it's using a technique called deep learning for that.
03:50:02.000And that's been very exciting and will be incredibly useful.
03:50:07.000The flip side of all this, I know that everything I tend to say on this sounds scary, but The next scariest thing is not to do any of this stuff.
03:50:30.000It's scary that we have a system where if you gave the best possible version of it to one research lab or to one government...
03:50:39.000It's not obvious that that wouldn't destroy humanity.
03:50:44.000That wouldn't lead to massive dislocations where you'd have some trillionaire who's trumpeting his new device and just 50% unemployment in the U.S. in a month.
03:50:56.000It's not obvious how we would absorb This level of progress.
03:51:02.000And we definitely have to figure out how to do it.
03:51:06.000And of course we can't assume the best case scenario, right?
03:51:11.000I think there's a few people that put it the way you put it that terrify the shit out of people.
03:51:17.000And everyone else seems to have this rosy vision of increased longevity and automated everything and everything fixed and easy to get to work.
03:51:36.000I mean, is this idea of a living thing that's creative and wrapped up in emotions and lust and desires and jealousy and all the pettiness that we see celebrated all the time, we still see it.
03:52:03.000You can live three times as long without that stuff.
03:52:06.000I think it would, in the best case, would usher in a...
03:52:16.000The possibility of a fundamentally creative life on the order of something like The Matrix, whether it's in The Matrix or it's just in the world that has been made as beautiful as possible based on what would functionally be an unlimited resource of intelligence.
03:53:06.000The inventor of game theory, a mathematician who, along with Alan Turing and a couple of other people, is really responsible for the computer revolution.
03:53:18.000He was the first person to use this term, singularity, to describe just this, that there's a speeding up of Information processing technology and a cultural reliance upon it beyond which we can't actually foresee the level of change that can come over our society.
03:53:40.000It's like an event horizon past which we can't see.
03:53:45.000And this certainly becomes true when you talk about these intelligent systems being able to make changes to themselves.
03:54:19.000But we will get more hardware, too, up to the limits of physics.
03:54:23.000And it will get smaller and smaller, as it has.
03:54:25.000And if quantum computing becomes possible or practical, that will...
03:54:33.000Actually, David Deutsch, the physicist I mentioned, is one of the fathers of the concept of quantum computing.
03:54:42.000That will open up a whole other area, you know, extreme of computing power that is not at all analogous to the kinds of machines we have now.
03:55:30.000How we currently use computers, that they just keep helping us do what we want to do.
03:55:37.000Like, we decide what we want to do with computers, and we just add them to our process, and that process becomes automated, and then we'll find new jobs somewhere else.
03:55:46.000Like, you don't need a stenographer once you have voice recognition technology, and that's not a problem.
03:55:52.000A stenographer will find something else to do, and so the economic dislocation isn't that bad.
03:55:59.000Computers will just get better than they are, and eventually Siri will actually work, and she'll answer your questions well, and it's not going to be a laugh line, what Siri said to you today.
03:56:10.000And then all of this will just proceed to make life better, right?
03:56:17.000Now, none of that is imagining what it will be like to make...
03:56:23.000Because there will be a certain point where you'll have systems that are...
03:56:30.000The best chess player on Earth is now always going to be a computer.
03:56:34.000There's not going to be a human born tomorrow that's going to be better than the best computer.
03:56:41.000We have superhuman chess players on Earth.
03:56:45.000Now imagine having computers that are superhuman At every task that is relevant, every intellectual task.
03:57:05.000There's no reason why we're not headed there.
03:57:10.000The only reason I could see we're not headed there is that something massively dislocating happens that prevents us from continuing to improve our intelligent machines.
03:57:19.000But the moment you admit that intelligence is just a matter of information processing...
03:57:25.000And you admit that we will continue to improve our machines unless something heinous happens, because intelligence and automation are the most valuable things we have.
03:57:36.000At a certain point, whether you think it's in five years or 500 years, we are going to find ourselves in the presence of super intelligent machines.
03:57:48.000The best source of innovation for the next generation of software or hardware or both will be the machines themselves, right?
03:57:58.000So then that's where you get what the mathematician I.J. Goode described as the intelligence explosion, which is just the process can take off on its own.
03:58:10.000And this is where the singularity people either are hopeful or worried, because there's no guarantee that this process will remain aligned with our interests.
03:58:26.000And every person who I meet, even very smart people like Neil, who says they're not worried about this, When you actually drill down on why they're not worried, you find that they're actually not imagining machines making changes to their own source code.
03:58:49.000Or they simply believe that this is so far away that we don't have to worry about it now.
04:00:14.000I mean, from what I... I mean, the sense I get from the people who are doing this work, it's far more likely to be 50 years than 500 years.
04:01:05.000Very little progress to, wow, this is all of a sudden really, really interesting and powerful.
04:01:13.000And again, progress is compounding in a way that's counterintuitive.
04:01:19.000People systematically overestimate how much change can happen in a year and underestimate how much change can happen in 10 years.
04:01:27.000And as far as estimating how much change can happen in 50 or 100 years, I don't know that anyone is good at that.
04:01:36.000How could you be with giant leaps come giant exponential leaps off those leaps and it's it's almost impossible for us to Really predict what we're gonna be looking at 50 years from now, but I don't I don't know what they're gonna think about us That's what's most bizarre about it is what we really might be obsolete if we look at how ridiculous we are look at This political campaign.
04:01:59.000Look at what we pay attention to in the news.
04:02:01.000Look at the things we really focus on.
04:02:28.000There are computer scientists who, when you talk about why they're not worried, or talk to them about why they're not worried, they just swallow this pill without any qualm.
04:02:41.000We're going to make the thing that is far more powerful and beautiful and important than we are, and it doesn't matter what happens to us.
04:03:01.000And I've literally heard someone give a talk.
04:03:05.000I mean, that's what woke me up to how interesting this area is.
04:03:10.000I went to this conference in San Juan about a year ago.
04:03:17.000The people from DeepMind were there, and the people who were very close to this work were there.
04:03:24.000To hear some of the reasons why you shouldn't be worried from people who were interested in calming the fears so they could get on with doing their very important work, it was amazing.
04:03:38.000They were highly uncompelling reasons not to be worried.
04:04:53.000And if they suddenly get there and sort of overshoot a little bit, and now they've got something like, you know, general intelligence, you know, or something close, what we're relying on, and they know everyone else is attempting to do this, right?
04:05:08.000We don't have a system set up where everyone can pull the brakes together and say, listen, we've got to stop racing here.
04:05:21.000This truly has to be open source in every conceivable way, and we have to diffuse this winner-take-all dynamic.
04:05:31.000I think we need something like a Manhattan Project To figure out how to do that.
04:05:36.000Not to figure out how to build the AI, but to figure out how to build it in a way that does not create an arms race, that does not create an incentive to build unsafe AI, which is almost certainly going to be easier than building safe AI, and just to work out all of these issues.
04:05:53.000Because I think we're going to build this by default.
04:05:57.000We're just going to keep building more and more intelligent machines.
04:06:08.000With each generation, if we're even talking about generations, it will have the tools made by the prior generation that are more powerful than anyone imagined 100 years ago, and it's going to keep going like that.
04:06:21.000Did anybody actually make that quote about giving birth to the mechanical gods?
04:06:50.000A caveat here is that unless they're not conscious.
04:06:55.000The true horror for me is that we can build things more intelligent than we are, more powerful than we are, and that can squash us, and they might be unconscious.
04:07:08.000The universe could go dark if they squash us.
04:08:02.000The ethical silver lining, and speaking outside of our self-interest now, but just from a bird's eye view, the ethical silver lining to building these mechanical gods that are conscious is that, yes, in fact,
04:08:18.000if we have built something That is far wiser and has far more beautiful experiences and deeper experiences of the universe than we could ever imagine.
04:08:28.000And there's something that it's like to be that thing.
04:08:36.000Well, that would be a very good thing.
04:08:38.000Then we will have built something that was...
04:08:41.000If you stand outside of our narrow self-interest...
04:08:44.000I can understand why he would say that.
04:08:47.000He was just assuming—what was scary about that particular talk is he was assuming that consciousness comes along for the ride here, and I don't know that that is a safe assumption.
04:08:58.000Well, and the really terrifying thing is who— If this is constantly improving itself, and it's under the beck and call of a person then?
04:09:09.000So it's either conscious where it acts as itself, right?
04:09:14.000It acts as an individual thinking unit, right?
04:09:48.000Just imagine, again, you could keep it in the most restricted case, you could just keep it at our level, but just faster, just a million times faster.
04:09:59.000But if it did all these things, if it kept going and kept every week was thousands of years, we're going to control it?
04:10:42.000But just imagine this emerging in some way online, already being out in the wild.
04:10:47.000So let's say it's in a financial market.
04:10:52.000Again, what worries me most about this and what is also interesting is that our intuitions here I think the primary intuition that people have is, no, no, no, that's just not possible or not at all likely.
04:11:07.000But if you're going to think it's impossible or even unlikely, you have to find something wrong with the claim that intelligence is...
04:11:18.000Just a matter of information processing.
04:11:21.000I don't know any scientific reason to doubt that claim at the moment.
04:11:27.000And very good reasons to believe that it's just undoubtable.
04:12:37.000We're going to get there and it's either not going to happen or it's going to be trivial.
04:12:42.000But if you don't have an argument for why this isn't going to happen, Then you're left with, okay, what's it going to be like to have systems that are better than we are at everything in the intellectual space?
04:13:08.000What will happen if that suddenly happens in one country and not in another?
04:13:14.000It has enormous implications, but it just sounds like science fiction.
04:13:19.000I don't know what's scarier, the idea that an artificial intelligence can emerge, it's conscious, it's aware of itself, and that acts to protect itself, or the idea that a person A regular person like of today could be in control of essentially a God.
04:13:38.000Because if this thing continues to get smarter and smarter with every week and more and more power and more and more potential, more and more understanding, thousands of years, I mean, it's just...
04:13:48.000This one person, a regular person controlling that is almost more terrifying than creating a new life.
04:13:55.000Or any group of people who don't have the total welfare of humanity as their central concern.
04:14:02.000So just imagine, what would China do with it now?
04:14:05.000What would we do if we thought China, Baidu or some Chinese company was on the verge of this thing?
04:14:13.000What would it be rational for us to do?
04:14:15.000I mean, if North Korea had it, it would be rational to nuke them, given what they say about their relationship with the rest of the world.
04:14:35.000But to wind this back to what someone like Neil deGrasse Tyson would say is that the only basis for fear is, yeah, don't give your super-intelligent AI to the next Hitler, right?
04:14:49.000But if we're not idiots and we just use it well, we're fine.
04:14:56.000And that, I think, is an intuition that's just a failure to unpack what is entailed by Again, something like an intelligence explosion.
04:15:09.000Once you're talking about something that is able to change itself and So what would it be like to guarantee, let's say we decide, okay, we're just not going to build anything that can make changes to its own source code.
04:15:24.000Any change to software at a certain point is going to have to be run through a human brain, and we're going to have veto power.
04:15:32.000Well, is every person working on AI going to abide by that rule?
04:15:36.000It's like we've agreed not to clone humans, right?
04:15:39.000But are we going to stand by that agreement in the rest of human history?
04:15:44.000Is our agreement binding on China or Singapore or any other country that might think otherwise?
04:15:52.000And at a certain point, everyone's going to be close enough to making the final breakthrough that unless we have some agreement about how to proceed, someone is going to get there first.
04:16:11.000That is a terrifying scenario of the future.
04:16:15.000You know, you cemented this last time you were here, but not as extreme as this time.
04:16:20.000You seem to be accelerating the rhetoric.
04:16:36.000In defense of the other side, too, I should say that David Deutsch also thinks I'm wrong, but he thinks I'm wrong because we will integrate ourselves with these machines.
04:16:47.000There will be extensions of ourselves, and they can't help but be aligned with us because we will be connected to them.
04:16:54.000That seems to be the only way we can all get along.
04:16:57.000Yeah, but I just think there's no deep reason why.
04:17:01.000Even if we decided to do that, like in the U.S. or in half the world, one, I think there are reasons to worry that even that could go haywire.
04:17:10.000But there's no guarantee that someone else couldn't just build AI in a box.
04:17:16.000I mean, if we can build AI such that we can merge our brains with it, Someone can also just build AI in a box, right?
04:17:26.000And then you inherit all the other problems that people are saying we don't have to worry about.
04:17:30.000If it was a good Coen Brothers movie, it would be invented in the middle of the presidency of Donald Trump.
04:17:36.000And so that's when AI would go live, and then AI would have to challenge Donald Trump, and they would have like an insult contest.
04:17:45.000That's when this thing becomes so comically terrifying, where it's just...
04:17:51.000Just imagine Donald Trump being in a position to make the final decisions on topics like this for the country that is going to do this almost certainly in the near term.
04:18:06.000It's like, should we have a Manhattan Project on this point, Mr. President?
04:18:14.000The idea that anything of value could be happening between his ears on this topic or a hundred others like it, I think is now really inconceivable.
04:18:26.000So what price might we pay for that kind of inattention and self-satisfied inattention to these kinds of issues?
04:18:37.000Well, this issue, if this is real, and if this could go live in 50 years, this is the issue.
04:18:44.000Unless we fuck ourselves up beyond repair before then and shut the power off, if it keeps going...
04:18:50.000Yeah, no, I think it is the issue, but unfortunately it's the issue that doesn't, it sounds like a goof.
04:19:14.000I mean, chess doesn't do it because chess is so far from any central human concern.
04:19:19.000But just imagine if your phone recognized your emotional state better than your best friend or your wife or anyone in your life and it did it reliably.
04:19:31.000And was your buddy like that movie with Joaquin Phoenix?
04:19:44.000I mean, you could do that without any...
04:19:47.000Any other ability in the phone, really.
04:19:49.000It doesn't have to stand on the shoulders of any other kind of intelligence.
04:19:56.000You could do this with just brute force in the same way that you have a great chess player that doesn't necessarily understand that it's playing chess.
04:20:06.000You could have the facial recognition of emotion and the tone of voice recognition of emotion and The idea that it's going to be a very long time for computers to get better than people at that I think is very far-fetched.
04:20:24.000I was thinking, yeah, I think you're right.
04:20:26.000I was just thinking how strange would it be if you had like headphones on and your phone was in your pocket and you had rational conversations with your phone.
04:20:33.000Like your phone knew you better than you know you.
04:20:36.000Like, I mean, I don't know what to do.
04:20:37.000I mean, I don't think I was out of line.
04:23:19.000Once you opened up that box, that Pandora's box of artificial intelligence.
04:23:22.000I have a small question about AI that I haven't heard you guys discuss yet, and I've looked up.
04:23:25.000Is there any sort of concept of, like, autism in AI? Like, a spectrum of AI? Like, there are dumb AI, and there's going to be smart AI, but...
04:23:42.000I mean, across the board, I think that superintelligence and motivation and goals are totally separable.
04:23:50.000So you could have a superintelligent machine that is purposed toward a goal that just seems completely absurd and harmful and non-commonsensical.
04:24:00.000And so the example that Nick Bostrom uses in his book, Superintelligence, which was a great book, And did more to inform my thinking on this topic than any other source.
04:24:13.000You could build a super-intelligent paperclip maximizer.
04:24:16.000Now, not that anyone would do this, but the point is you could build a machine that was smarter than we are in every conceivable way, but all it wants to do is produce paperclips.
04:24:26.000Now, that seems counterintuitive, but there's no reason, when you dig deeply into this, There's no reason why you couldn't build a superhuman paperclip maximizer.
04:24:38.000It just wants to turn everything, you know, literally the atoms in your body would be better used as paperclips.
04:24:44.000And so this is just the point he's making is that...
04:24:48.000Superintelligence could be very counterintuitive.
04:24:50.000It's not necessarily going to inherit everything we find as commonsensical or emotionally appropriate or wise or desirable.
04:25:00.000It could be totally foreign intelligence.
04:25:03.000Totally trivial in some way, you know, focused on something that means nothing to us but means everything to it because of some quirk in how its motivation system is structured, and yet it can build the perfect nanotechnology that will allow it to build more paperclips,
04:25:24.000At least, I don't think anyone can see why that's ruled out in advance.
04:25:28.000I mean, there's no reason why we would intentionally build that, but the fear is we might build something that either is not perfectly aligned with our goals and our common sense and our aspirations,
04:25:44.000and that it could form some kind of separate instrumental goals to get what it wants that are totally incompatible with Life as we know it.
04:25:55.000And that's, you know, I mean, again, the examples of this are always cartoonish.
04:26:00.000Like, you know, how Elon Musk said, you know, if you built a super intelligent machine and you told it to reduce spam, well, then it could just kill all people.
04:26:07.000And that's a great way to reduce spam, right?
04:26:09.000But see, the reason why that's laughable, but you can't assume, the common sense won't be there unless we've built it, right?
04:26:18.000Like, you have to have anticipated all of this.
04:26:19.000If you say, take me to the airport as fast as you can, again, this is Bostrom, and you have a super-intelligent automatic car, a self-driving car, you'll get to the airport covered in vomit because it's just going to go as fast as it can go.
04:26:37.000So our intuitions about what it would mean to be super-intelligent necessarily are...
04:26:46.000I mean, we have to correct for them, because I think our intuitions are bad.
04:26:51.000You're freaking me out, and you've been freaking me out for over an hour and a half.
04:26:55.000I'm freaked out that we did four and a half hours, and I thought we were coming up on three.
04:27:00.000Man, I hope you're wrong about all that stuff.