Sam Harris shares the story of a panic attack he had while filling in for a reporter on a morning news show, and how it almost killed him. He talks about how he recovered, and what happened to him after the panic attack. He also talks about what he did to get over it, and why he thinks it might have been caused by cocaine and ecstasy. Sam Harris is a stand-up comedian, writer, podcaster, and podcaster. He's also a friend of mine, and I really enjoyed having him on the show, so I thought it would be fun to have him on to talk about his experience with panic attacks and how he was able to recover from it. I hope you enjoy the story, and that it makes you feel better about your own anxiety, depression, and panic attacks. I know I did. I've had my own panic attacks in the past, and it's not fun, but it doesn't have to be this bad. If you're struggling with anxiety, or are struggling with something similar, please talk to a doctor if you can. I understand that being able to see a doctor and get some help. Thank you for listening and sharing this with someone who can help you. I really appreciate it. You're a rockstar. Thanks for listening. -Jon Soraya. Jon Sorrentino Music: "Good Morning America" by Shadydave (feat. The Good Morning America) (featuring Sam Harris ( ) and Dan ( ) ( ) is out of this episode, and we hope that you enjoy it. Thank you, Jon ( ) and Jon ( & Jon ( ). is out here . Thanks Jon and Jon is a good friend of ours, Jon's music is out there, and Jon's work is amazing, too! Jon's new album is out on SoundCloud , and we really hope you like it, too, so much so that you'll leave us a review of it on Apple Podcasts, too? You can find it on Soundcloud, and other places on Anchor, and also on Podchaser, and Instapod, and Apple Music, and all of our social media, and TikTok, etc., and other stuff like that, and more. (Thank you Jon's Insta: ) and we're on Insta, and his Insta Story
00:02:05.000So you start to worry, but then your fight-or-flight instincts kick in.
00:02:09.000So your lungs seize up, your palms start sweating, your mouth dries up, your heart is racing, your mind is racing.
00:02:18.000I couldn't breathe and therefore couldn't speak.
00:02:21.000So a couple seconds into reading what was supposed to be six stories right off of the teleprompter, I lost the capacity to speak, and I had to kind of squeak out something about, you know, back to you, back to the main anchors.
00:03:38.000And I wasn't getting high on the air or anything like that, but, you know, I was partying in my spare time because it made me feel better.
00:03:47.000So after I had the panic attack, I went to a doctor who's an expert in panic, and he started asking me a bunch of questions, trying to figure out what had caused the panic attack, and one of the questions was, do you do drugs?
00:04:08.000And he just pointed out that, you know, you raise the level of adrenaline in your brain artificially.
00:04:14.000You make it much more likely to have a panic attack.
00:04:16.000And I, at baseline, I'm a jittery little dude.
00:04:21.000So it doesn't take much to put me in that zone.
00:04:24.000I mean, you just offered me coffee and I said no, because even that will freak me out.
00:04:28.000Well, it's weird that ecstasy and cocaine was the combination, because ecstasy is something that they actually give to a lot of soldiers that have PTSD, and there's been quite a few tests on that.
00:04:37.000Yeah, I don't actually think ecstasy was the problem.
00:07:29.000It's been a while since I watched it, but absolutely.
00:07:32.000I want to just be clear that the experience of a journalist is so different from the experience, so much more mild than the experience of an enlisted man or woman.
00:08:44.000A very good friend of mine, the guy who actually ultimately set me up with my wife, is a guy named Bob Woodruff, who was the anchor of World News Tonight on ABC News.
00:08:53.000He had only been in the chair for about a month when he was on a trip to Iraq and he literally got his head nearly blown off.
00:10:02.000When you're a journalist and you're over in Iraq or in Afghanistan, you're in war, and what you're experiencing is so far removed from the day-to-day life that most people experience, what is it like trying to relay that to people?
00:11:18.000Yeah, and that's why you see a lot of risk-taking behavior among vets, because you're looking for another way to get that hit of adrenaline, for sure.
00:11:34.000There's a book, and I'm blanking on the name, it's a great book written by a much more experienced war correspondent than me.
00:11:44.000And that, to me, sums it up, at least in my experience.
00:11:47.000I got hooked on the experience of being in these really elevated situations, heightened situations, cinematic, dramatic situations, and I would come home and I didn't know what to do to replace it, and so this synthetic squirt of adrenaline that you can get from cocaine So...
00:13:10.000And I wouldn't have, I wouldn't call what I mean, there are people who have had drug addictions that are vastly more severe than mine.
00:13:19.000But it sucks to stop a habit that is giving you pleasure on, you know, on pretty, in pretty prominent areas of your brain.
00:13:29.000Well, your situation, what you're talking about is a very, very, very extreme situation.
00:13:33.000Like being a journalist, a war correspondent, going over there, experiencing that intense sort of adrenaline rush and then having your issues with it.
00:13:43.000But it seems like there's a tremendous amount of people today that are stimulating themselves.
00:13:51.000I mean, it's just, I know, I've found out recently like four or five people that I didn't know that were on Adderall.
00:13:56.000It seems like you just sort of start asking questions, and you find out how many people, I mean, all these, my kid goes to school, With a bunch of other kids and you get to meet the parents and like fucking half of them are on Adderall.
00:14:40.000I think it's just speaking, we have a neuroscientist in the room, so I'll let him say more about this, and also a guy who's a more experienced practitioner of Buddhism than I am, but, you know, it does speak to the nature of the human mind, that we're always on the hunt for the next little hit of dopamine, and now there are lots of ways to get it.
00:14:59.000Well, it's also very bizarre that you can just do that.
00:15:01.000I mean, I don't think there's ever been a time in history where you could just take a pill and you'll be elevated for five or six hours.
00:15:09.000I mean, and that your doctor will give you this pill and they'll encourage you to take it.
00:15:13.000And then you'll find out that 50% of the people in your community are taking it.
00:15:16.000I've done stories about parents who steal it from the kids.
00:15:21.000I mean, I'm reaching for the coffee here, having slept poorly last night.
00:15:27.000But that is, and that's just as much of a drug, it's just not as potent a drug as taking methamphetamine or Adderall or anything else that's a drug drug.
00:15:38.000But, I mean, this had civilizational consequences when humanity more or less switched from alcohol in the morning to caffeine in the morning.
00:15:48.000That's just—things got a lot different.
00:15:51.000I mean, for hundreds of years, people were just drinking ale and wine in the morning before coffee and tea became— Huge in Europe, and colonialism, the engine of colonialism, to a significant degree,
00:16:07.000was coffee, tea, sugar, and, you know, our behavior changed.
00:16:14.000People that were drinking ale and wine, wasn't a big part of it, the reason why they drank it with food, is because water would get stagnant.
00:16:21.000Yeah, well, there's the issue with clean water, too.
00:16:44.000I think the point of the underlying neuroscience is that all of these drugs, anything you're putting into your body that's modifying the behavior of your brain is only modifying the existing available neurochemistry of your brain.
00:17:02.000Get your brain to secrete more of an existing neurotransmitter or they mimic an existing neurotransmitter binding to the same receptor site or they keep something in play longer than it would otherwise have been.
00:17:14.000They block the reuptake of neurotransmitters or neuromodulators.
00:17:19.000So, a drug is never getting your brain to do something your brain is incapable of doing.
00:17:26.000And that's the most extreme thing, like DMT or LSD. The brain is still doing all of that.
00:17:33.000And so it stands to reason that there are...
00:17:37.000There are potentially other ways of getting the brain to do that, whether it's meditation or whether it's computer interface, ultimately, to the brain.
00:17:44.000I know that people are interested in brain-computer interface that not only allows a quadriplegic to move a robotic arm or gets a Parkinson's patient to be able to move,
00:18:00.000but to the ultimate degree, actually augmenting human function or opening I mean, all of that is...
00:18:17.000In principle, possible because, again, we're just talking about electrochemical phenomenon happening in our heads, which is just there to be modulated.
00:18:28.000Now, when you were talking about being depressed, and Sam, you're talking about reuptake inhibitors, I want to know, what are your thoughts on the massive amount of people that are on SSRIs now?
00:18:39.000I mean, there's another thing that I know, how many people I know that have Either are on or have been on some sort of antidepressants.
00:18:47.000And it seems, I mean, to me, to someone who's never taken them or doesn't have personal experience with it, massively overprescribed.
00:18:57.000Yeah, well, anything I say is with a caveat that this is, I mean, I'm not a neurologist.
00:19:57.000Finding the neuromodulator where only the symptoms you want to relieve are affected because these chemicals do a lot of things in a lot of places, even in your gut.
00:20:10.000Hence the side effects you get with almost any medication.
00:20:17.000In many respects, it's a matter of luck.
00:20:21.000To find a pharmacological target that actually does just what you want it to do, which is to say that those receptors are not elsewhere that are going to produce side effects for you.
00:20:34.000That's why a different kind of intervention, something like, ultimately, some electrical or magnetic or machine-based intervention, Could be more targeted because then you're not just putting something in the bloodstream that spreads everywhere.
00:20:54.000By machine-based, you mean something like electrodes that they put on the mind or the surface of the head to stimulate areas of the brain?
00:21:42.000But he announced investment in a company called Neuralink, which is looking at some advanced brain-computer interface.
00:21:54.000Based on the idea that you could get a, with these new microelectrodes, you can get an injectable mesh, like a wire mesh that just integrates with the brain, or very likely just the cortex.
00:22:11.000And I believe this work has already been done in mice, and the mice are, you know, have survived and are living with this mesh in their brains.
00:22:22.000And again, this is not research I'm close to at all.
00:22:24.000He just announced this a couple of weeks ago.
00:22:26.000But in principle, you're talking about having, whether it's a mesh or whether it's magneto-electric particles, something that is...
00:22:38.000On-site around individual neurons or assemblages of neurons, which can both read out and input wirelessly signal from those neurons.
00:22:50.000So just by both putting your thoughts into the world by influencing effectors, robotic arms or cursors on screens or whatever it is, And also influencing your mind based on whatever inputs you want to put in there from the world.
00:24:10.000What does Snapchat do, like 15 seconds?
00:24:13.00010 seconds at a time that little counter is just flashing was like the last three seconds and you can hit it again and get another 20 seconds or you can hold it and do like 30 seconds or something like that.
00:24:40.000The Google Glass thing made people very uncomfortable.
00:24:42.000I mean, I tried a very early prototype.
00:24:45.000I have a good friend of mine who was an executive at Google at the time, and she got a hold of one of the really early ones that actually had to be tethered by a cord.
00:24:54.000And, you know, you talk to it and swipe it.
00:24:56.000And I played with it a couple of times and we used it once at a UFC weigh-in where I put it on and I broadcast from the weigh-in.
00:25:15.000How long do we have while we're still people?
00:25:19.000Well, long before you asked that question, our sense of privacy...
00:25:27.000I remember what it was like to be neurotic about the sound of your voice on a voicemail or an answering machine, right?
00:25:35.000Like re-recording the outgoing message and just being worried about your voice showing up in someone else's tape.
00:25:42.000And now we're living in this panopticon surveillance society where you just assume you're on camera virtually every moment you're in public.
00:25:51.000Although I guess people don't think about it all that much.
00:25:55.000I think the norms around privacy shift just because we get so much value from having the data, ultimately.
00:27:11.000I mean, and Josh Zeps freaked me out this morning.
00:27:14.000He sent me some articles about these self-driving trucks that are already going in Australia that are as big as a 767, and they're driving down the road by themselves with cargo, probably nuclear waste or something, you know, just tooling down the road.
00:27:29.000But people are so bad at driving that the robots just have to get reliably better than people, and then you'll just feel nothing but relief.
00:27:40.000I mean, I think they probably are, as far as I know from what I hear from Tesla, that, yeah, the man hours they have of people using the autopilot, and the autopilot's not at all perfect, obviously.
00:27:55.000Two people have died already from using it badly.
00:27:59.000But still, they have something like some millions of man-hours of autopilot-assisted driving, and I think that has been safer than just pure ape.
00:28:10.000Did you see the video of the guy who fell asleep in traffic in San Francisco?
00:28:14.000He's literally out cold and his car's driving him on the highway?
00:28:20.000I mean, he's just some guy on his way to work, just passed out completely, mouth open, and people are filming him while his car is driving down the road.
00:28:28.000Yeah, and actually, probably the people filming him are doing the more dangerous thing.
00:30:02.000Maybe some people want to drive recreationally for some reason, but it will just be a new space where you won't believe that 40,000 people every year were dying because we couldn't figure out how to drive safely.
00:30:56.000I mean, if we build artificial intelligence that is...
00:30:59.000If it's independent of us and seems conscious and is more powerful than us, well, then we are, we have built a, in the limit, we have essentially built a god that we now have to be in relationship to.
00:31:13.000And hopefully that's a, works out well for us.
00:31:16.000And it's very easy to see how it might not.
00:31:18.000I think there are even scarier cases than that, though.
00:31:21.000We could build something that has godlike power.
00:31:23.000But there's no reason to think it's conscious.
00:31:27.000It's no more conscious than our current computers, which is to say that intelligence and consciousness may be separable phenomenon.
00:31:36.000Intelligence can scale, but consciousness need not come along for the ride.
00:31:40.000And that, for me, is the worst case scenario, because if we inherit all of the danger of The power of this system being misaligned with our interests.
00:31:51.000We could build something that's godlike in its power, and yet we could essentially be canceling the prospects of the evolution of consciousness.
00:32:01.000Because if this thing wipes us out, I think it's Nick Bostrom, the philosopher, who wrote a great book on this entitled Superintelligence.
00:32:10.000I think he calls this the Disneyland without children.
00:32:16.000It's incredibly powerful, intelligent landscape that continues to refine itself and its own powers in who knows what ways, what ways perhaps that are unimaginable to us, and yet the lights aren't on.
00:32:33.000There's nothing that it's like to be this machine or system of machines in a way that it's probably nothing that it's like to be the Internet right now.
00:32:41.000You think of all that's going on on the Internet...
00:32:44.000I don't think the Internet is conscious of any of it right now.
00:32:47.000The question is, could the Internet become conscious of what it's thinking?
00:32:51.000And I think there's no reason to think it couldn't.
00:32:55.000It's just we don't understand the physical basis of consciousness yet.
00:32:59.000The real question is, why would it do anything?
00:33:02.000I mean, if it doesn't have any of the biological motivations that people have to breed and to stay alive and to, you know, fight or flight and to be nervous and this desire to carry on our genes, I mean, if you really did build the ultimate supercomputer artificial intelligence that was beyond our capacity for reason and understanding,
00:33:45.000And so everything we build that's automated has goals explicitly programmed into it.
00:33:53.000And when you're talking about a truly intelligent machine, it will discover goals that you have never programmed into it that are intermediate to the goal that you have programmed.
00:34:05.000So if the goal of this machine is to You know, pick up all the trash in this room, and you physically try to stop it, well, then it's going to try to get around you to pick up the rest of the trash in the room, right?
00:34:20.000So it's, you know, this is probably already true of a Roomba, right?
00:34:23.000I actually don't have a Roomba, but if you put something in the way of the Roomba, it's going to get around the thing you have put in its way so that it can get to the rest of the room.
00:34:33.000So that's an intermediate goal, and some of these goals need never have been explicitly thought about or represented, which is to say programmed into it, and yet they're formed by the fact that the thing has a long-term goal.
00:34:49.000And one of the concerns is that we could build something...
00:34:53.000That has a long-term goal, build something that's super powerful, that has a long-term goal, which in principle is benign, right?
00:35:01.000This is something we want, and yet it could discover instrumental goals that are deeply hostile to what we want.
00:35:09.000I mean, the thing doesn't have common sense, right?
00:35:12.000We haven't figured out how to build common sense into the machine.
00:35:15.000So, I mean, there's just cartoon examples of this kind of thing, but like one example that...
00:35:21.000Elon used when he first was expressing fears about this.
00:35:24.000If you built a machine, the only goal of which was to cancel spam, right?
00:35:33.000Well, an easy way to get rid of spam is just kill all the people, right?
00:35:36.000Now, that's a crazy thing to think, but unless you've closed the door to that intermediate goal, My question would be,
00:35:51.000if this super powerful machine has the ability to create new super powerful machines, would it use the same mandate?
00:35:57.000Would it still try to follow the original programming or would it realize that our original programming is only Instrumental to the success of the human race and it might think the human race is ridiculous and preposterous and why not just program something that it thinks is the ultimate intelligence something beyond our capacity for reason and understanding now and that thing I would wonder in the absence of any sort of biological Motivations in the absence with I
00:36:27.000mean do you think of all the things that we do?
00:36:29.000I mean you break it down to What motivates people to get out of bed what motivates people to do good what our sense of community?
00:36:37.000The desire to breed the social status all these different things that motivate people to do things Remove all of those and what actions would it take and why?
00:36:46.000Well, I think I think you want to build it in a way That is focused on our well-being.
00:36:53.000For instance, I had Stuart Russell on my podcast.
00:36:57.000He's a computer scientist at Berkeley who, unlike many computer scientists, takes this problem really seriously and has thought a lot about it.
00:37:08.000In his lab, I believe they're working on a way of thinking about Safety that is open-ended and flexible without pretending we have any of the right answers in the near term or we're likely to have them.
00:37:26.000So you want to build a system that wants to know what you want at each point, that's tracking what humanity wants in terms of its goals.
00:38:04.000It doesn't think it knows what we want, and it continually wants to keep approximating better and better what we want.
00:38:13.000From my point of view, the most crucial thing is you always want the door to remain open to the statement, wait, wait, wait, that's not what I wanted.
00:38:25.000You want to be in the presence of this godlike superpower that will always take direction I think?
00:38:51.000There's dangerous potential in AI. I mean, I was listening to when you talked to Will McCaskill a few weeks ago, who was saying, you know, it was just a few years from starting to think about how to split the atom to actually having a bomb.
00:39:07.000Are they not coming around to the idea that technology can progress much faster than we think of?
00:39:13.000Well, it's a whole spectrum of the people who think that this is never going to happen or it's so far away that thinking about it now is completely irrational to people who are super worried and think that huge changes are imminent.
00:39:30.000Yeah, as am I. Well, my concern is not even with the initial construction, like the initial AI. My concern is with what the AI creates.
00:39:39.000If we give the AI ability to improve upon itself and look at our irrational thoughts and how we've programmed itself to support the human race, and then it might go, well, why the fuck would I do that?
00:40:07.000The analogy I always use is that we're some sort of an electronic caterpillar giving birth to some spectacular butterfly that we're not even aware of while we're building our cocoon.
00:40:19.000I mean is the caterpillar fully conscious of what it's doing when it makes that cocoon?
00:40:23.000Probably not, but it just does and there's plenty of Examples of that in nature of something that's doing something that's going through some metamorphosis that's completely unconscious.
00:40:32.000My worry would be, I guess it's not even really a worry.
00:40:36.000It's more like looking at the possibility of the AI improving upon itself and making a far better version than we could create, like almost instantaneously, right?
00:40:47.000I mean, isn't that, if you give it the ability to be autonomous and you give it the ability to innovate and to To try to figure out what's a better way around things and what's a better way to program things and then make its own version of what it is It's gonna be spectacular.
00:41:01.000I mean it really will be just I mean it's obviously just talking shit, but really would be a god I mean you're talking about something that if we give it the ability to create we give it the ability to think reason rationalize and then Build.
00:41:23.000Anything that can be built with intelligence can be built with intelligence.
00:41:27.000So you could be talking about armies of robots and nanotechnology and everything else that is the staple of the sci-fi scare scenario.
00:41:36.000But more likely, and certainly faster, And ultimately more powerful, you're talking about something that can rewrite its own code, improve...
00:41:46.000I mean, it's the code that is dictating the intelligence.
00:41:50.000And so you're talking about something that could be, for the longest time, invisible and just happening on the Internet.
00:41:58.000You're talking about code that could be put into financial markets, which could be built to be self-modifying.
00:46:47.000But it's actually, when you read the history of that effort, the Manhattan Project and the Trinity Test, It is super sobering because they moved forward in a context of real uncertainty about what was going to happen.
00:47:08.000In terms of the yield of the first bomb, there was a range, I think, of a hundredfold difference of opinion of what they were going to get once this thing went off.
00:47:38.000I mean they did something like due diligence where they were many of them were confident it wouldn't but that was not without beyond the realm of possibility for some of the people working on it and So we have shown a propensity for taking Possibly existential risks to develop new technology because there's there's a reason to develop it and In this case I
00:49:33.000I just feel like it's because of the race, because of the idea that there's a race to get to it, it seems like it's inevitable that someone actually does create it.
00:49:42.000And much like the atomic bomb, it'll probably be launched without a true understanding of what its potential is.
00:50:48.000Or just imagine some malicious code just destroying the record of money, just getting into the banking system, right?
00:51:00.000So it's like you then have to go look for the paperwork you may or may not have in your desk to argue that you have a certain amount of money because all those bits got scrambled, right?
00:51:11.000And we need some—all of this is just—there's so many aspects to this, but the fact that you can now credibly fake audio, right?
00:51:21.000So someone can listen to a sample, you know, five minutes of this podcast and then produce a conversation we've never had in voices exactly like our own— And those edits will no longer be discernible.
00:51:33.000I mean, we're basically there now, and we're almost there with video, right, where you could just have our mouths moving in the correct way.
00:51:41.000Well, again, I go to Snapchat, these crazy Snapchat filters.
00:51:44.000I don't know if you know about these, but my daughter, pull up the one of my daughter being Abraham Lincoln.
00:51:51.000I mean, it's really rudimentary right now, but my six-year-old loves it.
00:51:55.000She thinks it's hilarious, and she constantly uses it all the time.
00:51:58.000Like, she's just like, can I play with your phone?
00:52:00.000And she grabs my phone, and then she starts doing these little videos.
00:52:03.000Like, somehow or another, the little brains, like, sync up immediately with the technology, where if I gave it to my mom, she'd be like, I don't even know what this is.
00:53:17.000And being able to produce content where, you know, you are saying the thing that completely destroys your reputation, but it's pure—it's just fake news.
00:53:29.000And so we need—I mean, I don't know what the fix for this is.
00:53:33.000You know, I've just—and this is, again, something I know very little about, but I've— Clearly, something like the blockchain has to be a way of anchoring each piece of digital content.
00:53:45.000So there's like a chain of custody where you see exactly where this came from in a way that's not fake-able.
00:53:51.000And so, to take your podcast as an example, If someone is producing a clip that purports to be from your podcast where you're saying something insane, there just has to be a clear fingerprint digitally,
00:54:07.000which shows whether that came from you and Jamie or whether this came from some Macedonian hacker who just decided to screw with you.
00:55:33.000It's just people aren't really on the ball.
00:55:37.000There was a bit from my last comedy special about the guy that broke into the White House, about there was a woman guarding the front door by herself, and they had shut the alarm off because it kept going off, so they had to fuck it, just shut it off.
00:55:49.000And then there was a guy who was on the lawn who was supposed to be, he had canines, was supposed to be guarding the lawn.
00:55:54.000He took his earpiece out to talk to his girlfriend on the cell phone.
00:55:57.000He had a backup walkie talk, because they have a backup one, but he left that in his locker.
00:56:02.000So it's like all these steps, and this guy just hit the perfect sweet spot where he hopped the fence, ran the whatever hundred yards plus to get to the White House, got to the door, it was unlocked, got through it, there was a girl by herself, threw her to the ground, and just ran through the White House,
00:56:19.000and was in there for five, ten minutes.
00:56:23.000He was basically trying to, it was basically a suicide by cop situation, because they had caught him, this is what's really hilarious, when people think that the government is watching out for you, They weren't even watching this guy.
00:57:42.000I mean, this is brutal, but he took a photo of, I forget what actress, but clearly he was, I mean, they have, he then deleted the photo, but people recovered it.
00:57:58.000I don't know, Charlize Theron or somebody who was walking backstage and put it on his Twitter feed.
00:58:03.000And then the very next moment was the moment when he had to hand over the right envelope.
00:58:10.000And he just handed over the envelope for the best actress.
00:58:17.000But their only job is to get this straight, right?
00:58:20.000And they're just safeguarding the envelopes.
00:59:20.000And getting captivated by some social glitch, right?
00:59:24.000So if you have a bomb-screening robot, that's all it will do, and it will be better, whatever the visual signature of a bomb is, when you're talking about looking for one in a bag.
00:59:37.000Once computers get better at that than people, they will be reliable in a way that people can never be.
01:00:02.000But the scary stuff is when it can change itself.
01:00:10.000That's a principle that would allow it to escape from our control.
01:00:15.000Or we build something where we haven't anticipated the consequences.
01:00:19.000And the incentives are wrong The incentives are not aligned to make us prudent in the development of that.
01:00:28.000An arms race is the worst case for that, because the incentives are to get to the end zone as quickly as you can, because the guy next to you is doing the same, and you don't know whether he's ahead of you or behind you.
01:00:43.000And it's a winner-take-all scenario, because the amount of wealth that will go to the winner of this race, if things work, if this group doesn't destroy the world, is unimaginable.
01:00:57.000We're talking about a level of windfall profits that we just haven't seen in any other domain.
01:01:10.000Well, a perfect example is the creation of the atomic bomb.
01:01:12.000I mean, look at 70 years from the creation of the atomic bomb to today, which is literally a blip in human history, just a tiny little blink of an eye.
01:01:20.000And then the United States emerges as the greatest superpower the world's ever known.
01:01:25.000And that's going to be directly attributed to those bombs that were dropped on Hiroshima and Nagasaki.
01:01:29.000I mean, that from there on, that's where everything takes off.
01:01:33.000When you look at human history, if you look at us from a thousand years from now, it's very likely that they look at that moment and they go, this is the emergence of the American empire.
01:01:41.000But this is so much bigger than that because the bombs, the power in the bomb was just...
01:02:10.000Once you're talking about general human intelligence and beyond, you're talking about now having access To suddenly having access to the smartest people who have ever lived, who never sleep, who never get tired,
01:02:26.000who never get distracted, because now they're machines, and the smartest people who have never lived, right?
01:02:32.000People who are a thousand times smarter than those people and getting smarter every hour, right?
01:02:37.000And what are they going to do for you when they're your slaves?
01:02:45.000How powerful would you suddenly be if you had 10,000 of the smartest people in the world working for you full-time, they don't have to be fed, they never get disgruntled, they don't need contracts, and they just want to do whatever Joe Rogan wants to do and get done,
01:03:02.000and you flip a switch and that's your company, right?
01:03:10.000Some version of that's going to happen.
01:03:12.000The question is, is it going to happen in a way where we get this massive dislocation in wealth inequality, where all of a sudden someone's got a company and products and a business model which...
01:03:25.000It obviates 30% of the American workforce over the span of three months.
01:03:33.000Or you have some political and economic and ethical context in which we start sharing the wealth with everyone.
01:03:44.000And this is, again, this is the best case scenario.
01:03:46.000This is when we don't destroy ourselves inadvertently by producing something that's hostile to our interests.
01:04:14.000It's actually awesome because when she's there, she's like just tooling on him and he giggles and stuff like that.
01:04:20.000It's like a completely different Sam than the guy who's being an undertaker here and telling you about how we're all going to die from AI. Yeah.
01:05:56.000Or the meditation that has helped both of us.
01:06:02.000That definitely will help you while human beings are a real thing.
01:06:08.000Yes, although I wonder if, you know, getting enough people meditating might improve the quality of whatever gets created in the AI community.
01:06:18.000In other words, if you have people who are a little bit more sane as a consequence of meditation than maybe the people designing these products, rather these, I don't know if products is the right word, this stuff, then somehow this stuff is better.
01:06:32.000Well, I think it's especially relevant for the other side of that, which is when you...
01:06:38.000In the near term, we clearly have an employment problem.
01:06:43.000There are jobs that will go away that are not coming back, and...
01:07:05.000We'll get to a place where there's actually much less that has to be done because we have machines that do it and we have seen the wisdom of mitigating wealth inequality to some acceptable level where all boats begin to rise with this tide to some degree.
01:07:25.000So whether it's universal basic income or some...
01:07:27.000So we have some mechanism to spread the wealth around.
01:07:30.000But then there's the real question of what do you do with your life?
01:07:48.000If you told me that I never had to work again and I would just have to find things to do all day and all my food and everything would be taken care of, that would be the easiest choice I'd have ever made in my life.
01:08:08.000That seems to me that could be solved really easy.
01:08:11.000I feel the exact same way, but I think that I'm not convinced.
01:08:15.000I feel I could fill endless swaths of free time with a number of things that I'm interested in, including, but not limited to, meditation or just hanging out with my two-year-old.
01:08:26.000However, I'm not sure that the vast...
01:09:17.000And they've got huge buckets of soda and turkey legs and kind of hooked over the back of their motorized vehicle as a monitor at the front, which is entertaining them.
01:09:29.000They're just obese and entertained and immobile.
01:09:33.000Or mobile, but not actually ambulatory.
01:09:47.000Like, you'd be sitting in the waiting room of a doctor, right?
01:09:51.000And they have crappy magazines, and then you're just sitting there.
01:09:54.000And if you didn't know how to meditate, you had to confront this sense of, I'm bored, right?
01:10:01.000Now, one thing you discover when you learn how to meditate is boredom is just...
01:10:06.000An inability to pay attention to anything and then you can pay attention once you learn to pay attention to anything even something seemingly boring as your breath it suddenly becomes incredibly interesting so so focused attention is Intrinsically pleasurable, but boredom is the state of of kind of scattered attention looking for something that's worth paying attention to and yet now with technology You're never going to be bored again.
01:10:32.000I have at least 10 years worth of reading on my phone.
01:10:36.000So it's like if I'm standing in line, I'm constantly pulling this thing out.
01:10:42.000Well, it's potentially a bad thing because you...
01:10:47.000I mean, just to take this example of one interesting insight you get when you learn to meditate...
01:10:54.000It's incredibly powerful to cut through the illusion of boredom.
01:10:59.000I mean, to realize that boredom is not something.
01:11:03.000You can become interested in the feeling of boredom, and the moment you do, it...
01:11:15.000I think the bad thing about the hyper-stimulation that we get through our phone and all of technology is that we have lost the ability to just Sit back and, for lack of a less cliched term,
01:11:34.000And that means we have trouble paying attention when we're holding our kid in our lap and reading him or her a book, or we find ourselves without our technology for a moment.
01:11:44.000There was a recent study that asked people, would you rather be alone with your thoughts or get electric shocks?
01:11:51.000And a lot of people took the electric shocks.
01:11:54.000And I actually think that is a fundamental problem in terms of not being able to get in touch with the raw and kind of powerful, although obvious, fact that you're alive and that you exist.
01:12:06.000Right, but don't you think those people are idiots?
01:12:09.000I mean, you don't want an electric shock.
01:13:19.000But I've had both kinds of experience.
01:13:23.000I've had people send me articles that I would have never found otherwise.
01:13:26.000They're fascinating, super useful, and it's just like, this is the perfect use of this technology.
01:13:31.000And then again, then I get this river of green frogs and weirdness.
01:13:35.000But to the previous point, I... I think?
01:14:06.000I mean, it doesn't equip me to do anything better in my life.
01:14:11.000It doesn't make me feel any better about myself or other people.
01:14:14.000I mean, if it has any net effect, it basically grabs a dozen dials that I can just sort of dimly conceive of in my mind and turn them all a little bit toward the negative.
01:14:25.000You know, I feel a little bit worse about myself, a little bit worse about my career, a little bit worse about people, a little bit worse about the future, a little bit worse about the fact that I just was doing this when I could have been playing with my kid or writing or thinking productive thoughts or meditating or doing anything that I know is good.
01:17:48.000So I went back and looked at what he said here.
01:17:50.000Half of it, frankly, didn't make any sense.
01:17:53.000But his attacks on me on Twitter are the most juvenile.
01:17:59.000It's like the idea that he thinks this is a way he's going to establish a conversation with me by sending me two tweets and then sending me 400 which say, you're scared to debate me, right?
01:19:09.000Nine times out of ten, looking just makes me think...
01:19:13.000I mean, it's an illusion, because if you met most of these people, if they came up to you at a conference or at a book signing or after a gig of yours...
01:19:22.000And you had more information about them.
01:19:30.000You would say, there's no reason to pay attention to what this guy is saying.
01:19:36.000Whereas on Twitter, everything has the same stature.
01:19:39.000So whether it's a Washington Post columnist who's tweeting at me, or some guy like Hunter, who I have no idea who he is, but he's telling me I got something wrong...
01:19:50.000Everything has the same stature, and there's no signal-to-noise sense of what...
01:19:55.000Well, first of all, you fucked up, because you talked about them.
01:20:26.000And when I was trying to hold his feet to the fire and get some sort of a logical definition of what you do wrong, he really didn't have anything.
01:20:36.000Yeah, but he thinks he does, and he...
01:20:38.000But did he when I talked to him about it?
01:20:54.000And there's a level of arrogance and incivility and just kind of a lack of charity in interacting with other people's views, which is now kind of getting rebranded on the Internet as just...
01:21:20.000He's like the cartoon version of, you know, a person who doesn't know anything relevant to the enterprise, who doesn't show any aptitude for civilly engaging with business.
01:21:37.000And this thing gets, you know, amplified to the place of greatest prominence now in human history.
01:21:44.000Everyone's on social media, or many people on social media are playing the same game.
01:21:49.000And, you know, Ms. Cernovich is another, you know, just malignant example of this, where you have someone who's got a fairly large following.
01:21:57.000I mean, it's not as big as yours, but it's, you know, it's a very engaged following.
01:22:02.000I mean, this whole Trump Has shown me that a small percentage of one's audience can have like a hundred times the energy of the rest of your audience.
01:22:14.000Like whenever I went against Trump on my podcast, or this is still the case, the level of pain it causes in the feedback space is completely out of proportion to the numbers of people who are...
01:23:31.000I think this phenomenon that you're describing is both, has really serious negative consequences, but also has some beauty on the same, by the same token.
01:23:40.000You've got parents all over the world who've got children with rare disease, but they can connect on the internet and bond over that and share tips and doctors and all that stuff.
01:23:51.000So it is, it's actually, they're both outgrowths of the same kind of phenomenon, but it can be, we see The really difficult consequences of this in our politics right now, etc., etc.
01:24:04.000I think we're talking about two different things, though.
01:24:07.000I'm talking about a bunch of people that get together and say, yeah, obviously, I'm woke, and the Earth is flat, and pay attention, there's an Arctic, there's an ice wall.
01:24:18.000The groups of people that will find, you know, communities where maybe your child has autism, and there's some sort of an issue that can be mitigated with diet, and parents have had, you know, some success with that, and they could, you know, give you some enlightening information, and you can communicate with each other, and that's nice.
01:24:35.000And some of the hashtags that people use that they find searched through, it's great.
01:24:41.000But these little communities that you bond, there's no confirmation bias in those ideas.
01:24:49.000But there's confirmation bias in the idea that Trump is the man.
01:24:52.000There's confirmation bias in the idea that the earth is flat.
01:24:55.000And if you just huddle in those little communities and just bark the same noises that everybody else barks, there's some sort of sense of community in that too.
01:25:05.000They love being a part of a fucking team, even if it's a stupid team of green frogs.
01:25:09.000And you don't have to listen to other people's views, and you get deeply, deeply entrenched.
01:25:15.000I think we're seeing this all through our politics and media right now.
01:25:20.000Well, you also see when you go to those people's pages, which I do often, I don't engage with people in a negative way online very often, very, very rarely.
01:25:28.000Now, do you not look at your ad mentions?
01:25:29.000I do a little bit, but you know what, man?
01:25:31.000I just like to just go on about my day.
01:25:34.000I've found that the negative consequences that you're discussing...
01:25:40.000It's rare that I go down the rabbit hole and I go for days without looking at ad mentions.
01:25:44.000It's more that if I publish something, if I ask for feedback, I want to see the feedback.
01:25:49.000It's not necessarily that there's anything wrong.
01:25:52.000I mean, just as your friend, I feel like there's nothing wrong with it.
01:27:02.000So if you have three million fucking idiots and all your ad mentions, if you look at your ad mentions and three million comments are saying you're a fucking moron, there's just too many people.
01:27:13.000The numbers are, they're not manageable.
01:27:16.000The numbers of human beings you interact with online are not manageable.
01:27:19.000So anytime anything gets negative or insulting and...
01:30:15.000Which is to say, just go fuck yourself, just redounds to your credibility among your tribe, right?
01:30:23.000It's like, this guy is so powerful, he so fully doesn't give a shit what people think, that he can catch him in a lie and just watch how he gets out of it.
01:30:34.000So when I was bringing up these guys like Cernovich, I mean, this is...
01:30:39.000Actually, I decided to troll Cernovich one day, and I thought it was hilarious.
01:30:49.000But the thing that bothers me is that this has real political consequences.
01:30:55.000Do you think this is a time period where we're in this sort of adolescent stage of communication online where you can get away with saying things that are dishonest and that there might be some sort of a way to mitigate that in the future?
01:31:09.000I don't think we should act like dishonesty and bluff and bluster, to use the phrase you used before, is somehow new to the human repertoire.
01:31:16.000I get that The acceptance of it seems to be, though.
01:31:19.000I think we're in a period where that is true, and I think it is aided and abetted by technology and the social networks.
01:31:28.000I agree with your diagnosis on many levels, but I was having an interesting conversation with A guy that you introduced me to, Joseph Goldstein, who's an eminent meditation teacher, has become my meditation teacher, old friend of Sam's, and we were talking about the current political situation.
01:31:45.000He used a phrase that I like when I was asked him what he thought about it.
01:31:48.000He said, I'm kind of slotting into geological time.
01:31:51.000And I think that actually makes some sense.
01:32:31.000I mean, it just doesn't seem to me that this is sustainable.
01:32:34.000It feels like this is just going to be some spectacular moment in history where people were rah-rah Nixon, and now they look back and go, my God, Nixon was a fucking liar and a buffoon.
01:32:43.000But take this, this is a point I've made before, and I don't think it's original with me, I think other people have made it, but my claim is that if Trump were one-tenth as bad He would appear much worse.
01:32:57.000Because everything he does now is appearing against a background of so many lies and so much craziness that you can barely even weight its value.
01:33:06.000This is one of the things I find useful about Twitter, because I follow some very interesting people.
01:33:13.000Anne Applebaum, the Washington Post columnist.
01:33:39.000And yet no one can talk about it because no one believes him.
01:33:43.000We have a president whose speech has now become so denuded of truth value, perceived truth value, that he can say, if China doesn't handle North Korea,
01:35:32.000Do you think that they're connected to what we were talking about before where you said that people would rather be electrocuted than to be alone with their thoughts?
01:35:39.000That we have gotten to this weird place with our society, with our civilization, where we've made things so easy?
01:35:46.000We've made people so soft, so dependent upon technology.
01:35:50.000We've slotted out these paths, these predetermined patterns of behavior for people to follow, where they can just sort of plug into a philosophy, whether it's a right-wing one or a left-wing one with very little room for personal thought at all, very little room for objective reasoning.
01:36:24.000You know, I think you have nothing against the New York Times or Breitbart, but I think you need to read many things and follow many different sorts of people on Twitter, not just because you want to troll them, but because you actually want to listen to what they have to say and take it seriously.
01:36:35.000Well, the New York Times really fucked up.
01:36:37.000Where they really fucked up is where they said that they're going to, after the election, they're going to rededicate themselves to reporting the truth.
01:37:01.000The thing is, the enemy was so grotesque in this case that it was impossible to not have been biased seemed an abdication of responsibility.
01:38:01.000I just feel like at this stage of our society, there's real consequences to the infantilization, if that's actually a word, of human beings in our culture.
01:38:12.000We've made it very easy to just go to work and just get home and watch television and just not pay attention to anything and not read anything and not really think and then be manipulated.
01:38:25.000I mean, I think it's incredibly easy to manipulate people, especially people that are aware that they don't have a varied media diet.
01:38:33.000People that are aware that they don't have a real sense of the word.
01:38:36.000And it seems daunting to try to take into consideration, like, what is involved in foreign policy?
01:38:41.000What is involved in dealing with Russia?
01:38:48.000Put it in the hands of the strong man.
01:38:49.000I think this is true on both sides of the spectrum, though, because I think you've got folks who slot into just a media diet where they're just hearing things on the left and they're not curious about or, I guess,
01:39:35.000And I think the reason that so many people on the right, so many Trump supporters, feel like they're right is because it has been proven that the media was biased and that they did get it all wrong.
01:39:48.000And they were absolutely wrong when it came to who was going to win.
01:39:51.000I mean, Huffington Post had some ridiculous thing where it was the night of the election.
01:39:55.000They said that Hillary had like a 98% chance of winning or something crazy like that.
01:40:00.000Well, I think, yeah, there were some polls that were bad.
01:40:04.000Because I remember this because I sent out a tweet which said, like, you know, bye-bye, Donald, or something like that, you know, the day of.
01:40:12.000But when I did that, I mean, that wasn't a prediction.
01:40:15.000I mean, the polls that I was going by, that most people were going by at that point, it was like 80-20, you know, or at best 75-25 that she was going to win.
01:41:25.000You know, I think it actually goes back to what Sam was saying before, that people think when you see numbers like 70% odds that Clinton's going to win, 80% odds that Clinton's going to win, that she's definitely going to win.
01:41:39.000But there's room there for Trump to win.
01:41:59.000That's really good odds that you're going to get shot.
01:42:01.000I don't think it's so much about blaming the polls as it was blaming the overall tenor of the coverage, which made it seem like Clinton was inevitable.
01:42:13.000That's a hit I think that we can and should take.
01:42:17.000We definitely, you know, I think we weren't giving the 20 or 30% chance a serious enough look.
01:42:25.000What is your thought, as being someone who covers these things, what is your thought about the Electoral College?
01:42:30.000Do you think that that's an antiquated idea?
01:42:33.000I mean, it was kind of established back when you really needed a representative, because otherwise you would have to get on a fucking horse and ride into Washington, and it would take six months.
01:42:41.000I can see that you can make very powerful arguments that it's a deeply problematic institution.
01:42:47.000I can see the power of those arguments, for sure.
01:42:52.000There are people who argue, make similar arguments about the United States Senate.
01:42:59.000There was a piece that ran in the New York Times in their Sunday Week in Review, not long after the election, making the case that the angriest people in America actually should be those who live on the coast, because it's taxation without representation, that the people who live on the coast are paying more in taxes,
01:43:17.000but they have less representation actually in Washington.
01:43:22.000Again, that's not research that I've done, but it's an interesting idea.
01:43:26.000Have you ever seen anybody present any sort of a logical argument that there really shouldn't be a president anymore?
01:43:32.000That the idea of having one alpha chimp run this whole thing seems pretty outdated.
01:43:38.000I'm just not sure that he runs the whole thing.
01:43:51.000The health care bill that didn't even make it Wasn't even close to what he promised on the – in some ways wasn't even close to what he promised on the campaign trail.
01:44:01.000In other words, he couldn't get the bill that he wanted, and then he couldn't get that passed.
01:44:06.000And now he's looking at having to watch his party employ the nuclear option in order to get his Supreme Court nominee seated.
01:44:18.000I think that the founders designed in many ways a really ingenious system.
01:44:23.000And we put a lot of attention on the president because it's one person who's on our TV screens or our phones all the time.
01:44:31.000But I'm not sure how much power is vested in that person.
01:44:34.000Now, when it comes to foreign policy, It's a different kettle of fish.
01:44:38.000Well, it's enough that the EPA has been sort of hobbled.
01:44:42.000I mean, what they've done with the Environmental Protection Agency standards, especially in regards to emissions, he's rolled back emission standards.
01:44:50.000I mean, if there's anything that we should be concerned about, it's the air that we breathe.
01:45:01.000Not only that, but what I've heard is that that's not even going to be effective.
01:45:06.000Because most places have moved away from coal to the point where restarting coal production is not even going to recharge the economy in the way that would make it a viable option in the first place.
01:45:20.000The issue of climate change is just...
01:45:24.000I'll say, as a member of the media, an area where I feel, and I'm just speaking for myself here, really one of our biggest failures.
01:45:32.000And I don't think history is going to judge us kindly.
01:45:35.000And again, I'll put the blame on myself.
01:45:38.000It's a hard story to get people just that interested in.
01:45:43.000And especially for television, because it's a lot of sort of graphs and science, and there's only so many pictures of polar bears you can show.
01:45:54.000And so I anguish about that because I do think there isn't a debate.
01:46:01.000Climate change is real, almost certainly caused by humans.
01:46:07.000And for too long we fell for that in the media where we presented it as a debate when it wasn't.
01:46:13.000Now I think we're past that, but I still don't think we're covering it enough and as robustly as we should.
01:46:18.000Well, climate change, I think, is a sort of almost abstract to people.
01:46:23.000It's very difficult for them to wrap their head around, especially when they look at the ice core samples.
01:46:27.000There's plenty of stuff online where you could sort of convince yourself that there's always been this Rise and fall of the temperature on the earth, and in many ways that is true.
01:46:40.000I think, actually, I just walked in, so I might have missed what you said there, but I think that's a crucial shift of emphasis because there is no...
01:47:08.000I mean, just imagine if we had no Pollution coming from the exhausts of all the cars out there and there was no coal-fired power plants.
01:47:17.000We just had solar and wind and safe nuclear technology powering the grid.
01:47:24.000It would be fantastic from just a pure...
01:48:27.000And I think that even though they're making some pretty radical noises at the EPA, I'm not sure how far Pruitt can take some of this stuff, given the existing body of law, case law, that has formed around Obama's decisions.
01:48:43.000So it actually gets more complicated the closer you look at it, from what I can tell.
01:48:50.000And I won't claim to have studied it too, too closely.
01:48:52.000But it seems to get more complicated the more you look at it, The headlines may be scarier, I guess is my point.
01:48:59.000Well, what's pretty clear, though, is emission standards.
01:49:01.000Rolling back emission standards sends a very clear message that it's okay to pollute the air.
01:49:05.000I mean, we were moving in a direction of going towards electric cars, going towards cars that pollute the environment less.
01:49:13.000Even cars that use gas, you know, for Porsche, Porsche has a 911 turbo, and the standards of emissions are so strong, what they've developed is a car that when you drive it through LA, the exhaust that comes out is cleaner than the air it sucks in.
01:49:52.000What they've done is managed to make something so efficient that it actually does emit clean air coming out of it, or cleaner than the polluted air that it's sucking in.
01:50:02.000Now, if these Environmental Protection Agency standards keep getting rolled back, I mean, we're going to go back to...
01:50:09.000I mean, I don't know how far they're rolling it back, but...
01:50:20.000The idea that business should take precedent over the actual environment that we need to sustain ourselves.
01:50:27.000So let's not forget there are real human beings in coal country who have spent generations working in this industry, take great pride in it, and we've got to think about what we do.
01:50:41.000But the numbers here are surprising and also little reported.
01:50:47.000It's only 75,000 coal jobs we're talking about in the country.
01:50:52.000And there's something like 500,000 clean tech jobs just in California alone.
01:50:57.000I mean, the numbers are completely out of whack.
01:50:59.000No, I think the clean tech industry offers an enormous amount of promise, but 75,000 families is not nothing.
01:51:27.000I mean, that's, you know, why can't they figure out that they just want to learn new languages and spend more time with their kids and play Frisbee and have fun?
01:51:40.000And politics that decouples a person's claim on existence from doing profitable work that someone will pay you for.
01:51:49.000Because a lot of that work is going away.
01:51:51.000I mean, we could view it as an opportunity, and it is actually something that it does dovetail with this hobby horse that you and I have been on for a while about the power of meditation and what it can do to a human mind and the way you view the world and your role in it,
01:52:08.000Well, what are your thoughts on universal basic income?
01:52:10.000Because bring it back to that, with this rise of the machines, if we do have things automated, I mean, some ridiculous number of people make their living driving cars and driving trucks.
01:52:20.000Now, when those jobs are gone, I think it's millions of people, right?
01:52:23.000Yeah, and I think in the States, it's the most...
01:53:08.000Viewed as an opportunity, this is the greatest opportunity in human history.
01:53:14.000We're talking about canceling the need for dangerous, boring, repetitive work and freeing up humanity to do interesting, creative, fun things.
01:53:30.000Well, Give us a little time, and we'll show you how we can make it bad.
01:53:34.000And it'll be bad if it leads to just extraordinary wealth inequality that we don't have the political or ethical will to fix.
01:53:46.000Because if we have a culture of people who think, I don't want any handouts, and I certainly don't want my neighbor to get any handouts, and I don't want to pay any taxes so that he can be a lazy bum, if we have this You know, hangover from Calvinism,
01:54:01.000you know, that makes it impossible to talk creatively and reasonably about what has changed.
01:54:10.000Yeah, it could be a very painful bottleneck we have to pass through until we get to something that is much better or a hell of a lot worse, depending on where the technology goes.
01:54:22.000And I think at a certain point the wealth inequality will be Obviously unsustainable.
01:54:28.000I mean, you can't have multiple trillionaires walking around living in compounds with razor wire and just moving everywhere by private jet and then massive levels of unemployment in a society like ours.
01:54:46.000I mean, at a certain point, the richest people will realize that Enough is enough.
01:54:54.000We have to spread this wealth because otherwise people are just going to show up at our compounds with their AR-15s or their pitchforks and the society will not sustain it.
01:55:07.000There has to be some level of wealth inequality that is unsustainable, that people will not tolerate.
01:55:14.000And you begin to look more and more like a banana republic until you become a banana republic.
01:55:20.000But now we're talking about the U.S. or the developed world where all the wealth is.
01:55:45.000I mean, whatever the solution is for coal mining, we should not be hostage...
01:55:50.000For the coal miners, we should not be hostage to...
01:55:55.000The idea that they need jobs so that whatever job they were doing and are still qualified to do, that job has to continue to exist no matter what.
01:56:05.000No matter what the environmental consequences, no matter what the health consequences, no matter how it closes the door to good things that we want.
01:56:25.000At a certain point, we move on and we make progress and we don't let that progress get rolled back.
01:56:30.000And when you're talking about developing technology that produces energy that doesn't have any of these negative effects, whether it's global climate change or just pollution, Of course we have to move in that direction.
01:56:47.000And the other thing that's crazy is that we're not talking honestly about how dirty tech is subsidized.
01:56:55.000I mean, you have the oil people say, well, solar is all subsidized, right?
01:57:00.000This is a government handout that's giving us the solar industry.
01:57:07.000You have to produce an argument as to why that's a bad thing.
01:57:09.000This is something we should want the government to do.
01:57:11.000The government needs to incentivize new industries that the market can't incentivize now if they are industries that are just intrinsically good and are going to lead to the betterment of humanity.
01:57:59.000I mean, subsidizing the corn industry, when you find out that corn and corn syrup is responsible for just a huge epidemic of obesity in this country, the amount of corn syrup that's in foods.
01:58:12.000If you as a polluter had to pay the consequences of your pollution all the way down the line, you had to compensate everyone who got emphysema or lung cancer because of what you were putting into the air.
01:58:30.000Your industry would be less profitable, right?
01:58:33.000And it might not be profitable at all.
01:58:35.000And we haven't priced all of that in to any of these things, whether you're talking about the chemical industry or the cigarette industry.
01:58:45.000I mean, we're addicted to the use of these fossil fuels the same way some people are addicted to the use of cigarettes.
01:58:52.000I mean, the health consequences of those things, they're almost parallel in a lot of ways.
01:59:19.000I interviewed a guy named Fred Singer, who was one of these people, and he's a climate change denier, and there's just some...
01:59:30.000It's been a while, but there were some key moments where I was, like, listing for him all the major scientific organizations that say that climate change is real and that humans are major contributors to it, and he basically just refused to accept it.
01:59:46.000Well, those shows where you have the three heads, you and then the two experts, and they yell over each other, and then we'll be right back, and then you go to commercial.
01:59:54.000Those fucking things don't solve anything.
01:59:56.000Those weird moments where people are yelling at each other, and you can't figure out who's right or who's wrong.
02:00:02.000I'm not a big fan of that, personally, and I think that especially it was a failure to do that with climate change, because it created the doubt.
02:00:12.000It created the doubt, and I think that was a very successful It was a very successful strategy.
02:00:19.000But that's in tension with what we just said about the New York Times, because if you take a position as a journalist, if you say, okay, actually one side of this conversation is full of shit, I think it's different to take a position.
02:00:37.000It's not controversial for me to sit here and say, if you smoke cigarettes, the science strongly suggests you have higher odds of lung cancer.
02:00:45.000Same thing with climate change, but it became politicized.
02:00:50.000You'll notice I've stepped out of some of the discussions that have been taking place because it's not my role as a journalist to come down on one side or another.
02:00:59.000But with climate change, I feel absolutely comfortable saying the vast majority of scientists believe this is real and a big, big problem.
02:02:36.000But no, at a certain point, there's some level of dishonesty and misinformation that's so egregious that if you're a journalist at all committed to being...
02:03:09.000But I don't think you have to create mock unnecessary debates around climate change.
02:03:15.000What I do think is that with the Trump administration, that it is imperative that journalists call out when things are said that aren't true.
02:03:27.000But I don't think it's constructive for mainstream journalistic organizations to have an openly hostile anti-Trump attitude or pro-somebody-else attitude, because then that just leads to further polarization, which is exactly what we don't need.
02:03:43.000But you call him out on all the times that he's said things that are just absolutely not true.
02:03:51.000And I think that's an issue we're dealing with.
02:03:53.000But I think this is a time of real soul-searching in my industry, because I firmly believe there is a very powerful place in a functioning democracy for a press, for media, that people generally view as fair.
02:04:10.000Well, it's also what really highlights the responsibility of getting accurate and unbiased information to people, because there's not a lot of sources of that left.
02:04:19.000I mean, when you look at Fox News, and sometimes you look at CNN, and sometimes you look at MSNBC, and you're like, boy, how much of this is editorialized?
02:04:30.000You need unbiased facts, and you almost need it not delivered by people.
02:04:37.000The problem is when people are delivering the news, like when you talk to someone, you know they're educated in an Ivy League university and they speak a certain way, they act a certain way, you almost can assume that these intellectual people,
02:04:53.000these well-read people are going to sort of lean towards one way or another.
02:05:00.000I would argue, and again, I know this is self-serving, but I still believe it, that the three broadcast networks have actually fared quite well in what is an incredibly difficult environment right now.
02:05:13.000So you don't count Fox as a broadcast network?
02:05:16.000Fox News is a cable network, but Fox Broadcast does not have a news division the way ABC, CBS, and NBC do.
02:05:25.000So just broadcast, meaning traditional old-school signal in the air that nobody uses anymore.
02:05:31.000Nobody uses them, but we refer to them within the industry as the three broadcast networks.
02:07:11.000The majority is people over 54 or under 25, but I don't think that's the case.
02:07:15.000We're looking at 25 to 54, ABC has 1.6 million, as opposed to 8 million total viewers, NBC 7.8 million total viewers, 1.7 million, 25 to 54,
02:07:31.000and then CBS 6.4 million total, 1.3 million, 25 to 54. It's funny, like, after 54, fuck you, but before 25, fuck you.
02:07:49.000The ads you run, it's like, you know, for catheters and anti...
02:07:56.000It gets a little different on the morning, because in the morning shows where it's closer to 4 or 5 million for ABC and NBC and a little less for CBS, the percentage of that audience that falls within the demo, as we call it, 25 to 54,
02:08:12.000is higher, and so the ads are kind of different.
02:08:13.000But I guess my point is that you still have a really significant number of people, if you take the mornings and evenings on these broadcast networks, that are getting their news from these places.
02:08:24.000Which just gets back to the polarized media atmosphere you were talking about before.
02:08:29.000I think in this atmosphere, having the networks be seen to a certain extent, to the extent possible, as above the fray, actually is important for democracy.
02:09:11.000Yeah, it was pretty big back then, but not what it is now.
02:09:13.000In 1999, Alex Jones and I put on George Bush Sr. and Jr. masks, and we smoked pot out of a bong and then danced around the Capitol building in Texas for a stand-up video that I did.
02:09:30.000Yeah, I've been friends with that guy since 1999. Well, yeah, I listened to your recent podcast with him, which was just...
02:09:38.000Interdimensional child molesters that are infiltrating our airwaves.
02:11:31.000But what they do is they take something like the WTO, which was a big embarrassment that people were protesting the WTO, And they hire people to turn this peaceful protest into a violent protest.
02:11:43.000So these people come in, they wear ski masks, they break windows, and they light things on fire, do whatever they do that makes it violent.
02:11:49.000And then they have the cops come in and break it up, because now it's no longer a peaceful protest.
02:11:54.000And so it got to the point where people were trying to show up for work.
02:12:25.000And it's a real tactic that, again, what agency, what faction of the military, what faction of the government hires these people to do that, I don't know.
02:12:44.000When you stop and think about all the different things that he's informed people of that turned out were real, like Operation Northwoods, when the Freedom of Information Act came out with the Operation Northwoods document where the Joint Chiefs of Staff had signed this.
02:12:58.000And this was like something that they were really trying to implement.
02:13:00.000They were going to arm Cuban friendlies and have them attack Guantanamo Bay.
02:13:04.000They were going to blow up a drone jetliner.
02:13:06.000They're going to blame it on the Cubans.
02:13:07.000And they were trying to use this as impetus to get us to go to Yeah, but how do you feel about the things that he's talking about that are...
02:13:13.000Well, okay, but let's talk about that first.
02:13:16.000I mean, there are things that are true.
02:13:20.000What gets really squirrely is when you find out that there have been things, like the Gulf of Tonkin incident.
02:13:27.000There's many things that have happened where there have been false flags, where the government has conspired to lie to the American people, and people have died because of it.
02:13:35.000I think, well, I don't know a lot about Gulf of Tonkin, although I know more about it than those other examples.
02:13:43.000I mean, there are definitely cases where it's an additional interpretation to say that the government lied.
02:13:51.000I mean, so to take even something that's closer to hand, like weapons of mass destruction as a pretext of going to Iraq, right?
02:13:57.000Now, it's one thing to say that people knowingly lied about, that Bush and Cheney knowingly lied to the American people about that.
02:14:07.000Or they were misled by people who were knowingly lying.
02:14:14.000It was totally plausible to everyone who was informed that he had a WMD program and they misinterpreted whatever evidence they thought they had and they were just wrong.
02:14:28.000I'm not claiming to know which one of those is true.
02:14:30.000I think probably the last is much closer to the truth.
02:14:35.000And that explains many of these instances, but what's so corrosive about pure examples of lying is that—and we may have one case now that's just emerging in the news.
02:14:48.000I don't know if maybe the story has been clarified while we've been talking, but it now seems that Susan Rice— At one point she said she knew nothing about the unmasking of Trump associates in this recent surveillance case,
02:15:06.000and now it's claimed that she actually asked to have certain names unmasked.
02:15:13.000This is being seized upon, again, just in the last few hours, as an example of a lie which seems very sinister.
02:15:26.000But as though it equalizes the two sides here, right?
02:15:29.000So let's say, worst case scenario, Susan Rice lied about having some knowledge of this investigation.
02:15:40.000I mean, she has to deal with the consequences of that lie, but it doesn't exonerate all of the lying that Trump has done about everything under the sun, right?
02:15:47.000So what's so destabilizing here is that the moment...
02:15:55.000The moment that a news organization like yours or the New York Times commits an honest error, that gets pointed to from those who want to treat the mainstream news media as just fake news as, see, everything's the same.
02:16:11.000You're no better than somebody who's just manufacturing fake news on a laptop in his basement.
02:16:20.000And the flip side of that is when Alex Jones gets something right, it seems to make him look like a dignified journalistic enterprise analogous to the New York Times or to ABC News.
02:17:15.000Here's Susan Rice's one lie that she told in the last 10 years, maybe, and got caught for.
02:17:20.000And we have a president who lies every time he picks up his, you know, approaches a mic or picks up his Twitter.
02:17:26.000She had other problems because she had gone on, I believe, on the Sunday morning talk shows after Benghazi with some outdated talking points.
02:18:49.000It really is very blurry because we've never had a situation like this where we have a president that people just don't trust, to be honest.
02:18:56.000I mean, if someone lied about anything in the past, I mean, if Donald Trump got caught having his dick sucked in the White House on film, You know, he'd be like, look, I made a mistake.
02:19:11.000But don't you think, just again, just slot, to use Joseph Goldstein's phrase before about slotting into geological time and just looking at the broader scope of history.
02:19:19.000You know, we've had really, we've had periods of time where we had the muckrakers, you know, where the media outlets were Didn't even pretend at times to not have an agenda.
02:19:36.000In the early parts of our republic, where we had...
02:19:41.000Was it Teddy Roosevelt called the journalists muckrakers, where basically he was saying that their job was to rake muck, to be working in filth.
02:20:16.000I'm maybe the Pollyanna here, but I think that the Republic will survive and that it doesn't inexorably lead to a time where truth doesn't matter.
02:20:27.000Well, it seems to open up the door for a viable alternative.
02:20:31.000It seems to open up the door for someone who comes along who is, in many ways, bulletproof.
02:21:20.000They would hire— No, I mean, I think they're—you know, if you're going to use a conventional political calculus, well then, yeah, then— Being an atheist, having a history of psychedelic drug use, having edgy positions that alienate massive constituencies.
02:21:38.000All of that's a deal-breaker, but you would never have predicted that someone this scandalous and inept and dishonest, and provably so, he literally can't get through an hour of the day without...
02:21:54.000Something that would have been a scandal in some previous age of the earth coming out of his mouth.
02:24:41.000But you don't think he pays a price for his— No.
02:24:45.000I mean, his popularity rating is not high.
02:24:48.000Well, he's paying some price, but the question is, is it going to be enough?
02:24:52.000And what happens with the next terrorist attack?
02:24:56.000So the real fear is that if we have a Reichstag fire kind of moment, engineered by him or not, I mean, I'm not so paranoid as to think he's going to...
02:25:08.000But I just think it's inevitable something is going to happen.
02:25:12.000I mean, we've had 80 days, 100 days, whatever it's been of his presidency, where basically nothing has happened, and it's been pure chaos, right?
02:25:21.000And the work of government's not getting done.
02:27:53.000You know, Nixon was giving us the Clean Air Act, right?
02:27:56.000I mean, Nixon had some point of contact with terrestrial reality, which wasn't just about figuring out how to burnish his imaginary grandeur, you know.
02:28:09.000So, yeah, I mean, I... I think we're one huge news story away from finding out how bad a president he could be.
02:28:22.000And I think he could surprise even the people who are very pessimistic.
02:28:27.000Well, he could also rise to the occasion.
02:28:43.000If he starts doing something good, like, let's say, a massive infrastructure project, right, that includes building out good things, not coal jobs, but, you know, if...
02:29:02.000And it's possible he could start doing that for his own—his motives wouldn't even matter, ultimately, as long as he was doing the right things.
02:29:10.000But his motives are so reliably— Self-involved.
02:29:22.000Someone needs to be able to play him, this kind of narcissistic blockenspiel, well enough to get him to do the things that would be good for the world.
02:30:03.000Yeah, but you know, it's an interesting sort of Rorschach test because there are millions of Americans who watched that press conference and found it delightful.
02:30:13.000They thought that he was pounding on the liberal media.
02:30:20.000When you read a transcript of what actually comes out of his mouth, it is amazing.
02:30:27.000The poverty of the passages in terms of information.
02:30:34.000The fact that we have a president who speaks like this, I could never have foreseen that in our lifetime this was going to happen.
02:32:12.000I guess the other piece you have to put in play there is that there are some percentage of people working in the sex trade who are not doing it voluntarily, who are coerced to one or another degree.
02:32:47.000I just think that someone is going to have to be dynamic, they're going to have to be They're going to have to engage people in a way that, I mean, they're going to have to deal with his attacks.
02:33:01.000Or maybe what people will be thirsting for after four years is actually bland.
02:34:53.000No, you have to be wired differently, but he clearly is...
02:35:01.000He's wired that way, and it's worked for him.
02:35:03.000But you need someone who's willing to submit to the punishment of running, and that's a rare person.
02:35:10.000And the problem is that kind of selects for things that you don't actually want in a president, or at least I wouldn't think you would want.
02:35:18.000I mean, it selects for a kind of narcissism and a sense that it really has to be you, right?
02:35:25.000It doesn't select for the In a normal intellectual space, you're constantly aware of the ways in which you are not the best guy or gal to be doing the thing, right?
02:35:37.000Like, you want to defer to experts, and, you know, Trump is, only I can fix it, right?
02:35:47.000There's something of that that creeps into the...
02:35:52.000The headspace of most politicians, it seems.
02:35:56.000So scientific humility and just a sense of the limits of any one person's expertise is not necessarily the right piece of software to have running when it comes time to run for president.
02:36:12.000Now, I want to switch gears a little bit.
02:36:14.000It's not totally related, but it is in some ways.
02:36:20.000Talking about mindsets and talking about, like, we've brought this up but really haven't delved into it much at all, about meditation and about how much it's affected you and how it got you back on track.
02:36:31.000And I know that you're a big proponent of it, and I am as well, although I think I'd probably do it differently than you guys do.
02:37:03.000You're essentially forced to deal with what these poses require of you.
02:37:08.000And I think that in doing so and having a singular focus of trying to maintain your balance and stretch and extend and do all these different things while you're doing it, and concentrating almost entirely on your breath, which is a big factor in yoga, it has remarkable brain-scrubbing attributes.
02:37:36.000Well, you know, I use it in a bunch of different ways.
02:37:40.000I don't use it as much as I should, honestly.
02:37:42.000But I... Concentrate on, sometimes I go in there with an idea, like I'll concentrate on material that I'm working on, or maybe jujitsu techniques that I'm having problems with, or some other things that I'm dealing with, you know, any sort of issues that I have.
02:38:27.000And it also, it gives you this environment that's not available anywhere else on the planet.
02:38:31.000This weightless, floating, disconnected from your body environment where you don't hear anything, you don't see anything, you don't feel anything, you feel like you're weightless.
02:38:40.000You have this sensation of flying because you're totally weightless in the dark.
02:38:46.000You open your eyes, you don't see anything.
02:38:48.000You close your eyes, it's exactly the same.
02:38:50.000The water's the same temperature as your skin, so you don't feel the water and you're floating.
02:39:45.000A systematic collision with the asshole in your head has a real value.
02:39:49.000Because when the asshole offers you up a shitty suggestion in the rest of your life, which is basically its job, like, oh yeah, you should eat the 17th cookie or say the thing that's going to ruin the next 48 hours of your marriage or whatever, you're better able to resist it.
02:41:18.000In other words, that's how we're classified as a species.
02:41:21.000The one who thinks and knows he or she thinks.
02:41:25.000And that knowing that you have this voice in your head, as Sam likes to joke, he feels like when he thinks about the voice in his head, he feels like he's been hijacked by the most boring person alive.
02:41:36.000Just says the same shit over and over.
02:41:39.000A joke that I steal from him all the time.
02:41:52.000I do a form of meditation in the tank.
02:41:55.000Sometimes when I go in there without an idea, like if I'm not working on material or anything else, where I just concentrate on my breath in through the nose, out through the mouth, and I just literally concentrate on the breath, and the same thing happens.
02:44:02.000This is true Yeah, I don't know I think there seems like there might be easier ways to get to the same wisdom Maybe but I think there's also creativity that gets inspired by the edible pot Something called 11 hydroxy metabolite that your body produces.
02:44:19.000It's so different than most people when they eat pot they think they've been dosed.
02:44:24.000Like anybody who's smoked pot before and then you give them a brownie they think oh my god there's something in that and they're convinced because reality itself just seems like it just dissolves and especially inside the tank there's something about the tank environment that produces in the absence of any external stimuli your brain becomes sort of supercharged because What you're trying to do when you're just sitting down and concentrating and relaxing is you're trying to focus on your thoughts,
02:45:15.000Sometimes you bump into the wall and you have to center yourself and you have to relax again and make sure you're not moving so you don't touch things, which can kind of dissolve the experience.
02:45:26.000There are experiences in meditation where you have that same experience where you lose your sense of the body, but that usually comes with more concentration.
02:45:35.000You have to be very concentrated on them.
02:45:37.000I feel like you would have that experience and it would be even more intense if you did the exact same thing that you do outside the tank in the tank.
02:45:44.000I don't think you need any psychedelics in the tank.
02:45:46.000It's one thing I tell people when they ask me, should I get high before I do it?
02:45:53.000If you decide after a while, if you've done it three or four times, you're like, I wonder what it's like if I just take a little bit of a hit of pot and see where it takes me.
02:46:01.000You know, if you're a type of person who enjoys marijuana or whatever.
02:46:05.000But the tank alone by itself, just the absence of sensory input, your brain goes to a very, very different place.
02:46:12.000And as long as you can relax, as long as you don't think too much about the fact that you're in the tank, just concentrate entirely on your thoughts, entirely on your breath, And again, let all those crazy, like, where do hamsters live?
02:47:36.000So nothing, in principle, is a distraction.
02:47:38.000I mean, you could be meditating right next to a construction site, and the sound of the hammers is just as good an object of meditation as the breath or anything else.
02:47:52.000The superpower you're after, which you actually can acquire through this practice, is to realize that virtually all of your psychological suffering, and actually, arguably,
02:48:08.000virtually all of your physical suffering, or the difference between physical pain and suffering, which those two are not quite the same thing, It's a matter of being lost in thought.
02:48:21.000It's a matter of thinking without knowing that you're thinking.
02:48:23.000And what mindfulness does, and really any technique of meditation ultimately should do, is teach you to break the spell of being lost in thought and to notice a thought as a thought.
02:48:35.000The huge difference is, until you learn how to meditate or do something like meditation, You're just helplessly thinking every moment of your life.
02:48:44.000You're having a conversation with yourself.
02:48:46.000You're having content, whether it's imagistic or linguistic, pour forth into consciousness every moment and so incessantly that you don't even notice.
02:49:26.000It produces all of your intentions and your goals and your actions.
02:49:29.000And he said this about me, and now I'm going to say this.
02:49:31.000So it's like everything coming out of you is born of this same process.
02:49:36.000And meditation is a way of recognizing that consciousness, and what you are subjectively, is this prior condition of just...
02:49:45.000The awareness in which everything is showing up, sounds, sensations, and thoughts.
02:49:50.000And thoughts can become just other objects of consciousness.
02:49:55.000And so, I mean, to take even a very basic example of the difference between pain and suffering, you can feel...
02:50:06.000Very strong physical pain, unpleasant pain, and just be aware of it.
02:50:14.000The sense that it's unbearable is virtually always untrue because in that moment you've already borne it.
02:50:20.000The feeling that something's unbearable is really the fear of having to experience it in the next moment in the future.
02:50:28.000Because you're always like, if someone drives a nail into your knee, Well, that sounds like it's unbearable, but every moment you're feeling it, you're bearing it.
02:50:40.000What you're thinking about is the last moment and the next moment, and you're thinking about when am I going to get some relief and What's the cure and how badly is my knee injured?
02:50:51.000You're worried about the future, continuously, and you're not noticing the automaticity of thought that is amplifying the negativity of the experience in that moment.
02:51:05.000You can have super intense sensation which is either pleasant or unpleasant depending on the conceptual frame you've put around it.
02:51:17.000So for instance, if you had this massive sense of soreness in your shoulder, You would experience it very differently if it was, A, the result of you deadlifting more than you ever had in your life and you were proud of it,
02:51:34.000B, probably cancer and you're waiting for the biopsy results and you're worried about this is the thing that's going to kill you.
02:51:45.000Or you're getting rolfed, you know, like some deep tissue massage, and it hurts like hell, but you actually totally understand the source of the pain, and you know it's going to be gone the moment the guy pulls his elbow back, right?
02:51:56.000So it could be the exact same sensation in each one of those, but the conceptual frame you have around it totally dictates the level of psychological suffering, or it can dictate the total absence of psychological suffering.
02:52:09.000Now, we were talking before the podcast started about your apps, and we were talking about the amount of different meditation exercises on the apps.
02:52:18.000Like, what kind of different meditation exercises are there if you're talking about just concentrating on mindfulness and breathing?
02:52:26.000As it turns out, you can iterate off of that basic exercise to infinity, essentially.
02:52:39.000I don't want to get too ahead of myself.
02:52:42.000But basically, the basic instructions are that we listed before you're feeling your breath coming in, and then when you get lost, you start again.
02:53:46.000Sam is going to be doing all the teaching on his app, and on my app, since I'm not a teacher, we have experts coming in, like Joseph Goldstein, who's, again, a friend of both Sam and I. And each teacher has their own emphasis,
02:54:04.000and you then start talking about applied meditation.
02:54:07.000So how do I use it in my everyday life?
02:54:11.000How do I use it if really what I want to do right now is control my eating?
02:54:16.000So meditation, for example, we have a course on the app that talks about using it to not overeat.
02:54:23.000But you can use your mindfulness, your ability to know what's happening in your head in any given moment without getting carried away by it, To not overeat.
02:54:31.000Notice, oh, I'm having this urge right now to eat, as I did last night, an entire bag of malted chocolate in my hotel room, but I can ride that urge and not do the thing that I know is stupid.
02:54:45.000So anyway, that's just a little taste of how you can take meditation and bring it in kind of numerous directions.
02:55:14.000No, I actually think I'm of the view, you know, now that I've been in this meditation app business for a little while, I don't think it's...
02:55:22.000I don't think the business model is that there's just one huge app that everybody uses and maybe there's some distant second.
02:55:30.000I actually think it's a little bit more like fast food.
02:55:32.000I think there's going to be a bunch of big players and you may switch back and forth.
02:56:19.000Hearing what I'm saying right now or you're thinking about something else and you don't know it, right?
02:56:23.000Or you're either reading the book you're intending to read or your mind is wandering and you're going to have to read that paragraph again.
02:56:30.000So this failure to concentrate, this failure to be able to focus on what you're intending to focus on is just this universal problem of human consciousness.
02:56:41.000And so meditation trains that and other benefits follow, but the...
02:56:49.000Having a voice in your head reminding you that you're even attempting to be meditating is very powerful even if it's your own voice.
02:57:01.000Listening to a meditation that I recorded Just my own voice, reminding me that I'm supposed to be meditating, it works like any other voice.
02:57:14.000So it's a feedback system that you can't really provide for yourself.
02:57:19.000Although, obviously, you can meditate without an app, and most people do.
02:57:26.000I've spent very little time meditating with apps.
02:57:30.000But you, you know, both of us started meditating.
02:57:32.000You started meditating well before I did, but we both started pre-apps weren't around.
02:57:39.000So you can read a book, read a good book, and learn how to meditate out of the book.
02:57:45.000Just basically remember the basic instructions and do it.
02:57:49.000But it really is useful to have an app, especially for some people, because one of the biggest problems in meditation is this persistent fear that you're not doing it right.
02:57:58.000And so to have a voice you trust in your ear, just reminding you of the basic instructions, which are so simple but very easy to forget, it can be very useful.
02:58:10.000I like the idea of it being like bicep curls for your mind.
02:58:31.000Trying to focus on one thing at a time, and then when you get distracted, knowing you're distracted and returning to your breath is changing your brain when you do that.
02:58:41.000You're boosting the muscles, obviously the muscles I'm using loosely.
02:58:52.000And in many cases, there was a study in 2010, I think it was done at Harvard, that took people who had never meditated before, and they scanned their brains.
02:59:00.000And then they had them do eight weeks of, I think, a half-hour day of meditation.
02:59:03.000At the end of the eight weeks, they scanned their brains again.
02:59:05.000What they found was, in the area of the brain associated with self-awareness, the gray matter grew.
02:59:10.000And in the area of the brain associated with stress, the gray matter shrank.