The Joe Rogan Experience - April 04, 2017


Joe Rogan Experience #940 - Sam Harris & Dan Harris


Episode Stats

Length

2 hours and 59 minutes

Words per Minute

158.5368

Word Count

28,460

Sentence Count

2,146

Misogynist Sentences

14


Summary

Sam Harris shares the story of a panic attack he had while filling in for a reporter on a morning news show, and how it almost killed him. He talks about how he recovered, and what happened to him after the panic attack. He also talks about what he did to get over it, and why he thinks it might have been caused by cocaine and ecstasy. Sam Harris is a stand-up comedian, writer, podcaster, and podcaster. He's also a friend of mine, and I really enjoyed having him on the show, so I thought it would be fun to have him on to talk about his experience with panic attacks and how he was able to recover from it. I hope you enjoy the story, and that it makes you feel better about your own anxiety, depression, and panic attacks. I know I did. I've had my own panic attacks in the past, and it's not fun, but it doesn't have to be this bad. If you're struggling with anxiety, or are struggling with something similar, please talk to a doctor if you can. I understand that being able to see a doctor and get some help. Thank you for listening and sharing this with someone who can help you. I really appreciate it. You're a rockstar. Thanks for listening. -Jon Soraya. Jon Sorrentino Music: "Good Morning America" by Shadydave (feat. The Good Morning America) (featuring Sam Harris ( ) and Dan ( ) ( ) is out of this episode, and we hope that you enjoy it. Thank you, Jon ( ) and Jon ( & Jon ( ). is out here . Thanks Jon and Jon is a good friend of ours, Jon's music is out there, and Jon's work is amazing, too! Jon's new album is out on SoundCloud , and we really hope you like it, too, so much so that you'll leave us a review of it on Apple Podcasts, too? You can find it on Soundcloud, and other places on Anchor, and also on Podchaser, and Instapod, and Apple Music, and all of our social media, and TikTok, etc., and other stuff like that, and more. (Thank you Jon's Insta: ) and we're on Insta, and his Insta Story


Transcript

00:00:03.000 Three, two, one.
00:00:06.000 And we're live.
00:00:07.000 What's up?
00:00:07.000 All right.
00:00:08.000 How are you, man?
00:00:08.000 What's going on?
00:00:09.000 Doing great.
00:00:09.000 Sam Harris, ladies and gentlemen.
00:00:10.000 You got Harris and Harris.
00:00:11.000 Dan and Sam.
00:00:12.000 No relation, obviously.
00:00:13.000 No relation.
00:00:14.000 Brother from another mother kind of thing.
00:00:15.000 Aw, sweet.
00:00:17.000 Actually, no, we work this out.
00:00:19.000 We are deeply unrelated because your Harris is the Jewish side of your family, right?
00:00:24.000 No, no, no, no.
00:00:25.000 My Harris is the...
00:00:26.000 Yes, actually, no, you're right.
00:00:28.000 My Harris is the Jewish side of the family.
00:00:29.000 It was changed at Ellis Island from allegedly from Addis, which doesn't sound Jewish either, but yes.
00:00:37.000 My Harris is the Goyim side of the family.
00:00:40.000 Isn't that funny how many people's names were changed at Ellis Island?
00:00:44.000 Yeah.
00:00:44.000 Where they're like, nah, not American enough.
00:00:46.000 Yeah.
00:00:48.000 Meanwhile, Schwarzenegger made it.
00:00:49.000 Right.
00:00:50.000 Proudly.
00:00:51.000 Yeah.
00:00:51.000 Odd.
00:00:52.000 So, anyway, thanks for coming, you guys.
00:00:55.000 Thanks for having us.
00:00:56.000 This is a weird time.
00:00:57.000 You know, I've been extra weirded out over the last couple months, and I just got back from Mexico.
00:01:01.000 I was on vacation, and I didn't do shit for a week.
00:01:03.000 And in not doing anything for a week, I really got a chance to sit down and think about stuff.
00:01:09.000 And I'm more weirded out by life today than I think I ever have been before.
00:01:13.000 So I'm excited to have you on, because I want to hear your story.
00:01:16.000 Because Sam has been telling me about it, and I looked into it, and it's...
00:01:22.000 Please explain what happened to you, where you were, and what happened to you.
00:01:27.000 Okay, so you're talking about the panic attack, isn't it?
00:01:30.000 Yeah, yeah, yeah.
00:01:30.000 So it was 2004. I was on a little show that we do at ABC News called Good Morning America.
00:01:36.000 That's a big show.
00:01:37.000 That's a big show.
00:01:38.000 That's not a little show.
00:01:39.000 No, it's not a little show.
00:01:41.000 And I was doing the job that I was filling in as the newsreader.
00:01:45.000 That's the person who comes on at the top of each hour and reads the headlines.
00:01:48.000 Right.
00:01:49.000 And I just freaked out.
00:01:52.000 I just lost it.
00:01:53.000 What a cow!
00:01:55.000 So I was a couple seconds into it, and I started to get really scared.
00:02:00.000 Have you ever had a panic attack?
00:02:02.000 No.
00:02:02.000 So it's like anxiety on steroids.
00:02:05.000 So you start to worry, but then your fight-or-flight instincts kick in.
00:02:09.000 So your lungs seize up, your palms start sweating, your mouth dries up, your heart is racing, your mind is racing.
00:02:18.000 I couldn't breathe and therefore couldn't speak.
00:02:21.000 So a couple seconds into reading what was supposed to be six stories right off of the teleprompter, I lost the capacity to speak, and I had to kind of squeak out something about, you know, back to you, back to the main anchors.
00:02:35.000 Wow.
00:02:36.000 Yeah, it sucked uncontrollably.
00:02:38.000 Now, what caused it?
00:02:40.000 Do you know?
00:02:41.000 I do, yeah.
00:02:42.000 I definitely know.
00:02:45.000 Some dumb behavior in my personal life is what caused it.
00:02:48.000 I had spent a lot of time as a war reporter at ABC News.
00:02:53.000 I was in Afghanistan, Pakistan, Israel, the West Bank, Gaza, six trips to Iraq.
00:02:59.000 And I had come home from a long run.
00:03:02.000 I covered kind of the pre-invasion, invasion, and then insurgency in one kind of six-month run.
00:03:10.000 And I came home after that, and I got depressed.
00:03:13.000 And I didn't actually know I was depressed, but I was having some obvious symptoms in hindsight.
00:03:18.000 I was having trouble getting out of bed, felt like I had a low-grade fever all the time.
00:03:23.000 And then I did something really smart, which is I started to self-medicate with cocaine and ecstasy.
00:03:29.000 And even though I wasn't doing it all the time, I like to say it wasn't like that, you know, you ever see the Wolf of Wall Street?
00:03:36.000 Yes.
00:03:36.000 Where they're popping lewds.
00:03:37.000 That wasn't me.
00:03:38.000 And I wasn't getting high on the air or anything like that, but, you know, I was partying in my spare time because it made me feel better.
00:03:47.000 So after I had the panic attack, I went to a doctor who's an expert in panic, and he started asking me a bunch of questions, trying to figure out what had caused the panic attack, and one of the questions was, do you do drugs?
00:03:57.000 And I was like, yeah, I do drugs.
00:03:59.000 And he leaned back in his chair and gave me a look that communicated the following sentiment.
00:04:06.000 Okay, asshole, mystery solved.
00:04:08.000 And he just pointed out that, you know, you raise the level of adrenaline in your brain artificially.
00:04:14.000 You make it much more likely to have a panic attack.
00:04:16.000 And I, at baseline, I'm a jittery little dude.
00:04:21.000 So it doesn't take much to put me in that zone.
00:04:24.000 I mean, you just offered me coffee and I said no, because even that will freak me out.
00:04:28.000 Well, it's weird that ecstasy and cocaine was the combination, because ecstasy is something that they actually give to a lot of soldiers that have PTSD, and there's been quite a few tests on that.
00:04:37.000 Yeah, I don't actually think ecstasy was the problem.
00:04:40.000 The coke?
00:04:41.000 I think it was the coke.
00:04:42.000 Yeah, that makes sense.
00:04:44.000 But those are the two drugs I was mostly doing.
00:04:46.000 How often were you doing it?
00:04:47.000 I would say, you know, there would be months where I wasn't doing it at all because I was off.
00:04:52.000 I covered the 2004 presidential campaign and I didn't have a lot of time to be, you know, snorting coke.
00:04:58.000 But when I was home and around my friends, you know, on a busy week, you know, two, three times a week.
00:05:06.000 Well, that's a lot.
00:05:08.000 Yeah.
00:05:08.000 That'll do you.
00:05:09.000 Mm-hmm.
00:05:10.000 There's a comeback right now that cocaine is experiencing.
00:05:15.000 Really?
00:05:15.000 Did it ever go away?
00:05:16.000 I don't know.
00:05:17.000 I've never done it.
00:05:17.000 Was there some sort of cocaine recession that I had to bounce back from?
00:05:20.000 I believe there was.
00:05:21.000 Really?
00:05:22.000 I'm talking totally ignorantly.
00:05:23.000 I've been out of the game for a long time, so...
00:05:26.000 The cocaine?
00:05:26.000 Yeah, yeah.
00:05:27.000 I'm kind of boring now.
00:05:31.000 But it feels to me like it's kind of a perennial favorite.
00:05:34.000 Yeah, I don't know.
00:05:35.000 I mean, I feel like it went through a recession.
00:05:37.000 Maybe it's just my perception.
00:05:38.000 I had a buddy of mine when I was in high school, and his cousin was hooked on coke.
00:05:43.000 And I watched while we were in high school.
00:05:45.000 He started selling it, and he withered away, lost like 30 pounds or something like that.
00:05:49.000 And just him and his girlfriend just hide out in the attic.
00:05:52.000 They had an attic apartment.
00:05:53.000 They would just hide out there and watch TV and do coke and sell coke to people.
00:05:57.000 And I was like, well...
00:05:58.000 Fuck that drug.
00:05:59.000 Whatever that drug's doing, these people...
00:06:01.000 It was almost like knowing someone who had gotten bitten by a vampire and become something different.
00:06:06.000 It was very strange.
00:06:08.000 So my experience is seeing people do that led me to never do it.
00:06:12.000 Yeah, I mean, it certainly was not like that for me, but I could see it over the horizon.
00:06:18.000 It's an incredibly addictive drug.
00:06:20.000 So I think you made the right call.
00:06:22.000 Yeah, it's got a little too much gravity attached to it.
00:06:26.000 Yeah, I mean, you can get hooked and it will bring you down.
00:06:31.000 There are other drugs you can do.
00:06:32.000 I'm not recommending drugs, but the way my friend Sam, my half-brother Sam over there does.
00:06:37.000 Let me get to it.
00:06:41.000 There are other drugs you can do that have vastly lower addictive character.
00:06:49.000 What's the word I'm looking for here, Sam?
00:06:50.000 Characteristics?
00:06:51.000 Yes, thank you.
00:06:53.000 So how did you recover?
00:06:56.000 So I wasn't actually doing it that long.
00:06:58.000 I actually had never done hard drugs until my early 30s when I came home from the war zones.
00:07:03.000 And that's what started it?
00:07:05.000 Yeah.
00:07:06.000 You just were freaked out by seeing too much?
00:07:08.000 No, you know, it's actually, it wasn't PTSD. It was, I was addicted to the adrenaline.
00:07:14.000 Oh, wow.
00:07:14.000 It was not that I was traumatized, it was that I was enjoying it too much.
00:07:18.000 And I would come home and the world would seem gray and boring.
00:07:22.000 Wow.
00:07:23.000 Yes, that was the problem.
00:07:25.000 Did you watch Hurt Locker?
00:07:26.000 I'm sure you did, right?
00:07:27.000 Yeah.
00:07:27.000 Did that resonate with you?
00:07:29.000 Absolutely.
00:07:29.000 It's been a while since I watched it, but absolutely.
00:07:32.000 I want to just be clear that the experience of a journalist is so different from the experience, so much more mild than the experience of an enlisted man or woman.
00:07:43.000 Sure.
00:07:44.000 I don't want to compare my experience to the Hurt Locker.
00:07:47.000 I'm an observer on the side, and I don't even want to compare my experience to more experienced war correspondents out there.
00:07:55.000 I'm thinking of guys like Richard Engel on NBC. Sebastian Junger.
00:07:58.000 Absolutely.
00:07:58.000 I just actually sat down with him the other day.
00:08:00.000 He's got a new documentary coming out.
00:08:03.000 My experiences are much more mild than that, but certainly enough to really get a sense of how thrilling it is.
00:08:10.000 There's an expression, there's nothing more thrilling than the bullet that misses you.
00:08:13.000 And in my case, luckily, they all miss.
00:08:15.000 That was not true for some of my friends.
00:08:19.000 So I had a real sense of the stakes, but it is exciting.
00:08:24.000 It's also thrilling on an idealistic level.
00:08:28.000 I mean, I believe...
00:08:29.000 And the importance of bearing witness to the tip of the spear, to what we're doing, to what our military is doing in our name.
00:08:37.000 So all of that is a heady mix.
00:08:39.000 So you knew people over there, journalists that got killed?
00:08:42.000 Oh yeah, absolutely.
00:08:44.000 A very good friend of mine, the guy who actually ultimately set me up with my wife, is a guy named Bob Woodruff, who was the anchor of World News Tonight on ABC News.
00:08:53.000 He had only been in the chair for about a month when he was on a trip to Iraq and he literally got his head nearly blown off.
00:09:01.000 We're good to go.
00:09:12.000 Traumatic brain injury, was brought back to life, is to this day a walking miracle that he's alive.
00:09:19.000 And after he recovered, he then introduced me to the woman I married.
00:09:24.000 So he's a close friend, and I saw cases like that, lost friends, both Iraqi friends.
00:09:31.000 I think?
00:10:01.000 Whew.
00:10:02.000 When you're a journalist and you're over in Iraq or in Afghanistan, you're in war, and what you're experiencing is so far removed from the day-to-day life that most people experience, what is it like trying to relay that to people?
00:10:20.000 How difficult is it?
00:10:21.000 Because I think so many people have this...
00:10:24.000 Almost dramatic television movie-slash-view of war, where they don't ever experience it.
00:10:32.000 I would imagine probably 99% of the people that are in this country will never experience it.
00:10:39.000 No, and that's good.
00:10:40.000 I don't know that you can describe it...
00:10:55.000 Absolutely.
00:10:59.000 Absolutely.
00:11:01.000 Absolutely.
00:11:02.000 Absolutely.
00:11:18.000 Yeah, and that's why you see a lot of risk-taking behavior among vets, because you're looking for another way to get that hit of adrenaline, for sure.
00:11:34.000 There's a book, and I'm blanking on the name, it's a great book written by a much more experienced war correspondent than me.
00:11:41.000 He used the phrase, war is a drug.
00:11:44.000 And that, to me, sums it up, at least in my experience.
00:11:47.000 I got hooked on the experience of being in these really elevated situations, heightened situations, cinematic, dramatic situations, and I would come home and I didn't know what to do to replace it, and so this synthetic squirt of adrenaline that you can get from cocaine So...
00:12:19.000 How did you bounce back?
00:12:21.000 Oh, that was actually the question you asked me before that I somehow neglected to answer.
00:12:25.000 So the doctor who pointed out that I was an idiot and doing drugs and it had caused the panic attack, I agreed.
00:12:34.000 He didn't think I needed to go to rehab because it was pretty short-lived.
00:12:38.000 I was in my early 30s when I started and I'm still in my early 30s when I had the panic attack.
00:12:42.000 So it was only a couple years.
00:12:44.000 He said, I want you to come see me once or twice a week.
00:12:49.000 Forever.
00:12:49.000 Forever?
00:12:50.000 How convenient.
00:12:51.000 He said basically indefinitely.
00:12:53.000 So I still see him, but not...
00:12:55.000 It's a good business model.
00:12:55.000 Yeah, it's a good business model.
00:12:57.000 But not that...
00:12:58.000 I mean, it's been well north of...
00:13:00.000 It's been about 13 years, so I don't see him that often now.
00:13:03.000 But for a long time, I saw him intensively.
00:13:07.000 So that, you know, it wasn't easy.
00:13:09.000 It's not easy.
00:13:10.000 And I wouldn't have, I wouldn't call what I mean, there are people who have had drug addictions that are vastly more severe than mine.
00:13:19.000 But it sucks to stop a habit that is giving you pleasure on, you know, on pretty, in pretty prominent areas of your brain.
00:13:29.000 Well, your situation, what you're talking about is a very, very, very extreme situation.
00:13:33.000 Like being a journalist, a war correspondent, going over there, experiencing that intense sort of adrenaline rush and then having your issues with it.
00:13:43.000 But it seems like there's a tremendous amount of people today that are stimulating themselves.
00:13:49.000 Adderall is a big one.
00:13:51.000 I mean, it's just, I know, I've found out recently like four or five people that I didn't know that were on Adderall.
00:13:56.000 It seems like you just sort of start asking questions, and you find out how many people, I mean, all these, my kid goes to school, With a bunch of other kids and you get to meet the parents and like fucking half of them are on Adderall.
00:14:09.000 It's very weird.
00:14:11.000 And Adderall is a form of amphetamine.
00:14:14.000 And it seems like...
00:14:17.000 It's mind-boggling how many people are doing this stuff.
00:14:20.000 We're dosing ourselves with all sorts of things.
00:14:23.000 So it can be stimulants, but it also can be benzos, cousins of Valium.
00:14:31.000 It can be shopping, gambling.
00:14:33.000 It can be whatever, except Twitter.
00:14:40.000 I think it's just speaking, we have a neuroscientist in the room, so I'll let him say more about this, and also a guy who's a more experienced practitioner of Buddhism than I am, but, you know, it does speak to the nature of the human mind, that we're always on the hunt for the next little hit of dopamine, and now there are lots of ways to get it.
00:14:59.000 Well, it's also very bizarre that you can just do that.
00:15:01.000 I mean, I don't think there's ever been a time in history where you could just take a pill and you'll be elevated for five or six hours.
00:15:09.000 I mean, and that your doctor will give you this pill and they'll encourage you to take it.
00:15:13.000 And then you'll find out that 50% of the people in your community are taking it.
00:15:16.000 I've done stories about parents who steal it from the kids.
00:15:20.000 Also just caffeine.
00:15:21.000 I mean, I'm reaching for the coffee here, having slept poorly last night.
00:15:27.000 But that is, and that's just as much of a drug, it's just not as potent a drug as taking methamphetamine or Adderall or anything else that's a drug drug.
00:15:38.000 But, I mean, this had civilizational consequences when humanity more or less switched from alcohol in the morning to caffeine in the morning.
00:15:48.000 That's just—things got a lot different.
00:15:51.000 I mean, for hundreds of years, people were just drinking ale and wine in the morning before coffee and tea became— Huge in Europe, and colonialism, the engine of colonialism, to a significant degree,
00:16:07.000 was coffee, tea, sugar, and, you know, our behavior changed.
00:16:14.000 People that were drinking ale and wine, wasn't a big part of it, the reason why they drank it with food, is because water would get stagnant.
00:16:21.000 Yeah, well, there's the issue with clean water, too.
00:16:24.000 Yeah.
00:16:26.000 Just imagine the consequences of you and everyone you know getting up in the morning and just starting with beer or wine.
00:16:33.000 I know people like that.
00:16:34.000 That's a long day or a very short one.
00:16:40.000 But the fundamental...
00:16:44.000 I think the point of the underlying neuroscience is that all of these drugs, anything you're putting into your body that's modifying the behavior of your brain is only modifying the existing available neurochemistry of your brain.
00:17:02.000 Get your brain to secrete more of an existing neurotransmitter or they mimic an existing neurotransmitter binding to the same receptor site or they keep something in play longer than it would otherwise have been.
00:17:14.000 They block the reuptake of neurotransmitters or neuromodulators.
00:17:19.000 So, a drug is never getting your brain to do something your brain is incapable of doing.
00:17:26.000 And that's the most extreme thing, like DMT or LSD. The brain is still doing all of that.
00:17:33.000 And so it stands to reason that there are...
00:17:37.000 There are potentially other ways of getting the brain to do that, whether it's meditation or whether it's computer interface, ultimately, to the brain.
00:17:44.000 I know that people are interested in brain-computer interface that not only allows a quadriplegic to move a robotic arm or gets a Parkinson's patient to be able to move,
00:18:00.000 but to the ultimate degree, actually augmenting human function or opening I mean, all of that is...
00:18:17.000 In principle, possible because, again, we're just talking about electrochemical phenomenon happening in our heads, which is just there to be modulated.
00:18:28.000 Now, when you were talking about being depressed, and Sam, you're talking about reuptake inhibitors, I want to know, what are your thoughts on the massive amount of people that are on SSRIs now?
00:18:39.000 I mean, there's another thing that I know, how many people I know that have Either are on or have been on some sort of antidepressants.
00:18:47.000 And it seems, I mean, to me, to someone who's never taken them or doesn't have personal experience with it, massively overprescribed.
00:18:57.000 Yeah, well, anything I say is with a caveat that this is, I mean, I'm not a neurologist.
00:19:03.000 I have zero clinical experience.
00:19:06.000 And this is certainly not my area.
00:19:08.000 I'm not up on the recent literature on the efficacy of Of antidepressants.
00:19:15.000 But there's...
00:19:17.000 I mean, clearly...
00:19:19.000 It's like anything.
00:19:21.000 There's a spectrum.
00:19:21.000 There are people who have been unambiguously helped by antidepressants.
00:19:25.000 And there are people who are on them who shouldn't be on them.
00:19:28.000 And there are people who are on them who want to get off them and find it surprisingly difficult to get off them.
00:19:33.000 And so it's just...
00:19:34.000 These are blunt instruments by definition because...
00:19:41.000 Anything that's modulating serotonin, in this case, is effective everywhere serotonin is effective.
00:19:51.000 There's no magical property of...
00:19:57.000 Finding the neuromodulator where only the symptoms you want to relieve are affected because these chemicals do a lot of things in a lot of places, even in your gut.
00:20:10.000 Hence the side effects you get with almost any medication.
00:20:17.000 In many respects, it's a matter of luck.
00:20:21.000 To find a pharmacological target that actually does just what you want it to do, which is to say that those receptors are not elsewhere that are going to produce side effects for you.
00:20:34.000 That's why a different kind of intervention, something like, ultimately, some electrical or magnetic or machine-based intervention, Could be more targeted because then you're not just putting something in the bloodstream that spreads everywhere.
00:20:54.000 By machine-based, you mean something like electrodes that they put on the mind or the surface of the head to stimulate areas of the brain?
00:21:02.000 Yeah, yeah.
00:21:02.000 And again, what we have now is also still pretty primitive.
00:21:06.000 And anything that you would have that would be...
00:21:10.000 Super futuristic would seem to require that you put something actually inside your head, right?
00:21:17.000 So whether that's neurosurgery or putting something into the bloodstream that somehow gets inside your head, you know, an injectable...
00:21:27.000 So for instance, Elon Musk just mentioned something which he called neural lace, which is...
00:21:37.000 I believe a term that came from a sci-fi novel.
00:21:39.000 I don't think it originates with him.
00:21:41.000 I'm not a big science fiction reader.
00:21:42.000 But he announced investment in a company called Neuralink, which is looking at some advanced brain-computer interface.
00:21:54.000 Based on the idea that you could get a, with these new microelectrodes, you can get an injectable mesh, like a wire mesh that just integrates with the brain, or very likely just the cortex.
00:22:11.000 And I believe this work has already been done in mice, and the mice are, you know, have survived and are living with this mesh in their brains.
00:22:22.000 And again, this is not research I'm close to at all.
00:22:24.000 He just announced this a couple of weeks ago.
00:22:26.000 But in principle, you're talking about having, whether it's a mesh or whether it's magneto-electric particles, something that is...
00:22:38.000 On-site around individual neurons or assemblages of neurons, which can both read out and input wirelessly signal from those neurons.
00:22:50.000 So just by both putting your thoughts into the world by influencing effectors, robotic arms or cursors on screens or whatever it is, And also influencing your mind based on whatever inputs you want to put in there from the world.
00:23:09.000 Until the Russians hack you.
00:23:11.000 Yeah, exactly.
00:23:12.000 It opens all of those concerns.
00:23:14.000 Always the Russians.
00:23:15.000 It's always the Russians.
00:23:16.000 The Chinese are gambling.
00:23:19.000 I think the Chinese people are responsible for a lot of the propaganda that makes us think about the Russians.
00:23:23.000 It's like, yeah, push it off on them.
00:23:24.000 Push it off on them.
00:23:26.000 I'm prepared to blame the Russians for a lot at this point.
00:23:28.000 Blame them all.
00:23:29.000 What a slippery slope, though, for humans becoming cyborgs.
00:23:32.000 I mean, we're already like some weird form of symbiote right now carrying our cell phones like it's a baby.
00:23:38.000 You know, like you leave your cell phone and you're like, oh my god, I forgot the baby.
00:23:41.000 I mean, it's a very strange thing that we already have, and there's not a whole lot of steps between that and Snapchat glasses.
00:23:49.000 Jamie's got the Snapchat glasses.
00:23:50.000 Have you ever seen those things?
00:23:51.000 Yeah, I didn't know they existed.
00:23:53.000 Oh, they're very strange.
00:23:54.000 They have little cameras on them, and...
00:23:57.000 Yeah, you got it.
00:23:57.000 Go ahead, throw them on.
00:23:59.000 He knows how to use them.
00:24:00.000 Google Glass just was totally stillborn, right?
00:24:04.000 Yeah, it didn't work.
00:24:05.000 So he's transmitting, or he's making a video right now with that left side.
00:24:08.000 See how the left side is spinning?
00:24:10.000 What does Snapchat do, like 15 seconds?
00:24:13.000 10 seconds at a time that little counter is just flashing was like the last three seconds and you can hit it again and get another 20 seconds or you can hold it and do like 30 seconds or something like that.
00:24:21.000 Yeah.
00:24:21.000 Step one.
00:24:22.000 Now is that spinning just to alert the people you're looking at that you are recording?
00:24:26.000 I see it too.
00:24:27.000 So it's on alert me too to let me know that my recording is done.
00:24:30.000 Probably a little bit to let you know too but it's probably the only notification you know that I'm recording.
00:24:34.000 And it's just post.
00:24:35.000 It doesn't post automatically, and now I have to link it to my phone and then post it from there.
00:24:39.000 Okay.
00:24:40.000 The Google Glass thing made people very uncomfortable.
00:24:42.000 I mean, I tried a very early prototype.
00:24:45.000 I have a good friend of mine who was an executive at Google at the time, and she got a hold of one of the really early ones that actually had to be tethered by a cord.
00:24:54.000 And, you know, you talk to it and swipe it.
00:24:56.000 And I played with it a couple of times and we used it once at a UFC weigh-in where I put it on and I broadcast from the weigh-in.
00:25:02.000 It's very, very odd.
00:25:03.000 But it made people very uncomfortable.
00:25:05.000 Like you could see the difference when they saw you with that thing on.
00:25:08.000 All of a sudden there was all this apprehension.
00:25:10.000 They're being, you know, recorded or transmitted.
00:25:14.000 But how long?
00:25:15.000 How long do we have while we're still people?
00:25:19.000 Well, long before you asked that question, our sense of privacy...
00:25:27.000 I remember what it was like to be neurotic about the sound of your voice on a voicemail or an answering machine, right?
00:25:35.000 Like re-recording the outgoing message and just being worried about your voice showing up in someone else's tape.
00:25:42.000 And now we're living in this panopticon surveillance society where you just assume you're on camera virtually every moment you're in public.
00:25:51.000 Although I guess people don't think about it all that much.
00:25:55.000 I think the norms around privacy shift just because we get so much value from having the data, ultimately.
00:26:06.000 I guess.
00:26:06.000 But it just seems also like there's just this inexorable pull.
00:26:12.000 Yeah.
00:26:33.000 To be able to pick up your phone and find out what's true has also never been better.
00:26:39.000 So it's like we're both vulnerable in a way that we've never been and we're empowered in a way.
00:26:44.000 It's always been true about technological progress.
00:26:47.000 Yeah.
00:26:47.000 It just seems like there's a certain amount of time we have left.
00:26:52.000 Before we give birth to some new thing.
00:26:55.000 We're talking about just integrating ourselves biologically with our machines.
00:26:59.000 And also something that's independent of us.
00:27:02.000 Some artificial intelligence is independent of us.
00:27:05.000 You want to look for some fear around that?
00:27:07.000 You've got the right guy right here.
00:27:08.000 I've expressed those fears here, yeah.
00:27:11.000 Yeah.
00:27:11.000 I mean, and Josh Zeps freaked me out this morning.
00:27:14.000 He sent me some articles about these self-driving trucks that are already going in Australia that are as big as a 767, and they're driving down the road by themselves with cargo, probably nuclear waste or something, you know, just tooling down the road.
00:27:29.000 But people are so bad at driving that the robots just have to get reliably better than people, and then you'll just feel nothing but relief.
00:27:38.000 They probably already are.
00:27:39.000 Well, yeah.
00:27:40.000 I mean, I think they probably are, as far as I know from what I hear from Tesla, that, yeah, the man hours they have of people using the autopilot, and the autopilot's not at all perfect, obviously.
00:27:55.000 Yeah.
00:27:55.000 Two people have died already from using it badly.
00:27:59.000 But still, they have something like some millions of man-hours of autopilot-assisted driving, and I think that has been safer than just pure ape.
00:28:10.000 Did you see the video of the guy who fell asleep in traffic in San Francisco?
00:28:14.000 He's literally out cold and his car's driving him on the highway?
00:28:17.000 No, no, no.
00:28:18.000 It's kind of fantastic.
00:28:20.000 I mean, he's just some guy on his way to work, just passed out completely, mouth open, and people are filming him while his car is driving down the road.
00:28:28.000 Yeah, and actually, probably the people filming him are doing the more dangerous thing.
00:28:32.000 You're right, yeah.
00:28:33.000 Well, texting and driving drives me crazy.
00:28:36.000 To be in an Uber now, you can look around, you can see how many people are texting.
00:28:41.000 Yeah, including your driver sometimes.
00:28:42.000 Yeah, that drives me crazy.
00:28:44.000 Here's a guy.
00:28:45.000 Oh, yeah.
00:28:46.000 That dude's out cold.
00:28:49.000 His car is creeping along and stop and go traffic and he is completely out cold.
00:28:54.000 Wow!
00:28:58.000 Well, it works.
00:28:58.000 The autopilot works.
00:29:00.000 Yeah, no, it does work.
00:29:02.000 Yeah, texting and driving scares the shit out of me.
00:29:04.000 That Pokemon Go thing, thank God that died off.
00:29:07.000 It stopped, yeah.
00:29:08.000 I was driving on the highway, and there was a woman to the left of us, and I noticed that her face was illuminated by her cell phone.
00:29:14.000 So I look over, and she's playing Pokemon as she's driving.
00:29:18.000 So I guess, I don't know how Pokemon works, but I guess you pick up things, and as you're driving, you can get stuff.
00:29:24.000 And so she was playing the game while she was barely paying attention to the road, looking at her phone.
00:29:32.000 This is the argument for robot drivers.
00:29:34.000 Yeah.
00:29:35.000 And also it will open up when you are ultimately being driven safely by a car that you trust more than you trust yourself.
00:29:45.000 Then just imagine the information consumption entertainment options that open up there.
00:29:51.000 You'll watch movies.
00:29:52.000 You'll listen to podcasts.
00:29:54.000 You'll get work done.
00:29:55.000 And it'll become a space of...
00:29:57.000 We're not going to miss...
00:30:00.000 Having to pay attention to the road.
00:30:02.000 Maybe some people want to drive recreationally for some reason, but it will just be a new space where you won't believe that 40,000 people every year were dying because we couldn't figure out how to drive safely.
00:30:17.000 But isn't that a slippery slope?
00:30:20.000 That's a great thing, that 40,000 people are not going to die.
00:30:23.000 But the idea that you're going to stop people from driving a car...
00:30:26.000 You're going to have to live with these 40,000 people.
00:30:29.000 That's not what I was going to say.
00:30:30.000 But I was going to say, I mean, pretty much all the things that people do.
00:30:34.000 And then it's going to get down to why have people?
00:30:37.000 I mean, ultimately, that's the...
00:30:38.000 When I look at the event horizon of artificial intelligence, it's why?
00:30:42.000 Why would we...
00:30:43.000 We're so flawed.
00:30:44.000 We're not going to get our shit together by the time artificial intelligence is given birth to.
00:30:50.000 Well, that all goes to what we build.
00:30:56.000 I mean, if we build artificial intelligence that is...
00:30:59.000 If it's independent of us and seems conscious and is more powerful than us, well, then we are, we have built a, in the limit, we have essentially built a god that we now have to be in relationship to.
00:31:13.000 And hopefully that's a, works out well for us.
00:31:16.000 And it's very easy to see how it might not.
00:31:18.000 I think there are even scarier cases than that, though.
00:31:21.000 We could build something that has godlike power.
00:31:23.000 But there's no reason to think it's conscious.
00:31:27.000 It's no more conscious than our current computers, which is to say that intelligence and consciousness may be separable phenomenon.
00:31:36.000 Intelligence can scale, but consciousness need not come along for the ride.
00:31:40.000 And that, for me, is the worst case scenario, because if we inherit all of the danger of The power of this system being misaligned with our interests.
00:31:51.000 We could build something that's godlike in its power, and yet we could essentially be canceling the prospects of the evolution of consciousness.
00:32:01.000 Because if this thing wipes us out, I think it's Nick Bostrom, the philosopher, who wrote a great book on this entitled Superintelligence.
00:32:10.000 I think he calls this the Disneyland without children.
00:32:13.000 Basically, we could build this...
00:32:16.000 It's incredibly powerful, intelligent landscape that continues to refine itself and its own powers in who knows what ways, what ways perhaps that are unimaginable to us, and yet the lights aren't on.
00:32:33.000 There's nothing that it's like to be this machine or system of machines in a way that it's probably nothing that it's like to be the Internet right now.
00:32:41.000 You think of all that's going on on the Internet...
00:32:44.000 I don't think the Internet is conscious of any of it right now.
00:32:47.000 The question is, could the Internet become conscious of what it's thinking?
00:32:51.000 And I think there's no reason to think it couldn't.
00:32:55.000 It's just we don't understand the physical basis of consciousness yet.
00:32:59.000 The real question is, why would it do anything?
00:33:02.000 I mean, if it doesn't have any of the biological motivations that people have to breed and to stay alive and to, you know, fight or flight and to be nervous and this desire to carry on our genes, I mean, if you really did build the ultimate supercomputer artificial intelligence that was beyond our capacity for reason and understanding,
00:33:20.000 wouldn't it just do nothing?
00:33:21.000 Because everything is pointless.
00:33:23.000 Well, no, because what we would do...
00:33:25.000 We would program it.
00:33:26.000 It would do whatever we...
00:33:27.000 Right.
00:33:45.000 And so everything we build that's automated has goals explicitly programmed into it.
00:33:53.000 And when you're talking about a truly intelligent machine, it will discover goals that you have never programmed into it that are intermediate to the goal that you have programmed.
00:34:05.000 So if the goal of this machine is to You know, pick up all the trash in this room, and you physically try to stop it, well, then it's going to try to get around you to pick up the rest of the trash in the room, right?
00:34:20.000 So it's, you know, this is probably already true of a Roomba, right?
00:34:23.000 I actually don't have a Roomba, but if you put something in the way of the Roomba, it's going to get around the thing you have put in its way so that it can get to the rest of the room.
00:34:33.000 So that's an intermediate goal, and some of these goals need never have been explicitly thought about or represented, which is to say programmed into it, and yet they're formed by the fact that the thing has a long-term goal.
00:34:49.000 And one of the concerns is that we could build something...
00:34:53.000 That has a long-term goal, build something that's super powerful, that has a long-term goal, which in principle is benign, right?
00:35:01.000 This is something we want, and yet it could discover instrumental goals that are deeply hostile to what we want.
00:35:09.000 I mean, the thing doesn't have common sense, right?
00:35:12.000 We haven't figured out how to build common sense into the machine.
00:35:15.000 So, I mean, there's just cartoon examples of this kind of thing, but like one example that...
00:35:21.000 Elon used when he first was expressing fears about this.
00:35:24.000 If you built a machine, the only goal of which was to cancel spam, right?
00:35:29.000 We want no more spam.
00:35:32.000 Get rid of the spam.
00:35:33.000 Well, an easy way to get rid of spam is just kill all the people, right?
00:35:36.000 Now, that's a crazy thing to think, but unless you've closed the door to that intermediate goal, My question would be,
00:35:51.000 if this super powerful machine has the ability to create new super powerful machines, would it use the same mandate?
00:35:57.000 Would it still try to follow the original programming or would it realize that our original programming is only Instrumental to the success of the human race and it might think the human race is ridiculous and preposterous and why not just program something that it thinks is the ultimate intelligence something beyond our capacity for reason and understanding now and that thing I would wonder in the absence of any sort of biological Motivations in the absence with I
00:36:27.000 mean do you think of all the things that we do?
00:36:29.000 I mean you break it down to What motivates people to get out of bed what motivates people to do good what our sense of community?
00:36:37.000 The desire to breed the social status all these different things that motivate people to do things Remove all of those and what actions would it take and why?
00:36:46.000 Well, I think I think you want to build it in a way That is focused on our well-being.
00:36:53.000 For instance, I had Stuart Russell on my podcast.
00:36:57.000 He's a computer scientist at Berkeley who, unlike many computer scientists, takes this problem really seriously and has thought a lot about it.
00:37:08.000 In his lab, I believe they're working on a way of thinking about Safety that is open-ended and flexible without pretending we have any of the right answers in the near term or we're likely to have them.
00:37:26.000 So you want to build a system that wants to know what you want at each point, that's tracking what humanity wants in terms of its goals.
00:37:39.000 We're good to go.
00:38:04.000 It doesn't think it knows what we want, and it continually wants to keep approximating better and better what we want.
00:38:13.000 From my point of view, the most crucial thing is you always want the door to remain open to the statement, wait, wait, wait, that's not what I wanted.
00:38:25.000 You want to be in the presence of this godlike superpower that will always take direction I think?
00:38:51.000 There's dangerous potential in AI. I mean, I was listening to when you talked to Will McCaskill a few weeks ago, who was saying, you know, it was just a few years from starting to think about how to split the atom to actually having a bomb.
00:39:07.000 Are they not coming around to the idea that technology can progress much faster than we think of?
00:39:13.000 Yeah.
00:39:13.000 Well, it's a whole spectrum of the people who think that this is never going to happen or it's so far away that thinking about it now is completely irrational to people who are super worried and think that huge changes are imminent.
00:39:27.000 And so it's just a spectrum.
00:39:29.000 I'm with the latter.
00:39:30.000 Yeah, as am I. Well, my concern is not even with the initial construction, like the initial AI. My concern is with what the AI creates.
00:39:39.000 If we give the AI ability to improve upon itself and look at our irrational thoughts and how we've programmed itself to support the human race, and then it might go, well, why the fuck would I do that?
00:39:50.000 Like, you guys are ridiculous.
00:39:52.000 Like, this is a new life form.
00:39:53.000 This is a new, we've given birth to some...
00:39:56.000 Incredibly potent new thing that we think of it as artificial, but I mean is it really?
00:40:03.000 It's just a form of life.
00:40:04.000 It's a form of life that we've created.
00:40:06.000 Human beings have sort of...
00:40:07.000 The analogy I always use is that we're some sort of an electronic caterpillar giving birth to some spectacular butterfly that we're not even aware of while we're building our cocoon.
00:40:18.000 We're just doing it.
00:40:19.000 I mean is the caterpillar fully conscious of what it's doing when it makes that cocoon?
00:40:23.000 Probably not, but it just does and there's plenty of Examples of that in nature of something that's doing something that's going through some metamorphosis that's completely unconscious.
00:40:32.000 My worry would be, I guess it's not even really a worry.
00:40:36.000 It's more like looking at the possibility of the AI improving upon itself and making a far better version than we could create, like almost instantaneously, right?
00:40:47.000 I mean, isn't that, if you give it the ability to be autonomous and you give it the ability to innovate and to To try to figure out what's a better way around things and what's a better way to program things and then make its own version of what it is It's gonna be spectacular.
00:41:01.000 I mean it really will be just I mean it's obviously just talking shit, but really would be a god I mean you're talking about something that if we give it the ability to create we give it the ability to think reason rationalize and then Build.
00:41:14.000 Build something better.
00:41:16.000 And by build, you should be thinking more software than hardware.
00:41:21.000 Obviously, anything is possible.
00:41:23.000 Anything that can be built with intelligence can be built with intelligence.
00:41:27.000 So you could be talking about armies of robots and nanotechnology and everything else that is the staple of the sci-fi scare scenario.
00:41:36.000 But more likely, and certainly faster, And ultimately more powerful, you're talking about something that can rewrite its own code, improve...
00:41:46.000 I mean, it's the code that is dictating the intelligence.
00:41:50.000 And so you're talking about something that could be, for the longest time, invisible and just happening on the Internet.
00:41:58.000 You're talking about code that could be put into financial markets, which could be built to be self-modifying.
00:42:10.000 Right?
00:42:11.000 And then it's already out in the wild.
00:42:12.000 It's not sequestered in some air-gapped computer in a lab.
00:42:17.000 It's out there, and it's changing itself.
00:42:20.000 Now, that would be a totally irresponsible thing to do from a software designer's point of view, I think, at this point.
00:42:30.000 There's just no question that we're going to get to a place where...
00:42:33.000 I mean, it either will be the province of one lab that gets there first, or it will be open source.
00:42:43.000 But you're talking about software, right?
00:42:45.000 Us figuring out how to write better software, which becomes the basis of general intelligence.
00:42:55.000 And then where that gets put and what gets done with that, that's...
00:43:00.000 That's the question.
00:43:01.000 Isn't it a real question of also the race to see who can come up with one first?
00:43:06.000 I mean, once the idea gets put out there, like the idea of the nuclear bomb.
00:43:10.000 I mean, obviously, no one in their right mind thinks it's a good idea to make a nuclear bomb.
00:43:15.000 That story is especially sobering because I may forget the details.
00:43:22.000 I think it was...
00:43:24.000 I could have this backwards.
00:43:25.000 I think it was Rutherford...
00:43:28.000 There were two famous physicists involved.
00:43:30.000 It was either Rutherford who said we're never going to unlock the...
00:43:35.000 I think it was Rutherford who gave a talk saying we're never going to unlock the energy that we now know to be in the atom.
00:43:44.000 And Leo Szilard, the next day...
00:43:49.000 Produced the equations that unlocked it.
00:43:52.000 The next day?
00:43:54.000 Unbelievable.
00:43:54.000 And in direct response to this announcement, like, okay, that's bullshit.
00:43:59.000 And the next morning, woke up and produced the math that gave us the atomic bomb.
00:44:06.000 So it can happen really fast.
00:44:08.000 And if you want to get those details exactly right, listen to what Stuart Russell said on my podcast.
00:44:13.000 Well, Oppenheimer, in a really ironic twist, wasn't he a Buddhist?
00:44:20.000 Was he?
00:44:21.000 No, he was a fan of Hinduism, technically.
00:44:25.000 He taught himself Sanskrit, apparently in three months.
00:44:29.000 One never knows how much this is exaggerated.
00:44:32.000 To what end?
00:44:34.000 His publicist will tell you that he taught himself Sanskrit in three months to read the Bhagavad Gita.
00:44:40.000 Which is one of the texts of Hinduism.
00:44:44.000 The quote that he gave out as the first bomb was tested.
00:44:48.000 I have become death, destroyer of worlds.
00:44:50.000 I am become death, destroyer of worlds.
00:44:52.000 And when you hear him say it, it's even more creepy because you can see the sort of remorse.
00:44:57.000 Have you seen that?
00:44:58.000 No.
00:45:00.000 See if we can find that.
00:45:01.000 Every photo of Oppenheimer, he just looks haunted.
00:45:05.000 Well, he was hanging out.
00:45:07.000 It's weird to see him with this general.
00:45:09.000 I was watching some documentary on the creation of the atom bomb.
00:45:12.000 He was hanging out with these generals.
00:45:15.000 Probably Curtis LeMay.
00:45:17.000 Yeah.
00:45:17.000 And you see the two of them together and you're like, what a bizarre pairing.
00:45:22.000 Like this one monkey needs this other genius to make this bomb so he can drop it on these people.
00:45:28.000 And the guy realizes that if he doesn't make it, someone's going to make it and it could be dropped on them.
00:45:32.000 Well, and that's the thing.
00:45:35.000 We were genuinely in a race condition there, and we didn't know how close the Nazis were.
00:45:41.000 It turns out that they weren't as close as we feared.
00:45:43.000 But, yeah, just imagine Hitler having gotten there first, right, with the help of Heisenberg and others.
00:45:48.000 Play it, Jamie.
00:45:50.000 We knew the world would not be the same.
00:45:55.000 Few people laughed.
00:45:59.000 Few people cried.
00:46:01.000 Most people were silent.
00:46:06.000 I remembered the line from the Hindu scripture, the Bhagavad Gita.
00:46:15.000 Vishnu is trying to persuade the prince that he should do his duty and to impress him takes on his multi-armed form One says,
00:46:33.000 now I am become death, the destroyer of worlds.
00:46:39.000 I suppose we all thought that one way or another.
00:46:42.000 He does not look like a happy dude.
00:46:45.000 What a burden.
00:46:46.000 Yeah.
00:46:47.000 But it's actually, when you read the history of that effort, the Manhattan Project and the Trinity Test, It is super sobering because they moved forward in a context of real uncertainty about what was going to happen.
00:47:08.000 In terms of the yield of the first bomb, there was a range, I think, of a hundredfold difference of opinion of what they were going to get once this thing went off.
00:47:31.000 Yeah.
00:47:34.000 Yeah.
00:47:34.000 Yeah.
00:47:35.000 Yeah.
00:47:38.000 I mean they did something like due diligence where they were many of them were confident it wouldn't but that was not without beyond the realm of possibility for some of the people working on it and So we have shown a propensity for taking Possibly existential risks to develop new technology because there's there's a reason to develop it and In this case I
00:48:14.000 think?
00:48:23.000 And these bombs, you know, now getting in the hands of the wrong people.
00:48:26.000 With AI, it's so seductive because if it looked at in one light, it's just all upside.
00:48:35.000 I mean, there's nothing better than intelligence.
00:48:37.000 There's nothing more intrinsically desirable than intelligence.
00:48:41.000 And so to get more of it seems an intrinsic good.
00:48:45.000 And so it takes an extra step to say, well, wait a minute.
00:48:48.000 This could, in fact, be the most dangerous thing we've ever done.
00:48:52.000 And you have to spend a lot of time fighting that ideological battle with people who just think, no, this is just all upside.
00:49:03.000 What could go wrong?
00:49:06.000 You're just scaremongering.
00:49:08.000 You've seen too many Terminator movies.
00:49:12.000 Is that even possible?
00:49:14.000 To see too many?
00:49:15.000 No.
00:49:16.000 Not with the first.
00:49:17.000 The first was good.
00:49:18.000 The first was very good.
00:49:19.000 The second was good.
00:49:20.000 There's one in there that I don't think I've seen all of.
00:49:23.000 There were three or there were four?
00:49:25.000 I don't know.
00:49:25.000 I think it was...
00:49:27.000 I lost touch after the second.
00:49:28.000 I've seen two.
00:49:29.000 I haven't seen any more than two.
00:49:31.000 I don't know.
00:49:33.000 I just feel like it's because of the race, because of the idea that there's a race to get to it, it seems like it's inevitable that someone actually does create it.
00:49:42.000 And much like the atomic bomb, it'll probably be launched without a true understanding of what its potential is.
00:49:52.000 That's my fear.
00:49:54.000 I'm terrified of it.
00:49:55.000 I think about it all the time.
00:49:56.000 I step back sometimes and I look at the city of Los Angeles, look at the skyline, all the lights go off, and I'm like, this is all new.
00:50:03.000 This has only been here for a few hundred years.
00:50:05.000 There was nothing here.
00:50:07.000 1700, there was nothing.
00:50:09.000 This was nothing.
00:50:09.000 Now look at it.
00:50:10.000 It's all lit up and there's a gigantic grid you see from the sky.
00:50:13.000 What are we looking at 300 years from now?
00:50:16.000 What are we looking at with all these things that we're feverishly attempting to build and create?
00:50:21.000 Yeah, well, our dependence on the net is sobering.
00:50:27.000 I mean, just forget about all of these highfalutin fears of rogue AI. Just, we don't have a backup for the Internet.
00:50:36.000 I mean, if the Internet goes down, what happens in the real world?
00:50:39.000 A lot that is very difficult to recover from.
00:50:43.000 You know, just what happens to your money, right?
00:50:45.000 What is money when there is no...
00:50:47.000 Right.
00:50:48.000 Or just imagine some malicious code just destroying the record of money, just getting into the banking system, right?
00:51:00.000 So it's like you then have to go look for the paperwork you may or may not have in your desk to argue that you have a certain amount of money because all those bits got scrambled, right?
00:51:11.000 And we need some—all of this is just—there's so many aspects to this, but the fact that you can now credibly fake audio, right?
00:51:21.000 So someone can listen to a sample, you know, five minutes of this podcast and then produce a conversation we've never had in voices exactly like our own— And those edits will no longer be discernible.
00:51:33.000 I mean, we're basically there now, and we're almost there with video, right, where you could just have our mouths moving in the correct way.
00:51:41.000 Well, again, I go to Snapchat, these crazy Snapchat filters.
00:51:44.000 I don't know if you know about these, but my daughter, pull up the one of my daughter being Abraham Lincoln.
00:51:49.000 I mean, it's fucking crazy.
00:51:51.000 I mean, it's really rudimentary right now, but my six-year-old loves it.
00:51:55.000 She thinks it's hilarious, and she constantly uses it all the time.
00:51:58.000 Like, she's just like, can I play with your phone?
00:52:00.000 And she grabs my phone, and then she starts doing these little videos.
00:52:03.000 Like, somehow or another, the little brains, like, sync up immediately with the technology, where if I gave it to my mom, she'd be like, I don't even know what this is.
00:52:11.000 What do I do?
00:52:12.000 That six-year-old can figure it out like that.
00:52:15.000 Check this out.
00:52:26.000 If I showed this to my daughters, you would just never hear from them again.
00:52:32.000 It's like the most captivating thing.
00:52:35.000 I mean, this is obviously black and white, and she's being silly, and it's really obvious, and it's fake, but man, how far...
00:52:42.000 I mean, this is a six-year-old girl who looks like Abraham Lincoln.
00:52:45.000 It's mimicking the voice.
00:52:47.000 You can't...
00:52:48.000 Like, you see her mouth.
00:52:49.000 You can't discern where her mouth ends, and Abraham Lincoln's face begins.
00:52:54.000 Right, but here...
00:53:07.000 Right.
00:53:09.000 Right.
00:53:10.000 Right.
00:53:17.000 And being able to produce content where, you know, you are saying the thing that completely destroys your reputation, but it's pure—it's just fake news.
00:53:28.000 It's just fiction.
00:53:29.000 And so we need—I mean, I don't know what the fix for this is.
00:53:33.000 You know, I've just—and this is, again, something I know very little about, but I've— Clearly, something like the blockchain has to be a way of anchoring each piece of digital content.
00:53:45.000 So there's like a chain of custody where you see exactly where this came from in a way that's not fake-able.
00:53:51.000 And so, to take your podcast as an example, If someone is producing a clip that purports to be from your podcast where you're saying something insane, there just has to be a clear fingerprint digitally,
00:54:07.000 which shows whether that came from you and Jamie or whether this came from some Macedonian hacker who just decided to screw with you.
00:54:16.000 Yeah, but there's not.
00:54:18.000 I know, but clearly we need that tomorrow, because the technology is here to produce totally fake content, which is...
00:54:26.000 I am worried about that.
00:54:27.000 I'm also worried about the idea that someone anywhere is completely in charge and knows what's going on.
00:54:33.000 You know, I always point to, you remember that guy that went on stage with Obama and pretended to be a sign language...
00:54:39.000 Translator?
00:54:40.000 That was really...
00:54:42.000 That was South Africa or something like that?
00:54:44.000 I believe it was, wasn't it?
00:54:46.000 That sounds right.
00:54:47.000 I forget where it was.
00:54:49.000 I mean, that was amazing because there's just so few people read sign language that, you know...
00:54:56.000 Wasn't it around a Mandela thing?
00:54:59.000 I don't remember the details, but I remember thinking, this fucking guy is three feet away from the president.
00:55:06.000 He was three feet away from the president, and you would think that they had vetted everyone out.
00:55:10.000 So here's this guy talking, and that guy is just completely making things up.
00:55:15.000 He has no idea...
00:55:18.000 Yes.
00:55:18.000 He has no idea how to do sign language.
00:55:21.000 And that can happen.
00:55:24.000 There was another instance where a guy had gotten in an elevator with Obama.
00:55:28.000 Yeah, a security guard, yeah, with a gun.
00:55:29.000 With a gun.
00:55:30.000 Load a gun in an elevator with Obama.
00:55:32.000 Nobody screened him.
00:55:33.000 It's just people aren't really on the ball.
00:55:37.000 There was a bit from my last comedy special about the guy that broke into the White House, about there was a woman guarding the front door by herself, and they had shut the alarm off because it kept going off, so they had to fuck it, just shut it off.
00:55:49.000 And then there was a guy who was on the lawn who was supposed to be, he had canines, was supposed to be guarding the lawn.
00:55:54.000 He took his earpiece out to talk to his girlfriend on the cell phone.
00:55:57.000 He had a backup walkie talk, because they have a backup one, but he left that in his locker.
00:56:02.000 So it's like all these steps, and this guy just hit the perfect sweet spot where he hopped the fence, ran the whatever hundred yards plus to get to the White House, got to the door, it was unlocked, got through it, there was a girl by herself, threw her to the ground, and just ran through the White House,
00:56:19.000 and was in there for five, ten minutes.
00:56:21.000 He had a knife, right?
00:56:23.000 Yeah, he had a knife.
00:56:23.000 He was basically trying to, it was basically a suicide by cop situation, because they had caught him, this is what's really hilarious, when people think that the government is watching out for you, They weren't even watching this guy.
00:56:33.000 And listen to what this guy did.
00:56:35.000 He got arrested.
00:56:35.000 They pulled him over, I believe it was less than three months before that, with 800 rounds of ammunition.
00:56:43.000 He had two rifles, a shotgun, an axe and a machete, and a map of Washington with a fucking X where the White House is.
00:56:54.000 And they let him go.
00:56:57.000 There has to be a crime there.
00:57:00.000 But they weren't even watching that guy.
00:57:02.000 Like, the idea that they're watching you...
00:57:04.000 But we're not built as species for perfect vigilance.
00:57:09.000 That's so not perfect.
00:57:11.000 I mean, that's so ridiculously imperfect.
00:57:13.000 I'm thinking about even the TSA. Right.
00:57:15.000 You know, it's just...
00:57:16.000 We're just...
00:57:19.000 We're just not built for that.
00:57:21.000 It's boring.
00:57:22.000 Twitter can confound even the perfect vigilance.
00:57:25.000 I mean, just look what happened at the Oscars.
00:57:28.000 Their only job is to get those envelopes right.
00:57:32.000 You've got two people with the locked briefcases, and one guy starts tweeting and produces the wrong envelope.
00:57:38.000 Is that what happened?
00:57:40.000 He was tweeting?
00:57:40.000 Yeah.
00:57:41.000 Yeah, he was tweeting.
00:57:42.000 I mean, this is brutal, but he took a photo of, I forget what actress, but clearly he was, I mean, they have, he then deleted the photo, but people recovered it.
00:57:58.000 I don't know, Charlize Theron or somebody who was walking backstage and put it on his Twitter feed.
00:58:03.000 And then the very next moment was the moment when he had to hand over the right envelope.
00:58:10.000 And he just handed over the envelope for the best actress.
00:58:17.000 But their only job is to get this straight, right?
00:58:20.000 And they're just safeguarding the envelopes.
00:58:23.000 It's all adorable.
00:58:26.000 There's something I really enjoy about the folly of people.
00:58:30.000 That's one of the main things that I worry about with AI. The things that we find cute.
00:58:36.000 About us being ridiculous like that or like the sign language guy or any of these things.
00:58:42.000 But that's the thing that computers are really good at.
00:58:44.000 I mean, once computers get good at this sort of thing...
00:58:47.000 Yeah, but we're programming them.
00:58:49.000 No, but we can program them to fill in the gaps for...
00:58:56.000 Our own stupidity.
00:58:58.000 What we want to do is automate...
00:59:03.000 We're good to go.
00:59:20.000 And getting captivated by some social glitch, right?
00:59:24.000 So if you have a bomb-screening robot, that's all it will do, and it will be better, whatever the visual signature of a bomb is, when you're talking about looking for one in a bag.
00:59:37.000 Once computers get better at that than people, they will be reliable in a way that people can never be.
00:59:45.000 We know that about ourselves.
00:59:47.000 We want to outsource all of that stuff.
00:59:50.000 Driving is the perfect example.
00:59:53.000 You don't have self-driving cars that are falling asleep or reading billboards while they're driving at 80 miles an hour.
01:00:00.000 And so we want that.
01:00:02.000 But the scary stuff is when it can change itself.
01:00:10.000 That's a principle that would allow it to escape from our control.
01:00:15.000 Or we build something where we haven't anticipated the consequences.
01:00:19.000 And the incentives are wrong The incentives are not aligned to make us prudent in the development of that.
01:00:28.000 An arms race is the worst case for that, because the incentives are to get to the end zone as quickly as you can, because the guy next to you is doing the same, and you don't know whether he's ahead of you or behind you.
01:00:43.000 And it's a winner-take-all scenario, because the amount of wealth that will go to the winner of this race, if things work, if this group doesn't destroy the world, is unimaginable.
01:00:57.000 We're talking about a level of windfall profits that we just haven't seen in any other domain.
01:01:10.000 Well, a perfect example is the creation of the atomic bomb.
01:01:12.000 I mean, look at 70 years from the creation of the atomic bomb to today, which is literally a blip in human history, just a tiny little blink of an eye.
01:01:20.000 And then the United States emerges as the greatest superpower the world's ever known.
01:01:25.000 And that's going to be directly attributed to those bombs that were dropped on Hiroshima and Nagasaki.
01:01:29.000 I mean, that from there on, that's where everything takes off.
01:01:33.000 When you look at human history, if you look at us from a thousand years from now, it's very likely that they look at that moment and they go, this is the emergence of the American empire.
01:01:41.000 But this is so much bigger than that because the bombs, the power in the bomb was just...
01:01:49.000 Right.
01:02:03.000 Virtual reality entertainment.
01:02:05.000 We're talking about everything.
01:02:10.000 Once you're talking about general human intelligence and beyond, you're talking about now having access To suddenly having access to the smartest people who have ever lived, who never sleep, who never get tired,
01:02:26.000 who never get distracted, because now they're machines, and the smartest people who have never lived, right?
01:02:32.000 People who are a thousand times smarter than those people and getting smarter every hour, right?
01:02:37.000 And what are they going to do for you when they're your slaves?
01:02:41.000 Because now you're talking about...
01:02:42.000 You don't have to feed these people.
01:02:44.000 They're just working for you.
01:02:45.000 How powerful would you suddenly be if you had 10,000 of the smartest people in the world working for you full-time, they don't have to be fed, they never get disgruntled, they don't need contracts, and they just want to do whatever Joe Rogan wants to do and get done,
01:03:02.000 and you flip a switch and that's your company, right?
01:03:09.000 That's...
01:03:10.000 Some version of that's going to happen.
01:03:12.000 The question is, is it going to happen in a way where we get this massive dislocation in wealth inequality, where all of a sudden someone's got a company and products and a business model which...
01:03:25.000 It obviates 30% of the American workforce over the span of three months.
01:03:33.000 Or you have some political and economic and ethical context in which we start sharing the wealth with everyone.
01:03:44.000 And this is, again, this is the best case scenario.
01:03:46.000 This is when we don't destroy ourselves inadvertently by producing something that's hostile to our interests.
01:03:52.000 Dan, you look terrified.
01:03:54.000 Yeah, no, this is what hanging out with Sam does.
01:03:56.000 I know.
01:03:57.000 Just listen, we get to a high spot, we get a pitcher of margaritas, and we just watch this whole thing.
01:04:03.000 We're not going to make it anyway.
01:04:04.000 None of this room is going to live forever.
01:04:06.000 When I hang out with Sam, usually his wife's there, so it's like the conversation is a little less apocalyptic.
01:04:11.000 Yeah.
01:04:12.000 She keeps me grounded.
01:04:13.000 Yeah.
01:04:14.000 It's actually awesome because when she's there, she's like just tooling on him and he giggles and stuff like that.
01:04:20.000 It's like a completely different Sam than the guy who's being an undertaker here and telling you about how we're all going to die from AI. Yeah.
01:04:29.000 She keeps me honest.
01:04:31.000 But, I mean, the truth is it's not...
01:04:36.000 I mean, I'm cautiously...
01:04:38.000 I mean, there's no way to stop.
01:04:40.000 I mean, there's no break to pull, right?
01:04:42.000 So it's like this is...
01:04:43.000 I mean, it's the inevitability you were describing before.
01:04:47.000 We're moving toward this thing because intelligence is the best thing in the world.
01:04:53.000 I mean, intelligence is the thing that allows us to solve every problem we have or don't know we have.
01:04:58.000 We're continually discovering new problems.
01:05:01.000 And the question is, what are we going to do?
01:05:04.000 Some global pandemic arrives and the challenge is, do you find a vaccine for this thing or not, right?
01:05:11.000 Only intelligence solves that problem for us.
01:05:14.000 We either have the data and we can't interpret it.
01:05:16.000 We have to design the experiments to get the data.
01:05:24.000 Right.
01:05:41.000 We're not going to let China and North Korea and Singapore and Iran and Israel and all these other countries do it for us.
01:05:51.000 Find a high spot right above Denver.
01:05:54.000 Big bucket of margaritas.
01:05:56.000 Or the meditation that has helped both of us.
01:06:02.000 That definitely will help you while human beings are a real thing.
01:06:08.000 Yes, although I wonder if, you know, getting enough people meditating might improve the quality of whatever gets created in the AI community.
01:06:18.000 In other words, if you have people who are a little bit more sane as a consequence of meditation than maybe the people designing these products, rather these, I don't know if products is the right word, this stuff, then somehow this stuff is better.
01:06:32.000 Well, I think it's especially relevant for the other side of that, which is when you...
01:06:38.000 In the near term, we clearly have an employment problem.
01:06:43.000 There are jobs that will go away that are not coming back, and...
01:06:57.000 We're good to go.
01:07:05.000 We'll get to a place where there's actually much less that has to be done because we have machines that do it and we have seen the wisdom of mitigating wealth inequality to some acceptable level where all boats begin to rise with this tide to some degree.
01:07:25.000 So whether it's universal basic income or some...
01:07:27.000 So we have some mechanism to spread the wealth around.
01:07:30.000 But then there's the real question of what do you do with your life?
01:07:33.000 Yeah.
01:07:33.000 So what do you think people will do?
01:07:35.000 That's the easiest one.
01:07:37.000 But it's not so easy because people...
01:07:39.000 Well, people are confused.
01:07:40.000 Well, those people need to get their shit together.
01:07:42.000 I mean, that's giving someone free time, but find a hobby, man.
01:07:46.000 I mean, there's a lot of stuff to do.
01:07:48.000 If you told me that I never had to work again and I would just have to find things to do all day and all my food and everything would be taken care of, that would be the easiest choice I'd have ever made in my life.
01:07:58.000 I would just pursue things.
01:07:59.000 I would just learn how to speak a language.
01:08:01.000 I'd learn how to play an instrument.
01:08:02.000 I'd practice archery more.
01:08:04.000 I'd do jujitsu more.
01:08:07.000 That seems to me so ridiculous.
01:08:08.000 That seems to me that could be solved really easy.
01:08:11.000 I feel the exact same way, but I think that I'm not convinced.
01:08:15.000 I feel I could fill endless swaths of free time with a number of things that I'm interested in, including, but not limited to, meditation or just hanging out with my two-year-old.
01:08:26.000 However, I'm not sure that the vast...
01:08:29.000 The majority of humans are like that.
01:08:31.000 They just need guidance.
01:08:33.000 That seems to be the easiest thing to solve.
01:08:35.000 Just let them know.
01:08:36.000 Go find something fun to do.
01:08:38.000 Go start running up hills, man.
01:08:40.000 Go take up Frisbee golf.
01:08:43.000 There's shit to do.
01:08:45.000 There's a lot of shit to do.
01:08:46.000 My concern is that more and more it will take the form of being merely entertained.
01:08:52.000 So it will plug into some VR. The last scene of WALL-E. I never saw WALL-E. You see the movie WALL-E? No, I never watched it.
01:08:59.000 Okay, guys, you should maybe pull this up.
01:09:02.000 But in WALL-E, it's a dystopian future where everybody's riding around in those little jazzy...
01:09:12.000 Mobile wheelchairs that people sit in now.
01:09:15.000 Disneyland?
01:09:16.000 Yes.
01:09:17.000 And they've got huge buckets of soda and turkey legs and kind of hooked over the back of their motorized vehicle as a monitor at the front, which is entertaining them.
01:09:29.000 They're just obese and entertained and immobile.
01:09:33.000 Or mobile, but not actually ambulatory.
01:09:35.000 It's Disneyland.
01:09:36.000 Well, what's happened in our lifetime, the smartphone has made it virtually impossible to be bored.
01:09:45.000 Like, boredom used to be a thing.
01:09:47.000 Like, you'd be sitting in the waiting room of a doctor, right?
01:09:51.000 And they have crappy magazines, and then you're just sitting there.
01:09:54.000 And if you didn't know how to meditate, you had to confront this sense of, I'm bored, right?
01:10:01.000 Now, one thing you discover when you learn how to meditate is boredom is just...
01:10:06.000 An inability to pay attention to anything and then you can pay attention once you learn to pay attention to anything even something seemingly boring as your breath it suddenly becomes incredibly interesting so so focused attention is Intrinsically pleasurable, but boredom is the state of of kind of scattered attention looking for something that's worth paying attention to and yet now with technology You're never going to be bored again.
01:10:32.000 I have at least 10 years worth of reading on my phone.
01:10:36.000 So it's like if I'm standing in line, I'm constantly pulling this thing out.
01:10:41.000 Is that a bad thing?
01:10:42.000 Well, it's potentially a bad thing because you...
01:10:47.000 I mean, just to take this example of one interesting insight you get when you learn to meditate...
01:10:54.000 It's incredibly powerful to cut through the illusion of boredom.
01:10:59.000 I mean, to realize that boredom is not something.
01:11:03.000 You can become interested in the feeling of boredom, and the moment you do, it...
01:11:15.000 I think the bad thing about the hyper-stimulation that we get through our phone and all of technology is that we have lost the ability to just Sit back and, for lack of a less cliched term,
01:11:30.000 be.
01:11:31.000 And we're just constantly stimulated.
01:11:34.000 And that means we have trouble paying attention when we're holding our kid in our lap and reading him or her a book, or we find ourselves without our technology for a moment.
01:11:44.000 There was a recent study that asked people, would you rather be alone with your thoughts or get electric shocks?
01:11:51.000 And a lot of people took the electric shocks.
01:11:54.000 And I actually think that is a fundamental problem in terms of not being able to get in touch with the raw and kind of powerful, although obvious, fact that you're alive and that you exist.
01:12:06.000 Right, but don't you think those people are idiots?
01:12:09.000 I mean, you don't want an electric shock.
01:12:10.000 I won't take an electric shock.
01:12:12.000 You're not going to take an electric shock.
01:12:14.000 We're talking about children.
01:12:15.000 This is the dude with the Snapchat glasses.
01:12:18.000 Jamie's on the ball.
01:12:19.000 I don't want those glasses either.
01:12:20.000 But you know what I'm saying?
01:12:21.000 He doesn't use the glasses.
01:12:22.000 He just explores it because it's fascinating.
01:12:24.000 But you know what I'm talking about?
01:12:26.000 I mean, we're not talking about rational people.
01:12:28.000 We're talking about lowest common denominator people that would take an electric shock.
01:12:31.000 You would take an electric shock?
01:12:32.000 No, no, no.
01:12:33.000 I'm just thinking this is a study of...
01:12:35.000 They're not just picking idiots.
01:12:37.000 Right, but who are they asking?
01:12:38.000 And how are they phrasing the question?
01:12:39.000 But it also doesn't have to be conscious.
01:12:41.000 So, for instance, we willingly grant our attention to things which...
01:12:48.000 But retrospectively, we can judge, produce more or less nothing but pain for us.
01:12:53.000 Like what?
01:12:55.000 I'm continually thinking about this and rethinking about this with respect to social media.
01:13:02.000 What's the effect on my mind of looking at my Twitter ad mentions?
01:13:07.000 Well, I wondered that when you were asking people what kind of questions we should ask on this podcast.
01:13:12.000 Sometimes it's incredibly useful.
01:13:14.000 But I was thinking, what are you doing?
01:13:16.000 Yeah.
01:13:16.000 You're opening yourself up to the green frogs.
01:13:18.000 Yeah, yeah.
01:13:19.000 But I've had both kinds of experience.
01:13:23.000 I've had people send me articles that I would have never found otherwise.
01:13:26.000 They're fascinating, super useful, and it's just like, this is the perfect use of this technology.
01:13:31.000 And then again, then I get this river of green frogs and weirdness.
01:13:35.000 But to the previous point, I... I think?
01:14:06.000 I mean, it doesn't equip me to do anything better in my life.
01:14:11.000 It doesn't make me feel any better about myself or other people.
01:14:14.000 I mean, if it has any net effect, it basically grabs a dozen dials that I can just sort of dimly conceive of in my mind and turn them all a little bit toward the negative.
01:14:25.000 You know, I feel a little bit worse about myself, a little bit worse about my career, a little bit worse about people, a little bit worse about the future, a little bit worse about the fact that I just was doing this when I could have been playing with my kid or writing or thinking productive thoughts or meditating or doing anything that I know is good.
01:14:43.000 So you still do this?
01:14:43.000 I don't understand your Twitter compulsion.
01:14:47.000 Yeah, you argue with people.
01:14:49.000 I found that fascinating.
01:14:51.000 Periodically?
01:14:53.000 I go for weeks without arguing with people.
01:14:56.000 Well, but also you argue with people on your podcast.
01:14:59.000 Well, yeah, but that's different.
01:15:17.000 And savage me on the podcast and then troll me endlessly on Twitter and...
01:15:21.000 You mean like Abby?
01:15:22.000 Abby Martin?
01:15:23.000 Well, she was...
01:15:23.000 No, she doesn't troll.
01:15:24.000 I mean, she...
01:15:24.000 She doesn't troll you.
01:15:25.000 She hates me, but she...
01:15:26.000 She doesn't know you.
01:15:27.000 I haven't noticed her trolling me, but no, you...
01:15:29.000 If you got in a room with her, everybody would calm down.
01:15:32.000 She just has...
01:15:33.000 There's people that have radical misconceptions of who you are.
01:15:35.000 Yeah.
01:15:36.000 I'm sure you heard the Josh Zeps, Patton Oswalt thing.
01:15:39.000 No.
01:15:39.000 Or Patton Oswalt.
01:15:40.000 Don't listen to it.
01:15:41.000 Okay.
01:15:42.000 Okay.
01:15:43.000 He went off the rails.
01:15:44.000 Patton went off the rails like Hannibal style.
01:15:48.000 Are you thinking of Andy Kindler?
01:15:50.000 Oh yes, I am.
01:15:50.000 I'm sorry.
01:15:51.000 Did I say Patton Oswalt?
01:15:52.000 I'm so sorry, Patton.
01:15:54.000 So sorry.
01:15:54.000 How did I connect Patton Oswalt?
01:15:56.000 Yeah, Patton's great.
01:15:57.000 Well, Patton's very smart and very reasonable.
01:15:59.000 As is Andy for most of the time, Andy's kind of crazier.
01:16:04.000 I discovered this very late.
01:16:05.000 So I kept seeing this guy.
01:16:07.000 I didn't know who he was.
01:16:08.000 I kept seeing him in my Twitter feed.
01:16:09.000 Andy Kindler?
01:16:10.000 Andy Kindler.
01:16:11.000 And then I realized, wait a minute.
01:16:12.000 He's an established comic.
01:16:15.000 He seems to be friends with people who I really respect, who I don't know, like Sarah Silverman, who I don't know personally.
01:16:24.000 We've communicated a little bit on Twitter, but I'm just nothing but a pure fan of Sarah.
01:16:31.000 Jim Gaffigan, other big comics who I totally respect.
01:16:34.000 I don't know how close he is with these people, but he basically has endless energy for vilifying me as a racist and as a bigot.
01:16:42.000 He's a madman.
01:16:44.000 So I did a search of his Twitter feed, and he's got hundreds and hundreds of tweets where he's Yes.
01:16:59.000 Yes.
01:17:11.000 Talking to Duncan, saying, don't help him, Duncan.
01:17:13.000 He's a bigot.
01:17:15.000 Andy Kindler's just got endless energy for this.
01:17:18.000 And then I looked at what he was doing, and he had been doing it for years.
01:17:22.000 And I wasn't aware of it.
01:17:24.000 Just to you?
01:17:24.000 Just to me and Bill Maher.
01:17:26.000 He hates Bill Maher.
01:17:27.000 But then you've got another guy on this podcast...
01:17:32.000 Hunter Matz?
01:17:33.000 I had him on once.
01:17:34.000 Okay, so he was on here, and I actually had to go back and watch what he said here because I had been getting so much of this on Twitter.
01:17:40.000 And I mean, it's incredibly common.
01:17:42.000 People are tweeting at me saying, why won't you debate Hunter?
01:17:45.000 It's Matz, right?
01:17:46.000 Yeah.
01:17:47.000 Why won't you debate Hunter Matz?
01:17:48.000 So I went back and looked at what he said here.
01:17:50.000 Half of it, frankly, didn't make any sense.
01:17:53.000 But his attacks on me on Twitter are the most juvenile.
01:17:59.000 It's like the idea that he thinks this is a way he's going to establish a conversation with me by sending me two tweets and then sending me 400 which say, you're scared to debate me, right?
01:18:11.000 It's crazy behavior.
01:18:12.000 I think Hunter's on the spectrum.
01:18:14.000 Very, very, very, very smart guy, but taking a terrible, socially retarded approach to establishing a debate.
01:18:22.000 And I've contacted Brian Callen about this, and Brian Callen contacted Hunter about this, and was like, what the fuck are you doing?
01:18:27.000 And he continues to do it.
01:18:28.000 It's crazy behavior.
01:18:29.000 He said he wouldn't do it anymore, and then he did more.
01:18:32.000 Every time I look, I see him somewhere in there.
01:18:35.000 But then there's this guy, Mike Cernovich, who actually has...
01:18:55.000 I'm just talking about your psyche.
01:19:02.000 Oh yeah, well...
01:19:04.000 For sure, right?
01:19:05.000 Yeah, so they could come back.
01:19:07.000 It's not...
01:19:09.000 Nine times out of ten, looking just makes me think...
01:19:13.000 I mean, it's an illusion, because if you met most of these people, if they came up to you at a conference or at a book signing or after a gig of yours...
01:19:22.000 And you had more information about them.
01:19:25.000 You saw all of the crazy coming out.
01:19:30.000 You would say, there's no reason to pay attention to what this guy is saying.
01:19:36.000 Whereas on Twitter, everything has the same stature.
01:19:39.000 So whether it's a Washington Post columnist who's tweeting at me, or some guy like Hunter, who I have no idea who he is, but he's telling me I got something wrong...
01:19:50.000 Everything has the same stature, and there's no signal-to-noise sense of what...
01:19:55.000 Well, first of all, you fucked up, because you talked about them.
01:20:00.000 Both of them.
01:20:00.000 You said Candyman five times, and now you've got a problem.
01:20:03.000 You talked about him right now on this podcast.
01:20:05.000 This is the first time I've ever mentioned the guy.
01:20:07.000 Should have talked to you before this.
01:20:09.000 I'm bringing it to you, but you created it.
01:20:12.000 I didn't do it.
01:20:13.000 It was his podcast with you that kicked this whole thing off.
01:20:16.000 I certainly didn't think that Hunter was going to do that, and I know for a fact that Hunter's actually a fan of yours.
01:20:21.000 He's got a strange way of showing it.
01:20:23.000 I don't think it's a smart way.
01:20:24.000 I think what he's trying to do...
01:20:26.000 And when I was trying to hold his feet to the fire and get some sort of a logical definition of what you do wrong, he really didn't have anything.
01:20:36.000 Yeah, but he thinks he does, and he...
01:20:38.000 But did he when I talked to him about it?
01:20:40.000 No.
01:20:41.000 And half of what he said about Dawkins and me was totally wrong.
01:20:45.000 Half of what he said about the relevant biology was wrong.
01:20:48.000 I mean, he's just...
01:20:48.000 He's not...
01:20:50.000 He doesn't have his shit together.
01:20:52.000 But he thinks he does.
01:20:54.000 And there's a level of arrogance and incivility and just kind of a lack of charity in interacting with other people's views, which is now kind of getting rebranded on the Internet as just...
01:21:14.000 Just American can-do chutzpah, right?
01:21:17.000 And it's like it's given us Trump, right?
01:21:19.000 Trump is the ultimate example.
01:21:20.000 He's like the cartoon version of, you know, a person who doesn't know anything relevant to the enterprise, who doesn't show any aptitude for civilly engaging with business.
01:21:34.000 Differences of opinion.
01:21:37.000 And this thing gets, you know, amplified to the place of greatest prominence now in human history.
01:21:44.000 Everyone's on social media, or many people on social media are playing the same game.
01:21:49.000 And, you know, Ms. Cernovich is another, you know, just malignant example of this, where you have someone who's got a fairly large following.
01:21:57.000 I mean, it's not as big as yours, but it's, you know, it's a very engaged following.
01:22:02.000 I mean, this whole Trump Has shown me that a small percentage of one's audience can have like a hundred times the energy of the rest of your audience.
01:22:14.000 Like whenever I went against Trump on my podcast, or this is still the case, the level of pain it causes in the feedback space is completely out of proportion to the numbers of people who are...
01:22:29.000 I think?
01:22:54.000 Well, there's a big issue, I think, online where people find like-minded people and then they develop these communities.
01:23:05.000 Where they just support each other, and they just have these gigantic groups of people.
01:23:11.000 I mean, not even necessarily gigantic, but groups of people that find any subject.
01:23:17.000 Like, for me, you know what it is?
01:23:18.000 Flat Earth.
01:23:20.000 Right.
01:23:20.000 I get trolled by people all day that claim I'm a sellout because I don't believe the Earth is flat.
01:23:26.000 This is real.
01:23:28.000 But don't you think there's a...
01:23:31.000 I think this phenomenon that you're describing is both, has really serious negative consequences, but also has some beauty on the same, by the same token.
01:23:40.000 You've got parents all over the world who've got children with rare disease, but they can connect on the internet and bond over that and share tips and doctors and all that stuff.
01:23:51.000 So it is, it's actually, they're both outgrowths of the same kind of phenomenon, but it can be, we see The really difficult consequences of this in our politics right now, etc., etc.
01:24:04.000 I think we're talking about two different things, though.
01:24:06.000 You think so?
01:24:06.000 I'm talking about confirmation bias.
01:24:07.000 I'm talking about a bunch of people that get together and say, yeah, obviously, I'm woke, and the Earth is flat, and pay attention, there's an Arctic, there's an ice wall.
01:24:16.000 Yes, but they're different things.
01:24:18.000 The groups of people that will find, you know, communities where maybe your child has autism, and there's some sort of an issue that can be mitigated with diet, and parents have had, you know, some success with that, and they could, you know, give you some enlightening information, and you can communicate with each other, and that's nice.
01:24:34.000 It's beautiful.
01:24:35.000 And some of the hashtags that people use that they find searched through, it's great.
01:24:41.000 But these little communities that you bond, there's no confirmation bias in those ideas.
01:24:49.000 But there's confirmation bias in the idea that Trump is the man.
01:24:52.000 There's confirmation bias in the idea that the earth is flat.
01:24:55.000 And if you just huddle in those little communities and just bark the same noises that everybody else barks, there's some sort of sense of community in that too.
01:25:03.000 Yes, absolutely.
01:25:04.000 And people love that.
01:25:05.000 They love being a part of a fucking team, even if it's a stupid team of green frogs.
01:25:09.000 And you don't have to listen to other people's views, and you get deeply, deeply entrenched.
01:25:15.000 I think we're seeing this all through our politics and media right now.
01:25:20.000 Well, you also see when you go to those people's pages, which I do often, I don't engage with people in a negative way online very often, very, very rarely.
01:25:28.000 Now, do you not look at your ad mentions?
01:25:29.000 I do a little bit, but you know what, man?
01:25:31.000 I just like to just go on about my day.
01:25:34.000 I've found that the negative consequences that you're discussing...
01:25:40.000 It's rare that I go down the rabbit hole and I go for days without looking at ad mentions.
01:25:44.000 It's more that if I publish something, if I ask for feedback, I want to see the feedback.
01:25:49.000 It's not necessarily that there's anything wrong.
01:25:52.000 I mean, just as your friend, I feel like there's nothing wrong with it.
01:25:54.000 I look at my ad mentions, but...
01:25:57.000 But you function in a very different space.
01:25:58.000 I mean, you're not pushing controversial stuff out there.
01:26:01.000 No, definitely not.
01:26:02.000 Definitely not.
01:26:02.000 So I agree.
01:26:03.000 I stay offline.
01:26:04.000 I'm sorry, go ahead.
01:26:05.000 I was just more worried about, again, back to your psyche.
01:26:09.000 It feels like you need a little bit of a middle ground.
01:26:13.000 I stay offline after UFCs, for the most part.
01:26:17.000 That's when I get the most crazy people.
01:26:19.000 Especially if there's any sort of a controversial decision.
01:26:22.000 But are they criticizing you for something you said?
01:26:24.000 They get fucking mad at my commentary.
01:26:25.000 They'll just disagree with who I thought wanted.
01:26:28.000 And then they're so fucking vicious about it.
01:26:30.000 I'm just like, just yell.
01:26:31.000 Just yell in your own space.
01:26:33.000 So you don't want to see any of that.
01:26:34.000 There's too many people.
01:26:35.000 You're dealing with millions and millions of people.
01:26:37.000 And who knows how many of them are rational.
01:26:39.000 You know, there's a bit that I had in one of my specials where I was talking about the number of people that are stupid in the world.
01:26:45.000 Like, if you get a room full of a hundred people, the idea that one person isn't a complete fucking idiot, Of course.
01:26:53.000 You're being very charitable.
01:26:54.000 Right, I'm being charitable.
01:26:55.000 More than one, yeah.
01:26:55.000 But if you do that, we're talking about 300 million people in the United States plus, that's 3 million fucking idiots.
01:27:01.000 Yeah.
01:27:02.000 So if you have three million fucking idiots and all your ad mentions, if you look at your ad mentions and three million comments are saying you're a fucking moron, there's just too many people.
01:27:13.000 The numbers are, they're not manageable.
01:27:16.000 The numbers of human beings you interact with online are not manageable.
01:27:19.000 So anytime anything gets negative or insulting and...
01:27:23.000 I just check out.
01:27:24.000 I just, next, next, next.
01:27:26.000 I don't pay attention.
01:27:27.000 Because you can't.
01:27:28.000 But if I fucked up, and I know I fucked up, I think one of the most important things that I do is I admit that I fucked up.
01:27:35.000 And I talk about it, and I apologize, and I say, look...
01:27:38.000 I'm flawed.
01:27:39.000 I'm human.
01:27:39.000 I made a mistake.
01:27:40.000 Sorry.
01:27:41.000 And then just step away.
01:27:42.000 Let the fucking chaos ensue in the comments.
01:27:45.000 Let all the people call you a shill and a whatever.
01:27:47.000 Let all that happen, but don't let it in.
01:27:50.000 You're letting it in, and then you stew on it, and you have to bring up Hunter and all these other people.
01:27:58.000 You're letting them way too deep.
01:27:59.000 No, no, no.
01:28:01.000 Perhaps I gave a false impression of how under my skin this has gotten.
01:28:05.000 Just the fact that you talked about it at all, though, is too much.
01:28:08.000 Well, no, but I think it's a huge consequence.
01:28:10.000 I think it's given us Trump.
01:28:12.000 Yes.
01:28:16.000 This style of communication is attractive to so many people.
01:28:20.000 The fact that you can—the bluff and bluster and empty boasts and lies—being caught in lies without consequence.
01:28:28.000 Right.
01:28:29.000 The fact that people never admit mistakes.
01:28:32.000 I mean, what you just described, you do and which I do.
01:28:36.000 You make a mistake.
01:28:37.000 You want to hear about it.
01:28:38.000 You want to correct it as quickly as possible, right?
01:28:41.000 Right.
01:28:41.000 That is the antithesis of what we're seeing now in this space.
01:28:44.000 I mean, when has Trump admitted a mistake?
01:28:47.000 But isn't it the difference being that neither you nor I have any desire to control anything?
01:28:52.000 Like, I don't want to be the leader of anything.
01:28:55.000 Well, no, but you want effective, you want good ideas to win.
01:28:58.000 Sure.
01:28:58.000 And you want the truth to win.
01:28:59.000 You want the truth to propagate.
01:29:01.000 You want facts to propagate.
01:29:02.000 You want to be able to correct errors, whether they're your own or others, especially when they're consequential.
01:29:07.000 I think that we need a common ethic where lying has real consequences.
01:29:14.000 So many people are trying to figure out what's the antidote to fake news.
01:29:18.000 Well, one antidote is to be caught lying has to be devastating for your career, right?
01:29:27.000 Like politically, journalistically, academically, as a public intellectual.
01:29:33.000 I mean, to be caught in a whopping lie Will require, at minimum, some serious atonement.
01:29:42.000 And historically, it has been.
01:29:43.000 Absolutely.
01:29:44.000 It still is in many spheres.
01:29:46.000 We have slipped those rails just to a degree that I never thought possible.
01:29:50.000 We're going light speed right into the woods.
01:29:52.000 Yeah.
01:29:53.000 I mean, we're talking about people have figured out how to just...
01:30:00.000 I think?
01:30:15.000 Which is to say, just go fuck yourself, just redounds to your credibility among your tribe, right?
01:30:23.000 It's like, this guy is so powerful, he so fully doesn't give a shit what people think, that he can catch him in a lie and just watch how he gets out of it.
01:30:34.000 So when I was bringing up these guys like Cernovich, I mean, this is...
01:30:39.000 Actually, I decided to troll Cernovich one day, and I thought it was hilarious.
01:30:44.000 It was just nothing but fun.
01:30:45.000 So there was nothing toxic about that.
01:30:48.000 He's obsessed!
01:30:49.000 But the thing that bothers me is that this has real political consequences.
01:30:55.000 Do you think this is a time period where we're in this sort of adolescent stage of communication online where you can get away with saying things that are dishonest and that there might be some sort of a way to mitigate that in the future?
01:31:09.000 I don't think we should act like dishonesty and bluff and bluster, to use the phrase you used before, is somehow new to the human repertoire.
01:31:16.000 I get that The acceptance of it seems to be, though.
01:31:19.000 I think we're in a period where that is true, and I think it is aided and abetted by technology and the social networks.
01:31:28.000 I agree with your diagnosis on many levels, but I was having an interesting conversation with A guy that you introduced me to, Joseph Goldstein, who's an eminent meditation teacher, has become my meditation teacher, old friend of Sam's, and we were talking about the current political situation.
01:31:45.000 He used a phrase that I like when I was asked him what he thought about it.
01:31:48.000 He said, I'm kind of slotting into geological time.
01:31:51.000 And I think that actually makes some sense.
01:31:54.000 What does that mean?
01:31:55.000 Meaning that I'm just viewing it from a...
01:32:00.000 I'm widening the lens to look at the broad scope of human history to see that, you know, over time, we've got these ups and downs.
01:32:11.000 He's getting to the top of the mountain near Denver with a big bucket of margaritas.
01:32:14.000 That's exactly right.
01:32:15.000 That's exactly right.
01:32:17.000 They're slotting in, they're slotting in.
01:32:19.000 Just look down with binoculars, watch the bombs go off, learn how to get water out of the ground.
01:32:26.000 Get a solar power generator.
01:32:29.000 And make good margaritas.
01:32:30.000 Yeah.
01:32:31.000 I mean, it just doesn't seem to me that this is sustainable.
01:32:34.000 It feels like this is just going to be some spectacular moment in history where people were rah-rah Nixon, and now they look back and go, my God, Nixon was a fucking liar and a buffoon.
01:32:43.000 But take this, this is a point I've made before, and I don't think it's original with me, I think other people have made it, but my claim is that if Trump were one-tenth as bad He would appear much worse.
01:32:57.000 Because everything he does now is appearing against a background of so many lies and so much craziness that you can barely even weight its value.
01:33:06.000 This is one of the things I find useful about Twitter, because I follow some very interesting people.
01:33:13.000 Anne Applebaum, the Washington Post columnist.
01:33:16.000 Who's just awesome on Twitter.
01:33:18.000 Everyone should follow her.
01:33:19.000 She just keeps hammering Trump with her own points and other stuff that she finds.
01:33:24.000 And she just pointed out that...
01:33:28.000 Did anyone notice that Trump threatened a war with North Korea two days ago?
01:33:36.000 It was in the Financial Times.
01:33:39.000 And yet no one can talk about it because no one believes him.
01:33:43.000 We have a president whose speech has now become so denuded of truth value, perceived truth value, that he can say, if China doesn't handle North Korea,
01:33:59.000 we're going to.
01:34:00.000 And no one even feels like they have to ask a follow-up question on that topic because everyone assumes it's an empty bluff.
01:34:10.000 I mean, just imagine, like, just step back into the previous presidency.
01:34:14.000 If Obama had said, if China doesn't handle North Korea, we will, right?
01:34:20.000 That would be top of the fold.
01:34:23.000 This is all we're talking about today, right?
01:34:26.000 Yeah.
01:34:26.000 It just comes out of a blizzard of inanity and craziness.
01:34:32.000 He's going after Meryl Streep.
01:34:34.000 He's lying about Obama wiretapping him.
01:34:37.000 Now he's threatening war with North Korea.
01:34:39.000 And nobody knows what to talk about.
01:34:42.000 So it's like the consequence of this is we have a president who...
01:34:48.000 Not only can he not be trusted to tell the truth, he can be trusted to lie whenever he thinks it suits his purpose.
01:34:56.000 And now, so the state of mind that everyone's in, including the press, in listening to him is just to take...
01:35:05.000 Potentially the most serious things in the world, not seriously.
01:35:09.000 And the least serious things in the world, like Meryl Streep or what he thinks of her acting, that dominates a whole news cycle.
01:35:18.000 It's very upside down.
01:35:20.000 It seems to me to be quite new.
01:35:22.000 Lies are perennial, but I feel like we're in a very different space now with the consequences of this.
01:35:31.000 We certainly are.
01:35:32.000 Do you think that they're connected to what we were talking about before where you said that people would rather be electrocuted than to be alone with their thoughts?
01:35:39.000 That we have gotten to this weird place with our society, with our civilization, where we've made things so easy?
01:35:46.000 We've made people so soft, so dependent upon technology.
01:35:50.000 We've slotted out these paths, these predetermined patterns of behavior for people to follow, where they can just sort of plug into a philosophy, whether it's a right-wing one or a left-wing one with very little room for personal thought at all, very little room for objective reasoning.
01:36:07.000 We sort of made it easy.
01:36:09.000 We babied them.
01:36:09.000 I do think that it's imperative if you want to be a good citizen to have a varied media diet.
01:36:16.000 You're not going to have a clear view of the world if all you're reading is...
01:36:21.000 Is Breitbart.
01:36:23.000 Or the New York Times.
01:36:24.000 Right.
01:36:24.000 You know, I think you have nothing against the New York Times or Breitbart, but I think you need to read many things and follow many different sorts of people on Twitter, not just because you want to troll them, but because you actually want to listen to what they have to say and take it seriously.
01:36:35.000 Well, the New York Times really fucked up.
01:36:37.000 Where they really fucked up is where they said that they're going to, after the election, they're going to rededicate themselves to reporting the truth.
01:36:45.000 And, like, what?
01:36:46.000 Why did you say that?
01:36:48.000 Like, I wish I was there.
01:36:49.000 I wish I was in the office.
01:36:50.000 Why?
01:36:51.000 That just sends the wrong message?
01:36:53.000 Yes!
01:36:54.000 Well, it says they're biased.
01:36:56.000 Yeah.
01:36:56.000 They fucked up.
01:36:58.000 They had an idea.
01:36:59.000 The truth is they were biased.
01:37:00.000 Yes, you're right.
01:37:01.000 The thing is, the enemy was so grotesque in this case that it was impossible to not have been biased seemed an abdication of responsibility.
01:37:14.000 I feel it myself.
01:37:15.000 Everything I say against Trump from a Trump person sounds like mere partisan bias.
01:37:22.000 I've got zero connection to the Democratic Party.
01:37:28.000 There's no partisan bias.
01:37:30.000 100% of what I want to say about Trump does not apply to some other Republican who just stands for policies I might not like.
01:37:42.000 It's a completely unique circumstance.
01:37:48.000 Yeah, it's true that you went to read the New York Times for the longest time.
01:37:52.000 It was reading like just the entire thing had become the opinion page on the Huffington Post or something.
01:38:00.000 Yeah.
01:38:01.000 I just feel like at this stage of our society, there's real consequences to the infantilization, if that's actually a word, of human beings in our culture.
01:38:12.000 We've made it very easy to just go to work and just get home and watch television and just not pay attention to anything and not read anything and not really think and then be manipulated.
01:38:25.000 I mean, I think it's incredibly easy to manipulate people, especially people that are aware that they don't have a varied media diet.
01:38:33.000 People that are aware that they don't have a real sense of the word.
01:38:36.000 And it seems daunting to try to take into consideration, like, what is involved in foreign policy?
01:38:41.000 What is involved in dealing with Russia?
01:38:43.000 What is involved?
01:38:44.000 How do you negotiate with North Korea?
01:38:47.000 Fuck, it's too much.
01:38:48.000 Put it in the hands of the strong man.
01:38:49.000 I think this is true on both sides of the spectrum, though, because I think you've got folks who slot into just a media diet where they're just hearing things on the left and they're not curious about or, I guess,
01:39:07.000 just not curious enough to hear...
01:39:11.000 Right.
01:39:32.000 Yeah, I mean, I think the consequences have never been greater for that.
01:39:34.000 Absolutely.
01:39:35.000 And I think the reason that so many people on the right, so many Trump supporters, feel like they're right is because it has been proven that the media was biased and that they did get it all wrong.
01:39:48.000 And they were absolutely wrong when it came to who was going to win.
01:39:51.000 I mean, Huffington Post had some ridiculous thing where it was the night of the election.
01:39:55.000 They said that Hillary had like a 98% chance of winning or something crazy like that.
01:40:00.000 Well, I think, yeah, there were some polls that were bad.
01:40:02.000 But the poll, like the...
01:40:04.000 Because I remember this because I sent out a tweet which said, like, you know, bye-bye, Donald, or something like that, you know, the day of.
01:40:12.000 But when I did that, I mean, that wasn't a prediction.
01:40:15.000 I mean, the polls that I was going by, that most people were going by at that point, it was like 80-20, you know, or at best 75-25 that she was going to win.
01:40:25.000 Now, that's not...
01:40:25.000 I mean, you roll dice for a few minutes, you realize a 20% chance...
01:40:30.000 It comes up a lot, right?
01:40:31.000 So I guess that's not infinitesimal odds.
01:40:34.000 Plus Florida.
01:40:35.000 Yeah, but so, well, you should tell the story about what it was like to anchor the broadcast.
01:40:40.000 Yeah, so I was anchoring the ABC News digital coverage that night, and they give you the exit polling.
01:40:48.000 You're not supposed to report it publicly, but the exit polling...
01:40:51.000 That we were seeing before we went on the air late in the day really made it seem like it was going to be a Clinton landslide.
01:40:57.000 You have all these folks who say, look at the crestfallen faces of the journalists because they're so upset that Trump won.
01:41:04.000 That was not the case for the folks on my set.
01:41:07.000 It was that we didn't see it coming.
01:41:10.000 We weren't prepared for it.
01:41:11.000 Everything that we're seeing in terms of the math made it look like this was a Clinton victory, a shoe-in.
01:41:16.000 It was just about just tying a ribbon around it.
01:41:18.000 So when the night became long, there was just confusion about what was going on.
01:41:24.000 How did they get it so wrong?
01:41:25.000 You know, I think it actually goes back to what Sam was saying before, that people think when you see numbers like 70% odds that Clinton's going to win, 80% odds that Clinton's going to win, that she's definitely going to win.
01:41:39.000 But there's room there for Trump to win.
01:41:44.000 A lot of room.
01:41:44.000 Yeah, a lot of room there for Trump to win.
01:41:46.000 20% comes up all the time.
01:41:47.000 It's Russian roulette.
01:41:48.000 Yeah, those are bad odds.
01:41:50.000 If it's Russian roulette, those are bad odds, right?
01:41:52.000 You're not going to take that, you know, not going to put a single bullet in a five-chamber gun.
01:41:56.000 One bullet, six rounds, yeah, spin it.
01:41:58.000 That's a spear hunter.
01:41:59.000 That's really good odds that you're going to get shot.
01:42:01.000 I don't think it's so much about blaming the polls as it was blaming the overall tenor of the coverage, which made it seem like Clinton was inevitable.
01:42:10.000 Yeah, it was so shocking.
01:42:13.000 That's a hit I think that we can and should take.
01:42:17.000 We definitely, you know, I think we weren't giving the 20 or 30% chance a serious enough look.
01:42:25.000 What is your thought, as being someone who covers these things, what is your thought about the Electoral College?
01:42:30.000 Do you think that that's an antiquated idea?
01:42:33.000 I mean, it was kind of established back when you really needed a representative, because otherwise you would have to get on a fucking horse and ride into Washington, and it would take six months.
01:42:41.000 I can see that you can make very powerful arguments that it's a deeply problematic institution.
01:42:47.000 I can see the power of those arguments, for sure.
01:42:52.000 There are people who argue, make similar arguments about the United States Senate.
01:42:57.000 Yeah.
01:42:59.000 There was a piece that ran in the New York Times in their Sunday Week in Review, not long after the election, making the case that the angriest people in America actually should be those who live on the coast, because it's taxation without representation, that the people who live on the coast are paying more in taxes,
01:43:17.000 but they have less representation actually in Washington.
01:43:22.000 Again, that's not research that I've done, but it's an interesting idea.
01:43:26.000 Have you ever seen anybody present any sort of a logical argument that there really shouldn't be a president anymore?
01:43:32.000 That the idea of having one alpha chimp run this whole thing seems pretty outdated.
01:43:38.000 I'm just not sure that he runs the whole thing.
01:43:40.000 But he's got a lot of influence.
01:43:42.000 He has an enormous amount of influence.
01:43:43.000 But we're seeing, Donald Trump is seeing right now, the limits of presidential power.
01:43:49.000 He is.
01:43:50.000 He couldn't get his...
01:43:51.000 The health care bill that didn't even make it Wasn't even close to what he promised on the – in some ways wasn't even close to what he promised on the campaign trail.
01:44:01.000 In other words, he couldn't get the bill that he wanted, and then he couldn't get that passed.
01:44:06.000 And now he's looking at having to watch his party employ the nuclear option in order to get his Supreme Court nominee seated.
01:44:16.000 So I – I don't know.
01:44:18.000 I think that the founders designed in many ways a really ingenious system.
01:44:23.000 And we put a lot of attention on the president because it's one person who's on our TV screens or our phones all the time.
01:44:31.000 But I'm not sure how much power is vested in that person.
01:44:34.000 Now, when it comes to foreign policy, It's a different kettle of fish.
01:44:38.000 Well, it's enough that the EPA has been sort of hobbled.
01:44:42.000 I mean, what they've done with the Environmental Protection Agency standards, especially in regards to emissions, he's rolled back emission standards.
01:44:50.000 I mean, if there's anything that we should be concerned about, it's the air that we breathe.
01:44:54.000 And we're moving in a direction.
01:44:56.000 We're clearly moving in a direction to get away from things like coal.
01:44:59.000 And he's going the opposite way.
01:45:01.000 Not only that, but what I've heard is that that's not even going to be effective.
01:45:06.000 Because most places have moved away from coal to the point where restarting coal production is not even going to recharge the economy in the way that would make it a viable option in the first place.
01:45:20.000 The issue of climate change is just...
01:45:24.000 I'll say, as a member of the media, an area where I feel, and I'm just speaking for myself here, really one of our biggest failures.
01:45:32.000 And I don't think history is going to judge us kindly.
01:45:35.000 And again, I'll put the blame on myself.
01:45:38.000 It's a hard story to get people just that interested in.
01:45:43.000 And especially for television, because it's a lot of sort of graphs and science, and there's only so many pictures of polar bears you can show.
01:45:54.000 And so I anguish about that because I do think there isn't a debate.
01:46:01.000 Climate change is real, almost certainly caused by humans.
01:46:07.000 And for too long we fell for that in the media where we presented it as a debate when it wasn't.
01:46:13.000 Now I think we're past that, but I still don't think we're covering it enough and as robustly as we should.
01:46:18.000 Well, climate change, I think, is a sort of almost abstract to people.
01:46:23.000 It's very difficult for them to wrap their head around, especially when they look at the ice core samples.
01:46:27.000 There's plenty of stuff online where you could sort of convince yourself that there's always been this Rise and fall of the temperature on the earth, and in many ways that is true.
01:46:38.000 But pollution is another thing.
01:46:40.000 I think, actually, I just walked in, so I might have missed what you said there, but I think that's a crucial shift of emphasis because there is no...
01:47:08.000 I mean, just imagine if we had no Pollution coming from the exhausts of all the cars out there and there was no coal-fired power plants.
01:47:17.000 We just had solar and wind and safe nuclear technology powering the grid.
01:47:24.000 It would be fantastic from just a pure...
01:47:26.000 I mean, forget about the health.
01:47:28.000 Obviously, lung cancer and cardiovascular disease is a huge issue there.
01:47:38.000 Just aesthetically, it's so desirable.
01:47:40.000 I mean, there's no argument against it.
01:47:42.000 And I feel like people don't make that connection very much.
01:47:46.000 Because climate change is slow moving, you know, and it could be way out in the future before...
01:47:51.000 But pollution isn't.
01:47:52.000 I know, I know.
01:47:53.000 But pollution is much less controversial than climate change.
01:47:56.000 Well, that's why this coal thing is so disturbing.
01:47:59.000 Yeah.
01:47:59.000 You know, reigniting this production of coal.
01:48:02.000 Yeah, it's unbelievable.
01:48:03.000 It's ridiculous.
01:48:04.000 Yeah, I guess I have questions about whether it's even going to happen.
01:48:08.000 In other words, you can roll it back, but the coal industry, there are many, many factors to their decision-making.
01:48:18.000 Right.
01:48:18.000 For example, if they go and do the mining, is what they mine going to actually even be consumed?
01:48:26.000 Right.
01:48:27.000 And I think that even though they're making some pretty radical noises at the EPA, I'm not sure how far Pruitt can take some of this stuff, given the existing body of law, case law, that has formed around Obama's decisions.
01:48:43.000 So it actually gets more complicated the closer you look at it, from what I can tell.
01:48:50.000 And I won't claim to have studied it too, too closely.
01:48:52.000 But it seems to get more complicated the more you look at it, The headlines may be scarier, I guess is my point.
01:48:59.000 Well, what's pretty clear, though, is emission standards.
01:49:01.000 Rolling back emission standards sends a very clear message that it's okay to pollute the air.
01:49:05.000 I mean, we were moving in a direction of going towards electric cars, going towards cars that pollute the environment less.
01:49:12.000 I mean...
01:49:13.000 Even cars that use gas, you know, for Porsche, Porsche has a 911 turbo, and the standards of emissions are so strong, what they've developed is a car that when you drive it through LA, the exhaust that comes out is cleaner than the air it sucks in.
01:49:29.000 Hmm.
01:49:31.000 Imagine that.
01:49:32.000 So you're driving through...
01:49:33.000 It's an air filter.
01:49:34.000 Yeah, it literally is.
01:49:35.000 It's a bad air filter.
01:49:35.000 But it can be done.
01:49:36.000 Very expensive air filter.
01:49:37.000 I mean, it can be done.
01:49:38.000 I mean, that's...
01:49:40.000 I mean, and we can move further and further away.
01:49:42.000 I mean, obviously, the problem with that is it's taking in polluted air.
01:49:46.000 You know, that's the issue in the first place.
01:49:47.000 I mean, it's not clean.
01:49:48.000 You don't want to breathe the exhaust of a Porsche.
01:49:51.000 But...
01:49:52.000 What they've done is managed to make something so efficient that it actually does emit clean air coming out of it, or cleaner than the polluted air that it's sucking in.
01:50:02.000 Now, if these Environmental Protection Agency standards keep getting rolled back, I mean, we're going to go back to...
01:50:09.000 I mean, I don't know how far they're rolling it back, but...
01:50:12.000 What you said is so clear.
01:50:14.000 There's nothing good about polluting the air.
01:50:17.000 It's what we need to breathe.
01:50:19.000 And there's options.
01:50:20.000 The idea that business should take precedent over the actual environment that we need to sustain ourselves.
01:50:27.000 So let's not forget there are real human beings in coal country who have spent generations working in this industry, take great pride in it, and we've got to think about what we do.
01:50:41.000 But the numbers here are surprising and also little reported.
01:50:47.000 It's only 75,000 coal jobs we're talking about in the country.
01:50:52.000 And there's something like 500,000 clean tech jobs just in California alone.
01:50:57.000 I mean, the numbers are completely out of whack.
01:50:59.000 No, I think the clean tech industry offers an enormous amount of promise, but 75,000 families is not nothing.
01:51:07.000 But then give them money, right?
01:51:09.000 They don't want to hand out.
01:51:11.000 They want to work.
01:51:12.000 But this goes to the question of meaning and, you know, what are we going to do?
01:51:16.000 Because the precipice we're getting to is Everyone, virtually everyone, is going to be in the position of these coal miners.
01:51:23.000 When we're talking about, and that's a good thing.
01:51:27.000 That's the thing.
01:51:27.000 I mean, that's, you know, why can't they figure out that they just want to learn new languages and spend more time with their kids and play Frisbee and have fun?
01:51:36.000 We need a new ethic.
01:51:40.000 And politics that decouples a person's claim on existence from doing profitable work that someone will pay you for.
01:51:49.000 Because a lot of that work is going away.
01:51:51.000 I mean, we could view it as an opportunity, and it is actually something that it does dovetail with this hobby horse that you and I have been on for a while about the power of meditation and what it can do to a human mind and the way you view the world and your role in it,
01:52:07.000 for sure.
01:52:08.000 Well, what are your thoughts on universal basic income?
01:52:10.000 Because bring it back to that, with this rise of the machines, if we do have things automated, I mean, some ridiculous number of people make their living driving cars and driving trucks.
01:52:20.000 Now, when those jobs are gone, I think it's millions of people, right?
01:52:23.000 Yeah, and I think in the States, it's the most...
01:52:28.000 A common job for white men, I think.
01:52:32.000 Something like 9 million white men are driving trucks and cars.
01:52:37.000 The problem with that is most people are like, fuck white men.
01:52:40.000 Tired of white men.
01:52:41.000 We're the patriarchy.
01:52:42.000 This is Trump's base.
01:52:45.000 Yeah.
01:52:46.000 No, it's...
01:52:47.000 These...
01:52:50.000 I think universal basic income...
01:52:51.000 There are reasons to worry that it's not a perfect solution because you do want...
01:52:56.000 You want to incentivize the things you want to incentivize.
01:52:59.000 You need to just understand the consequences of any system you would put in place.
01:53:04.000 But there's just no question that...
01:53:08.000 Viewed as an opportunity, this is the greatest opportunity in human history.
01:53:14.000 We're talking about canceling the need for dangerous, boring, repetitive work and freeing up humanity to do interesting, creative, fun things.
01:53:29.000 Now, how could that be bad?
01:53:30.000 Well, Give us a little time, and we'll show you how we can make it bad.
01:53:34.000 And it'll be bad if it leads to just extraordinary wealth inequality that we don't have the political or ethical will to fix.
01:53:46.000 Because if we have a culture of people who think, I don't want any handouts, and I certainly don't want my neighbor to get any handouts, and I don't want to pay any taxes so that he can be a lazy bum, if we have this You know, hangover from Calvinism,
01:54:01.000 you know, that makes it impossible to talk creatively and reasonably about what has changed.
01:54:10.000 Yeah, it could be a very painful bottleneck we have to pass through until we get to something that is much better or a hell of a lot worse, depending on where the technology goes.
01:54:22.000 And I think at a certain point the wealth inequality will be Obviously unsustainable.
01:54:28.000 I mean, you can't have multiple trillionaires walking around living in compounds with razor wire and just moving everywhere by private jet and then massive levels of unemployment in a society like ours.
01:54:46.000 I mean, at a certain point, the richest people will realize that Enough is enough.
01:54:54.000 We have to spread this wealth because otherwise people are just going to show up at our compounds with their AR-15s or their pitchforks and the society will not sustain it.
01:55:07.000 There has to be some level of wealth inequality that is unsustainable, that people will not tolerate.
01:55:14.000 And you begin to look more and more like a banana republic until you become a banana republic.
01:55:20.000 But now we're talking about the U.S. or the developed world where all the wealth is.
01:55:28.000 So redistribution is the endgame.
01:55:32.000 But that's a toxic concept for half of the country right now.
01:55:36.000 Right.
01:55:37.000 The idea of the welfare state.
01:55:38.000 The idea of perpetuating that and spreading it across the board.
01:55:42.000 Yeah.
01:55:42.000 But these are...
01:55:43.000 So, yeah.
01:55:45.000 I mean, whatever the solution is for coal mining, we should not be hostage...
01:55:50.000 For the coal miners, we should not be hostage to...
01:55:55.000 The idea that they need jobs so that whatever job they were doing and are still qualified to do, that job has to continue to exist no matter what.
01:56:05.000 No matter what the environmental consequences, no matter what the health consequences, no matter how it closes the door to good things that we want.
01:56:14.000 We don't do that with anything.
01:56:15.000 We didn't do that with, you know, the people who are making buggy whips or anything else.
01:56:20.000 Yeah.
01:56:21.000 I mean, there's just...
01:56:22.000 There's no...
01:56:25.000 At a certain point, we move on and we make progress and we don't let that progress get rolled back.
01:56:30.000 And when you're talking about developing technology that produces energy that doesn't have any of these negative effects, whether it's global climate change or just pollution, Of course we have to move in that direction.
01:56:47.000 And the other thing that's crazy is that we're not talking honestly about how dirty tech is subsidized.
01:56:55.000 I mean, you have the oil people say, well, solar is all subsidized, right?
01:57:00.000 This is a government handout that's giving us the solar industry.
01:57:05.000 Well, one, that's not even a...
01:57:07.000 You have to produce an argument as to why that's a bad thing.
01:57:09.000 This is something we should want the government to do.
01:57:11.000 The government needs to incentivize new industries that the market can't incentivize now if they are industries that are just intrinsically good and are going to lead to the betterment of humanity.
01:57:23.000 But carbon is massively subsidized.
01:57:28.000 If we actually had the coal producers and the petroleum producers...
01:57:36.000 Pay for the consequences of carbon and pollution, it would be much more expensive than it is.
01:57:44.000 So it's already subsidized.
01:57:47.000 We need a carbon tax, clearly.
01:57:50.000 The tax code should incentivize what we do.
01:57:54.000 Well, there's a ton of industries in this country that you could make that argument for.
01:57:57.000 The corn industry is one.
01:57:59.000 I mean, subsidizing the corn industry, when you find out that corn and corn syrup is responsible for just a huge epidemic of obesity in this country, the amount of corn syrup that's in foods.
01:58:12.000 If you as a polluter had to pay the consequences of your pollution all the way down the line, you had to compensate everyone who got emphysema or lung cancer because of what you were putting into the air.
01:58:30.000 Your industry would be less profitable, right?
01:58:33.000 And it might not be profitable at all.
01:58:35.000 And we haven't priced all of that in to any of these things, whether you're talking about the chemical industry or the cigarette industry.
01:58:45.000 I mean, we're addicted to the use of these fossil fuels the same way some people are addicted to the use of cigarettes.
01:58:52.000 I mean, the health consequences of those things, they're almost parallel in a lot of ways.
01:58:57.000 Yeah.
01:58:57.000 Well, and amazingly, as you have the same, I mean, I think you turned me on to this documentary.
01:59:03.000 What was it?
01:59:04.000 Merchants of Doubt?
01:59:05.000 Yes.
01:59:05.000 Amazing.
01:59:06.000 The same PhDs.
01:59:08.000 You got like 10 guys who move from, you know, just toxic industry to toxic industry, defending whether it's big tobacco or the fire.
01:59:17.000 I'm in that documentary.
01:59:17.000 Oh, you are?
01:59:18.000 Oh, you are.
01:59:19.000 That's right.
01:59:19.000 I interviewed a guy named Fred Singer, who was one of these people, and he's a climate change denier, and there's just some...
01:59:30.000 It's been a while, but there were some key moments where I was, like, listing for him all the major scientific organizations that say that climate change is real and that humans are major contributors to it, and he basically just refused to accept it.
01:59:46.000 Well, those shows where you have the three heads, you and then the two experts, and they yell over each other, and then we'll be right back, and then you go to commercial.
01:59:54.000 Those fucking things don't solve anything.
01:59:56.000 Those weird moments where people are yelling at each other, and you can't figure out who's right or who's wrong.
02:00:02.000 I'm not a big fan of that, personally, and I think that especially it was a failure to do that with climate change, because it created the doubt.
02:00:12.000 It created the doubt, and I think that was a very successful It was a very successful strategy.
02:00:19.000 But that's in tension with what we just said about the New York Times, because if you take a position as a journalist, if you say, okay, actually one side of this conversation is full of shit, I think it's different to take a position.
02:00:37.000 It's not controversial for me to sit here and say, if you smoke cigarettes, the science strongly suggests you have higher odds of lung cancer.
02:00:45.000 Same thing with climate change, but it became politicized.
02:00:48.000 So I don't feel...
02:00:50.000 I don't...
02:00:50.000 You'll notice I've stepped out of some of the discussions that have been taking place because it's not my role as a journalist to come down on one side or another.
02:00:59.000 But with climate change, I feel absolutely comfortable saying the vast majority of scientists believe this is real and a big, big problem.
02:01:10.000 So I don't...
02:01:11.000 What you're talking about with the New York Times, and I don't really...
02:01:14.000 I'm not going to step out and take a view on it, but...
02:01:16.000 What I believe you're saying with the Times is that they were pro-Clinton and anti-Trump.
02:01:22.000 That's different from having a position on climate change.
02:01:25.000 Well, it's not...
02:01:26.000 Well, no, because to have a position on...
02:01:28.000 Just take the lens as climate change.
02:01:30.000 You have one candidate who's denying the reality of climate change or who's just claiming that climate change is a Chinese hoax.
02:01:38.000 Yeah, but you can report that without being anti-Trump.
02:01:41.000 Well, no, because when Trump says climate change is a Chinese hoax, You have to call him...
02:01:47.000 Has he really said that?
02:01:48.000 Yes.
02:01:48.000 He said it's a Chinese hoax?
02:01:51.000 Yes.
02:01:51.000 He should go to China and see those people that are walking around with gas masks on because they can't breathe in Beijing.
02:01:57.000 Okay, but that's the pollution argument, which I think is actually much stronger because there's just...
02:02:02.000 You can't go to Beijing and say, this is how we want the air to be.
02:02:07.000 I think something like 25% of the air pollution in California on some days is coming from China.
02:02:19.000 I could have that slightly wrong.
02:02:20.000 Jamie could check that out.
02:02:21.000 Please do.
02:02:22.000 There's some extraordinary amount of air pollution that we get from China.
02:02:26.000 That's insane.
02:02:27.000 I did not know that.
02:02:28.000 It just makes its way across the ocean?
02:02:30.000 Yeah, oh yeah.
02:02:30.000 There's no wall.
02:02:32.000 We've got to build that wall.
02:02:36.000 But no, at a certain point, there's some level of dishonesty and misinformation that's so egregious that if you're a journalist at all committed to being...
02:02:49.000 I think?
02:03:07.000 You can't split that baby.
02:03:09.000 But I don't think you have to create mock unnecessary debates around climate change.
02:03:15.000 What I do think is that with the Trump administration, that it is imperative that journalists call out when things are said that aren't true.
02:03:27.000 But I don't think it's constructive for mainstream journalistic organizations to have an openly hostile anti-Trump attitude or pro-somebody-else attitude, because then that just leads to further polarization, which is exactly what we don't need.
02:03:43.000 But you call him out on all the times that he's said things that are just absolutely not true.
02:03:46.000 That feels like you're anti-Trump.
02:03:48.000 Yes, it can.
02:03:49.000 There's no question.
02:03:50.000 There's no question.
02:03:51.000 And I think that's an issue we're dealing with.
02:03:53.000 But I think this is a time of real soul-searching in my industry, because I firmly believe there is a very powerful place in a functioning democracy for a press, for media, that people generally view as fair.
02:04:10.000 Well, it's also what really highlights the responsibility of getting accurate and unbiased information to people, because there's not a lot of sources of that left.
02:04:19.000 I mean, when you look at Fox News, and sometimes you look at CNN, and sometimes you look at MSNBC, and you're like, boy, how much of this is editorialized?
02:04:28.000 How much of this is opinion?
02:04:30.000 You need unbiased facts, and you almost need it not delivered by people.
02:04:37.000 The problem is when people are delivering the news, like when you talk to someone, you know they're educated in an Ivy League university and they speak a certain way, they act a certain way, you almost can assume that these intellectual people,
02:04:53.000 these well-read people are going to sort of lean towards one way or another.
02:04:58.000 I think that there is that issue.
02:05:00.000 I would argue, and again, I know this is self-serving, but I still believe it, that the three broadcast networks have actually fared quite well in what is an incredibly difficult environment right now.
02:05:13.000 So you don't count Fox as a broadcast network?
02:05:16.000 Fox News is a cable network, but Fox Broadcast does not have a news division the way ABC, CBS, and NBC do.
02:05:25.000 So just broadcast, meaning traditional old-school signal in the air that nobody uses anymore.
02:05:31.000 Nobody uses them, but we refer to them within the industry as the three broadcast networks.
02:05:35.000 Isn't that bizarre, though?
02:05:36.000 It is.
02:05:37.000 Well, some people, you know, they put up their rabbit ears and get the signal.
02:05:40.000 Who the fuck does that?
02:05:41.000 Cord cutters, extreme cord cutters.
02:05:43.000 Jamie, you got what?
02:05:43.000 It's 20 bucks.
02:05:44.000 I get like 30 channels and it's way better and it works.
02:05:47.000 There's no lagging.
02:05:47.000 There's no buffering.
02:05:48.000 So you do that and then you get like Netflix for TV shows?
02:05:52.000 Yeah, I get like Hulu and I have Sling.
02:05:53.000 But when I want to watch a basketball game like the NCAA basketball tournament last night, I had to watch it on that.
02:05:57.000 You watch it through the air.
02:05:58.000 It worked really good.
02:06:00.000 I understand it worked.
02:06:01.000 Interesting.
02:06:02.000 It's like having a vinyl collection, too.
02:06:06.000 What's the viewership on an evening news broadcast now?
02:06:10.000 Is it like five million on it?
02:06:11.000 No, it's around eight or nine.
02:06:12.000 Eight or nine, okay.
02:06:13.000 So it's still a really big number.
02:06:15.000 Is it up?
02:06:16.000 I don't follow all of the ticks when it comes to the numbers, so I can't answer that accurately.
02:06:24.000 But people have been predicting the demise of network news for decades now.
02:06:31.000 I've been at ABC News for 17 years.
02:06:33.000 I've been reading obituaries all throughout that time.
02:06:37.000 Eight million a night on each of the networks?
02:06:39.000 Still, that is a gigantic number, especially in an age of micro-information, of niche broadcasting and the internet.
02:06:50.000 Here are the numbers right there.
02:06:52.000 NBC. So ABC's in the lead.
02:06:55.000 ABC is the lead in total viewers.
02:06:57.000 So ABC and NBC, and this is just one week, March 27th, right around 8, and then CBS is 6.5 million.
02:07:04.000 So that's a lot of people.
02:07:05.000 It skews super old, though.
02:07:06.000 It does.
02:07:06.000 Look at that demographic.
02:07:07.000 It absolutely skews very, very old.
02:07:09.000 Well, it's all people.
02:07:11.000 The majority is people over 54 or under 25, but I don't think that's the case.
02:07:15.000 We're looking at 25 to 54, ABC has 1.6 million, as opposed to 8 million total viewers, NBC 7.8 million total viewers, 1.7 million, 25 to 54,
02:07:31.000 and then CBS 6.4 million total, 1.3 million, 25 to 54. It's funny, like, after 54, fuck you, but before 25, fuck you.
02:07:41.000 That's what they concentrate on.
02:07:42.000 It is what advertisers want.
02:07:45.000 It's funny.
02:07:46.000 People with vitality.
02:07:47.000 You can tell based on the ads.
02:07:49.000 The ads you run, it's like, you know, for catheters and anti...
02:07:56.000 It gets a little different on the morning, because in the morning shows where it's closer to 4 or 5 million for ABC and NBC and a little less for CBS, the percentage of that audience that falls within the demo, as we call it, 25 to 54,
02:08:12.000 is higher, and so the ads are kind of different.
02:08:13.000 But I guess my point is that you still have a really significant number of people, if you take the mornings and evenings on these broadcast networks, that are getting their news from these places.
02:08:24.000 Which just gets back to the polarized media atmosphere you were talking about before.
02:08:29.000 I think in this atmosphere, having the networks be seen to a certain extent, to the extent possible, as above the fray, actually is important for democracy.
02:08:39.000 How many does Alex Jones get?
02:08:43.000 What does Alex Jones get?
02:08:44.000 I think $40 million a week, but they include website hits in their big number, so it's a skewed number.
02:08:51.000 I've spent some time with him.
02:08:52.000 Me too?
02:08:53.000 Have you?
02:08:53.000 Oh yeah, I know him very well.
02:08:55.000 I've had him on the podcast.
02:08:56.000 I went down to Austin.
02:08:57.000 Did you interview him?
02:08:58.000 Yeah.
02:08:59.000 How long?
02:09:01.000 How long were you there for?
02:09:01.000 I basically spent a day hanging around his operation.
02:09:04.000 This was, I would say, in 2009?
02:09:07.000 Yeah, so it was a while ago.
02:09:09.000 Before the monolith that it is now.
02:09:11.000 Yeah, it was pretty big back then, but not what it is now.
02:09:13.000 In 1999, Alex Jones and I put on George Bush Sr. and Jr. masks, and we smoked pot out of a bong and then danced around the Capitol building in Texas for a stand-up video that I did.
02:09:30.000 Yeah, I've been friends with that guy since 1999. Well, yeah, I listened to your recent podcast with him, which was just...
02:09:38.000 Interdimensional child molesters that are infiltrating our airwaves.
02:09:42.000 Yeah, he went deep.
02:09:43.000 We got him high and drunk, and he went...
02:09:45.000 It was amazing.
02:09:46.000 He went as deep as he's ever gone before.
02:09:48.000 Do you find that he is different when he's not being recorded?
02:09:51.000 Uh, yeah, sort of.
02:09:53.000 There he is.
02:09:54.000 There he is, screaming and yelling.
02:09:57.000 That's me and him.
02:09:58.000 Oh, shit, I broke the mask.
02:10:02.000 That's him singing, by the way.
02:10:09.000 He made up all this.
02:10:12.000 Moloch the Owl God.
02:10:14.000 He made up these lyrics.
02:10:16.000 This song is all him.
02:10:18.000 I believe he might have ad-libbed it, too.
02:10:20.000 Give me some volume on this.
02:10:24.000 That was the name of my DVD, Belly of the Beast.
02:10:38.000 That's Alex.
02:10:39.000 He's so crazy.
02:10:41.000 I mean, he's definitely different when he's not being recorded, but he is that guy, you know?
02:10:46.000 It's sincere.
02:10:47.000 He believes all this stuff.
02:10:51.000 He believes a lot of it.
02:10:53.000 Whether he's correct or not, that's a different argument.
02:10:56.000 Whether or not he believes it...
02:10:57.000 I was struck.
02:10:59.000 I spent a full day with him, and then we had dinner with him afterwards, and we were not recorded.
02:11:05.000 I was struck by the difference in demeanor.
02:11:09.000 Off camera and not being recorded.
02:11:12.000 Yeah, there's definitely that.
02:11:12.000 I'm not saying he doesn't believe it.
02:11:14.000 I don't know.
02:11:15.000 I'm not in his mind.
02:11:17.000 He's a showman, for sure.
02:11:18.000 What's really disturbing is when he gets stuff right.
02:11:22.000 You know, like the World Trade Organization is a perfect example.
02:11:24.000 He was one of the ones that highlighted the use of agent provocateurs.
02:11:29.000 Now, what agency?
02:11:30.000 Who hires these people?
02:11:31.000 But what they do is they take something like the WTO, which was a big embarrassment that people were protesting the WTO, And they hire people to turn this peaceful protest into a violent protest.
02:11:43.000 So these people come in, they wear ski masks, they break windows, and they light things on fire, do whatever they do that makes it violent.
02:11:49.000 And then they have the cops come in and break it up, because now it's no longer a peaceful protest.
02:11:54.000 And so it got to the point where people were trying to show up for work.
02:11:57.000 They had WTO pins on.
02:12:00.000 That aligned through it.
02:12:01.000 Well, they had created a no-protest zone.
02:12:03.000 And this is all on footage on the news.
02:12:06.000 They were telling people, you have to take the pin off.
02:12:08.000 You can't go through this area where you work with a WTO pin on your backpack or on your jacket.
02:12:16.000 I mean, that's crazy.
02:12:17.000 And he highlighted that, and that use of agent provocateurs has been documented.
02:12:24.000 This is a real thing.
02:12:25.000 And it's a real tactic that, again, what agency, what faction of the military, what faction of the government hires these people to do that, I don't know.
02:12:33.000 But it is a real thing.
02:12:34.000 And I did not know about that until Alex highlighted it on one of his videos.
02:12:40.000 It's been proven.
02:12:41.000 It's been proven that it's real.
02:12:42.000 And so that alone is disturbing.
02:12:44.000 When you stop and think about all the different things that he's informed people of that turned out were real, like Operation Northwoods, when the Freedom of Information Act came out with the Operation Northwoods document where the Joint Chiefs of Staff had signed this.
02:12:58.000 And this was like something that they were really trying to implement.
02:13:00.000 They were going to arm Cuban friendlies and have them attack Guantanamo Bay.
02:13:04.000 They were going to blow up a drone jetliner.
02:13:06.000 They're going to blame it on the Cubans.
02:13:07.000 And they were trying to use this as impetus to get us to go to Yeah, but how do you feel about the things that he's talking about that are...
02:13:13.000 Well, okay, but let's talk about that first.
02:13:16.000 I mean, there are things that are true.
02:13:18.000 That's what gets really squirrely.
02:13:20.000 What gets really squirrely is when you find out that there have been things, like the Gulf of Tonkin incident.
02:13:27.000 There's many things that have happened where there have been false flags, where the government has conspired to lie to the American people, and people have died because of it.
02:13:35.000 I think, well, I don't know a lot about Gulf of Tonkin, although I know more about it than those other examples.
02:13:43.000 I mean, there are definitely cases where it's an additional interpretation to say that the government lied.
02:13:51.000 I mean, so to take even something that's closer to hand, like weapons of mass destruction as a pretext of going to Iraq, right?
02:13:57.000 Now, it's one thing to say that people knowingly lied about, that Bush and Cheney knowingly lied to the American people about that.
02:14:07.000 Or they were misled by people who were knowingly lying.
02:14:11.000 Or just everyone got it wrong.
02:14:14.000 It was totally plausible to everyone who was informed that he had a WMD program and they misinterpreted whatever evidence they thought they had and they were just wrong.
02:14:26.000 So that's a spectrum.
02:14:28.000 I'm not claiming to know which one of those is true.
02:14:30.000 I think probably the last is much closer to the truth.
02:14:35.000 And that explains many of these instances, but what's so corrosive about pure examples of lying is that—and we may have one case now that's just emerging in the news.
02:14:48.000 I don't know if maybe the story has been clarified while we've been talking, but it now seems that Susan Rice— At one point she said she knew nothing about the unmasking of Trump associates in this recent surveillance case,
02:15:06.000 and now it's claimed that she actually asked to have certain names unmasked.
02:15:13.000 This is being seized upon, again, just in the last few hours, as an example of a lie which seems very sinister.
02:15:26.000 But as though it equalizes the two sides here, right?
02:15:29.000 So let's say, worst case scenario, Susan Rice lied about having some knowledge of this investigation.
02:15:36.000 That doesn't...
02:15:37.000 It says something bad about Susan Rice.
02:15:39.000 It says something...
02:15:40.000 I mean, she has to deal with the consequences of that lie, but it doesn't exonerate all of the lying that Trump has done about everything under the sun, right?
02:15:47.000 So what's so destabilizing here is that the moment...
02:15:54.000 This is even true of honest errors.
02:15:55.000 The moment that a news organization like yours or the New York Times commits an honest error, that gets pointed to from those who want to treat the mainstream news media as just fake news as, see, everything's the same.
02:16:11.000 You're no better than somebody who's just manufacturing fake news on a laptop in his basement.
02:16:20.000 And the flip side of that is when Alex Jones gets something right, it seems to make him look like a dignified journalistic enterprise analogous to the New York Times or to ABC News.
02:16:34.000 I think?
02:16:59.000 We're good to go.
02:17:11.000 Fake news.
02:17:13.000 Both sides do it.
02:17:14.000 Or here's a lie.
02:17:15.000 Here's Susan Rice's one lie that she told in the last 10 years, maybe, and got caught for.
02:17:20.000 And we have a president who lies every time he picks up his, you know, approaches a mic or picks up his Twitter.
02:17:26.000 She had other problems because she had gone on, I believe, on the Sunday morning talk shows after Benghazi with some outdated talking points.
02:17:33.000 Yes.
02:17:34.000 Yeah.
02:17:34.000 No, Susan looked like she was itching to get caught.
02:17:38.000 I mean, whether this is a case of...
02:17:41.000 Yeah.
02:17:41.000 No, I think your point is very, very important.
02:17:45.000 Because I think that when you have...
02:17:47.000 If you have one side that lies all the time, it's imperative that the other side don't lie at all.
02:17:53.000 Yeah, yeah.
02:17:54.000 If you caught lying and say, look, everybody does it.
02:17:56.000 That's why it's tense time for people in my line of work.
02:17:59.000 Because we...
02:18:00.000 We're so much scrutiny.
02:18:03.000 And when we get it wrong, we really...
02:18:06.000 You know, we take a lot of heat.
02:18:08.000 But when we get it wrong, I think we're pretty quick to say, we got it wrong, here's the actual, here's the truth.
02:18:14.000 Do you know that Donald Trump Jr. said that Cernovich should get a Pulitzer?
02:18:17.000 No.
02:18:17.000 For exposing Susan Rice?
02:18:19.000 Really?
02:18:19.000 Yeah.
02:18:20.000 Right.
02:18:20.000 But that's why this drives me a little crazy, because it's not Cernovich, it's the fact that That this is bled into the real world.
02:18:29.000 Why'd you lead him down the Cernovich thing?
02:18:31.000 You just gave him right back.
02:18:33.000 I had to move the yarn in front of the kitten.
02:18:36.000 We're only now discovering the consequences of this.
02:18:41.000 Let's come back in five years and see if we're talking about anything that makes any sense.
02:18:46.000 Well, it's all very blurry.
02:18:49.000 It really is very blurry because we've never had a situation like this where we have a president that people just don't trust, to be honest.
02:18:56.000 I mean, if someone lied about anything in the past, I mean, if Donald Trump got caught having his dick sucked in the White House on film, You know, he'd be like, look, I made a mistake.
02:19:05.000 I mean, it would be nothing.
02:19:07.000 It would be nothing.
02:19:09.000 Photoshop.
02:19:10.000 Fake news.
02:19:11.000 But don't you think, just again, just slot, to use Joseph Goldstein's phrase before about slotting into geological time and just looking at the broader scope of history.
02:19:19.000 You know, we've had really, we've had periods of time where we had the muckrakers, you know, where the media outlets were Didn't even pretend at times to not have an agenda.
02:19:34.000 What times were those?
02:19:36.000 In the early parts of our republic, where we had...
02:19:41.000 Was it Teddy Roosevelt called the journalists muckrakers, where basically he was saying that their job was to rake muck, to be working in filth.
02:19:52.000 And...
02:20:02.000 I think?
02:20:16.000 I'm maybe the Pollyanna here, but I think that the Republic will survive and that it doesn't inexorably lead to a time where truth doesn't matter.
02:20:27.000 Well, it seems to open up the door for a viable alternative.
02:20:31.000 It seems to open up the door for someone who comes along who is, in many ways, bulletproof.
02:20:36.000 That is a very ethical person.
02:20:39.000 It's opened a door.
02:20:39.000 I mean, now anyone to be president.
02:20:40.000 You're talking about somebody to run for office?
02:20:42.000 Yes.
02:20:43.000 I just don't think anybody's bulletproof.
02:20:45.000 Humans, I mean, with the exception of Joe Rogan.
02:20:48.000 Well, definitely not bulletproof.
02:20:49.000 You don't have to be bulletproof.
02:20:50.000 I mean, this guy's got nothing but bullet holes in him, and he's president.
02:20:54.000 No, but what Joe was saying is it opens the door to somebody who's unheachable.
02:20:57.000 A strong, viable alternative, like someone who is ethical.
02:21:01.000 I think the door is open to anyone.
02:21:03.000 I mean, who couldn't be president at this point?
02:21:05.000 But there's no one that steps forward right now.
02:21:07.000 I mean, do you think they're waiting?
02:21:09.000 I mean, like, who the fuck—first of all— You couldn't be president.
02:21:14.000 There's never been an open atheist.
02:21:16.000 They would attack your Twitter page, first of all.
02:21:19.000 Yeah, good luck.
02:21:20.000 They would hire— No, I mean, I think they're—you know, if you're going to use a conventional political calculus, well then, yeah, then— Being an atheist, having a history of psychedelic drug use, having edgy positions that alienate massive constituencies.
02:21:38.000 All of that's a deal-breaker, but you would never have predicted that someone this scandalous and inept and dishonest, and provably so, he literally can't get through an hour of the day without...
02:21:54.000 Something that would have been a scandal in some previous age of the earth coming out of his mouth.
02:22:02.000 It's just...
02:22:03.000 I think, yeah, I think there's no predicting who could be president in the future.
02:22:11.000 You need wealth and you need...
02:22:15.000 Charisma on some level.
02:22:16.000 You need to be able to get a tribe behind you, but I don't think you need any of what people thought you needed even a year ago.
02:22:26.000 Well, what he's done by circumventing the whole system and by being independently wealthy...
02:22:33.000 Is really kind of...
02:22:34.000 But even that was a sham.
02:22:35.000 I mean, what's amazing is that he's like...
02:22:38.000 I don't know how much money he actually put into his campaign, but it's just not nearly what you would have...
02:22:42.000 I mean, he really only had to pretend to be that wealthy.
02:22:46.000 And it was incredible what happened.
02:22:48.000 The flip side of it is what he achieved with very little campaign funds.
02:22:52.000 Oh, yeah.
02:22:53.000 It's incredible.
02:22:54.000 And his wits.
02:22:55.000 Yeah, it's incredible.
02:22:57.000 It's...
02:22:58.000 Like him or not, you have to give him credit for an unbelievable victory that very few people saw coming.
02:23:03.000 It's a popularity contest.
02:23:05.000 Isn't that part of the problem?
02:23:06.000 He made a popular person president of a popularity contest.
02:23:11.000 I mean, that's what it is.
02:23:11.000 He won.
02:23:12.000 He was running against somebody who wasn't popular.
02:23:14.000 So, I mean, let's put it in context.
02:23:17.000 And we was coming after eight years of a Democratic president who was controversial in many ways.
02:23:22.000 And it's always harder for somebody of the same party to run.
02:23:26.000 So there are a lot of dynamics.
02:23:28.000 Larger, impersonal dynamics that were working in his favor.
02:23:32.000 Yeah, it's just...
02:23:33.000 But just to get back to the argument about, like, are we in apocalyptic times or whatever, I guess, I don't know anything.
02:23:41.000 But my instinct is...
02:23:43.000 Just as a country, we've seen times of much worse division.
02:23:48.000 We had a civil war.
02:23:51.000 It's telling that you have to go back there for a compelling example.
02:23:56.000 What about in the 60s?
02:23:59.000 We had people in the streets, we had the weather underground, we had MLK and RFK assassinated.
02:24:07.000 It was pretty divided time.
02:24:10.000 What I'm worried about now is...
02:24:14.000 I think he's truly unique, and I think the way we're becoming unmoored from the ordinary truth testing in politics, I think is unique.
02:24:29.000 I mean, the fact that we have a president who seems—because there's a whole feedback mechanism.
02:24:33.000 Trump seems to get some of his information from InfoWars and from places like InfoWars.
02:24:39.000 Oh, he goes on all the time.
02:24:41.000 But you don't think he pays a price for his— No.
02:24:45.000 I mean, his popularity rating is not high.
02:24:48.000 Well, he's paying some price, but the question is, is it going to be enough?
02:24:52.000 And what happens with the next terrorist attack?
02:24:56.000 So the real fear is that if we have a Reichstag fire kind of moment, engineered by him or not, I mean, I'm not so paranoid as to think he's going to...
02:25:08.000 But I just think it's inevitable something is going to happen.
02:25:12.000 I mean, we've had 80 days, 100 days, whatever it's been of his presidency, where basically nothing has happened, and it's been pure chaos, right?
02:25:21.000 And the work of government's not getting done.
02:25:24.000 The government's barely even staffed.
02:25:26.000 And this has been a period of where nothing really has happened.
02:25:31.000 But Imagine a 9-11 size event or something really goes off the rails with North Korea or China or Russia.
02:25:40.000 It will be so...
02:25:42.000 I mean, the pressure...
02:25:44.000 One, the pressure to normalize him as commander-in-chief...
02:25:50.000 I think we're good to go.
02:26:10.000 I mean, I think it's a scary moment, and I mean, the people who are...
02:26:20.000 Drawing analogies between him and the 30s in Germany, I think those analogies are misleading in ways, but Again,
02:26:36.000 you just don't know how strange things could get with a big negative stimulus to the system.
02:26:43.000 What I think we know is that we have someone in charge who is a malignantly selfish con man.
02:26:52.000 I think that is an objective fact.
02:26:54.000 That's not a partisan thing to say.
02:26:57.000 That's as obvious about him as his hair.
02:27:01.000 And to have that much power in the hands of somebody whose ethical compass Is that unreliable or reliably bad, in my view?
02:27:14.000 And I wouldn't say this of Pence, who scares me for other reasons.
02:27:17.000 I mean, he's a theocrat.
02:27:19.000 I wouldn't say this of most Republicans, who I might disagree with.
02:27:24.000 But there's just something to put that much power in the hands of somebody who...
02:27:30.000 It's so disconnected from facts and reasonable concern for the well-being of the rest of humanity and the future.
02:27:40.000 It's an incredible moment.
02:27:42.000 And I don't think we've ever had a president who you could look at and say that about so clearly.
02:27:48.000 I mean, not even Nixon.
02:27:49.000 Nixon was pathological in other ways.
02:27:53.000 You know, Nixon was giving us the Clean Air Act, right?
02:27:56.000 I mean, Nixon had some point of contact with terrestrial reality, which wasn't just about figuring out how to burnish his imaginary grandeur, you know.
02:28:09.000 So, yeah, I mean, I... I think we're one huge news story away from finding out how bad a president he could be.
02:28:22.000 And I think he could surprise even the people who are very pessimistic.
02:28:27.000 Well, he could also rise to the occasion.
02:28:31.000 Look at you, being objective.
02:28:33.000 That's the newsman in him.
02:28:34.000 See that?
02:28:35.000 I also like to...
02:28:37.000 I would be the first to give him or the system credit for that.
02:28:41.000 I mean, it's not that...
02:28:43.000 If he starts doing something good, like, let's say, a massive infrastructure project, right, that includes building out good things, not coal jobs, but, you know, if...
02:28:54.000 Fixing bridges.
02:28:55.000 Clean infrastructure, right?
02:28:57.000 If he starts doing that...
02:29:00.000 I mean, that'd be fantastic, right?
02:29:02.000 And it's possible he could start doing that for his own—his motives wouldn't even matter, ultimately, as long as he was doing the right things.
02:29:10.000 But his motives are so reliably— Self-involved.
02:29:19.000 What we need is a system.
02:29:22.000 Someone needs to be able to play him, this kind of narcissistic blockenspiel, well enough to get him to do the things that would be good for the world.
02:29:31.000 But I just don't see...
02:29:33.000 There's just too much chaos in the system for that to be a reliable game.
02:29:37.000 Well, like you, they try to keep him away from social media.
02:29:41.000 Yeah, well, that's working out.
02:29:42.000 I mean, that's a big thing, don't they?
02:29:44.000 They try to pull him away from it.
02:29:46.000 Yeah, but that's been impossible.
02:29:50.000 No one can do it.
02:29:51.000 Yeah.
02:29:53.000 Remember that crazy press conference that he had just a few weeks into being president?
02:29:58.000 That was unbelievable.
02:29:59.000 I got text messages from friends that are Republicans.
02:30:01.000 They were like, what the fuck?
02:30:03.000 Yeah, but you know, it's an interesting sort of Rorschach test because there are millions of Americans who watched that press conference and found it delightful.
02:30:13.000 They thought that he was pounding on the liberal media.
02:30:20.000 When you read a transcript of what actually comes out of his mouth, it is amazing.
02:30:27.000 The poverty of the passages in terms of information.
02:30:34.000 The fact that we have a president who speaks like this, I could never have foreseen that in our lifetime this was going to happen.
02:30:44.000 It's unimaginable to me.
02:30:47.000 Yeah, I think a lot of people, there are just indisputably millions of Americans who find it refreshing.
02:30:53.000 Did you see Bill Maher's thing about Anthony Weiner, that he thinks that Anthony Weiner should run against Trump?
02:30:59.000 We need our own crazy man.
02:31:01.000 Because Anthony Weiner, besides his obvious addictions to sexting with people, was a very powerful politician and a very good speaker.
02:31:12.000 He had a lot of very good quality.
02:31:14.000 He was going after bankers, the same as Eliot Spitzer.
02:31:17.000 Eliot Spitzer did a lot of good things.
02:31:19.000 He was going after the bankers.
02:31:20.000 He was going after corruption.
02:31:23.000 Yeah, well, that's an example.
02:31:27.000 There's a flag planted that shows us how far we have gone from the normal.
02:31:33.000 I mean, Eliot Spitzer destroyed himself with one instance of hypocrisy.
02:31:39.000 Right.
02:31:40.000 Well, it was pretty bad.
02:31:42.000 Trump?
02:31:43.000 Level bad?
02:31:43.000 I mean, there's just no...
02:31:44.000 I don't know.
02:31:45.000 I mean, we don't have Trump breaking the law in this way.
02:31:49.000 I mean, he was with a prostitute.
02:31:51.000 Well, that's a stupid law.
02:31:52.000 The real problem was that he was going after prostitutes.
02:31:57.000 I know.
02:31:58.000 That was the real problem.
02:31:59.000 That's the hypocrisy.
02:31:59.000 Yeah, the hypocrisy wasn't that he did something illegal where he paid someone for sex.
02:32:04.000 That law seems to me to be so archaic and so ridiculous.
02:32:07.000 You can pay someone to massage you.
02:32:09.000 You can't pay someone to touch your genitals.
02:32:10.000 It seems ridiculous.
02:32:12.000 Right.
02:32:12.000 I guess the other piece you have to put in play there is that there are some percentage of people working in the sex trade who are not doing it voluntarily, who are coerced to one or another degree.
02:32:24.000 And that's a horror show.
02:32:26.000 And part of the problem with that is that it's illegal.
02:32:28.000 I mean, that's the big argument is that...
02:32:31.000 Yes, there is that argument.
02:32:34.000 And I agree with you that consenting adults should be able to do what they want to do, but the crucial variable there is consent.
02:32:42.000 Right.
02:32:43.000 And kids can't consent.
02:32:45.000 Agreed.
02:32:45.000 Agreed.
02:32:47.000 I just think that someone is going to have to be dynamic, they're going to have to be They're going to have to engage people in a way that, I mean, they're going to have to deal with his attacks.
02:33:01.000 Or maybe what people will be thirsting for after four years is actually bland.
02:33:09.000 Yeah, right?
02:33:09.000 Maybe, right?
02:33:10.000 Yeah.
02:33:11.000 I would dupe with Bland right now.
02:33:13.000 That would be fantastic.
02:33:15.000 Bland and actually super religious would be...
02:33:18.000 Ted Cruz?
02:33:20.000 You'd be happy for Ted Cruz?
02:33:21.000 Ted Cruz is not Bland.
02:33:22.000 No.
02:33:22.000 But my enthusiasm for impeachment suggests that I'm happy with Pence, right?
02:33:27.000 Right.
02:33:27.000 So, like, if I could find the impeachment button, I would not hesitate to press it.
02:33:32.000 And Pence, given his religious commitments...
02:33:35.000 A few short years ago would have been among my worst nightmares.
02:33:38.000 I mean, I would be talking about the rise of the Christian right and the danger of theocracy.
02:33:45.000 But psychologically, he seems like a normal, predictable, solid American compared to what we have.
02:33:55.000 It's just...
02:33:57.000 Isn't the other question like Trump's age?
02:33:59.000 He's a 70-year-old man.
02:34:01.000 People don't really live that much past 70, especially people that are overweight and people that don't exercise.
02:34:08.000 He seems all too healthy to me.
02:34:10.000 He seems pretty healthy.
02:34:10.000 I mean, that campaign.
02:34:11.000 He does.
02:34:12.000 I can't believe it.
02:34:14.000 His amount of energy.
02:34:15.000 And his credit, yeah.
02:34:15.000 I mean, that's an amazing thing to be able to do.
02:34:18.000 How do you think he does that?
02:34:19.000 It's the most punishing thing in the world.
02:34:21.000 I can't even imagine what his schedule was like.
02:34:22.000 He looked like he was having a good time.
02:34:24.000 Seriously.
02:34:25.000 I did.
02:34:27.000 It did.
02:34:28.000 And in that press conference we were talking about, I think it was a 77-minute press conference, he looked like he was having a blast.
02:34:34.000 Yeah.
02:34:35.000 He gets really energized.
02:34:38.000 Yeah, I mean, it's got to be the most punishing beatdown ever if you're not designed for it.
02:34:44.000 I mean, I can't imagine how tiring that campaign was.
02:34:48.000 I mean, you get exhausted for 30 minutes looking at your at-mention.
02:34:51.000 Exactly, yeah.
02:34:53.000 No, you have to be wired differently, but he clearly is...
02:35:01.000 He's wired that way, and it's worked for him.
02:35:03.000 But you need someone who's willing to submit to the punishment of running, and that's a rare person.
02:35:10.000 And the problem is that kind of selects for things that you don't actually want in a president, or at least I wouldn't think you would want.
02:35:18.000 I mean, it selects for a kind of narcissism and a sense that it really has to be you, right?
02:35:25.000 It doesn't select for the In a normal intellectual space, you're constantly aware of the ways in which you are not the best guy or gal to be doing the thing, right?
02:35:37.000 Like, you want to defer to experts, and, you know, Trump is, only I can fix it, right?
02:35:44.000 And that worked.
02:35:46.000 And so you need...
02:35:47.000 There's something of that that creeps into the...
02:35:52.000 The headspace of most politicians, it seems.
02:35:56.000 So scientific humility and just a sense of the limits of any one person's expertise is not necessarily the right piece of software to have running when it comes time to run for president.
02:36:12.000 Now, I want to switch gears a little bit.
02:36:14.000 It's not totally related, but it is in some ways.
02:36:18.000 Headspace.
02:36:20.000 Talking about mindsets and talking about, like, we've brought this up but really haven't delved into it much at all, about meditation and about how much it's affected you and how it got you back on track.
02:36:31.000 And I know that you're a big proponent of it, and I am as well, although I think I'd probably do it differently than you guys do.
02:36:37.000 I'd love to hear how you do it.
02:36:39.000 I use an isolation tank.
02:36:41.000 Oh, really?
02:36:42.000 Like a sensory deprivation technique?
02:36:43.000 Yeah.
02:36:43.000 Yeah.
02:36:44.000 And I do a lot of yoga.
02:36:46.000 Those are two big ones for me.
02:36:48.000 You know, I think those alone have straightened out my brain in a great way.
02:36:53.000 Yoga in particular.
02:36:54.000 Yoga because it's not...
02:36:57.000 Yoga forces you to do it.
02:36:59.000 You're either doing it or you're not doing it.
02:37:01.000 There's no room for distraction.
02:37:03.000 You're essentially forced to deal with what these poses require of you.
02:37:08.000 And I think that in doing so and having a singular focus of trying to maintain your balance and stretch and extend and do all these different things while you're doing it, and concentrating almost entirely on your breath, which is a big factor in yoga, it has remarkable brain-scrubbing attributes.
02:37:27.000 Yeah.
02:37:28.000 I would say, and I don't know much...
02:37:31.000 Before I say this, let me just ask.
02:37:32.000 What are you doing in the isolation tank?
02:37:34.000 Like, what are you doing in your mind?
02:37:36.000 A lot.
02:37:36.000 Well, you know, I use it in a bunch of different ways.
02:37:40.000 I don't use it as much as I should, honestly.
02:37:42.000 But I... Concentrate on, sometimes I go in there with an idea, like I'll concentrate on material that I'm working on, or maybe jujitsu techniques that I'm having problems with, or some other things that I'm dealing with, you know, any sort of issues that I have.
02:37:58.000 Sometimes I do that.
02:37:58.000 Sometimes I just go in there and chill out, and relax, and breathe, and concentrate.
02:38:02.000 There's a lot of physical things that happen inside the tank.
02:38:06.000 There's the amount of magnesium that's in the water because it's Epsom salts.
02:38:10.000 It's really good for you physically.
02:38:12.000 It's very good for the muscles.
02:38:13.000 It loosens you up and relaxes you and that eliminates a lot of stress.
02:38:17.000 And that physical elimination of stress allows the brain to function with just less pressure.
02:38:23.000 It allows you to relax more.
02:38:25.000 It puts things in perspective better.
02:38:27.000 And it also, it gives you this environment that's not available anywhere else on the planet.
02:38:31.000 This weightless, floating, disconnected from your body environment where you don't hear anything, you don't see anything, you don't feel anything, you feel like you're weightless.
02:38:40.000 You have this sensation of flying because you're totally weightless in the dark.
02:38:46.000 You open your eyes, you don't see anything.
02:38:48.000 You close your eyes, it's exactly the same.
02:38:50.000 The water's the same temperature as your skin, so you don't feel the water and you're floating.
02:38:55.000 I'll say this.
02:38:56.000 I'm not sure how this is going to go down.
02:38:57.000 I actually don't have any questions about the benefits of being in an isolation tank, even though I don't know too much about it.
02:39:05.000 And I also think yoga is great, although I don't do much of it myself.
02:39:10.000 I think, though, that there may actually be a difference between those two activities and meditation.
02:39:16.000 Because there's a kind of—this is a highfalutin term—metacognition, sort of knowing what you know or knowing that you're thinking.
02:39:24.000 That happens in the kind of mindfulness meditation of which Sam and I are proponents that I think is a different thing.
02:39:34.000 Where you're seeing, and I'll just try to put this in English, when you're meditating the way we do, you're seeing how crazy you are.
02:39:41.000 You're seeing your fucking nuts.
02:39:42.000 And that actually has a real value.
02:39:45.000 A systematic collision with the asshole in your head has a real value.
02:39:49.000 Because when the asshole offers you up a shitty suggestion in the rest of your life, which is basically its job, like, oh yeah, you should eat the 17th cookie or say the thing that's going to ruin the next 48 hours of your marriage or whatever, you're better able to resist it.
02:40:04.000 So what do you do?
02:40:06.000 So, I mean, the basic steps of mindfulness meditation are to sit.
02:40:09.000 Most people close their eyes.
02:40:11.000 You bring your full attention to the feeling of your breath.
02:40:14.000 You're not thinking about your breath.
02:40:16.000 You're just feeling the raw data of the physical sensations.
02:40:20.000 And then the third step is the biggie, which is, as soon as you try to do this, your mind's going to go bonkers.
02:40:25.000 You're going to start thinking about, you know, what's for lunch?
02:40:27.000 Do I need a haircut?
02:40:29.000 Where do gerbils run wild?
02:40:30.000 Whatever.
02:40:31.000 Blah, blah, blah.
02:40:31.000 You're just going to notice...
02:40:33.000 Oh, my mind's going crazy right now.
02:40:35.000 That noticing is the key moment.
02:40:37.000 It's, in fact, the victory.
02:40:38.000 It's interesting because this is when most people think they've failed.
02:40:42.000 Oh, I can't meditate because I can't clear my mind.
02:40:45.000 This is the biggest misconception about meditation.
02:40:48.000 You do not need to clear your mind.
02:40:50.000 That's impossible unless you're enlightened or dead.
02:40:53.000 The whole goal is just to notice when you become distracted and start again.
02:40:57.000 You return your attention to your breath.
02:41:06.000 Mm-hmm.
02:41:16.000 That you're a homo sapien sapien.
02:41:18.000 In other words, that's how we're classified as a species.
02:41:21.000 The one who thinks and knows he or she thinks.
02:41:25.000 And that knowing that you have this voice in your head, as Sam likes to joke, he feels like when he thinks about the voice in his head, he feels like he's been hijacked by the most boring person alive.
02:41:36.000 Just says the same shit over and over.
02:41:39.000 A joke that I steal from him all the time.
02:41:43.000 That is enormously powerful.
02:41:46.000 Because then you are not held hostage by this voice.
02:41:49.000 A similar thing happens in the tank.
02:41:52.000 I do a form of meditation in the tank.
02:41:55.000 Sometimes when I go in there without an idea, like if I'm not working on material or anything else, where I just concentrate on my breath in through the nose, out through the mouth, and I just literally concentrate on the breath, and the same thing happens.
02:42:05.000 That's meditation.
02:42:07.000 That's meditation.
02:42:08.000 It's not similar.
02:42:09.000 It's the exact same thing.
02:42:11.000 The difference is, in the tank, after a while, after about 20 minutes or so, that breaks loose to psychedelic states.
02:42:19.000 Wow.
02:42:20.000 Have you ever combined the tank with psychedelics?
02:42:22.000 Yes.
02:42:22.000 Altered state stuff?
02:42:23.000 Yeah.
02:42:23.000 Yeah.
02:42:24.000 It's a trip.
02:42:26.000 LSD or mushrooms?
02:42:27.000 Mushrooms.
02:42:28.000 But the big one is edible pot.
02:42:29.000 Edible pot seems to be as strong as anything in there.
02:42:32.000 I eat enough pot where I'm convinced I'm going to die, and then I climb in there.
02:42:37.000 Every time I do it, I go, don't do that again.
02:42:39.000 Because I just get out so terrified.
02:42:41.000 What's your motivation when you're eating the pot and climbing into the tank?
02:42:44.000 Let's see what happens.
02:42:45.000 Okay.
02:42:46.000 Yeah, just be scared.
02:42:47.000 Be terrified.
02:42:49.000 Really?
02:42:49.000 Yeah, because nothing ever happens.
02:42:50.000 You never die.
02:42:52.000 But goddamn, you're just convinced that the universe is imploding around you.
02:42:56.000 So it usually has the character of fear being a major part of it?
02:43:03.000 It's also not embracing the fear, not letting the fear run rampant, and just sort of relaxing and giving in to the vulnerability.
02:43:10.000 The finite nature of your existence and just breathing and concentrating and letting the dance take place.
02:43:17.000 Because there's some sort of a weird...
02:43:18.000 One of the things that there's a big misconception about when it comes to edible pot is that edible pot is like smoking pot.
02:43:25.000 It's an entirely different process physiologically.
02:43:28.000 Your liver...
02:43:29.000 The time course is very different, too.
02:43:31.000 You can stay stoned for three days.
02:43:32.000 Yeah.
02:43:33.000 You could fuck up and eat too many brownies and you'd be gone for a long time.
02:43:38.000 That sounds miserable.
02:43:39.000 It is, but it's not, because you get something out of it when it's over.
02:43:43.000 The process is excruciating, but when you come out of it, you just feel so happy.
02:43:49.000 Feel so happy it's over.
02:43:50.000 Yeah, it's like the joke about the dude who's banging his head up against the wall, and somebody says, why are you doing that?
02:43:55.000 He says, because it feels so good when I stop.
02:43:57.000 In a way, but there's no physical damage banging your head up against the wall.
02:44:01.000 You're gonna hurt yourself.
02:44:02.000 This is true Yeah, I don't know I think there seems like there might be easier ways to get to the same wisdom Maybe but I think there's also creativity that gets inspired by the edible pot Something called 11 hydroxy metabolite that your body produces.
02:44:19.000 It's so different than most people when they eat pot they think they've been dosed.
02:44:24.000 Like anybody who's smoked pot before and then you give them a brownie they think oh my god there's something in that and they're convinced because reality itself just seems like it just dissolves and especially inside the tank there's something about the tank environment that produces in the absence of any external stimuli your brain becomes sort of supercharged because What you're trying to do when you're just sitting down and concentrating and relaxing is you're trying to focus on your thoughts,
02:44:50.000 but you're still aware of your body.
02:44:52.000 You're still aware of your elbows touching this desk, your butt touching the chair.
02:44:56.000 There's all these different factors that are...
02:44:58.000 There's stimuli that's coming into your senses.
02:45:02.000 Whereas in the tank, there's none of that, virtually.
02:45:05.000 It's almost completely eliminated.
02:45:06.000 There's some, but you can phase that stuff out.
02:45:10.000 Like, you can still feel the water a little bit if you think about it.
02:45:13.000 You can still...
02:45:15.000 Sometimes you bump into the wall and you have to center yourself and you have to relax again and make sure you're not moving so you don't touch things, which can kind of dissolve the experience.
02:45:26.000 There are experiences in meditation where you have that same experience where you lose your sense of the body, but that usually comes with more concentration.
02:45:35.000 You have to be very concentrated on them.
02:45:37.000 I feel like you would have that experience and it would be even more intense if you did the exact same thing that you do outside the tank in the tank.
02:45:44.000 I don't think you need any psychedelics in the tank.
02:45:46.000 It's one thing I tell people when they ask me, should I get high before I do it?
02:45:50.000 I'm like, no, just do it.
02:45:52.000 Just do it.
02:45:53.000 If you decide after a while, if you've done it three or four times, you're like, I wonder what it's like if I just take a little bit of a hit of pot and see where it takes me.
02:45:59.000 There's nothing wrong with that.
02:46:00.000 It's not going to hurt you.
02:46:01.000 You know, if you're a type of person who enjoys marijuana or whatever.
02:46:05.000 But the tank alone by itself, just the absence of sensory input, your brain goes to a very, very different place.
02:46:12.000 And as long as you can relax, as long as you don't think too much about the fact that you're in the tank, just concentrate entirely on your thoughts, entirely on your breath, And again, let all those crazy, like, where do hamsters live?
02:46:24.000 Like, all that shit.
02:46:26.000 Let all that stuff run wild through your mouth.
02:46:28.000 But I feel like, in the tank at least, that gets to a certain spot and it stops existing.
02:46:33.000 And then the psychedelic state takes over.
02:46:36.000 Yeah, well, it depends on what the goal is.
02:46:38.000 I think there can be many different...
02:46:41.000 Goals of meditation or quasi-spiritual practice, and they're distinct.
02:46:49.000 So, I mean, the center of the bullseye for me is not suffering unnecessarily, right?
02:46:56.000 And so one thing that mindfulness gives you is—I mean, so it's compatible with— Every experience you can have.
02:47:07.000 There's nothing in your experience that isn't an appropriate object of meditation.
02:47:11.000 Most people start with the breath because it's just a very convenient thing to start with.
02:47:15.000 But once you know how to do this particular practice, your goal is to just be clearly aware of whatever your experience is in each moment.
02:47:26.000 Emotions arise, thoughts arise, sounds come in.
02:47:31.000 Your attention is wide open to whatever your experience is.
02:47:35.000 So it's not like...
02:47:36.000 So nothing, in principle, is a distraction.
02:47:38.000 I mean, you could be meditating right next to a construction site, and the sound of the hammers is just as good an object of meditation as the breath or anything else.
02:47:47.000 So everything is included.
02:47:49.000 But the...
02:47:52.000 The superpower you're after, which you actually can acquire through this practice, is to realize that virtually all of your psychological suffering, and actually, arguably,
02:48:08.000 virtually all of your physical suffering, or the difference between physical pain and suffering, which those two are not quite the same thing, It's a matter of being lost in thought.
02:48:21.000 It's a matter of thinking without knowing that you're thinking.
02:48:23.000 And what mindfulness does, and really any technique of meditation ultimately should do, is teach you to break the spell of being lost in thought and to notice a thought as a thought.
02:48:35.000 The huge difference is, until you learn how to meditate or do something like meditation, You're just helplessly thinking every moment of your life.
02:48:44.000 You're having a conversation with yourself.
02:48:46.000 You're having content, whether it's imagistic or linguistic, pour forth into consciousness every moment and so incessantly that you don't even notice.
02:48:55.000 It's just white noise.
02:48:57.000 And not only does it completely color You're experienced moment to moment.
02:49:03.000 So if they're angry thoughts, you're angry.
02:49:05.000 If they're depressed thoughts, you're depressed.
02:49:07.000 If they're sad, you're sad.
02:49:08.000 So you become your thoughts, but you also feel identified.
02:49:14.000 You feel that you are the thinker of your thoughts.
02:49:17.000 You feel like a self.
02:49:18.000 And it's completely structured by this flow of mentation every moment.
02:49:22.000 And it produces everything you do.
02:49:26.000 It produces all of your intentions and your goals and your actions.
02:49:29.000 And he said this about me, and now I'm going to say this.
02:49:31.000 So it's like everything coming out of you is born of this same process.
02:49:36.000 And meditation is a way of recognizing that consciousness, and what you are subjectively, is this prior condition of just...
02:49:45.000 The awareness in which everything is showing up, sounds, sensations, and thoughts.
02:49:50.000 And thoughts can become just other objects of consciousness.
02:49:55.000 And so, I mean, to take even a very basic example of the difference between pain and suffering, you can feel...
02:50:06.000 Very strong physical pain, unpleasant pain, and just be aware of it.
02:50:14.000 The sense that it's unbearable is virtually always untrue because in that moment you've already borne it.
02:50:20.000 The feeling that something's unbearable is really the fear of having to experience it in the next moment in the future.
02:50:28.000 Because you're always like, if someone drives a nail into your knee, Well, that sounds like it's unbearable, but every moment you're feeling it, you're bearing it.
02:50:40.000 What you're thinking about is the last moment and the next moment, and you're thinking about when am I going to get some relief and What's the cure and how badly is my knee injured?
02:50:51.000 You're worried about the future, continuously, and you're not noticing the automaticity of thought that is amplifying the negativity of the experience in that moment.
02:51:05.000 You can have super intense sensation which is either pleasant or unpleasant depending on the conceptual frame you've put around it.
02:51:17.000 So for instance, if you had this massive sense of soreness in your shoulder, You would experience it very differently if it was, A, the result of you deadlifting more than you ever had in your life and you were proud of it,
02:51:33.000 right?
02:51:34.000 B, probably cancer and you're waiting for the biopsy results and you're worried about this is the thing that's going to kill you.
02:51:45.000 Or you're getting rolfed, you know, like some deep tissue massage, and it hurts like hell, but you actually totally understand the source of the pain, and you know it's going to be gone the moment the guy pulls his elbow back, right?
02:51:56.000 So it could be the exact same sensation in each one of those, but the conceptual frame you have around it totally dictates the level of psychological suffering, or it can dictate the total absence of psychological suffering.
02:52:09.000 Now, we were talking before the podcast started about your apps, and we were talking about the amount of different meditation exercises on the apps.
02:52:18.000 Like, what kind of different meditation exercises are there if you're talking about just concentrating on mindfulness and breathing?
02:52:26.000 As it turns out, you can iterate off of that basic exercise to infinity, essentially.
02:52:36.000 Because you talk about not only...
02:52:39.000 I don't want to get too ahead of myself.
02:52:42.000 But basically, the basic instructions are that we listed before you're feeling your breath coming in, and then when you get lost, you start again.
02:52:49.000 But then you can add onto that.
02:52:52.000 So one big thing to add on is something called mental noting.
02:52:56.000 So you're breathing in and out, you're feeling your breath, and then you get distracted by a huge wave of anger.
02:53:03.000 Generally speaking, when we get hit by a wave of anger, we just inhabit the anger.
02:53:07.000 We become angry.
02:53:08.000 There's no buffer between the stimulus and our response to it.
02:53:12.000 But there's this little technique you can do of just making a little mental note of, oh, that's anger.
02:53:20.000 Welcome to my show!
02:53:46.000 Sam is going to be doing all the teaching on his app, and on my app, since I'm not a teacher, we have experts coming in, like Joseph Goldstein, who's, again, a friend of both Sam and I. And each teacher has their own emphasis,
02:54:04.000 and you then start talking about applied meditation.
02:54:07.000 So how do I use it in my everyday life?
02:54:11.000 How do I use it if really what I want to do right now is control my eating?
02:54:16.000 So meditation, for example, we have a course on the app that talks about using it to not overeat.
02:54:21.000 By the way, I'm terrible at this.
02:54:23.000 But you can use your mindfulness, your ability to know what's happening in your head in any given moment without getting carried away by it, To not overeat.
02:54:31.000 Notice, oh, I'm having this urge right now to eat, as I did last night, an entire bag of malted chocolate in my hotel room, but I can ride that urge and not do the thing that I know is stupid.
02:54:45.000 So anyway, that's just a little taste of how you can take meditation and bring it in kind of numerous directions.
02:54:53.000 Do you guys feel competitive?
02:54:55.000 Both of apps?
02:54:56.000 No, his isn't even out yet.
02:54:57.000 Mine's not out yet.
02:54:58.000 No, I feel not yet.
02:55:01.000 When his comes out and completely cannibalizes mine, then yeah.
02:55:05.000 We'll see how happy he is for me.
02:55:07.000 10% happier?
02:55:08.000 I'll be like negative 500% happier.
02:55:11.000 What's 10% of zero?
02:55:13.000 Yeah, exactly.
02:55:14.000 No, I actually think I'm of the view, you know, now that I've been in this meditation app business for a little while, I don't think it's...
02:55:22.000 I don't think the business model is that there's just one huge app that everybody uses and maybe there's some distant second.
02:55:30.000 I actually think it's a little bit more like fast food.
02:55:32.000 I think there's going to be a bunch of big players and you may switch back and forth.
02:55:38.000 Does one need an app?
02:55:40.000 No.
02:55:41.000 No.
02:55:41.000 No, you don't.
02:56:04.000 And because distraction is just continually the problem, I mean, you're either meditating or you're distracted.
02:56:10.000 You're either aware of what's happening at that moment or you're lost in thought.
02:56:16.000 And that's true throughout your life.
02:56:18.000 I mean, you're either...
02:56:19.000 Hearing what I'm saying right now or you're thinking about something else and you don't know it, right?
02:56:23.000 Or you're either reading the book you're intending to read or your mind is wandering and you're going to have to read that paragraph again.
02:56:30.000 So this failure to concentrate, this failure to be able to focus on what you're intending to focus on is just this universal problem of human consciousness.
02:56:41.000 And so meditation trains that and other benefits follow, but the...
02:56:49.000 Having a voice in your head reminding you that you're even attempting to be meditating is very powerful even if it's your own voice.
02:57:01.000 Listening to a meditation that I recorded Just my own voice, reminding me that I'm supposed to be meditating, it works like any other voice.
02:57:14.000 So it's a feedback system that you can't really provide for yourself.
02:57:19.000 Although, obviously, you can meditate without an app, and most people do.
02:57:26.000 I've spent very little time meditating with apps.
02:57:28.000 I just think they're very useful.
02:57:30.000 But you, you know, both of us started meditating.
02:57:32.000 You started meditating well before I did, but we both started pre-apps weren't around.
02:57:39.000 So you can read a book, read a good book, and learn how to meditate out of the book.
02:57:45.000 Just basically remember the basic instructions and do it.
02:57:49.000 But it really is useful to have an app, especially for some people, because one of the biggest problems in meditation is this persistent fear that you're not doing it right.
02:57:58.000 And so to have a voice you trust in your ear, just reminding you of the basic instructions, which are so simple but very easy to forget, it can be very useful.
02:58:10.000 I like the idea of it being like bicep curls for your mind.
02:58:15.000 Yeah.
02:58:15.000 I mean, you see it in the brain scans.
02:58:19.000 And Sam will correct me where I run afoul of scientific accuracy here.
02:58:25.000 But this simple act of sitting...
02:58:31.000 Trying to focus on one thing at a time, and then when you get distracted, knowing you're distracted and returning to your breath is changing your brain when you do that.
02:58:41.000 You're boosting the muscles, obviously the muscles I'm using loosely.
02:58:50.000 You're boosting your focus muscle.
02:58:52.000 And in many cases, there was a study in 2010, I think it was done at Harvard, that took people who had never meditated before, and they scanned their brains.
02:59:00.000 And then they had them do eight weeks of, I think, a half-hour day of meditation.
02:59:03.000 At the end of the eight weeks, they scanned their brains again.
02:59:05.000 What they found was, in the area of the brain associated with self-awareness, the gray matter grew.
02:59:10.000 And in the area of the brain associated with stress, the gray matter shrank.
02:59:15.000 That, to me, is pretty compelling.
02:59:16.000 That is.
02:59:17.000 Beautiful.
02:59:19.000 Gentlemen, we just did three hours.
02:59:20.000 Wow.
02:59:21.000 Flew by.
02:59:22.000 Yes, it did.
02:59:22.000 Thank you very much.
02:59:24.000 Thank you.
02:59:24.000 My pleasure.
02:59:24.000 It was a lot of fun.
02:59:25.000 It was fun to meet you.
02:59:26.000 As always.
02:59:26.000 Yeah.
02:59:27.000 All right, everybody.
02:59:28.000 That's it.
02:59:29.000 Go do something else.
02:59:31.000 Bye.