The Joe Rogan Experience - June 21, 2013


Joe Rogan Experience #369 - Jason Silva, Duncan Trussell, Ari Shaffir


Episode Stats

Length

1 hour and 59 minutes

Words per Minute

207.6569

Word Count

24,815

Sentence Count

2,400

Misogynist Sentences

64

Hate Speech Sentences

60


Summary

In this episode of the Joe Rogan Experience Podcast, the boys talk about the future of the internet and how it s going to change the way we live in the future. They talk about a new service that allows you to measure stuff at home with your own digital scale, and how you can get cancer from licking stamps. Also, they talk about how they re going to make purses online, and why you should be careful about who you order them from. Joe also talks about how he thinks the future is going to be better than the past, and what it s like to work for a company that makes purses and sells them online. The boys also talk about what they would like to see happen in the world of the future, and if they think it s possible that we re all going to end up in a post office or not. They also talk a little bit about how the future might be a little different than the present, and the pros and cons of using the internet as a way of life. This episode is brought to you by Stamps and Hover, and is sponsored by Shaffir's Legs. If you like what you hear, please HIT SUBSCRIBE and leave us a review on Apple Podcasts and tell us what you think about it! We re listening to this episode and what you're listening to and sharing it with your friends and family! and posting it on Insta: and tagging us in the comments! Thanks to our sponsorships and shout outs! We ll see you next week for the podcast! - we re looking out for the next week with a new episode on the next episode of The Joe Rogans Experience Podcast! :) Thank you for all the love and support! Cheers, Joe, Joe and Joe! XOXO, Joe & Joe! xoxo - The Rogans Podcast, EJ & EJG. - P.R. (and Joe's Legless Podcast ( ) - The Crew at the JOB PODCAST, and EJOD Podcasts Podcast, and Joe's Legs Podcast, , and the JRE Podcast, AKA the JR-E, and JRE. . & JRE AND JRE! , , JRE, , & the JRB Podcast. & the Podcast, & podcast


Transcript

00:00:03.000 Hello, you dirty freak bitches.
00:00:05.000 Welcome to the new world.
00:00:07.000 Welcome to the future.
00:00:09.000 Welcome to the Joe Rogan Experience Podcast.
00:00:11.000 This episode is brought to you by Stamps.com.
00:00:15.000 If you go to Stamps.com...
00:00:18.000 Hello, you dirty freak bitches.
00:00:19.000 Wow, look how long it takes.
00:00:20.000 What a delay.
00:00:21.000 You guys are living in the past.
00:00:22.000 That's a long time.
00:00:23.000 You guys are living in the past.
00:00:24.000 People listening, this shit already happened.
00:00:27.000 If you go...
00:00:28.000 What?
00:00:30.000 Stamps.com is a service that allows you to Measure stuff at home with your own scale and then...
00:00:40.000 Was it?
00:00:42.000 Yeah.
00:00:43.000 No, that's exactly what it is.
00:00:43.000 Send shit through the internet, man.
00:00:45.000 You guys are high.
00:00:46.000 This is rude.
00:00:47.000 You weigh things.
00:00:48.000 You measure things.
00:00:49.000 You measure the weight.
00:00:50.000 Is that rude to say measure the weight?
00:00:53.000 That's what it is.
00:00:54.000 You measure the weight, you fucks.
00:00:56.000 First of all, how dare you?
00:00:58.000 All of you.
00:00:59.000 I was thinking it sounded like a true term.
00:01:01.000 For some reason, when I'm typing in JRE, it's not sending me to that.
00:01:05.000 How are you getting there?
00:01:06.000 Joe?
00:01:07.000 What do you type it in?
00:01:09.000 You type JRE, right?
00:01:11.000 No, I just do stamps.com.
00:01:13.000 Stamps.com forward slash Joe.
00:01:14.000 Right.
00:01:15.000 No.
00:01:16.000 No?
00:01:16.000 No, that's not it.
00:01:17.000 No.
00:01:17.000 How did you get...
00:01:18.000 It's stamps.com.
00:01:19.000 You enter in the code word JRE, right?
00:01:21.000 Yeah.
00:01:23.000 If you buy any of the kitty cat t-shirts that Brian Red Band sells, that's how he makes them.
00:01:28.000 He sends them, rather, through stamps.com.
00:01:30.000 If you have a small business and you're trying to go to the post office and Wait in line and get your shit weighed and put different labels on boxes.
00:01:41.000 You're going to go crazy.
00:01:43.000 It's not healthy.
00:01:44.000 This way, you do it all on your home computer.
00:01:47.000 Super simple.
00:01:48.000 They give you a free weight with a scale.
00:01:50.000 They give you a scale.
00:01:52.000 It's a $110 value if you enter in the code word J-R-E. And there's no risk choice.
00:02:01.000 Say it again?
00:02:01.000 No risk trial, plus a $110 bonus offer, including the digital scale, and up to $55 of free postage.
00:02:09.000 Wow.
00:02:09.000 So they give you a sweet deal.
00:02:10.000 And if you're selling shit, it's a really easy way to deal with things, as opposed to hiring someone to handle it.
00:02:17.000 You can actually handle it yourself.
00:02:18.000 And if you're not selling shit, what, do you want to work for somebody forever?
00:02:22.000 Just go make some macrame and sell it.
00:02:24.000 Sell purses online.
00:02:25.000 It's the future.
00:02:26.000 Also, you can get cancer from licking stamps.
00:02:29.000 Oh, yeah.
00:02:30.000 Yeah, you want to wet a sponge, you fucks.
00:02:32.000 Don't be licking shit.
00:02:33.000 You remember when you had to lick stamps?
00:02:35.000 You freak bitches.
00:02:35.000 Yeah, I remember that.
00:02:37.000 1986. That was a bad time, man.
00:02:39.000 That was a bad time.
00:02:41.000 Stamps.com.
00:02:41.000 Everyone's mouths were sticking together.
00:02:43.000 Code word J-R-E. We're also brought to you by Hover.
00:02:47.000 Hover is an internet domain name company that is owned by the people that own...
00:02:55.000 Is it Hover.com forward slash Joe Rogan?
00:02:57.000 Is that what it is?
00:02:59.000 Hover.com forward slash Rogan.
00:03:01.000 Hover.com forward slash Rogan.
00:03:05.000 I think it's slash Joe.
00:03:06.000 You've got to get all these people to make the same one.
00:03:08.000 No, it is.
00:03:09.000 Just all make the same one.
00:03:10.000 Yeah.
00:03:10.000 I know.
00:03:10.000 They can get that together.
00:03:11.000 They can code that in no time.
00:03:12.000 Because they don't want to pretend that each other exists.
00:03:15.000 They're like, if you're a single guy and you've got a bunch of girlfriends, they don't want to know that they have other girlfriends.
00:03:19.000 Nobody wants to think that, why should we change our shit to JRE? You know, just because Squarespace.com uses that.
00:03:26.000 I mean, what the fuck, man?
00:03:27.000 Hover is the domain company that I use, actually.
00:03:31.000 And if you're like a techno idiot like myself, I do not know how to do anything correctly other than really easy, simple, intuitive shit like Facebook or Twitter or something like that.
00:03:42.000 I don't know how to, you know, if I can register a website, like super easy, and it has things that you normally have to pay for, like who is domain name privacy, which is, that's key.
00:03:56.000 That's key if you order Ari Shaffir's legs, and then, you know, you put that shit online, and you don't want people to know.
00:04:03.000 You're just steady beating off to Ari Shaffir's legs.
00:04:06.000 You feel me, people?
00:04:08.000 I mean, they're sexy.
00:04:09.000 Or, it could be DickPartyInMyMouth.com, which we own.
00:04:14.000 Go to R-E-C-T-S, Joe.
00:04:16.000 Yeah, we own DickPartyInYourMouth.com.
00:04:18.000 Because we were just wanting to make the point that you could hide this.
00:04:22.000 Because it would be embarrassing if people knew that you had DickPartyInYourMouth.com.
00:04:26.000 Right.
00:04:26.000 Hover.com forward slash Rogan is the sweet spot.
00:04:30.000 If you go there, you will save 10% off your domain name registrations.
00:04:35.000 And like I said, it's the same company that owns Ting.
00:04:39.000 Very ethical company.
00:04:41.000 Very reasonable rates.
00:04:42.000 And I use them too.
00:04:43.000 So go.
00:04:44.000 They support the podcast.
00:04:45.000 And we support them.
00:04:47.000 And blah, blah, blah, blah, blah.
00:04:50.000 Anyway, Onnit.com, last sponsor, O-N-N-I-T. Use a code named Rogan.
00:04:55.000 Save 10% off any supplements.
00:04:57.000 That's it.
00:04:58.000 Fuck it.
00:04:58.000 Let's not say anything more.
00:05:00.000 I talk too much.
00:05:01.000 Boom!
00:05:02.000 We got Jason Silva.
00:05:04.000 Boom!
00:05:05.000 We got Ari Shafir.
00:05:06.000 Boom!
00:05:07.000 We got Duncan Trussell.
00:05:09.000 This is the Uber Podcast.
00:05:11.000 It launches now.
00:05:17.000 The Joe Rogan Experience.
00:05:26.000 Praise Odin, Duncan Trussell.
00:05:27.000 Praise Odin.
00:05:28.000 Praise Odin, Jason Silva.
00:05:30.000 Praise Odin.
00:05:31.000 Praise Odin, Ari Shafir.
00:05:33.000 We're finally together again.
00:05:35.000 This is a fucking super podcast.
00:05:37.000 Yeah.
00:05:38.000 This is really great.
00:05:39.000 This is as sexy as it gets for me.
00:05:40.000 I'm very happy to be here, man.
00:05:42.000 Thank you for having me back.
00:05:43.000 It was so cool running into you at the Global Future 2045 conference, man.
00:05:48.000 That was badass.
00:05:49.000 Dude, that was awesome.
00:05:50.000 That is your world, dude.
00:05:52.000 Singularity, man.
00:05:53.000 Yeah.
00:05:53.000 Well, I was actually talking with Duncan about this a little while ago.
00:05:56.000 What was your guys' impression?
00:05:57.000 Because, I mean, that's kind of like the Mecca of the Singularity.
00:06:00.000 You guys visited Mecca of the Singularity.
00:06:02.000 And because we visited for the show, we got to talk to some really cool people that probably would never sit down and talk to us.
00:06:10.000 You know, like Aubrey de Grey.
00:06:12.000 I got to talk to Aubrey de Grey, which we'll have that on the show.
00:06:14.000 And, you know, you guys got to talk to...
00:06:15.000 What was the gentleman's name of the robot?
00:06:18.000 Hiroshi Ishiguro.
00:06:20.000 Hiroshi Ishiguro?
00:06:21.000 Ishiguro.
00:06:23.000 Crazy robot man.
00:06:24.000 Really weird.
00:06:25.000 Did you talk to the one that didn't have limbs?
00:06:27.000 No.
00:06:28.000 The little baby one that talks back to you.
00:06:30.000 Like somewhat limbs.
00:06:32.000 That one freaked me out because that one responds to you in conversation.
00:06:36.000 It understands what you say and it responds to you.
00:06:38.000 So you're like, hey, how you doing?
00:06:39.000 And it goes, good, how are you?
00:06:41.000 And then it's like, good, man.
00:06:42.000 Where are you from?
00:06:43.000 He's like, I'm from Tokyo.
00:06:44.000 And I'm like, oh cool, did you like to travel here?
00:06:46.000 He's like, yeah, it was fun.
00:06:47.000 But here's the thing.
00:06:48.000 It's this like humanoid baby with no limbs, which feels like the beginning of like a creepy sci-fi movie that starts out with showing like the first AI, but it's limbless.
00:06:57.000 So it's this like weird thing.
00:06:59.000 Like, so we're going to build them, but they're going to be like a little circus freaks without limbs.
00:07:03.000 It just felt really honking.
00:07:05.000 Would it be weirder if it was limbless or if it had limbs that couldn't move?
00:07:10.000 Either or...
00:07:10.000 These can move.
00:07:11.000 They can hug you with his nubs.
00:07:13.000 Right, with his nubs.
00:07:13.000 But that's why it was freaky.
00:07:15.000 That is fucked up, man.
00:07:16.000 Because it responds to you like a human, and it's cute.
00:07:19.000 You immediately anthropomorphize it, and you start to respond to it like it's alive.
00:07:23.000 And at some point, it'll be so good at responding that by all measures that we know about when it comes to knowing if you have a subjective experience or if you have a subjective...
00:07:31.000 We'll just believe that they're conscious.
00:07:33.000 And at that point, are they just going to be these...
00:07:35.000 Living, thinking things that we keep limbless behind a rope so people can throw money and look at it.
00:07:42.000 I mean, it just freaked me out, the presentation of it.
00:07:44.000 I didn't think it was a compelling idea.
00:07:46.000 I think one thing that people haven't thought of with robots yet is that they think that they're going to be confined to one form.
00:07:52.000 I think the...
00:07:53.000 Reality is going to be that they're going to be able to disassemble into individual droplets if they want to.
00:07:59.000 They're going to be these self-assembling things that can kind of...
00:08:02.000 Like the Terminator, remember?
00:08:04.000 That's it.
00:08:05.000 That's what it's going to be.
00:08:06.000 But think about that.
00:08:07.000 In a way, it's kind of like time-lapsing reality because what are atoms if not self-assembling entities that link up With other atoms that become cells, and then those cells self-assemble into tissues and organisms.
00:08:20.000 I mean, the whole story of the universe, the opposite of entropy is extropy, right?
00:08:26.000 The things that move towards greater complexity and self-assemble.
00:08:29.000 So when people talk about robots or nanotechnology, it's like an acceleration of it so that it becomes discernible to us.
00:08:34.000 From this scale.
00:08:35.000 And this is funny because...
00:08:37.000 That already exists.
00:08:37.000 When McKenna talks about his DMT experience, he calls them self-transforming machine elves.
00:08:43.000 So it's almost as though he's come in contact with a future version of what these androids are going to be.
00:08:49.000 Well, just stop if you think if you accelerated life.
00:08:52.000 I mean, time is all...
00:08:53.000 It's all our perspective.
00:08:55.000 Because if you accelerated our life, human life, the experience of humanity...
00:08:59.000 But accelerated at times a billion.
00:09:02.000 And you got to see the first single cell become multicellular and on and on and on.
00:09:07.000 Cities rise and people fly and then the earth crash again.
00:09:10.000 You would get to see it all in some sort of a psychedelic trip that only takes three minutes.
00:09:15.000 And you would see the earth form out of the cosmos.
00:09:20.000 That's psychedelic.
00:09:22.000 That might as well be a mushroom trip.
00:09:23.000 100%.
00:09:24.000 It's just a perspective thing.
00:09:26.000 It's a time thing.
00:09:27.000 It's all perspective.
00:09:28.000 Even when you see time lapse of trees growing.
00:09:30.000 You see the tree and it aims towards the sun.
00:09:32.000 The plants aim towards the sun.
00:09:34.000 The flowers blossom.
00:09:35.000 It's kind of crazy.
00:09:36.000 It's almost like agency itself, which is what they call the life force, that there's agency.
00:09:41.000 Even plants seem to have it through time lapse.
00:09:43.000 You see Agency.
00:09:45.000 Intent.
00:09:46.000 It wants to go towards something.
00:09:48.000 Kevin Kelly, the trippy co-founder of Wired, in his book What Technology Wants, he calls technology the technium.
00:09:55.000 He says it's the seventh kingdom of life and that it also has wants and needs.
00:10:00.000 And he says that if we were able to zoom out and remove ourselves from being the co-participants in creating technology, it would actually look like the technology itself is self-assembling and has a direction, like the time-lapsing of plants.
00:10:11.000 Which is crazy.
00:10:12.000 And then when McKenna was tripping out and he starts talking about singularities, hello, hear the echo with Kurzweil and McKenna.
00:10:19.000 McKenna's tripping on DMT and talking about singularities.
00:10:22.000 He's talking about universes that engender novelty, universes that allow the sprouting of new possibility.
00:10:27.000 It's the same thing that Kevin Kelly's writing about in his technology book.
00:10:30.000 So you see like the respected technologists writing these books about what's happening.
00:10:35.000 You know, even Eric Schmidt, The Age of Augmented Humanity and Google.
00:10:38.000 Google represents the literalization of the psychedelic dream.
00:10:41.000 You know, the literalization of the idea that we are expanding our minds with these technologies, whether they be chemical technologies or whether they be these external technologies.
00:10:49.000 And what you're saying, this is why it's hilarious when you hear people start railing against, it's unnatural!
00:10:56.000 We shouldn't do it!
00:10:57.000 Because what you're saying is like, no, actually there appears to be some form of transcendent, invisible architecture that all things grow upon in a similar way, whether it's Plants, technology, humans.
00:11:09.000 It just stretches on this invisible framework and reveals what's hidden and underneath all things, which seems to be this ever-perfecting, ever-complexifying, harmonious expression.
00:11:21.000 That's right.
00:11:22.000 And Pierre de Chardin, who was a famous Jesuit priest, called it the Omega Point.
00:11:26.000 He called it the Omega Point.
00:11:27.000 He got pushed out of the church because he basically...
00:11:30.000 He sort of divinized the idea of the singularity, and he was using the language of God, but he was talking about this move towards complexity and the phenomenon of man, and man was the point in which evolution became self-aware and started directing its own evolution.
00:11:41.000 Doesn't that echo what we were talking about at the Futures Conference?
00:11:44.000 So you see these echoes, you see these patterns that connect, you know?
00:11:47.000 The whole idea of cyberdelics, cybernetics and computers, and then psychedelics and chemical technologies, and they collide in what's known as cyberdelic.
00:11:55.000 That all started in the 60s and 70s in Silicon Valley, when the computer scientists were tripping on LSD and working on creative problems.
00:12:03.000 Xerox PARC, Augmenting Human Intelligence.
00:12:05.000 There's a book by John Markoff called What the Dormouse Says, which talks about where that came from.
00:12:10.000 I mean, you have to think that these people were out of their minds when they were conceiving of a world in which these computers could be wirelessly sending our thoughts across time and space at the speed of light, and that we're all going to be connected and see our faces on these machines.
00:12:22.000 I mean, you have to be, in a way, psychologically or metaphorically tripping to even think so far outside the box.
00:12:29.000 That's why Da Vinci was so fascinating.
00:12:31.000 A lot of the stuff that he came up with really didn't come to fruition in that form, but you could see that he was thinking of these concepts way ahead of everybody else.
00:12:42.000 What a fascinating, fascinating guy that must have been.
00:12:45.000 Oh, completely.
00:12:46.000 Envisioning the future, knowing that he's just stuck with these fucking apes.
00:12:49.000 Do you think he could have any regular conversation with anybody?
00:12:52.000 I doubt it.
00:12:53.000 Do you think you can walk along and pretend that you cared about what happened to the Coliseum last night?
00:12:56.000 Did you hear about the potatoes all tainted?
00:13:00.000 The potatoes are no good.
00:13:02.000 We're going to riot.
00:13:03.000 We can fly!
00:13:05.000 He's drawing a fucking helicopter in his backyard.
00:13:07.000 Yeah.
00:13:08.000 Yeah.
00:13:09.000 Can you imagine?
00:13:09.000 That's so far ahead of the curve.
00:13:12.000 But it stands to reason that some people are born with bigger dicks and other people have larger ears.
00:13:17.000 Some people just have a part of the brain or the ability to tune into creativity.
00:13:22.000 Who's Da Vinci's teacher?
00:13:24.000 Was he a wealthy guy?
00:13:26.000 Did he come from a background where he had time for leisure?
00:13:29.000 I do not know.
00:13:29.000 I do not know any of the history of Da Vinci.
00:13:31.000 He didn't make any money at these things.
00:13:33.000 I was reading an article today that said even though human beings evolved about 200,000 years ago, the first art, the first signs of religion or contemplative thinking didn't appear until the cave paintings that are 70,000 years old.
00:13:44.000 So if we had the same brains for 200,000 years, but you didn't see the beginning of humanness or imagination until about 70,000 years ago, why did it take so long if we had the same brains?
00:13:53.000 And the idea is that it's like Maslow's Hierarchy of Needs.
00:13:56.000 Those first 100,000 years, we didn't even have enough food.
00:13:58.000 We didn't have, like, any kind of organized society.
00:14:01.000 It was only when we could afford the leisure time It's like paying back a loan.
00:14:05.000 You mostly pay interest at first and then one cent on the...
00:14:08.000 Powerful Jew logic.
00:14:09.000 That's what it is.
00:14:10.000 And then eventually you're paying like only two cents of interest and the rest of the stuff.
00:14:13.000 Yeah, so you get it all.
00:14:14.000 So like they took 99% of their time just dealing with staying alive.
00:14:18.000 1% to get to wherever they were going.
00:14:21.000 It's Maslow's hierarchy of needs.
00:14:22.000 And it's like as a collective human society at some point we were able to...
00:14:25.000 We had enough hunters and we had enough organization maybe that we could feed ourselves and it was the beginning of free time.
00:14:30.000 The beginning of eating psychedelic chemicals and making...
00:14:32.000 Cave paintings and shamanic dances and all of that.
00:14:35.000 Free time.
00:14:35.000 Whoever ran to free time is my god.
00:14:37.000 Think about countries today, civilizations today that are living like that.
00:14:40.000 Have you heard about these people in India?
00:14:42.000 There's an island, an uncontacted island off of India.
00:14:45.000 And recently, within the last few years, these fishermen...
00:14:48.000 They inadvertently got drunk on their boat and drifted into the shore, and these people killed them.
00:14:52.000 They killed them, and then the authorities were trying to figure out how to get to the bodies without getting shot at by these people without having to kill them, because there's not that many left.
00:14:59.000 There's maybe like 40 or 50, and they have no contact with other human beings.
00:15:03.000 They are barbarians.
00:15:05.000 They're total, 100% complete savages that are living off the land, fishing with homemade nets made with twine.
00:15:12.000 They're controlling themselves.
00:15:13.000 Who are you to put them under your thumb?
00:15:15.000 Yeah, well, it's not only that.
00:15:16.000 They won't even go in there to retaliate against murder, which is fascinating to me.
00:15:21.000 It's like the anthropologists are so concerned with keeping this intact and studying the civilization in some sort of a way.
00:15:29.000 Really, they were attacked.
00:15:30.000 They're almost like their own country, and they were attacked.
00:15:32.000 Well, no, no.
00:15:33.000 Some foreign person came on.
00:15:35.000 Both washed a board and they just killed the guy.
00:15:36.000 They didn't even talk to him first.
00:15:38.000 You know, I mean, he might have been a respected member of the community within six months.
00:15:41.000 You know, they didn't even give him a chance.
00:15:42.000 Hello, can you help me out?
00:15:43.000 My rudder seems broken.
00:15:45.000 And there's tales of cannibalism, but it's hard to substantiate.
00:15:49.000 But it's not outside the realm of possibility.
00:15:51.000 So you might be dealing with cannibals that we allow to kill people because they're so primitive that we don't want to fuck up what they've got.
00:15:59.000 Because there's only like 40 of them.
00:16:00.000 Why do we want to preserve that?
00:16:02.000 Isn't that interesting?
00:16:02.000 Because there is a sentimental instinct in humanity that wants to preserve everything.
00:16:08.000 A hamburger place goes under, everyone cries.
00:16:11.000 It's like an instinct that people want to do that.
00:16:13.000 And a lot of people say that's part of our humanity, is keeping intact people.
00:16:19.000 Cultures and religions and keeping intact all these ideas because you can see that with this thing that's emerging, this thing that's emerging is so far, is so much bigger than some old desert religion.
00:16:35.000 And the bigger it gets and the more this thing emerges, the more its light begins to shine so brightly that all these silly little superstitious ideas begin to seem...
00:16:45.000 Increasingly ridiculous.
00:16:46.000 Yeah, I mean, our consciousness is becoming so expanded that it's almost like, you know, we're able to, like, all of a sudden turn around and see ourselves.
00:16:53.000 And I know that sounds almost like an impossible shape, but like the first time that we can actually do that and we can see ourselves sort of out of context, just like in 1969 when astronauts first took a picture of the Earth from the vantage point of space, I mean, it's literally like the human mind was folding in on itself, because how is it possible for a human brain that emerged from the Earth to then see the Earth,
00:17:12.000 not from the Earth?
00:17:13.000 Do you know what I mean?
00:17:13.000 It's like the Earth looking at itself.
00:17:15.000 Like, that's what was happening, because we are like a seed of the planet.
00:17:18.000 And then we left the planet and turned around and took a picture.
00:17:21.000 How about the mind fucks of the shots from the Voyager from Orbit?
00:17:25.000 Oh yeah.
00:17:26.000 Where you see Earth as like a tiny little dot.
00:17:30.000 Have you seen Sagan's film?
00:17:31.000 Oh my god!
00:17:32.000 Yeah, yeah, yeah.
00:17:33.000 Amazing.
00:17:34.000 They sent something out there and it's still taking pictures of us.
00:17:37.000 And it's so fucking far away that we're like a little tiny dot.
00:17:41.000 And then you get a sense of what this thing really is.
00:17:43.000 We're nothing compared to all of it.
00:17:44.000 It's insanity.
00:17:46.000 It's insanity.
00:17:47.000 How do you reconcile yourself, though, to that?
00:17:49.000 Think about the average person.
00:17:51.000 How do you accommodate to yourself, to the idea that everything you know, the vast expanse, the repository of experience of your entire life, Is a blink of a blink of a blink on a grain of a grain of a grain of a grain.
00:18:03.000 Well, instead of turning it that way, I mean...
00:18:06.000 Yeah, that's it.
00:18:07.000 You don't matter at all.
00:18:08.000 Why not just enjoy yourself?
00:18:09.000 Well, you do matter, though.
00:18:10.000 That's not true there.
00:18:11.000 But you want it to mean something.
00:18:12.000 You need a narrative.
00:18:12.000 It does.
00:18:13.000 Yeah, I want to fly.
00:18:14.000 No, it does.
00:18:15.000 It does mean something.
00:18:16.000 It means something to you right now.
00:18:17.000 It's tangible.
00:18:18.000 It's real.
00:18:18.000 It means something to the people that you're in contact with.
00:18:20.000 There's nothing unreal or unnatural about it because it's temporary.
00:18:25.000 But do you have attachments or you don't have attachments?
00:18:27.000 Well, you do.
00:18:28.000 It's natural.
00:18:28.000 But this connection that you have to this greater gigantic thing doesn't negate the small moments in your life.
00:18:36.000 It doesn't negate video games that you enjoy.
00:18:38.000 It doesn't negate finding the perfect porn to jerk off to.
00:18:41.000 It doesn't negate a sandwich that you enjoy.
00:18:44.000 That's what I'm saying.
00:18:44.000 Just have fun.
00:18:44.000 Just give yourself happiness until you're gone and we'll remember you.
00:18:48.000 But I don't agree that it doesn't matter.
00:18:50.000 Because it does matter.
00:18:51.000 It matters to you right now.
00:18:52.000 And although that seems ridiculous.
00:18:55.000 It's significant.
00:18:55.000 Yeah.
00:18:55.000 So does life itself.
00:18:56.000 So does every breath you take.
00:18:58.000 Well, hold your breath, stupid.
00:18:59.000 You don't think that breathing is important?
00:19:00.000 Hold your breath.
00:19:01.000 Of course it's important.
00:19:02.000 Don't be silly.
00:19:04.000 The existential ideas can overwhelm the reality of the situation.
00:19:08.000 And the reality of the situation is you would like to stay alive, and for the most part, it's fun.
00:19:12.000 And if it's not fun, you've managed your life incorrectly.
00:19:16.000 But you know what's interesting about what you just said is that you've Eloquently stated something that's actually very difficult for most people to experience.
00:19:24.000 Most people either have to have, like, they have to be half asleep, which means ignore the overwhelming universe and just be barely present.
00:19:33.000 And then other people are awake to this overwhelming universe.
00:19:37.000 You know, Brian's right here and he can hear you.
00:19:39.000 Hey.
00:19:39.000 You know, you're talking about him like he's not here.
00:19:41.000 That's for real.
00:19:44.000 He's jabbing at Red Band.
00:19:46.000 No, we're not.
00:19:47.000 We're just bringing him into the conversation.
00:19:48.000 No, but it's like, you know, Albert Camus, the existentialist, said life should be lived to the point of tears, right?
00:19:55.000 Everything has been figured out except how to live.
00:19:57.000 I heard this dude was into onions and hot sauce.
00:19:59.000 He was just a freak like that.
00:20:01.000 Oh, yeah?
00:20:01.000 Camus?
00:20:02.000 Yeah, it was a big misunderstanding.
00:20:03.000 He was at onions and hot sauce aficionado, and everybody was like, oh, he's just, like, really deep.
00:20:08.000 You know how he died?
00:20:11.000 He had a train ticket that he was going to take a train to get to his destination, and his friends were like, no, let us drive you.
00:20:17.000 And they got in an accident?
00:20:18.000 Yeah.
00:20:19.000 It was like a bad, terrible decision.
00:20:21.000 But go on with your Camus quote, because I love it.
00:20:23.000 My mother died yesterday.
00:20:25.000 It might have been the day before.
00:20:25.000 I can't remember.
00:20:26.000 What's that, Ari?
00:20:27.000 My mother died yesterday, or it might have been the day before, I can't remember.
00:20:30.000 Oh yeah, I remember that.
00:20:33.000 Just disconnected.
00:20:34.000 Completely.
00:20:35.000 I remember reading it and I found it really depressing.
00:20:37.000 Well, I guess my whole question is, okay, so in the face of an infinite universe, with our minds we can ponder something close to the infinite, yet the irony is that we're housed in these heart-pumping, breath-gasping, decaying bodies.
00:20:49.000 You know, Ernest Becker wrote The Denial of Death.
00:20:50.000 It says we are gods with anuses.
00:20:52.000 Think of how brilliant that is.
00:20:54.000 So the idea that we are these transcendent beings, but every single day we are reminded that we have metabolism.
00:20:59.000 It's funny that Becker thinks that God doesn't have an anus.
00:21:03.000 How does he know?
00:21:04.000 What an asshole.
00:21:05.000 He's so presumptuous.
00:21:07.000 But still, how do you reconsider?
00:21:09.000 It's the only species that can really lose sleep over the fact that we are mortal beings.
00:21:14.000 We can barely sustain the here and now because we know that one day we might be dead.
00:21:19.000 And so what do we do?
00:21:19.000 Do we just lose ourselves in diversions and sex and drugs?
00:21:23.000 Well, here's my problem with this whole line of thinking.
00:21:25.000 Here's my problem with this whole line of thinking.
00:21:27.000 What does the average person do?
00:21:29.000 What does the average person do?
00:21:29.000 How about turn the question inward and say, what do you do?
00:21:33.000 And tell everybody what you do because that's how we figure it out through you telling me how you're managing it, I tell you.
00:21:40.000 But when you start going, what does the average person do?
00:21:42.000 Well, the reality is we're all the average person.
00:21:44.000 We're all the average person.
00:21:45.000 The average person varies radically.
00:21:46.000 In our insignificance especially.
00:21:49.000 Yes, absolutely in our significance we are all the average person 100%.
00:21:52.000 Yeah, when someone says that all men are created equal, not really, but yes.
00:21:56.000 Not really in this experience, but yes.
00:21:59.000 There's Einstein's, there's Stephen Hawking's, there's fucking Lou Ferrigno's.
00:22:02.000 There's a lot of weird people in this world.
00:22:03.000 But I think there's a responsibility as technology emerges and science begins to show us the truth of reality, these responsibilities begin to emerge that create ethical dilemmas for societies, which is when you have large swaths of the human population being controlled by tyrannical,
00:22:21.000 fundamentalist, religious people, Who are basing everything they do on a phantasmal being that clearly doesn't exist and outdated rituals that are just rotting.
00:22:34.000 They made Galileo apologize.
00:22:35.000 Yes.
00:22:36.000 What do you do?
00:22:37.000 Because in those situations, at some point, it's like, well...
00:22:40.000 You do have a right.
00:22:42.000 Obviously, there's freedom of religion.
00:22:43.000 You want to give people freedom of religion.
00:22:45.000 But simultaneously, it's like, well, but why are you cutting girls' clitorises off?
00:22:49.000 You know what I mean?
00:22:50.000 It's almost like we need an upgrade.
00:22:54.000 Religion was a technology that at least informed people with the illusion of meaning so that they could...
00:23:01.000 As they say, you can live for a week without food, three days without water, but not a minute without hope.
00:23:06.000 So that gave us hope.
00:23:07.000 And Ernest Becker says that was the first solution.
00:23:08.000 I can hang on for a minute.
00:23:09.000 Yeah, that was the first solution.
00:23:10.000 Whoever said that never ate a pot brownie.
00:23:13.000 Because there's fucking hours with no hope.
00:23:15.000 And you get through it.
00:23:17.000 You get through that shit.
00:23:19.000 You just get through it and you learn when you hit the other side.
00:23:23.000 You learn when you hit the other side.
00:23:24.000 Well, that's interesting.
00:23:25.000 We talked about ayahuasca and DMT, which, as Eric Davis says, baseline reality dissolves.
00:23:31.000 There's a complete ego death and a new reality emerges.
00:23:34.000 Or it just fucks with your visual cortex and you add it all with your ego and your psyche and your creativity.
00:23:40.000 And you just, you have the ability to generate images inside the mind's eye with your creativity and you just create a fucking world of geometric patterns because that's how your eyeball works when it's over-flooded with too much DMT. That's how your visual cortex responds.
00:23:53.000 That's what happens.
00:23:53.000 When there's ten times the normal dose in.
00:23:55.000 Maybe that too.
00:23:56.000 Just like when you have a cut, it clots.
00:23:58.000 Yeah.
00:23:59.000 That's just how it responds to that.
00:24:00.000 This is what I always say when people say, how do you know whether or not a DMT trip is real?
00:24:05.000 You're pretending that that really happened, you really did speak with intelligent beings from the planet.
00:24:09.000 The reality is, whether or not you really did go to another dimension and speak with these super-intelligent beings who are made out of love, or whether it didn't happen at all, either way, you experience the same thing.
00:24:22.000 Right.
00:24:23.000 Sure.
00:24:23.000 You get to hear the message.
00:24:24.000 You get to see the exact same thing as if it was real.
00:24:27.000 Yes.
00:24:27.000 Like literally.
00:24:28.000 You know where that was explored brilliantly?
00:24:30.000 Do you guys remember the movie Contact?
00:24:32.000 Yes.
00:24:32.000 Based on the Carl Sagan book?
00:24:34.000 So she is a secular scientist.
00:24:36.000 She doesn't want to hang out with Matthew McConaughey who's a priest.
00:24:40.000 She doesn't believe in God.
00:24:40.000 In fact, at first they try not to let her go.
00:24:42.000 See, that's how the brilliant of the Matthew McConaughey cock.
00:24:45.000 Because even though she didn't want any of that, he still fucked her.
00:24:48.000 Yeah, he did.
00:24:49.000 Boom, son.
00:24:51.000 That's right.
00:24:51.000 She risked pregnancy.
00:24:52.000 But think of what happened at the end of the movie.
00:24:53.000 She went, right, through the wormhole and went to this, saw these alien civilizations, had an experience that sounded like a religious experience, except it was, you know, a scientist.
00:25:01.000 She didn't go anywhere, though.
00:25:03.000 Well, from the perspective of Earth, through the wormhole, it just looked like the ship just ran right through.
00:25:08.000 So nobody believed her, but she had the experience.
00:25:10.000 So all of a sudden, she was sounding like the religious people or like the people that were tripping that said they saw the elves.
00:25:14.000 And all of a sudden, she, as the scientist, had to cast doubt on her own experience, Because she saw the evidence.
00:25:20.000 Yeah, but like when you try DMT, you know already what that is.
00:25:23.000 It doesn't just hit you.
00:25:23.000 You're like, what the fuck?
00:25:24.000 And your mind just explodes.
00:25:25.000 You know you're taking something.
00:25:26.000 So you can make reason of it.
00:25:28.000 I could totally see like, well, what was it?
00:25:29.000 That must be God.
00:25:30.000 And let me tell other people about this.
00:25:32.000 I could totally see that as a way to start it.
00:25:34.000 There's scholars in Jerusalem, legitimate scholars now, that believe that that's what Moses saw when he saw the burning bush.
00:25:39.000 What?
00:25:39.000 That he saw the acacia tree.
00:25:41.000 The acacia tree, which is rich in DMT, and it's really common to that area.
00:25:45.000 That makes a lot of sense.
00:25:45.000 The idea is that this bush, or the extraction of this bush, was burning, and that's how he had this religious experience and saw God.
00:25:51.000 It's real dry there.
00:25:52.000 He had a DMT trip.
00:25:53.000 That makes a lot of sense.
00:25:55.000 He caught a puff of this tree on fire?
00:25:57.000 That's one of the theories.
00:25:58.000 Wow, I can see that.
00:25:59.000 That's just a theory, but the primary focus of achieving this theory is that Moses most likely had a psychedelic experience.
00:26:07.000 Yeah.
00:26:07.000 Because we know that these substances are not new.
00:26:10.000 And plus, he said he got the tablets that God gave him, but he smashed them before he got down the hill.
00:26:15.000 He was mad because they made the golden count.
00:26:17.000 He smashed them when he ever saw the fucking tablets.
00:26:19.000 Yeah, he was tripping his dick off.
00:26:20.000 And that researcher does a lot of mushrooms.
00:26:22.000 You know?
00:26:22.000 Yeah.
00:26:23.000 He could have been like, yeah, or no, I smashed him.
00:26:25.000 Destroy the evidence.
00:26:26.000 Yeah.
00:26:27.000 Or destroy the non-evidence.
00:26:29.000 Everybody talking about it's high.
00:26:30.000 But that's very similar to when Terrence McKenna talks about the stoned ape hypothesis, obviously.
00:26:34.000 But there's a guy called Rich Doyle.
00:26:36.000 He wrote a book called Darwin's Pharmacy, and it's about sex, plants, and the evolution of the noosphere.
00:26:40.000 And he talks about psychedelic substances as information technologies that manipulate our ability to capture and manage attention.
00:26:47.000 So they create what he calls...
00:26:49.000 Infinite resonance with set and setting.
00:26:51.000 Say that again.
00:26:52.000 In other words, infinite resonance with set and setting.
00:26:54.000 That's why when people talk about psychedelic experiences, they're like, make sure you're in a good headspace, make sure you're in a good set and setting.
00:26:59.000 Because if you have infinite resonance with set and setting...
00:27:01.000 What's resonant?
00:27:02.000 Resonance.
00:27:02.000 Like, you become completely porous to whatever is around you.
00:27:05.000 This motherfucker's talking mumbo-jumbo.
00:27:07.000 No, I'm hearing it, but I want to hear it all.
00:27:10.000 You no longer have the ability.
00:27:12.000 It's like turning up the volume on existence.
00:27:14.000 Turning up the volume so loud that if you're in a magnificent place looking at a tree, you might think you're looking at God.
00:27:20.000 If you're in an uncomfortable situation, you're going to go down to the pits of hell.
00:27:23.000 It means you know that thing when a microphone gets too loud and you get feedback?
00:27:26.000 If you're in a shitty place tripping, you can get a feedback.
00:27:29.000 Infinite feedback just keeps going.
00:27:31.000 Or you could be in a beautiful place that elicits feelings of calm and sereneness, and then you can feel like you're getting licked by God.
00:27:40.000 Like infinite orgasm.
00:27:40.000 Happy Shroomfest, by the way, everybody.
00:27:43.000 That's going to be the real problem when people are able to decide what state of consciousness they experience at that moment, not even earning it by being so scared that you take mushrooms.
00:27:52.000 Because every time I've taken mushrooms, I've been scared.
00:27:54.000 You should be.
00:27:54.000 You're about to go through some change.
00:27:56.000 Yeah.
00:27:56.000 It should scare you.
00:27:57.000 I get scared when I eat a pot cookie.
00:27:58.000 You lose control.
00:27:58.000 When I eat the last crumb of a pot cookie, I'm like, oh shit.
00:28:02.000 Yeah, you gotta deal with this.
00:28:03.000 You have that moment.
00:28:03.000 I start tapping my feet going, motherfucker.
00:28:05.000 How bad is it gonna get?
00:28:05.000 What did I do?
00:28:06.000 Yeah.
00:28:07.000 Yeah, it's true.
00:28:08.000 Now, is that why?
00:28:09.000 Is that because it's like, if you think of the metaphor of skydiving, it's the moment where you've already jumped?
00:28:13.000 And you're just like, I don't know what it's gonna be like.
00:28:14.000 I honestly think that there's an accelerating process of self-development.
00:28:18.000 That we all go through.
00:28:19.000 And that if we are not improving as a human being, we feel shitty.
00:28:22.000 We don't feel good.
00:28:23.000 I don't feel good unless I'm getting my shit together.
00:28:26.000 That's just a fact.
00:28:27.000 I try to be a better person today than I was yesterday, for real.
00:28:30.000 And it sounds stupid, but it's because everybody says it and very few people legitimately, totally practice it.
00:28:36.000 They fall in and out.
00:28:37.000 But I feel like that's also one of the reasons why people aren't that happy.
00:28:41.000 I feel like if you're not improving yourself and getting rid of your bullshit in life, it's very difficult to feel good.
00:28:49.000 It's very difficult to be enjoying it if you have all these issues that you're not dealing with.
00:28:53.000 Like about you as a person or you with your job or you...
00:28:57.000 I hear more and more people talking about it when they say, like, and I know I've been bad at that and I'm actually trying to make a note to not be like that anymore.
00:29:04.000 Well, it might be telling the truth.
00:29:05.000 I mean, it's not a process where you either get it right or you don't get it right.
00:29:08.000 Some people fall back and forth.
00:29:10.000 I mean, how many people do we know that used to drink and then drank again and stopped drinking for a long time?
00:29:14.000 It's like a little battle sometimes with people to try to improve their shit.
00:29:18.000 Ram Dass compares it to floating in the ocean and your head's bobbing up and down.
00:29:24.000 Sometimes you see the shore and sometimes you don't.
00:29:26.000 That's what it's like.
00:29:27.000 It's like sometimes you're there and you see it, but you can't beat yourself up when you go down.
00:29:32.000 You have to have faith that you'll come back up again.
00:29:34.000 But what you're saying is dead on, man, because if I'm feeling like shit, what you're saying is not some broad, big thing you've got to do.
00:29:40.000 If I'm feeling like shit, nine times out of ten, it's just because I've got dishes in the sink, or I didn't go jogging, or I didn't, like, sweep my kitchen.
00:29:49.000 It's not like big...
00:29:50.000 But that would be, like, an example of set and setting right there.
00:29:53.000 Other people say that, you know, 99% of your problems will go away if you get a good night's sleep.
00:29:57.000 A lot of them do.
00:29:59.000 Yeah.
00:29:59.000 Because sometimes, you know, even when you're sleeping, you're thinking about things, and you're putting them into perspective.
00:30:04.000 Yeah.
00:30:04.000 I mean, how many times have you been really upset when you go to bed, and by the time the night is over, you're like, eh, whatever.
00:30:09.000 Oh, right.
00:30:10.000 You let it go.
00:30:10.000 Sleep on it.
00:30:11.000 Well, it has to do with regulating your emotions.
00:30:13.000 A gig you lost or something, something that happened.
00:30:15.000 Like, I'm sure you've had a few...
00:30:16.000 So don't try to go to bed before you make the phone call or whoever.
00:30:19.000 Talk to you about this, because you've had a few instances where, like, the amazing racist stuff, like, lost you gigs.
00:30:25.000 Yeah, I just had to get a thing canceled in London, Ontario.
00:30:28.000 That's right.
00:30:29.000 Yeah.
00:30:29.000 Somebody...
00:30:30.000 Yeah.
00:30:30.000 Now, that's got to be frustrating as fuck.
00:30:33.000 It's so frustrating.
00:30:33.000 How do you let go of that?
00:30:34.000 Like, when you're dealing with something like that, how do you let go of that?
00:30:37.000 Well, sometimes you say, I stop myself.
00:30:39.000 I go, stop.
00:30:40.000 Hold on.
00:30:41.000 Can griping about this change it in any way?
00:30:44.000 No.
00:30:44.000 All right.
00:30:45.000 All right.
00:30:45.000 Move on.
00:30:46.000 So it's like a little moment of dialogue with yourself and a decision.
00:30:48.000 Yeah.
00:30:49.000 If I'm in a long line at the airport and I'm going to miss my flight, I'm like, oh, come on.
00:30:55.000 I keep looking up ahead.
00:30:56.000 I'm like...
00:30:56.000 Are you going to go to the front and ask, say, I'm about to miss my flight, or are you not?
00:30:59.000 If you're not, then stop griping.
00:31:01.000 Then just let it wash over you.
00:31:02.000 That's very interesting.
00:31:02.000 I have a friend who was starting a company.
00:31:03.000 He wants to create a watch that regulates emotion because people talk about the age of the quantified self where we're going to have all these devices that are going to be- A watch that what emotions?
00:31:12.000 That measures emotion.
00:31:14.000 Oh, really?
00:31:14.000 You have everybody know if you're annoyed.
00:31:16.000 Well, it's not going to tell other people.
00:31:18.000 It'll tell you.
00:31:18.000 You get to set the code.
00:31:19.000 It'll measure your biofeedback rhythms and it'll give you feedback to tell you how you're feeling so that then you can change your behavior if that's what needs to be done.
00:31:29.000 Feedback loop seems to be the best way to reprogram reflex responses.
00:31:33.000 They're saying that the best way to stop people from doing speeding is not from actually pulling them over, but it's from those sensors that say your speed and tell you that as you pass them by.
00:31:42.000 So receiving feedback is the best way to Change behavior.
00:31:46.000 So in terms of moving towards experience design and the age of the quantified self, if you know you're eating something unhealthy, maybe you won't eat it.
00:31:53.000 If you're constantly reminded about how you're feeling and what you're doing, you can really kind of improve yourself.
00:31:59.000 Technologically enhanced mindfulness is what that is.
00:32:02.000 Because it's like, how often are you wandering around in a state of absolute terror, pretending everything's fine, I'm totally fine, having a great day, but inside you're like, Oh, fuck, man.
00:32:13.000 If I don't make enough money, I'm not going to make rent.
00:32:14.000 But you're pretending to be happy.
00:32:16.000 So if you have a watch that's flashing, you are terrified.
00:32:20.000 You are tense right now.
00:32:22.000 You are tense right now.
00:32:23.000 Talk about it.
00:32:23.000 Deal with it.
00:32:24.000 One of the most important things that anybody could ever understand in this life is what happens and what it feels like when you're out of debt.
00:32:32.000 Because how many of us, by the time we're 20, whatever the fuck we are, we have so much money that we can't pay off.
00:32:40.000 Credit cards, student loans.
00:32:41.000 I just last week paid off my college loans.
00:32:43.000 Congratulations.
00:32:44.000 I'm 39. Isn't that crazy?
00:32:45.000 What a crazy society we have.
00:32:47.000 That everybody, by the time they're 30 years old, is in some kind of debt unless you're super rich.
00:32:52.000 Slavery.
00:32:53.000 A lot of education debt, right?
00:32:55.000 A lot of education debt.
00:32:57.000 Medical debt.
00:32:58.000 Medical school debt is insane.
00:33:00.000 Not medical school, medical debt.
00:33:02.000 That too.
00:33:03.000 Those are two systems that definitely need an upgrade.
00:33:06.000 Fuck yes.
00:33:06.000 And this is where we can actually connect that to the ideas that Rick Kurzweil is talking about.
00:33:10.000 He's saying that healthcare is about to undergo the same transformation that information technology went through.
00:33:14.000 So that means that the whole idea of how people cure themselves or fix diseases, this is all going to become like something that's a part of our smartphone and part of our day-to-day life.
00:33:21.000 So it's going to change That broken system of healthcare.
00:33:24.000 And education through the internet, free education, people around the world coming online, joining the global conversation, getting free education.
00:33:30.000 There's a Harvard professor who's offering all his classes online.
00:33:34.000 Yeah, exactly.
00:33:36.000 It's only the beginning of that.
00:33:37.000 And you have the power of decentralized peer networks that can be leveraged to solve all these problems.
00:33:41.000 What does that mean, decentralized peer networks?
00:33:44.000 It's the same thing that we talk about self-organization and emergence when cells link up together and become organisms.
00:33:49.000 So you have electronic self-organization happening with social media.
00:33:54.000 When there's leaderless protests that just spontaneously self-organize.
00:33:58.000 And these technologies allow these decentralized peer networks that don't have leaders and don't have a head that you can cut off.
00:34:04.000 Potentially something like that.
00:34:06.000 But, so, Steven Johnson's book, Future Perfect, talks about how those can be leveraged to, like, solve problems, you know?
00:34:10.000 Like cure diseases, like leverage the collective intelligence.
00:34:13.000 How, what, the decentralized peer networks can be used?
00:34:15.000 Yeah, decentralized peer networks.
00:34:15.000 By the way, I love that term because I think one of the big fucking problems is the need that people have to claim responsibility.
00:34:22.000 for innovation and this is this is one of the horrors of our age is that that thing which makes people get rich is what motivates people people aren't motivated like when people are trying to cure cancer you like to believe that the reason they're trying to cure cancer is out of some kind of altruistic desire until you see them going to the Supreme Court to try to patent genes because they want to profit off of their research yeah that's pretty weird right it seems like the part that the idea should be that the I know,
00:34:50.000 but that's why Pfizer is giving the researchers the money to research, is so that eventually they'll be able to show a return on that investment.
00:34:57.000 Right.
00:34:57.000 It's just a strange.
00:34:58.000 Yeah, it is strange.
00:34:59.000 It's very strange.
00:35:00.000 And you know what the real problem with all of it is?
00:35:01.000 That it's not psychedelic.
00:35:03.000 That's the real problem is that you can make money and create things, but you have to have a psychedelic mindset in order for society to move forward like emotionally.
00:35:11.000 That's where they put it all behind.
00:35:12.000 Friendship-wise.
00:35:14.000 If it's not doing that, then it's going to get caught up in the ones and zeros, collecting the numbers.
00:35:18.000 It can be prosperous.
00:35:19.000 And still be ethical.
00:35:21.000 It's just, it's not.
00:35:22.000 And the reason why it's not is because when you have a corporation, you get that diffusion of responsibility thing going on, which is the opposite of what's psychedelic.
00:35:32.000 Yes, absolutely.
00:35:33.000 Exactly, man.
00:35:34.000 It's where the individual has no responsibility for the mass of individuals, whereas the psychedelic experience is connected to all individuals.
00:35:41.000 The mass of individuals is connected entirely.
00:35:43.000 Yes.
00:35:43.000 That's well put.
00:35:44.000 I'm going to Joshua Trey on Sunday.
00:35:45.000 Dude, tell me what you're going to do there, son.
00:35:48.000 Let me guess.
00:35:48.000 I think you're probably going to do your taxes.
00:35:51.000 I'm not going to do my taxes.
00:35:51.000 I'm going to do mushrooms under the supermoon.
00:35:53.000 Get your freak on.
00:35:54.000 Is there a supermoon on Sunday?
00:35:56.000 Sunday night.
00:35:56.000 Middle of Shroomfest.
00:35:57.000 Let me tell you something, son.
00:35:58.000 When you do mushrooms, it's over the supermoon.
00:36:01.000 There's no non-super moon.
00:36:03.000 When you do mushrooms and you realize that there is a fucking planet one quarter of the size of ours and it's literally floating above our heads.
00:36:14.000 Are you doing a tent?
00:36:15.000 Are you doing a tent?
00:36:17.000 Hotel room?
00:36:17.000 No, tent.
00:36:18.000 Tent out in Joshua.
00:36:20.000 I've never been there too.
00:36:21.000 It should be cool.
00:36:22.000 Duncan, what are you getting, buddy?
00:36:23.000 Getting beer for you and me.
00:36:24.000 Yeah, alright.
00:36:25.000 That's what I'm talking about.
00:36:26.000 I'll have one.
00:36:27.000 Beer?
00:36:28.000 Try one of these Black Butte porters.
00:36:30.000 They're delish.
00:36:31.000 Yeah.
00:36:32.000 Delicious.
00:36:33.000 I tried drinking with a 23-year-old recently.
00:36:36.000 Those people are straight out of college.
00:36:37.000 They're in training.
00:36:38.000 Damn.
00:36:39.000 They bring shots around like it's nothing, and I'm barfing in the street, and they're still going.
00:36:42.000 They don't give a fuck.
00:36:43.000 God, I can't.
00:36:45.000 Yeah, your liver's old, son.
00:36:46.000 Yeah.
00:36:47.000 It's almost over for you.
00:36:47.000 Have you ever had Eric Davis on the show?
00:36:50.000 Eric Davis is...
00:36:51.000 He wrote a book called Technosis that I think you guys would love.
00:36:55.000 He kind of Writes about the mystical undertones of technology.
00:36:59.000 So, again, like the whole psychedelic cybernetics thing.
00:37:04.000 And I just...
00:37:05.000 I don't know.
00:37:05.000 I wondered if you guys ever had him.
00:37:07.000 No, I never heard of him.
00:37:08.000 Now I know.
00:37:09.000 Boom.
00:37:10.000 Sounds good.
00:37:11.000 Sounds like perfect stuff.
00:37:12.000 Right up my alley.
00:37:13.000 Yeah, there's so many people like that now.
00:37:15.000 Thank you, sir.
00:37:15.000 That's the beautiful thing about this time.
00:37:17.000 It's like, you know, every day there's some new guy who's got a new video or a new song or a new...
00:37:21.000 There's so many fucking pieces of something that are being created.
00:37:25.000 Oh, yeah.
00:37:26.000 There's more comedians now than ever.
00:37:27.000 There's access to all of it.
00:37:29.000 But there's also an infinite amount of content.
00:37:30.000 How do you decide what to pay attention to?
00:37:32.000 It's really hard.
00:37:33.000 To get banned with anxiety?
00:37:34.000 Well, that's the beauty of something like A Death Squad, which was the stupid nickname that we all call ourselves.
00:37:41.000 You know that if Ari tells you someone's funny, he's not going to be lying about it.
00:37:45.000 Right.
00:37:45.000 So you guys tell you so many people that aren't funny that you know.
00:37:48.000 Yeah, but that's good.
00:37:49.000 If Ari tells me to watch some guy, I know he's really funny.
00:37:52.000 So the audience knows that too.
00:37:54.000 And that's sort of like the beauty of having a bunch of people that are like-minded.
00:37:59.000 You tune in.
00:38:00.000 And we're all different.
00:38:01.000 There's a lot of alt people that would hate our humor.
00:38:05.000 And the last thing they want to do is be hanging around with us.
00:38:07.000 And they're not wrong.
00:38:08.000 Yeah, they're just not into it.
00:38:09.000 They have a different thing.
00:38:12.000 But you find your thing, whatever it is.
00:38:14.000 Whether it's Johnny Cash or Taylor Swift.
00:38:17.000 You fucking follow it.
00:38:18.000 And then, you know, whatever's connected to her.
00:38:21.000 Those people are going to find it for you.
00:38:23.000 Yeah, I think that's how you do it in today's day and age.
00:38:26.000 And that's the beauty of us being able to introduce people like Bert Kreischer or anybody else that we brought onto our podcast that all of a sudden other people can go, oh, that guy's really funny.
00:38:37.000 Like, oh, and he's friends with this guy.
00:38:39.000 Oh, that guy's really funny too.
00:38:40.000 Jason Silva?
00:38:41.000 Yeah, yeah, yeah.
00:38:41.000 How many people discovered you?
00:38:43.000 This sort of stuff.
00:38:44.000 This was one of the biggest things I ever did.
00:38:46.000 Come on your podcast, dude.
00:38:48.000 That's amazing.
00:38:49.000 It's been a year later, and I still get people saying I'd like to come back and hang out with you guys and have a mind job, which has been amazing.
00:38:54.000 Well, it's a two-way street, though, because the whole reason why the podcast is interesting is because people like you are interested in coming on.
00:39:00.000 Oh, right.
00:39:01.000 If you only had just me talking after a while, I'd be repeating stories like a motherfucker.
00:39:05.000 I'd just be spouting nonsense at this point.
00:39:07.000 But you know what's interesting about what you're saying?
00:39:10.000 There's a book that talks about the importance of an information diet, you know, because we live in a world now where there is like an infinite amount of content out there, more than 10,000 hours of video uploaded to YouTube every hour, some crazy number like that.
00:39:22.000 The most difficult thing, I think, becomes deciding who are going to be your information diet filters.
00:39:29.000 Like in this case, the death squad, the peer networks that you are connected to, the people you follow on Twitter.
00:39:36.000 A lot of people trust NBC. By the way, if you only watch traditional media, you realize how limited it is.
00:39:42.000 Brian Redband's leaving us, everybody, so he's got to go do an Icehouse show.
00:39:47.000 When is the new shirt coming out?
00:39:49.000 Pre-order should be up next week, so next two weeks or so.
00:39:51.000 It's my favorite of all time.
00:39:53.000 It's head and shoulders.
00:39:54.000 It's awesome.
00:39:55.000 And the other one is awesome, too.
00:39:57.000 But this new one is on another level.
00:39:59.000 It's dope.
00:40:00.000 I want it.
00:40:00.000 I want a pair of underwear.
00:40:02.000 I want it.
00:40:02.000 I want a pair of underwear with that on.
00:40:04.000 Jesus Christ.
00:40:05.000 I don't give a fuck, dude.
00:40:06.000 I'll wear cloth over my dick.
00:40:08.000 Not on the podcast, Joe.
00:40:10.000 I'll let you smell them.
00:40:13.000 Jason, what kind of car do you drive, dude?
00:40:15.000 I don't have a car anymore.
00:40:17.000 Are you one of those motherfuckers?
00:40:19.000 I'm in New York.
00:40:19.000 I used to have one here.
00:40:21.000 You don't drive cars?
00:40:24.000 I mean, when I lived here, I had a car.
00:40:25.000 I wanted to ask you about that Tesla.
00:40:27.000 Oh, I don't have it.
00:40:28.000 Oh my god, is that the coolest thing of all time?
00:40:30.000 I know, it's kind of incredible.
00:40:31.000 And they're only going to get better.
00:40:32.000 I think they're about to release the third or fourth generation now?
00:40:34.000 I don't know.
00:40:35.000 Martine Rothblatt had one.
00:40:36.000 And when I interviewed her and I got to meet her robot, Bina48.
00:40:40.000 You ever seen that?
00:40:41.000 No.
00:40:41.000 Her spouse is a robot.
00:40:43.000 Martine Rothblatt, fascinating story.
00:40:45.000 Was a man, was a man, founded Sirius Satellite Radio, got a sex change, became a woman, and then created a robot.
00:40:52.000 That is a direct copy, a duplicate of her spouse.
00:40:56.000 And it's creepy how good it looks.
00:40:59.000 Where's the spouse?
00:40:59.000 Well, there.
00:41:00.000 She's there, too.
00:41:01.000 She's there as well.
00:41:02.000 She just loves her, so she made a robot for her.
00:41:04.000 That's so sweet.
00:41:06.000 Yeah, it's interesting.
00:41:06.000 It's like Liberace making that guy change his face to him.
00:41:09.000 Whoa.
00:41:10.000 Did he really do that?
00:41:11.000 He made him get plastic surgery to look more like Liberace.
00:41:15.000 That's hilarious.
00:41:16.000 Is that in the HBO thing?
00:41:18.000 Yeah.
00:41:18.000 I always wondered why Matt Damon and Michael Douglas were willing to do that.
00:41:21.000 They have pictures of those two blowing each other.
00:41:24.000 That's the only thing that makes sense.
00:41:27.000 Like if they came up to you and you were Matt Damon and they said, hey, listen, man.
00:41:31.000 I know those Bourne Identity movies, they're a really big movie.
00:41:34.000 Listen, you're going to play Liberace's Butt Buddy.
00:41:38.000 And best of all, made for TV. Yeah, made for TV. No, it's on HBO. Liberace's longtime lover.
00:41:45.000 Oh, I see, not in the movies.
00:41:46.000 Butt Buddy's very offensive, by the way, to my gay friends, and I apologize for that.
00:41:48.000 It's Butt Pirate.
00:41:50.000 You guys are so immature.
00:41:52.000 I mean, if you said that a girl was your vagina pal, would that be rude?
00:41:57.000 Vagina pal.
00:41:58.000 No, it would just be embarrassing.
00:41:59.000 That would be rude, but isn't it interesting that a girl can call a guy a dick and there's no repercussions at all?
00:42:04.000 There are to my heart.
00:42:06.000 And she can even say, I'm here to get some good Jason Silva dick, and you wouldn't have any problem with that.
00:42:12.000 You'd be like, yep, I'm dishing it out, honey.
00:42:14.000 But if a guy says, oh, I'm here to get some sweet Mary pussy, she'd be like, what the fuck?
00:42:19.000 It's because they're the gatekeepers.
00:42:20.000 Really?
00:42:20.000 That's who I am?
00:42:21.000 I'm sweet Mary pussy?
00:42:22.000 You don't think that's gross?
00:42:24.000 And you're like, oh my god, you're not really my friend.
00:42:27.000 Can't even joke around with you.
00:42:28.000 Jason, what do you think is going to happen with these sex androids?
00:42:31.000 What do you see as the future?
00:42:33.000 Well, I think the sex technology will probably be the pioneer.
00:42:37.000 I mean, just like with The porn industry pioneered DVDs.
00:42:40.000 When DVDs were first a thing, who do you think was doing the most advanced, multi-angle, interactive DVD experiences was the porn industry.
00:42:47.000 As soon as they said phones wasn't going to go with that old style of video, all of porn was like, cool, we'll update.
00:42:52.000 The iPhone never did.
00:42:54.000 The rest of them were like, yep.
00:42:55.000 We're always going to be driven by our sexual desires.
00:42:58.000 Kurzweil says we'll be able to tap into each other's nervous systems and become each other.
00:43:02.000 When we have sex with our girlfriend.
00:43:03.000 Oh, that's combining with someone.
00:43:04.000 Like, imagine actually merging.
00:43:06.000 Well, but no, but some people say, you know, the Kama Sutra talks about we've been wanting to merge with our lovers at the beginning of time.
00:43:11.000 We want to become one.
00:43:12.000 But imagine if we can actually scramble our nervous systems together because we have the Demolition Man device or whatever.
00:43:18.000 Remember that?
00:43:18.000 That could be a real mindfuck, though, if you just find out that you are, like, the worst in bed ever.
00:43:23.000 You feel what it's like to be fucked by you.
00:43:25.000 Holy shit!
00:43:26.000 Look at that!
00:43:27.000 That's very eloquent.
00:43:28.000 Look at me.
00:43:28.000 I'm this...
00:43:29.000 Oh, why am I doing that?
00:43:31.000 That's going to be for 16 to 25 years.
00:43:33.000 You can grow from that experience.
00:43:34.000 Even better.
00:43:35.000 It would be like self-knowledge.
00:43:36.000 Take it!
00:43:37.000 Even better.
00:43:38.000 Take it!
00:43:39.000 So you don't like it.
00:43:41.000 Take it!
00:43:42.000 Your eyes would pop over like, why am I choking myself?
00:43:46.000 Why won't you let me breathe, me?
00:43:48.000 Or even worse, what if you get into her mind, like if you can access your girlfriend's needs and desires, and you go, I want to know what you want, and you get into her mind.
00:43:56.000 It's just a river of black cock.
00:43:59.000 Black cock!
00:44:00.000 Just a sleepy river of disembodied black cocks just shooting sperm like a psychedelic dream.
00:44:07.000 You're riding a river of dark black cocks.
00:44:10.000 It's not black cock, it's slippery like eels.
00:44:12.000 It's a river of pit bull cocks.
00:44:14.000 Yeah.
00:44:14.000 Well, it's good to allow for a lot of creativity in sort of our sexual consciousness.
00:44:19.000 Yeah, it's on the side.
00:44:20.000 Let's get ahead with it.
00:44:21.000 It's fragmented to a multiplicity of dimensions that we can't even imagine.
00:44:24.000 It's the psychedelicization of sexuality.
00:44:27.000 Isn't it amazing, though, that that drives a lot of our technology lately, but we still have to repress it societally.
00:44:33.000 Sex?
00:44:34.000 It still has to be looked on as embarrassing.
00:44:37.000 The reproductive force.
00:44:38.000 It's the wind in the sails of humanity.
00:44:41.000 It drives everything.
00:44:44.000 There's actually a book about this.
00:44:45.000 It's called The Mating Mind.
00:44:47.000 It was written by Jeffrey Miller.
00:44:48.000 He says that the brain's extraordinary capacities for creativity, for discourse, for Everything we do, even build airplanes and iPhones, is ultimately our glorified version of the peacock feather.
00:44:58.000 It's our version of the bird song.
00:45:00.000 It's just us charming to capture and manage the attention of those potential mates.
00:45:05.000 It's a way of saying, I'm poetic, I built that skyscraper, or I wrote you the song.
00:45:08.000 So is every joke every comedian's ever told.
00:45:10.000 Everything.
00:45:11.000 But what's interesting is that the side effect of this sexual creativity is also responsible for everything wonderful we've created.
00:45:18.000 So it talks about sexuality ultimately as a creative act.
00:45:22.000 It is.
00:45:23.000 Because it's about reproduction.
00:45:24.000 But ultimately on a cultural level and on an idea sex level, like the whole fucking thing about reproduction seems to be like...
00:45:31.000 Right.
00:45:32.000 We used to think we're getting lured by nature into making babies, but now we see we're being lured by nature into making spaceships.
00:45:38.000 That's why the pill changed so much in society too.
00:45:42.000 That's why the singularity is a cosmic orgasm, is the best way to describe what the singularity is.
00:45:47.000 It's the universe waking up.
00:45:48.000 It's us impregnating the universe with intelligence.
00:45:50.000 That's the great Marshall McLuhan quote.
00:45:52.000 You know that quote?
00:45:53.000 Human beings are the sex organs of the machine world.
00:45:55.000 Wow.
00:45:56.000 Brilliant.
00:45:57.000 That's brilliant.
00:45:58.000 Marshall McLuhan nailed that shit in the 60s.
00:46:00.000 Of the machine world, like the machines are controlling us.
00:46:02.000 Before computers.
00:46:02.000 He figured that out before computers.
00:46:04.000 He was an actual genius.
00:46:06.000 He also said, first we build the tools, then they build us.
00:46:09.000 Right.
00:46:09.000 Think about that.
00:46:09.000 It's happening, that's true.
00:46:11.000 Of course it's happening.
00:46:11.000 It's growing through.
00:46:12.000 And again, much like Da Vinci, what a mindfuck it must have been to be operating like that back in the 50s.
00:46:16.000 You know what Da Vinci also did?
00:46:17.000 He figured out how to draw curves.
00:46:19.000 How to draw rounded edges in art.
00:46:22.000 No one knew how to make it.
00:46:24.000 It's like to go around and it loses...
00:46:27.000 You know how, like, a path would go up to nothing?
00:46:29.000 Nobody knew how to do that?
00:46:30.000 He was the one who...
00:46:31.000 I don't know if I remember from high school.
00:46:33.000 Maybe, right?
00:46:35.000 Eh, whatever, let's attribute it to him anyway.
00:46:36.000 Something along those lines.
00:46:37.000 He was a good guy.
00:46:38.000 He deserves it.
00:46:39.000 He put in his hours.
00:46:42.000 I love that stuff, though, because it's like, well, what aren't we doing now that we can do?
00:46:46.000 Oh, yeah, that we'll have, like, for granted, we'll take for granted in a hundred years.
00:46:50.000 Like, what?
00:46:50.000 What do you mean?
00:46:51.000 They didn't walk through walls.
00:46:53.000 Why didn't they?
00:46:54.000 It's true.
00:46:54.000 They walked around every time?
00:46:55.000 Yeah.
00:46:56.000 Why?
00:46:57.000 What a strange world it must have been back then, man.
00:47:00.000 When you could just die if you got sick.
00:47:02.000 Most people just died.
00:47:04.000 50% mortality rate for children.
00:47:07.000 Completely.
00:47:07.000 And there's a lot of people, though, today that think that things are getting worse in the world.
00:47:13.000 It's another one of those mistakes that people make.
00:47:14.000 We're living longer than ever.
00:47:15.000 We're living longer than ever.
00:47:16.000 And there's a guy called Hans Rosling who has these amazing videos on the internet that show every nation in the world by every measurable indicator of quality of life has been right over the last hundred years.
00:47:24.000 No, but it just shows that contrary to what the media in which it bleeds, it leads, feeds us because we have these overactive, fear-based amygdala that only pay attention to what's wrong, We fail to see everything that's going right.
00:47:36.000 You know what's the most confusing shit?
00:47:38.000 Really hot newscasters telling you horrible things.
00:47:41.000 That's not confusing at all.
00:47:43.000 She's got big tits and I'm scared out of my mind.
00:47:45.000 My dick is hard and I'm ready to run.
00:47:48.000 Everything is together.
00:47:50.000 So rude.
00:47:52.000 Put someone ugly for bad news.
00:47:54.000 Yeah, you're going to tell me some bad news.
00:47:55.000 Get my high school teachers.
00:47:57.000 And then switch them in when they go, and this just in a puppy found alive and healthy.
00:48:01.000 Ah, and then the big tits come out and they're ready to party.
00:48:04.000 But you could tell a lot about Bill O'Reilly.
00:48:07.000 I promise you that Bill O'Reilly loves getting tied up.
00:48:10.000 I can't say that for sure.
00:48:12.000 That's a guess.
00:48:13.000 Whatever legal stuff is, I guess.
00:48:15.000 But I would imagine Bill O'Reilly, because he's always on his show, it's always these Beautiful, yet dominating, hot girls that surround him.
00:48:24.000 He likes to be around these types of girls.
00:48:27.000 Yeah, he loves it.
00:48:28.000 You never see the women around him on that show as being submissive to him.
00:48:32.000 They're always kind of tough.
00:48:34.000 Guaranteed they scrub him down.
00:48:35.000 When you think about this, do you masturbate?
00:48:37.000 Scrub him down.
00:48:37.000 You masturbate thinking about Bill O'Reilly?
00:48:39.000 All the time.
00:48:40.000 Do you imagine ever that you were some really fucking stupid guy like Bill O'Reilly?
00:48:43.000 A smart, stupid guy.
00:48:45.000 Like, he's a Harvard graduate dummy.
00:48:47.000 You know, he's one of those guys.
00:48:48.000 I'm gonna go with Jesus.
00:48:49.000 What does he say, his thing about Jesus?
00:48:51.000 He's like, I'm gonna go with the Jesus guy.
00:48:53.000 Why does the tide come in?
00:48:54.000 Why does it go out?
00:48:55.000 Oh, that Dawkins interview when Dawkins, the smile on Dawkins' face is the smile of the Lord of the Rings necromancer as he's crushing just like a little imp or something.
00:49:06.000 What was he saying?
00:49:08.000 The look on his face because he's like, oh.
00:49:09.000 Dawkins, though, for my taste, gets a little too upset.
00:49:13.000 What's that?
00:49:13.000 Richard Dawkins.
00:49:15.000 For my super genius, atheist, reasonable people talking to cuckoo heads, I like my atheist to be a little bit more relaxed.
00:49:26.000 Completely like...
00:49:26.000 I think it was a great reaction.
00:49:28.000 He's a little on the cunty side.
00:49:30.000 What did he say to him?
00:49:32.000 Dawkins has got a hammer smile.
00:49:35.000 O'Reilly deserves a...
00:49:36.000 Oh, for sure.
00:49:37.000 Don't get me wrong.
00:49:38.000 But I think that Dawkins, at his age, is such a statesman, such a well-respected, brilliant man.
00:49:45.000 He's probably showing off, too.
00:49:46.000 Maybe, perhaps.
00:49:47.000 And maybe also he feels it's his duty to maybe mercenary, go after those guys.
00:49:53.000 Because he is the intellectual voice for the atheists.
00:49:56.000 He's super important in that way.
00:49:58.000 I just wish he would chill.
00:49:59.000 And you know one of the things that I found out about him is?
00:50:01.000 What?
00:50:01.000 No psychedelic drugs in his background.
00:50:03.000 That's true.
00:50:04.000 And he talked about how he maybe would be interested in taking LSD under a very clinical setting to explore the merits of the drug.
00:50:13.000 You know what that says to me?
00:50:14.000 That's where the hole is.
00:50:16.000 That's where the hole's in his game.
00:50:17.000 That's why he comes off gunty.
00:50:18.000 Well, it's interesting because another one of our atheists, intellectuals, Sam Harris, has done psychedelics.
00:50:24.000 In fact, he wrote an essay called Psychedelics and the Meaning of Life.
00:50:27.000 Which was actually a very brilliant piece.
00:50:30.000 He's an interesting guy because he's an atheist, but he has some radical insights about spiritual, subjective experience.
00:50:36.000 He gets lumped in with Islamophobes.
00:50:39.000 Really?
00:50:39.000 I don't find that his writing comes across that way.
00:50:42.000 A lot of people argue that it does.
00:50:43.000 He's a brilliant man and a friend.
00:50:45.000 I really like the guy very much.
00:50:47.000 And I really enjoy talking to him because he's got such a fucking stupid smart mind.
00:50:52.000 Yeah.
00:50:53.000 His brain is just, like, firing, like, at a million hertz.
00:50:57.000 Oh, yeah.
00:50:57.000 But the thing about the label of Islamophobe is, like...
00:51:03.000 The reality is all ideologies that force people into doing violent things are crazy.
00:51:09.000 And to try to pretend they're not to make some people who aren't violent happy seems like intellectually dishonest.
00:51:16.000 And that's where the guy has courage.
00:51:18.000 Of course he has courage.
00:51:19.000 It's not that he's an Islamophobe.
00:51:20.000 Not at all.
00:51:20.000 If Islam was Buddhism, okay, think about Buddhism.
00:51:23.000 And by the way, this is a radical new sect of Buddhism, apparently, that's involved in ethnic cleansing.
00:51:28.000 Oh, yeah.
00:51:29.000 I know what you're talking about.
00:51:30.000 Oh, they're fucking up Buddhism.
00:51:31.000 It's the last safe place after being a Mormon.
00:51:34.000 You know, go to the last safe cult.
00:51:37.000 Yeah, Buddhists get angry.
00:51:37.000 But I mean, apparently.
00:51:39.000 Malaysia?
00:51:40.000 Where is it?
00:51:40.000 Humans are imperfect, man.
00:51:42.000 I think it's something like that.
00:51:42.000 I don't really remember where it was, but humans are so imperfect.
00:51:46.000 You know?
00:51:46.000 And the idea that there's anything wrong with saying that ancient ideologies that involve killing people if they don't believe are fucking bad.
00:51:56.000 I mean, doesn't it say that somewhere?
00:51:58.000 Well, you can't be tolerant of intolerance.
00:52:01.000 I mean, that's the problem with moral relativism and with this fear of, like, passing any kind of judgment because it's a different religion.
00:52:07.000 So what if they...
00:52:08.000 Beat each other to death.
00:52:10.000 Yeah, but it's like, you want to be like, do whatever you want, but don't let someone do something against someone's will to them.
00:52:14.000 100%.
00:52:14.000 You can't tolerate intolerance, and behavior like that obviously is intolerance.
00:52:18.000 That's where the buck stops.
00:52:19.000 Bill Maher has spoken about this a lot.
00:52:20.000 You know what's weird, though?
00:52:22.000 Here's what's weird.
00:52:22.000 You said Bill Maher.
00:52:23.000 Bill Maher gets labeled as an Islamophobe, which I find fascinating because progressives, for some reason, it's almost like they're bullied, so they want to make friends with the bully.
00:52:35.000 So there's this weird progressive thing where you don't criticize Islam, and if you do, you become an Islamophobe.
00:52:41.000 Or if someone is criticizing other religions, that's the first thing they say.
00:52:45.000 Oh, we never criticized Islam.
00:52:46.000 You never criticize Islam.
00:52:47.000 How come you never say shit about Islam?
00:52:49.000 So it becomes this weird sort of polarity.
00:52:52.000 Because they're so gangster, they'll kill you if you draw pictures of Muhammad.
00:52:56.000 They take shit to the next level.
00:52:58.000 So the natural inclination of the biggest pussies on earth, which are the liberals for the most part.
00:53:04.000 Let them do it.
00:53:05.000 They not only let them do it, but support them.
00:53:07.000 You're Islamophobic.
00:53:09.000 You know, Bill Maher is an Islamophobe.
00:53:11.000 Guess what?
00:53:12.000 You should be an you-believe-a-phobe.
00:53:14.000 Whether it's UFOs or Bigfoot or Chupacabras or Islam or Joseph Smith or Jesus Christophobe.
00:53:21.000 You're not a Christophobe because you're against them raping little boys.
00:53:24.000 If you believe anything that you haven't seen yourself or watched on TV, you're an idiot.
00:53:28.000 Well, this is a thing, Joe, this is a thing you were talking about earlier when it comes to the DMT experience or the psychedelic experience.
00:53:35.000 And the question is, does it matter if this is real or not?
00:53:38.000 And I think it matters more than anything if it's real.
00:53:42.000 We must understand reality from subjective reality.
00:53:47.000 If we can, we should try to understand it.
00:53:50.000 For example...
00:53:52.000 Who was it?
00:53:52.000 I can't remember who it was talking about.
00:53:54.000 Only when it affects someone else, I would argue.
00:53:56.000 But I'm saying it doesn't matter because it's the same experience.
00:54:00.000 The experience is not a tangible, rock-solid, carbon-based, touch-a-table experience.
00:54:06.000 The experience is this spiritual, which I fucking hate to use, but there's no other way to use it, disembodied consciousness.
00:54:13.000 It's a disembodied consciousness experience.
00:54:15.000 Why would that be the same as an experience that's real where you can touch paper?
00:54:19.000 Here's why.
00:54:20.000 Here's why.
00:54:21.000 If you take, let's take, I can't, where is it people go to get healed?
00:54:24.000 Lords, I think is what it's called.
00:54:26.000 The water where you go.
00:54:27.000 The water.
00:54:27.000 Sinai is also like that.
00:54:28.000 Yeah.
00:54:29.000 I can't remember who wrote this.
00:54:30.000 I think it might have been Sagan.
00:54:31.000 No, it might have been Feynman.
00:54:33.000 I can't remember which one.
00:54:34.000 Talking about how it's important to understand if this phenomena is real or subjective.
00:54:39.000 Because if it's real, then that means that we should understand what are the properties of these waters?
00:54:44.000 Is it something in the land?
00:54:46.000 Is it something in the air?
00:54:47.000 And if we can understand that, then we can...
00:54:49.000 Help the whole planet with us.
00:54:50.000 In the same way, if the DMT or the psychedelic experience is taking us into a state that is non-subjective, that is external, is actually introducing us to entities or intelligences that somehow exist outside of our own being, it's incredibly important to begin to communicate with them in a real way.
00:55:08.000 But what is real?
00:55:12.000 That's where it becomes.
00:55:13.000 When you're talking about an...
00:55:15.000 Outside of the body experience.
00:55:17.000 An experience that transcends the physical flesh.
00:55:20.000 It could still be real, but not be measurable.
00:55:23.000 It could still be real, but you can't put it in a bucket and throw it on a scale.
00:55:26.000 It's not measurable.
00:55:26.000 It doesn't mean it's not real.
00:55:27.000 It could be a chemical gateway.
00:55:29.000 It could be something we don't have an instrument to measure.
00:55:32.000 We don't have a conceptual framework understanding.
00:55:34.000 But it's still real if it happens.
00:55:35.000 That's what my point is.
00:55:37.000 The idea of the imagination.
00:55:38.000 You imagine something.
00:55:40.000 The imagination is responsible for every fucking thing that a human has ever made.
00:55:44.000 Clothes, this microphone that I'm talking to, this computer that I'm on, clothes that I'm wearing, the car that drove me here, it's all manifested out of the imagination.
00:55:52.000 So the imagination is fucking real as shit.
00:55:54.000 100%.
00:55:55.000 And before you created those things, when you just imagined them, you were conjuring up something that didn't exist.
00:56:02.000 And the fact that we then brought it into existence proves, well, it at least existed as a potentiality.
00:56:06.000 It was allowed by the laws of physics.
00:56:08.000 So then it makes you wonder, what are you tapping into when you're having that kind of vision?
00:56:13.000 That disembodied, you know, reconceptualization of reality.
00:56:16.000 When you live in a world where there is no airplanes and you think that you could build a machine that will fly over the ocean and get you to this other place.
00:56:24.000 Like, to imagine that, to even fantasize about it, if we can utter it, it means that it's possible.
00:56:31.000 Gene Roddenberry invented the iPad.
00:56:33.000 Did he really?
00:56:33.000 Does anybody even know how cell phones work?
00:56:36.000 Yeah, but what did they do in those things?
00:56:37.000 Can you explain to me how a cell phone works?
00:56:38.000 How a cell phone works?
00:56:39.000 Look at stuff.
00:56:40.000 I don't know.
00:56:40.000 Did they swipe?
00:56:42.000 Yeah.
00:56:42.000 They swiped?
00:56:43.000 Yeah, they did it all.
00:56:44.000 Picard.
00:56:45.000 Picard didn't swipe.
00:56:45.000 Oh, Picard wasn't Roddenberry, though, was it?
00:56:46.000 Yeah, I think he did that and then died.
00:56:48.000 Oh, poor guy.
00:56:50.000 Yeah.
00:56:50.000 Well, he had a lot of success.
00:56:51.000 But he didn't get to see the future.
00:56:53.000 You don't get to see Deep Space Nine, you're right.
00:56:55.000 Those motherfuckers.
00:56:56.000 I just think that if something...
00:56:58.000 I think we should try...
00:57:00.000 Be mourning Rottenberry's life because he never saw Deep Space Nine.
00:57:04.000 He never saw the new Battlestar Galactica, which fucking, he's lucky.
00:57:08.000 It's so good.
00:57:09.000 That show kicked his show right in the dick.
00:57:11.000 Oh, yeah.
00:57:11.000 I was going to say it was bad.
00:57:12.000 Star Trek is such shit compared to Battlestar Galactica.
00:57:17.000 Oh, my God.
00:57:17.000 I never watched.
00:57:17.000 Every problem you have, every time you're thinking, like, oh, they're doing this is lame.
00:57:21.000 Within two seasons, it'll pay off.
00:57:23.000 You'll be like, oh.
00:57:24.000 Dude.
00:57:24.000 Battlestar Galactica on the sci-fi channel was the greatest sci-fi show ever.
00:57:31.000 Dealt with it in a real way.
00:57:33.000 Real situations.
00:57:35.000 Game of Thrones has whores and murders.
00:57:37.000 By the way, how hot is that robot bitch?
00:57:39.000 The blonde one?
00:57:40.000 In what show?
00:57:42.000 The hottest.
00:57:43.000 Ridiculous.
00:57:44.000 You take it.
00:57:45.000 I've burned holes in socks.
00:57:46.000 Oh, the sidelines?
00:57:48.000 The Cylons become the hot chicks.
00:57:50.000 That's part of the plot.
00:57:52.000 I don't want to...
00:57:52.000 Spoiler.
00:57:53.000 Spoiler alert.
00:57:53.000 If you haven't seen the DVD series, you've got to get that.
00:57:56.000 Get it right now!
00:57:57.000 How do you have time?
00:57:58.000 I was incredulous.
00:57:59.000 Brian Callen told me about that.
00:58:00.000 I'm like, that's going to suck, dude.
00:58:01.000 It's a remake of a show that was kind of hokey.
00:58:04.000 This show is not hokey at all.
00:58:05.000 Joe, in all seriousness, man, because you're one of the busiest people I know, How do you find time to watch Battlestar, Galacta, and Breaking Bad, and Game of Thrones?
00:58:16.000 Well, right now, I don't have hardly any time.
00:58:19.000 But you already watched Battlestar.
00:58:21.000 It was a long time ago.
00:58:22.000 You know, when I was just doing Fear Factor in the UFC, I had way more time.
00:58:26.000 Because back in the days of Breaking Bad, it just started.
00:58:28.000 Things were different back then.
00:58:29.000 I watched most of Breaking Bads while getting tattooed.
00:58:32.000 I watched the first season while getting my left arm done.
00:58:37.000 Right arm.
00:58:38.000 Sorry.
00:58:38.000 Yeah, I'm trying to catch up on Breaking Bad.
00:58:41.000 I fucking love it.
00:58:42.000 It's so good.
00:58:42.000 You have to catch up.
00:58:44.000 You know what's better?
00:58:44.000 So you can talk about it with people.
00:58:45.000 Homeland.
00:58:46.000 Homeland's pretty good.
00:58:47.000 I heard that was better.
00:58:47.000 I watched season one.
00:58:48.000 I'm up to season two.
00:58:50.000 Stunningly good.
00:58:52.000 Stunningly good show.
00:58:53.000 What do you guys think of Walking Dead?
00:58:54.000 It's awesome and sucks at the same time.
00:58:56.000 It's hokey as fuck sometimes.
00:58:57.000 No shit.
00:58:58.000 Every time they resolve a conflict, it's over the way they describe their emotions.
00:59:02.000 This guy...
00:59:03.000 And now it's all done.
00:59:04.000 The guy, the...
00:59:05.000 Spoiler alert.
00:59:06.000 Spoiler alert.
00:59:07.000 The bad guy, the number one bad guy in the third season.
00:59:09.000 The man.
00:59:10.000 Come on, dude.
00:59:10.000 I'm a governor.
00:59:11.000 You guys are teetering on the edge of fucking this whole thing up.
00:59:14.000 You need to regroup, get together as a group of writers, do some mushrooms, and figure out where the fuck you're going from here.
00:59:19.000 Duncan, you told me that.
00:59:20.000 That they had these other writers, and then it got really emotional for a while, and then they came in and said, guys...
00:59:24.000 Everybody get the fuck out, and they hired legit action guys.
00:59:27.000 Here's the problem, I think.
00:59:29.000 The problem is TV, not the writers.
00:59:31.000 It's what TV tends to do to creativity.
00:59:33.000 If you read the comics, the comics are some of the most bleak, horrific things that you've ever seen, where it's like every few pages is a gut punch, where you're like, what the fuck?
00:59:44.000 It's not like this emotional kind of sappy thing.
00:59:48.000 It's like...
00:59:49.000 You are existing in a world where you are going to die probably by being eaten by the undead.
00:59:56.000 Yeah, that's what they always say in Apocalypse.
00:59:57.000 Wait, but wait, but sorry.
00:59:58.000 So what are you saying about that versus the world of TV? I'm saying what happens because TV is like...
01:00:04.000 Oh, we can't make it too dark.
01:00:06.000 We can't kill that character.
01:00:07.000 Well, you think because there's an established business model and we have to keep some kind of stability in the system and we don't want to stuff this up to make you question too much?
01:00:14.000 They didn't like Homicide.
01:00:15.000 Homicide didn't last because they didn't have any clear-cut victories for the good guys and bad guys.
01:00:19.000 I'm going to do a spoiler.
01:00:20.000 Is that why they call it programming?
01:00:21.000 Let me do it.
01:00:22.000 Not with a series, but with the comic books.
01:00:24.000 Can I do a spoiler?
01:00:25.000 Why don't you spoil it?
01:00:26.000 The comic books?
01:00:27.000 Don't spoiler.
01:00:28.000 You just spoiler alert.
01:00:29.000 Say spoiler alert.
01:00:30.000 Spoiler alert.
01:00:31.000 It's a spoiler to me.
01:00:32.000 You're not going to read the comics.
01:00:33.000 How dare you.
01:00:34.000 How dare you pretend I don't read the comics.
01:00:35.000 What are you going to get to them?
01:00:36.000 No, listen to this, bastards.
01:00:37.000 I had comics on my iPad once on a plane, and you mercilessly made fun of me for the entire flight about the fact that I had comics on my iPad.
01:00:48.000 You know why I kept going?
01:00:49.000 Why?
01:00:50.000 Because you responded.
01:00:52.000 I couldn't help myself.
01:00:53.000 You responded.
01:00:54.000 I'm like, oh, I got him dancing.
01:00:57.000 You gotta play dead in front of a bear, man.
01:01:00.000 You can't fight back.
01:01:01.000 No, I'm a bear.
01:01:02.000 I always thought of myself as a twink.
01:01:05.000 I guess my time has come.
01:01:08.000 Yeah, you were fucking dancing.
01:01:11.000 You ever read The Road?
01:01:12.000 That's pretty bleak.
01:01:13.000 The Road is a fucking...
01:01:15.000 You might as well just get punched in the face.
01:01:16.000 Just get kicked in the stomach instead of reading that book.
01:01:18.000 I had to regroup for about a year.
01:01:20.000 Also, this is what movie sent me into depression more than anything.
01:01:24.000 Which one?
01:01:25.000 Revolutionary Road.
01:01:26.000 What's that?
01:01:27.000 That's brilliant, dude.
01:01:28.000 That was with DiCaprio and Kate Winslet.
01:01:31.000 It's about a married couple, and it flashes back and forth between the banality of day-to-day life, like what happens after you get what you want, versus the hopes and dreams of when they first met.
01:01:41.000 God, it made me feel bad about existence for a while.
01:01:44.000 It just didn't read about the bushes.
01:01:46.000 It made you collide against Against frustrated ambitions and a life of having to settle and settle and settle and settle and to become a stale facsimile of what you want to work.
01:01:59.000 It makes you go find your dreams after that.
01:02:00.000 It's one way or the other.
01:02:01.000 The sequence is when they get excited about Paris is the best part of the movie.
01:02:04.000 You're just like, oh yes, they're going to move to Paris.
01:02:06.000 It's going to be amazing.
01:02:07.000 It's going to be like a Richard Linklater film.
01:02:08.000 They're going to be in Europe.
01:02:09.000 It's going to be so good.
01:02:10.000 And then it doesn't happen.
01:02:11.000 I know!
01:02:12.000 I was like, yes!
01:02:13.000 It's going to be awesome for you!
01:02:14.000 By the way, I want to give a report.
01:02:17.000 I'm going to report on what you've been looking at on the laptop during this thing.
01:02:20.000 It's now gone from a series of videos of weird vintage cars, Porsches.
01:02:27.000 At one point, he was just looking at, what do you call it, an accelerometer?
01:02:30.000 You're just looking at a speedometer accelerating.
01:02:35.000 That was like five minutes.
01:02:36.000 You're just looking at a car speeding, and now you're looking at Well, I don't know if you know this, Duncan, but I'm crazy.
01:02:43.000 What are you looking at?
01:02:44.000 Is he looking at pool cues?
01:02:45.000 He's looking at people playing pool.
01:02:47.000 I have an ADD that you couldn't possibly understand.
01:02:51.000 There's no rhyme or rhythm to it.
01:02:52.000 I need 13 different things going on in my life.
01:02:54.000 It's like the drums on war pigs.
01:02:56.000 It makes no sense.
01:02:57.000 I am what I am, son.
01:02:58.000 Do you want to see a two-minute...
01:03:01.000 A two-minute trippy video?
01:03:02.000 Yeah.
01:03:03.000 I did a new video that says, we are already cyborgs.
01:03:05.000 It's kind of about the stuff from the conference we were just at.
01:03:08.000 Yeah, but, but, but, let's pass that joint around before we get to that.
01:03:12.000 No gay stuff either, right?
01:03:14.000 No, no, no.
01:03:15.000 Well, I'll get you one.
01:03:16.000 I'll get you your own.
01:03:17.000 How about that?
01:03:17.000 Oh, thank you.
01:03:18.000 Oh, yeah.
01:03:18.000 We live in a world of abundance.
01:03:21.000 We're the new Romans.
01:03:23.000 Yes!
01:03:24.000 Thanks, John.
01:03:25.000 Broken Studios is the new Roman Empire for weed.
01:03:27.000 I don't give a fuck, dude.
01:03:28.000 We're so gangster.
01:03:29.000 We'll barf and come back for more.
01:03:30.000 Yeah, we're like vomitoriums.
01:03:31.000 Did you ever go to one of those?
01:03:32.000 A vomitorium?
01:03:33.000 They exist?
01:03:34.000 No, they just show the old ones in Israel and Jerusalem.
01:03:37.000 Oh, no.
01:03:37.000 They show you where you have to go to vomit and they come right back to party.
01:03:41.000 They left out the part about fucking kids.
01:03:43.000 Vomit?
01:03:44.000 Wait, what's a vomitorium?
01:03:46.000 You just eat and party and drink, just keep going and going, and you're like, oh, I can't eat anymore.
01:03:50.000 You know that moment where you can't eat anymore?
01:03:51.000 And then they vomit?
01:03:52.000 You go to this room where you just get to vomit.
01:03:54.000 And then you come back.
01:03:56.000 It's like a urinal, but instead of peeing, you just vomit out as much as you can of the booze and the food, and you go back to drinking and eating.
01:04:04.000 Yeah, the Romans supposedly did it with feathers.
01:04:06.000 Really?
01:04:07.000 Have you ever heard this before?
01:04:10.000 Romans wanted to party so hard that they were willing to just eat as much as they wanted to and then throw up so they could eat again.
01:04:17.000 The way Itskov thinks of living forever, they thought of partying forever.
01:04:20.000 Right.
01:04:21.000 Well, it's also because mortality was at such a high level back then.
01:04:25.000 Infant mortality was 50%.
01:04:27.000 People were dying left and right in sword fights.
01:04:30.000 Good things.
01:04:31.000 That was some crazy ass times.
01:04:33.000 Yeah, yeah, yeah, yeah, yeah.
01:04:34.000 But how much different is that?
01:04:36.000 How much different?
01:04:37.000 Shit.
01:04:39.000 Emergency.
01:04:40.000 Your wife stood up with free t-shirts.
01:04:42.000 I love it.
01:04:43.000 That's a nice shirt.
01:04:44.000 Oh, is it?
01:04:46.000 That's a big deal.
01:04:47.000 It's a HirePrimate.com shirt.
01:04:49.000 Available at HirePrimate.com.
01:04:51.000 They're also great for mopping up spells.
01:04:53.000 Yeah, you can mop up booze with them.
01:04:56.000 They wash right off.
01:04:57.000 Those salt crystals are great, man.
01:04:59.000 I gotta get some of those.
01:05:00.000 They're cool if you want to bang yoga teachers.
01:05:04.000 Yeah, nobody wants to do that.
01:05:06.000 If you had a house and it was totally set up, you were like, Sat Nam, come into my presence.
01:05:14.000 You had Om on the wall, and these are beside your bed.
01:05:17.000 Dude, you're in!
01:05:18.000 And all you need is some quote from some really obscure Indian guy on the wall.
01:05:23.000 Like, oh, he's my guru.
01:05:24.000 I just have a video of Duncan playing behind the bed.
01:05:27.000 Oh, Duncan gets to start singing shit.
01:05:28.000 It's my apartment.
01:05:29.000 Oh, it's your apartment?
01:05:30.000 I'm describing his apartment to it, too.
01:05:33.000 I'm trying to fuck with him.
01:05:34.000 You're like, dude, why are you telling people all my stuff?
01:05:37.000 If Duncan could actually sing them songs, you could sing.
01:05:40.000 You have chants in your head, right?
01:05:42.000 I have chants in your head.
01:05:44.000 The one I'm chanting right now is a great chant because it sounds exactly the way...
01:05:51.000 Nitrous oxide sounds.
01:05:53.000 When you do nitrous oxide, if you chant it long enough, it's the sound of when you get super high.
01:05:58.000 And so the chant is R-A-M, Ram.
01:06:01.000 It's simple.
01:06:02.000 So the chant just goes, Ram, Ram, Ram, Ram, Ram, Ram.
01:06:07.000 Oh, that is when you do whippets.
01:06:12.000 You hear that when you do whippets?
01:06:14.000 When you do whippets, yeah.
01:06:16.000 You're hearing the ohm.
01:06:17.000 What you're hearing is your brain cells committing suicide.
01:06:20.000 Please.
01:06:21.000 You're hearing your brain cells dance.
01:06:24.000 You're hearing your brain cells tap dance.
01:06:26.000 Doesn't that give you brain damage?
01:06:28.000 For just a short amount of time.
01:06:29.000 Nitrous oxide?
01:06:30.000 I'm pretty sure nitrous oxide is not good for you.
01:06:32.000 It's the same thing dentists give you.
01:06:33.000 I'm pretty sure going to dentists is not good.
01:06:35.000 Not good for you.
01:06:35.000 You're right about that.
01:06:36.000 But that's a simple, great chant that you could do at any time.
01:06:40.000 Yeah, but what is that chant that you do, that crazy one you have memorized?
01:06:44.000 That's what I'm talking about.
01:06:45.000 You know that whole...
01:06:46.000 You have it memorized now.
01:06:50.000 That's the chant.
01:06:50.000 That goes...
01:07:15.000 Shit.
01:07:17.000 Do you still do that thing with the bed that you used that in?
01:07:20.000 That's amazing.
01:07:21.000 That's a chant that you say at the beginning.
01:07:23.000 You might pray if you were into bhakti yoga.
01:07:25.000 You would pray that prior to reading the Bhagavad Gita.
01:07:30.000 And that's a prayer that is basically...
01:07:33.000 The first verse is very beautiful.
01:07:36.000 It goes...
01:07:37.000 I was born into the darkest of ignorance, but my spiritual master opened my eyes with a torch of knowledge, which I love that a lot.
01:07:45.000 But it's like basically the idea is like...
01:07:49.000 When you come into contact with truth, which is what any of the sutras are, by the way, I love the Bhagavad Gita, but I just started reading the Yoga Sutras of Patanjali, which are fucking great, man!
01:07:59.000 They blow the Bhagavad Gita out of the water, as far as I'm concerned.
01:08:01.000 You don't find them pretentious at all?
01:08:02.000 What?
01:08:03.000 The second thing you said?
01:08:05.000 I forget it, sorry.
01:08:07.000 Which part?
01:08:08.000 I don't know.
01:08:09.000 I was joking.
01:08:11.000 I think it can seem pretentious.
01:08:13.000 And I think that people can use it as a tail feather, as you mentioned.
01:08:16.000 And I will fully admit that I've used it as a tail feather before.
01:08:20.000 But I think in the same way that you were talking about how reproduction kind of lures you into creating robots or advances society.
01:08:27.000 In the same way, I think people get drawn to philosophy for reasons that are just like, well, this will make me seem smart.
01:08:33.000 I don't think there's anything wrong with tail feathers.
01:08:35.000 I really don't.
01:08:36.000 And I think that people worry about it and other people.
01:08:39.000 But what's crazy about this stuff is that once you get into it for weird reasons, but once you get into it, then it starts deconstructing you.
01:08:48.000 It starts breaking you apart because it's going to this very micro level of the way that we tend to work subjectively, which is what...
01:08:55.000 But that subjective experience is the only thing that ultimately matters in terms of your interior world, right?
01:09:01.000 I mean, you talk about truth.
01:09:03.000 I think it was Werner Herzog, the documentary filmmaker, that was talking about the difference between ecstatic truth and factual truth.
01:09:09.000 And he said, you know, if facts were the most interesting thing in the world, then the phone book would be the world's most interesting book.
01:09:15.000 But obviously there's this other experience that we still call truth.
01:09:18.000 Maybe it's italicized or whatever it is, but it's that ecstatic truth.
01:09:21.000 It's subjective truth.
01:09:22.000 It's the truth of the poet.
01:09:24.000 You know, a journalist may be more accurate in describing the facts of an event, but a poet may nevertheless reveal...
01:09:30.000 LeBron James dug deep and found the way to overcome the spurs.
01:09:34.000 Whatever it is, the poet reveals deeper truths as if I know place in the other's literal grit.
01:09:38.000 Well, this is...
01:09:39.000 Werner Herzog's...
01:09:40.000 Interesting cat.
01:09:41.000 Very interesting.
01:09:42.000 Oh, I fucking love him, man.
01:09:43.000 I just saw him in Spring Break.
01:09:44.000 Wait, no, not Spring Breakers.
01:09:46.000 What did I see?
01:09:46.000 He played a villain.
01:09:47.000 No, Jack Reacher.
01:09:48.000 Yeah, he's so weird.
01:09:49.000 He was great, but wasn't it weird?
01:09:51.000 Milky Eye.
01:09:52.000 He's got that Milky Eye.
01:09:53.000 He's a good actor.
01:09:54.000 He's a great actor, which is weird.
01:09:56.000 Yeah, I love Werner Herzog, man.
01:09:57.000 He's the shit.
01:09:58.000 I would really love to get him off the record to give his opinion on Grizzly Man or whether or not he knew that he was making a comedy.
01:10:06.000 No, Werner Herzog knows he's making comedy.
01:10:08.000 Because in all of his documentaries is an element of comedy.
01:10:11.000 He's mocking the person that is...
01:10:12.000 He is smart enough to know what the person watching his movie is thinking.
01:10:17.000 And he knows when he does this stuff, he knows that we are thinking this has got to be a comedy.
01:10:22.000 Werner Herzog's hilarious.
01:10:23.000 If that's the case...
01:10:25.000 And cynical.
01:10:25.000 In that sense, then...
01:10:26.000 That might be, Grizzly Man might be the greatest creation in all of comedy.
01:10:30.000 Yeah, real subtle.
01:10:31.000 It's brilliant.
01:10:31.000 It's a wonderful comedy.
01:10:32.000 So subtle and so goddamn brilliantly crazy.
01:10:35.000 It's so, it's wonderful.
01:10:37.000 It like celebrates people and all our wackiness.
01:10:39.000 And there's like a certain comfort in watching a film about a guy who's completely off the rails, that's living with grizzlies.
01:10:47.000 Yeah.
01:10:47.000 Like it as we, you know, we might not like it, but it makes us feel better about ourselves, man.
01:10:51.000 When we got a guy who's way more fucked up than us, it makes us feel better about ourselves.
01:10:55.000 Did you find him also kind of fascinating?
01:10:58.000 Fuck yeah!
01:10:59.000 I thought it was actually fascinating because that's the thing about when you watch a movie.
01:11:01.000 I mean, part of what happens when you're watching a movie is the same thing.
01:11:04.000 They've done REM on people when they...
01:11:06.000 They've done fMRI scans of people when they watch movies, and they say it's very similar to when you're dreaming.
01:11:10.000 So the self, the self-awareness disappears.
01:11:13.000 So that's why you're able to become the character that you identify with.
01:11:16.000 They call it the diictic shift, when you assume the viewpoint of one of the characters.
01:11:20.000 So you watch a film like Grizzly Man, it allows you to actually enter the consciousness, perhaps, of this person.
01:11:24.000 And that's what the whole thing about cinema allows us to do.
01:11:27.000 That's why a film like that might be fascinating.
01:11:28.000 Say that name for the shift again.
01:11:30.000 The diictic shift.
01:11:31.000 See, I love that term, man, because enlightenment is the ultimate shift, which is where you do the diectic shift from yourself to the whole.
01:11:39.000 That's the idea.
01:11:40.000 It's like we are always on the precipice of this final shift.
01:11:43.000 And we're terrified to make that shift because we want to be an individual.
01:11:46.000 And the idea of going backwards that one time...
01:11:49.000 Of taking off the neurological VR goggles, the sociological VR goggles.
01:11:53.000 The consensus trends, the cultural operating system, the whole thing.
01:11:56.000 We don't want to do it!
01:11:57.000 We don't want to do it!
01:11:58.000 It's easier to become another person in the movie than it is to become the whole.
01:12:02.000 Yes!
01:12:03.000 Just you explaining it makes my heart race.
01:12:06.000 It's terrifying.
01:12:06.000 No, but the reason I did it is because I love movies a lot.
01:12:10.000 Since I was a little kid, I would watch movies.
01:12:11.000 And one of the coolest experiences is that you became Indiana Jones.
01:12:14.000 Like, for two hours, you were Indiana Jones.
01:12:17.000 So what is happening?
01:12:18.000 Like, how come sometimes you become the movie and other times you don't?
01:12:21.000 And when you don't, like, life sucks, right?
01:12:24.000 Like, oh, I'm watching this movie.
01:12:25.000 It's not sucking me in, right?
01:12:26.000 So then I wanted to study that.
01:12:27.000 And they say that, you know, movie-watching and dreaming are strangely familiar existing...
01:12:34.000 Familiar experiences, similar experiences, but apparently it has to do with your self-awareness, the lateral prefrontal cortex.
01:12:40.000 The same thing that turns off when people are in flow states, when rappers are freestyling.
01:12:44.000 The self-editing, the self-consciousness disappears.
01:12:47.000 And we love transcending our self-consciousness because it's the moment in which we see that there's an infinite amount of subjective experiences that we can have.
01:12:54.000 We can be Indiana Jones, we can be anybody we want.
01:12:58.000 We're not bound by our individuated state, which as amazing as it is, it's still limited.
01:13:03.000 Is that lateral prefrontal cortex that you're talking about, is that the neocortex?
01:13:08.000 Or is that, like, is the neocortex...
01:13:10.000 I have no idea.
01:13:11.000 It says lateral, so I imagine it's on this side.
01:13:13.000 Oh, yeah, yeah.
01:13:14.000 But this is, the reason I saw this is because it was talking about it in the article about movies and you blurring, blending into the films, but also another article was talking about flow states and when they did fMRI scans on freestyle wrappers versus memorize, and it was like the same thing.
01:13:27.000 But this is a terrifying thing for people.
01:13:28.000 The flow state that you're talking about, if you have identified yourself...
01:13:33.000 With a level of suffering or with a level of control or with a level of always being the thing driving the car, then this flow state you're talking about is a form of death.
01:13:42.000 You don't want to be there.
01:13:43.000 A lot of people, the people who suck in bed are the ones who are the most wanting to be in control.
01:13:50.000 The people who have the most awful marijuana trips are always the control freaks.
01:13:55.000 But think about it.
01:13:57.000 The reason that movies are so good at it is because first they sit you in a really comfortable place.
01:14:01.000 It's a comfortable seat.
01:14:03.000 You're in the dark.
01:14:04.000 The phones are off.
01:14:05.000 They make sure that you are comfortable so that they can ease you in.
01:14:08.000 And when the movie starts, you're still yourself.
01:14:10.000 You're still fidgeting.
01:14:10.000 You might have to pee.
01:14:11.000 But as soon as it starts, they guide you with music.
01:14:14.000 The set and setting inform the direction that your consciousness is going.
01:14:17.000 And before you know it, you're on a ride.
01:14:19.000 Like the roller coaster has started and all of a sudden you forget yourself.
01:14:23.000 You are the story.
01:14:23.000 Just like mushroom trips.
01:14:24.000 And then at the end of the day, it becomes the best experience ever, right?
01:14:28.000 Because when the great movie is done, you're like...
01:14:30.000 Wow, that was awesome!
01:14:31.000 I don't know where I went, but I loved it, right?
01:14:34.000 But when the movie sucks, it was a really unpleasant experience.
01:14:36.000 So we love losing ourselves, but it's also what we're most terrified of.
01:14:40.000 So this is the idea that when we die, the exact same experience happens where you're like, Holy shit!
01:14:46.000 That was fucking amazing!
01:14:48.000 I thought I was a human?
01:14:51.000 Wow!
01:14:52.000 No kidding, right?
01:14:54.000 Maybe it's just an extended period of dreaming.
01:14:57.000 Maybe it's a movie within a movie within a movie.
01:14:59.000 The dream state of eight hours becomes a dream state of 80 years.
01:15:01.000 That's why the movie Inception is so brilliant.
01:15:03.000 When they go into limbo.
01:15:05.000 Limbo was 80 years.
01:15:06.000 Our entire life could be one of those limbos that we forgot that we decided to go to sleep.
01:15:11.000 We could be in the dream within dream within dream.
01:15:12.000 Inception didn't really lock me in.
01:15:13.000 I don't know why.
01:15:14.000 You should have been more high when you saw it.
01:15:16.000 I don't know.
01:15:17.000 Maybe.
01:15:17.000 I wouldn't have understood it.
01:15:19.000 I think Inception was a little too refined.
01:15:25.000 You had to follow it a little too close.
01:15:27.000 He was explaining it to you.
01:15:29.000 Like a Rubik's Cube or something.
01:15:30.000 I like that though.
01:15:30.000 I'm sure you like that.
01:15:32.000 I'm sure your mind likes that.
01:15:33.000 It wasn't like The Matrix, which is sort of more of a visceral thing, but still they're both pointing to the same idea, which is that whatever your experience of reality is may in fact just be a dream state or some kind of hallucination or an aspect of a simulation that you've become absorbed into.
01:15:51.000 It's interesting you mention that because I actually brought something to read to you guys about The Matrix.
01:15:56.000 I'm going to load it up.
01:15:58.000 Oh, cool.
01:15:58.000 And it's exactly about this conversation we're having.
01:16:00.000 It's almost like I thought at some point we're going to start talking about blending into movies and breaking the ego and that whole thing.
01:16:07.000 So I'm just going to load it up.
01:16:08.000 What ever happened to that lady that was suing the person that made the Matrix?
01:16:13.000 Do you remember that whole thing?
01:16:14.000 I thought she lost.
01:16:15.000 Did she really?
01:16:16.000 There was a settlement maybe.
01:16:17.000 I thought there was a settlement.
01:16:18.000 Yeah, I think I remember that.
01:16:19.000 Settlement.
01:16:19.000 Yeah, that never means anything, does it?
01:16:21.000 It's not a loss.
01:16:23.000 You're just bored of dealing with it?
01:16:25.000 Yeah, sometimes it's that.
01:16:26.000 It doesn't mean you're going to win.
01:16:27.000 How about we just stop this right now?
01:16:28.000 Yeah, sometimes it's that, too.
01:16:30.000 It's different things.
01:16:33.000 Maybe she had an original idea and they took it to a different place, but maybe they can trace the origins of that idea.
01:16:39.000 Get the fuck out of here.
01:16:40.000 Bitch!
01:16:41.000 Who is this?
01:16:41.000 There was a woman who sued the brothers for making the Matrix.
01:16:46.000 Former Wachowski brothers.
01:16:46.000 Now they're Wachowski siblings.
01:16:48.000 Really?
01:16:48.000 Yeah.
01:16:49.000 I feel like her name is Gloria Larson.
01:16:50.000 That's lame.
01:16:51.000 Gloria Larson?
01:16:52.000 You're just going from memory on that?
01:16:54.000 That'd be great if you got it right.
01:16:55.000 Why won't the story about Sophia Stewart and her own Matrix?
01:16:59.000 No, that's not it.
01:16:59.000 Matrix?
01:17:02.000 Let's hear it.
01:17:03.000 Okay.
01:17:03.000 Let me read you this while you search for that.
01:17:05.000 So, this is an article by Eric Davis, and he's talking about Descartes and The Matrix and the false reality genre of filmmaking.
01:17:12.000 So, films that reveal a crack in your reality, the possibility of a hidden door, of a rabbit hole to fall through.
01:17:18.000 And so he says, you know that scene in The Matrix when he's in the hotel room and they're about to give him the pill?
01:17:23.000 Yes.
01:17:23.000 Okay, that's the craziest part of the movie.
01:17:24.000 So he says, we too are in that decrepit hotel room with Laurence Fishburne's Morpheus, who is really speaking to us when he addresses Neo.
01:17:32.000 The ever-wooden canneries.
01:17:33.000 You know something.
01:17:35.000 What you know you can't explain, but you feel it.
01:17:37.000 You felt it your whole life.
01:17:39.000 You felt that something is wrong with the world.
01:17:41.000 You don't know what, but it's like there, like a splinter in your mind.
01:17:44.000 And establishing that itch, which I suppose most of us share, however we interpret it, Morpheus offers to scratch.
01:17:50.000 He will give Neo nothing more than knowledge of the truth, i.e., no solutions to the problems posed by said truth.
01:17:57.000 And then he goes on and he says, like a serpent in the Garden of Eden...
01:18:02.000 Okay, so hold on.
01:18:03.000 Wow!
01:18:26.000 Right.
01:18:44.000 Cool.
01:18:45.000 That's great, man.
01:18:47.000 I mean, the fact that Eric Davis reads this deeply into the film, that's why you guys gotta chat with him.
01:18:52.000 So let's listen to Stuart.
01:18:53.000 Tell me.
01:18:54.000 The case.
01:18:54.000 It was dismissed when she failed to show up for a preliminary hearing of her case.
01:18:59.000 Oh.
01:19:00.000 So that can mean one of two things.
01:19:01.000 Either it means she's crazy, or they paid her to not show up.
01:19:05.000 No, she would just like...
01:19:06.000 No.
01:19:07.000 Why not?
01:19:07.000 They made a settlement with her off the record.
01:19:09.000 Yeah, but they don't just not show up.
01:19:10.000 Then maybe they scared the shit out of her and told her not to show up.
01:19:14.000 Something happened.
01:19:14.000 She didn't show up.
01:19:15.000 You think they scared her?
01:19:16.000 The Walenskis?
01:19:17.000 I mean, maybe she had no case at all and she was just crazy.
01:19:19.000 Remember the guy who sued me because I was your lawyer?
01:19:22.000 Oh, that's right.
01:19:22.000 The angel of God.
01:19:24.000 The angel of God, yeah.
01:19:25.000 I was his lawyer.
01:19:25.000 I had to get a lawyer to go to court.
01:19:27.000 How long were you in court with that thing?
01:19:30.000 About a year.
01:19:31.000 Are you kidding?
01:19:32.000 Yeah.
01:19:32.000 How many times did you have to go to court?
01:19:34.000 I had to keep responding.
01:19:35.000 I had to keep responding to them.
01:19:36.000 He sued me for being a false prophet.
01:19:39.000 No, no, I'm sorry.
01:19:40.000 He wanted to sue me.
01:19:41.000 It's got a point.
01:19:41.000 Yeah, and being a bad lawyer.
01:19:45.000 And the Better Business Bureau came after me.
01:19:48.000 So you lost the case?
01:19:49.000 No.
01:19:51.000 This girl Lisa helped me fight the case and had to show her.
01:19:54.000 He would sue me for all the riches in the world.
01:19:58.000 Ha ha ha!
01:20:00.000 And then when they...
01:20:01.000 Because she had to read it to the judge.
01:20:03.000 Like, this guy's crazy.
01:20:04.000 She's like, what do you mean?
01:20:05.000 She goes, read that.
01:20:05.000 And she did.
01:20:06.000 The judge read it.
01:20:07.000 And he was like, oh.
01:20:08.000 And so they made him rewrite it.
01:20:09.000 And he goes, okay, that $800 billion was what he said before.
01:20:13.000 All the riches in the world.
01:20:15.000 That leaves a lot on the table.
01:20:16.000 There's a lot of room for negotiation.
01:20:18.000 I mean, assuming that you're going to continue to be more successful.
01:20:22.000 He did promise me, though, that when he did eventually become king of kings, he could repay me with...
01:20:28.000 Untold riches.
01:20:30.000 King of Kings is a big, big title for a guy living in a homeless shelter.
01:20:35.000 I represent him pro bono!
01:20:36.000 I love this guy's, like, levels of riches.
01:20:39.000 He has all the riches in the world, but that's exceeded by untold riches.
01:20:43.000 No, all the riches in the world is more.
01:20:45.000 Well, then if you pay him all the riches in the world and he pays you back in untold riches, you're getting ripped off.
01:20:49.000 Yeah, yeah.
01:20:51.000 That's what he's suing me for.
01:20:53.000 I denied him.
01:20:54.000 He's not really paying you back.
01:20:55.000 He's giving you a little bit of what you gave him.
01:20:57.000 He's giving me some for helping him.
01:20:58.000 That's about right.
01:20:59.000 You're just saying riches are confined to the world, Ari.
01:21:01.000 I disagree.
01:21:03.000 Untold riches could be all the riches in the universe.
01:21:07.000 You should have had me as an attorney.
01:21:09.000 Maybe I'll hire you to represent me.
01:21:12.000 That would have been amazing.
01:21:13.000 Fake attorney Duncan represents fake attorney Ari being sued by crazy guy.
01:21:18.000 He kept trying to use bigger lingo because I would sometimes.
01:21:20.000 So he'd be like, where to for the plaintiffs who are attacking me?
01:21:26.000 He also wanted to sue San Diego State Hospital, and I think they really fucked him up.
01:21:30.000 I think he got in the psych ward there.
01:21:32.000 Oh, really?
01:21:32.000 Yeah, I think.
01:21:33.000 Really?
01:21:33.000 He wanted to sue that guy, Dean.
01:21:35.000 Dean.
01:21:36.000 Galber.
01:21:36.000 No, who talked to the dead?
01:21:39.000 Dean.
01:21:40.000 Dean something.
01:21:41.000 Dean Edwards?
01:21:42.000 John Edwards?
01:21:42.000 John Edwards, yeah.
01:21:43.000 John Edwards.
01:21:43.000 Dean Edwards is a comic.
01:21:44.000 Because it was just like the guy who ran for president.
01:21:47.000 Dean...
01:21:47.000 Edwards.
01:21:48.000 Yeah, and I remember saying, he said he wanted to sue him for being a false prophet.
01:21:52.000 And I was like, why would you get that money?
01:21:54.000 Hey, wait, can I change the subject?
01:21:56.000 You're talking logic with a crazy guy.
01:21:57.000 Congratulations.
01:21:57.000 Let me change the subject for two seconds.
01:21:59.000 You're talking about, just because we were talking about riches outside the world, this meteor harvesting thing, do you know about this?
01:22:05.000 Planetary resources.
01:22:06.000 Yeah, it's fucking crazy.
01:22:07.000 Oh yeah, Peter Diamandis is behind it, dude.
01:22:09.000 They just launched a million dollar Kickstarter project to create a space telescope for public use, because you know they're launching a whole fleet of tiny space telescopes that To scan for near-Earth asteroids that we can then land on and leverage for resources.
01:22:23.000 Pull back and mine.
01:22:24.000 A typical one has like a trillion dollars worth of plutonium, for example.
01:22:28.000 Oh my god, insane.
01:22:28.000 Because that's why they're doing it.
01:22:29.000 They're not doing this when they're talking about how NASA wants to grab an asteroid or a meteor.
01:22:34.000 It's a meteor or an asteroid.
01:22:35.000 Asteroid.
01:22:36.000 Asteroid.
01:22:36.000 When they say this, you know they're not doing this.
01:22:39.000 You know it's not just scientific reasons.
01:22:41.000 They don't just want to harvest this shit.
01:22:43.000 It's riches beyond all previous limits, but it's okay because that's just an incentive.
01:22:48.000 Yeah, if they can really pull it off, that helps everybody.
01:22:50.000 That's why they would go do that.
01:22:52.000 That's why there's technology.
01:22:53.000 So they're saying there's plutonium in those things.
01:22:55.000 So that somebody can make money and then...
01:22:58.000 How do they know there's plutonium in those things?
01:23:00.000 Oh, they know.
01:23:00.000 They know the chemical composition.
01:23:02.000 Is it based on asteroids that have fell to Earth?
01:23:04.000 Just like they can make estimations about what Jupiter is composed of.
01:23:07.000 I mean, that's kind of insane about humanity.
01:23:08.000 That's exciting.
01:23:09.000 We can actually use our brains to extend our sensory apparatus beyond Earth.
01:23:12.000 What's the mechanism of determining the contents of an asteroid?
01:23:16.000 I have no idea.
01:23:17.000 The last few elements of the periodic table, they knew how much they'd weigh.
01:23:21.000 That's insane.
01:23:22.000 They knew where they'd fit in in that chart.
01:23:23.000 They're like, we haven't discovered them, but we know exactly what they'll weigh and how they're masked.
01:23:26.000 The ability of the human brain to acquire such knowledge about the building blocks of the physical world.
01:23:31.000 What does that say about us as this unique?
01:23:35.000 You keep lumping me into that group, and that shit's preposterous.
01:23:39.000 Those aren't even related to me.
01:23:40.000 Those are totally different kind of animals, those people that are figuring that out.
01:23:43.000 But I don't think they are, because the fact that we can have this conversation means we can acknowledge some kind of understanding of what we're talking about.
01:23:50.000 It's childlike.
01:23:51.000 It's a childlike understanding.
01:23:53.000 Comparison to the dude who figured out what a quark-gluon plasma particle would weigh and then made it.
01:23:59.000 Made something that if you get a sugar cube of it, it'll fall straight through the center of the earth because it'll weigh like 400 billion pounds or something fucking crazy.
01:24:07.000 So you think it's just a different kind of animal?
01:24:08.000 You think it's just a different kind of brain?
01:24:09.000 You think like if we went and sat with him for a couple weeks, he could explain it to us?
01:24:13.000 Yeah, maybe.
01:24:15.000 I think people are remarkably adaptable and people go down certain paths, I think, in life.
01:24:20.000 And if you meet a guy who's been a ballet dancer since he was four years old and now he's 25 and doing these twirls in the air and shit, you would look at that guy moving.
01:24:29.000 I don't know why I chose ballet dancer.
01:24:31.000 But you would look at that guy moving and going like, that guy is so far down the path I could never possibly catch up to him.
01:24:38.000 But I think human beings have a capacity for continuing down a path in a very far way to the point where they're almost unrecognizable from when they first started.
01:24:49.000 An insane specialization, like a malleability, like an ability to transform.
01:24:53.000 I know that from martial arts especially.
01:24:54.000 It's amazing how you can see that about, let's say, a ballet artist dancing and stuff.
01:24:58.000 Like, wow, I can never do that.
01:24:59.000 And you could easily say, there's no way I could.
01:25:01.000 But everybody thinks they can do stand-up.
01:25:03.000 Isn't that funny?
01:25:04.000 With no training at all.
01:25:06.000 Isn't it funny that they don't really know what we're doing?
01:25:08.000 It's like they think that we're just telling jokes, and we absolutely are, but it's all about where to put them, how to say them, how to structure them.
01:25:14.000 I think comedians are philosophers.
01:25:15.000 I think they're modern philosophers.
01:25:17.000 They're stand-up philosophers.
01:25:18.000 It's also hypnotism.
01:25:19.000 Because there's some weird thing that you're doing where you can get them into the way you're thinking.
01:25:24.000 And you get them tuned into the way you're thinking by giving them shit that they want to listen to.
01:25:28.000 And if you can find that rhythm where it's a thought that they would entertain themselves, then they'll allow your mind to work for them.
01:25:36.000 Because like, oh, this guy's got a very aware mind.
01:25:39.000 I'm curious to hear how he looks at things.
01:25:41.000 I'll allow him to think for me.
01:25:43.000 It's the same thing presidents do.
01:25:45.000 Yeah, and in those moments, those people are plugging into you the same way when you watch a movie, you become the character in the movie.
01:25:50.000 Those people are plugging into you.
01:25:51.000 And in that moment, when you enter that flow state, do you feel like a conductor in an orchestra?
01:25:55.000 Absolutely.
01:25:55.000 Like, literally, you move in the orchestra?
01:25:57.000 Sort of.
01:25:57.000 Yeah, conductor in an orchestra.
01:25:58.000 That's exactly how it is.
01:25:59.000 Yeah.
01:25:59.000 I mean, it's like you guys are in a flow.
01:26:00.000 You become in sync.
01:26:01.000 Something there is interesting that's happening.
01:26:03.000 And I don't know if we can even measure that.
01:26:04.000 Like when people sync up like that.
01:26:05.000 Whether it's lots of people to one person or lots of people to lots of other people.
01:26:08.000 Yeah, you don't have a scale for that.
01:26:10.000 You don't have a scale for that feeling.
01:26:11.000 I think you can study the way that metallic particles react to magnets.
01:26:17.000 I think you can look at the way sound waves affect water.
01:26:20.000 And we use those metaphors.
01:26:21.000 We say he's so magnetic when he's on stage.
01:26:24.000 I mean, we use those metaphors to explain something for which we have no instrumentation or way to quantify or measure, yet we employ those capacities.
01:26:31.000 We use those capacities.
01:26:32.000 We pay people millions of dollars Because they're charismatic.
01:26:35.000 Well, how do you measure charisma?
01:26:36.000 Is there a little machine that measures it like radiation?
01:26:38.000 Like, he has 97 Kelvins of charisma.
01:26:41.000 So we employ these things, but we can't measure them.
01:26:43.000 They exist, but we can't measure them.
01:26:44.000 But you can focus attention.
01:26:46.000 And attention is a specific pattern of neural activity.
01:26:50.000 So the idea is that you have this group of people and you're transforming their neural activity to match some intention that you have.
01:26:57.000 Whether it's because you want them to listen to your speech about hotels.
01:27:01.000 Don't you wish you could see that?
01:27:02.000 Don't you wish there was a special light you could use that could show that energy transfer?
01:27:06.000 Yeah, exactly.
01:27:06.000 Oh, I'm getting it.
01:27:06.000 Okay, I know where I am now.
01:27:08.000 We could see the cell phone signals going through us right now.
01:27:10.000 That would be cool.
01:27:10.000 If I could see how my attention is being captured.
01:27:12.000 But go and watch a group of people dancing who are all in ecstasy.
01:27:15.000 Or look at the way fish move around a coral reef.
01:27:18.000 Yes.
01:27:19.000 Or look at, you know, you see this exact same undulating quality.
01:27:22.000 Insane!
01:27:23.000 Like a hidden order to things.
01:27:24.000 Well, there's an essay, dude, called Virtual Reality and Hallucination, written by Diana Slattery on Reality Sandwich.
01:27:30.000 And it's all about that.
01:27:32.000 She says the capture and management of attention is a vital component, a state of immersion, a state of absorption is a vital component in any kind of interpersonal transformation or education or influence of any capacity or growth.
01:27:46.000 In other words, you need to be completely sucked into whatever it is that's going to really transform you and get inside of you.
01:27:52.000 So it all has to do with the capture and management of attention.
01:27:55.000 And what are psychedelics if not Attention technologies.
01:27:58.000 Rhetoric technologies.
01:28:00.000 What is language, if not a technology, to capture attention and shift awareness?
01:28:03.000 Not a sense that any discipline is a psychedelic experience in the long haul.
01:28:07.000 Yes.
01:28:07.000 Because disciplines transform you.
01:28:09.000 Yes.
01:28:09.000 Yes.
01:28:09.000 And they focus attention.
01:28:11.000 Tarantino said for Pulp Fiction, he just wanted people to put the laundry away while they were watching.
01:28:15.000 Just not fold shit.
01:28:16.000 Just look at it.
01:28:17.000 It's all he wanted.
01:28:19.000 What?
01:28:19.000 So you get lost in it.
01:28:21.000 So you get people on your side.
01:28:22.000 Focusing attention is the key to everything.
01:28:24.000 If we had the power to decide at any given moment to focus our attention on the best possible thing that we could focus our attention on, our life would be like a living, breathing sculpture.
01:28:32.000 It would be like a dream constantly rendering.
01:28:34.000 Or it would just be you jerking off in new socks.
01:28:37.000 But think about how...
01:28:38.000 Sounds like a dream world.
01:28:40.000 But think about how profitable being able to grab people's attention is.
01:28:44.000 Oh my god, it's everything!
01:28:46.000 It's everything!
01:28:46.000 It's the currency of this new age.
01:28:49.000 Attention is the new limited resource.
01:28:51.000 Attention is the new oil.
01:28:52.000 In a world of social media, of infinite media, infinite channels, who succeeds but the one who is most physically able to capture and manage attention?
01:29:01.000 And that's where you see the phenomenon of something like this.
01:29:04.000 I'm always talking about like, it's been a year and people are still saying, Come back and have this conversation.
01:29:09.000 I mean, that just means that you've tapped into a nerve that millions are feeling.
01:29:13.000 And when you consider that 10,000 hours of content is uploaded to YouTube every hour, that you still have millions of people that come and join this conversation, shows the power of that.
01:29:24.000 It's like, you know, shining a little bit brighter than the other 10,000 hours.
01:29:29.000 But it's a funny thing when the attention doesn't tune in.
01:29:32.000 Like, when you see Obama in Germany recently, did you see that shit?
01:29:35.000 No.
01:29:35.000 The speech he gave?
01:29:36.000 Ooh, it's creepy.
01:29:37.000 That's like Alex Jones-level creepy.
01:29:39.000 Why?
01:29:40.000 Because nobody paid attention?
01:29:41.000 Well, no.
01:29:41.000 Aside from the fact that there was only, like, the first time he came there, it was packed.
01:29:45.000 Hordes of people came to see him.
01:29:47.000 This time it was sparse and empty, but what was really creepy was his echoey message about how we have to give up freedom for security.
01:29:56.000 And you hear this coming out.
01:29:57.000 It's like- Obama was saying it?
01:29:59.000 Oh yeah, yeah.
01:30:00.000 Oh yeah, he was saying he was fucking sticking up for the goddamn NSA because- Can you pull up that, Jamie?
01:30:05.000 See if you can find that.
01:30:06.000 It might be a long speech, but it was spooky.
01:30:08.000 Let's listen to some of it, at least.
01:30:08.000 He's out there baking in the heat, sweating, and giving this proclamation of how there's a balance between...
01:30:15.000 He was just talking about the importance of the security state.
01:30:18.000 And, by the way, there's a logical part of my brain that considers the...
01:30:23.000 What they're saying, I have to allow myself to give consideration to what they're saying.
01:30:27.000 Me too, of course.
01:30:27.000 You know what I mean?
01:30:29.000 Well, especially because I'm thinking, well, if this allows them to stop somebody from blowing themselves up in the subway, then cool, you know?
01:30:34.000 What a pickle you're in.
01:30:35.000 What a pickle you're in if you know that if I have this much width, breadth, when it comes to monitoring, then I can stop people from getting blown up.
01:30:47.000 What happened at the Boston Marathon?
01:30:48.000 I can stop a kid from getting turned into fucking Right.
01:30:52.000 So now you're basically saying, well, what do we do here?
01:30:55.000 Are we going to just – do we just say, okay, well, I guess the cost of people's privacy is that from time to time children get evaporated?
01:31:02.000 Or do you say, no, we've got to grow up to the fact that – We're an interconnected system.
01:31:08.000 We're all cells in a bigger organism.
01:31:10.000 It's tough.
01:31:10.000 And we don't want to give up our security as a people.
01:31:13.000 I don't think the price is worth it.
01:31:14.000 We can't trust the government.
01:31:16.000 Not only that, I think we're looking at the thing incorrectly.
01:31:19.000 I think the thing they should be concentrating on is the mental health issue.
01:31:23.000 What makes people willing to lash out and kill large numbers of people?
01:31:27.000 That's another thing.
01:31:28.000 Instead of investing in defense, you can invest in research for mental health and you would solve a lot more murders.
01:31:35.000 We know when babies are born.
01:31:36.000 Potentially, I agree with that.
01:31:37.000 We know we have birth certificates.
01:31:39.000 We know where people are living, social security numbers.
01:31:42.000 Why can't we find out whether or not people are doing really bad?
01:31:45.000 Why can't we find out whether or not people are losing their minds?
01:31:48.000 Well, it's hard when they're in Yemen.
01:31:49.000 No, I mean, even in America, we can't find out.
01:31:51.000 I think we do not have an accurate account of our citizens, yet we can pretend that we're some sort of a community.
01:31:57.000 But we don't have an accurate account of the health of our citizens.
01:32:00.000 When SAG told me they weren't going to cover my mental health anymore, Because of some type of Obamacare that went into action.
01:32:05.000 They said, you can't carry everybody.
01:32:06.000 You can't carry them all.
01:32:07.000 So only plan one gets it.
01:32:08.000 Plan two gets none.
01:32:09.000 And I was like, I hope another Jared Loeffner goes into your building and shoots every one of you.
01:32:13.000 Did you say that?
01:32:14.000 Yeah.
01:32:14.000 Did you say that?
01:32:15.000 Yeah.
01:32:15.000 What are they going to pull?
01:32:16.000 Right after that, they're going to pull everyone's mental health insurance?
01:32:19.000 Yeah, because that crazy guy lost his health insurance.
01:32:21.000 Well, that's what you're turning us off.
01:32:22.000 One of us is going to do that.
01:32:24.000 But let me ask you something.
01:32:25.000 Do you think that when a person implodes like that, if we're talking about what happened to that person on a human scale, we might say, okay, well, maybe...
01:32:32.000 You know, years of disaffection and radicalization and propaganda and mediation from the wrong influences and his focused attention on the wrong place.
01:32:39.000 It could be mental health as well.
01:32:41.000 It could be a real issue.
01:32:42.000 A medical issue.
01:32:43.000 Right.
01:32:43.000 So let's zoom out for a little bit and think of that person as a cell in the bigger organism.
01:32:47.000 Is he like a cancer cell?
01:32:48.000 Is he the equivalent of when a cancer cell starts to replicate without concern for the rest of the cells in the system?
01:32:54.000 Is that what it is?
01:32:55.000 Is it a broken thing?
01:32:57.000 Could it be fixed?
01:32:58.000 Just like we want to make advances in medicine to detect cancer cells before they metastasize, can we find human beings before they metastasize into that?
01:33:07.000 I think it's not an either or.
01:33:09.000 I think sometimes yes, sometimes no.
01:33:11.000 I think sometimes it's probably a medical issue.
01:33:13.000 Yeah, what about research in terms of being able to find it?
01:33:16.000 To find the genetic markers that predispose you to that?
01:33:19.000 But isn't culture ultimately the technology that does that?
01:33:23.000 Isn't culture, like when we call TV, we call it programming.
01:33:27.000 It programs you.
01:33:28.000 It teaches you about right and wrong and what's legal and what's not legal.
01:33:31.000 If you're part of the pop culture, you're programmed into a kind of mainstream consensus trance that basically says, we're moderately free as long as you don't physically hurt me and I don't physically hurt you.
01:33:42.000 You don't do certain things, but we have these kind of frameworks to impose some kind of an order so that the system can have some kind of function.
01:33:51.000 It just makes you wonder.
01:33:53.000 Especially with the privacy things.
01:33:54.000 Are they really spying on me?
01:33:55.000 Or is it more like I'm a billion lines of code mixed with a billion other lines of code and just a bunch of algorithms against gates?
01:34:01.000 And then they just detect when there's weird behavior associated with violence that they would zoom in on something.
01:34:06.000 Or they will use anything you've done wrong as an excuse to go and really go after you.
01:34:11.000 They don't have a manpower for that.
01:34:12.000 They don't have a manpower for that.
01:34:13.000 No, but if they already want to fuck with you, they can look at your stuff and say, oh, he owns too large a lobster, which is a federal offense.
01:34:19.000 Ari, let me ask you.
01:34:20.000 Whether or not you bought it from anyone, then they can use that to say he's broken the law.
01:34:23.000 Why would they want to fuck with you when there's people that want to blow themselves up in subways?
01:34:25.000 Don't you think that that's going to occupy most of their attention?
01:34:27.000 Let's say this.
01:34:28.000 This is why.
01:34:29.000 Because some guy who works there, you fucked his ex-girlfriend.
01:34:32.000 Oh, snap, son.
01:34:33.000 Goddamn.
01:34:34.000 Shit just got real.
01:34:35.000 You know what I mean?
01:34:35.000 And then you're giving that guy the power to abuse it.
01:34:40.000 That's what I don't like.
01:34:42.000 And that's what this guy is saying is absolutely possible.
01:34:45.000 This guy is saying that the people that work at the organization, like him, they keep referring to him as a high school dropout, which is hilarious.
01:34:53.000 He was your coder?
01:34:54.000 Didn't you hire him?
01:34:56.000 Are you pretending you don't know this guy?
01:34:58.000 Mm-hmm.
01:34:59.000 By the way, when has it ever worked?
01:35:01.000 This is the question we have to ask.
01:35:03.000 You know that saying, if we don't understand history, we're doomed to repeat it.
01:35:07.000 Let's look in the past.
01:35:09.000 At what point has a government gained full access to the information flow of its citizenry where it hasn't gone wrong?
01:35:18.000 Show me where.
01:35:19.000 What nation has it been where it's like, oh yes, that was that one government that knew, that studied all the correspondents of all its citizens, and it didn't go wrong at all.
01:35:29.000 It didn't tighten down.
01:35:30.000 It didn't become a security state.
01:35:31.000 It was a utopia.
01:35:33.000 No.
01:35:33.000 No.
01:35:34.000 It doesn't exist.
01:35:35.000 It's always bad.
01:35:36.000 It's North Korea.
01:35:37.000 Yes.
01:35:37.000 Okay.
01:35:38.000 But here's the counter-argument to that.
01:35:40.000 Do you think that if given the proper instruments, the citizenry could police themselves?
01:35:44.000 Could we have a society that becomes like Airbnb, where everybody can rent their own place and everybody else judges everybody else?
01:35:52.000 You don't need someone now to tell you about a good dentist.
01:35:54.000 You can just look online and you're connected to all the citizens who have told you by four and a half stars, right?
01:35:59.000 I was asking my brother, I was like, how do you know somebody from Airbnb is not going to be some serial killer who's going to cut me into pieces?
01:36:04.000 He's like, well, because you can look at the 50 other people that stayed at his house, and they rate his cleanliness, and they rate his...
01:36:09.000 So you can go anywhere in the world and have this...
01:36:11.000 Same way hookers work now on aeros.com?
01:36:13.000 Networks that regulate each other.
01:36:15.000 Self-regulating networks.
01:36:16.000 So it's almost like we are connecting to each other, and it's like a homeostasis is being formed, where the system is self-regulating.
01:36:23.000 We made a government in a time where we didn't have that ability.
01:36:25.000 Right.
01:36:26.000 And now we do.
01:36:26.000 Right.
01:36:27.000 So the government is becoming obsolete.
01:36:29.000 Right.
01:36:29.000 So it's not...
01:36:30.000 The government's an appendix.
01:36:30.000 It's not utopia, but these decentralized peer networks that self-regulate each other with no top-down management, but just lateral, is leaning towards a kind of, like, space in which we can...
01:36:41.000 We have Congress because we didn't have the ability to send someone...
01:36:43.000 Represent ourselves.
01:36:44.000 Yeah, except from California, we can't speak in Washington.
01:36:47.000 So we need to send some guy as our congressman to speak for us.
01:36:51.000 But now we can speak for ourselves.
01:36:53.000 Right.
01:36:53.000 We don't need that.
01:36:54.000 It's so outdated.
01:36:54.000 A lot of people doing democracy 2.0.
01:36:56.000 I mean, we need literally, like, if iOS doesn't get an upgrade for our iPhone every six months, we freak out.
01:37:01.000 Like, we need to upgrade literally the way the whole governmental system works to use these new technologies.
01:37:06.000 It should be online.
01:37:08.000 It's really simple.
01:37:09.000 The idea that anybody controls it is ridiculous.
01:37:11.000 It should be online.
01:37:13.000 And there should be some sort of anonymous type group that controls the code to make sure that nobody can fuck with it.
01:37:18.000 Like Bitcoin.
01:37:19.000 Or someone who's on top of shit.
01:37:22.000 Maybe the AIs.
01:37:23.000 Develop a global ethic amongst the hackers.
01:37:25.000 Anonymous is us.
01:37:26.000 By the way, no one's giving this power up.
01:37:31.000 Exactly.
01:37:32.000 They're just taking it.
01:37:33.000 This is where I think this fervent form of naive futurist comes into being, and not just futurists, spiritualists and a lot of other people think, oh, you know, these, as you're saying, these, what do you call it, peerless networks?
01:37:45.000 Yeah, these decentralized peer networks.
01:37:48.000 And by the way, when I say you, I mean me too, because I do have this hope that somehow this thing is just like an escalation towards bliss and it's all going to work itself out.
01:37:55.000 And I do think that.
01:37:56.000 But again, if you look back at history, you will see that even if a thing has become archaic and antiquated, it doesn't mean that the people running that thing are going to give it up.
01:38:05.000 No, no, they don't want to give it up.
01:38:06.000 They're not going to.
01:38:07.000 The only way they give it up is through violence.
01:38:10.000 Bam!
01:38:11.000 That's our inevitability here.
01:38:13.000 We're in for a revolution here in our lifetime.
01:38:16.000 Kurzweil says that that's actually not the case, and he says that, you know, like...
01:38:22.000 The radio industry didn't want TV to become a thing.
01:38:25.000 There didn't need to be violence for TV to become a thing.
01:38:29.000 No, we're not talking about...
01:38:30.000 Ari Shafir is ready to start a goddamn revolution.
01:38:33.000 No, I'm ready to cheer it on.
01:38:34.000 I'm a coward.
01:38:35.000 I'm not going to get involved.
01:38:37.000 But I'm ready to say, here's some water, you guys.
01:38:39.000 Go out there and fight for us.
01:38:40.000 But I think that these technologies will meet resistance from the establishment, but I don't think that it requires violence for transformation to occur.
01:38:47.000 I mean, we're seeing it through social media.
01:38:49.000 What are you talking about?
01:38:51.000 Syria, Liberia, the other one.
01:38:56.000 I think you mean Libya.
01:38:57.000 Yeah, that's the one.
01:38:58.000 It always gets overthrown.
01:39:02.000 They don't give up.
01:39:03.000 They don't give up.
01:39:03.000 And they have the guns, so the only way to get them to give it up...
01:39:05.000 Taking the guns.
01:39:07.000 Which other ways?
01:39:08.000 When does it work?
01:39:09.000 Younger people who are growing up with the internet who have a different understanding of the future.
01:39:14.000 I don't think that a representative government is impossible.
01:39:17.000 I just think that we have to have more accountability.
01:39:19.000 And the thing that gives more accountability than anything is the internet.
01:39:23.000 It forces accountability.
01:39:25.000 So I think ultimately it's an age thing.
01:39:27.000 You can't record cops anymore?
01:39:29.000 You can, though.
01:39:30.000 You can.
01:39:30.000 Yes, you can.
01:39:31.000 They can arrest you?
01:39:32.000 No, they can't.
01:39:33.000 No.
01:39:33.000 I mean, individual places have passed laws trying to make that real.
01:39:37.000 But there's a bunch of videos online that show people telling cops.
01:39:41.000 They told me today I couldn't take a picture of a TSA. I saw a baby being left on the counter.
01:39:44.000 So it's funny.
01:39:45.000 So I took a picture.
01:39:46.000 Sir, you can't take a picture of the checkpoint.
01:39:47.000 I'm like, that's wrong!
01:39:49.000 What you're telling me is not true!
01:39:51.000 I'm going to take more pictures.
01:39:53.000 I got to say, I don't want there to be violence, man.
01:39:57.000 I don't think there needs to be.
01:39:58.000 I think the dream, the naive dream of the futurists is, it's like you look in the animal kingdom.
01:40:05.000 You look in, not just the animal kingdom, but you look at any massive change that has ever happened is always surrounded by a release of energy.
01:40:14.000 When things rapidly change, there's a release of energy, and energy releases are always violent.
01:40:19.000 Bam!
01:40:20.000 They're called explosions.
01:40:22.000 Well, it doesn't have to be, though.
01:40:23.000 It can be a psychic explosion.
01:40:26.000 It can be a consciousness explosion.
01:40:28.000 It doesn't require Hiroshima.
01:40:30.000 I think we're experiencing that.
01:40:31.000 It's just, again, the psychedelic state of seeing the whole history unfold in a matter of a few seconds.
01:40:37.000 I think this is just, it's slow, so we're not really understanding what's happening, but we're seeing all these...
01:40:43.000 Well, the psychedelics are coming back.
01:40:45.000 We're seeing all these things.
01:40:47.000 We're seeing all these paradigms crumble in front of ourselves.
01:40:51.000 And the reason being is because they're being exposed.
01:40:53.000 They're being exposed by the internet.
01:40:54.000 It's just happening to us too slowly.
01:40:57.000 Or it's too confusing as to which direction it's going to go.
01:41:00.000 There's too much peril in it.
01:41:02.000 It's the last few people that are just like, hey, it's changing.
01:41:04.000 They're like, no, no, no, no, no.
01:41:05.000 How do the robots make the Ayatollah understand that it's not good to put women in beekeeper outfits?
01:41:14.000 You know, I... I completely concur with that, but I had an interesting experience.
01:41:19.000 I was just in Berlin for a couple days, and I went to this steam room and spa over at the Soho house there, and it's co-ed.
01:41:29.000 So beautiful naked girls are walking around and showering in front of you in Germany, and it's perfectly normal.
01:41:36.000 And of course, I'm loving it, but I'm also slightly like, what's happening here?
01:41:39.000 And then I think, we're just as...
01:41:42.000 Primitive here with separating the men's room and the female room, compared to them, we're like the beekeeper suit.
01:41:49.000 Do you know what I mean?
01:41:50.000 Indeed.
01:41:51.000 I know what you're saying.
01:41:52.000 And I would think of myself as, no, we're liberal here in America, but look how shamelessly those women are just walking around naked.
01:41:58.000 It's like, relax, we're not fucking.
01:42:00.000 I totally agree with you.
01:42:02.000 I think that what you're seeing is a spectrum of men kind of controlling women.
01:42:07.000 Like if a woman takes her shirt off at the beach, she gets arrested in the United States.
01:42:11.000 Crazy!
01:42:12.000 But I think that if I'm going to be on some part of that spectrum, I definitely want to be on the part of the spectrum where the definition of shirt is like a shoelace or something.
01:42:22.000 I'd rather it be no shoelace at all.
01:42:24.000 But in that way, all I'm saying is I don't understand the solution to fundamentalists of any religion controlling massive populations or, in the case of North Korea, people controlling massive – I don't see how – Some android Jesus.
01:42:42.000 I don't see how Kurzweil's manifestation of full brain emulation and the subsequent empathic connection that happens with people.
01:42:50.000 I don't understand how that makes a person who's wielded control over a chunk of land using a false god.
01:42:57.000 I don't understand how that's going to make them be like, you know what?
01:43:01.000 I guess I was wrong.
01:43:03.000 But our society has been transformed by a change in consciousness.
01:43:08.000 I mean, you could argue that the 60s fundamentally changed the way we think.
01:43:11.000 I mean, this place used to be a lot more puritan than it is now.
01:43:14.000 Iran used to be a lot more liberal.
01:43:17.000 Okay, so you have both things happening.
01:43:20.000 You have cycles, you have things.
01:43:21.000 Iran, Lebanon.
01:43:22.000 I hear it's very liberal now amongst the people.
01:43:25.000 Yeah, the actual citizens are very secular, but the government is what's really military.
01:43:30.000 There's a new guy who got in office, but there's always going to be the Ayatollah that's a supreme leader.
01:43:35.000 Yeah, exactly.
01:43:36.000 It's really a weird situation.
01:43:37.000 It's very strange.
01:43:38.000 And the guy who's the Ayatollah has been the Ayatollah since like 89 or something like that?
01:43:42.000 Really?
01:43:43.000 Yeah.
01:43:43.000 Something crazy like that.
01:43:45.000 Well, you know what, man?
01:43:46.000 Let me just say this, because I don't want to come off sounding like Ari, like I want some violent revolution.
01:43:51.000 I don't want one.
01:43:52.000 I see it inevitable.
01:43:53.000 Have you ever had Steven Pinker on the show?
01:43:55.000 I don't think it's inevitable.
01:43:56.000 So Steven Pinker called Better Angels of Our Nature.
01:43:58.000 He wrote a book called Better Angels of Our Nature.
01:44:00.000 He has a TED Talk called The Myth of Violence.
01:44:02.000 And he actually went up there and explained that the chances of a man dying at the hands of another man today are the lowest than they've ever been in all of human history.
01:44:09.000 Wow.
01:44:09.000 The world has actually never been less violent.
01:44:12.000 Mm-hmm.
01:44:13.000 Than it is today.
01:44:13.000 But look at Syria.
01:44:15.000 100,000 people.
01:44:17.000 It's a lot, but I guess what he's saying is that it used to be worse.
01:44:21.000 The Mongols killed a million in a day.
01:44:23.000 Right, right, right.
01:44:23.000 I see what you're saying.
01:44:24.000 On a horse.
01:44:25.000 Again, it's that spectrum.
01:44:26.000 It's a spectrum.
01:44:27.000 In a day?
01:44:28.000 Yeah, some insane number.
01:44:29.000 I might be exaggerating.
01:44:30.000 Well, there were battles like that in World War II. 72 hours that killed a million people with horses.
01:44:35.000 On horseback.
01:44:36.000 Awful.
01:44:37.000 No nuclear bomb.
01:44:38.000 Just swords.
01:44:39.000 They must be so tired after that.
01:44:42.000 You're chopping arms.
01:44:44.000 That's when you gotta run, when they're that tired.
01:44:47.000 After they killed a million people.
01:44:50.000 Yeah.
01:44:53.000 A million.
01:44:55.000 We have to really sort of grasp.
01:44:57.000 It's hard to, but we have to try to grasp.
01:45:00.000 Just what a short period of time we haven't been barbarians.
01:45:03.000 Yeah.
01:45:04.000 I mean, even the Catholic Church, I mean, you go back to the 1500s, they were drinking, they had like mistresses, the popes had mistresses.
01:45:14.000 They even raped kids back then.
01:45:15.000 They had armies.
01:45:16.000 Yeah, they raped kids.
01:45:17.000 They had armies.
01:45:18.000 They had full armies.
01:45:19.000 The pope had armies.
01:45:20.000 The Catholic Church had armies.
01:45:21.000 The Vatican had an army behind it.
01:45:23.000 It's crazy.
01:45:24.000 You're talking about really nutty shit.
01:45:25.000 They asked for the Pope's help once in fighting off the Mongols.
01:45:29.000 Really?
01:45:30.000 Yeah, they didn't want to send any troops.
01:45:33.000 They got lucky the fucking Mongols died off.
01:45:38.000 Here's the thing, man.
01:45:39.000 I want to be pessimistic again.
01:45:40.000 Because I keep thinking about this.
01:45:43.000 So it's like, okay.
01:45:45.000 In the same way that it used to be, if a person had a great idea and wanted to transmit it, he'd have to get a printing press, and he'd have to, like, you know, to get the idea around it.
01:45:53.000 It took a long time.
01:45:55.000 The idea would have to go by boats, right?
01:45:57.000 So, now it's like, if you want to build, like, a nuclear bomb, It's really hard.
01:46:04.000 You've got to have centrifuges, plutonium.
01:46:06.000 It's a fucking bitch.
01:46:08.000 You can't just do it if you want to build a nuke.
01:46:10.000 But as matter as 3D printers begin to become more and more advanced, and they start working at the atomic level, then in the same way that we have accessibility to instantaneous communication, People are going to have accessibility to instantaneous creation of all kinds of fucked up weapons.
01:46:27.000 That's when we're going to need an account of all the people in our community.
01:46:30.000 That's when we're going to need an account of them in a loving way.
01:46:33.000 No, but all it'll take is one person.
01:46:35.000 We have society looking out for each other.
01:46:37.000 All it'll take is one guy in his basement with a nuclear bomb.
01:46:40.000 You're right, you're right.
01:46:40.000 But the idea is that every baby has the potential to become an awesome human being.
01:46:45.000 That's the idea.
01:46:46.000 But some of them just get shit rolls of the dice.
01:46:49.000 And they wind up with two asshole meth head parents who fucking leave them in a basement one day for 24 hours and they starve to death.
01:46:56.000 You know, this shit like that happens to kids.
01:46:58.000 You just get a shit roll of the dice.
01:47:00.000 Or you can get an awesome roll of the dice.
01:47:02.000 Yes.
01:47:02.000 You know, I mean, that's possible too.
01:47:04.000 I think it's our job collectively as a human species to concentrate on the least fortunate amongst us.
01:47:11.000 I think it's the thing that everyone takes for granted, everyone ignores, and every single guy who runs for president doesn't bring it up.
01:47:20.000 They never say, look, our society is only as strong as its weakest link.
01:47:25.000 We've got a bunch of people that are being ignored, and they're an awesome resource.
01:47:28.000 If we educated them and helped them and moved them forward in some way, who knows what kind of great benefit you could get out of this community of people.
01:47:36.000 That's true.
01:47:36.000 I just think of that guy on Locked Up Raw.
01:47:40.000 That one, I can't remember which guy it is.
01:47:43.000 The people that you see in Locked Up Raw were just like...
01:47:47.000 I've thought, oh, I know the way to fix that.
01:47:51.000 You flood the prisons with LSD and give these people house therapy.
01:47:56.000 It would definitely help a little.
01:47:57.000 You're eventually going to have a 12-monkey situation.
01:47:59.000 I think you can do a psychedelic therapy.
01:48:01.000 Somebody's going to have so much effect on the rest of the population.
01:48:05.000 With a toxin or a bunch of nuclear weapons that it'll just drastically change everything.
01:48:09.000 Well, you know what?
01:48:09.000 I think society has always thought that this was coming or not.
01:48:13.000 It's like there's always people that believe that the apocalypse is around the corner, and there's always people that had faith.
01:48:18.000 And like Jason said, this is the safest time to be alive ever, but yet we're still like, fuck!
01:48:23.000 The sky is falling.
01:48:24.000 It's almost like a part of being a human to recognize all the flaws around us to make it glaringly obvious that we're aware of having to focus more attention on them and hopefully slow down the progress of the evil.
01:48:35.000 Us being scared and neurotic and almost negative has been biologically selected for to warn us against potential consequences.
01:48:43.000 The early caveman who was chilling out looking at the sky got The one that was like scared of impending doom survived so there's no coincidence that we have evolved that but until we start like Playing with our own genomes, like we can't change our basic dispositions,
01:48:59.000 which is to pay attention to whatever we think is dangerous.
01:49:02.000 But I think it does have, you know, knowledge is power.
01:49:04.000 When you appreciate, oh really?
01:49:06.000 The world has never been safer?
01:49:07.000 That's interesting.
01:49:08.000 Not to say there's not a lot of things to worry about.
01:49:10.000 It's not to say that there's not school shootings and 3D printing guns could be dangerous.
01:49:14.000 But let's look at the actual facts as it is today.
01:49:17.000 Yeah, but all it would take, but in the beginning, if one guy out of a hundred of them went a little nuts, he would punch some people.
01:49:22.000 Right.
01:49:22.000 And then as technology got better, he stabbed two people and then got stopped.
01:49:25.000 And then as technology got better, he put off a bomb.
01:49:28.000 And then as technology got better, he flew a plane into a building.
01:49:30.000 And as technology gets better, you can affect millions and millions and millions of people all at once with just one...
01:49:37.000 Just one guy that falls in the cracks.
01:49:38.000 This is very much like the internet.
01:49:40.000 But it's funny, Ari, what you're saying goes against your hate of the surveillance state.
01:49:45.000 Because you could see how in an accelerating technology where people can blow each other up with increasing competency, you can see the necessity for perhaps an observation.
01:50:00.000 That's what I'm saying.
01:50:01.000 It's a pickle, man.
01:50:02.000 Yeah, but I think that's going to happen anyway.
01:50:04.000 You're never going to stay fully.
01:50:05.000 You can't monitor everybody at all times.
01:50:07.000 But just because something's going to happen doesn't mean you don't try to stop it.
01:50:09.000 You delay it.
01:50:10.000 The trend seems irreversible.
01:50:12.000 The trend towards everybody having access to everything at all times.
01:50:17.000 I think we're going to have a real problem with money because money right now used to be based on gold and now we've got this ones and zeros thing that we're rocking that doesn't really make any sense at all.
01:50:27.000 I think that Bitcoin's not going to last but I think the thing that comes right after Bitcoin is going to be the one.
01:50:31.000 Oh, have you guys heard about this shit?
01:50:33.000 Bitcoin's going to be the front story.
01:50:33.000 But I wonder.
01:50:34.000 I mean, I think it's...
01:50:36.000 We live in a really strange time because as access to information gets more and more transparent, more and more free, where we all have access to everything...
01:50:46.000 Well, then what exactly is financial resources?
01:50:50.000 It's going to become obsolete.
01:50:52.000 Where are those ones and zeros?
01:50:53.000 Where do they go?
01:50:56.000 I'm just saying there will be a permeation point where transparency gets to a point where everyone has access to everything at all times.
01:51:04.000 You're not going to be able to store any secrets, so you're not going to be able to have money.
01:51:07.000 You're not going to be able to put ones and zeros all in your bank account.
01:51:10.000 It doesn't mean anything.
01:51:11.000 Someone will take your ones and zeros.
01:51:12.000 But there also might not be a reason for anybody to hurt each other.
01:51:15.000 Might not.
01:51:16.000 It doesn't have to be that way.
01:51:17.000 Right.
01:51:17.000 McKenna says we're all going to move into universes of our own construction.
01:51:20.000 But then, like, Kurzweil and Peter Diamandis just say, you know, technology is a resource-liberating mechanism.
01:51:26.000 The whole idea of scarcity is just contextual, you know?
01:51:28.000 We fight over 1% of the fresh water in the world when this is a water planet.
01:51:33.000 Desalinization revolution could give us all the water we could ever need.
01:51:35.000 We fight over energy.
01:51:37.000 We get 10,000 times more energy from the sun than we would ever need.
01:51:39.000 With nanotechnology, matter becomes a programmable medium.
01:51:43.000 We can turn anything into anything.
01:51:44.000 When we have infinite abundance, potentially, what would we fight about?
01:51:49.000 There would be no incentive to fight.
01:51:50.000 Pussy.
01:51:52.000 We could clone pussy.
01:51:54.000 We could clone.
01:51:55.000 We could live in virtual reality.
01:51:56.000 You know what?
01:51:57.000 It's like diamonds.
01:51:57.000 Girls don't want fake diamonds.
01:51:59.000 They want real diamonds that came from coal.
01:52:01.000 And the guys are going to want real pussy that you earn.
01:52:03.000 Oh, she's a real person?
01:52:04.000 Well, we can have it in virtual reality.
01:52:06.000 We'll have virtual games where we can be the heroes in our own universe.
01:52:09.000 The flaw in your argument, and the terrifying flaw in your argument, is the assumption that people do things for a reason.
01:52:15.000 You're saying people do things because they don't have enough or they do things because of this or that.
01:52:18.000 Some of the most horrible things are done for no reason at all.
01:52:22.000 They're done because the biocomputer that somebody's running clicked the wrong way and they decided it'd be fun to hear the sound of a teenage boy's neck snap when they're painted like a clown.
01:52:32.000 That's a cancer cell.
01:52:33.000 That's a thing that's not moving towards complexity and organization in the sublime like the rest of the evolutionary process.
01:52:41.000 That's why there's the urge to kill it.
01:52:42.000 Right.
01:52:43.000 There's the urge to kill it, but the interesting thing, and I think this, I heard, I think McKenna said this, there's this relay, there's a race happening right now.
01:52:50.000 There's a race happening.
01:52:51.000 Because it's not as though these two things can't exist at the same time.
01:52:54.000 It's like, we were talking about this, and I've been thinking about it a bunch since, how, you know, in phones, of course, there are conflict minerals.
01:53:01.000 What's the name of that shit in phones?
01:53:02.000 What's it called?
01:53:03.000 Coltan.
01:53:04.000 So in phones, it's Coltan.
01:53:05.000 So we know that in our phones, in this device that's allowing us this greater connectivity, is the suffering of children in African minds.
01:53:15.000 So we see in nature that there is this intertwining of people.
01:53:21.000 Very interesting.
01:53:22.000 And to think that somehow technology is going to make things all light is to say that we will actually rewire the universe, when in fact it seems like what's happening is an acceleration on both sides of the scale.
01:53:35.000 And as that acceleration happens, there will be an equivalent amount of this orgasmic, utopian, Teilhard de Chardon omega point with the other side of the thing, which is the absolute obliteration of all humanity through nuclear weapons or bioweapons.
01:53:51.000 Now, here's the hopeful thing is what Martin Luther King said, which is the universe bends in the direction of justice.
01:53:57.000 And there is this hope that there's a refraction in this lens where things are going towards the direction of creation instead of...
01:54:05.000 Yeah, well, Steven Johnson says it's not utopia, but it's leaning that way.
01:54:09.000 So, you know, you could argue the things are better.
01:54:11.000 They're not perfect.
01:54:12.000 They're better.
01:54:13.000 If technology amplifies the good in us, it amplifies the bad in us, but maybe it amplifies the good a little bit more than it amplifies the bad, so that eventually it might subvert completely.
01:54:24.000 The light might swallow the darkness.
01:54:28.000 Extropy might transcend entropy completely.
01:54:30.000 We might become immortal gods living outside of time, and maybe that's the singularity.
01:54:36.000 This sounds a lot like the battle between hell and heaven.
01:54:38.000 It does, doesn't it?
01:54:40.000 It's the same thing.
01:54:42.000 It's not a coincidence.
01:54:43.000 We need those archetypes to make sense of what's happening.
01:54:45.000 We've used the same archetypes.
01:54:47.000 Religion, salvation, transcendence, the same things.
01:54:49.000 The difference is that religion never produced what technology produced.
01:54:52.000 Religion never let us fly through the air.
01:54:54.000 Religion never gave us cell phones.
01:54:56.000 It never gave us the internet.
01:54:57.000 Technology does.
01:54:58.000 So in our desire to believe that these things are going to help us transcend our limitations, you know, technology is actually delivering a little bit more than the previous stuff.
01:55:09.000 Oh, a lot more.
01:55:10.000 How dare you with your a little bit...
01:55:12.000 Well, there you go.
01:55:13.000 A lot more.
01:55:13.000 Cyborg arms.
01:55:15.000 Trying to make friends.
01:55:15.000 Kids are hearing again.
01:55:17.000 They made Galileo apologize.
01:55:19.000 Yeah, they said you did.
01:55:20.000 You were wrong.
01:55:20.000 The church.
01:55:22.000 What do you have to do with it?
01:55:24.000 You guys want to see a trippy video about us being cyborgs?
01:55:26.000 Okay, we're going to wrap this up with this trippy video because I've got to get out of here.
01:55:29.000 I've been working all day.
01:55:30.000 Let's do it.
01:55:31.000 Howdy, Christian.
01:55:31.000 Power to the people.
01:55:32.000 Happy Mushroom Fest to everyone who's participating.
01:55:34.000 Shroom Fest, bitches.
01:55:35.000 God is love.
01:55:36.000 Everything's going to be fine.
01:55:37.000 Forget all that explosion shit.
01:55:39.000 Yes, and before we go, we would just like to thank Hover.com.
01:55:42.000 Go to Hover.com forward, Rogan.
01:55:44.000 Get 10% off of your domain name purchases and stamps.com.
01:55:51.000 If you click on the microphone, enter in the code word JRE, get yourself a special offer.
01:55:57.000 That's O-N-N-I-T. Use the code name ROGAN. Save yourself 10% off any and all supplements.
01:56:05.000 I've had a wacky few, actually like a month and a half lately, ladies and gentlemen.
01:56:11.000 And I wouldn't have been able to do all this stuff if I didn't enjoy the shit out of it.
01:56:14.000 And it's a fascinating experience.
01:56:16.000 So I want to thank all of you.
01:56:18.000 And I want you guys to follow all my friends on Twitter, including Jason Silva.
01:56:24.000 What is it?
01:56:25.000 At Jason Silva.
01:56:26.000 At Jason Silva.
01:56:26.000 At Jason Silva.
01:56:28.000 At Ari Shafir.
01:56:29.000 Watch my Storyteller show online, on YouTube.
01:56:32.000 Yeah, there's a couple of them now.
01:56:33.000 The first one was really good, too.
01:56:35.000 Where'd you film those?
01:56:36.000 Cheetahs.
01:56:37.000 Oh, that's hilarious.
01:56:38.000 Yeah.
01:56:39.000 That's what it looked like.
01:56:40.000 It looked like some sort of a strip club type of environment.
01:56:42.000 It's perfect.
01:56:43.000 And TJ Miller's one was really good, too.
01:56:45.000 That was really funny.
01:56:47.000 And it's a really interesting sort of setup, the way you have it.
01:56:49.000 And I love that Comedy Central...
01:56:51.000 Have the balls to put that online like that.
01:56:53.000 Just produce it.
01:56:54.000 Make it like a real show.
01:56:55.000 Like a legit TV show.
01:56:56.000 Put it online.
01:56:57.000 And if it gets a good reaction, they'll probably wind up doing something like that on television.
01:57:01.000 Yeah, exactly.
01:57:03.000 You do what you feel is bright.
01:57:05.000 Yeah.
01:57:06.000 So go and check it out and support it.
01:57:08.000 Joey Diaz is coming.
01:57:08.000 Joey Diaz is coming.
01:57:09.000 Support it, folks, because Ari Shafir is a bad motherfucker.
01:57:12.000 I'd like to do it on television for you guys, too.
01:57:13.000 The concept is sound.
01:57:14.000 So pass it around to each other.
01:57:15.000 And he would like to get some of that sweet, sweet TV money.
01:57:18.000 Mostly I really just want to put on a good show for people.
01:57:20.000 You do.
01:57:20.000 You do.
01:57:21.000 I want to check it out, man.
01:57:22.000 Your intentions are 100% pure.
01:57:23.000 You're a real legit comic, R.H.F.E.R. It's an honor watching you grow from being a dude who is just sort of getting on stage the first time to being a legit headliner.
01:57:32.000 It's awesome.
01:57:32.000 It's cool to see.
01:57:33.000 Thanks, man.
01:57:34.000 Wow, cool.
01:57:35.000 Same to you, Duncan Trussell, you sexy bitch.
01:57:37.000 YouTube Comedy Central, this is not happening.
01:57:38.000 All right, you fucks.
01:57:40.000 Follow Duncan Trussell on Twitter.
01:57:41.000 Duncan, D-U-N-C-A-N, Trussell, T-R-U-S-S-E-L-L. This is my podcast, Duncan Trussell Family Hour.
01:57:48.000 Double S, double L, that's Duncan Trussell.
01:57:52.000 Awesome.
01:57:52.000 All right, you fucks.
01:57:53.000 If I may add, I would love everyone to check out Brain Games on National Geographic.
01:57:59.000 Yes, yes.
01:57:59.000 I have a new series.
01:58:00.000 It airs Monday nights at 9. We're doing season 2 now in the fall, but currently we're still airing it.
01:58:05.000 Next Monday is the final episode, actually, the 12th episode.
01:58:08.000 So please check out Brain Games on National Geographic.
01:58:10.000 And I also launched a new YouTube channel called YouTube.com slash Shots of Awe, where I'm releasing new videos of my crazy espresso psychedelic videos every week.
01:58:20.000 So YouTube.com slash Shots of Awe.
01:58:22.000 If Red Band made that video, it would be Shots of Awe, just A-W, and it would just have kittens all day long.
01:58:28.000 Aww.
01:58:30.000 Well, this one is Shots of Awe, A-W-E, but if you check it out, that new video is called We Are Already Cyborgs.
01:58:36.000 That's the one I want you to check it out.
01:58:38.000 Hopefully with that.
01:58:39.000 Jason Silva.
01:58:40.000 Do you have that thing queued up?
01:58:42.000 YouTube.com slash Shots of Awe, and it's the first one down.
01:58:45.000 It's called We Are Already Cyborgs.
01:58:48.000 How dare you?
01:58:50.000 We're already cyborgs.
01:58:51.000 We can't get a goddamn YouTube video to play.
01:58:53.000 You're fired!
01:58:55.000 Destroy the planet!
01:58:56.000 How dare you?
01:58:57.000 All right, well, let's just recommend that people watch it.
01:58:59.000 Yeah, sure.
01:59:00.000 It's called Shots of Awe on YouTube, and we are already cyborgs, which I concur, sir.
01:59:07.000 Awesome.
01:59:07.000 I concur.
01:59:08.000 You're a bad motherfucker.
01:59:08.000 Cosmic Dick Slinger.
01:59:10.000 Oh, yes.
01:59:10.000 The great Jason Silva.
01:59:12.000 Thanks for that.
01:59:12.000 Thank you for having me.
01:59:14.000 Powerful Duncan Trussell.
01:59:15.000 Powerful Jamie.
01:59:16.000 We're out of here, you dirty fucks.
01:59:17.000 We'll see you next week with new shenanigans.
01:59:20.000 And Ari Shafir and I go to Alaska to conquer the Great White North.
01:59:23.000 Yeah.
01:59:23.000 Go fishing like a motherfucker.
01:59:25.000 Get a reindeer dog.
01:59:26.000 All right, you fucks.
01:59:26.000 Awesome.
01:59:27.000 We'll see you soon.
01:59:28.000 We love the shit out of you.
01:59:29.000 All right.
01:59:29.000 Big kiss.