The Joe Rogan Experience - March 12, 2012


Joe Rogan Experience #194 - Jason Silva


Episode Stats

Length

2 hours and 46 minutes

Words per Minute

211.56947

Word Count

35,251

Sentence Count

3,278

Misogynist Sentences

79

Hate Speech Sentences

58


Summary

This week, we talk about porn, nootropics, and the future of the porn industry. Joe also talks about how he thinks virtual reality is going to change the way we look at sex in the future, and why you should be worried about having sex in virtual reality because it's going to be way better than masturbating in front of the mirror. Joe is joined by comedian Brian Redman to talk about this and much more on this week's episode of the Joe Rogan Experience Podcast! Logo by Courtney DeKorte. Theme by Mavus White. Music by PSOVOD and tyops. All rights reserved. Used by permission. If you like what you hear, please leave us a review on Apple Podcasts or wherever else you get your content. Please don't forget to rate, review and subscribe to our other shows, The Anthropology, The HYPEBEAST Radio and HYPETALKS! We post polls, questions and thoughts on both socials and the results/comments are featured on the episodes as well. Send your voice messages to josephcrane@whatiwatchedtonight.co.uk and we'll get them on the show. Thanks again for the questions and comments! Peace, love, bye! Timestamps: 0:00 - 1:30 - 2:15 - 3:40 - 5:20 - 6: 7:05 - 8:00 | 9:15 | 11:40 | 13:30 | 15: 16:00 17: What's your favorite kind of porn? 19:00: What do you like? 21:30 22:00 // 23:00 / 24:00 + 25:00 & 27:00s | 26:00 Is it better than your first 30:00 ? 29:00 Or do you think it's better than mine? ? 30:40 35:00? 36:00/ 37:00 or 39: Is sex better than my favorite part? 41: Is it more important than your money back? 44: Is your money better than yours? 45:00 # 47:00 , 46:00 @ what do you want to get back on it? & so on?


Transcript

00:00:02.000 Shazam, bitches.
00:00:04.000 Yes, we're back.
00:00:05.000 Four days in a row this week.
00:00:06.000 That's how crazy we're living out here on the West Coast.
00:00:10.000 Holla!
00:00:11.000 See, I just made it to East Coast, West Coast things.
00:00:13.000 The Joe Rogan Experience Podcast is brought to you by the Fleshlight.
00:00:16.000 There's no different way to do this.
00:00:17.000 I've done it every possible way.
00:00:19.000 I'm just going to say it.
00:00:21.000 What is the Fleshlight, Joe?
00:00:22.000 Have you used your Fleshlight lately?
00:00:23.000 No, I have not.
00:00:24.000 I actually reused it yesterday.
00:00:27.000 Really?
00:00:27.000 I was watching my girlfriend on camera fucking another girl, and I used my flashlight when I watched it.
00:00:32.000 You got a little personal with the whole rest of the world right there.
00:00:35.000 There's a lot of people that they're not comfortable with someone that they think is their friend, like little Brian Redband.
00:00:40.000 Telling them about fucking and watching things.
00:00:42.000 Have you ever done one of those webcam shows?
00:00:44.000 You're like a moral degenerate.
00:00:44.000 You're bad for society.
00:00:45.000 Have you ever seen one of those webcam shows?
00:00:47.000 Yes, I have.
00:00:47.000 It's pretty cool.
00:00:48.000 Somebody sent me to a link the other day that was, he goes, what the fuck, the mother load?
00:00:52.000 And I go to the link and it's some crazy cam site where it's all girls that are topless and are like playing with themselves.
00:00:59.000 Yeah.
00:00:59.000 And it's all free.
00:01:00.000 Yeah, it's weird.
00:01:01.000 The one I use is Stream8.
00:01:03.000 And you can go to and just have a conversation for the girl while she's sitting there topless.
00:01:08.000 And then she's like, you want to go in a private room for $10?
00:01:10.000 And then you do whatever you want to in the private room.
00:01:13.000 This was all virtual?
00:01:14.000 Yeah, this is like webcams and stuff.
00:01:17.000 But there were so many old ladies and fetishes that it was just creepy to look at an old lady.
00:01:22.000 There was an old lady that was maybe 75 years old.
00:01:26.000 I want to hear about this.
00:01:28.000 I want to hear about this, but let's get through these stupid commercials, because that sounds too good to just be relegated.
00:01:33.000 The adult industry has always been pioneers of new technology, whether it was like home video, you know, and then like on the internet.
00:01:39.000 I mean, they've always been the first.
00:01:41.000 And when we have virtual reality, I bet virtual sex will be pioneering it.
00:01:45.000 Yeah, yeah.
00:01:46.000 Alright, we're going to get to that.
00:01:48.000 We're going to get to that.
00:01:48.000 But let's get this out of the way.
00:01:50.000 Fleshlight.
00:01:50.000 Yeah, Fleshlight sponsored, JoeRogan.net.
00:01:53.000 Enter in, code name, Rogan, get 15% off.
00:01:55.000 It's a solid product.
00:01:57.000 It's the best fake vagina out there.
00:01:58.000 It really is way better than masturbating.
00:02:00.000 There.
00:02:00.000 We're also brought to you by Onnit.com, O-N-N-I-T, makers of AlphaBrain, the cognitive enhancing supplement that I use.
00:02:06.000 I didn't take one this morning, so let's see if I stumble.
00:02:08.000 If I stumble through my words today.
00:02:10.000 I don't have any more.
00:02:11.000 You took all mine.
00:02:11.000 Dude, I gave you some.
00:02:12.000 Didn't I give you some of the new stuff?
00:02:13.000 No.
00:02:14.000 Yeah, I got some new stuff for you.
00:02:15.000 I need to get back on it.
00:02:17.000 Get on it.
00:02:17.000 Get up on it.
00:02:19.000 What is this shit?
00:02:20.000 What is it?
00:02:21.000 What are you talking about?
00:02:22.000 We're talking about nootropics.
00:02:23.000 And what nootropics are, they're vitamins that are designed to enhance the function of your mind, the way your thinking works.
00:02:29.000 And they work.
00:02:30.000 They work.
00:02:30.000 They're only going to get better.
00:02:32.000 Yeah.
00:02:32.000 Do you try stuff?
00:02:34.000 What do you do?
00:02:34.000 I have not, but David Pierce who wrote the Hedonistic Imperative talks about how nanotechnology eventually is going to be used to design vaster and more broader versions of human intelligence.
00:02:46.000 I'm sort of all about Tinkering with ourselves in order to sort of improve ourselves.
00:02:51.000 Oh yeah, for sure.
00:02:51.000 Where the ultimate art project is us, right?
00:02:53.000 So it's kind of like, why not?
00:02:55.000 I've got to get you some of this just to see if you like it.
00:02:57.000 I have some people that I've gotten that are completely addicted to it.
00:03:00.000 It's fascinating stuff.
00:03:01.000 I like it.
00:03:02.000 Some people...
00:03:03.000 The most important thing about this product, I should say, is that we have...
00:03:06.000 For the first order, for your first 30 pills, there's a money-back guarantee, 100%.
00:03:09.000 You don't have to send the product back.
00:03:11.000 All you have to do is say, you know what?
00:03:12.000 This stuff didn't work for me.
00:03:13.000 And you get 100% of your money back.
00:03:15.000 We're way more concerned with having people feel like they're not ripped off than with making money.
00:03:21.000 So check out the stuff.
00:03:22.000 It's all at Onnit.com.
00:03:24.000 We have sports-related supplements, immunity-related supplements.
00:03:27.000 We have New Mood, which is a 5-HTP supplement, which boosts your serotonin, actually makes you feel happier.
00:03:33.000 It's great stuff.
00:03:34.000 It's all healthy for you.
00:03:35.000 It's all explained online as a frequently asked questions.
00:03:38.000 You can go into it in depth and Google Nootropics and try out All the different stuff that's out there, and just check it out.
00:03:45.000 But go to honor.com if you're interested, and under your alpha brain, enter in the code name ROGAN, and you get 10% off.
00:03:51.000 Not just the first order, but every order.
00:03:52.000 All right, bitches, Jason Silva's back!
00:03:56.000 He's slinging Experience Train by day Joe Rogan podcast by night All day Jason Silva come back to sling more cosmic dick Ha ha ha ha ha ha You said we're cosmic revolutionaries.
00:04:11.000 I say you're a cosmic dick swinger.
00:04:13.000 How about that?
00:04:14.000 Wow.
00:04:15.000 Well, thank you for having me back, dude.
00:04:17.000 Oh, please.
00:04:18.000 It was so fun.
00:04:19.000 Thanks, man.
00:04:20.000 It was so fun.
00:04:20.000 I had a great time, too.
00:04:21.000 Just a shout-out to all these amazingly engaged listeners and followers, dude.
00:04:26.000 The response was so positive and so...
00:04:29.000 Yeah, we have a really super positive group of people that follow the show, and it sounds ridiculous.
00:04:34.000 How do you do that?
00:04:35.000 I mean, how does that ever happen?
00:04:36.000 I don't know, but I'm so honored.
00:04:38.000 Maybe it's because of your authenticity, man.
00:04:40.000 Well, I'm honored, if that's what it is.
00:04:42.000 Whatever it is, I'm honored.
00:04:43.000 It comes across, man.
00:04:45.000 When we go to clubs, that's the thing that the waitresses are always saying, that our crowds are so nice and that they tip really well.
00:04:51.000 It just makes you feel so good.
00:04:52.000 It's like the biggest feeling of accomplishment that I've ever had is...
00:04:56.000 Someone who listened to the show once and said, your show makes me want to try to be a better person.
00:05:02.000 This is the kind of feedback that we've been getting about our mind meld, dude.
00:05:05.000 It's been insane.
00:05:06.000 Like some people have created these remix videos where they've taken highlights and sound samples from what we talked about.
00:05:12.000 There's so many.
00:05:12.000 Set them to imagery and set them to music.
00:05:14.000 And that's kind of like what the creativity and the whole remix culture is all about.
00:05:17.000 It's not about where you take things from, it's where you take them to.
00:05:20.000 Yeah, yeah.
00:05:20.000 And you see that, it's just like, oh my god, it's...
00:05:22.000 There's a bunch of those out there now.
00:05:24.000 There's so many guys that are really good at that too.
00:05:27.000 There's so much creativity.
00:05:28.000 Oh yeah, so much.
00:05:28.000 And most of them have like regular jobs.
00:05:30.000 They're just like regular dudes.
00:05:32.000 So they're doing it out of pure passion.
00:05:34.000 Yeah.
00:05:34.000 There's a kid who calls himself the Paradigm Shift on YouTube.
00:05:37.000 I met him.
00:05:38.000 And you know, he's just a really fucking talented guy.
00:05:42.000 He made this thing for me, the American War Machine.
00:05:45.000 And I mean, it's like, it's humbling.
00:05:47.000 It's humbling, because you hear the words that you say, and the words seem just kind of obvious to you, the things that you've thought of and said a hundred times, but then when this kid puts it to images and video and music...
00:05:58.000 And then you see the power of an idea, the power of an idea to live on beyond its inception, beyond the moment that it came out of your mouth.
00:06:04.000 There was this guy, The Thinking Primate is the YouTube name, and they did a remix of us, and I thought it was glorious.
00:06:11.000 Honestly, I thought it was glorious.
00:06:12.000 Yeah, there's a lot of those guys out there.
00:06:14.000 And yeah, we're super honored that they do that.
00:06:17.000 It's one of the coolest things of all time.
00:06:19.000 It's a weird thing going on right now, man.
00:06:21.000 I think the internet has kind of ushered in a whole new culture.
00:06:23.000 I really do believe that.
00:06:26.000 You can't get by on bullshit anymore.
00:06:28.000 Yeah, it seems like a culture of massive collaboration and cooperation.
00:06:32.000 Even that recent example of that viral video that they made about Joseph Kony in Africa.
00:06:37.000 And it reached 100 million views in a week.
00:06:40.000 And I think that just what it shows, between that and also the anti-SOPA movement online, I think that what it's really demonstrating is just the ability to create viral swells that have massive impact without having used mainstream media, for example.
00:06:55.000 Just make a video, put it on YouTube for free, and have a voice in the national conversation.
00:07:01.000 Everybody can do that, and the price points keep going down and down and down exponentially.
00:07:06.000 And there's no reason not to think...
00:07:09.000 God, what comes next, right?
00:07:10.000 Well, yeah, well, this guy, I don't know the whole story on the guy who orchestrated the whole Coney campaign, and I've seen some criticisms about him, but it didn't really make much sense to me.
00:07:19.000 I mean, it seems like this guy really is a war criminal, and what this guy's doing by exposing that, it's like, yeah, we're exposing, really, a guy who's done some terrible, horrible things.
00:07:27.000 Oh, no, absolutely.
00:07:28.000 Unquestionably, right?
00:07:29.000 I just think the success of the campaign, like, game-changing, game-changing viral success...
00:07:36.000 It also is going to invite scrutiny that comes with that.
00:07:40.000 So I think whatever the controversy is, that's a whole separate conversation.
00:07:44.000 I think the real conversation is people, democracy, social movements, revolutions, take note.
00:07:51.000 This is how you join the conversation.
00:07:53.000 This is how you get your voice heard.
00:07:54.000 No need to take up arms.
00:07:56.000 No need to be violent.
00:07:57.000 You want to get something heard?
00:07:59.000 Have a good video editor and a good sense of aesthetic presentation.
00:08:02.000 Yeah, no shit, huh?
00:08:04.000 It's kind of amazing.
00:08:04.000 You know, I saw the tweets.
00:08:06.000 They started coming in.
00:08:07.000 You know, it's Coney, Coney, Coney.
00:08:09.000 And I knew who the guy was.
00:08:11.000 I'd read about his movement in Africa.
00:08:15.000 And you saw Peter Pan.
00:08:17.000 I saw Peter Pan?
00:08:18.000 It's all exactly the same as Coney.
00:08:20.000 Really?
00:08:20.000 Yeah.
00:08:21.000 Didn't Peter Pan used to steal the kids and make them an army?
00:08:24.000 Isn't it horrible, though, that that actually is happening?
00:08:27.000 That someone, they're stealing children and forcing them to become soldiers?
00:08:30.000 I mean, it's just terrifying stuff.
00:08:32.000 It's really, really horrifying, horrifying stuff.
00:08:35.000 It's terrible, but I do think that we're seeing violence going down across the world.
00:08:40.000 I mean, this guy, Steve Pinker, and he has a TED Talk, The Myth of Violence.
00:08:43.000 We might have mentioned it last time.
00:08:44.000 We'll say that...
00:08:45.000 Violence is down across the world, and the chances of a man dying at the hands of another man are the lowest that they've ever been.
00:08:50.000 Now, granted, there's more people in the world than there were in the past, but proportionally, the violence is a lot less.
00:08:54.000 And I think as these people, you know, the rising billion in certain parts of the world, coming online, getting smartphones, joining the global conversation, all of a sudden can have their voices heard.
00:09:02.000 And the first step to addressing a problem is, you know, making an awareness that the problem is there so that the importance of it can resonate with people.
00:09:10.000 And so I think there's reason to, you know, Be optimistic about even the worst of the worst getting less worse.
00:09:16.000 I agree.
00:09:16.000 I think we automatically go pessimistic because things aren't perfect.
00:09:19.000 We look at it and we go, God, why is there so much fucked up shit in this world?
00:09:23.000 Why is there so much crime?
00:09:24.000 Why is there so much violence?
00:09:25.000 Why is there so much death?
00:09:26.000 Why is war still here?
00:09:28.000 Why is corruption still here?
00:09:29.000 But what you don't realize if you really stop and think is like, this is the best it's ever been ever by a goddamn long shot!
00:09:37.000 Absolutely.
00:09:38.000 I was driving on the way over here today on the highway and it was a nice day here in Pasadena.
00:09:43.000 There was no one on the highway.
00:09:44.000 It was like easy traveling.
00:09:46.000 It's nice and beautiful and sunny out.
00:09:48.000 I was thinking how much it would have sucked to live just 500 years ago.
00:09:53.000 Oh, totally.
00:09:54.000 Just 500, a blip in time, like nothing.
00:09:57.000 No cars, fucking horses.
00:10:00.000 There's not even trails out here.
00:10:02.000 You need to see there's a presentation by this guy called Hans Roebling, his website Gapminder.
00:10:08.000 He does this thing where he shows all the nations across the world over time and how the indicators of quality of life and infant mortality rate and income and all these different things.
00:10:16.000 He shows that all the countries of the world, even the worst of the worst, are rising.
00:10:19.000 So the rising tide does lift everybody else.
00:10:21.000 It's unbelievable.
00:10:22.000 And I think the reason that most people don't realize that things are always getting better is because of the amygdala.
00:10:27.000 Peter Diamandis did a presentation about this at the TED conference just a week and a half ago.
00:10:32.000 And he has this book called Abundance and he'll explain that because our brains evolved in a time where we had to have fight or flight mode, the amygdala is always looking for danger and it supersedes everything else.
00:10:40.000 And so the media gives us danger because that's what we're drawn to.
00:10:44.000 If it bleeds, it leads.
00:10:45.000 And we're always going to be paying attention to what's wrong even when there's infinitely more things that are going right.
00:10:50.000 And because the media wants to just get viewership, the mainstream media will feed us what we want, which is to see all the horrible things that are happening across the world.
00:10:57.000 Although, eventually, that's actually going to be a good thing because if we can see what's wrong, we'll try to address it and try to fix it.
00:11:03.000 But even when we remedy 99% of the problems that exist today, our brains are still going to be seeing the new problems because that's what the brain does.
00:11:10.000 Yeah, the amount of time from us running from Jaguars to being a guy who steps into a Jaguar and turns the key, the amount of time is so small.
00:11:20.000 The biology has never had a chance to catch up.
00:11:22.000 It does not!
00:11:23.000 We have pretty much the same brains as we did 100,000 years ago.
00:11:27.000 I mean, 100,000 years ago, kind of everyone is agreeing, unless you really go extreme, that there was no sophisticated culture, which is nothing.
00:11:36.000 100,000 years is nothing.
00:11:37.000 It's a bling.
00:11:38.000 What the fuck happened, man?
00:11:39.000 It's a bling.
00:11:40.000 Language.
00:11:40.000 Language.
00:11:41.000 We got into it last time.
00:11:42.000 Yeah, we did get into language.
00:11:43.000 You really believe that that just made everything change because we could exchange information?
00:11:47.000 Yes, well, because the moment that we invented, and this is where Terrence McKenna gets into, you know, gets Kurzweilian and Kevin Kelly-ish in his comments, is that he said that when we invented language, biological evolution stopped playing the key role.
00:12:02.000 Yeah.
00:12:02.000 Because it was replaced by this, you know, cultural epigenetic type of evolution which goes faster and faster and faster because it accrues knowledge and it builds on itself and it's not limited by the hardware of the brain which would take billions and billions of years to change, you know?
00:12:17.000 And so this cultural thing, you know, all of a sudden each brain became a neuron in a vaster global brain of accrued knowledge and intelligence that was bootstrapping on its own complexity which is why over the last hundred thousand years It has been, the cultural evolution has been accelerating exponentially.
00:12:35.000 It manifested as technology, technological evolution.
00:12:37.000 But what's most interesting is that this telescopic nature of it gets faster and faster and faster.
00:12:42.000 So over the last 100,000 years, yeah, crazy.
00:12:44.000 But over the last 100 years even, it's gotten crazier than the last...
00:12:48.000 A billion!
00:12:49.000 Well, they say that a thousand years ago, no one could read silently.
00:12:53.000 Right.
00:12:54.000 There you go.
00:12:54.000 They had to read by talking.
00:12:56.000 They had to say the words.
00:12:57.000 No one could read silently.
00:12:59.000 And it was actually one of the ways that some guy, I don't remember, some religious figure, Thomas Aquinas, maybe it's him?
00:13:06.000 I'm not sure.
00:13:07.000 Okay.
00:13:07.000 Proved that he was a saint.
00:13:10.000 Because he could read silently.
00:13:12.000 Because he could read silently and then he would recite it.
00:13:14.000 Amazing.
00:13:15.000 He would look at it, not say anything, look at the scripture, obviously not reading because he wasn't speaking aloud.
00:13:20.000 Wow.
00:13:21.000 And then he would recite it.
00:13:22.000 Wow.
00:13:22.000 And that was his master of the scripture was unparalleled.
00:13:25.000 It's because he could read silently.
00:13:26.000 He was like the only guy.
00:13:27.000 Yeah.
00:13:28.000 I don't think it's that guy, though.
00:13:29.000 It's like it's one of those other religious people that may or may not have ever really existed.
00:13:33.000 Might not have been Thomas Aquinas.
00:13:35.000 I didn't ask Sam Harris if he believed it.
00:13:38.000 Jesus was a real human.
00:13:39.000 That was in the Zeitgeist documentary, I remember.
00:13:42.000 Was it?
00:13:42.000 They said that he probably never even existed.
00:13:44.000 Well, because he shares all the same attributes as all these other gods and all these other cultures.
00:13:48.000 They all die at the same age.
00:13:49.000 They're all born at the same, like...
00:13:51.000 But isn't it also possible that it could have been just a real person, but they attached all these other attributes to him because of ancient mythology?
00:13:58.000 I suppose.
00:13:59.000 If you look at it, it's just completely open.
00:14:02.000 Right.
00:14:02.000 But I don't know if it was a real dude, but man, you want to talk about one guy just dominating religion for thousands of years.
00:14:12.000 Well, he became a meme.
00:14:14.000 He was no longer a person, man.
00:14:17.000 He became a meme.
00:14:18.000 In Richard Dawkins' book, The Selfish Gene, he says that there was this new replicator.
00:14:23.000 Just like genes were the replicators.
00:14:26.000 We could multiply and they could evolve over time.
00:14:28.000 That there was a new replicator that was born above the biosphere.
00:14:32.000 A new kingdom above the biosphere.
00:14:34.000 And the denizens of this kingdom were ideas.
00:14:37.000 And so he said ideas in the form of memes.
00:14:39.000 They're like organisms.
00:14:41.000 They've retained the properties of organisms even though they rise above the biosphere.
00:14:45.000 They replicate.
00:14:46.000 They complete each other.
00:14:47.000 They mutate.
00:14:48.000 They leap from brain to brain.
00:14:50.000 They compete.
00:14:51.000 They compete for attention, you know?
00:14:53.000 And And he goes crazy.
00:14:56.000 And James Glick, who wrote the book The Information, says that the primary building block of reality might be information before it, before matter itself.
00:15:06.000 So he actually says it comes from bit, matter comes from information, and that information is really what's at the core of reality.
00:15:14.000 And it's just an insane idea.
00:15:17.000 Because that goes back to the whole thing about the power to change the world.
00:15:20.000 People, ideas, passions can change the world because ideas have done more than genes over the last hundred years.
00:15:27.000 Well, McKenna would always go on about the world being made of language.
00:15:31.000 Yes!
00:15:32.000 Really hard to wrap your head around, man.
00:15:34.000 That was a real mindfuck.
00:15:36.000 It was huge.
00:15:37.000 Sort of not really, but wait a minute.
00:15:39.000 Because two people have to communicate in order to create something together.
00:15:43.000 And then you're thinking about infrastructures and cities.
00:15:44.000 That is all a factor of language.
00:15:47.000 Without language, none of this would be there.
00:15:50.000 It's just so hard to wrap your head around that.
00:15:52.000 Yeah, and I think that he was spot on.
00:15:55.000 And I think that the reason that he was spot on is because I think when he says, okay, the world is made of language, What he's saying is we create a mental model of the world in order to understand the world, in order to speak about the world and react to the world.
00:16:08.000 We create a mental model in our head and then we label those pictures in our heads, you know, symbolically.
00:16:13.000 So we abstractify reality.
00:16:15.000 And therefore, the way that we interface reality through the prism of our language, our thinking, our preconceptions, our stereotypes, our culture, which is to say we don't see the world as it is, we see the world as we are, which speaks exactly and directly to what I think McKenna was saying.
00:16:28.000 Reality is made of language.
00:16:30.000 It's almost like...
00:16:31.000 It's why they say that even thinking a happy thought will start to make you happier.
00:16:36.000 Essentially, the world changes.
00:16:38.000 You become happier about the world simply by thinking it so.
00:16:41.000 And it sounds kind of new-agey and stuff, but not really, because even the object of description, I think, does something to influence one's perception of reality, which is just how you interpret electrical signals going through your brain anyway.
00:16:55.000 And so if you're aware that reality is made of language and that we're like co-creating it with our intention and something of course which is magnified with psychedelics that's why they talk about set and setting being so integral to the trip because your thoughts about the trip affect the trip itself so thoughts become reality but we should think of our lives as one big fucking trip our normal baseline waking sober lives is one big hero's journey and it should be up to us to think of it so and so if we're all on a hero's journey if we're all On an extended,
00:17:25.000 lifelong mind-manifesting, which means psychedelic, trip, then we have a responsibility to sort of use words to map our reality the way that we want, to be authors of our reality, of our existence, to make a masterpiece out of life, one that we would willingly live again and again for all of eternity.
00:17:44.000 So I... Like what we're doing now.
00:17:47.000 Our conversation.
00:17:48.000 It's changing the reality inside the synapses of those that are engaging with us just the same way we're changing each other's reality right now.
00:17:56.000 This is a different reality than where we were an hour ago.
00:17:59.000 We're literally interfacing in a different universe.
00:18:03.000 You don't think about it that way, though.
00:18:04.000 You think, well, we're just doing a podcast.
00:18:08.000 A podcast.
00:18:09.000 Chilling here, talking shit.
00:18:10.000 Yes, but you're...
00:18:13.000 Portions of your mind, the output of your mind, whether it's immaterial or not, still creates tangible impact in the world.
00:18:23.000 Because think of like the one or five or ten people that you might inspire to create some work of art that came out of what they heard in this conversation.
00:18:33.000 And that work of art gets licensed by a brand to create a campaign for creativity that then the government of Finland adopts in their...
00:18:42.000 In their policy for education for the following year, and it transforms the lives of the next generation of students.
00:18:49.000 The butterfly effect in transformation triggered by ideas is more powerful than, you know, I think, you know, Than of the physical world.
00:19:01.000 I think you're absolutely right, especially in the age of the internet.
00:19:04.000 I think this is the time where the ideas really can go viral almost instantaneously.
00:19:09.000 Like this Kony video.
00:19:10.000 I don't even think we've really fully examined the impact possible through information.
00:19:18.000 Especially with what are kids going to be like, man.
00:19:21.000 What are 20-year-old kids going to be like 20 years from now?
00:19:24.000 Just growing up in this...
00:19:27.000 More advanced and empowered.
00:19:29.000 Yeah.
00:19:29.000 Way more aware.
00:19:31.000 Way harder to bullshit.
00:19:32.000 Yeah.
00:19:32.000 Bluetooth enabled kids.
00:19:34.000 Yeah.
00:19:34.000 They're just going to be crazy.
00:19:35.000 They're going to look back on the nonsense that we believe today and they're going to be laughing at us, man.
00:19:40.000 Yeah.
00:19:41.000 I think even the way that...
00:19:43.000 How things are voted in, you know, how people resolve issues.
00:19:47.000 I think the idea of having representatives over there to carry our voice to Washington is obsolete because we are post-geographical beings at this point.
00:19:58.000 We don't need somebody else to represent us necessarily because we can all represent ourselves and have a voice online.
00:20:04.000 In fact, there's people that are talking about how we could reform or upgrade or re-examine how government is run and how people are represented.
00:20:11.000 I mean, I'm talking a little farther out, but there's this guy who's starting this thing.
00:20:14.000 He's a friend of mine.
00:20:15.000 His name is Micah.
00:20:15.000 He used to actually be with Students for Sensible Drug Policy, and now he's doing this thing called Dynamic Democracy, which is about starting a conversation and exploring new ways of how the Internet, the human extended nervous system that's Connecting us all, right?
00:20:28.000 Because we love saying that.
00:20:28.000 We are all connected.
00:20:29.000 We are all empowered.
00:20:31.000 Well, how about we upgrade the way the world is run, you know, like on a meta scale?
00:20:37.000 Well, let's talk about it, you know?
00:20:39.000 Yeah, yeah.
00:20:40.000 I mean, that essentially is what the Internet's doing, right?
00:20:42.000 Yeah.
00:20:43.000 I mean, I've heard people be down on the internet and I guess you could see some negative points to anonymity and there's a few aspects of pornography that are a little unseemly.
00:20:53.000 It's definitely accelerated pornography, I'll tell you that.
00:20:56.000 Things have gotten really weird, man.
00:20:58.000 If you want to look at what happens to human beings when left alone to their own devices and when allowed to expand in a contained market like pornography, there's only so many different things they can do.
00:21:09.000 You know what the big thing is lately that I keep seeing, man?
00:21:13.000 Is girls getting guys to come in them and then they squirt it into a champagne glass and drink it.
00:21:20.000 Really?
00:21:20.000 Or a martini glass?
00:21:22.000 Really?
00:21:23.000 Who's asking for that?
00:21:24.000 Just because these digital tools extend the range of our creativity, it doesn't mean that people can't use that creativity in ways we don't agree with, or perhaps in bad ways, because just like we use the power of fire to cook our food, we use the power of fire to burn other people, which is always the double-edged sword of anything.
00:21:46.000 Expansion and extension of human reach.
00:21:48.000 But that's still what evolution is probing for because we're all seeking out complexity.
00:21:53.000 It's just going from single-celled organisms to multicellular organisms to beings to thinking beings to beings who create technology and so on and so forth.
00:22:01.000 So it's all happening anyway.
00:22:02.000 So people say it's not going to stop.
00:22:03.000 It's part of evolution.
00:22:04.000 But yes, we have to acknowledge that these tools are a double-edged sword.
00:22:07.000 And that's fine.
00:22:08.000 That's part of what makes the conversation interesting.
00:22:10.000 Or some people really like doing that.
00:22:12.000 That's possible too, right?
00:22:13.000 There could be a woman out there that actually likes to get dudes to shoot loads and then she squirts them out into a glass.
00:22:19.000 It is very possible.
00:22:20.000 And who am I to judge, right?
00:22:22.000 No, you should never judge.
00:22:24.000 People can do whatever they want as long as they're not hurting anybody else.
00:22:26.000 Exactly.
00:22:27.000 It's just weird that porn is accelerated to this, to what it is today.
00:22:32.000 Porn was just porn for the longest time.
00:22:35.000 You'd heard rumors of snuff films or something crazy, but no one ever saw one.
00:22:39.000 Did you ever see a snuff film, Brian?
00:22:41.000 Yeah, I've seen snuff films.
00:22:42.000 Well, you've seen people die on the internet for real in their life.
00:22:45.000 Yeah, it's disturbing.
00:22:47.000 But I don't even know if those are real half the time now.
00:22:49.000 Remember when Nine Inch Nails had a snuff film out called Broken?
00:22:53.000 It was like they advertised it as this bootleg video.
00:22:56.000 And you'd rent it and it looked like somebody murdering somebody else.
00:22:59.000 It was kind of like Faces of Death.
00:23:01.000 Oh, wow.
00:23:01.000 That's terrible, man.
00:23:03.000 And everyone thought it was a snuff film, but it turned out...
00:23:05.000 Well, there have been real films, man, for real.
00:23:07.000 There was a documentary on it a while back.
00:23:10.000 That's terrible.
00:23:11.000 Yeah, and the guy who was the...
00:23:14.000 One of the people they were interviewing was talking about watching this film, and as he's talking about watching the film, he starts crying.
00:23:19.000 Wow.
00:23:20.000 It's pretty intense.
00:23:21.000 Wow.
00:23:21.000 Yeah, he's obviously pretty fucked up by it, you know?
00:23:24.000 Maybe he didn't cry.
00:23:25.000 He definitely got choked up.
00:23:26.000 He was, like, just thinking about watching this.
00:23:31.000 There's a broad spectrum of human behavior, man.
00:23:34.000 We've got to figure out somehow to stop that.
00:23:37.000 Is there a way, or is it necessary to have negative in order to influence positive?
00:23:42.000 I don't know if that's necessary.
00:23:44.000 I think that has been something that perhaps has worked for some people.
00:23:49.000 You've got to know what bad is in order to know what good is.
00:23:52.000 You need the contrast.
00:23:53.000 But it doesn't mean that we come up with some more novel solution that allows us to live...
00:23:59.000 According to that idea that we need the bad in order to know the good, it implies that we need to have suffering to appreciate when we're not suffering.
00:24:07.000 Not that we need to, but if you look at things as being natural, you look at everything as being natural, like wolf behavior, bee behavior, look at all this stuff as being natural and positive towards whatever their goal is.
00:24:19.000 Whether their goal is to create this beehive that they create, whether their goal is to create an anthill.
00:24:24.000 When you look at human society, Maybe what we're doing is natural as well.
00:24:30.000 And maybe we're so fucking chaotic and so crazy because you sort of have to be to be working with technology that's so far and ahead what your biology is capable of processing.
00:24:41.000 So we have this fucking wacky tribal monkey shit going on while we have nuclear power, while we have...
00:24:50.000 There's a lot going on.
00:24:53.000 Increasingly, people are moving into their own personal universes and soundscapes.
00:24:57.000 And when we have virtual reality, then we each become the god in our own universe.
00:25:02.000 And at that point, an infinity of combinations and permutations of lifestyles will be explored by individuated nervous systems living out in the ethersphere of the interweb.
00:25:13.000 So who the fuck knows?
00:25:15.000 But...
00:25:16.000 At that point, we won't care what that person does in their own virtual universe.
00:25:20.000 The porn's going to be awesome.
00:25:23.000 It's going to be grosser, probably.
00:25:25.000 Grosser?
00:25:26.000 Yeah, like balut ponds where the tampon gets shoved in the vagina for a week and then pulled out and somebody eats it or something like that.
00:25:32.000 I hope not.
00:25:33.000 I hope it actually becomes about composing and creating...
00:25:37.000 The greatest dream we have ever dreamed.
00:25:40.000 Well, maybe it could be both.
00:25:41.000 To make greater art than we've ever experienced.
00:25:43.000 To create better designer drugs that engage with our senses and make us appreciate art in ways that we couldn't have before.
00:25:50.000 To merge with our lovers.
00:25:52.000 To become one with them.
00:25:53.000 I mean, we use language to connect and say how we feel to one another.
00:25:56.000 What if chicks want to merge all the time, man?
00:25:58.000 What if they want to merge all the time?
00:25:59.000 You got shit you want to do, man.
00:26:01.000 What if you want to go hang out with your boys?
00:26:02.000 You want to go play pool?
00:26:04.000 What if your boys want to merge?
00:26:06.000 Your boys want to merge with you and your wife?
00:26:08.000 Ew.
00:26:09.000 You won't be playing pool, though.
00:26:10.000 What if your boy wants to merge with your wife?
00:26:12.000 He's like, hey, man, can I merge with your wife?
00:26:14.000 Ew.
00:26:15.000 Ew.
00:26:15.000 That's weird, man.
00:26:16.000 I just want to see what it's like to be her.
00:26:17.000 What if they want to merge with your kids?
00:26:18.000 What if they can copy and paste your wife?
00:26:20.000 It's not sexual.
00:26:21.000 No, no, no.
00:26:22.000 What if they want to merge with a dolphin?
00:26:23.000 Because they want to know what it's like to be a dolphin.
00:26:25.000 Jesus Christ, Timothy Leary.
00:26:25.000 Settle the fuck down.
00:26:27.000 Merging with dolphins and people.
00:26:29.000 Yeah, what if, right?
00:26:31.000 We have to define, like, if we do create something that allows, like, the human consciousness to merge, to interface with something, we're going to have to, like, really define what's happening there.
00:26:41.000 So people don't...
00:26:41.000 Like whether it's going to have parameters.
00:26:43.000 ...sexual.
00:26:43.000 Interesting.
00:26:44.000 I just don't see...
00:26:45.000 You can see, like, Nancy Grace on TV. Who is this man that he's merging with a 14-year-old girl in Florida?
00:26:55.000 You tell me that's appropriate?
00:26:59.000 That this man is merging?
00:27:01.000 What does a grown man have in common with the thinking of a 14-year-old girl?
00:27:10.000 She gets a little wetter every time there's a dead baby in Florida.
00:27:13.000 Every time something happens in Florida, she's like, oh yes, more programming, more material.
00:27:22.000 You know, I have to say the fact that we see so much Can't she find nice things?
00:27:28.000 Nancy Grace, please.
00:27:29.000 I love you.
00:27:29.000 I'm picking on you because I have to.
00:27:31.000 I'm a comedian.
00:27:32.000 Can't you just find one nice story?
00:27:33.000 Yeah, we need more programming that's uplifting.
00:27:37.000 Isn't it nice to see stuff that makes you feel good about humanity?
00:27:40.000 But it's also good to have people go after bad people.
00:27:42.000 Don't get me wrong.
00:27:43.000 The idea of stopping crime and preventing scumbags from getting along.
00:27:47.000 But absolutely, as far as what we project, our issue is that there's 7 billion people on this planet, and if you only want to pay attention to negative shit, you can find enough to fill every second of every day.
00:27:57.000 Every second of every day, of every moment that you are on this planet, someone's getting jacked.
00:28:02.000 Yes, but I think the people are reacting to that by creating more and more really inspiring content.
00:28:07.000 And I think corporations now are all wanting to align themselves with having a sort of...
00:28:13.000 Positive impact on the world.
00:28:15.000 You know, they're saying there's more to a corporation than just making money.
00:28:18.000 Well, I hope so.
00:28:19.000 How about wanting to make a social...
00:28:20.000 But I think it is becoming part of our consciousness now.
00:28:23.000 Increasingly, like, this is what you're hearing.
00:28:25.000 I mean, you had Pepsi do that campaign last year.
00:28:27.000 They're all...
00:28:29.000 My point was that you have to manage your own interaction with this kind of information.
00:28:36.000 My point was that if you so choose you can be around it all day every day or you can just not and you can force yourself into more positive places and the options available.
00:28:48.000 Both options are available.
00:28:50.000 Yeah, absolutely.
00:28:50.000 And you have to be kind of careful in how you manage your consciousness.
00:28:53.000 Because you really can freak yourself the fuck out if you only chose to concentrate on all the negative things in this world.
00:28:59.000 There's too much information.
00:29:00.000 Totally.
00:29:01.000 And you could drown in information, especially because the new limited resources are attention.
00:29:06.000 But I think it's interesting.
00:29:07.000 There's a book about this.
00:29:08.000 It's called The Information Diet.
00:29:10.000 And it says that it's really up to us to take responsibility over our information diet, to set up curators, to set up certain filters, to sort of, you know, to have a significant say in how we interface with media.
00:29:25.000 And we have that opportunity now that we didn't have before when it was just two channels, it was on or off.
00:29:30.000 Now there's a billion options.
00:29:32.000 So curate, author, create an experience, an information diet that will keep you mentally invigorated Just like a healthy food diet will keep you healthy.
00:29:40.000 A lot of experimenting going on, too.
00:29:41.000 There's also a lot of people trying different things out and focusing on different things.
00:29:46.000 And there's a lot of misses that seem like they were hits.
00:29:49.000 Remember when everybody was into The Secret?
00:29:51.000 Yeah.
00:29:52.000 You remember that?
00:29:52.000 And everybody was convinced that all you have to do is think positive and just draw a picture of the house you want on your wall, and one day it'll sort of manifest itself.
00:29:59.000 And yes, secret fans.
00:30:01.000 Yes, I'm paraphrasing that.
00:30:04.000 But don't you think that's an example, like the way you said it, is how probably a lot of people literalized the message without really thinking about it a little deeper and understanding how it might not sound like just...
00:30:20.000 Bull.
00:30:21.000 Well, here's the problem with the secret.
00:30:23.000 Some of it's real, okay?
00:30:25.000 There is a certain amount.
00:30:26.000 It's one of the ingredients in making something happen.
00:30:29.000 One of the ingredients is vision.
00:30:30.000 It's 100%.
00:30:31.000 There's one of the ingredients.
00:30:32.000 I mean, you talk to anybody that had some sort of a great success and a good percentage of them, at least.
00:30:37.000 Some of them have sort of gotten their vision along the way, but a good percentage of them had a vision and followed it.
00:30:41.000 And it is true.
00:30:44.000 But there's so much other shit involved.
00:30:46.000 Education, hard work, discipline.
00:30:48.000 It's not as simple as just thinking.
00:30:50.000 And executing.
00:30:52.000 Everything out in the world, the most magnificent artifact from the iPhone to the jet engine is actualized from a thought, from a dream, from a design.
00:31:00.000 Which means to say we constructed the virtual version before we constructed the actual version.
00:31:05.000 That's the same thing as visualizing something into being.
00:31:08.000 But the into being part is when you say, okay, I'm going to go execute on this.
00:31:14.000 I'm going to move through space and time, move my atoms through space and time and go construct the thing and go lobby to build the thing, to build the dream, to actualize the goal.
00:31:23.000 And I think maybe people who read the book without reading as deeply enough into it, what they thought it was like, okay, I'm just going to sit on the couch and dream something and it's going to come knocking on my door.
00:31:31.000 You know, it's also the problem is that they're dealing with a bunch of people who have had success.
00:31:35.000 And when people have had success and, you know, they all tell you the same story.
00:31:39.000 Oh, I knew it was going to happen and I dreamed it.
00:31:41.000 Well, but that's because it happened.
00:31:43.000 You know what I'm saying?
00:31:45.000 There's a lot of shit that comes along the way.
00:31:47.000 You could have gotten some random car accident.
00:31:48.000 You could have got hit by a fucking meteor.
00:31:50.000 Absolutely.
00:31:50.000 I'm not exactly sure if 100% of your success is based on the fact that you've focused on your dream.
00:31:56.000 Right.
00:31:56.000 I think it's a percentage of the success, but there's a lot of luck involved there, too, man.
00:32:00.000 Oh, yeah.
00:32:01.000 100%.
00:32:02.000 Oh, yeah.
00:32:03.000 There's a lot of luck involved in everything.
00:32:05.000 I mean, the fact that each of us is here, we beat out billions of sperm.
00:32:09.000 Yeah.
00:32:10.000 We've already, all of us are living against the odds.
00:32:12.000 And respect for luck, I think, is one of the reasons why people get lucky.
00:32:16.000 A respect for luck.
00:32:19.000 You gotta respect.
00:32:20.000 Luck is, you know, fortune, good fortune, is unquestionably an ingredient.
00:32:25.000 There's an ingredient in there.
00:32:27.000 And I feel like if karma is real in any form, I believe that's where the most evidence of it being real is.
00:32:34.000 That, to me, the people that I know that are the most fortunate are also the kindest, are also the most generous.
00:32:40.000 Those are the people that are the most fortunate.
00:32:41.000 With themselves as well, which is a very critical point when a lot of people mess up.
00:32:47.000 They're super nice to other people, but they treat themselves like shit.
00:32:50.000 They treat their body like shit.
00:32:51.000 Which is no good.
00:32:52.000 They don't go after their own goals.
00:32:53.000 They don't trace their own dreams.
00:32:55.000 They let people abuse them because they're too nice.
00:32:57.000 I mean, there's a lot of people that are not nice to themselves.
00:33:00.000 You've got to be as nice to yourself as you are to other people.
00:33:03.000 I totally agree.
00:33:04.000 It's a huge part of the equation that a lot of people miss out on.
00:33:07.000 They're like, I'm a good person.
00:33:08.000 I'm nice to people.
00:33:09.000 Yeah, but you hate yourself.
00:33:11.000 You hate your body.
00:33:12.000 You hate your mind.
00:33:13.000 You hate the way you think.
00:33:14.000 Not everybody.
00:33:16.000 Not you.
00:33:17.000 You can do more for the world, I think, by treating yourself with the same kindness that you treat other people.
00:33:25.000 Well, it's a sickness not to.
00:33:27.000 It's a sickness.
00:33:27.000 Yeah, absolutely.
00:33:30.000 I mean, food is fucking delicious, but you shouldn't eat yourself to death.
00:33:32.000 You know, I'm not saying you have to look like Kate Moss in her prime.
00:33:35.000 Where did I put that reference from?
00:33:37.000 Where was that?
00:33:38.000 But, you know, you don't have to fucking eat yourself to death either.
00:33:44.000 There's a lot of people that eat themselves to death.
00:33:47.000 The human mind can go terribly wrong.
00:33:49.000 It can go on a horrible path and just get stuck there, just get stuck in the mud.
00:33:54.000 Yeah, but the thing is, when we have that problem with software, and if software gets corrupted or if it gets a bad virus in it, we can upgrade it and reboot the system, and we're not so lucky yet with our biology.
00:34:04.000 Would you trust that, though?
00:34:05.000 What if someone did something horrible, like there was a mall shooting or something, and some guy goes in the mall and just shoots random people...
00:34:12.000 And then you reboot him.
00:34:13.000 Would you allow that guy?
00:34:14.000 Do you think that's okay in a civilized society?
00:34:17.000 Do you think we have to reboot him?
00:34:20.000 That's a great philosophical question to ask.
00:34:22.000 I mean, that's a different case study.
00:34:25.000 Do we blame society on allowing him to get to a point where his software failed him?
00:34:29.000 How do we approach that?
00:34:30.000 If it's effective, if it's real, do we blame the tissue that's left after we remove his consciousness?
00:34:36.000 Do we blame that tissue and say, I'm sorry, this tissue has to die to make up for the 16 people you shot at the mall?
00:34:42.000 Who knows?
00:34:42.000 Maybe there will be some form of like...
00:34:45.000 I bet there will be an ethical dilemma.
00:34:46.000 Like a virtual reality psychedelic experience where you take him down the rabbit hole and he has a Joseph Campbell-esque hero's journey and he collides with his own cosmic nakedness and then emerges rehabilitated.
00:34:59.000 Maybe we'll have like...
00:35:00.000 It sounds like ayahuasca.
00:35:01.000 ...digital download rehabilitation.
00:35:02.000 Yeah, electronic ayahuasca.
00:35:04.000 You tweeted once that that would be a way to grab a criminal and you should put him in an ayahuasca session with a shaman to stare into the nakedness of his own soul.
00:35:13.000 Well, this is my new show, my next show that I'm working on.
00:35:15.000 Nobody's bought it yet, but I've got some hopes.
00:35:17.000 It's called Douchebags on Mushrooms.
00:35:19.000 And that's the show.
00:35:20.000 We take douchebags throughout the world and we just bring them somewhere and dose them up with like five grams of mushrooms and let them see themselves.
00:35:27.000 Nobody's going to die.
00:35:28.000 I think psychedelic therapy is so special.
00:35:31.000 Spot on, like in terms of the psychic readjustments that can happen in one session could take years of conventional therapy.
00:35:38.000 Imagine giving it to people, yeah, giving it to criminals as part of a rehabilitation.
00:35:43.000 That would be very interesting to explore.
00:35:45.000 It just sounds very fascinating.
00:35:46.000 And not even just criminals, but people that have issues like alcoholism.
00:35:49.000 Oh, well, that's obvious.
00:35:50.000 I mean, they just came out with a study just now that said that LSD could help people get over alcohol in one session.
00:35:54.000 I mean, that's not to say about the mushrooms and depression.
00:35:56.000 Well, you know, they were actually doing tests on this in the 60s.
00:35:58.000 In the 1960s, they determined that 500 micrograms was enough to cure, like, more than 70% of chronic alcohol patients that came in and tried acid.
00:36:10.000 Just from, like, looking at the situation just completely differently.
00:36:14.000 Being separated from the nonsense of what you're engaged in.
00:36:17.000 We get stuck in these weird patterns.
00:36:20.000 It's very strange.
00:36:21.000 It's almost like a byproduct of our ability to focus on things.
00:36:24.000 We have this ability to become intense and obsess and focus on things in a positive sense.
00:36:28.000 But there's a byproduct of that and that byproduct is obsession.
00:36:31.000 It's a glitch.
00:36:32.000 It's a glitch and you get stuck in stuff.
00:36:34.000 It's like if somebody gives you a fucking Ferrari but you don't know how to drive a stick shift and you sort of figure it out along the way.
00:36:41.000 Jamming gears and fucking things up.
00:36:43.000 Sometimes it's working well.
00:36:44.000 You don't understand how to use the system.
00:36:46.000 And it could be just that.
00:36:48.000 When you see a kid that becomes obsessed with jerking off, you get him into a sport.
00:36:52.000 Maybe he become a fucking world champion.
00:36:54.000 Maybe he's just one of those kids that just whatever he focuses on, he focuses on insanely.
00:36:59.000 There's a lot of kids out there.
00:37:00.000 I'm not saying you're wasting your life playing video games because video games are awesome.
00:37:04.000 They've improved one's brain.
00:37:05.000 There's been a bunch of studies about how gaming improves coordination cooperation.
00:37:09.000 But what I'm saying is that these kids, any kid that gets really good at a video game, you can get really good at anything.
00:37:14.000 Yeah.
00:37:14.000 You can get really good at anything.
00:37:15.000 If you put that kind of focus that you put to get fucking awesome at Call of Duty, you could really, you know, you could have a better life.
00:37:22.000 Oh, yeah.
00:37:23.000 Well, imagine these gamification progresses where you can play these games to address real social challenges, and these gamers will probably find solutions to problems that engineers couldn't in the real world.
00:37:34.000 That is happening more and more now.
00:37:36.000 To use the resources of gamers to gamify a real-world problem.
00:37:41.000 How does that work?
00:37:42.000 It's like a virtual reality game?
00:37:44.000 Yeah, they'll create some interface and some problem and there's a game and you get points for solving issues related to the game.
00:37:51.000 And some gamers discovered some antibody for some crazy virus.
00:37:55.000 People can Google this.
00:37:57.000 Gamers solve some illness.
00:37:59.000 Crazy stuff.
00:37:59.000 And you're going to be seeing that more and more.
00:38:01.000 In fact, they did this crowdsourcing experiment about protein folding.
00:38:06.000 And you know who the world's best protein folder is who can fold and design proteins in the virtual space?
00:38:11.000 It's like a woman who does it in her free time in the UK. And during the day, she was like a receptionist or something.
00:38:16.000 Really?
00:38:17.000 And she's the world's best protein folder.
00:38:18.000 She used to do it on her computer at night.
00:38:20.000 What?
00:38:21.000 Yeah, because you crowdsource what Clay Shirky calls the cognitive surplus.
00:38:25.000 It's all this extra brain activity.
00:38:27.000 How is she protein folding?
00:38:28.000 What is she doing?
00:38:29.000 It's some kind of crowdsource software thing that lets people fold proteins and you can figure out how to do it in the virtual space and then it can be applied in real life.
00:38:37.000 It turns out that the best one in the world was this woman in the UK. Better than all the scientists in the world.
00:38:42.000 Yeah, yeah.
00:38:42.000 But she's just a lady with a regular job.
00:38:43.000 Yeah, yeah, yeah.
00:38:44.000 And you're going to find that more and more.
00:38:45.000 There's going to be some gamer in Budapest who's going to fix world hunger.
00:38:50.000 Wow.
00:38:51.000 Look at hackers, the hacking community.
00:38:54.000 These little 13-year-olds are hacking fucking Microsoft.
00:38:58.000 Yeah, exactly.
00:38:58.000 It's ridiculous.
00:38:59.000 Yeah, a 13-year-old hacked the UFC. Yeah.
00:39:02.000 Exactly.
00:39:02.000 Yeah, they're badasses.
00:39:04.000 Did you hear about the Lulosec guy?
00:39:07.000 Is that how you call it?
00:39:08.000 Lulosec?
00:39:09.000 He ratted out all these anonymous guys, like 26 anonymous guys.
00:39:14.000 He just ratted out everybody to the FBI just because, I guess, the FBI was playing dirty and was saying, hey, we're going to arrest you forever.
00:39:22.000 You're never going to see your kids ever.
00:39:23.000 Oh, no.
00:39:25.000 And the FBI actually admitted to it and interviewed it.
00:39:27.000 That's what they used their kids against.
00:39:29.000 Oh, no.
00:39:30.000 Man, what were they guilty of?
00:39:32.000 Hacking, you know?
00:39:33.000 Digital terrorism.
00:39:35.000 They broke into some serious websites, right?
00:39:38.000 Yeah.
00:39:39.000 Somebody said recently it's a higher form than terrorism.
00:39:43.000 Whoa!
00:39:44.000 Cyber terrorism rated higher than regular terrorism?
00:39:47.000 Well, I have a good friend, this guy Mark Goodman.
00:39:49.000 He's at Singularity University over in Silicon Valley where they look at how these emerging technologies...
00:39:54.000 Whoa, whoa, whoa.
00:39:54.000 This is Singularity University?
00:39:56.000 Hell yeah, dude.
00:39:57.000 You need to go, man.
00:39:58.000 I just did their executive program here in LA. It was at Fox Studios, and it was hosted by the head of Fox, the chairman, Jim Janopoulos.
00:40:05.000 And it was the founder, Peter Diamandis and Kurzweil.
00:40:08.000 They have people from all over the world, like the most interesting smart people, diplomats, actors, technologists, business people, to learn about exponentially emerging technologies and how they can be addressed to solve humanity's grand challenges.
00:40:19.000 And you know, like the homework there, everybody that comes out has to come up with an idea that can help a billion people.
00:40:23.000 Because the notion is that technology and our tools now allow individuals to know what to do, what at one time could only be done by governments, you know what I'm saying?
00:40:31.000 Or people with extreme resources.
00:40:33.000 But yeah, Singularity University had an executive program and they had talks about all the amazing stuff going on.
00:40:38.000 But also this guy Mark Goodman talked about like cyber terrorism and new forms of obviously synthetic biology used in bad ways.
00:40:44.000 It's a conversation that needs to be had.
00:40:46.000 Because human beings have a good ability to foresee problems, and so we should start addressing those problems before they become a serious issue, so that we can enjoy all the fruits and benefits that are coming from these emerging technologies, but at the same time take responsibility for, obviously, what is a double-edged sword, as always.
00:41:01.000 Or the aliens land first, before we get our shit together.
00:41:05.000 Right, well...
00:41:06.000 And then we got a problem.
00:41:07.000 Actually, we should talk about aliens.
00:41:08.000 I have a...
00:41:09.000 Fucking crazy idea to tell you about.
00:41:11.000 Are you ready for this?
00:41:12.000 Have you guys heard of the Transcension Hypothesis?
00:41:14.000 No, I have not.
00:41:15.000 Okay.
00:41:16.000 So I just found out about this last night, and it's a hypothesis by this guy called John Smart.
00:41:22.000 He's an accelerating specialist, futurist over in Silicon Valley.
00:41:26.000 The Transcension Hypothesis is an answer to Fermi's paradox, which is if the universe is so vast, and there's all these other planets that have had so much more time to develop intelligent life, how come we don't see it everywhere?
00:41:38.000 Right?
00:41:39.000 Like, that's Fermi's paradox, I'm told.
00:41:41.000 And the transcension hypothesis says that if you look at what's happening with technological progress as we head towards the singularity, is the dematerialization and miniaturization of complexity.
00:41:52.000 So, like, there's more energy per second per gram going through a microchip than there is in the surface of the sun.
00:41:57.000 The most complex thing in the universe that we know of right now is the human being.
00:42:00.000 So, complexity gets more complex but also gets denser.
00:42:03.000 It's what they call stem, right?
00:42:05.000 And so...
00:42:07.000 What is STEM again?
00:42:08.000 Tell me what it stands for.
00:42:11.000 Anyway, I'll remember.
00:42:13.000 Aliens, brother.
00:42:14.000 No, no, no.
00:42:14.000 But what happens is he says that eventually, this exponentially growing technology, and when we start talking about nanotechnology and putting intelligence into the nanoscale, that we're going to eventually create an artificial black hole and disappear into it.
00:42:26.000 And slingshot into the future.
00:42:27.000 Because there's going to be so much density and so much complexity and so much information that eventually is going to create a rupture through space-time and we're going to disappear into it.
00:42:34.000 So we're just going to do that just by density of information?
00:42:37.000 By too many hard drives in one spot at one time?
00:42:39.000 Yeah, well, because he says that the computation event works by shrinking things.
00:42:44.000 And complexity gets smaller and smaller and smaller as the computer chips get faster and faster and faster and more powerful.
00:42:49.000 I mean, look at the complexity that's in an iPhone today.
00:42:51.000 It's a million times cheaper, a million times smaller, a thousand times more powerful than half a building in size was 40 years ago.
00:42:56.000 So in a hundred years, imagine the complexity that is going to be in something smaller than an atom or even scales beyond that.
00:43:01.000 So when our minds, when intelligence is residing on those scales, basically they're saying that eventually we're not going to colonize outer space.
00:43:08.000 We're going to go into the inner space.
00:43:09.000 We're going to go smaller and smaller and smaller in density until we literally create the ultimate universal computer, which is a black hole.
00:43:16.000 Does everybody have to do this or can we opt out?
00:43:19.000 Can people opt out?
00:43:21.000 It's a crazy idea.
00:43:21.000 I don't know if I explained it very well.
00:43:23.000 I've been saying for years that I think that people are responsible for the Big Bang.
00:43:27.000 There you go.
00:43:27.000 There you go.
00:43:28.000 Well, the Big Bang could have been birthed from a previous universe that eventually achieved the transcension.
00:43:32.000 I think it's a reset button.
00:43:34.000 I think that's why we're so fascinated with technology.
00:43:36.000 We eventually hit a point where we figure something out and we press a button.
00:43:39.000 And we disappear into a black hole which is birthed as a Big Bang in a new universe.
00:43:43.000 The whole idea of the Big Bang is really fucking amazing.
00:43:46.000 Because it's amazing that science ever would come up with a theory like the Big Bang.
00:43:51.000 It's almost like they had to have a theory.
00:43:53.000 So this was the best one.
00:43:54.000 The universe is constantly expanding.
00:43:56.000 There's some radio waves from 14 billion years ago we're detecting.
00:44:00.000 We believe that was a big explosion.
00:44:02.000 Let's just run with this.
00:44:03.000 And the idea that at one point in time, 14 whatever billion years ago, the universe was so small.
00:44:10.000 It was more than the head of a pin.
00:44:11.000 Everything.
00:44:11.000 The entire universe.
00:44:13.000 That is ridiculous.
00:44:14.000 That's ridiculous.
00:44:15.000 But that's going the other direction.
00:44:16.000 And I just remembered what the STEM acronym stands for.
00:44:19.000 It's space, time, matter, and energy.
00:44:20.000 Space, time, matter, and energy shrink.
00:44:23.000 Why isn't it STEM then?
00:44:24.000 Space, time, energy, and matter.
00:44:26.000 My bad.
00:44:28.000 Space, time, energy, and matter.
00:44:29.000 Compresses as technology progresses.
00:44:32.000 So there's less space and less time, and things are smaller, and less energy going through that matter, and also less matter.
00:44:40.000 So that's the move towards density.
00:44:41.000 It's like a reverse...
00:44:43.000 It's giving me a fucking headache.
00:44:45.000 It's really ridiculous if I was correct.
00:44:47.000 I'm sure I'm not the only one who ever thought this up, by the way.
00:44:50.000 I think that when you look at nuclear bombs and just nuclear power in general, the fact that we control most of our power in major cities is controlled by these nutty fucking...
00:45:02.000 Nuclear explosions that they've contained.
00:45:04.000 Not an explosion, but a nuclear reaction that they've contained.
00:45:07.000 And if the power goes out like it goes in Japan, everyone's fucked.
00:45:10.000 You have to run.
00:45:11.000 Everybody has to get away from it, and it's doomed for 100,000 years.
00:45:14.000 Just that alone.
00:45:15.000 Just that alone.
00:45:16.000 It makes me think, like, wow.
00:45:19.000 I know I don't have any better options.
00:45:21.000 No, I don't.
00:45:21.000 But this is all you guys got.
00:45:23.000 You guys, I mean, in the 1960s and 70s, this is what you figured out.
00:45:26.000 You figured out how to make nuclear power that if the power goes off, it just eats right through the earth.
00:45:32.000 And then everyone's fucked anywhere near it.
00:45:34.000 You know, but isn't it mind-blowing what a mind, what minds can do?
00:45:39.000 Oh, it's incredible.
00:45:40.000 Because when you think of the scale that we are, like how small and dense a mind is, a thinking being, the amount of synaptic connections inside of something as small as the brain is as many galaxies as there are in the universe.
00:45:52.000 Right.
00:45:52.000 That amount of complexity in something so small is what we are.
00:45:55.000 So it's like people say, oh, we're so insignificant.
00:45:57.000 I think we're like...
00:45:59.000 Really significant.
00:46:00.000 You know what I mean?
00:46:01.000 Like, we're the cutting edge of design that has emerged from the universe.
00:46:04.000 I agree and I don't at the same time.
00:46:06.000 I think, yes, we're very significant if it comes to change on this earth.
00:46:11.000 But the earth is just so goddamn small in the big picture.
00:46:15.000 It's ridiculous to say that we're significant.
00:46:17.000 We're so fucking tired.
00:46:18.000 But just the fact that we can talk about the whole universe and literally play back the evolution of the universe in our heads, a capacity to understand events that have occurred over deep time means that we're creating models on the scale of that universe.
00:46:35.000 The universe that you're saying is so much bigger than we are.
00:46:37.000 We're creating internal models of it inside of our heads.
00:46:40.000 It's true.
00:46:40.000 We're fitting the universe in our head as far as virtual conversations about it are.
00:46:45.000 That's what's fucking crazy, which means it fits in our heads.
00:46:47.000 The design fits in our heads.
00:46:49.000 If we understand it correctly, the smartest people in the world, Einstein among them, could probably contemplate it in his head.
00:46:56.000 And, you know, people who take psychedelics say that they experience the entire universe at once.
00:47:00.000 Maybe they do.
00:47:01.000 Yeah, maybe they do.
00:47:02.000 Maybe it's all inside a mushroom.
00:47:04.000 You can see the whole thing.
00:47:05.000 You just got to take a nap on them.
00:47:07.000 Yeah, well, because the universe expands outwards, but it goes inwards, too.
00:47:10.000 The scales get smaller.
00:47:12.000 There's an entire universe inside of us.
00:47:13.000 Ten trillion, trillion, trillion atoms.
00:47:15.000 And apparently the scales go smaller.
00:47:18.000 In most atoms, it's just space.
00:47:20.000 Oh, yeah.
00:47:23.000 Somebody told me about that the other night.
00:47:24.000 Most of everything is mostly empty space.
00:47:27.000 Most of us, apparently, is mostly empty space.
00:47:28.000 Mostly empty space.
00:47:29.000 We're just pattern integrities, man.
00:47:31.000 Just pattern integrities.
00:47:32.000 It's so insane to just even try to wrap your head around how complex the whole thing really truly is.
00:47:38.000 Which is why people like sticking to neighborhoods and watching the same shows.
00:47:41.000 They want anything that calms down this bizarre feeling of never-ending complexity.
00:47:49.000 It's impossible to understand or be in control of your universe.
00:47:53.000 Well, it's frightening to live in the mystery, to live on the edge of knowledge, to live on the edge of thought.
00:48:00.000 Well, there's a reason we call it the edge, because it looks like there's a ravine on the other end.
00:48:03.000 But I still think, even though as individuals, some of us find that frightening and to each his own, as a collective, I think mankind is always restless and never afraid of the edge.
00:48:15.000 I think mankind always pushes at the edge.
00:48:18.000 And that's what makes me ultimately so optimistic about humanity.
00:48:20.000 We're still here.
00:48:21.000 And it's getting crazy.
00:48:23.000 And look at the stuff that humanity is talking to itself about.
00:48:26.000 Yeah, about bombing Iran.
00:48:29.000 That's depressing.
00:48:30.000 There's a little of that going on too, man.
00:48:32.000 It's going both ways.
00:48:33.000 It's a self-correcting global organism.
00:48:35.000 So maybe we're just self-correcting.
00:48:38.000 I agree with you.
00:48:38.000 I don't quite share in your optimism because I'm continually fascinated by the stupidity of the human race as well as the intelligence of it.
00:48:45.000 I think you can't ignore that.
00:48:48.000 There's a lot of dummies out there, unfortunately.
00:48:51.000 A big percentage of the world is a fucking mess right now.
00:48:54.000 I don't know if it's a big percentage, but I just think that what is a mess gets magnified and brought to our attention.
00:48:59.000 Well, it's more than 1%, right?
00:49:01.000 More than 1% of the world's a mess, I would say.
00:49:03.000 Wouldn't you say?
00:49:04.000 When you think about Iraq, Afghanistan, what's going on in Syria, what just happened in Libya, what's going on in Egypt, what may happen in Iran.
00:49:13.000 There's a lot of things about this that are very exciting, right?
00:49:15.000 I mean, what happened in Libya and Indonesia.
00:49:17.000 No, don't get me wrong, but I'm saying...
00:49:19.000 You're saying it's a mess.
00:49:20.000 There's a lot of movement happening.
00:49:22.000 If it's more than 1%, and you've got 100 people in a room, and one of them is fucking crazy.
00:49:27.000 I think we live in disruptive times.
00:49:29.000 Yes.
00:49:29.000 Yes.
00:49:30.000 Fueled by these accelerating technologies.
00:49:32.000 But I think disruption, it's like going through the birth canal.
00:49:36.000 It's like when Timothy Leary says that we're about to shed our skin.
00:49:41.000 We're in the larval stage.
00:49:43.000 We were pre-larval and then we're larval and then we're about to spread our wings.
00:49:48.000 Potentially.
00:49:49.000 Potentially.
00:49:50.000 That's where this conversation comes in.
00:49:52.000 If some new age Hitler doesn't step into the equation.
00:49:54.000 Fair enough.
00:49:56.000 Goddammit.
00:49:56.000 But a good conversation to have, right?
00:49:58.000 Yes.
00:49:59.000 Oh, yeah, of course.
00:49:59.000 It's amazing when you really look back at World War II that it was such a short amount of time ago.
00:50:03.000 It's terrifying.
00:50:04.000 It's hard to wrap your head around that.
00:50:05.000 I'm Jewish, I know.
00:50:06.000 Yeah.
00:50:07.000 Did you have family?
00:50:09.000 My family fled from Europe, yeah, on my mother's side, Polish and Russian, yeah.
00:50:14.000 They went to Venezuela.
00:50:16.000 Ari interviewed a bunch of...
00:50:17.000 Wasn't his dad in...
00:50:18.000 Well, I don't want to say.
00:50:19.000 Let's see.
00:50:20.000 But yeah, it's incredible that that's inside.
00:50:23.000 That could be your grandparents.
00:50:24.000 That could be our lifetime.
00:50:26.000 That's within our grasp.
00:50:28.000 Frightening.
00:50:29.000 While this chain of life is going on, the Holocaust was happening.
00:50:33.000 World War II was happening.
00:50:34.000 I mean, storming the beaches of Normandy.
00:50:36.000 That's like the most savage shit in human history.
00:50:39.000 Cutting people down with machine guns as they run through the sand.
00:50:42.000 I mean, that's our lifetime, man.
00:50:45.000 It's amazing.
00:50:46.000 It's really, truly amazing when you stop and think about how crazy that seems.
00:50:50.000 Yeah.
00:50:50.000 Yeah, there was a really interesting article that I read because obviously everything you're saying is very upsetting.
00:50:56.000 It was in Foreign Policy Magazine.
00:50:58.000 I'll just stop talking, dude.
00:50:59.000 It's cool.
00:51:00.000 No, no, no.
00:51:00.000 For everything I'm saying, it's very...
00:51:01.000 No, no, no, no.
00:51:02.000 No, but there was an article in Foreign Policy Magazine.
00:51:04.000 It was called The End of War.
00:51:06.000 And it was one of those counterintuitive articles that you read it and you're like, okay, there's these interesting academics that are saying, yes, this was tragedy.
00:51:14.000 Yes, there have been horrible things.
00:51:15.000 Yes, these numbers, these scales are horrific.
00:51:18.000 But put it in context over deeper, longer time.
00:51:21.000 And what you see is that things are getting better.
00:51:25.000 Less wars happening.
00:51:27.000 Just before, we couldn't cover every war on TV. There were too many conflict zones in the world.
00:51:31.000 But he talks about how there's less and less.
00:51:33.000 It's important to get the other side.
00:51:37.000 I'm sure that it's better now than it has been ever, but I think human beings as just naturally we look at the errors and the issues that we have, and we see a lot of them that are sort of legacy, that aren't corrected, and they've been going on for so long, like war.
00:51:53.000 I remember when I was a kid, I was, I don't know, maybe like I think it was like eight or something like that when the government pulled out of Vietnam and the Vietnam War was over.
00:52:01.000 And I remember thinking like it's good that we're done having wars because now people realize that we don't like war.
00:52:09.000 No one's going to go to war anymore.
00:52:10.000 I remember even as a child with the idea in my head that I was watching the culture evolve past war.
00:52:16.000 I had like a real sense.
00:52:18.000 Especially, I think, when you're a child, because as you're growing and you're kind of experiencing life and it's being sort of explained to you along the way through experiences, that you start getting an idea that that's how the whole world works, that things just get better over time.
00:52:31.000 Things get smarter, they improve, because that's what you're doing.
00:52:34.000 You're eight years old.
00:52:35.000 You're smarter than you were when you were five.
00:52:36.000 Right.
00:52:36.000 You know what I'm saying?
00:52:37.000 I mean, a significant leap over who you are when you're eight.
00:52:40.000 Yes.
00:52:40.000 So I think that's how I viewed the world.
00:52:42.000 And I remember in whatever year it was, 91 or 92 when we went with Desert Storm.
00:52:47.000 Right.
00:52:48.000 What year was that?
00:52:49.000 91, right?
00:52:50.000 Maybe.
00:52:50.000 Whatever.
00:52:51.000 I don't remember.
00:52:51.000 Might have even been 89. But whatever year it was, I remember watching that happen.
00:52:55.000 I remember me and my buddy Jimmy that I used to live with, my roommate, Jim Dottilio.
00:52:59.000 What's up, Jim?
00:53:00.000 We were sitting in front of the TV and they were showing missiles, like, firing over into Baghdad.
00:53:07.000 And I remember watching that going, what the fuck?
00:53:10.000 And he looks at me and he goes, we're at war, buddy.
00:53:13.000 We're at war.
00:53:14.000 Like, that didn't even make sense.
00:53:16.000 Well, because it seems obsolete compared to all the great things that are happening in the world, right?
00:53:20.000 The massive collaboration, the massive cooperation, you know, people doing things increasingly for free for one another online, people coming together, people protesting against dictatorships.
00:53:28.000 Twitter being used as fuel for dissent and discontent.
00:53:31.000 I mean, there's so many encouraging trends that whenever you kind of contemplate the fact that there's still bad things out there, you realize...
00:53:37.000 Well, the contrast also makes you realize, wow, there's aspects of us that are so obsolete.
00:53:41.000 We need a firmware upgrade.
00:53:44.000 But we're fucking...
00:53:45.000 We're getting there.
00:53:46.000 Singularity University is what...
00:53:47.000 You need to go, man.
00:53:48.000 I'm sure they love that.
00:53:49.000 I do need to go.
00:53:49.000 Yeah, we had...
00:53:50.000 Actually, Will.i.am was there.
00:53:52.000 Really?
00:53:52.000 Yeah, and then he did a talk and a panel.
00:53:54.000 We had this company that does a...
00:53:55.000 Wait a minute, wait a minute, wait a minute.
00:53:56.000 Will.i.am gets to talk at the Singularity University?
00:53:59.000 Him and the head of Fox.
00:54:00.000 What does he talk about?
00:54:01.000 Well, because he was talking about using these technologies.
00:54:04.000 Getting hot bitches on the road.
00:54:05.000 The creative and good uses of these technologies and how we need to spread these technologies to those that are less lucky than we are and whatnot.
00:54:12.000 I'm sure he was talking about nice things.
00:54:13.000 I just have to crack jokes.
00:54:15.000 And we saw two paralyzed people walk.
00:54:17.000 This bionics company who makes these exoskeletons.
00:54:20.000 I've seen those.
00:54:21.000 Demonstrated two paralyzed people standing up and walking.
00:54:24.000 I mean, it was insane.
00:54:25.000 That's really intense.
00:54:26.000 I've seen that.
00:54:26.000 An exoskeleton is fucking nuts, man.
00:54:29.000 That's like something right out of a Marvel Comics, man.
00:54:31.000 Totally, right?
00:54:31.000 It looks like a trailer for a movie that takes place in the future that's showing you how we got there.
00:54:36.000 Archival footage from the future.
00:54:37.000 It just totally makes sense, right?
00:54:39.000 I mean, it's the future.
00:54:40.000 They're going to figure out artificial bodies eventually.
00:54:43.000 Dude.
00:54:43.000 Absolutely.
00:54:44.000 For sure they're going to be able to put your head on someone else's body.
00:54:46.000 Dude.
00:54:47.000 On an artificial...
00:54:48.000 Avatars.
00:54:48.000 Well, isn't there a Wired magazine story about the man who wants to build the real avatar?
00:54:52.000 Could you imagine if they get so good at surgery that they build an artificial you and the head is open and they just have to sew it up and stuff your brain in there.
00:55:00.000 They only have like a certain amount of time where they could take your brain and reattach it.
00:55:03.000 That would be interesting.
00:55:04.000 They open your...
00:55:05.000 Good night.
00:55:05.000 Good night, Mr. Jones.
00:55:06.000 Boom.
00:55:06.000 Cut open your...
00:55:07.000 Next thing I see, you're going to be 20 years old and invincible.
00:55:10.000 They cut open your fucking brain, suck it out real quick, and they only have a couple minutes, and then they screw it into this new brain.
00:55:16.000 Turn it on.
00:55:17.000 Fire up.
00:55:17.000 Mr. Jones, do you hear us, Mr. Jones?
00:55:19.000 Well, I think...
00:55:19.000 37 seconds.
00:55:20.000 We're good.
00:55:21.000 We're good.
00:55:21.000 You got a new life.
00:55:23.000 And then Mr. Jones made the trip into a synthetic body with his biological brain.
00:55:27.000 Is that possible?
00:55:28.000 Well, I think by the time that we can do that, we will be non-biological in the sense that we'll have far greater than human intelligence and sentience residing in decentralized non-biological substrates.
00:55:39.000 Do you feel like that about spaceflight?
00:55:41.000 On the nanoscale.
00:55:41.000 Do you feel like that?
00:55:42.000 I've never felt like we're going to go to other planets.
00:55:44.000 We definitely are.
00:55:45.000 We definitely are.
00:55:46.000 We're going to go to Mars in less than 10 years.
00:55:48.000 Elon Musk is working on this.
00:55:51.000 Newt Gingrich was saying, if you let me in office.
00:55:53.000 Did you hear that?
00:55:54.000 Yes.
00:55:55.000 Newt Gingrich said, if you let me in office by the second term, we'll have a base on the moon.
00:55:58.000 Well, we don't need governments for that.
00:55:59.000 See, that's the difference of where we are now.
00:56:01.000 It's going to happen by private spaceflight.
00:56:03.000 It's going to be the techno-philanthropists like Elon Musk who have the vision and the resources to make it happen.
00:56:08.000 And they benefit from the emerging technologies because something that was, that the cost was impossible 20 years ago, all of a sudden is miniaturized, is infinitely more affordable.
00:56:17.000 We're going to space and then we're going to send artists into space and that will transform the We have to decide who the artists are, because the last thing you want is shitty poetry from outer space.
00:56:27.000 Well, imagine you in space analyzing it philosophically.
00:56:31.000 A podcast from space.
00:56:32.000 How that would influence your thoughts, your ideas.
00:56:34.000 How about we just get a green screen and put some space behind me?
00:56:37.000 Maybe that'll work.
00:56:39.000 That'll be the same.
00:56:40.000 And we'll put on our NASA suits.
00:56:41.000 Do you have your NASA suit, Brian?
00:56:43.000 Yep, it's here.
00:56:43.000 Mine's in the trunk.
00:56:45.000 Very cool, man.
00:56:46.000 Yeah, I don't know, man.
00:56:47.000 I don't know what the future holds, but I think the Big Bang machine might come before space travel.
00:56:52.000 I mean, I'm just guessing.
00:56:53.000 The Big Bang machine?
00:56:54.000 Yeah.
00:56:54.000 They might press the Big Bang button before we figure out how to get to other planets.
00:56:58.000 I hope not, man.
00:56:59.000 We have to at least figure out how to back ourselves up.
00:57:02.000 What if you fucking fly out to Mars?
00:57:04.000 What if you fly out to Mars and it's just like the shittiest parts of Arizona?
00:57:07.000 It's just like the shittiest parts of the Arizona desert.
00:57:09.000 And you're like, you know what?
00:57:10.000 There's spots like this that suck on America.
00:57:12.000 I could have just driven there.
00:57:13.000 I didn't have to fucking fly in a rocket ship to some place with no air to see a shitty part of the universe that I could have seen in Arizona.
00:57:22.000 You know those rock desert areas where there's fucking no one but rattlesnakes for a hundred thousand fucking square miles?
00:57:30.000 Yeah.
00:57:31.000 Dude, fair enough.
00:57:33.000 And I think that those that go are not going to swim on Mars' beautiful beaches.
00:57:39.000 I think they're going for the feeling that they will have when they look out that window and see another celestial body.
00:57:47.000 They're going there for the wanderlust.
00:57:49.000 The awe.
00:57:50.000 They're going there for the awe.
00:57:51.000 That's their religious feeling.
00:57:53.000 That's them getting off on God.
00:57:55.000 Anybody who does that, who really, if they really do choose to give up essentially years and years of their lives for this scientific adventure, I mean, that's what they're doing.
00:58:05.000 It's going to take like six months just to get to Mars.
00:58:07.000 That's a real hero.
00:58:08.000 And there's going to be a lot of one-way tickets like you were saying.
00:58:10.000 Oh, yeah.
00:58:11.000 That's a real hero.
00:58:12.000 And who knows, by the way, what the fuck happens to your body out there in radiation and deep space.
00:58:16.000 Who knows?
00:58:17.000 How unhealthy it is to be in the atmosphere.
00:58:19.000 And then what are they going to do?
00:58:21.000 They're going to have to do some sort of a...
00:58:22.000 What is it called when you change the atmosphere of a new planet?
00:58:27.000 Terraform.
00:58:27.000 Terraform.
00:58:28.000 Oh, yeah.
00:58:28.000 They would have to terraform.
00:58:29.000 So they would have to build machines that actually create oxygen and then hope it stays stable.
00:58:35.000 Yeah, but you want to go a little crazier, man, a little farther into the future?
00:58:38.000 That will all be done with nanotechnology.
00:58:40.000 The physicist Freeman Dyson says we'll be able to have the entire biosphere of the world decoded, the genome of the entire biosphere of everything that's living on planet Earth in something that's a few micrograms in weight and at the nanoscale.
00:58:52.000 And we'll be able to send those nanotechnology instructions to self-replicate and seed the universe.
00:58:58.000 I did a whole rant about it.
00:58:59.000 This is a physicist talking about this.
00:59:01.000 It's not like some hippie tripping.
00:59:04.000 This is a physicist who at one time was probably a hippie tripping and became a physicist.
00:59:08.000 It makes sense when you think about how small data holding little hard drives now and what they're going to be like.
00:59:17.000 They've got computers that are as small as a grain of sand now.
00:59:20.000 Dude, quantum computing is going to be doing superposition, which means like being one and zero at the same time.
00:59:26.000 Yeah, what is that?
00:59:27.000 Explain to me superposition.
00:59:29.000 Does anybody...
00:59:29.000 Do you understand it?
00:59:30.000 I'm no expert, but superposition means that...
00:59:33.000 Something can be in motion and still at the same time.
00:59:35.000 Exist and not exist.
00:59:36.000 Yes, yes.
00:59:37.000 In two different places at once.
00:59:38.000 Yeah, something can be a particle and a wave at the same time.
00:59:42.000 And so something can be at the same time in two different points in the universe, simultaneously, and communicate.
00:59:51.000 And this is not horseshit, right?
00:59:52.000 This is all proven stuff.
00:59:53.000 No, this is all proven stuff.
00:59:55.000 At least accepted in the quantum physics community, yeah.
00:59:58.000 Doesn't that make you want to toss all previous notions about reality aside?
01:00:03.000 Yes.
01:00:03.000 When you look at something like that and you go, wait a minute, wait a minute, wait a minute.
01:00:05.000 Yes.
01:00:06.000 The very foundation of everything that we see, touch, feel, observe, know exists.
01:00:12.000 Is illusory.
01:00:13.000 What?
01:00:14.000 Yes.
01:00:14.000 It's a goddamn program.
01:00:15.000 Yeah.
01:00:16.000 Dude, it's the fucking Matrix.
01:00:17.000 Yeah.
01:00:17.000 It's the Matrix.
01:00:18.000 It really is real.
01:00:20.000 The Matrix didn't go far enough.
01:00:21.000 The Matrix didn't go far enough, but that's why the movie was so brilliant, and that's why Inception was similarly on those same, my friend has a shirt, it has these two guys sitting in a chair, and one of them says, are we just graphics on an imaginary t-shirt?
01:00:33.000 And the other guy says, that's ludicrous.
01:00:36.000 But you could extend that and extrapolate that to us.
01:00:38.000 Yeah.
01:00:39.000 Are we just like two dudes in some virtual simulation that somebody else is playing with?
01:00:43.000 And then you're like, nah.
01:00:45.000 And then it zooms out and it shows that we're playing on some screen for someone else's entertainment.
01:00:50.000 Well, the game was really good.
01:00:51.000 Which is what's happening right now.
01:00:52.000 Yeah, I mean, when you're playing a really good game of Quake, when you're in the zone, man, you're not thinking, hey, I'm playing Quake.
01:01:00.000 Oh, no, it's real.
01:01:01.000 You're locked in there.
01:01:02.000 You're hopping and moving.
01:01:03.000 You're a part of it.
01:01:04.000 We can be in multiple realities, dude.
01:01:05.000 There's no doubt.
01:01:06.000 We're already doing it with flat Screens that are not even that immersive, and we can lose ourselves in it.
01:01:11.000 Right, so if this is artificial, it could be just so good, it feels real.
01:01:15.000 Which makes sense for a lot of shit.
01:01:17.000 Absolutely.
01:01:17.000 I've said so many times that the world feels like a piece of fiction.
01:01:22.000 When a guy like, you know what?
01:01:24.000 I yelled, shut the fuck up at the TV when that Anthony Weiner guy got caught with taking pictures of his dick.
01:01:31.000 I'm like, come on, man.
01:01:32.000 This is shitty writing.
01:01:33.000 If this was a sitcom, I'd get mad.
01:01:35.000 Well, you know what?
01:01:36.000 That's a movie.
01:01:37.000 A movie about a guy that's criticizing the screenplay of life starring you would be really funny.
01:01:43.000 It wasn't that I was criticizing.
01:01:45.000 It was that it was like Coen Brothers-esque.
01:01:48.000 It was so preposterous that it seemed like all of a sudden we're in a movie.
01:01:52.000 Come on, the guy named Wiener takes pictures of his cock.
01:01:55.000 He just throws them on the internet.
01:01:57.000 The clues are everywhere, man.
01:01:58.000 Row, row, row your boat.
01:02:00.000 What?
01:02:00.000 What does that mean?
01:02:01.000 Well, the last line of row, row, row your boat.
01:02:03.000 It says, life is but a dream.
01:02:04.000 Life is but a dream.
01:02:08.000 They probably had to make something rhyme.
01:02:10.000 What if life is not the pursuit of a dream?
01:02:12.000 What if life is not the pursuit of a dream?
01:02:14.000 There's clues everywhere, man.
01:02:16.000 Or someone could have been really high when they wrote that.
01:02:19.000 Jesus wrote that song.
01:02:20.000 Yeah, I've written a lot of shit high that I'm embarrassed about.
01:02:22.000 Yeah.
01:02:24.000 Like your one joke?
01:02:25.000 Yeah, it's true.
01:02:26.000 I really did write that.
01:02:27.000 That's funny.
01:02:28.000 So, what do we do to sort of...
01:02:32.000 I mean, I guess what we're doing is what you're doing right now.
01:02:35.000 I mean, what we're doing, what we can do as individuals.
01:02:38.000 We're creating memes.
01:02:39.000 Yeah, and what we can do is retweet things that resonate with us.
01:02:42.000 Absolutely.
01:02:42.000 Talk about things that resonate with us.
01:02:44.000 Well, every time you retweet something, 600,000 minds are...
01:02:48.000 Potentially, yeah.
01:02:49.000 Potentially, potentially, fair enough.
01:02:50.000 But 600,000 is a very big number.
01:02:52.000 So even if only 10, if the tweet that you sent out triggers a butterfly effect in his thought that opens up a whole new stream of possibilities for that person, that's real transformation.
01:03:05.000 So it's natural selection playing out at a faster and faster rate because things are happening.
01:03:10.000 So you're creating artful change in the world using the power of your mind.
01:03:14.000 Somebody listening to this might invent some new poem that becomes the campaign for some brand that transforms the world.
01:03:22.000 Like I said, the butterfly effect.
01:03:23.000 But we're talking on scales and numbers where that's possible.
01:03:26.000 I've seen the engaged, inspired audience interface with you.
01:03:31.000 I've seen it.
01:03:32.000 It's kind of amazing.
01:03:33.000 Well, they're responding to you too, man.
01:03:34.000 They're responding to your ideas and you're passionate about it.
01:03:36.000 And one of the cool things about having a podcast is someone right now could be anywhere doing some tedious work around the house or whatever.
01:03:44.000 Right.
01:03:45.000 And they were in a certain state of mind.
01:03:47.000 And the conversations, the topics that you brought up and the way we've explored these topics, all of a sudden their mind is fucking racing.
01:03:55.000 Right.
01:03:55.000 And that is a real cool thing.
01:03:59.000 That is a really cool thing that we can do something like that.
01:04:02.000 That to me is one of the most satisfying aspects of this.
01:04:05.000 That you can entertain someone and engage them and literally put them on a little bit of a mental journey where they start thinking about these different subjects.
01:04:13.000 Absolutely.
01:04:14.000 Nanotechnology and you start exploring it and seeing how bizarre it is.
01:04:17.000 of the references that you use you go and look them up yeah holy fuck so many people ask for book recommendations after our session Oh yeah, I'm sure.
01:04:23.000 So many.
01:04:24.000 Yeah, I've got to eventually make a thing on my website.
01:04:27.000 We need to do that.
01:04:27.000 My favorite books.
01:04:28.000 There's favorite documentaries.
01:04:30.000 There's a documentary thread on the message board.
01:04:33.000 I need to make that popular.
01:04:34.000 I'm such a big fan of Joseph Campbell and the sort of monomyth and the hero's journey.
01:04:39.000 Me as well, yeah.
01:04:39.000 See, think of the hero's journey.
01:04:41.000 We think so literally.
01:04:42.000 So we're like, obviously a geographical journey.
01:04:45.000 Like if you go on a safari and you climb Kilimanjaro, you will go through all the steps.
01:04:48.000 A departure from the ordinary, overcoming obstacles, having a catharsis and realization and making the return.
01:04:54.000 But we need to apply that metaphor internally.
01:04:57.000 This podcast session is a Joseph Campbell-esque hero's journey.
01:05:01.000 Person puts on the headphones and it's a departure...
01:05:03.000 Who's the prince and who's the princess?
01:05:05.000 No.
01:05:05.000 I just want to know.
01:05:06.000 We don't need...
01:05:07.000 We're going on a hero's journey.
01:05:08.000 No, but think this is the hero's journey.
01:05:09.000 It follows the steps.
01:05:10.000 It's a departure from the ordinary.
01:05:11.000 We're partaking in conversations that are maybe not your everyday conversations.
01:05:15.000 We're overcoming obstacles in the sense that we're challenging preconceived truths and questioning ourselves and asking difficult questions and thinking new thoughts.
01:05:23.000 So that's the obstacles.
01:05:24.000 And then we're transcending and overcoming the resistance that we have to change into new ways of thinking.
01:05:29.000 And then we're having, hopefully, the catharsis.
01:05:31.000 Hopefully sometime during this journey we have a moment of profound realization that changes us both and somebody listening to forever.
01:05:36.000 And then we make the return, which is to say, I love that.
01:05:39.000 I want to share that with my community.
01:05:40.000 I'm going to tweet it.
01:05:41.000 I'm going to Facebook it.
01:05:42.000 So if you apply that metaphor of the hero's journey, you try to make...
01:05:45.000 Parts of your life.
01:05:46.000 Significant heroes.
01:05:47.000 I'm going to wake up in the morning.
01:05:47.000 I'm going to be like, today, I'm going to depart from the ordinary.
01:05:49.000 I'm going to put myself in uncomfortable situations.
01:05:51.000 I'm going to transcend those boundaries.
01:05:52.000 I'm going to have a new realization.
01:05:54.000 I want this day to mean something and then make the return.
01:05:56.000 Right.
01:05:56.000 But what if that day you got like a bunch of shit you need to get done?
01:06:00.000 You listen to this podcast.
01:06:01.000 I want to depart from the ordinary.
01:06:01.000 Because it can happen in your brain too.
01:06:03.000 Right.
01:06:03.000 I'm going to read this book tonight.
01:06:04.000 I'm going to check out this interesting documentary.
01:06:07.000 Right.
01:06:07.000 It's definitely good to expose yourself to different things.
01:06:10.000 Absolutely.
01:06:10.000 That's why I really get into finding Bigfoot.
01:06:12.000 I've been watching that a lot lately, Brian.
01:06:14.000 That's probably a good thing.
01:06:16.000 It's one of the things I have.
01:06:18.000 Isn't that just a waste of time for you though?
01:06:20.000 Fuck yeah, it's a waste of time, but I'm trying to write some new material.
01:06:22.000 I'm doing my special, by the way, it's confirmed.
01:06:24.000 It's going to be happening in Atlanta on April 20th at the Tabernacle Theater.
01:06:28.000 And most of the tickets are sold out for the first show, but we're going to do a second show.
01:06:32.000 So we'll have the first show, I think, at 8. The second show will be 10.30.
01:06:36.000 And the second show tickets will go on sale sometime this week, probably today's Monday, sometime probably Wednesday, I would guess.
01:06:43.000 And it'll be me and Joey Diaz and Duncan Trussell.
01:06:47.000 Holla!
01:06:47.000 And I'll be recording my new comedy special and releasing it Louis C.K. style on the internet for five bucks.
01:06:53.000 That's brilliant, dude.
01:06:54.000 Brilliant.
01:06:55.000 You've got to call it Louis C.K. style.
01:06:58.000 And you're going to be in New York, too, at some point, right?
01:07:00.000 Yeah, I'm going to be in New York.
01:07:01.000 I'm going to be there.
01:07:01.000 Are you going to be there?
01:07:02.000 Let's link up.
01:07:03.000 Yeah, what day is that?
01:07:04.000 I'm going to be there all of April, man.
01:07:06.000 Oh, you are?
01:07:06.000 Okay, cool.
01:07:07.000 Yeah, yeah, yeah, yeah.
01:07:08.000 Actually, I should tell you this because this is really cool.
01:07:11.000 I'll be there May 5th.
01:07:12.000 May 4th.
01:07:13.000 May 4th.
01:07:13.000 Okay, so let's link up.
01:07:15.000 Okay.
01:07:15.000 But actually, next week, man, I'm heading up to, or this week, at the end of this week, I'm heading up to the Bay Area because on the 20th, I'm speaking at Stanford Design School and showing some of my crazy ecstatic videos.
01:07:26.000 Oh, wow.
01:07:27.000 And then on the 27th, on March 27th, I'm speaking at Google.
01:07:32.000 I was invited to speak there.
01:07:33.000 Yeah, I'm going to show some of the videos.
01:07:35.000 And then on March 28th, I'm going to be speaking at the Economist Ideas Festival on Innovation at Berkeley.
01:07:41.000 Wow.
01:07:41.000 It's going to be sick.
01:07:42.000 All about showing the videos.
01:07:43.000 It's about talking about inspiration, creativity, new ways of packaging and disseminating ideas.
01:07:47.000 Then I go to New York, and on March 30th, I'm speaking at the PSFK conference in Battery Park.
01:07:52.000 Instead of making people memorize this stuff, because most of them won't, what is your website?
01:07:56.000 Where is this all?
01:07:57.000 Oh, yeah.
01:07:58.000 JasonSilva.com?
01:07:59.000 Yeah, if you go to thisisjasonsilva.com.
01:08:02.000 Thisisjasonsilva.com.
01:08:03.000 Yes, but the best thing, honestly, is Twitter.
01:08:05.000 Twitter.
01:08:05.000 At Jason underscore Silva, S-I-L-V-A. Yeah, and if you can't find it, it's on mine, talking about this podcast.
01:08:12.000 Yeah, that way I keep people updated on all the talks.
01:08:14.000 And then April 20th, the National Arts Club in New York City.
01:08:17.000 I'm going to be speaking as well.
01:08:19.000 That's amazing.
01:08:19.000 And this is all because of your videos that you produce for the internet, which are really amazing.
01:08:23.000 And if you Google Jason Silva Vimeo, there's a whole page with a gang of them on.
01:08:28.000 And Vimeo is a nice, high-quality visual, too.
01:08:31.000 Yeah, I love Vimeo, man.
01:08:32.000 They're amazing.
01:08:32.000 It's very high-quality.
01:08:34.000 You can go full screen with it on a large screen.
01:08:36.000 It looks great.
01:08:36.000 They get it.
01:08:37.000 And the design, it's made for artists.
01:08:38.000 It's beautiful.
01:08:39.000 Yeah, Vimeo is awesome.
01:08:40.000 We put all of our podcasts up on Vimeo.
01:08:42.000 We also put a video blog up.
01:08:44.000 We put that on Vimeo as well.
01:08:46.000 But yeah, I think that's incredible, man, that you're getting all this work just from those videos popping up on the internet.
01:08:53.000 How did you get started on this, man?
01:08:54.000 What is your background as far as education?
01:08:57.000 Yeah, man.
01:08:57.000 Well, I grew up in Venezuela.
01:08:59.000 And I went to international school and of course after Venezuela I was in film school and I did Current TV which was Al Gore's TV channel for like the last five years.
01:09:07.000 But it was really when I left last year that I wanted to do my own content.
01:09:10.000 Did you ever get massages with Al?
01:09:11.000 No we didn't.
01:09:12.000 Got really baked.
01:09:14.000 Didn't Al have some massage problems?
01:09:16.000 I don't know.
01:09:18.000 Yeah, the short videos, I wanted to apply in principle what I was believing intellectually.
01:09:24.000 I wanted to make content that was mimetic, because I believe we live in a world where short-form content disseminated through the internet can infect people, can transform minds.
01:09:32.000 We don't need the old gatekeepers, so to speak.
01:09:36.000 Everybody's empowered.
01:09:37.000 And so the reason short videos are easier to consume through small devices and this and that, and you don't ask people for too much of their time.
01:09:43.000 That's a big thing.
01:09:44.000 Well, until you've won them over, like you.
01:09:47.000 I know that people love to listen to you for a long time.
01:09:49.000 Well, like they love the podcast.
01:09:51.000 What most people do is they listen to it while they're doing other stuff.
01:09:53.000 That's the best way to do it.
01:09:54.000 That's brilliant.
01:09:56.000 Well, it's a real genre that hasn't really been addressed before.
01:10:00.000 I love coming here with you because it gives me a chance to talk about these ideas in a space which is bigger and people are listening to it.
01:10:08.000 But, you know, for my situation, to initially get the word out about the videos, it just worked to do them really short.
01:10:14.000 But what I think people respond to them is whether or not they're into the ideas of exponential growth and technology and transforming the human condition, people are into the idea that inspiration needs to be reinvented.
01:10:23.000 How we package and disseminate big ideas needs to be reexamined because we have a new substrate.
01:10:29.000 The internet is a new substrate.
01:10:31.000 When we invented the printing press, we came up with the format of the book, and there was rules and parameters, and this is how it works best.
01:10:36.000 Television, we came up with the sitcom.
01:10:38.000 Film, we came up with the length of time that a film should be before people get restless in the theater, and so on and so forth.
01:10:42.000 And I think on the internet, we're still figuring it out.
01:10:44.000 What are the parameters of work?
01:10:46.000 What are the lengths of videos?
01:10:47.000 We look at the statistics and get the information and find out how long people pay attention to stuff, and this and that.
01:10:52.000 And so I'm just trying to raise that conversation.
01:10:55.000 I get excited when I find long form documentaries available on like Google Video and YouTube.
01:11:00.000 Yeah, but people don't do it yet on their screens.
01:11:03.000 We're going to have that more when we have the merger of TV and the web.
01:11:07.000 Apple TV. Because then you're on the couch watching TV. Watching web content and it's a different experience.
01:11:13.000 It's not just a small screen.
01:11:14.000 When the screens merge, I think we'll have that.
01:11:17.000 Do you think that's entirely going to happen?
01:11:18.000 You don't think that it'll still have the separation of computer and television?
01:11:23.000 No, I think software is going to eat the world.
01:11:25.000 That was a great article that I read.
01:11:27.000 So do you think that networks, like NBC, ABC, that's like legacy, it's all going to be like VHS tape someday?
01:11:33.000 Yeah.
01:11:33.000 I think we're going to be interfacing.
01:11:36.000 I think the Apple TV thing is coming, is my feeling.
01:11:39.000 And that is going to make everything intuitive.
01:11:41.000 I read an article yesterday from Nick Bilton from The Times where he was saying that he gets anxious when he looks at his cable and TV box because there's so many buttons and it's so complicated and he doesn't know what input is connected to what this and most of the stuff he doesn't want to watch.
01:11:56.000 And he says that he looks at his iPad and everything's so neat and he can press what he wants and get what he wants in real time.
01:12:01.000 We're moving in that direction.
01:12:02.000 I mean, have you guys checked out HBO Go?
01:12:04.000 It's really cool.
01:12:05.000 Awesome.
01:12:05.000 Yeah.
01:12:06.000 I mean, on your computer, watch anything, anytime, on demand, if you're an HBO subscriber.
01:12:11.000 Wow.
01:12:11.000 But, like, a premium, beautiful, you know, experience.
01:12:14.000 Holy shit.
01:12:14.000 I think that's the future.
01:12:15.000 That's beautiful.
01:12:15.000 And, you know, another thing that's the future, which is it's still clunky today.
01:12:19.000 And I can't believe it's still clunky.
01:12:21.000 Like, the other day, I wanted to find a show on a channel.
01:12:24.000 And I'm like, I don't know what the channel is.
01:12:26.000 I have a thousand numbers.
01:12:27.000 So, I'm, like, going through each channel.
01:12:29.000 Like, God, where the fuck is this channel, you know?
01:12:31.000 It should be a certain channel.
01:12:32.000 The search feature, right?
01:12:32.000 No, it's just going to be Siri.
01:12:34.000 It's going to be, turn to the Cartoon Network.
01:12:37.000 And that's all it's going to be.
01:12:38.000 And the remote control alone is just such, it's like looking at an old payphone.
01:12:43.000 Yeah, when the UFC moved to Fuel TV, when they were having some fights on Fuel TV, I had to find Fuel TV. Took forever, right?
01:12:49.000 Good fucking luck, man.
01:12:50.000 Yeah.
01:12:50.000 Good luck.
01:12:51.000 It sucks.
01:12:52.000 Yeah.
01:12:52.000 It's like 618 on DirecTV.
01:12:54.000 And especially when you have HD channels now, too.
01:12:57.000 Half the time, you're not even watching the HD channel, and you're like, oh shit, how long have I been watching this?
01:13:02.000 Yeah.
01:13:02.000 There's so many channels now, man.
01:13:04.000 It's stupid.
01:13:04.000 It's amazing.
01:13:05.000 That's why I think the Apple TV, when it does get released, I think that's just going to change everything.
01:13:09.000 I think everything's going to be a la carte.
01:13:11.000 I think, yeah, NBC's going to be around, but they're going to be like any other channel.
01:13:15.000 It's going to be like your channel.
01:13:17.000 People like shows, though.
01:13:20.000 They like Lost.
01:13:21.000 They like things that are going to be produced by a production company that you might not be able to replicate.
01:13:26.000 The home...
01:13:29.000 You know, even with incredible software.
01:13:31.000 I think people will always want the premium experience, and they're willing to pay for the premium experience.
01:13:36.000 I mean, I don't care to pay $20 to see an IMAX 3D film in a theater and be completely immersed in an experience like that.
01:13:42.000 Did you like Avatar?
01:13:44.000 I thought it was beautiful.
01:13:45.000 Did you feel any Avatar depression once you left?
01:13:47.000 You know, I think that's fascinating.
01:13:49.000 Don't you think?
01:13:50.000 I love it.
01:13:50.000 That idea?
01:13:51.000 Yeah.
01:13:52.000 Like, there's a great book called The Art of Immersion by Frank Rose, who used to be at Wired, who says that the future of immersive storytelling, and an example is Avatar is such an immersive 3D experience, and he says, we all long to go back to Pandora, even though we've never really been there.
01:14:08.000 Yeah.
01:14:09.000 We missed something that wasn't really real.
01:14:11.000 But then again, everything is not really real, right?
01:14:13.000 Because it's all an illusion.
01:14:14.000 But more and more, dude, immersive experiences like that.
01:14:16.000 Yes.
01:14:17.000 We're going to get sad when we fall out of the game or out of the movie or out of the virtual space because it's increasingly becoming more interesting than reality.
01:14:25.000 If people got Avatar depression, really they got depression that they weren't one of those blue things.
01:14:31.000 Because you wouldn't want to be living in Avatar if you were a human.
01:14:35.000 You're just a little fucking bitch of an animal.
01:14:37.000 I think it was just so pretty.
01:14:38.000 That got jacked left and right.
01:14:39.000 Yeah, but humans didn't even have a chance in the Avatar world.
01:14:42.000 You can't have Avatar depression.
01:14:44.000 You essentially have depression about your species.
01:14:46.000 You want to be one of the Na'vi.
01:14:48.000 Well, you want to be larger than life, but that can be...
01:14:51.000 Well, you want to live that lifestyle that they're living the love and honor.
01:14:54.000 When I was little, I used to get Indiana Jones depression.
01:14:57.000 When Indiana Jones ended, I used to get sad.
01:14:59.000 I wanted to take for treasure in my backyard, and I wanted to have my life to be as fun as Indiana Jones.
01:15:03.000 I remember that feeling as a kid.
01:15:05.000 I really remember leaving films and wanting the luster and the awe of...
01:15:15.000 I think the idea of Avatar though was that the culture of Avatar was missing everything that we're missing or rather that the culture of the Na'vi had everything that we're missing.
01:15:29.000 That our lost society, that our materialistic, ridiculous society where we're not taking responsibility for our own actions, we all act collectively as a gigantic group or corporation, that this tribal life, this tribal life where all these people were forced to toe their own weight and celebrated and loved each other.
01:15:48.000 Yeah, but that tribal life that supposedly was so advanced, I mean, they still had hierarchical systems.
01:15:54.000 There was still an angry boss that told everybody else what to do.
01:15:57.000 There was still warriors.
01:15:58.000 They didn't really transcend our savagery.
01:16:02.000 But they're happier than secretaries.
01:16:04.000 Do you see what I'm saying?
01:16:04.000 There's an interesting thing that Kurzweil had mentioned that he thought it was really interesting that you use the world's greatest technology to bring our imaginings into being.
01:16:15.000 To make that movie.
01:16:17.000 To then criticize technology in the movie.
01:16:19.000 So you use the most powerful computers and digital tools to realize that dream into screen.
01:16:25.000 And then you tell a story inside of that technologically mediated reality.
01:16:29.000 You tell a story about how bad technology is and how we should all go live in the forest again.
01:16:33.000 Not really.
01:16:35.000 What they told the story about was about greed and about the willingness to fuck over cultures and kill entities just to get that crazy mineral.
01:16:43.000 But a lot of people came out of that and said it was an indictment of technology.
01:16:46.000 What was the mineral?
01:16:47.000 Impossibranium or something like that.
01:16:49.000 That was some stupid fucking name.
01:16:51.000 What was it called, Brian?
01:16:52.000 I don't know.
01:16:53.000 They saw the movie once.
01:16:55.000 Obtainium.
01:16:56.000 Inobtainium.
01:16:57.000 Impossible to obtain.
01:16:58.000 Something along those lines.
01:16:59.000 Like, oh, you silly goose.
01:17:01.000 There's a great term called computranium that I recently learned.
01:17:04.000 And I think it's when we leverage all the matter in the universe or in the galaxy into computation.
01:17:12.000 So all the atoms, we put computation into everything and then it becomes a computronium.
01:17:17.000 I'm not sure if I'm explaining it correctly, but yeah, this idea that civilization will eventually get advanced, that it can leverage all the matter in the universe and put computation into it.
01:17:26.000 Harness all the matter and energy in the universe.
01:17:28.000 What does that even mean?
01:17:30.000 Could you use that to get you to work?
01:17:30.000 It means everything will have computation in it.
01:17:32.000 Well, you know how there's, you know, our computers are built of materials and we put computation into those...
01:17:38.000 So we could put computation into the stars?
01:17:41.000 Yeah.
01:17:42.000 That's...
01:17:42.000 Yeah.
01:17:42.000 How the fuck would you do that?
01:17:44.000 I'm not a physicist, but this is stuff that you can find physical articles that, you know, speculate about the future and how a society will cross a scale and then it will harness the energy of a star and put computation into matter and terraform other worlds.
01:17:56.000 And yeah, I mean, it's...
01:17:58.000 I mean, we already do it inside of computers.
01:18:01.000 I mean, computation and complexity inside of a microchip, the only other thing as complex is the brain.
01:18:07.000 Nothing else in the universe has that complexity.
01:18:09.000 I find fascinating when I go back to some 1980s and 1990s science fiction movies.
01:18:14.000 I like watching what they thought a computer was going to be like.
01:18:18.000 The movie Alien, I watched that again the other day.
01:18:20.000 One of my all-time favorite movies, an amazing movie, and still holds up as far as suspense.
01:18:24.000 You must be excited about Prometheus, though.
01:18:26.000 Oh!
01:18:26.000 Fuck yeah.
01:18:26.000 Oh man, I saw the 3D trailer in a theater.
01:18:29.000 Anything Ridley Scott comes up with, I'm down for.
01:18:32.000 But the first Alien movie is one of my all-time favorite movies.
01:18:35.000 But god, the computer looks so fucking wonky and shit.
01:18:39.000 And it was fascinating that when you look at some of these older movies, they'll take place in 2017. And it's like nothing looks anything like today.
01:18:51.000 Everything's super futuristic, flying cars and shit.
01:18:54.000 Like, when was Blade Runner supposed to be taking place in?
01:18:57.000 How far in the future was it?
01:18:59.000 That's a good question.
01:19:00.000 I don't think it was that far.
01:19:01.000 Dude, if you would see some of the little flying robots that they showed at TED this year.
01:19:08.000 Oh, I saw some of those.
01:19:09.000 Oh, the choreographed flying little helicopters that could do a dance and go around obstacles and objects, and those are going to have HD cameras and they can map rooms.
01:19:18.000 The Google self-driving cars...
01:19:21.000 200,000 miles they've driven with zero accidents.
01:19:24.000 A million people a year die in road accidents, okay?
01:19:27.000 A million people a year.
01:19:28.000 When we switch over to those self-driving cars, which we already know after 200,000 miles, no accidents.
01:19:33.000 They're only going to get better.
01:19:34.000 That's, I mean, it's coming.
01:19:35.000 So that's what cars are going to be?
01:19:36.000 Of course.
01:19:37.000 Self-driving cars.
01:19:38.000 Just like airplanes, man.
01:19:39.000 Humans are too unreliable.
01:19:41.000 You'd never be able to go sideways or on a corner.
01:19:43.000 You can do it in a sport track.
01:19:44.000 You'd have to go to a track.
01:19:45.000 Yeah, it'll be a sport, but not in a place where you can hit a pedestrian or hurt somebody else, you know?
01:19:50.000 Of course.
01:19:50.000 Of course.
01:19:51.000 But that's...
01:19:52.000 Yeah, that's amazing.
01:19:53.000 That's coming, man.
01:19:53.000 It's amazing.
01:19:54.000 Those Google guys, oh, they're geniuses.
01:19:56.000 Yeah, they're so...
01:19:58.000 No one saw them coming.
01:19:59.000 I mean, if there was a Skynet, and Skynet wanted to sneak up on society and just sort of integrate itself completely, I mean, it is Google.
01:20:06.000 Is there like a website?
01:20:07.000 Google is Skynet.com or something like that?
01:20:09.000 Yeah, but I think at this point, their ambition of don't be evil is holding so true.
01:20:14.000 I love Google, don't get me wrong.
01:20:16.000 But I'm saying it's amazing with Google Maps, Google fucking voicemail, and Google Gmail.
01:20:22.000 All free.
01:20:22.000 Jesus Christ.
01:20:23.000 All free.
01:20:24.000 Yeah, it's incredible.
01:20:25.000 There's a whole book about how everything is dematerializing and it's becoming for free.
01:20:29.000 We used to have a camera, but now a camera doesn't exist because it's inside your phone.
01:20:33.000 You used to have a notebook to write things down.
01:20:35.000 That disappeared because now it's all on your phone.
01:20:37.000 Which, by the way...
01:20:38.000 Everything is dematerializing and going into your devices.
01:20:40.000 Have you seen this new device that's just come out?
01:20:42.000 There's a new Droid that came out?
01:20:44.000 This Samsung Journal?
01:20:47.000 Oh!
01:20:47.000 Have you seen this thing?
01:20:48.000 The pen one.
01:20:49.000 Dude, it's fucking five inches.
01:20:51.000 Yeah, it's huge.
01:20:52.000 It's the biggest one ever.
01:20:53.000 It's like a cross between a tablet and an iPhone.
01:20:56.000 Right.
01:20:56.000 It's amazing.
01:20:57.000 Yeah.
01:20:58.000 Beautiful.
01:20:58.000 I bet the battery lasts about 35 seconds.
01:21:02.000 On full brightness, you've got about a half a minute.
01:21:05.000 It's pretty crazy that the new iPad 3 has the same battery life, but yet the screen's HD. Oh yeah, that's exponential growth right there.
01:21:16.000 The battery life on those iPads is amazing.
01:21:18.000 They like sold out their first batch already, dude.
01:21:21.000 The demand is unprecedented, dude.
01:21:23.000 It's pretty shocking how long you can watch a movie on those things.
01:21:26.000 You watch three, four movies and you look at it, it's not even like halfway juiced with the battery.
01:21:30.000 I can't wait for the iMind.
01:21:31.000 You think they're already working on the iMind?
01:21:33.000 What is that?
01:21:33.000 It'll be like a synthetic mind.
01:21:35.000 I don't trust them.
01:21:36.000 I can wait for the iCar.
01:21:38.000 They need to make cars.
01:21:39.000 Oh yeah, iCar.
01:21:40.000 They need to make a car.
01:21:40.000 Well, they'll do their counterparts who Google self-driving Android cars and then Apple needs its iCar.
01:21:45.000 Jesus, how long before we see self-driving cars on the street?
01:21:48.000 Very soon.
01:21:49.000 I think very soon.
01:21:50.000 Because Google already has them driving in California and there's been over 200,000 miles.
01:21:54.000 Yeah.
01:21:54.000 They're driving in California right now.
01:21:55.000 Oh, yeah.
01:21:56.000 You could be rear-ended by a fucking machine.
01:21:58.000 There's been zero accidents in 200,000 miles.
01:22:01.000 Would you want to be the first one, son?
01:22:02.000 They map.
01:22:02.000 They map three-dimensional maps of what's in front of them.
01:22:06.000 Dude, it's insane.
01:22:06.000 They can see.
01:22:07.000 Like, they can see and they can...
01:22:09.000 Notice people walking and they'll adjust accordingly.
01:22:11.000 It's insane.
01:22:12.000 You know what freaks me out, man?
01:22:14.000 Insane.
01:22:14.000 We got onto this through the idea of robotics and flying drones.
01:22:17.000 What freaks me out is those things that walk that have like 10, 15 legs and you kick them and they adjust.
01:22:25.000 Yeah, that one that looks like a dog, dude, that you immediately sympathize with.
01:22:28.000 What the fuck is that thing, man?
01:22:29.000 That's incredible.
01:22:30.000 You should have seen the TED Talk.
01:22:31.000 The head of DARPA gave a TED Talk, dude, and she was the most poised, elegant, articulate, attractive woman, dude.
01:22:37.000 Isn't that the people from Lost?
01:22:38.000 DARPA is the Defense Advanced Research Project.
01:22:40.000 That's DARMA. DARMA, I know.
01:22:41.000 Yeah, I know.
01:22:42.000 DARPA. She was amazing.
01:22:43.000 She was talking about dreaming the impossible.
01:22:44.000 She actually reminded me of Jodie Foster's character in Contact.
01:22:47.000 This is really elegant, poised, articulate.
01:22:49.000 You actually felt comforted to know that somebody that intelligent-seeming is running DARPA. And her TED Talk was unbelievable.
01:22:55.000 What did she talk about?
01:22:56.000 She was talking about dreaming the impossible and we have to challenge what is in order to dream about what could be.
01:23:02.000 And she's speaking on behalf of the agency that has invented a lot of stuff.
01:23:06.000 So it's kind of amazing.
01:23:07.000 What have they invented?
01:23:07.000 I don't know.
01:23:08.000 But a lot of stuff that we take care of today.
01:23:11.000 Cutting edge stuff.
01:23:11.000 Are you sure?
01:23:12.000 Pretty sure.
01:23:12.000 If you don't know, how can you be sure?
01:23:14.000 Because if you read about DARPA all the time, they do the advanced secret research project.
01:23:18.000 Wow.
01:23:18.000 I wonder what they're working on.
01:23:19.000 She showed a hypersonic plane.
01:23:21.000 We still never got to aliens.
01:23:23.000 We didn't get to aliens.
01:23:24.000 No, you took me on a crazy journey.
01:23:26.000 I took you about Transcension.
01:23:27.000 Why we could never see them.
01:23:29.000 Fermi's Paradox.
01:23:30.000 So you don't think we'll ever see them?
01:23:32.000 Only when we build our own black hole and go into it.
01:23:34.000 Then we'll meet them at the end of time.
01:23:36.000 Because it slingshots you into the future.
01:23:38.000 So it's not possible that they could just be roaming through this universe in galactic spaceships?
01:23:43.000 No, because if they did that, that would influence our evolution.
01:23:46.000 Us discovering them and being influenced by their technology would be influencing our ultimate evolution.
01:23:51.000 It would create a butterfly effect.
01:23:54.000 And the thing that John Smart says is they wouldn't want to do that because that would be akin to incest.
01:23:58.000 To influencing us in some way and then changing how we unfold.
01:24:02.000 Okay, devil's advocate.
01:24:03.000 This is through our understanding of genetics, right?
01:24:05.000 It's not through theirs.
01:24:07.000 If they're a thousand or a million years more advanced than us, maybe they know a lot more about how to work that shit.
01:24:12.000 Right, and maybe that's why he says they don't get involved.
01:24:14.000 So maybe it's not that they would think of it as incest at all, but they've completely gone past the idea of gender.
01:24:19.000 And replication by means of sexuality is just what we have to do to make one step from the primate form into the gray, alien, large, almond-shaped eye form.
01:24:30.000 They wouldn't want us to replicate their technology?
01:24:32.000 Says who?
01:24:34.000 There's some sort of thing that will lead to the most diversity, and if we're not influenced by them, there'll be more diversity, because we'll get there ourselves.
01:24:41.000 It's just beginning to get me in it.
01:24:42.000 Well, the idea, I think...
01:24:43.000 Anyway, that's what he says.
01:24:43.000 The idea that people really love to share when it comes to wacky alien theories is that aliens have genetically engineered human beings in the first place.
01:24:50.000 Well, I mean, you can't unprove that, so that's...
01:24:53.000 That's a problem, right?
01:24:55.000 You can't unprove leprechauns, bro.
01:24:57.000 You know?
01:24:58.000 No, no, no!
01:24:58.000 Why can science...
01:25:00.000 Yeah, that's the thing.
01:25:01.000 That's the dog thing.
01:25:02.000 We're watching this...
01:25:04.000 The nuttiest thing about this weird looking robot thing is that it moves, it has sort of like an insect-like leg setup, but if you kick it, it adjusts and it doesn't fall down.
01:25:15.000 Yeah, look, it's adjusting to the sand and the water.
01:25:17.000 Yeah, totally, dude.
01:25:18.000 Walking on the beach.
01:25:19.000 Yeah, and if that thing starts saying hi to you and smiling and drooling, you would totally fall in love with it.
01:25:24.000 You know what I'm seeing?
01:25:24.000 Put that thing on one more second.
01:25:26.000 You know what I'm saying?
01:25:27.000 I'm seeing that thing storm out of the back of a giant battleship and missiles flying off of it.
01:25:32.000 That's what I think.
01:25:33.000 I think if they make one of those fucking things, they make it to send over to countries.
01:25:37.000 Could you imagine a whole army of these motherfuckers heading into your town shooting missiles?
01:25:43.000 I'm imagining them as pets for people who are lonely.
01:25:46.000 You couldn't strap some rocket launchers on that bitch?
01:25:47.000 It's always a double-edged sword.
01:25:48.000 I have to admit it's always a double-edged sword.
01:25:50.000 But look, what if people start riding them?
01:25:52.000 What if people start riding them?
01:25:53.000 They become the new horses.
01:25:54.000 Yeah, they will become the new horses.
01:25:55.000 They'll be our little pets.
01:25:56.000 Look, it's wagging its tail.
01:25:57.000 It's wagging its tail.
01:25:58.000 Don't you feel bad for it when he kicks it?
01:26:00.000 You feel bad for it.
01:26:01.000 It's so human-like.
01:26:02.000 Yeah, but think of how quickly you feel bad for it.
01:26:04.000 I do, but what's amazing is this thing adjusted.
01:26:07.000 And it seems to be adjusting.
01:26:08.000 The movements seem organic.
01:26:11.000 Yeah, totally.
01:26:13.000 Totally.
01:26:13.000 We're going to have one of those and we're going to ride them like a goat down the side of a mountain.
01:26:18.000 That's what it's going to be.
01:26:20.000 You know those crazy goats?
01:26:22.000 They're incredibly strong and they can climb up the side of mountains.
01:26:25.000 They never complain.
01:26:27.000 They never get tired.
01:26:29.000 They don't have ache.
01:26:31.000 Yeah, but then they probably run on solar power, too.
01:26:34.000 Eventually, solar power is going to get to a point where it can power everything, right?
01:26:37.000 Of course, dude.
01:26:38.000 We get 10,000 times more energy from the sun than we need.
01:26:41.000 This fucking thing's slow as shit.
01:26:43.000 We just need to get better ways of capturing that energy.
01:26:45.000 I'd be pissed.
01:26:46.000 If I was riding this stupid thing right now, I'd be like, come on, bitch.
01:26:48.000 Free energy.
01:26:49.000 That's what the sun gives us, free energy.
01:26:51.000 It's too heavy, Brian?
01:26:51.000 Yeah, for that, you know, how it's going.
01:26:54.000 Oh, wow, it's making it through snow.
01:26:56.000 It's fascinating.
01:26:56.000 That is incredible.
01:26:57.000 It's walking through snow.
01:26:58.000 It's fascinating.
01:26:59.000 And there's a bunch of different designs of them, too.
01:27:01.000 I've seen other ones that have many more legs.
01:27:04.000 Wow, that's crazy looking.
01:27:05.000 This thing's dancing around.
01:27:07.000 Robotics, man.
01:27:07.000 Robotics is going to be such a huge...
01:27:09.000 Well, robotics and AI. Yeah.
01:27:11.000 Artificially intelligent robots.
01:27:13.000 Jesus Christ, look at that.
01:27:14.000 They're down to the bare skeleton of the thing.
01:27:16.000 This is incredible stuff, folks.
01:27:18.000 I know we're just talking right now, unfortunately, for a lot of you folks that are listening to this on iTunes.
01:27:23.000 What should they Google, Brian, so they can watch this?
01:27:25.000 This is Boston Dynamics, but it's just a new big dog robot.
01:27:30.000 New big dog robot video.
01:27:32.000 And it looks like a spider when you're looking.
01:27:34.000 Really, it's the one that has 700 plus thousand hits on it.
01:27:37.000 It's a must see.
01:27:39.000 You need to know.
01:27:39.000 Amazing.
01:27:40.000 Yeah, I mean, what does the future hold that we're not prepared for?
01:27:45.000 What is the next step?
01:27:46.000 You know, I mean, the internet, I think, caught most people by surprise.
01:27:49.000 Yeah, well, we're in for some...
01:27:51.000 Is that a robot dog?
01:27:53.000 Oh, my God!
01:27:54.000 Brian just put on a robot dog, and this thing is moving around like an animated dog!
01:27:58.000 This is insane!
01:28:01.000 The thing is, there was a guy at TED that showed his...
01:28:04.000 Oh my god, that's insane.
01:28:05.000 That dog's insane.
01:28:07.000 No, there's going to be more of those kinds of robots, and the more that they interface with us and they look cute, it doesn't matter if they're conscious or not.
01:28:17.000 Once they cross a certain kind of...
01:28:20.000 Perceptual barrier that we have and they're like, they seem real, we'll start to interface with them as if they are real.
01:28:24.000 Well, they're going to be our friends just like your dog.
01:28:26.000 You know, when you come home and you have a conversation with my dog, it's a one-way conversation.
01:28:30.000 It's a one-way conversation.
01:28:30.000 It's just about the feedback.
01:28:31.000 Exactly.
01:28:32.000 And we'll have the same thing with robots, dude.
01:28:33.000 Yeah, as long as we can get past the idea that something that's metal and wires and, you know, that that thing can't have some sort of a soul.
01:28:42.000 Yeah.
01:28:42.000 Because you're interacting with it.
01:28:43.000 You know, if you're interacting with it, as long as it doesn't get needy.
01:28:45.000 You'll plant a soul in it.
01:28:46.000 What if your fucking computer gets needy?
01:28:48.000 It might.
01:28:49.000 Well, maybe you might want it to get needy because it'll make you feel like you're important to someone.
01:28:54.000 Maybe that's a part of what it's like to have a robot fuck doll.
01:28:58.000 That a robot fuck doll, the really good ones, they're really dangerous.
01:29:01.000 This bitch might burn your house down.
01:29:03.000 You can't fuck other girls.
01:29:05.000 She's going to be the best, hottest robot fuck doll ever.
01:29:08.000 The robots will give you whatever you want.
01:29:11.000 And that's the only way to make it hot.
01:29:12.000 The only way to make it hot.
01:29:13.000 She has to be super jealous.
01:29:14.000 She can't just let you treat her like shit.
01:29:16.000 I mean...
01:29:18.000 Why aren't you sharing your location services with me?
01:29:20.000 She becomes a robot, an angry, psycho, jealous robot.
01:29:25.000 And that's the hottest sex you can get.
01:29:27.000 And so that's why everybody just accepts it.
01:29:29.000 You'll be able to get whatever you want.
01:29:31.000 The robots are going to be...
01:29:33.000 They'll be Asian robot fuck dolls.
01:29:34.000 Don't ask no questions.
01:29:36.000 Just take care of business, son!
01:29:38.000 Right?
01:29:39.000 They'll always be the exotic Asian robot fuck dolls.
01:29:41.000 Yeah.
01:29:43.000 You know, it's interesting.
01:29:45.000 We keep talking about all this exotic technology, and it sounds...
01:29:48.000 I think we were talking about Asians.
01:29:49.000 We're talking about girls, bro.
01:29:50.000 It sounds hallucinatory, even though it's very quickly emerging.
01:29:56.000 And it just takes you back to that whole thing about computers as the modern version of the psychedelics.
01:30:02.000 I just want to say, if you're an Asian girl, I'm just joking around.
01:30:04.000 These are just jokes.
01:30:05.000 I just throw things out there.
01:30:06.000 I don't mean anything by them.
01:30:08.000 Okay?
01:30:10.000 But what do you think of that?
01:30:11.000 Like, To tell Timothy Leary, computers are the LSD of the 90s.
01:30:14.000 People took drugs and they're like, we can expand our minds, and now computers expand our minds.
01:30:20.000 That relationship is very fascinating to me.
01:30:22.000 It's absolutely fascinating.
01:30:23.000 Well, right now, just think, what's this interface that's happening right now?
01:30:26.000 This is all live.
01:30:27.000 Right now, only 2,000 people are synced up live with us, but eventually this feeling of this conversation and these ideas explored are going to branch out to about a half a million people.
01:30:40.000 Right.
01:30:40.000 So, half a million minds are hearing our thoughts.
01:30:44.000 Yeah, and out of that half a million, who knows how many people are going to just, you know, I read this Tony Robbins thing once where he talked about Tony Robbins actually very positive.
01:30:54.000 You know, a lot of people think that Tony Robbins is full of shit because he's kind of like made a lot of money.
01:30:57.000 Oh, no, I think he's brilliant.
01:30:58.000 He's got a lot of very, very good points.
01:31:01.000 And one of them was to change your life, to make huge changes, all you need is a small change in the direction.
01:31:07.000 And over time, that small change will lead you so far apart of where your initial direction was going.
01:31:12.000 It's absolutely true.
01:31:13.000 And the idea is that if you have two cars in two parallel lines, and one of them just takes a slight turn to the right, and they keep driving straight.
01:31:20.000 The one that's a slight turn to the right is going to be, you know, a hundred miles from now is going to be way the fuck away from that other one.
01:31:26.000 Totally.
01:31:27.000 And that's sometimes really how you have to look at it.
01:31:29.000 Yeah.
01:31:29.000 We especially, I mean, I'm a super impatient person.
01:31:32.000 I want things now.
01:31:33.000 Me too, man.
01:31:33.000 Even when I go to the supermarket, I'm like, you bitches don't have any grass-fed beef.
01:31:36.000 Right.
01:31:37.000 Seriously?
01:31:37.000 Right.
01:31:37.000 But I mean, like...
01:31:40.000 How preposterous is it that I think that I can just go to a place and they've killed an animal for me, raised it on grass only, killed an animal, and there's plenty of meat.
01:31:48.000 I can feed my family.
01:31:49.000 I can stay alive from this food here.
01:31:52.000 But you're complaining because they didn't get the shipment of organic meat that day.
01:31:56.000 Bitches don't have no grass-fed beef?
01:31:57.000 The fuck?
01:31:58.000 Well, what about when you're having a Skype conversation with somebody on the other side of the planet, and you're just like, take it for granted that you can see their face, that they can see yours, you know?
01:32:07.000 You're talking in real time for free, and then all of a sudden it might freeze.
01:32:11.000 You're like, oh, goddammit, it's freezing!
01:32:13.000 Why is this freaking computer freezing?
01:32:15.000 But think about what you were just enjoying two seconds before, and you totally take it for granted.
01:32:22.000 We assimilate, man.
01:32:23.000 Hedonic adaptation.
01:32:24.000 Yeah, that is what it is.
01:32:26.000 It is adaptation.
01:32:27.000 It's amazing, though, that we have this urge and this push to make things bigger, faster, quicker.
01:32:32.000 And that urge and push is also responsible for one of the reasons why people get accustomed to things and want more.
01:32:39.000 Yeah, and you quoted McKenna, and you talked about something about the astonishment, to not give in to the astonishment.
01:32:45.000 Yes, to not give in to the astonishment.
01:32:46.000 To not give in to it, but definitely seek it out.
01:32:48.000 Because I think most people...
01:32:49.000 Well, he's talking about DMT, though.
01:32:51.000 Yeah.
01:32:52.000 But the truth in DMT. You ever had a DMT experience?
01:32:55.000 I have not.
01:32:57.000 But don't you think that, for example, it's astonishing that you can do this podcast and reach half a million minds.
01:33:03.000 And very rarely does one marvel at the astonishment of the things that occur every day that are miraculous.
01:33:08.000 How many hundreds of thousands of aircrafts are flying through the air right now, communicating with one another, flying safely, individuals to other parts of the world.
01:33:16.000 Right.
01:33:16.000 Right.
01:33:17.000 We don't experience that astonishment.
01:33:18.000 I don't wake up in astonishment.
01:33:20.000 We should.
01:33:20.000 Yeah, you're absolutely right.
01:33:21.000 I mean, if you had pulled someone out of the caveman era and put him in modern society, it would be just as psychedelic as a lot of peyote trips.
01:33:29.000 Yes.
01:33:29.000 Yeah.
01:33:30.000 Yeah, exactly.
01:33:31.000 It'd be so bizarre and outside of what you conceived of just seconds ago as being possible.
01:33:37.000 Right.
01:33:38.000 You know, you take someone from, you know, a thousand years ago, 500 years ago, a blip in time means nothing to the universe, and then put him in today, or put him in a goddamn movie theater, make him watch Harry Potter and shit his pants.
01:33:47.000 Right, shit his pants.
01:33:48.000 Can you imagine what a guy would do if he saw a fucking dragon, one of those Harry Potter dragons blowing fire out, flying through the fire?
01:33:55.000 He would just dive on the ground screaming in horror.
01:33:58.000 Right.
01:33:58.000 And that is so amazing.
01:34:00.000 It's amazing.
01:34:01.000 And the fact that that is.
01:34:02.000 Like, we wake up in the morning, we don't think about that, because that just is.
01:34:05.000 We're on to the next thing.
01:34:07.000 Yeah.
01:34:07.000 Yeah.
01:34:07.000 We're done and on to the next thing to be looking forward to or to be complaining about.
01:34:12.000 And maybe that's the part of our evolutionary makeup that makes us always probe the boundaries of the adjacent possible and always want to keep pushing.
01:34:20.000 Because maybe if we were in astonishment of all we've done, we wouldn't keep progressing.
01:34:23.000 We're obsessed with innovation.
01:34:25.000 Yes.
01:34:25.000 Human beings are obsessed with innovation.
01:34:27.000 Yes.
01:34:27.000 I mean, you know, every year sports cars get faster.
01:34:31.000 Yes.
01:34:31.000 You know, we're getting to a point right now where like regular cars are doing like race track numbers.
01:34:37.000 Yeah.
01:34:37.000 It's insane.
01:34:38.000 Yeah.
01:34:38.000 Even though we have speed limits, even though we have that.
01:34:40.000 Like we still push the performance to its limits.
01:34:43.000 I'm fascinated by sports cars just because I'm fascinated by extreme engineering.
01:34:47.000 Yeah.
01:34:47.000 And I'm fascinated by the idea that there's a bunch of people out there that are trying to get something that handles faster, has better geometry, moves better, sticks.
01:34:53.000 And the newest Porsche 911 goes around the track as fast as the 996 Cup car.
01:35:01.000 So the Nürburgring, which is like this really twisty, turny track in Germany, A really high-end sports car can go around it today at about 7 minutes, 30 seconds, 7 minutes, 40 seconds.
01:35:15.000 That's like a 911, you know, like a real high-end car.
01:35:18.000 That's what race cars would do just a decade earlier.
01:35:23.000 So it's getting to this crazy point where regular modern street cars are like fucking cup cars.
01:35:30.000 And how much faster do you need these fucking things?
01:35:33.000 Like, you know, the Bugatti Veyron, they have a Bugatti Veyron.
01:35:37.000 It's like a thousand fucking horsepower!
01:35:39.000 Yeah, but we do it just to do it, man.
01:35:41.000 That's crazy!
01:35:42.000 Just to see how much complexity we can pack into it, how much performance we can get out of it.
01:35:47.000 It's like modern jet engines, dude, operate at half the temperature of the surface of the sun.
01:35:52.000 The core of a modern jet engine.
01:35:54.000 Jesus Christ.
01:35:54.000 I mean, it's insane, okay?
01:35:57.000 It spins at 500 miles.
01:35:59.000 I mean, I don't remember the speed, but jet engines are really feats of engineering.
01:36:03.000 I mean, it's dazzling.
01:36:04.000 I mean, you know, everyone's scared of flying.
01:36:06.000 Jesus Christ, there's 30,000 flights a day and nothing happens.
01:36:08.000 It's incredible.
01:36:09.000 It's so safe.
01:36:10.000 It's amazing.
01:36:11.000 It's so safe.
01:36:12.000 Yeah, but the way you go is so terrifying.
01:36:15.000 Just slamming into the fucking rainforest.
01:36:17.000 Yeah.
01:36:19.000 Well, that's why I love Virgin America so much, dude, because they got brand new planes.
01:36:22.000 They really rock.
01:36:24.000 Brand new, state-of-the-art fleet.
01:36:27.000 Well, because most other airlines in this country have fleets that are 20 to 25 to 30 years old.
01:36:31.000 Dude, you're scaring the fuck out of me right now.
01:36:33.000 I need to go on Virgin America.
01:36:35.000 Listen, that doesn't make them any less safe.
01:36:37.000 These planes are still certified and well-maintained.
01:36:40.000 Nonetheless, on Virgin America, you're getting a brand new fleet of shiny state-of-the-art aircraft with the best of everything with internet.
01:36:49.000 No, you're not getting a brand new pilot.
01:36:50.000 He's even saying, dude, We're good to go.
01:37:09.000 The same general principles, but the engines are far more reliable and far more advanced than they were before.
01:37:14.000 When did they start getting much better?
01:37:16.000 Oh, well, the same Moore's Law that applies in computers.
01:37:19.000 I mean, the engineering of a modern jet engine in the computers.
01:37:21.000 But aren't a lot of these jets from, like, the 1970s and 1980s?
01:37:24.000 Well, no, they make revisions that are pretty much like entire new models.
01:37:28.000 Do they change the engines?
01:37:29.000 They change everything.
01:37:31.000 Yeah, iPhone 1 will work, as long as you don't update the software.
01:37:33.000 Right.
01:37:33.000 Right?
01:37:34.000 But it's kind of interesting, though.
01:37:36.000 It's more fun to go on the new ones.
01:37:37.000 They have far more technology in them.
01:37:39.000 Yeah, absolutely.
01:37:40.000 You know that robot dog?
01:37:41.000 You know what that's going to be in the future?
01:37:42.000 What?
01:37:42.000 Check the screen out right here.
01:37:43.000 How crazy is this?
01:37:45.000 Somebody posted this on your message board, but...
01:37:47.000 Hold on.
01:37:48.000 Episode 2, Attack of the Clones.
01:37:50.000 No, no, no.
01:37:51.000 Here we go.
01:37:52.000 Oh, my God.
01:37:52.000 Is that real?
01:37:53.000 Oh, that augmented reality placed in there or what?
01:37:56.000 Yeah, yeah.
01:37:57.000 But they're AT-ATs, you know, from Star Wars.
01:37:59.000 Those robot dogs are the exact same thing as an AT-AT. Right.
01:38:02.000 No, dude, you're totally right.
01:38:03.000 I mean, it looks exactly the same.
01:38:06.000 Oh, so what is that?
01:38:07.000 That's just they added that?
01:38:08.000 Yeah, somebody just put a funny video together.
01:38:10.000 Yeah, well, that's amazing.
01:38:12.000 You're totally right.
01:38:13.000 Yeah, that's exactly what it is.
01:38:14.000 Yeah.
01:38:16.000 That's fascinating.
01:38:17.000 There's Brian's stupid fucking cat clock.
01:38:18.000 How dare you?
01:38:20.000 Oh, is that the cat clock?
01:38:22.000 Yeah, that's the famous cat clock.
01:38:24.000 He likes cats.
01:38:26.000 He likes things to meow.
01:38:30.000 Future of medicine, man.
01:38:32.000 Are you excited about that?
01:38:33.000 Yeah, well, I'm excited about the idea of keeping people alive long enough to figure out some really crazy shit.
01:38:39.000 The idea of people staying long enough to overpopulate the planet kind of freaks me out, though.
01:38:44.000 Yeah, well, I think that most people cluster around only like 3% of the surface of the world, which is city-states, like big cities.
01:38:51.000 Yeah, the world is still mostly empty space, and it's mostly water, and technology is more like a resource-liberating mechanism, because scarcity is just contextual.
01:39:00.000 Things are only scarce until you create technology that makes them into things that are abundant.
01:39:04.000 People talk about...
01:39:04.000 Water wars, but the minute...
01:39:06.000 So you're not worried at all about overpopulation?
01:39:08.000 No, man.
01:39:08.000 Not at all.
01:39:09.000 In fact, the more developed and educated people become, and in developed nations, the rate of having children goes down significantly.
01:39:15.000 McKenna has no...
01:39:16.000 The best cure against overpopulation is to educate and empower people and put more technology into their hands.
01:39:21.000 But also, um...
01:39:23.000 Also, desalinization, for example.
01:39:25.000 Once we perfect that technology, this is called a blue planet.
01:39:28.000 It's a water planet.
01:39:29.000 It's mostly water.
01:39:30.000 It just needs to be converted.
01:39:31.000 Do they have anything right now that can do the widespread?
01:39:34.000 Israel has a lot of desalinization plans.
01:39:37.000 They've just got to get more advanced, just like solar panels.
01:39:39.000 It's just exponential growth.
01:39:40.000 Once they hit the tipping point where it's actually cheaper to use those technologies than to do it the other way, then it'll become the main thing.
01:39:47.000 Wow.
01:39:49.000 Desalinization.
01:39:49.000 That's going to be intense.
01:39:49.000 You need to incentivize people to innovate.
01:39:54.000 We're such cunts, though, we'll probably dry out the fucking ocean.
01:39:57.000 We'll probably pull all the water out of the ocean.
01:39:59.000 No, I don't think we were.
01:40:00.000 Could you imagine, though, if we're so greedy, we use up all the water in the ocean?
01:40:05.000 I mean, nobody predicted that we would have polluted the ocean the way we have in just 100 years.
01:40:09.000 I mean, we've done an incredible job of fucking out the ocean.
01:40:11.000 We'll do nanotechnology, we'll create synthetic biology, algae that eats the plastic, and we'll, yes...
01:40:16.000 Where's the evidence of that ever having taken place in the past?
01:40:19.000 When have we ever fixed anything?
01:40:20.000 There's an X-Prize contest that the X-Prize is doing to come to something with plastics, technology to clean up oil spills or something like that.
01:40:28.000 Yeah, like bacteria that eats plastic.
01:40:31.000 Something like that.
01:40:31.000 What it is, they create incentive by offering these prizes, like $10 million prizes, and teams around the world will spend $100 million to win a $10 million prize because of the prestige and because of the legacy.
01:40:41.000 Isn't that where a swamp is?
01:40:42.000 Swamp Thing came from.
01:40:43.000 For what?
01:40:43.000 Swamp Thing.
01:40:44.000 Remember the Marvel Comics Swamp Thing?
01:40:46.000 From a contest?
01:40:47.000 No, no.
01:40:48.000 From pouring some biological shit to eat up some...
01:40:51.000 Maybe I'm inventing it.
01:40:52.000 Oh, I don't know.
01:40:53.000 Maybe it's another comic book hero.
01:40:54.000 For example, the XPRIZE, they were the ones that did the $10 million XPRIZE for space, which became Virgin Galactic.
01:41:01.000 Well, they have one now to create a device that's the size of an iPhone called a Tricorder.
01:41:06.000 $10 million so you can make a device that you can spit on or you can put your blood on and that will diagnose you with the equivalent of 10 certified doctors with greater accuracy than 10 certified doctors.
01:41:17.000 I swear to God, this is their new contest.
01:41:18.000 This is their net $2 million, $10 million prize that they just put out.
01:41:22.000 Tricorder XPRIZE. Is it possible to do that?
01:41:24.000 Of course it's going to be possible.
01:41:25.000 They already have things that you can put on your iPhone that you can spit on that will measure and analyze your fluids.
01:41:32.000 Really?
01:41:32.000 Yeah, they already have that.
01:41:33.000 You spit on your iPhone and it gives you information.
01:41:36.000 What's it called?
01:41:37.000 I have no idea, but people can Google spitting on your iPhone medical device.
01:41:41.000 Wow, I never heard of that.
01:41:42.000 That's amazing.
01:41:43.000 That stuff is going to get a lot faster because now that biology is becoming information, biology is an information technology, we're going to see the same progress.
01:41:50.000 Well, it is so cool when you have contests for good along those lines, like with XPRIZE and the fact that they would come up with something along that.
01:41:57.000 Yeah, they're brilliant.
01:41:57.000 Yeah.
01:41:58.000 And I mean, I would love to believe you.
01:42:00.000 I'd love to believe that someone's going to eventually figure out a way to get rid of that giant patch of garbage that's in the Pacific Ocean.
01:42:05.000 We shall.
01:42:06.000 Yeah.
01:42:07.000 We shall.
01:42:07.000 That's a big issue, huh?
01:42:08.000 Yeah.
01:42:08.000 Well, people talk about that a lot.
01:42:10.000 They're very concerned.
01:42:11.000 Although it's not actually like the size of a country as people have said.
01:42:14.000 I think it's the size of Texas.
01:42:16.000 I don't think it actually is physically the size of Texas.
01:42:20.000 Well, I think that they're so small.
01:42:22.000 No, I think the...
01:42:23.000 You know how it is.
01:42:24.000 It's all caught in the current.
01:42:25.000 There's like a vortex.
01:42:27.000 Really?
01:42:27.000 And that's where all the garbage piles up.
01:42:29.000 And all the garbage...
01:42:30.000 I'll look at it right now.
01:42:31.000 Okay.
01:42:32.000 Let me Google this real quick.
01:42:34.000 Powerful Google.
01:42:36.000 Pacific Ocean garbage patch.
01:42:38.000 Pacific Ocean.
01:42:39.000 Oh, it's fucking huge.
01:42:49.000 Holy shit.
01:42:50.000 Wow. - No.
01:42:58.000 Although many media and advocacy reports have suggested that the patch extends over an area larger than the continental US, recent research sponsored by the National Science Foundation suggests that the affected area may be twice the size of Hawaii.
01:43:15.000 Wow.
01:43:15.000 That's fucking big.
01:43:16.000 But that's not the size of Texas.
01:43:20.000 I'm pretty confident that we will create nanotechnology that will literally eat up the garbage.
01:43:26.000 That's how we'll fix it.
01:43:28.000 But then when it runs out of garbage, then it'll be hungry.
01:43:30.000 And then it becomes swamp thing.
01:43:32.000 It doesn't want to die, man.
01:43:34.000 It doesn't want to die.
01:43:36.000 Isn't that like the premise to a lot of comic book monsters?
01:43:38.000 It's a double-edged sword.
01:43:40.000 Fuck that is.
01:43:41.000 We better come up with a way to kill those things before we feed them plastic.
01:43:45.000 Well, yes.
01:43:46.000 Look, it's important to look at all the possible uses of technology for good and for bad.
01:43:51.000 That's why the conversation needs to be had, though.
01:43:52.000 Let me ask you this.
01:43:53.000 The progress is not stopping.
01:43:55.000 I think if we paint beautiful pictures of how things could be, we inspire the people to make sure that that's what we actualize.
01:44:01.000 Absolutely, I completely agree with you.
01:44:02.000 And I think, you know, the way you're doing it in videos and online is really cool, and it's very positive.
01:44:06.000 Thanks, buddy.
01:44:06.000 Well, the way you're doing it is amazing.
01:44:08.000 But my question to you is, what if we saw kangaroos evolving?
01:44:11.000 What if we saw kangaroos, they had found some flower, there was a psychedelic flower, they started eating it, and kangaroos started building houses, and whittling weapons and shit like that, and we saw some kangaroos welding, we saw some welding, Would we allow that shit?
01:44:26.000 You think we'd go in and kick the kangaroos' asses and go, get the fuck out of here with your armor?
01:44:30.000 What, bitch?
01:44:31.000 We might make other animals smarter.
01:44:33.000 Who knows?
01:44:33.000 Do you think so?
01:44:34.000 We might give them sanctions.
01:44:35.000 But then we'd be battling for resources.
01:44:37.000 I think we would just jack them.
01:44:38.000 No, because, no.
01:44:39.000 We don't even want Iran to have nuclear power.
01:44:41.000 What if the kangaroos came up with the nukes before Iran?
01:44:44.000 What if kangaroos just started fucking being really super smart, man?
01:44:48.000 Yeah, well, but I don't think that the resources will be an issue, because we'll be harnessing this matter and energy from the whole galaxy.
01:44:55.000 There's an infinity of resources.
01:44:56.000 You say that, but what if an asteroid lands in Australia, right near where the kangaroos are, and some spores from this asteroid contain a never-before-seen mushroom that rapidly accelerates evolution, and within, like, a hundred years, they surpass us, and then kangaroos are smarter than us.
01:45:14.000 What do you do then, Brian?
01:45:15.000 What are you going to do with your fucking cat clock?
01:45:18.000 I'll create a time machine and take the words what if out of the dictionary.
01:45:22.000 Well, then I'm going to take a time machine and take the word like out and you won't ever be able to say anything.
01:45:26.000 Whatever.
01:45:27.000 Whatever, bitch.
01:45:30.000 Listen, I believe that you are absolutely convinced that someone's going to come up with this.
01:45:34.000 I just don't know if I agree with you.
01:45:35.000 Well, I see it happening, man.
01:45:38.000 You see this?
01:45:38.000 What is the current plans to fix this now?
01:45:42.000 To fix which...
01:45:43.000 The garbage patch that we were talking about?
01:45:44.000 Yeah, well, they're talking about creating some kind of algae or bacteria that eats the plastic.
01:45:49.000 I think one of the big guys of synthetic biology is Craig Venter, who also spoke at the Singularity University thing.
01:45:56.000 And he was seeing in terms of the future of fuels and the future of cleaning up chemicals and absolutely going to be using synthetic biology.
01:46:05.000 Wow.
01:46:06.000 Yeah.
01:46:07.000 Because we can program life to do whatever we want.
01:46:11.000 It's just like we can use language to describe anything.
01:46:14.000 We can just author instructions.
01:46:16.000 And here you have software that writes its own hardware.
01:46:18.000 See, that's the thing about programmable life, unlike computers.
01:46:21.000 You write the code, the code manufactures its own phenotype.
01:46:27.000 Right?
01:46:27.000 Because life, the genes, determine its physical attributes.
01:46:30.000 So the software writes its own hardware into existence.
01:46:33.000 That's what's really exciting about synthetic biology and programmable life.
01:46:37.000 Especially if you give some artificial intelligence access to 3D computers and 3D printers.
01:46:42.000 Dude, absolutely.
01:46:44.000 Things are going to get crazy.
01:46:46.000 Unlimited intelligence, unlimited intelligence that replicates itself, and 3D printers.
01:46:51.000 But just the concept of 3D printers, having it aware of, oh, now I can improve upon this design of 3D printers.
01:46:57.000 With trillions of time more RAM than our brain.
01:47:01.000 Yeah, and instantaneously.
01:47:02.000 Well, you know what Henry Miller said.
01:47:03.000 One year is like 10,000 years of progress.
01:47:05.000 And we need to believe that it's coming, man.
01:47:06.000 Henry Miller said, the day that men cease to believe that they will one day become gods, then they will surely become worms.
01:47:15.000 Wow.
01:47:16.000 That was Henry Miller.
01:47:17.000 So he says, believe.
01:47:19.000 Mankind, you know, going from ape to Superman, you know, smack in the middle in a trajectory between the born and the made.
01:47:26.000 That's where we are, man.
01:47:27.000 Yeah, we're in this weird stage.
01:47:29.000 We're in the middle.
01:47:30.000 Yeah, this weird stage where we're sort of conscious and we're aware.
01:47:34.000 We're also animalistic and jealous and weird and savage, horny.
01:47:38.000 We're going to turn ourselves into the most beautiful artwork we've ever made, man.
01:47:42.000 You really think so?
01:47:43.000 I definitely do.
01:47:44.000 Or the aliens land first.
01:47:45.000 Or the aliens land first.
01:47:47.000 So you don't believe that societies ever get to the point where they travel from one place to another land and affect things?
01:47:52.000 That doesn't...
01:47:53.000 No, I think that they do, but the transcension hypothesis says that by the time maybe they reach the edge of the solar system or the edge of the galaxy, at that point, all the density goes back and it goes inwards into the nanoscale.
01:48:03.000 So it's kind of like we...
01:48:04.000 The complexity kind of goes into itself and...
01:48:07.000 It makes a black hole and disappears from the visible universe.
01:48:10.000 Is it possible that what we're dealing with?
01:48:11.000 People should look up the Transcension Hypothesis because it'll probably explain much better than I can.
01:48:15.000 But isn't it possible that what you're dealing with is something that's here all the time but it's in another dimension?
01:48:21.000 Hyperdimension, string theory, yeah.
01:48:22.000 I mean, that is addressed in that article, yeah.
01:48:25.000 So you could even go back to McKenna and say, oh, so when McKenna talks about hyperdimensional beings, well, the Transcension Hypothesis says essentially our minds, yes, will break through the visible universe into other dimensions.
01:48:37.000 It's like crazy stuff, except it's like written by an academic scholar.
01:48:41.000 Wow.
01:48:42.000 Yeah.
01:48:43.000 So no aliens and flying saucers, just landing.
01:48:46.000 Yeah, that's what he says.
01:48:47.000 He says, well, yeah, we'll go to other planets, but that's like early stage stuff.
01:48:51.000 Like going to other planets over the next like 50 years, you know, that's early stage.
01:48:54.000 So if we ever get invaded, we're essentially being invaded by young punks.
01:48:57.000 The really high level aliens wouldn't bother invading us.
01:49:01.000 Right, right.
01:49:01.000 Totally.
01:49:02.000 You know, every time we do that, the microphone picks it up.
01:49:05.000 Sorry, man.
01:49:05.000 It's fine.
01:49:06.000 It's good.
01:49:06.000 But, you know, someone's going to be upset.
01:49:09.000 I'm sensitive because people always complain.
01:49:10.000 I used to chew gum.
01:49:11.000 Can't chew gum on the mic anymore.
01:49:13.000 People are like, dude, you're fucking sipping.
01:49:15.000 Can't sip drinks.
01:49:16.000 If you get up here and go, people get mad.
01:49:19.000 I guess what you've got to think of is that's the reason why we put headphones on.
01:49:23.000 We would easily do this conversation.
01:49:24.000 But if you were in the gym right now, you would hear that.
01:49:27.000 Cool.
01:49:27.000 Sorry, Jim.
01:49:28.000 No, please.
01:49:29.000 Sorry.
01:49:29.000 Sorry, everybody.
01:49:30.000 Just wanted to keep everybody happy.
01:49:32.000 This is all incredible stuff, and I guess it all could come true and come to fruition as long as we don't fuck it up or as long as some gigantic natural disaster doesn't happen as well, right?
01:49:44.000 Yes.
01:49:45.000 Do you ever take any care or in consideration?
01:49:49.000 Well, yeah.
01:49:50.000 Supervolcanoes, shit like that.
01:49:51.000 I think, look, we have to be paying attention and we have to be cautious and we have to be vigilant as we transition towards what promises to be the most exciting time in human history.
01:50:02.000 I mean, we're already living in the most exciting time in human history, but let's not lose focus.
01:50:06.000 Let's address the grand challenges of humanity.
01:50:09.000 We've never had such tools with which to do so.
01:50:13.000 And I think it's like an opportunity for us to pool our mental cognitive surpluses together and fix shit.
01:50:20.000 Yeah, absolutely.
01:50:21.000 Do you think we'll ever get to the point where we can avoid asteroids?
01:50:24.000 Sure.
01:50:25.000 You think so?
01:50:26.000 Yeah, yeah.
01:50:26.000 Shoot them down.
01:50:27.000 Yeah, we'll get to that.
01:50:28.000 Shoot them down, you think?
01:50:29.000 Shoot them down with lasers.
01:50:30.000 A single laser would blow it up.
01:50:32.000 Everything is turning into Star Wars.
01:50:34.000 Yeah.
01:50:34.000 You really think the enemy will do that, though?
01:50:36.000 I mean, some asteroids are miles wide.
01:50:37.000 We already have lasers that are pretty powerful.
01:50:39.000 I mean...
01:50:40.000 And send nuclear weapons to them like in the movie Sunshine.
01:50:43.000 Well, no, the issue with that is actually that it makes it worse because what happens is instead of one big impact, you have hundreds of thousands of impacts.
01:50:50.000 Little ones.
01:50:51.000 Well, they're not even little.
01:50:52.000 You know, you don't need anything that big to make that giant crater in Nevada.
01:50:56.000 You know, it's one of the weird things about all planets.
01:50:58.000 I mean, every planet we find is littered with impacts.
01:51:01.000 You know, we live in a very volatile solar system.
01:51:04.000 Well, we've been inhabiting the Goldilocks space for the Goldilocks amount of time.
01:51:07.000 We've just been very, very lucky.
01:51:09.000 Like I said, we've been talking about this.
01:51:11.000 All of our progress, man, is a blink of a blink of a blink of a blink of a blink in terms of cosmic time.
01:51:15.000 So it's like, it's not that we've...
01:51:17.000 I mean, we're lucky, yeah, but it hasn't been that much time that has passed.
01:51:20.000 You know, give a couple million years and the inevitability of getting hit is coming.
01:51:24.000 That's why we've got to progress so that we can thwart that.
01:51:26.000 Well, they just found a very recent evidence of an impact, a big one, about 13,000 years ago.
01:51:32.000 And what's really fascinating about that is all the ancient history theorists all point to that point in time as one being the end of the Ice Age, like around that time, the end of the Pleistocene, and also that's when a lot of people point to the possibility of like an ancient civilization like Egypt falling apart and then rebuilding in the same area.
01:51:52.000 Sure, sure, sure, sure.
01:51:52.000 You know, when they hypothesize that something went wrong, it's always around 10,000, 12,000, 13,000, somewhere around there.
01:51:58.000 Like these cycles?
01:51:59.000 Yeah.
01:51:59.000 Yeah, well, the idea is that, you know, human life on this planet, like the reason why there's the myth of Atlantis and the myth of, you know, Noah and the Ark and the epic of Gilgamesh is that there's all these giant disasters just that frequently hit, you know, and, you know, if something hit us today...
01:52:19.000 We're wired to look for danger, man.
01:52:21.000 That's the only way we survive.
01:52:22.000 So cautionary tales embedded in our culture are just alarm systems.
01:52:26.000 And it's kind of a race.
01:52:28.000 I mean, it's kind of a race between technology, awareness, progress, and the ability to at least predict and prepare slightly for natural disasters.
01:52:39.000 But some of them, like caldera volcanoes and things along those lines, this is nothing you can do, man.
01:52:43.000 It's just nothing you can do.
01:52:44.000 When it goes, it goes.
01:52:45.000 Unless you can figure out a way to throw some ice cubes on the On the lava.
01:52:50.000 Maybe.
01:52:50.000 Keep it from fucking blowing sky high.
01:52:52.000 Nanotechnology is the only way I think that could be addressed, you know?
01:52:56.000 That really is the craziest technology.
01:52:59.000 I mean, self-replicating things in a nanoscale.
01:53:02.000 So do you think that you'd be able to throw those into the lava and they would somehow or another chill everybody the fuck out?
01:53:07.000 Yeah, change the structure of the molecular structure of the lava.
01:53:10.000 What if that fucking freezes up the planet and turns us into another ice age?
01:53:14.000 Jesus Christ, Jason Salva, what are you doing?
01:53:16.000 The butterfly effect issue is always...
01:53:19.000 Yeah, it is, right?
01:53:19.000 We don't know.
01:53:21.000 Well, no, we'll have supercomputers that can map out every possible possibility, trillions of times more than we can map out different scenarios in our heads.
01:53:28.000 So those AIs will be able to pick the best scenario.
01:53:31.000 They'll make mathematical projections.
01:53:32.000 It'll be like, okay, there's a billion and one probabilities of...
01:53:35.000 This is the best one.
01:53:37.000 Let's do it.
01:53:37.000 You spend so much time thinking about the future and thinking about all these possibilities.
01:53:42.000 Is it possible that when you do this, or is it difficult when you do this, not to ignore the present?
01:53:50.000 Is it like sort of a normal thing to sort of ignore the present, where you're concentrating entirely on what the human race is going to accomplish?
01:53:57.000 Well, I think that...
01:53:59.000 You know how they say that.
01:53:59.000 I didn't disphrase that very well, but you know what I mean.
01:54:01.000 No, no, no.
01:54:02.000 We always talk about how human beings need a purpose.
01:54:05.000 A purpose by its very nature implies a reason to look forward.
01:54:09.000 So we can't help but look to the future.
01:54:12.000 It's what we do.
01:54:13.000 So your purpose is to create a purpose and to put the idea of purpose into people's heads.
01:54:18.000 What gives me a sense of purpose is a collective feeling that like, wow, humanity has this unique opportunity to sort of map its road Beautifully.
01:54:28.000 And we all have a way of participating in that.
01:54:31.000 And what a wonderful sense of collective purpose.
01:54:33.000 It's more interesting to me than like, oh, well, my purpose is to become or get this job or do this thing.
01:54:39.000 It's like, yeah, I want to get this job and do this thing just like everybody else because I want to survive.
01:54:42.000 But I'm in the mood for cosmic purpose, cosmic significance.
01:54:46.000 You're a cosmic dick slinger.
01:54:48.000 Did I say that?
01:54:48.000 It's the same reason that religion always appealed to people, for the same reason that man can live for a few weeks without food, a few days without water, but not for a second without hope.
01:54:59.000 It's just the human condition.
01:55:01.000 The minute we lose hope, we commit suicide.
01:55:03.000 Not the minute.
01:55:04.000 Sometimes you could really suffer for years before you pull the plug.
01:55:07.000 But when you lose complete hope, you might not even wait around.
01:55:10.000 If you're waiting around, it becomes you have a little bit of thinking that things might turn around.
01:55:15.000 So I think it's important.
01:55:16.000 I think it's important to look forward.
01:55:18.000 I think it's huge.
01:55:19.000 I think it's the only thing that propels our progress anyway, because if we were in a stupefied lull staying in the present, we wouldn't do anything.
01:55:29.000 Yeah, of course.
01:55:31.000 What do you see happening in your lifetime?
01:55:34.000 I mean, we are right now in 2012. This is supposed to be, if you're Paying attention to Time Wave Zero novelty theory that's supposed to be when the shit hits the fan.
01:55:42.000 You know what I found recently?
01:55:43.000 I've talked about this recently, but I wanted to bring it up with you because I know you're a McKenna fan as well.
01:55:47.000 He altered the end date to coincide with the end date of the Mayan calendar.
01:55:53.000 Yeah, maybe he was of the people that believe that by creating a social movement around these ideas, you more quickly actualize those ideas.
01:56:01.000 People were so upset at me for bringing this up, but somebody posted it on my message board, and then I went and read, and apparently his initial calculations was November.
01:56:10.000 November of 2012. And then he moved it to December?
01:56:12.000 Moving to December 21st, which is the, you know, the end date of the long count.
01:56:15.000 But then somebody brought up the other day, there was like an internet meme going around where, you know, calculate leap years.
01:56:20.000 Did the Mayans calculate leap years?
01:56:21.000 Because if they didn't, you know, all this shit all happened 700 years ago.
01:56:25.000 Yeah, I mean, the specifics, I have no idea what the science is.
01:56:30.000 I think what's interesting is that if you create a viral swell, 10 times the scale of the Joseph Kony video, with some beautifully produced message about how mankind is using technology to create a global brain and address the problems of humanity, and it's seen by a billion people by December on YouTube, then the idea becomes reality, because this is what we've been talking about.
01:56:54.000 Ideas are just as real as the neurons that they inhabit.
01:56:57.000 So that's what's crazy.
01:56:58.000 It's a self-fulfilling prophecy.
01:57:01.000 The Joseph Kony video, we talked about this, but I remember when it hit Twitter, when I saw it starting to appear in my timeline, I started thinking, wow, what's going to happen here?
01:57:11.000 This seems like a very orchestrated campaign.
01:57:14.000 And the idea to make a terrible person very famous so that he's a target.
01:57:20.000 More vilified more, yeah.
01:57:21.000 What a genius idea.
01:57:23.000 And that really is just sort of tapping into potential.
01:57:26.000 Tapping into, which no one else has done before.
01:57:29.000 No one else has ever done that about a terrible person.
01:57:32.000 Yeah.
01:57:32.000 No, no, it's very...
01:57:34.000 Yeah.
01:57:34.000 It's interesting.
01:57:35.000 It's so interesting.
01:57:36.000 It's brilliant.
01:57:37.000 And just the use of media and understanding its power and applying it for savvy social impact.
01:57:43.000 What are the criticisms of this Coney video?
01:57:45.000 Because I know there's a few...
01:57:46.000 No, the criticism I think has to do specifically with the non-profit...
01:57:51.000 But look, again, that's something, that's a whole other conversation.
01:57:54.000 We're not experts and we don't know the facts.
01:57:55.000 But I think what's interesting is what they've made with the video and what that video means about the future of how messages get spread.
01:58:01.000 That we're seeing, we all realize, we all know where we were when the Kony video hit.
01:58:05.000 It's one of those things where it's like something has changed here.
01:58:07.000 And we're all aware that, okay, this is a new paradigm.
01:58:11.000 It's a paradigm shift.
01:58:12.000 It's a paradigm shift.
01:58:14.000 You know what's really fascinating is Obama, the Obama campaign is releasing, this is where they're so social media, brilliant, savvy, they understand aesthetics in that campaign.
01:58:22.000 They had the director of an inconvenient truth Is about to release a documentary.
01:58:27.000 So like a well-made film about Obama.
01:58:31.000 And that's going to be part of their campaign media materials.
01:58:34.000 So instead of like an ad, like a normal attack ad like the other guys are doing, these guys are releasing a film made by a talented filmmaker.
01:58:41.000 I mean, the brilliance of that.
01:58:43.000 And that's probably going to go ridiculously viral.
01:58:46.000 That's the best campaign video you could have ever done.
01:58:48.000 When is that going to come out?
01:58:49.000 I don't know the dates, but people should Google the new Obama.
01:58:51.000 They just released a trailer.
01:58:52.000 I wonder if it's going to be free or like Louis C.K. I wonder if it's going to be like a Kim Kardashian reality TV show where you know that they've created artificial scenarios to move the plot along.
01:59:01.000 Yeah, like Obama's...
01:59:02.000 Alright, Obama, we got you at a car wash now.
01:59:04.000 Obama's like, Mexican food?
01:59:06.000 I don't want Mexican food!
01:59:08.000 Now, you're going to be washing the car.
01:59:10.000 Could you imagine if they actually did it, they'd produce it like a reality show?
01:59:13.000 Yeah.
01:59:13.000 Would that be the most ridiculous shit ever?
01:59:15.000 I haven't seen it.
01:59:16.000 I wonder what it's going to be.
01:59:17.000 Something tells me it's going to be a well-made film with beautiful music and beautiful cinematography.
01:59:20.000 Well, it'd be nice to see him talk outside of that fake sort of, I'm giving a speech voice.
01:59:25.000 Yeah, yeah, exactly.
01:59:26.000 It should be like a documentary that followed him and you get the behind the scenes moment.
01:59:29.000 Well, the fake I'm giving a speech voice is very disturbing because it's too smooth, it's not real, it's too polished, it's not, you know, I know it's prepared and beaten down.
01:59:40.000 I don't want that out of a leader.
01:59:42.000 What I want out of a leader is I want to know that this is you.
01:59:46.000 This isn't I'm being a strip club DJ. This isn't I'm the AM morning guy on the zoo.
01:59:52.000 Coming up next!
01:59:53.000 The same fucking voice.
01:59:54.000 When you hear a man give a speech.
01:59:58.000 There's that way of talking that is so goddamn fake it should be illegal.
02:00:03.000 They should be able to stop you from making campaigns In speeches and stop you and go, you can't talk like that.
02:00:09.000 It'll be interesting to see how it comes across.
02:00:12.000 And speaking of politicians, did you see the HBO movie about Game Change?
02:00:17.000 What is it?
02:00:17.000 What's Game Change?
02:00:18.000 It's about the McCain.
02:00:20.000 Oh, about Sarah Palin?
02:00:21.000 Yeah, when he picked his running name.
02:00:22.000 I can't watch anymore Sarah Palin stuff.
02:00:24.000 Well, Julianne Moore was so good in it, dude.
02:00:26.000 She's pretty hot.
02:00:27.000 She looks just like her.
02:00:28.000 Yeah, but the film is so upsetting.
02:00:32.000 Really?
02:00:32.000 Well, because it shows you the theater of what a lot of politics has become.
02:00:37.000 And also how obsolete it is.
02:00:39.000 How accurate is it?
02:00:40.000 How accurate is the conversations?
02:00:42.000 I mean, it's all been doctored up for fucking dramatic effect.
02:00:44.000 I don't know.
02:00:45.000 You need to see it, man.
02:00:45.000 When they bookend conversations and shit.
02:00:48.000 You still get the message.
02:00:49.000 You still get the idea of the reasons that she was put there and her lack of experience.
02:00:55.000 Well, that became painful.
02:00:56.000 Right, but here it's presented in a, you know, the way it would be like a film scholar, you know, explaining something to you.
02:01:02.000 Well, to me, it just illustrates how wonky the system is.
02:01:06.000 That could even be an option.
02:01:08.000 How could that be an option, exactly?
02:01:09.000 Why do we as a society would allow something like that?
02:01:13.000 Well, you know, it's what I've always said, is the real problem is that there's really fucking dumb people out there, a lot of them, and they get to vote too.
02:01:21.000 And the problem with dumb people is they don't know they're dumb.
02:01:24.000 So when they see someone like Sarah Palin, who may not be the smartest person in the world, but she's way smarter than them, they can't distinguish between her and Stephen Hawking.
02:01:34.000 When Neil Tyson speaks, he sounds just as brilliant as Sarah Palin, because they're both way out of their fucking league.
02:01:40.000 Most people can barely string together a sentence.
02:01:43.000 And so these are the people that cling to her because she represents simplicity.
02:01:46.000 She represents good old-fashioned things and hunting and family and God.
02:01:53.000 There's a safe danger with her.
02:01:54.000 She's dangerous, but she's also familiar.
02:01:57.000 So maybe that's why.
02:01:59.000 But like, fuck.
02:02:00.000 I feel like today, man, if you have access to the internet, you have no excuse not to be on Khan Academy.
02:02:04.000 You have no excuse not to be watching TED Talks.
02:02:06.000 You have no excuse not to saturate your brain with knowledge.
02:02:09.000 It's not like there's no books around.
02:02:11.000 Every book that's ever been written is an internet click away.
02:02:14.000 And I feel like ignorance is inexcusable these days if you have an internet connection.
02:02:20.000 So it's kind of like...
02:02:21.000 I don't know.
02:02:23.000 It feels like...
02:02:24.000 The tools are there, but it's up to us how we use them.
02:02:26.000 It's going back to the same message, and I think people really need to get on this, right?
02:02:30.000 They need to get on this.
02:02:31.000 The representative government idea has got to go.
02:02:34.000 That's not necessary anymore.
02:02:35.000 We can all instantly communicate with the government.
02:02:38.000 We can instantly decide what we agree with or don't agree with.
02:02:42.000 We can have our voices heard already.
02:02:44.000 The idea that we have senators and congressmen, and they're in this position where they get to vote for their districts.
02:02:49.000 Shut the fuck up.
02:02:50.000 Fascinating idea.
02:02:50.000 That's a ridiculous idea.
02:02:52.000 See, everybody looks at democracy as if you get a say, you get to vote.
02:02:57.000 You don't get a say in shit.
02:03:00.000 You get a say in who you pick, who gets you in a position.
02:03:03.000 Democracy needs to come online, man.
02:03:05.000 100% needs to be revamped.
02:03:07.000 Yeah.
02:03:07.000 They need to throw out all representation.
02:03:10.000 The internet needs to vote on new constitutional amendments.
02:03:14.000 And there should be people who have jobs.
02:03:15.000 But those jobs are to carry out the will of the people.
02:03:18.000 Not to represent the people.
02:03:20.000 The people can represent themselves now.
02:03:22.000 When is somebody going to make a Joseph Kony-style video about legalizing marijuana?
02:03:26.000 And if people say, Click here and play this to say yes.
02:03:30.000 And if it gets a billion views, they'll have to legalize it.
02:03:32.000 You can already see.
02:03:32.000 Just Google The Union, man.
02:03:34.000 Go watch the movie The Union, The Business Behind Getting High.
02:03:37.000 It's a documentary that I was involved in that my friend Adam Skorgy produced.
02:03:41.000 It was like four or five years ago at least that we did this.
02:03:45.000 It's one of the best documentaries on the reality behind the illegalization of marijuana and the reality behind how big of a business it is and how many fucking people use it.
02:03:54.000 Dude, I have members in my family that benefit from its medicinal use and they benefit immensely.
02:04:00.000 It's been like a miracle for my aunt.
02:04:02.000 Well, it's one of those plants, one of those substances, one of those elements of our culture and society that if you were, again, if you were looking at life as a work of fiction, if life was a movie and there was some plant in the movie that was incredibly beneficial...
02:04:19.000 culture, not just to instilling a sense of camaraderie in people, not just for making you inquisitive, a turbocharger for your imagination, making sex feel better, not just all of these things.
02:04:31.000 But then it creates a superior fiber that you can make clothes out of that's way more durable than cotton.
02:04:36.000 It makes a much more superior paper and you can put it in an area and in four months it can be ready to process, whereas it takes fucking years to grow trees in the same area.
02:04:45.000 Plus it outproduces the trees in the same acreage by something like four to one.
02:04:50.000 I mean, it's amazing.
02:04:51.000 It has the amino acids that you can live off of.
02:04:54.000 You can use it to make fuel.
02:04:56.000 You can make hemp oil.
02:04:57.000 It becomes productive.
02:04:58.000 Preposterous.
02:04:59.000 Preposterous.
02:04:59.000 But imagine a video that's slickly produced like the Coney video that gets a billion views.
02:05:05.000 Well, let's do it.
02:05:06.000 Let's do it.
02:05:06.000 You and me, dude.
02:05:07.000 It's 10 minutes.
02:05:08.000 Did you hear about the new bill?
02:05:09.000 It's your specialty.
02:05:10.000 We'll pump it up on Twitter.
02:05:11.000 Did I hear about what?
02:05:12.000 There's a new bill in California, a DUI bill, that they're going to make it zero tolerance DUI if you have any weed in your system or any marijuana.
02:05:21.000 And marijuana usually stays in your system.
02:05:22.000 Six weeks.
02:05:23.000 How could they do that?
02:05:24.000 That's like punishing you for being drunk two days ago.
02:05:26.000 So it would pretty much make anyone that smokes weed...
02:05:28.000 Well, that's silly.
02:05:29.000 They'd have to get a urine test from you.
02:05:32.000 You wouldn't be able to blow it.
02:05:34.000 That's what they would be allowed to do if they pulled you over.
02:05:36.000 Oh, that's so ridiculous.
02:05:37.000 The real problem with that is that science is not the same.
02:05:42.000 Marijuana does not treat people or it doesn't affect people the same way that alcohol does, period.
02:05:46.000 I'm not saying you should go out and get high and drive around, but I'm saying some people can drive high and they're fine, and that is a fact.
02:05:52.000 You might not want to address it because it seems like it's a taboo subject and people want to dance around it.
02:05:58.000 It is not getting drunk.
02:05:59.000 Getting drunk is something that really severely impairs your ability to operate machines, your ability to walk, your coordination.
02:06:05.000 They're very different things.
02:06:07.000 Very, very, very different.
02:06:09.000 And still, it's not a good idea to be in any altered state of consciousness while you're responsible for other people's lives.
02:06:13.000 No, that's why we get the self-driving cars.
02:06:14.000 That's why the Google self-driving cars are so perfect.
02:06:16.000 So you can be baked as fuck in your Google car.
02:06:18.000 Just your Google car could be cheese and chong to the max.
02:06:21.000 Yes.
02:06:21.000 Just completely filled with smoke.
02:06:23.000 So they open up those side going doors.
02:06:25.000 Why not?
02:06:25.000 Google self-driving cars.
02:06:26.000 And everybody gets a contact high.
02:06:27.000 Yes.
02:06:28.000 And it'll be the self-driving car.
02:06:29.000 It's perfectly safe.
02:06:30.000 The computer doesn't get high.
02:06:32.000 Dude, you're a Google fanboy a little bit, right?
02:06:35.000 I'm kind of a fan of anybody who's pushing the boundaries of the possible, dude.
02:06:38.000 Of course.
02:06:39.000 Yes.
02:06:40.000 I'm a Google fan, boy.
02:06:41.000 Yes.
02:06:41.000 I'm Lycos all the way.
02:06:43.000 Whoa, bro.
02:06:44.000 I'm Netflix.
02:06:45.000 Netflix search.
02:06:45.000 People have the courage to dream.
02:06:48.000 What was the other one?
02:06:49.000 Netscape.
02:06:49.000 No, goddammit.
02:06:50.000 Netscape.
02:06:50.000 That's your browser.
02:06:51.000 Remember Netscape?
02:06:52.000 Yeah.
02:06:52.000 Netscape search?
02:06:54.000 You should do an internet search.
02:06:55.000 Yeah, there was like a search part of it, I think.
02:06:57.000 Does that make sense?
02:06:58.000 I thought it was always a browser.
02:06:59.000 I didn't know they had it.
02:07:00.000 Well, there was a...
02:07:01.000 What were the earliest search engines before Google?
02:07:03.000 There wasn't the first.
02:07:04.000 No, the search engines?
02:07:06.000 Webcrawler?
02:07:07.000 Alta Vista.
02:07:08.000 I remember Alta Vista.
02:07:09.000 I remember Alta Vista.
02:07:09.000 Hotbot.
02:07:10.000 Hotbot, I don't remember.
02:07:12.000 Lycos.
02:07:13.000 I remember Lycos.
02:07:13.000 MSN. So how did those go away, and how did Google just storm the beach and just take over?
02:07:17.000 They out-innovated, man.
02:07:18.000 Is that what it is?
02:07:19.000 Out-innovated is like natural selection.
02:07:21.000 It's like winning.
02:07:22.000 It's like winning the game.
02:07:23.000 Microsoft is trying so hard with this whole Bing thing.
02:07:26.000 First of all, why Bing?
02:07:27.000 What does that mean?
02:07:28.000 What are you saying?
02:07:29.000 That's Google.
02:07:30.000 I don't know.
02:07:30.000 Bing.
02:07:32.000 It's not bad.
02:07:33.000 It's probably the second best one.
02:07:34.000 It's pretty good.
02:07:35.000 But why Bing?
02:07:36.000 Why call it Bing?
02:07:37.000 Why is it Google?
02:07:38.000 Why is Google?
02:07:39.000 Because a Google is a number, dude.
02:07:40.000 It's a very cool number, actually.
02:07:42.000 I've never known that until recently.
02:07:43.000 It's like a term for like a million zeros or some shit.
02:07:48.000 Well, let's find out what it is.
02:07:49.000 What is a Google here?
02:07:50.000 Because we should inform people.
02:07:52.000 What's a Googler?
02:07:53.000 It comes from another term.
02:07:55.000 I don't believe the word is Google.
02:07:57.000 It's an abbreviation of a term.
02:07:59.000 Yeah, Google.
02:08:00.000 It's G-O-O-G-O-L. Yeah.
02:08:04.000 It is, holy shit.
02:08:06.000 How many?
02:08:07.000 Oh my god, 100 zeros.
02:08:09.000 It is the, wow, a Google is the large number 10 to 100. That is the digit 1 followed by 100 zeros.
02:08:17.000 What a What a great way to make a statement about the depth and breadth of your capabilities by using a number like that.
02:08:24.000 It's kind of beautiful.
02:08:25.000 And how perfect.
02:08:26.000 What a perfect description.
02:08:27.000 It's perfect.
02:08:28.000 For Google?
02:08:29.000 That is Google, man.
02:08:30.000 Google voicemail, Google fucking maps, Google...
02:08:33.000 Jesus.
02:08:34.000 That's why I'm so excited to speak there.
02:08:36.000 Yeah, what are you going to talk about?
02:08:37.000 I'm going to talk about creativity.
02:08:39.000 I'm going to talk about innovation.
02:08:40.000 I'm going to talk about inspiration and awe.
02:08:42.000 I'm going to talk about using technology to render the impossible into existence.
02:08:47.000 And I'm going to show some of the videos.
02:08:48.000 Actually, my friend, Josh Kaplan, actually, who set this up, is a huge fan of your show.
02:08:54.000 Oh, that's awesome, man.
02:08:55.000 He loves your podcast.
02:08:57.000 What's up, Josh?
02:08:58.000 Yes, he's the man.
02:08:59.000 And he set up the invite to Google.
02:09:01.000 Oh, that's amazing, man.
02:09:03.000 That's incredible.
02:09:04.000 Google is known as being one of those companies that really treats their employees well.
02:09:09.000 We got an invite a couple years ago when we were in San Francisco.
02:09:12.000 Maybe a year ago, someone from Google emailed me, but I lost it in the shuffle.
02:09:15.000 My email gets clogged sometimes, and I just can't find anybody.
02:09:19.000 Because you get a lot of crazy...
02:09:20.000 You must get a lot of emails in general, huh?
02:09:22.000 Yeah, but the one thing that I was fascinated by, I wanted to see what it was like in there, because I've always thought, like, man, why can't someone make a company where they treat their employees well?
02:09:31.000 Like, how much more does it cost to give them really good food, take care of them?
02:09:34.000 It might cost, like, a little more, but wouldn't make the atmosphere way better and make everybody appreciate it.
02:09:40.000 Exponentially so.
02:09:40.000 Yeah, I mean, that's like one of the most important things is that the environment be positive.
02:09:44.000 Totally.
02:09:45.000 Nobody wants to work around someone who doesn't want to work there.
02:09:47.000 They also understand that creativity and productivity comes from allowing people to have distractions.
02:09:53.000 Yes.
02:09:53.000 So it's like they have ping pong tables and bean bags and all these things because, you know, and some people might criticize, oh, it's just a playground.
02:09:59.000 No, it probably makes the employees much more creative.
02:10:02.000 You're creating spaces in which the free association and their synapses can fire.
02:10:06.000 Yes.
02:10:07.000 Creativity is about that.
02:10:08.000 And I'm sure they're judged or at least evaluated based on their productivity.
02:10:13.000 It's not like they're not going to be productive in the law.
02:10:15.000 I got this job.
02:10:16.000 Tom just played ping pong all day.
02:10:18.000 They're not the type of people that would do that in the first place.
02:10:20.000 Right.
02:10:20.000 So it becomes a resource rather than a distraction.
02:10:23.000 And a distraction as a resource.
02:10:24.000 And these are the post-industrial revolution companies.
02:10:27.000 And these are the most admired companies in the world.
02:10:29.000 You have Apple, you have Google, and people are looking to these companies as examples of how to run businesses, how to have social impact, how to make legacies, how to not be evil.
02:10:37.000 And they stood out against SOPA. This is the new model of corporations that are going to be judged upon.
02:10:42.000 So all these new entrepreneurs now coming online, they're getting inspired from these companies.
02:10:46.000 I want to be the next Google and change the world.
02:10:49.000 It's not I want to make the next Google and be rich.
02:10:51.000 It's I want to change the world.
02:10:52.000 What happens with Google Video and stuff?
02:10:54.000 Because I know they came out against SOPA and the Stop Online Piracy Act.
02:10:58.000 That all fell apart and they're trying to come up with a new strategy, a new act.
02:11:03.000 Well, I think we need to all have a new conversation about content ownership in a world in which everybody has the tools to make mixtapes.
02:11:11.000 Yeah, but like what about Google videos and stuff like that?
02:11:13.000 What if someone has a documentary and that documentary is for sale, but you go to Google video and there it is, and you can just watch it for free?
02:11:21.000 No, you do what Radiohead did, which is where they put their album online for free and they said donate money if you would like to pay for this music.
02:11:29.000 So you think that that's how people who want to sell DVDs should deal with the fact that people are stealing their shit?
02:11:35.000 They're gonna have to ask for donations?
02:11:37.000 Well, no, but I think that we're just, it's an environment in which more, because what happens is everybody's going to be making content for free anyway, and the content for free is going to be just as good as the content you charge for, and I think people will pay because they appreciate your content, but I think it's going to be harder and harder to, like, impose.
02:11:54.000 Payment on it.
02:11:54.000 Well, how would someone, like, let's say, for an example, say if there's a documentary on crocodiles, okay, I tell you about it, oh my god, it's crazy, you've got to watch this.
02:12:01.000 Now you go to Google Video and you find this documentary on crocodiles, how the fuck are you going to find the production company, the website, are you going to search it out?
02:12:08.000 Are you going to go Google the name of it?
02:12:10.000 So you're saying you were just watching the website?
02:12:12.000 Yeah, I mean, if you really wanted to, if they had it set up where you could, you know, where you could donate, if you would like, on their website, I mean, Well, no, they can do a YouTube channel that's supported by ads.
02:12:23.000 And if lots of people watch the movie, they'll get money from the ads that they have on their page.
02:12:28.000 And then in the description, they can say, we're putting this movie online for free because we want to share the ideas, but we're asking for donations of $5 of you.
02:12:35.000 And I'm sure that a lot of people would give it.
02:12:38.000 A lot of people would.
02:12:39.000 A lot of people wouldn't as well, though.
02:12:41.000 So do you feel like that is...
02:12:42.000 And then there's the other argument is the people that wouldn't as well.
02:12:45.000 I kind of see their point of view because they would say, listen, I would have never bought this in the first place, so I'm not taking anything away from them.
02:12:51.000 I downloaded it because it was free, because I knew I could watch it and I didn't like it.
02:12:55.000 So I'm glad they didn't get my money.
02:12:57.000 When you see a bad movie, don't you want to get your money back?
02:12:59.000 Yes.
02:12:59.000 I can see that argument as well.
02:13:02.000 It is a weird thing when it's ones and zeros and it's just being distributed through the internet.
02:13:08.000 It's a weird conversation.
02:13:10.000 Because things only have a price because of scarcity.
02:13:12.000 You need to charge for something because it's a rare commodity.
02:13:14.000 Well, no.
02:13:15.000 Things have a price because there's no scarcity in artwork.
02:13:19.000 No, but your unique work is yours unique.
02:13:22.000 So people pay for it because it's only you did something that's unique to you.
02:13:26.000 And if you have an audience, people will pay for that.
02:13:28.000 That's what I'm saying.
02:13:28.000 But I think that increasingly scarcity itself...
02:13:30.000 But you're compensating them for their efforts.
02:13:32.000 I mean, it's not necessarily just paying for scarcity.
02:13:35.000 It's you're compensating someone honestly for their efforts.
02:13:38.000 I appreciate the efforts, but I think that it's just the genies out of the bottle.
02:13:41.000 It's just too difficult for immaterial things to be contained.
02:13:44.000 Do you think that ultimately that's going to lead to sort of a decay in the idea of capitalism?
02:13:48.000 Everything is going to be reexamined.
02:13:50.000 Everything is going to be reexamined.
02:13:51.000 When they start getting into real high-end 3D printers, and that's how you order things, you just order the formula to create things.
02:13:57.000 Transform manufacturing is Yeah.
02:14:00.000 People will get scared and lose their jobs and will have moments of panic and all of that transition will change everything.
02:14:07.000 But you know that like 80% of the jobs that people do today didn't exist 100 years ago.
02:14:12.000 There were jobs that didn't exist.
02:14:13.000 So there will be new things for us to do.
02:14:16.000 It becomes a real problem when people hold on to the idea that they need to keep a job because the job is part of the old way.
02:14:23.000 And that is also...
02:14:24.000 One of the reasons why marijuana is still illegal.
02:14:26.000 And there was a recent article that I tweeted, if you find it just a couple of days ago, or just Googled the statement, lobbyists are getting rich off keeping marijuana illegal, because that's what's going on, man.
02:14:36.000 There's lobbyists that are doing this through police unions.
02:14:40.000 You know, there's lobbyists that are doing this and they're, you know, these guys are making a lot of money off keeping marijuana illegal.
02:14:47.000 There's a lot of people that their business is to arrest people for pot.
02:14:53.000 I mean, that's part of the job.
02:14:54.000 It's part of what keeps people paid.
02:14:56.000 It's part of what keeps a strong police force.
02:14:59.000 But I think that in a country where most of the population at this point wants it to be legalized, there should be no red tape or bureaucracy between the people's will and it being changed.
02:15:08.000 I think it's also...
02:15:09.000 Most people want it to be legalized.
02:15:11.000 There should be like a like button on Facebook, and if 100 million people click it, it should be legalized tomorrow.
02:15:15.000 And I think that will eventually.
02:15:17.000 Yeah, that's...
02:15:17.000 That's dynamic democracy.
02:15:19.000 This is where we need to get to.
02:15:20.000 But I think one of the issues is, and I think this has to be stopped, is we have to stop treating police officers as glorified revenue collectors.
02:15:28.000 Because that's what they are.
02:15:29.000 And I think that's a really disgusting thing.
02:15:31.000 Because guess what?
02:15:33.000 Firefighters in place, and I hope we never have to fucking use them.
02:15:35.000 I hope those guys get to hang out at the firehouse all day and cook and work out and do fucking chin-ups and shit.
02:15:40.000 I hope no one ever has to work.
02:15:42.000 I hope no one ever has to deal with a fire.
02:15:43.000 Exactly.
02:15:43.000 I would like the same thing with police officers.
02:15:45.000 Right, it would be great if they never had to take the guns out.
02:15:47.000 Well, yes, but the issue is they have quotas.
02:15:50.000 They have quotas they have to reach.
02:15:52.000 I had no idea.
02:15:53.000 Oh, fuck yeah, especially with speeding.
02:15:54.000 You know, I've talked to friends.
02:15:56.000 They have quotas?
02:15:57.000 Yes, absolutely.
02:15:57.000 Yeah, they have to make quotas as far as giving out tickets.
02:16:00.000 That really raises a red flag, doesn't it?
02:16:01.000 That's like telling a firefighter.
02:16:04.000 That you're only going to get paid if you put out a fire.
02:16:06.000 They're going to be looking to build fires.
02:16:08.000 Imagine what would happen if the entire country decided that for one month, which would fuck up the entire system, that's all we need is 30 days, everybody in agreement, where nobody ever violates a single law as far as speeding or driving or traffic or stoplights.
02:16:21.000 If we made a viral video for it and we created a campaign.
02:16:24.000 Don't break a law for a month!
02:16:25.000 Every cop would get fired.
02:16:26.000 It would be chaos.
02:16:27.000 Wow.
02:16:28.000 Yeah, it'd be crazy.
02:16:28.000 They would lose all that revenue that they count on.
02:16:30.000 They count on us never evolving.
02:16:32.000 I mean, it's really factored into the budget.
02:16:34.000 Dude, we need massive system upgrades here.
02:16:36.000 Massive system upgrades.
02:16:38.000 Just the idea that you have engineered a system where we can never be good.
02:16:42.000 We can never get through because your cops need to arrest a certain amount of people.
02:16:46.000 Need to pull over, rather, a certain amount of people and put out a certain amount of tickets.
02:16:49.000 The state relies on that for real.
02:16:51.000 We're going to have to radically change.
02:16:54.000 Everything will radically change.
02:16:56.000 But there was a time when somebody's life was about making saddles for horses.
02:16:59.000 Because everybody ran horses.
02:17:01.000 And that person was probably really nervous when the car started to become popular because he couldn't make his horse carriages anymore.
02:17:05.000 Lucky for him, lesbians are still around.
02:17:08.000 They still like horses.
02:17:10.000 I wouldn't say lesbians.
02:17:11.000 I'd say women who used to like men but gave up.
02:17:13.000 Now they like horses.
02:17:14.000 I don't know, man.
02:17:15.000 I live right in the equestrian district and I see them every day and some of them are fucking high.
02:17:19.000 These are like spoiled little girls that date like rich guys that buy the models and shit.
02:17:23.000 Yeah, you gotta fuck those girls hard, man.
02:17:25.000 They ride horses all day.
02:17:26.000 They're not impressed by just like regular sex.
02:17:29.000 Riding a giant animal all day.
02:17:31.000 You must feel so feeble.
02:17:32.000 You know?
02:17:33.000 You know what I'm saying?
02:17:34.000 They get on top of you.
02:17:35.000 They're like, really?
02:17:36.000 This is it?
02:17:37.000 This is all you got here?
02:17:37.000 It's the next level of masturbating.
02:17:39.000 I don't know.
02:17:39.000 I always feel bad for the horses, man.
02:17:40.000 There's chains around their mouths.
02:17:42.000 I'm like, no, man.
02:17:43.000 I'd rather...
02:17:43.000 Yeah, I don't like it.
02:17:44.000 It's gross.
02:17:45.000 It's gross.
02:17:46.000 There's a lot of people in my neighborhood, and they're super self-righteous.
02:17:50.000 Like, you know, slow down.
02:17:51.000 You could be like, it's 28 miles.
02:17:53.000 The speed limit's 25. You're going 28. Slow down!
02:17:56.000 Slow down!
02:17:57.000 Slow down!
02:17:58.000 Just big, big bull dyke on her fucking crazy animal.
02:18:01.000 It's a weird thing that you're allowed to just ride around animals in 2012. Someone should come along and you go, really?
02:18:07.000 Just go to a farm somewhere.
02:18:08.000 You can't be just, you know, I don't care if this is an equestrian district.
02:18:11.000 That's ridiculous.
02:18:12.000 It's still Burbank, you crazy fuck.
02:18:14.000 What are you doing riding a horse in my neighborhood?
02:18:16.000 Get out of here.
02:18:16.000 Well, soon we won't eat animals either.
02:18:19.000 I'm convinced about in vitro meat, man.
02:18:22.000 Tissue engineering.
02:18:22.000 Someone's going to have to eat those fucking animals.
02:18:24.000 That's a problem, because what happens then?
02:18:26.000 Do we sterilize them?
02:18:27.000 What do we do to keep them from just being everywhere, like in India?
02:18:31.000 What is it like if you drive everywhere and there's fucking cows or rats in New York City, just infesting the landscape, and we can't eat them?
02:18:39.000 Some of them have to kill them, bro.
02:18:40.000 We're going to have to introduce triceratops, bring in some dinosaurs.
02:18:44.000 We won't breed as many.
02:18:45.000 I mean, that's another option.
02:18:46.000 What if we just let them go, right?
02:18:48.000 If we let them go.
02:18:48.000 We're not eating them anymore.
02:18:49.000 We let them go.
02:18:50.000 They're going to fuck.
02:18:50.000 They're going to fuck and they're going to be like buffalo.
02:18:52.000 Buffalo on the plane.
02:18:53.000 We'll just grow the tissue without a nervous system.
02:18:54.000 They're going to be everywhere.
02:18:55.000 Oh, man.
02:18:55.000 I don't know.
02:18:56.000 I think if we want to stay human.
02:18:59.000 I think they're going to keep breeding.
02:19:01.000 I think we're going to have to get predators.
02:19:03.000 We're going to have to make some robot predators like those dog robot things that only just go out and jack cows.
02:19:09.000 Interesting.
02:19:10.000 They just do it to keep the population down.
02:19:11.000 Who knows, man?
02:19:12.000 It's going to be...
02:19:15.000 Brian just shook his head.
02:19:16.000 We'll have to invent our way out of the new scenario.
02:19:20.000 The new scenarios will come and we'll find new novel solutions to deal with them.
02:19:24.000 Do you eat meat?
02:19:25.000 I eat meat.
02:19:26.000 Not every day, but I'm a flexitarian.
02:19:28.000 A flexitarian?
02:19:29.000 You're flexible?
02:19:30.000 No, I eat meat, but I try not to eat it every day.
02:19:33.000 Do you ever consider the idea that what you're doing is harmful to the energy of the universe, that you're eating tortured animals?
02:19:40.000 Does that ever fuck with you?
02:19:41.000 You ever watch like Food Inc?
02:19:43.000 I try to have organic food, but I still...
02:19:46.000 That's like a cow that grows up in a hippie community and then gets shot in the head.
02:19:50.000 I don't know.
02:19:50.000 It still gets jacked.
02:19:51.000 I would like to become vegetarian.
02:19:53.000 It's just it's not the easiest thing to do logistically, you know?
02:19:56.000 Yeah.
02:19:57.000 Now you can't always find...
02:19:58.000 Right now you're going to get a swarm of hate mail from right now from sweaty little vegans and vegetarians.
02:20:02.000 They're warming up their little fingers right now.
02:20:04.000 Well, I'm a flexitarian.
02:20:05.000 I mean, not eating vegan...
02:20:07.000 I thought you were an open-minded person.
02:20:09.000 Eating vegan food twice a week is already really good.
02:20:12.000 You're making...
02:20:13.000 That's a beginning.
02:20:13.000 That's decent.
02:20:14.000 It's the beginning.
02:20:15.000 Are you?
02:20:15.000 No.
02:20:17.000 I think animals are dumb, and I think if they were smart, they'd be killing us.
02:20:21.000 I think we'd have issues.
02:20:23.000 I think every animal on this planet is an animal.
02:20:24.000 Hopefully we get to the point where our empathy is big enough to alleviate suffering, even suffering if it's not completely...
02:20:31.000 We're completely as conscious as we are.
02:20:32.000 However, the cycle of life requires predators, and we have sort of completely hijacked that cycle of life with the idea of cities and civilization and big metal boxes where you can drive through a fucking safari and be ten feet away from a lion killing a gazelle.
02:20:45.000 I mean, we got a crazy reality.
02:20:46.000 We're game changers, man.
02:20:48.000 We're a game-changing species.
02:20:50.000 For good or for bad.
02:20:51.000 I think more for good.
02:20:52.000 I'm more impressed with us than I am disturbed by us.
02:20:56.000 By a long shot.
02:20:57.000 I am much more impressed with us than I am disturbed as well.
02:20:59.000 But it's nice to just kind of marvel at ourselves a little bit.
02:21:02.000 I think we have kind of what's called a guilty cosmic complex where we feel like we're small and insignificant.
02:21:08.000 I think we have a big role to play.
02:21:10.000 We can play an even bigger role if we pool our cognitive resources together.
02:21:14.000 I agree, but I also agree that bacon is delicious, and so is steak.
02:21:18.000 Yeah, but you can grow bacon out of stem cells and not have to kill another animal.
02:21:24.000 Print out some bacon.
02:21:25.000 You're never going to be able to recreate venison.
02:21:27.000 You're never going to be able to recreate wild venison.
02:21:30.000 No, venison is deer meat.
02:21:31.000 You could totally recreate that.
02:21:33.000 There's a delicious, gamey, wild flavor to animals that run away.
02:21:37.000 They'll have that shit in numbers.
02:21:39.000 Ones and zeros.
02:21:40.000 Do you think they'll be able to figure that out?
02:21:41.000 Fuck yeah.
02:21:41.000 Yeah, man.
02:21:42.000 It'll be like in the Matrix when the guy's eating the steak and he's like, I know this is not real.
02:21:45.000 I know it's made of like code.
02:21:47.000 And he's like, I don't care.
02:21:49.000 It tastes delicious.
02:21:50.000 And then he puts it in his mouth.
02:21:51.000 So you're going to be satisfied with that?
02:21:53.000 I think we all would.
02:21:54.000 It's inevitable, right?
02:21:55.000 It's inevitable.
02:21:56.000 Well, dude, I mean, what are we tasting anyway except our brain's interpretation of something going in through senses that are like creating a software that goes in real time and tells us, oh, this is what this feels like.
02:22:05.000 I'm glad I got to experience life with no answering machines.
02:22:08.000 I'm glad I got to experience no cell phones when I was growing up.
02:22:11.000 Were you?
02:22:11.000 So you could see the contrast?
02:22:12.000 Absolutely.
02:22:13.000 Absolutely.
02:22:14.000 Does it make you appreciate how wired you are now?
02:22:16.000 I appreciate how wired we are now, but I also appreciate old school stuff.
02:22:19.000 I appreciate a good steak.
02:22:22.000 I like a good steak.
02:22:24.000 Hardwood, Kohl's, grilled.
02:22:25.000 Stop talking about food.
02:22:27.000 Hey man, cooking with fire was a technology too.
02:22:32.000 Yeah, it is.
02:22:33.000 Yeah, absolutely.
02:22:35.000 I mean, look, the appendix exists.
02:22:37.000 It was an organ to break down fibers.
02:22:39.000 We were eating all kinds of crazy shit back then, right?
02:22:41.000 Wasn't it?
02:22:41.000 Wasn't that what it was?
02:22:42.000 And then we lost its use, and that's why a lot of people have to have it removed.
02:22:45.000 Have you ever gotten your genome tested to see what percentage of the under tall you are?
02:22:49.000 I did 23 and me.
02:22:51.000 The Google thing?
02:22:52.000 Yeah, that's the thing we were talking about.
02:22:54.000 What'd they say?
02:22:54.000 It's not completely...
02:22:56.000 I mean, it's not like they can understand everything yet.
02:22:58.000 Someone's a monkey and doesn't want to...
02:22:59.000 He's trying to downplay the technology.
02:23:02.000 No, no, no.
02:23:02.000 It's one of those exponential things where eventually it'll be 100% tell you everything about everything.
02:23:07.000 Do you have to spit in a cup?
02:23:08.000 Right now, at least it tells you...
02:23:09.000 Did you spit in a cup or something like that?
02:23:10.000 How do you do it?
02:23:11.000 What do you do?
02:23:11.000 They send you this little tube and you spit in it and you mail it to them.
02:23:15.000 Yeah, 23 to me.
02:23:15.000 It's amazing.
02:23:16.000 And then it'll tell you if you have a precondition of some sorts or if you have a likelihood of developing something like high blood pressure or if your genetic profile says you're going to get Parkinson's or the percentages of a chance of developing something.
02:23:29.000 So for people who get stuff that's preventable, you know, if they're like, oh, I have a 70% chance of high blood pressure, I can start addressing that now.
02:23:37.000 I've been told that I'm more likely to get it than another person so I can change my diet now.
02:23:41.000 Because some people are just genetically so lucky that they can eat shit and nothing will ever happen to them.
02:23:45.000 Those motherfuckers.
02:23:47.000 Yeah, that's always the case until we all upgrade our genes.
02:23:49.000 But for everybody else, this is a chance to see what some of their vulnerabilities might be and how they might address them.
02:23:54.000 So we start to hack our biology.
02:23:56.000 How cool is this idea that we all start hacking our biology?
02:23:59.000 We're upgrading ourselves by hacking in and getting backdoor and shortcuts and fixing things.
02:24:04.000 Are we delaying the inevitable brilliant next stage of existence?
02:24:09.000 Are we in this life?
02:24:10.000 You mean that we wake up into something else?
02:24:12.000 Maybe something after this stage is way better.
02:24:15.000 And that's the natural progress.
02:24:17.000 The natural progress is to move from this to the next.
02:24:19.000 Well, that could only be the case if this is a dream.
02:24:22.000 If this is a simulation and we're eventually waking up from the simulation, if this is a lucid dream, if this is limbo from inception, you know, that you spend 80 years in limbo and you get old before you wake up and become a young man again.
02:24:33.000 But if that's the case, great.
02:24:35.000 Look, awesome.
02:24:36.000 I fucking hope so, man.
02:24:38.000 I'm just not fully convinced.
02:24:39.000 So I'm going to fight for my survival as passionately as I can now.
02:24:43.000 Because I don't have the evidence that there's anything else.
02:24:45.000 And with no evidence, it's pretty hopeless.
02:24:47.000 The despair is pretty vivid.
02:24:48.000 It might be the big sleep.
02:24:51.000 Eternity on both ends.
02:24:52.000 The universe is eternal.
02:24:53.000 Why can't we be?
02:24:54.000 That's my question.
02:24:55.000 Well, I think consciousness probably goes to sleep forever, but I think you become a part of...
02:24:59.000 Is it?
02:24:59.000 Maybe consciousness is really a tool to create action.
02:25:02.000 Maybe it's a tool to move things forward.
02:25:05.000 There's no doubt that it is, but it's a tool that found out that it enjoys its own...
02:25:08.000 It wants to persist.
02:25:08.000 It likes blowjobs.
02:25:09.000 It enjoys itself.
02:25:10.000 No, it enjoys itself.
02:25:11.000 It likes to get drunk.
02:25:12.000 We're self-referential.
02:25:13.000 It's that recursive feedback loop.
02:25:14.000 We know that we know that we know.
02:25:16.000 And therefore, consciousness, if it was a fluke or if it was by emergent design, it has decided that it likes itself.
02:25:23.000 It likes free time.
02:25:26.000 It likes to make art and sing songs.
02:25:28.000 Not everything that it does is to build things and to be utilitarian and functional.
02:25:32.000 Some things are pure pleasure.
02:25:34.000 Like the robots in Blade Runner.
02:25:35.000 The pleasure of being.
02:25:36.000 They like being alive as well.
02:25:37.000 There you go.
02:25:37.000 That could be what it is, right?
02:25:40.000 Yeah.
02:25:40.000 Rutger Hauer, remember?
02:25:41.000 He really liked it.
02:25:42.000 He was bummed out.
02:25:43.000 yeah crazy Kangaroos.
02:25:48.000 Kangaroos.
02:25:49.000 Kangaroos who eat a flower that came from another planet.
02:25:52.000 I'm telling you, man.
02:25:53.000 Super intelligent kangaroos.
02:25:55.000 Would that be the shit?
02:25:57.000 If kangaroos start yelling at you for fucking polluting?
02:26:00.000 Kangaroos start talking English like really quick within a couple of years?
02:26:03.000 And they wear Catholic schoolgirl outfits.
02:26:05.000 You know what's amazing about kangaroos is that they continue to raise...
02:26:08.000 They have that thing, you know, the pouch where they put their little babies.
02:26:12.000 But it's almost like when the baby's born, it's almost like not ready to be born.
02:26:15.000 And so they keep them in there.
02:26:17.000 And that's kind of like...
02:26:18.000 Well, they live in a terrible environment.
02:26:19.000 Yeah.
02:26:20.000 I mean, they have to protect that fucking thing.
02:26:21.000 It's kind of amazing.
02:26:22.000 They're living with these crocodiles everywhere.
02:26:24.000 Kangaroos everywhere.
02:26:25.000 That's a bad spot.
02:26:27.000 Australia's a shady fucking spot.
02:26:29.000 The most venomous snakes in the world are there.
02:26:30.000 Oh, they got all kinds of shit that can kill you.
02:26:32.000 And most of the country you can't live in.
02:26:33.000 Most of the country nobody lives in.
02:26:35.000 They live around the coast.
02:26:36.000 I think that's happening more and more and more and more people moving to cities, man.
02:26:40.000 Most of the population lives in cities and will continue to live in cities.
02:26:44.000 Of ancient simple organism shit that is some 650 fucking million years old.
02:26:53.000 Some insane.
02:26:54.000 It predated the idea of life on this planet and they found it in Australia.
02:26:59.000 Yeah, Australia is a crazy spot.
02:27:02.000 It would be fun to go.
02:27:03.000 Have you been?
02:27:04.000 Yeah, I've been a few times.
02:27:05.000 I've been to Sydney twice.
02:27:06.000 Great people.
02:27:07.000 Really fucking nice people, man.
02:27:09.000 That's what I hear.
02:27:10.000 And they have no ozone over there, man.
02:27:11.000 They got real cancer problems.
02:27:13.000 Right.
02:27:13.000 Like, all over their billboards are these pictures.
02:27:16.000 Really?
02:27:16.000 They have, like, these skin cancer campaigns.
02:27:18.000 So there's photos of people with big, giant, stitched-up scars.
02:27:21.000 Oh, bummer.
02:27:21.000 Because everyone's getting chunks taken out of them.
02:27:24.000 Like, you go out there in the sun with no sunscreen on, you get fucked up, man.
02:27:30.000 It's important to wear sunscreen.
02:27:30.000 Yeah, it's another level sun with no ozone layer.
02:27:34.000 Wow.
02:27:34.000 There's a giant hole in the ozone over there.
02:27:35.000 It's so long to get there, right?
02:27:36.000 Like, 17 hours or something?
02:27:38.000 Not quite.
02:27:38.000 I think 15, something like that.
02:27:40.000 I want to go.
02:27:41.000 It's a lot.
02:27:42.000 I want to go.
02:27:42.000 But you know what?
02:27:43.000 If you can deal with that just one day, I mean, you know, what you do is, man, get a bunch of podcasts.
02:27:49.000 Get a bunch of podcasts.
02:27:50.000 Get a few movies on your iPad and just zone out and just...
02:27:53.000 Go zen and say, this is what I'm doing.
02:27:55.000 And don't freak out and don't feel like, fuck, I've got to get up and move around.
02:27:58.000 Just deal with it.
02:27:59.000 Ain't no big a deal.
02:28:01.000 And then once you get there, holy shit.
02:28:03.000 It's a beautiful, beautiful country, man.
02:28:04.000 It's the Wizard of Oz.
02:28:05.000 It's so gorgeous.
02:28:06.000 It's so amazing when you think that the people from England caused that as a prison colony at one point in time.
02:28:13.000 What a silly idea.
02:28:14.000 Fascinating.
02:28:14.000 Now all their actors win all the awards.
02:28:17.000 Yeah, I wonder why they're so good.
02:28:18.000 Yeah, Australian and British actors, man.
02:28:20.000 They make good stand-ups, too.
02:28:22.000 There's a few Jim Jefferies, really funny stand-up from Australia.
02:28:26.000 Yeah, and there's a couple Americans that do really well over there, like Eddie Ift and Arge Barker.
02:28:29.000 They go over there, and they're...
02:28:31.000 Arge Barker's fucking enormous over there, apparently.
02:28:33.000 Yeah, yeah, and they just...
02:28:35.000 When I was in Australia, I was talking to people, like, what do you do?
02:28:38.000 I'm like, oh, I'm a comedian.
02:28:39.000 They're like, you know Arge Barker?
02:28:40.000 Like, immediately.
02:28:42.000 Wow.
02:28:42.000 Yeah, yeah.
02:28:44.000 It's so close to, like, it's nothing like America, but you could totally hang out there.
02:28:50.000 Like, you could live there.
02:28:51.000 Like, you wouldn't have to learn a language.
02:28:52.000 People are very friendly.
02:28:53.000 You'd have no problems.
02:28:54.000 It feels slightly alternate universe.
02:28:56.000 Yes, very much so.
02:28:57.000 Because it's like, they speak English, but it's just another reality.
02:29:00.000 Well, it's so far away that they drive on the other side of the road.
02:29:04.000 Right.
02:29:04.000 That freaks you out.
02:29:05.000 Totally.
02:29:06.000 That whole England thing is a trip.
02:29:07.000 Alice in Wonderland, man.
02:29:09.000 It's like, why are you on this side?
02:29:10.000 That's why traveling is so cool, though, for shifting people's sense of reality.
02:29:14.000 Oh, yeah.
02:29:15.000 Expanding your consciousness because you're immersed in a sort of mirror world.
02:29:18.000 Yeah.
02:29:19.000 Where it's like, well, most things are kind of the same, but they're a little off.
02:29:22.000 Yeah.
02:29:23.000 So it's like reality has shifted a little bit.
02:29:24.000 I think traveling is so important.
02:29:26.000 Well, it's cool to see a culture like Australia where socially is really kind of parallel to America, like really similar.
02:29:33.000 I mean, not exact, but really similar.
02:29:35.000 If you go over there and you meet an Australian guy who's your age, chances are you're going to have a lot of things to talk about in common.
02:29:41.000 It'd be different, but not alien at all.
02:29:43.000 Yeah.
02:29:43.000 So it's kind of interesting that that is happening on the other side of the planet.
02:29:46.000 There's some sort of a really modern civilization.
02:29:51.000 And they're like, these people live here, and they dream here, and they wake up, and they go to work here.
02:29:55.000 Skyscrapers, nice cars.
02:29:55.000 And they've been living their entire lives with a whole different set of priorities that have no boundaries.
02:29:59.000 Bearing on my existence and that I didn't know about it until I came here.
02:30:02.000 What really makes you trip out is when you watch their TV shows and they have like really popular TV shows you fucking never heard about.
02:30:08.000 Parallel world, man.
02:30:08.000 And a guy comes out and a girl comes out.
02:30:10.000 You know, who the fuck are these people?
02:30:11.000 Everybody goes crazy.
02:30:12.000 Everybody goes crazy.
02:30:13.000 It's the number one show in Australia, Mike.
02:30:14.000 Right.
02:30:14.000 I can't believe they're really on right now.
02:30:17.000 Everybody will sit down and get drunk.
02:30:18.000 It's amazing.
02:30:19.000 It gives you perspective also.
02:30:21.000 It unhinges you from your reality.
02:30:22.000 It unhinges you a little bit.
02:30:23.000 The one thing that I consistently get when I go to these places is how uptight America is.
02:30:28.000 When you go, especially in Australia, they're so fun and they're so easy to hang out with and so generally friendly.
02:30:36.000 It makes you feel like, what's responsible for this level of tension in America?
02:30:40.000 Yeah, I don't know.
02:30:42.000 It's not everywhere, by the way.
02:30:43.000 Of course, there's a lot of cool people in America.
02:30:44.000 Don't get me wrong.
02:30:45.000 I get a lot of douchebag dummy tweets like, why don't you fucking go move to Afghanistan if you hate America?
02:30:50.000 It's not that I hate America, stupid.
02:30:52.000 It's that I love America and I think America should be awesome.
02:30:56.000 I mean, it is awesome, but it should be better than what it is.
02:30:58.000 It's possible.
02:30:59.000 For us to improve.
02:31:00.000 What holds us back is fuckheads like you.
02:31:03.000 That's what holds us back.
02:31:04.000 Twitter angry people.
02:31:05.000 There should be no reason why the cutting edge should be uptight about things.
02:31:09.000 Particularly social issues.
02:31:10.000 We need to completely legalize gay marriage everywhere.
02:31:14.000 We need to legalize marijuana everywhere.
02:31:15.000 You need to debate Rick Santorum because he disagrees.
02:31:18.000 Rick Santorum did have a really interesting point, though, I've got to admit.
02:31:20.000 I mean, I am always 100% in support of gay marriage.
02:31:25.000 I think you should be able to do whatever the fuck you want to do.
02:31:27.000 Of course.
02:31:27.000 It's not hurting me.
02:31:28.000 It's not a scam, and it's not hurting me.
02:31:30.000 It's not like you're trying to steal money, and it's not hurting me.
02:31:32.000 So I'm completely in support of that, but he had a really interesting point, that Rick Santorum, because he was talking about marriage has always been, for over a thousand years, been defined as a man and a woman.
02:31:43.000 Now, all of a sudden, you're calling it marriage, but it's a man and a man.
02:31:46.000 Can it be a man and two men?
02:31:49.000 And I was like, oh, shit.
02:31:50.000 Like, he just flipped it on its head.
02:31:52.000 Like, he really did.
02:31:52.000 It was a really good point.
02:31:54.000 And I was like, well, yeah, well, why can't it be two men?
02:31:56.000 Why can't two men get married to a guy?
02:31:59.000 Why can't everybody just bang each other?
02:32:01.000 Why not?
02:32:03.000 Yeah, but he was...
02:32:04.000 And the women in the audience were saying, no, that's a different scenario.
02:32:07.000 You're talking about a couple that's in love.
02:32:10.000 And she's like, well, no, what if these people are all in love?
02:32:12.000 There's three of them.
02:32:13.000 What if they are?
02:32:13.000 What if, you know, can it be two women and a man?
02:32:15.000 Can it be two men and a woman?
02:32:17.000 And then, you know, he just fucked them up, man.
02:32:19.000 He just fucked them up.
02:32:20.000 There's nothing they can say then.
02:32:21.000 Because, you know, he's really right.
02:32:22.000 Like, first of all, as a personal freedom issue, I'd feel like you should be able to do whatever you want if it's not hurting me.
02:32:27.000 Clearly gay marriage is not hurting me.
02:32:30.000 So do whatever the fuck you want.
02:32:31.000 But if you want to call it marriage, like maybe they should call it...
02:32:34.000 Maybe it should be something different.
02:32:36.000 Maybe it's marriage, but it's gay marriage.
02:32:38.000 Oh, we're gay married?
02:32:39.000 No.
02:32:40.000 Like me and my partner are gay married?
02:32:41.000 You wouldn't be able to say regular married?
02:32:43.000 Oh, we're triple married.
02:32:45.000 Oh, there's three of you?
02:32:46.000 Yeah.
02:32:47.000 What would you call that?
02:32:48.000 If you made that legal, what would you call that?
02:32:50.000 Domestic partnership.
02:32:52.000 They're all domestic partners.
02:32:53.000 What if they want to call them married?
02:32:55.000 They want to call them whatever they want to.
02:32:57.000 And then they get all the benefits of society and they want to do tax stuff together and all those things that people want.
02:33:04.000 Why not?
02:33:05.000 Corporations can have hundreds of employees or thousands of employees.
02:33:08.000 Maybe marriages should be able to as well.
02:33:09.000 So 100,000 people in a marriage.
02:33:11.000 So you're down with like They can be many nation states of Maryhood.
02:33:14.000 Mormon style, polygamy.
02:33:16.000 I got a Time magazine at home.
02:33:17.000 Well, no, no, no.
02:33:17.000 That's different because those are getting minors involved, right?
02:33:20.000 And coercing people and imposing reality tunnels and closing access to other media.
02:33:25.000 It's different.
02:33:26.000 I got a Time magazine at home, and there's a guy on the cover.
02:33:28.000 It's one of those last holdout old man Mormon dudes who has a gang of wives.
02:33:34.000 That's so true.
02:33:35.000 He's still rocking it.
02:33:36.000 There's one dude, nine wives, and he's got 46 children.
02:33:40.000 What the fuck?
02:33:42.000 What the fuck?
02:33:42.000 He should be thrown in jail.
02:33:44.000 What the fuck, man?
02:33:45.000 Wow, that's not good.
02:33:47.000 Yeah, that's crazy.
02:33:48.000 Did you know that a lot of those guys, we talked about this before, who is it when we brought up this, that they went to Mexico, that a lot of Mormons were traveling to Mexico, and they were having problems with the cartels.
02:33:57.000 They established these polygamous communities down in Mexico.
02:34:00.000 Oh, wow.
02:34:00.000 Fascinating.
02:34:01.000 And now they're having problems with the drug cartels.
02:34:03.000 Someone was assassinated recently, remember that?
02:34:05.000 Yeah.
02:34:06.000 I remember who brought it up.
02:34:07.000 One of our guests brought it up.
02:34:08.000 I'm like, wow, I had no idea that that was even going down.
02:34:11.000 They've set up these alternative communities down in Mexico.
02:34:15.000 Do you ever think about that?
02:34:17.000 What if somebody just decided to turn Costa Rica into goddamn paradise?
02:34:21.000 They're trying to do it.
02:34:22.000 The world's all fucked up everywhere else, but here's the deal.
02:34:25.000 We have a limited government.
02:34:26.000 We're going to establish the best schools possible.
02:34:29.000 Libertarian schools.
02:34:30.000 The best social care possible, the best health care, the best community centers where we have people who are set up to take care of stray children and really create a society.
02:34:43.000 They're trying to do it.
02:34:44.000 Where?
02:34:44.000 There's a guy called Patry Friedman who had this thing called the Seasteading Institute.
02:34:49.000 Which is an organization and they're backed by like Peter Thiel and everything.
02:34:52.000 Is that the giant island?
02:34:53.000 Yeah, to make these artificial man-made islands where we can do practice runs of futuristic versions of governance and they can be in international waters.
02:35:01.000 But now they're doing something with a Central American nation where the nation has given them a chunk of land to let them set up their own autonomous zone.
02:35:09.000 Where was this?
02:35:09.000 In Central America.
02:35:10.000 I don't know if it was Nicaragua or Guatemala or one of those.
02:35:13.000 But they're going to try it.
02:35:15.000 There's been all these articles about it and they're going to test out Futuristic, cutting-edge forms of government.
02:35:19.000 See, the only thing I worry about is one of the beautiful things about doing things in America, even though you're under the shadow of the military-industrial complex, is that it's fairly safe.
02:35:27.000 Yes.
02:35:28.000 You know, it's fairly safe here.
02:35:29.000 Yes, yes, very much.
02:35:31.000 Unless you're...
02:35:33.000 I mean...
02:35:34.000 Where are you going to recreate that?
02:35:36.000 Where are you going to recreate that?
02:35:37.000 You're not going to do it in open waters.
02:35:38.000 Because if you do it in international waters, what are these Somali pirate dudes?
02:35:41.000 You hear about that shit every day.
02:35:42.000 Yeah, yeah, yeah.
02:35:42.000 What are you going to do?
02:35:43.000 They've got a bunch of peace nicks.
02:35:44.000 There's a lot to think about.
02:35:45.000 A floating spot out there.
02:35:46.000 Yeah.
02:35:47.000 A big concrete floating jungle with what?
02:35:48.000 Paul, the security guard who patrols the perimeter.
02:35:51.000 They shoot Paul in the head.
02:35:52.000 Fucking take over.
02:35:54.000 God damn it.
02:35:54.000 Jason Silva, you got to be ready for war.
02:35:56.000 Yeah, I know.
02:35:57.000 I mean, I think there's obviously logistical things that have to be addressed, but it's a very ambitious idea to begin with.
02:36:02.000 It's very ambitious.
02:36:03.000 You would have to have guns.
02:36:05.000 You probably would.
02:36:06.000 You have to have laser beams.
02:36:08.000 Just get that trash pile and live in the middle of it.
02:36:11.000 That's not a bad idea.
02:36:12.000 No one wants to get near that thing.
02:36:13.000 Yeah, I hear people coming too.
02:36:14.000 But then where are you going to get your water from?
02:36:17.000 Your water?
02:36:18.000 What do you need from the ocean?
02:36:19.000 Desalinization?
02:36:20.000 Of course.
02:36:21.000 Is that what they're trying to do?
02:36:22.000 Dean Kamen, who invented a lot of these water purification systems, man, that you can put in like bacteria, infection, like poison almost into the water and it comes out like ready to drink.
02:36:36.000 Wow.
02:36:36.000 And they have this new unit that they're going to be taking to like rural parts around the world.
02:36:41.000 To these little villages.
02:36:43.000 And that's like the number one cause of disease and illness.
02:36:46.000 Dirty water, right?
02:36:47.000 These small little self-powered devices and they last forever.
02:36:52.000 Dean Kamen and his water products, water filtration stuff, people should Google.
02:36:56.000 How far?
02:36:57.000 He's a genius.
02:36:58.000 No, no.
02:36:58.000 He already has this new design.
02:37:00.000 I wasn't going to say that.
02:37:01.000 I was going to say, how far are they on the island thing?
02:37:03.000 The artificial island?
02:37:05.000 I don't know.
02:37:05.000 I think funding.
02:37:06.000 They need funding to build it.
02:37:08.000 Of course.
02:37:08.000 Who the hell is going to pay for that?
02:37:09.000 Like, bitch, what?
02:37:10.000 Well, techno-philanthropists.
02:37:12.000 New internet age billionaires.
02:37:13.000 They have the resources.
02:37:14.000 It would have to be a lot of money.
02:37:16.000 How much would it cost to build a fake island?
02:37:17.000 I have no idea.
02:37:18.000 Probably a lot of fucking money, man.
02:37:19.000 Probably a lot of money.
02:37:21.000 I was watching a documentary on the Japanese airport that they had created, and it's on an artificial island.
02:37:29.000 An artificial island that they've created, but they're slowly sinking into the sea, so they have this elaborate system of lifts that as it sinks, they raise it up to keep it level.
02:37:38.000 Wow.
02:37:38.000 It's fucking nuts.
02:37:39.000 I mean, what a marvel of engineering.
02:37:42.000 Engineering is just magnificent when you think about it.
02:37:45.000 It's amazing.
02:37:46.000 It's incredible.
02:37:46.000 I love looking at engineering in nature and comparing it to engineering made by man, and you see how there's certain patterns that align.
02:37:52.000 And here we are, we're like, oh, you know, we thought of this.
02:37:55.000 But then it's like, oh, but it matches this pattern that nature came up with, too.
02:37:59.000 And what you realize is that a good idea is a good idea whether you came up with it blindly, like nature, or whether you came up with it consciously, like man.
02:38:06.000 A good idea is a good idea.
02:38:07.000 If it works, it works.
02:38:09.000 That's what's amazing.
02:38:10.000 Have you ever seen when they take a colony of leafcutter ants and they expose their entire underground structure by filling it with cement?
02:38:18.000 Technology, man.
02:38:19.000 They have their own technology.
02:38:21.000 Holy shit.
02:38:21.000 That's their extended phenotype, man.
02:38:22.000 It's really kind of a fucking disturbing thing to watch, though, because it's kind of ant genocide you're, like, looking at.
02:38:27.000 I mean, they eventually cemented everybody in there.
02:38:29.000 It is very sad.
02:38:31.000 I mean, I don't really give a fuck about ants, but it's kind of crazy that they're willing to just cement the shit out of their houses just to find out how big the house is.
02:38:39.000 If you haven't seen it, folks, just Google it.
02:38:41.000 What is it, Brian?
02:38:43.000 Leafcutter ants?
02:38:43.000 Just pull that video up because it's astounding to look at.
02:38:46.000 Just pull up Leafcutter Ant Colony Exposed.
02:38:51.000 And these scientists, they found out that not only do they have these intricate structures, but they have vents set up so where they bring in, like, funguses and things that are rotting, there's an ability for the fucking gases to rise out through the air.
02:39:11.000 It's insane.
02:39:11.000 So it doesn't pollute their...
02:39:12.000 There's so much emergent intelligence in the design.
02:39:16.000 But do you know what the most amazing part is?
02:39:17.000 There's no one in control.
02:39:19.000 It's all decentralized.
02:39:20.000 It's all a bunch of individual local interactions happening simultaneously that together exhibit emergent phenomenon and emergent complexity.
02:39:28.000 It's like a beehive.
02:39:29.000 It's like Occupy Wall Street.
02:39:30.000 Beehives exhibit self-organization that emerges when all these billions of bees are working together to create this intelligent behavior.
02:39:36.000 But no individual bee itself is intelligent.
02:39:39.000 That's amazing.
02:39:40.000 Now they're saying that our neurons are the same, that we're not like a singular consciousness, but billions of non-intelligent neurons that together create synchronous transcendent effects.
02:39:49.000 Consciousness emerges from the interactions of trillions of neurons, individual, local relationships happening in different parts of the brain.
02:39:56.000 That totally makes sense.
02:39:57.000 Yeah.
02:39:57.000 So our brains are like ant colonies.
02:39:59.000 Our brains, our neurons are like the ants in the ant colony, and then us is the emergent behavior.
02:40:04.000 It's what comes out.
02:40:05.000 Yeah, I've always said that it's ridiculous to think that human beings can ever be separate, because that's the worst thing they could do to you in prison.
02:40:12.000 The worst thing they could do to you in prison is separate you from the general population and put you in solitary.
02:40:17.000 Nobody can talk to you.
02:40:18.000 You're just by yourself.
02:40:19.000 And that's crazy for people.
02:40:20.000 Well, that's like cutting your arm off.
02:40:22.000 Yes, it's alien to our goal and purpose and our desires on Earth.
02:40:27.000 100%.
02:40:28.000 Yeah, so it's obvious that we are engineered for a reason, or at least...
02:40:32.000 For feedback.
02:40:32.000 Yeah.
02:40:33.000 For interaction.
02:40:33.000 Yes.
02:40:34.000 Interaction and feedback.
02:40:35.000 Everything is feedback loops, dude.
02:40:37.000 Everything.
02:40:38.000 Well, that's why these kind of conversations are so exciting.
02:40:40.000 Yes, exactly.
02:40:40.000 Because you turn my brain into an area that it might not have gone into.
02:40:45.000 Likewise.
02:40:45.000 And then we start expanding in that area.
02:40:48.000 Yeah, there's a book by Matt Ridley called The Rational Optimist, and he coined the word idea sex.
02:40:52.000 And he says that ideas coming together in open liquid networks, open channels of communications are akin to genetic recombination in nature.
02:40:59.000 It's genes being in the primordial soup, mixing and completing each other and interacting.
02:41:03.000 It's all a giant algorithm, right?
02:41:04.000 And it's happening with ideas.
02:41:05.000 Yeah, ideas are intermingling and having sex with one another, but they're creating more change and at a rate that is unheard of.
02:41:11.000 By the gene pool.
02:41:12.000 If we could look at the interactions of human behavior and thought and language, if we could look at all that stuff as numbers and look at it as energy and something that could be quantified, instead of looking at it as our own product, instead of looking at it as something that we have done, if we could just look at it entirely of its own, we would see a completely different picture, wouldn't we?
02:41:32.000 Well, if we take the long view.
02:41:34.000 We're a caterpillar, man.
02:41:35.000 We're a caterpillar about to become a butterfly.
02:41:37.000 That's what we're doing.
02:41:38.000 We're making some crazy fucking cocoon right now.
02:41:40.000 We don't even know what we're doing.
02:41:41.000 Transform everything, man.
02:41:43.000 And we know it's possible because the caterpillars did it.
02:41:46.000 Caterpillars do it.
02:41:47.000 It exists.
02:41:48.000 It's not beyond the laws of physics for completely radical self-transformation.
02:41:52.000 You fucking blew my mind again, man!
02:41:56.000 You blew everybody's mind again.
02:41:58.000 It's a podcast.
02:41:59.000 Essentially, I gotta think we should stop it right there.
02:42:01.000 Because that's about three hours.
02:42:02.000 Oh, that was magnificent.
02:42:04.000 Wasn't it about three hours?
02:42:05.000 Somewhere in there?
02:42:06.000 Yeah.
02:42:06.000 Two hours, 40 minutes?
02:42:07.000 Something like that?
02:42:08.000 Thanks again, brother.
02:42:09.000 Thank you, man.
02:42:10.000 You're very, very stimulating to talk to.
02:42:12.000 Thanks, man.
02:42:12.000 It's one of those...
02:42:13.000 So are you.
02:42:13.000 So are you guys.
02:42:14.000 We have these conversations and it just, you know, you walk out and drive home and just go, what the Fuck, man.
02:42:19.000 Thanks for having me, man.
02:42:20.000 What is next?
02:42:21.000 You're so awesome at passionately infusing these ideas in other people's heads.
02:42:26.000 Thanks, brother.
02:42:27.000 You have a way of...
02:42:29.000 When you get fired up about shit, everybody around you is like, oh, yeah, yeah.
02:42:34.000 Yeah, dude.
02:42:34.000 Thanks, man.
02:42:35.000 It's infectious.
02:42:36.000 Very infectious.
02:42:36.000 Thanks, brother.
02:42:37.000 And if people want to find you on Twitter, it's Jason underscore Silva.
02:42:39.000 If they want to find you online, it is thisisjasonsilva.com.
02:42:44.000 And all of his upcoming lectures and all that.
02:42:47.000 Is there anything that people can see?
02:42:48.000 Like, are there any places where the average person can go and buy a ticket and watch you perform live?
02:42:53.000 The Economist Ideas Festival happening in Berkeley on March 28th is on innovation, and people can Google Economist Ideas Festival.
02:42:59.000 Anyone can go to that.
02:43:00.000 You can buy tickets for that.
02:43:01.000 So you don't have to work at Google.
02:43:03.000 Yes, and on April 20th, when I speak at the National Arts Club, they have a website.
02:43:07.000 You should be able to look it up.
02:43:08.000 I think it's about Dreaming Unlimited or something like that in New York City.
02:43:11.000 And all this information is on ThisIsJasonSilva.com?
02:43:14.000 Yeah, and I tweet about it all the time as the talks come up.
02:43:17.000 The PSFK Conference in New York at Battery Park on March 30th.
02:43:20.000 You can buy tickets to I'm going to speak at University of Pennsylvania April 2nd actually.
02:43:25.000 Beautiful.
02:43:26.000 This class on psychedelics and visionary arts and stuff.
02:43:28.000 Dude, keep doing what you're doing.
02:43:30.000 I love it.
02:43:30.000 It's very exciting.
02:43:31.000 Thank you everybody for tuning into the podcast.
02:43:33.000 Thanks for all the positive tweets and messages and we love you too.
02:43:37.000 Thanks for everybody who already bought tickets for Atlanta April 20th.
02:43:41.000 The first show is pretty much sold out.
02:43:43.000 It will be when I record my next special.
02:43:45.000 So if you want to go, there will be tickets to the second show that will be available sometime this week.
02:43:50.000 Like I said, probably somewhere around Wednesday.
02:43:53.000 And I got a lot of other shit going on in the future too.
02:43:56.000 Louisville, Kentucky.
02:43:57.000 That's soon.
02:43:58.000 When is that, Brian?
02:43:59.000 Any ideas?
02:43:59.000 Louisville, Kentucky?
02:44:01.000 March 30th through April 1st is Louisville, Kentucky.
02:44:05.000 And then we're in...
02:44:07.000 Hermosa Beach is actually before that.
02:44:08.000 March 23rd and 24th at Hermosa Beach, the Comedy Magic Club.
02:44:12.000 One of my favorite clubs ever.
02:44:13.000 And then 420 in Atlanta.
02:44:16.000 420 is when I'm going to do my special.
02:44:17.000 It's so cliche.
02:44:18.000 Yeah, bro.
02:44:19.000 It's so cliche, as a corny pothead, I couldn't resist.
02:44:23.000 Thank you to the Fleshlight for tuning in and saving our souls with their plastic vagina.
02:44:28.000 Does that work?
02:44:29.000 No.
02:44:29.000 I need to come up with a new commercial.
02:44:31.000 Thank you to the Fleshlight for being there, for Lonely Boys.
02:44:34.000 For being easy.
02:44:35.000 Thank you for being, yeah, but not that easy to clean.
02:44:38.000 A little complicated.
02:44:39.000 No, it's nice.
02:44:39.000 It's super easy.
02:44:40.000 Yeah, you should, at the end of it, it should be like a garbage disposal.
02:44:44.000 It just eats loads and then turns it into love and sends it out through the universe.
02:44:48.000 Totally.
02:44:48.000 It's really easy if you like to suck your own cum out of it.
02:44:50.000 So, Brian, so not necessary.
02:44:55.000 Jason Silva's here.
02:44:56.000 He's a serious man.
02:44:57.000 You don't need to do that in front of him.
02:44:58.000 You fucking freak.
02:45:00.000 Thank you to Onnit.com.
02:45:01.000 O-N-N-I-T. Oh, yeah.
02:45:03.000 What did I say?
02:45:05.000 Fleshlight.
02:45:05.000 Entering the code NABROGEN. 15% off.
02:45:07.000 You already know that.
02:45:08.000 You heard the first half of this fucking podcast.
02:45:10.000 Go to Onnit.com, enter in the code name, Rogan, save 10%.
02:45:12.000 There, it's over.
02:45:13.000 It's done.
02:45:13.000 Tomorrow, we have Aubrey Marcus, formerly known as, the artist formerly known as Chris, who's our friend who changed his fucking name.
02:45:21.000 That's how hard he tripped.
02:45:22.000 Wow.
02:45:22.000 He went to Peru and did ayahuasca and changed his fucking name.
02:45:26.000 You know, I have a friend who did that too, my friend Lyon.
02:45:30.000 Yeah, that ayahuasca.
02:45:31.000 Okay, fuck you up, son.
02:45:32.000 And Aubrey just got back from Costa Rica, where he went through a series of ibogaine experiences.
02:45:38.000 And now his name is Optimus Prime.
02:45:39.000 Yeah, now he's Mr. Manhattan.
02:45:41.000 And we're gonna meet him tomorrow, and he's gonna explain to us what the fuck is really going on with this crazy universe we're living.
02:45:48.000 Everything that has not been covered today will be covered tomorrow.
02:45:50.000 And then on Wednesday, we get Matt from Hoarders, his clutter cleaner on Twitter, and he's the guy who cleans up the crazy people's houses.
02:45:58.000 And I'm really fascinated by that because I got a bit of a hoarder in me.
02:46:01.000 Just a pinch.
02:46:02.000 You do as well.
02:46:02.000 Yeah.
02:46:03.000 Yeah.
02:46:03.000 So we'll find out what that fucking psychosis is all about.
02:46:05.000 Jason Silva, you are the man, sir.
02:46:07.000 You are the man.
02:46:08.000 Thank you, guys.
02:46:08.000 Namaste.
02:46:09.000 Thank you, everybody.
02:46:10.000 We love you dirty bitches.
02:46:11.000 Oh, two shows this weekend in Pasadena.
02:46:13.000 Because I'm gearing up for my special freaks.
02:46:16.000 So this Friday and Saturday, Friday night, when are we doing it?
02:46:19.000 Friday, 9 o'clock, Saturday, 10.30, IceHouseComedy.com.
02:46:22.000 Yes, and it's the annex.
02:46:23.000 It's a small room.
02:46:24.000 It always sells out in advance.
02:46:26.000 So if you want to get on this shit, IceHouseComedy.com.
02:46:29.000 Is that it?
02:46:29.000 Yeah, one Friday at 9 o'clock?
02:46:32.000 9 o'clock Friday, 10.30 on Saturday.
02:46:34.000 That's it.
02:46:35.000 It's over.
02:46:36.000 God bless America.