The Joe Rogan Experience


Joe Rogan Experience #284 - Daniel H. Wilson


Summary

This week on the Joe Rogan Experience Podcast, we talk about the new movie "Robot Apocalypse" by Daniel Wilson. We also talk about a new bee honey product, and how it's going to take over the world. Joe also talks about how he thinks the bees are going to steal his honey and eat it. Also, we have a new sponsor, Onnit. Onnit is a supplement company, a nutrition and fitness supplement company. They make the best quality supplements we can buy, at very reasonable rates. Everything that we sell is literally the very best quality shit possible at a very reasonable rate. We don t tolerate boredom. And you don't tolerate ignorance anymore. Get your shit together, stupid. We're also sponsored by Onnit, the makers of New Mood, Alpha Brain, ShroomTech, and Shroom Tech Immune. We love you, we appreciate you, and we're here to help you get the most out of your day to day life. -Joe Rogan and Brian Rogan Thank you so much for being a part of the podcast community, we really appreciate it and we really do appreciate you. We really do. XOXO, Joe and the rest of the crew at The Joe Rogans Podcast. Love ya, Joe! -Jon and the crew. -Jon & the Crew at The Rogans Crew. Jon and the Rogan Crew. -The Crew at the Rogans. Brian and the team at the R&B Project. Thanks to everyone for all the support and support and all the hard work they put out this week's episode. We appreciate you all for making this podcast and the support you all the work you've shown us out there. We can't thank you all so much. We're so grateful for all of the support we got out there! - Thank you, Jon & the support out there, all of your support out here! -The Rogans and all of our support us with all the love and support we get back to the podcast. -BONUS! - The Rogan Podcast! -Jon Rogan & Co. - Jon and The Crew at work! -Bryan and the Crew back at The R&R. . - . . Joe & the R & R. - - Jon & The Crew - Joe and The Boys at the JOB Podcast - The R & B Crew - The JOB PODCAST


Transcript

00:00:03.000 Hey everybody, what the fuck?
00:00:07.000 Are we live, Brian?
00:00:08.000 Yes, we are.
00:00:09.000 Is this legitimate?
00:00:09.000 Uh-huh.
00:00:10.000 Ladies and gentlemen, the Joe Rogan Experience Podcast is brought to you today.
00:00:14.000 Today we have one of our regular sponsors, which is Audible.com.
00:00:19.000 Audible.com, if you haven't used it.
00:00:21.000 Oh, sweet.
00:00:22.000 If you use the website, audible.com forward slash Joe, it takes you to this website where it gives you free Audible for 30 days and you get a free audio book.
00:00:38.000 Welcome to my show!
00:00:57.000 And something where you're enjoying the book so much you don't even give a shit.
00:01:02.000 I can get completely caught up in an audiobook in like a four or five hour trip and the time just flies by and I actually enjoy it.
00:01:11.000 It's like literally a difference between not liking something, being like annoyed by something and actually enjoying it just because your time is just sitting in a chair taking in this book.
00:01:23.000 So it's great for commuting.
00:01:25.000 It's great for the same things I said, again, like podcasts are great for.
00:01:29.000 It's great for if you're just going to ride a bike for 40 minutes.
00:01:32.000 You're going to get on a stationary bike just to do exercise.
00:01:35.000 That shit can be really boring.
00:01:36.000 But if you could do it with either some great music or a great audio book, you can kill that time a lot better.
00:01:43.000 It's a great way to get in some interesting information.
00:01:46.000 And again, if you go to the website...
00:01:50.000 Audible.com forward slash Joe.
00:01:52.000 They give you 30 free days.
00:01:54.000 Look who I found on here.
00:01:56.000 Oh, the Robo Apocalypse.
00:01:57.000 That's by our friend Daniel H. Wilson.
00:01:59.000 And he's on the podcast today, folks.
00:02:03.000 You could talk during any of this.
00:02:05.000 It doesn't matter.
00:02:06.000 I'm not an apostle.
00:02:08.000 We're also brought to you by DeathSquad.tv.
00:02:11.000 That's Brian's website.
00:02:13.000 People always ask where to get these crazy cat t-shirts that say DeathSquad on them.
00:02:18.000 These are all Brian's creations, and you can get them on DeathSquad.tv.
00:02:23.000 The money goes to support.
00:02:38.000 We had Bobcat on last night and he just finished up on a Bigfoot movie.
00:02:41.000 We talked all about it.
00:02:43.000 You're going to love it, dude.
00:02:44.000 The whole time, I'm just like, Joe's going to freak out.
00:02:46.000 I'm going to download it on my phone before I leave here.
00:02:48.000 I'm going to listen to it on the car home.
00:02:49.000 I love that you can just do that.
00:02:51.000 I just love this world that we live in.
00:02:53.000 You know, that you can just do that.
00:02:54.000 I can just throw that on my phone.
00:02:57.000 You know, over the Wi-Fi.
00:02:58.000 It'll take a couple of minutes.
00:02:59.000 Go, get in my car.
00:03:00.000 Boom!
00:03:01.000 Listen to it all the way home.
00:03:02.000 We live in awesome times.
00:03:04.000 This is fucking badass.
00:03:06.000 We don't tolerate boredom.
00:03:07.000 Yeah, we don't tolerate it.
00:03:08.000 And you don't tolerate ignorance anymore either.
00:03:11.000 Get your shit together, stupid.
00:03:14.000 All right?
00:03:14.000 The fucking information's out there.
00:03:16.000 We're also sponsored by Onnit.com.
00:03:18.000 That's O-N-N-I-T. Makers of New Mood, Alpha Brain, Shroom Tech Sport, Shroom Tech Immune.
00:03:24.000 There's too many things to list now.
00:03:25.000 It gets silly if I try to list all of them because you're not really paying attention anymore.
00:03:29.000 You're just getting annoyed by me.
00:03:30.000 But what Onnit is, is a supplement company, a nutrition and fitness supplement company.
00:03:37.000 I think that's the best way to describe it.
00:03:39.000 And what we're trying to do is just sell you the very best shit possible at very reasonable rates.
00:03:45.000 Everything that we sell is literally the best quality supplements we can buy, whatever it is, whatever we find the best quality hemp.
00:03:54.000 It's more expensive than regular ham powder, but it is the best shit.
00:03:57.000 It's the highest protein ratio.
00:03:59.000 It tastes the best.
00:04:01.000 We mix it with raw cocoa and maca.
00:04:04.000 And it's delicious.
00:04:05.000 Oh yeah, that's a new thing we have.
00:04:07.000 It's killer bee honey.
00:04:08.000 Oh, dude.
00:04:09.000 Yeah, it's killer bee honey.
00:04:10.000 Why?
00:04:11.000 I don't know.
00:04:11.000 Why not?
00:04:12.000 Dude, that seems scary.
00:04:14.000 That seems scary as fuck.
00:04:16.000 Yeah, I don't know.
00:04:17.000 It's probably not even good for you.
00:04:18.000 Who the fuck knows?
00:04:20.000 Evil bitches.
00:04:21.000 They're so evil, those bees, man.
00:04:23.000 We got their honey.
00:04:25.000 Ah, we ganked them.
00:04:26.000 It's like some gangster honey.
00:04:28.000 Imagine if they smelled it and they came and fucked you up because it was in your cupboard.
00:04:32.000 Oh, man.
00:04:32.000 Shit.
00:04:33.000 The killer bees realized, what, this motherfucker's stealing our honey in some sort of a way.
00:04:36.000 They try to mate with you?
00:04:37.000 All the bees try to fuck you in the ass because they're trying to mate you?
00:04:40.000 I don't think they would do that, Brian.
00:04:41.000 They might.
00:04:42.000 I think they're a little small for that.
00:04:43.000 But the idea that they might come fuck you up because you stole their honey, that's a possibility.
00:04:48.000 Or it can make you immune to their stings.
00:04:51.000 I don't think it would do that.
00:04:52.000 I think you have to actually get venom.
00:04:54.000 The honey and the venom aren't related.
00:04:55.000 If you get a little bit of venom, I think you get used to it, right?
00:04:58.000 Isn't that what happens?
00:04:59.000 It's getting scary.
00:05:02.000 Scary?
00:05:02.000 What's scary?
00:05:05.000 Dude, don't be scared.
00:05:06.000 They're just little bees.
00:05:06.000 You smack them out of the sky.
00:05:09.000 I really don't think you have to worry.
00:05:10.000 I think we're talking with great ignorance, Brian.
00:05:13.000 Don't be scared.
00:05:14.000 It's just honey.
00:05:14.000 These fucking bees have no idea that you have this honey.
00:05:18.000 They're stupid as fuck.
00:05:19.000 They don't know about jars.
00:05:22.000 I don't know about none of that shit, man.
00:05:24.000 They all work together and spin the wheel?
00:05:26.000 Dude, you do not have to worry about that.
00:05:27.000 Get the cap off?
00:05:28.000 Get yourself some Vitamin Sun, some Alpha Brain, throw down some kettlebells and get all manly if you want to buy kettlebells.
00:05:35.000 We have the best quality kettlebells that we can sell at the cheapest possible rates.
00:05:40.000 It's a very problematic thing to sell kettlebells, to be a small company and try to sell them, because it's all about the shipping.
00:05:48.000 You're sending these giant steel balls in the mail.
00:05:52.000 But we're getting the best quality shit when you buy it.
00:05:56.000 You don't ever have to replace it.
00:05:59.000 They last forever.
00:06:00.000 These fucking things are amazingly well made.
00:06:03.000 And they're some of the best workouts you can ever do as far as functional strength and fitness.
00:06:08.000 As far as the ability to apply it in sports and athletics, I think the kettlebells are the best.
00:06:14.000 I think it's one of my favorite things to do.
00:06:16.000 You don't really need that much exercise equipment.
00:06:19.000 You can do bodyweight squats, chin-ups, and kettlebell exercises.
00:06:23.000 It's really almost all you need.
00:06:25.000 If you want to be a bad motherfucker like Brian Redman.
00:06:28.000 Aren't you kettlebelly, Brian?
00:06:30.000 No.
00:06:30.000 No kettlebelly?
00:06:31.000 I'm trying to make sure my cat's not peeing on my couch right now.
00:06:33.000 That's the hard thing in Brian's life.
00:06:37.000 Use the code name ROGAN and save 10% off any and all of the Onnit supplements.
00:06:43.000 Alright, you dirty bitches.
00:06:45.000 Daniel Wilson's here.
00:06:46.000 We're going to get down to business.
00:06:47.000 We're going to find out about robots and shit and whether or not we should be freaking out right now.
00:06:51.000 Cue the music, Brian.
00:06:53.000 Check it out!
00:06:54.000 The Joe Rogan Experience.
00:06:56.000 Train by day!
00:06:57.000 Joe Rogan Podcast by night!
00:06:59.000 All day!
00:07:01.000 Powerful Daniel H. Wilson.
00:07:04.000 How are you a robotics expert?
00:07:07.000 Was it all in research for this book?
00:07:09.000 Do you have a background in robotics?
00:07:11.000 Yeah, as a kid I got really into robots and then I studied computer science.
00:07:16.000 And then while I was doing that I found out that instead of just programming computers, you could actually teach them how to learn the answer on their own, artificial intelligence.
00:07:26.000 That there was science fiction that you could study for real, the nerd in me really went for it.
00:07:33.000 It was exactly like if you're playing a role-playing game and you have a character sheet and you're picking the skills.
00:07:39.000 I saw roboticist and I said, oh well, I'll level up in that.
00:07:43.000 I think that would be great.
00:07:45.000 So it was just like you had a dream job as a little kid and you just got lucky that it was an actual real job.
00:07:51.000 Yeah, basically.
00:07:52.000 I mean, I like science fiction, I like the science, and then whenever I got...
00:07:56.000 I finished this computer science thing and I didn't want to...
00:07:59.000 I mean, forget going into the real world and getting a job.
00:08:02.000 Yeah, fuck that.
00:08:03.000 I went to grad school and studied robotics.
00:08:05.000 When I finished that, I started writing books about robots.
00:08:08.000 Wow, that's awesome.
00:08:10.000 This book that you wrote, Robopocalypse, is that what it is?
00:08:13.000 Yeah.
00:08:14.000 This is going to get turned into a movie?
00:08:16.000 Yeah.
00:08:16.000 Are you going to scare the fuck out of people, man?
00:08:18.000 You know, I've had this described to me from very credible sources as saving Private Ryan with robots.
00:08:25.000 Oh, Jesus Christ.
00:08:26.000 It's going to be fucking intense.
00:08:29.000 Is this something that you just wrote about because it's a fascinating piece of fiction to pursue?
00:08:36.000 Or is this something that you think is actually possible?
00:08:39.000 Do you ever consider the idea that robots could try to take over the Earth?
00:08:43.000 Well, man, I have really mixed feelings about this because I spent all this time with roboticists.
00:08:48.000 We were building robots, we all had our own research, and we were definitely trying to help people.
00:08:54.000 None of us were evil that I knew of, you know?
00:08:57.000 And so then I go and I get done and I write this book where robots are killing everybody.
00:09:01.000 So I made it as realistic as I could based on everything I know.
00:09:06.000 So there are no robots from outer space.
00:09:09.000 There's no time travel.
00:09:11.000 This is all based on stuff that we either have already or we're going to have soon.
00:09:15.000 So it's the most realistic version that I could come up with.
00:09:18.000 But that said, I don't really think that the robots are going to You know, join together under a sentient artificial intelligence and then try to wipe us out as a species.
00:09:29.000 I always wonder, because I always felt like there were certain things that, the instincts that human beings had that lead us to war and lead us to feats of ego and craziness and psychosis, and I always felt that they were,
00:09:44.000 a lot of them were wrapped around breeding, around the necessary things that need to be in place in order to reinforce the idea that it's competition to breed.
00:09:57.000 And that these things wouldn't exist in a computer because it wouldn't need them.
00:10:01.000 It wouldn't be inherent to the system the same way greed and ego is almost inherent to the human system to promote sexual conquest or to promote competition.
00:10:13.000 Yeah, I mean if you look at any of us that are sitting here that are alive, you gotta think that every single one of our ancestors, by hook or by crook, they lived long enough To make babies and to keep the babies safe.
00:10:24.000 And so you don't make it that long, like 200,000 years of homo sapien, without being a badass, right?
00:10:33.000 Anybody that was a little too soft, they're not here.
00:10:36.000 They didn't make babies.
00:10:38.000 And so that is a part of our DNA, literally, as human beings.
00:10:42.000 And the thing about building robots is...
00:10:45.000 I mean, you can make them any way you want, right?
00:10:48.000 If you want to build a robot that's going to have a sense of self-preservation, you can do that.
00:10:54.000 Well, there's also the crazy thought that in this pursuit, this mad pursuit of success that pushes people to do war and pushes people with great feats of ego, it's almost like that's necessary to ensure that there's some form of competition to make things move in the right direction.
00:11:19.000 But it doesn't seem like that would be inherent in a computer system.
00:11:23.000 I think, like, the douchey human behavior, like, we shouldn't think that it would, like, say, oh, we've got to wipe out all these people.
00:11:31.000 We have to take over and wipe out all these people.
00:11:32.000 It doesn't even seem like it would have, like, a desire to compete.
00:11:36.000 See, here's the deal.
00:11:37.000 Like, human beings, I feel like we've got all this machinery that's in place, has been for a long time, because it works, right?
00:11:43.000 There's a reason we're all still alive.
00:11:45.000 And so, we have a nature.
00:11:47.000 We have a human nature that we can't change.
00:11:50.000 And sometimes...
00:11:51.000 It pushes us to do terrible things in order to survive or, you know, like you're talking about conquest.
00:11:57.000 And that's scary, right?
00:11:58.000 That we're each of us fighting against sort of a dark nature and we've all got the potential to have a good nature.
00:12:04.000 I think what's even more scary than that is that a robot can be a total blank slate.
00:12:09.000 Let's say you tell a robot to solve a problem.
00:12:11.000 You tell it to get from point A to point B. Well, if that involves, like, stepping on babies, A robot doesn't have any nature.
00:12:19.000 It doesn't have any instinct.
00:12:20.000 It'll do that.
00:12:20.000 It'll commit in an amoral way without any good or evil associated with it.
00:12:25.000 It can commit atrocities, you know?
00:12:27.000 And that's pretty scary.
00:12:29.000 And that means you've got to be careful whenever you build these things.
00:12:32.000 Yeah, that's...
00:12:33.000 I mean, you can build into it.
00:12:35.000 Please avoid humans.
00:12:36.000 Avoid, you know, unnecessary loss of human life.
00:12:39.000 But at a certain point in time, if you're going to use, like, one of those for war, like, we're kind of doing with...
00:12:46.000 With drones.
00:12:47.000 I mean, that's kind of essentially what a drone is, right?
00:12:49.000 It's like a robot that flies.
00:12:51.000 Yeah, a drone has a lot of autonomy.
00:12:53.000 It makes some decisions on its own, but they're not pulling the trigger themselves.
00:12:57.000 I don't think they're doing that.
00:12:59.000 There are, by the way, people get all upset about drones.
00:13:03.000 Autonomous weapons have been around for decades and decades.
00:13:07.000 I mean, the very first drone, I think, they used it in the Korean War.
00:13:11.000 And in Vietnam, they had them.
00:13:13.000 Now they're getting cheap.
00:13:14.000 Because we got the cheap sensors, cheap processing.
00:13:17.000 Well, that was one of the part particulars in Operation Northwoods, I think, was that they were going to use a drone jetliner.
00:13:26.000 And they were going to blow it up and blame the Cubans.
00:13:28.000 Whoa.
00:13:29.000 Yeah.
00:13:30.000 The Cubans blew up a jetliner.
00:13:32.000 And back then, they could just fucking change your life.
00:13:35.000 You're now Joe Hill, and you live in Montgomery, Alabama.
00:13:38.000 You just move there.
00:13:40.000 They had drones back then, which is nuts.
00:13:45.000 That's like 1962 or something like that.
00:13:48.000 61, whatever it was.
00:13:50.000 Yeah, you just load up a vehicle with a bunch of explosives and put a rock on the gas pedal.
00:13:55.000 I wonder if they knew how to land them or they just knew how to take them off.
00:13:59.000 You know what I mean?
00:14:01.000 It's a drone-like jumbo jet.
00:14:03.000 I wonder what they actually knew how to do back then, if they could actually land it.
00:14:07.000 One thing I've always wondered is if you're driving a drone all day, you know, and you, like, are the human being that does pull the trigger, like, wouldn't it be nicer for the government to put, like, a big black sensor bar, like, over it before you see all the little people get turned into chunks of meat?
00:14:22.000 Yeah.
00:14:22.000 Just for your own sanity?
00:14:24.000 I don't think...
00:14:24.000 Well, do they look at it through that night vision thing that we see?
00:14:27.000 Yeah, that heat, that thermal stuff, right?
00:14:29.000 Dude, that's weird, too, right?
00:14:30.000 Have you watched those videos?
00:14:31.000 They're horrific.
00:14:32.000 The crazy thing is you hear...
00:14:35.000 The guys, they speak so clinically, right?
00:14:38.000 They're like, okay, engage target.
00:14:39.000 And they sound like airline pilots.
00:14:42.000 And then they pull that trigger and you hear the guns firing because you're there inside the Apache helicopter or whatever.
00:14:49.000 And then it actually takes like five seconds for the bullets to go like two miles.
00:14:54.000 And then all the little people start, you know, flying around.
00:14:57.000 It's pretty crazy to watch.
00:14:58.000 It's bizarre.
00:15:00.000 It's bizarre.
00:15:01.000 I mean, it's just the weirdest way of eliminating people.
00:15:04.000 It's so clinical.
00:15:06.000 It's so detached.
00:15:09.000 It's weird.
00:15:10.000 We're killing people from a mile away.
00:15:12.000 What's a landmine, though?
00:15:14.000 A landmine is basically a robot, right?
00:15:17.000 To define robots, when people ask me, I always say, It's any kind of mechanical artifact that senses the environment, thinks about what to do, and then acts on its own and does the whole sense-think-act deal.
00:15:28.000 And that's what a landmine does.
00:15:29.000 It sits around and senses, waits on a human whenever a human steps on it, makes a simple decision to explode.
00:15:37.000 Wow.
00:15:38.000 Yeah.
00:15:39.000 How many of those are allowed?
00:15:41.000 Aren't there a bunch around from the Vietnam War that they still haven't detonated yet?
00:15:45.000 I don't think the United States...
00:15:47.000 It's allowed.
00:15:48.000 I think they're outlawed, and I'm pretty sure that we conform to that in most circumstances.
00:15:53.000 But yeah, there's a ton of them that are left.
00:15:55.000 And there's not just landmines, right?
00:15:58.000 There's underwater mines for submarines, right?
00:16:01.000 So they hang out on the bottom of the ocean, and they are able to This is what's interesting.
00:16:06.000 These are old, too.
00:16:07.000 These are from the 60s.
00:16:08.000 They can target nuclear submarines, and these landmines are complicated as shit.
00:16:12.000 So they sit there and they listen for the acoustic signature of whatever comes past, and then they identify what type of craft it is.
00:16:20.000 Based on that, they choose different loitering strategies about how to follow it and how to get close enough to detonate.
00:16:26.000 And then they chase, and then they eventually explode.
00:16:30.000 They do all this stuff.
00:16:31.000 Targeting.
00:16:32.000 Oh my god.
00:16:33.000 And some of them are still down there.
00:16:36.000 There's some areas where submarines can't go.
00:16:38.000 I imagine, but I'm not an expert on that.
00:16:42.000 I've read about all this equipment.
00:16:44.000 It's cool stuff.
00:16:45.000 The landmines are crazy.
00:16:47.000 The idea that you're just going to go to war with anything that touches these things and just blow them up.
00:16:52.000 That would be an ethical idea.
00:16:55.000 So now, in order to make them more ethical, there are lots of new landmines that I've read about.
00:17:01.000 The self-healing minefield is one of my favorites.
00:17:04.000 It's a minefield where the landmines can locomote a little bit, so you spread them out, and then if something comes through, they basically set up a local area network, so each landmine kind of has basically Wi-Fi, and they're talking to each other, so they kind of know where each other are at.
00:17:18.000 Then if some of them get blown up, other ones are able to hop, and they just do these little hops until they Until they evenly distribute themselves again.
00:17:28.000 And so then you've got this what they call a self-healing minefield.
00:17:32.000 But that's nothing compared to the crab mines, the ones that are designed to be dropped offshore.
00:17:36.000 And they've got crab legs and they scuttle up on the bottom of the ocean up to the...
00:17:42.000 Up through the beach, basically.
00:17:44.000 And then onto land, and that's how you mine beaches.
00:17:47.000 Like those walking mime, or bombs from Mario Brothers.
00:17:50.000 Exactly.
00:17:51.000 Bomb bombs.
00:17:52.000 Another example of video games becoming real.
00:17:55.000 Dude, dude, dude, dude, dude.
00:17:56.000 You don't want to get hit by a turtle shell.
00:17:58.000 What happens?
00:17:59.000 These crabs go in the water, and they run up onto the beach?
00:18:04.000 Yeah, I mean, it's a good example of what they call biomimetics.
00:18:07.000 How big are they?
00:18:09.000 Well, they're about the size of a crab, I mean.
00:18:13.000 I've seen descriptions of that.
00:18:16.000 I've never seen a real one of those, and I don't think those are really in use.
00:18:19.000 But it's a great example of if you want to build a machine that's going to operate in a certain environment, you think of the environment as a problem, right?
00:18:28.000 The problem is how do you locomote on the bottom of the ocean in the pounding surf?
00:18:33.000 How do you do that?
00:18:34.000 Well, there are answers to all these problems.
00:18:36.000 There are animals, right?
00:18:38.000 A lobster is the answer to that problem.
00:18:40.000 A crab is the answer to that problem.
00:18:42.000 And so you go and you study the animals, and then you take the basic principles about how they locomote and how they do whatever, and then you distill them down, you stick them into a robot.
00:18:52.000 So no matter where you want to go, there's usually a solution to that problem in the form of an animal that you can study.
00:18:59.000 And then I push up my glasses onto my nose.
00:19:03.000 Jesus!
00:19:04.000 The idea that you could make like a million little robot mine crabs and just unleash them on a beach.
00:19:10.000 Dude, that's like the third chapter of Robopocalypse.
00:19:12.000 Is it really?
00:19:13.000 Well, hell yeah, that's too cool not to put into a book, right?
00:19:16.000 Oh, fuck yeah.
00:19:17.000 I got this part where they're all walking across Boston, like they're walking across this plaza.
00:19:22.000 Spoiler alert!
00:19:22.000 They sense vibration through their feet, right?
00:19:25.000 And so whenever they sense the vibration, they raise their little feet up in the air and they all kind of do it at a pat...
00:19:30.000 I don't know what's going to be in this movie, but there are some things that, God, I just want to see it.
00:19:36.000 Really?
00:19:36.000 That's awesome, man.
00:19:38.000 Wow.
00:19:39.000 I can't wait to read it.
00:19:41.000 When you wrote this, is this something that was in your head for a while before you wrote it?
00:19:46.000 Yeah, man.
00:19:47.000 These are ideas that have just been sort of percolating around for a really long time.
00:19:50.000 My first book was How to Survive a Robot Uprising.
00:19:53.000 I was just making fun of all the Hollywood BS where robots are killing.
00:19:58.000 I specifically made fun of...
00:20:00.000 Stuff I love, like The Matrix and Terminator.
00:20:02.000 Of course I love it.
00:20:03.000 I'm thinking about it.
00:20:04.000 But then I kind of went to the dark side, you know?
00:20:07.000 Wrote that fiction, you know, science fiction.
00:20:11.000 Killer robots.
00:20:12.000 They're sexy, man.
00:20:13.000 They got biblical themes built in.
00:20:15.000 Like, they make for great drama.
00:20:18.000 Do you look at the idea of technology and robots as like a life form?
00:20:25.000 That human beings are responsible for igniting and starting in the world.
00:20:30.000 Marshall McLuhan, I think it was, someone just told me this quote the other day.
00:20:34.000 It's a brilliant quote.
00:20:36.000 He said that humans are the sex organs of machines.
00:20:42.000 Isn't that a beautiful quote?
00:20:45.000 There's ways of looking at...
00:20:47.000 See, this is one thing that, as a roboticist, you have to do this.
00:20:50.000 You have to look at a human being from a totally alien perspective, right?
00:20:54.000 And I have a point, but...
00:20:56.000 If you look at how to do speech recognition or emotion recognition, from a robot's perspective, we are just moving pieces of flesh around on our faces into different configurations, and then that conveys some sort of inner chemical state...
00:21:12.000 I mean, it's not intuitive.
00:21:14.000 It's not easy to figure out.
00:21:16.000 You can look at humankind as like, all we do is we cover the earth in lawns.
00:21:22.000 We are slaves to grass.
00:21:25.000 If it wasn't for us, grass wouldn't really exist everywhere that it does.
00:21:30.000 So if you look at human beings from sort of these alien perspectives, sometimes cool shit falls out.
00:21:36.000 And some people do think that it's our destiny to create the next intelligent life form, you know, and to set it free.
00:21:45.000 And then those people usually think that it's our turn to retire.
00:21:50.000 Hans Moravec.
00:21:51.000 He says, you know, once we do this, children of the mind, right?
00:21:54.000 Once we make the robots, then we're done.
00:21:57.000 We're finished.
00:21:57.000 Like, we achieved our goal.
00:21:59.000 I disagree.
00:22:01.000 It's so terrifying, but the idea that there's a reason why dinosaurs aren't around anymore.
00:22:05.000 Because, you know, they...
00:22:07.000 They got wiped out.
00:22:08.000 Something better came along.
00:22:09.000 We like to think.
00:22:11.000 And it's called people.
00:22:11.000 Now, we're the new head of the planet.
00:22:13.000 But why would we think that we can hold this spot forever?
00:22:16.000 I mean, it doesn't make any sense.
00:22:17.000 Well, I think that we're going to evolve with our tools.
00:22:20.000 Yeah.
00:22:20.000 So, are we going to become a part of a computer?
00:22:22.000 Are we going to become a symbiotic organism?
00:22:25.000 You know, I just...
00:22:25.000 So, I wrote a book about it.
00:22:28.000 It's called Amped, and it came out this year.
00:22:30.000 And it's basically...
00:22:31.000 Yeah, it's about thinking about, like...
00:22:34.000 What happens when we start integrating this technology into our own bodies?
00:22:38.000 And just not thinking about crazy science fiction, far out stuff, just thinking about right now.
00:22:44.000 What's really cool about this to me is that the people who are getting this, are people that have real serious disabilities.
00:22:52.000 People who are willing to have a hole drilled in their skull and a neural implant placed on the surface of their brain to improve their quality of life.
00:23:01.000 And it's not like Tony Stark or rich kids that are getting a leg up in school.
00:23:08.000 It's like the most vulnerable, challenged people in our society are getting this technology.
00:23:15.000 And in some cases, it's making them It's not bringing them back to normal.
00:23:19.000 It's taking them past normal.
00:23:21.000 And so people with disabilities are like becoming people with super abilities.
00:23:25.000 And it's a pretty cool trend to watch.
00:23:28.000 Wow.
00:23:28.000 That's amazing.
00:23:29.000 Because, I mean, what's it going to be like when you realize that, hey, if you really do want to be the fastest sprinter on the planet, you've got to cut your legs off.
00:23:36.000 Because, like, there's just no way to do it.
00:23:39.000 Oh, my God.
00:23:43.000 That hurts my brain.
00:23:44.000 Or be lucky enough to be born.
00:23:45.000 You know why that hurts?
00:23:46.000 Because I know someone's going to do it.
00:23:48.000 That's, you know, you think about...
00:23:50.000 Don't you think someone's going to do it?
00:23:52.000 Armstrong and, you know, I don't know.
00:23:55.000 You mean Neil Armstrong?
00:23:56.000 No.
00:23:56.000 The guy who went on the moon?
00:23:57.000 No.
00:23:59.000 Oh, that Lance guy?
00:24:00.000 That Lance guy.
00:24:00.000 Well, do you make that akin to hacking your legs off, taking performance-enhancing drugs?
00:24:05.000 I think that's a little bit crazier.
00:24:07.000 No, there's a leap, right?
00:24:08.000 I mean, there's definitely a leap, which is you've got to go back to go forward, right?
00:24:13.000 Because without performance-enhancing drugs, you're still not disabled.
00:24:20.000 Like Oscar Pistorius, he competed in the London 2012 Olympics, right?
00:24:24.000 The guy has no legs below the knees.
00:24:26.000 He ran alongside able-bodied athletes on prosthetics, super advanced prosthetics.
00:24:31.000 If you take those prosthetics off of him, he ain't going anywhere, right?
00:24:35.000 And that's the difference.
00:24:36.000 I mean, if you take somebody off of performance-enhancing drugs, they're still capable of doing whatever it is that they were doing, just slower or not as well or whatever.
00:24:47.000 Yeah, it's an interesting thing, the performance-enhancing drug thing, because By keeping it secret and by hiding it and by sneaking around it.
00:24:58.000 Take all the moral arguments out of it.
00:25:02.000 When it comes to cheating and achieving victory and unseemly methods...
00:25:08.000 Most of the people that do that, they think that they have to do it in order to compete.
00:25:12.000 They think there's hidden rules, right?
00:25:13.000 And they're keeping with the hidden rules.
00:25:15.000 But I think it's also really dishonest to the actual...
00:25:19.000 The issue becomes...
00:25:21.000 It's dishonest to the actual idea of human ability.
00:25:26.000 Because human ability under these incredibly enhanced conditions is quite a bit more than a regular person's ability.
00:25:33.000 So we have these distorted perceptions of records that have been achieved through...
00:25:39.000 We need to know.
00:25:41.000 If we're going to all decide that they're going to take...
00:25:44.000 We're looking for bedrock.
00:25:46.000 We want that definition of human, straight up natural human.
00:25:49.000 I want to compare myself to Babe Ruth and have that shit be a straight comparison.
00:25:55.000 Yeah, and it gets to be a weird situation when you've got a bunch of people introducing all these alien things into the body.
00:26:03.000 And eventually it probably will be parts.
00:26:06.000 It probably will be.
00:26:07.000 That's why this is scaring me so much when you said that.
00:26:09.000 Because I'm thinking of someone actually chopping their legs off and giving themselves robot legs.
00:26:14.000 Well, you know, I don't...
00:26:16.000 Somebody would fucking do it, man.
00:26:17.000 There's a cool book called Machine Man by Max Berry, which...
00:26:20.000 I love any book that's written from the perspective of, like, an autistic person.
00:26:25.000 Because a person with Asperger's, you know, because if you read books that are from those perspectives, they tend to be really Hemingway-esque.
00:26:32.000 Like, the short sentences, they're easy to read.
00:26:34.000 I really like them.
00:26:35.000 But anyway, there's a book where that happens.
00:26:37.000 It's a good one.
00:26:38.000 Machine Man.
00:26:39.000 But...
00:26:40.000 Do you think that it would be as simple as we would engineer them so we could shut them off?
00:26:47.000 Robots?
00:26:47.000 Yeah.
00:26:48.000 Is it possible?
00:26:49.000 Well, no.
00:26:49.000 I mean, think about...
00:26:50.000 Okay, sure.
00:26:51.000 Think about that.
00:26:51.000 You shut them off.
00:26:52.000 I mean, so if you have a robot that...
00:26:56.000 You know, you don't shut an elevator off whenever it's got like 20 people on it.
00:27:00.000 It's 100 stories up.
00:27:02.000 Like, there are circumstances where...
00:27:05.000 You know, you have to have, what do they call it, graceful failure or graceful degradation or something, where it needs to fail gracefully.
00:27:14.000 It can't just fail all at once.
00:27:15.000 So if you shut it down, it needs to shut down its stages so that it doesn't hurt people.
00:27:20.000 But yeah, I think kill switches, emergency stops, those are a huge aspect of building safe robots, you know, but it's not the whole story.
00:27:31.000 It seems like the real issue would be how much of the human ideal of life would be programmed into it.
00:27:39.000 If you were going to engineer a life form, which essentially you would be able to do if you had a robot and you turn this computer into some sort of a sentient being, what aspects of the human psyche would you engineer into it?
00:27:53.000 Would you engineer into it a sense of survival?
00:27:56.000 Would it be able to understand that that's illogical and override it if it was that strong and if you gave it some autonomy?
00:28:03.000 Or would it just go with it?
00:28:05.000 Yeah, you know, the thing that's scary about this is there are an infinite number of minds that you could really create.
00:28:13.000 You could create totally alien minds that just meditate on things that we could never comprehend.
00:28:19.000 You know, you could create minds...
00:28:22.000 I'm freaking everybody out.
00:28:23.000 You're freaking me out.
00:28:24.000 Totally freaking me out.
00:28:26.000 I think there's some story I read where there's...
00:28:32.000 Actually, I wrote a story where there was basically a robot that was designed to paint happy faces on things and then it goes nuts and runs amok and it just paints happy faces on everything and it ends up destroying the universe, painting happy faces because that's the way it sees the world.
00:28:46.000 I think there are a lot of people who think a lot about Ray Kurzweil, for instance.
00:28:54.000 He's really obsessed with this idea That we're going to upload our brains into machines and that we will basically have a machine that simulates every neuron in your brain and then you'll live inside and you'll live forever inside the box, right?
00:29:10.000 That's a great example of giving a human experience, like a human life to a machine so that it knows what things are like from our perspective.
00:29:19.000 So it knows you don't step on babies when you're walking across the room.
00:29:24.000 That innate nature that we have.
00:29:27.000 I think it's important that we do convey our ethics to the machines that we build so that they behave in a way that allows us to co-exist.
00:29:39.000 Hopefully, or we're done, son.
00:29:41.000 It's one or the other.
00:29:42.000 Because if it's us versus them, they're going to be able to figure shit out so quickly.
00:29:46.000 They're going to be able to make better versions of themselves very quickly.
00:29:50.000 Yeah, and that's the idea of the singularity, right?
00:29:52.000 It just bears definition.
00:29:55.000 We create a machine that's smart enough to make itself smarter, and then you kind of have a runaway feedback loop, and it gets smarter and smarter.
00:30:03.000 But I don't think that that's something that is really going to happen anytime soon.
00:30:08.000 But isn't it a possibility?
00:30:09.000 Yeah, I absolutely think that's why.
00:30:11.000 So why would you be confident in thinking that it's not going to happen?
00:30:13.000 Well, because it's sort of like if we were live with Jules Verne and we were discussing, like, well, what shape should the capsule be when we go to the moon?
00:30:22.000 You know, it's like we're not close enough to solving that problem to sort of make informed decisions about how we should solve it.
00:30:29.000 You see what I mean?
00:30:30.000 Right, I see what you're saying.
00:30:31.000 Yeah.
00:30:31.000 We're going to have other robots in our lives a lot sooner, and those have more immediate problems to solve.
00:30:37.000 Ethical problems, too.
00:30:39.000 Yeah, like robot fuck dolls.
00:30:40.000 Right?
00:30:41.000 For sure.
00:30:42.000 For sure, that's going to be an ethical issue.
00:30:45.000 It's going to be like a slavery-type issue.
00:30:47.000 If you have a really good form of intelligent life that you've created, it's an artificial intelligence, and you have it as a sex slave, that would be fucked up, man.
00:30:57.000 The sentient thing, like the thing where they're smart enough to...
00:31:01.000 For us to worry about all these ethical things.
00:31:03.000 I think that's far away.
00:31:05.000 You think so?
00:31:05.000 No, it's all about human beings, man.
00:31:08.000 So for instance, the real problem when you make a fuck doll is do you allow a person to...
00:31:15.000 What happens to the people who interact with it?
00:31:18.000 So if you have a guy who goes home every day and he beats the crap out of this doll, is that okay?
00:31:24.000 If you built the doll, should the doll call the police?
00:31:28.000 Should the doll take it?
00:31:29.000 And your decision there, as a roboticist, as a scientist, product designer, whatever you are, you're designing an ethical interaction between your product and a human being.
00:31:41.000 And your decision can affect whether this guy Hurts a real woman, you know?
00:31:49.000 Or if you're building toys for children or really lifelike pets, you know, and a kid sticks the fake dog into the microwave, like, what happens, you know?
00:31:58.000 Because my gut feeling is not okay, right?
00:32:02.000 Yeah.
00:32:03.000 The animal should react.
00:32:05.000 And there's actually research on this.
00:32:08.000 My wife's a child psychologist at University of Washington University.
00:32:11.000 She's in Portland, where I live, but before she was at UW. And they have this research there where they basically took...
00:32:18.000 Let me see if I can get this straight, because it's kind of fascinating.
00:32:21.000 They take these kids and have them play tic-tac-toe against a virtual head, right?
00:32:26.000 It looks like a person.
00:32:27.000 And then they have a scientist walk in, and in half of the experiments, the scientist says to the robot head, Hey, that was a really dumbass move.
00:32:36.000 You're stupid.
00:32:37.000 And then half the time, the robot says...
00:32:40.000 Nothing.
00:32:41.000 And the other half the time, the robot says, hey, it's not okay for you to talk to me like that.
00:32:45.000 And when they ask the kids later, Whether the virtual head deserved to be treated with respect or whether it was smart.
00:32:55.000 They asked these kids all these questions.
00:32:57.000 If the robot demanded moral treatment, then the kids thought that the robot deserved moral treatment.
00:33:04.000 If it didn't, they thought it was okay to abuse it.
00:33:07.000 And so it comes right back down to it.
00:33:10.000 When you build a robot, you're building an ethical interaction.
00:33:13.000 Wow.
00:33:14.000 You can mess it up, I feel like.
00:33:16.000 That's fascinating.
00:33:20.000 Yeah, you can't allow something to just be brutalized and to take punishment.
00:33:27.000 You can't make a person.
00:33:28.000 You can't make an artificial person.
00:33:30.000 Can you imagine that future, right?
00:33:31.000 Where people are followed around with perfectly human-like robots and they just abuse them?
00:33:35.000 Just beat the shit out of them.
00:33:36.000 Yeah, man.
00:33:38.000 That would very quickly warp, I think, the psyche.
00:33:40.000 Oh yeah, of humans.
00:33:42.000 It would create sociopaths.
00:33:43.000 Because, yeah, we acclimate to whatever we're around, you know?
00:33:47.000 Yeah, especially if it's indistinguishable.
00:33:48.000 Unless we're already robots.
00:33:50.000 And we don't know it.
00:33:51.000 What do you buy with this simulation theory thing, man?
00:33:55.000 Do you ever wrap your head around that?
00:33:56.000 Do you ever fuck with that?
00:33:59.000 No, lay it on me.
00:34:00.000 It's this new thing, and these are legitimate scientists that are considering this.
00:34:05.000 It's the Matrix thing?
00:34:05.000 Yeah, yeah, yeah.
00:34:05.000 The damn Matrix again?
00:34:08.000 The idea that we're in some sort of an incredible computer program, and that it's so complicated.
00:34:15.000 And so well done that it's indistinguishable from real life and that we're interacting with things, but that the more they study string theory, they're studying in the computations of string theory, they keep finding this self-correcting computer code.
00:34:32.000 I don't understand what that means, but I understand it's a very specific type of computer code that we didn't even figure out until the early 20th century.
00:34:40.000 I think they said we figured it out in the 40s or 50s or something like that, if I remember correctly, but...
00:34:46.000 The idea that this is in these string theory equations that they're putting together.
00:34:53.000 I don't understand mathematics, but what I think they're trying to say is like, there's an eerie code to all this.
00:35:03.000 It's not just like, we don't know what the code is, but we can see that there's some repeating patterns.
00:35:08.000 No, no, no, no.
00:35:09.000 It's a very specific type of code.
00:35:13.000 And that's when they're studying the nature of reality.
00:35:17.000 So the nature of reality, and it's like one of its smallest measurable forms, is very obviously computer code.
00:35:24.000 Seven.
00:35:26.000 42. It doesn't seem like if technology moves in the direction it's going right now.
00:35:36.000 If our computers get more and more powerful or our CGI shit is more and more believable, we've got to keep moving until one day we reach a point where we can simulate reality.
00:35:49.000 It's going to happen.
00:35:50.000 And if it's really fucking good, it's going to be just like this.
00:35:56.000 Dude, is that a...
00:35:58.000 I'm trying to think.
00:35:58.000 Running Man.
00:36:00.000 Is that a Running Man?
00:36:01.000 That's whenever they recreate the fight, you know, out of the computer models.
00:36:05.000 I mean, before all that happens, we're going to be able to...
00:36:08.000 I can't wait until the day when I can say, all right, television, whatever you are now.
00:36:13.000 I want to watch, like, Running Man, but I want, instead of Arnold, you know, I want Jeremy Renner, and I want, like...
00:36:21.000 No, Megan Fox.
00:36:23.000 I want an all-female cast of...
00:36:26.000 Yes.
00:36:26.000 I want, yeah.
00:36:28.000 And you have to follow them with a reality show camera as well.
00:36:31.000 And it just maps them.
00:36:32.000 Get them all in a house together, make them live together.
00:36:34.000 Oh, that's awesome.
00:36:34.000 It just maps their bodies onto it, you know?
00:36:37.000 And that would be like the ultimate ironic one to watch because they do that in the movie.
00:36:41.000 How long before we have our first robot action hero?
00:36:46.000 Wait, so what do you mean by robot action?
00:36:48.000 Robocop?
00:36:48.000 Well, like a guy in the movies.
00:36:50.000 A robot actor?
00:36:52.000 Yeah, like a robot actor.
00:36:53.000 You know, so I... You would have like all Ryan O'Neill's qualities with, you know, a little...
00:36:58.000 Arnold Schwarzenegger's masculine bravado and...
00:37:03.000 You know, Brad Pitt's acting chops or whatever.
00:37:05.000 You know, just mix it all up together and make the perfect robot action hero.
00:37:09.000 Joe's pouty lips.
00:37:10.000 My lips are not pouty.
00:37:11.000 Yeah, they are, dude.
00:37:12.000 They're so pouty.
00:37:14.000 Son of a bitch.
00:37:15.000 Son of a bitch.
00:37:16.000 Do you get injections in them?
00:37:18.000 I just punch myself in the face.
00:37:20.000 I hate myself.
00:37:22.000 Yeah, I'm sorry.
00:37:23.000 All I'm sitting here thinking, I'm just thinking of Calculon from Futurama.
00:37:27.000 Calculon!
00:37:28.000 Sorry, he's like, he's an actor.
00:37:30.000 A robot actor.
00:37:31.000 He's a robot actor!
00:37:33.000 Is that possible?
00:37:34.000 Could they ever make, like, you know, are we going to have human beings that are that indistinguishable?
00:37:40.000 The question is, like, would you bother, because you could just create them in CG, right, and make them perfectly realistic, would you bother creating the real world version?
00:37:50.000 But I have to say, I'm super excited because I have a short story that got picked up by this director.
00:37:55.000 He's sort of a budding guy in London, right?
00:37:59.000 But he's making a short film based on this thing and he's negotiating with...
00:38:03.000 It's a story about a guy and a robot boy that he lives with because his real son has passed away.
00:38:10.000 And we got Lambert Wilson, the Merovingian, is going to be in this thing.
00:38:15.000 And then he's working with a university to get an actual robot.
00:38:20.000 There's this thing called the N.A.O. Humanoid.
00:38:23.000 It's the size of an eight to ten year old kid.
00:38:25.000 If you Google it, I mean, it moves like a real human being.
00:38:28.000 And they're going to have it as an actor.
00:38:30.000 It's just a short film, but, I mean, it's going to be pretty cool.
00:38:34.000 It's going to be a real robot acting in a movie.
00:38:36.000 Whoa.
00:38:37.000 Goddamn, man.
00:38:39.000 I saw that there was a very intricate Japanese robot.
00:38:43.000 It was a woman.
00:38:44.000 Have you seen that one?
00:38:46.000 It was very emotive.
00:38:48.000 Well, it was a woman head.
00:38:49.000 Yeah, it was just a head.
00:38:50.000 Okay, because there's the actroid.
00:38:53.000 Is that the NAO? Is this it?
00:38:55.000 See, that's not the one I was thinking of.
00:38:57.000 I got the wrong name.
00:38:58.000 This is so freaky.
00:38:59.000 This is so weird.
00:38:59.000 There's a robot on the screen.
00:39:01.000 Folks that are listening to this on iTunes, there's this robot on the screen and it's walking around.
00:39:05.000 It's so weird.
00:39:08.000 Oh my god, they're so good now.
00:39:10.000 I mean, they really are like what we hoped robots would be when we were kids.
00:39:17.000 Remember Lost in Space?
00:39:18.000 That stupid fucking garbage can?
00:39:20.000 That was a garbage can robot.
00:39:22.000 That was the dumbest looking robot ever!
00:39:25.000 Danger, Will Robinson!
00:39:26.000 Danger!
00:39:27.000 All he had was those little Tyrannosaurus arms that were just out on him.
00:39:31.000 Yeah, he had arms that would clip, and he had to do their bidding, always.
00:39:35.000 People loved him.
00:39:36.000 People still loved him.
00:39:37.000 Yeah, they loved that robot.
00:39:37.000 People are really obsessed with that robot.
00:39:39.000 What was that robot's name?
00:39:41.000 He was Robbie.
00:39:42.000 Robbie the Robot?
00:39:43.000 Really?
00:39:43.000 Yeah, from Lost in Space, but he was in a lot of stuff.
00:39:45.000 Oh, really?
00:39:46.000 Yeah, and there were different versions of him.
00:39:48.000 He's always hard for me to pin down in my head, because he was around for a while, but...
00:39:52.000 Brian, pull up Robbie the Robot.
00:39:54.000 Let's see what that looks like.
00:39:56.000 I haven't looked at that in years.
00:39:57.000 So one kind of cool thing about robots that I've noticed.
00:40:01.000 No.
00:40:02.000 That's Robbie and his friend Jeff.
00:40:07.000 No.
00:40:08.000 Robbie the Robot.
00:40:09.000 What were they doing to each other?
00:40:11.000 This is Robbie the Robot, man.
00:40:13.000 It looks like that.
00:40:13.000 You see that, Brian?
00:40:16.000 Okay.
00:40:16.000 He was actually pretty badass.
00:40:19.000 His arms would just come out of the center of his chest.
00:40:22.000 But he always had to listen, no matter what.
00:40:25.000 Will Robinson would be like, listen, bitch, we're gonna go exploring.
00:40:28.000 He was like the first gay robot, right?
00:40:30.000 No, no, no, no.
00:40:31.000 The other dude was the gay guy.
00:40:33.000 Because Lost in Space was a weird show.
00:40:35.000 They had a guy who acted really obviously gay.
00:40:38.000 Look at that.
00:40:38.000 That looks like a fucking disco floor.
00:40:40.000 Look at that, though.
00:40:41.000 Seriously, those are gay robots.
00:40:43.000 What was the dude's name?
00:40:44.000 Was it Vincent Price or somebody?
00:40:47.000 No, no, no.
00:40:49.000 How dare you?
00:40:50.000 So do you like this TV? Oh, I love this TV. Do you like this show?
00:40:56.000 The original Lost in Space was a crazy show, man.
00:40:59.000 It was from 1965 to 1968. They would land on planets and shit.
00:41:04.000 Well, they were marooned.
00:41:05.000 It was like Gilligan's Island, except they were on some tiny planet.
00:41:09.000 Yeah.
00:41:11.000 There have been some great ones.
00:41:13.000 I mean, I always used to love Small Wonder when I was a kid, you know?
00:41:16.000 That was one with Vicky, the robot who, the dad is a roboticist and he brings her home to test her out.
00:41:22.000 She sleeps in his son's closet.
00:41:25.000 Jonathan Harris was Dr. Zachary Smith.
00:41:29.000 Dr. Zachary Smith was like the first, like, pretty obviously gay character on television.
00:41:36.000 That might have just been all the spandex unitards that they were wearing on that show.
00:41:40.000 Well, he was very eccentric.
00:41:43.000 I mean, in this really feminine, apologizing, sobbing way.
00:41:47.000 He was the villain, right?
00:41:48.000 Sort of.
00:41:48.000 He was always the scheming bad guy.
00:41:50.000 He was always dumb.
00:41:52.000 He was ruining things.
00:41:54.000 Stupid.
00:41:55.000 Stupid Dr. Smith.
00:41:56.000 And his ego would always be his folly.
00:41:59.000 That was a great show, man.
00:42:03.000 Well, they're really fun to watch today.
00:42:05.000 Because it's like, you know, they thought this was going to be like 1990. Like, that's how we're going to be living.
00:42:10.000 Like, living in colonies in space.
00:42:12.000 And so this sort of idea of, like, man, space travel never really materialized the way a lot of people thought it was from the 1960s.
00:42:21.000 So you get this weird idea of what they thought the future was going to be from these shows.
00:42:25.000 It's really fascinating.
00:42:26.000 You know, I think they actually misjudged human nature.
00:42:30.000 Like, there's this idea that We've accomplished this amazing feat by going to the moon, right?
00:42:35.000 We planted a flag.
00:42:36.000 We walked around.
00:42:38.000 We went into the uncharted depths, the wilderness.
00:42:42.000 It's an amazing thing to do.
00:42:44.000 And you think that that sort of raw awe-inspiring event is going to propel mankind into the stars.
00:42:50.000 But actually, we were just competing with the Russians.
00:42:53.000 And it was just, it was enough to put a flag there.
00:42:56.000 And the instant that we need to go beyond that, I think, is the instant that some other nation plants a flag next to ours or knocks it down.
00:43:04.000 Or plants a flag on Mars.
00:43:06.000 Yeah, I mean, it's all about just like pissing on trees, you know?
00:43:10.000 The trees just keep getting further away from Earth.
00:43:12.000 Well, it's also to let you know, like, look, if we can go to the moon, we can launch missiles on your head from space, dude.
00:43:17.000 I'm going to let you know.
00:43:18.000 Yeah, I'm going to let you know.
00:43:19.000 I'm going to let you know how we run and shit from orbit, bitch.
00:43:23.000 I mean, to be the first person to set up a base on the moon would be an incredible military advantage.
00:43:29.000 I think the Chinese are going to do it.
00:43:30.000 I think it's going to be a great thing because it's going to excite those...
00:43:34.000 Dude, could you imagine...
00:43:37.000 If we actually start having bases on the moon, if people actually started doing that, if they developed the technology to one day have bases on the moon, and we would go there, like going to fucking Hawaii.
00:43:47.000 It's like a 24-hour trip in the shuttle.
00:43:49.000 I totally believe that.
00:43:50.000 Dude.
00:43:50.000 Because, I mean, it's like something tragic is going to happen.
00:43:53.000 Like, you know, like a lot of the land is going to be poisoned or something.
00:43:56.000 Well, the moon is fucked, man.
00:43:58.000 They would have to terraform on the moon because the moon doesn't have an atmosphere.
00:44:02.000 Yeah, that ain't going to happen.
00:44:03.000 They could do that someday.
00:44:05.000 That's the whole premise of Prometheus, that they were terraforming that planet.
00:44:08.000 They were trying to make it habitable for their life form.
00:44:12.000 They're going to be able to do that, for sure, one day.
00:44:15.000 People always argue that we're going to screw Earth up so bad that then we're going to have to go to Mars, the moon.
00:44:20.000 But the resource expenditure to leave Earth With a lot of people and even just to go to the moon, much less Mars, is so out of whack.
00:44:31.000 I mean, you'd be better off like almost doing anything to fix the Earth or just eking it out here.
00:44:36.000 I mean, let's face it.
00:44:37.000 If you poison the Earth's atmosphere so that you can't even walk outside anymore or you screw up the atmosphere so that we're getting radiation from...
00:44:46.000 Well, guess what?
00:44:47.000 That's Mars.
00:44:48.000 Why go to Mars to build a fucking dome, right?
00:44:53.000 Just build a dome in Arizona, right?
00:44:55.000 You don't have to go to Mars first.
00:44:56.000 No shit.
00:44:57.000 Fix this, stupid.
00:44:59.000 I've always thought that was bullshit, the idea that we're going to mess up Earth so bad we have to go to Mars.
00:45:03.000 The only way I could think that we could do that, though, is like nuclear shit.
00:45:07.000 Yeah, okay.
00:45:07.000 If there's a massive nuclear holocaust all across the country, across the world, we all launch bombs at each other and just wipe out.
00:45:15.000 They would make so many areas radioactive.
00:45:17.000 They would almost be like, you can't live here anymore.
00:45:20.000 You know, that would still be pretty tough to do, I think.
00:45:22.000 Just because, like...
00:45:23.000 Really?
00:45:24.000 Yeah, because you'd have to launch all of them at the exact same time, right?
00:45:27.000 And then they'd have to land all over the Europe.
00:45:29.000 I mean, if you think about when we first got nukes, we were doing crazy shit.
00:45:34.000 We were launching them all the time in space, underground, in the water.
00:45:39.000 On the show, we showed a video of all the nuclear explosions from 1947. It's online.
00:45:44.000 You can get it.
00:45:45.000 It's on...
00:45:46.000 I forget the website, but it shows the first tests, and then it shows Hiroshima and Nagasaki, and it shows all these different ones that they did in Nevada.
00:45:55.000 Nevada's crazy!
00:45:57.000 Oh, dude, they were nuking the shit out of everything.
00:45:59.000 So my favorite is Project Plowshares.
00:46:01.000 This is like Iowa or something.
00:46:04.000 And here's the notion.
00:46:05.000 We're gonna mine for natural gas...
00:46:10.000 The gas is trapped in all these...
00:46:12.000 Now we run water through it.
00:46:13.000 It's called fracking, right?
00:46:14.000 But they're like, we're going to free up all this natural gas and harvest it by detonating a nuke under the ground.
00:46:19.000 So they drill this giant hole.
00:46:22.000 They drill this giant hole.
00:46:23.000 They put a nuke at the bottom of it.
00:46:25.000 They fill the hole up.
00:46:26.000 And then they detonate the nuke.
00:46:29.000 Here's what happens.
00:46:30.000 Highly irradiated natural gas shoots out of this chimney.
00:46:34.000 It ejects all the crap they put in there.
00:46:38.000 It goes into the atmosphere.
00:46:40.000 All of the natural gas is poisonous.
00:46:44.000 It's got radiation in it.
00:46:46.000 And they're like, oh, well, that's not a good idea.
00:46:49.000 It's so much for turning swords to plowshares, right?
00:46:52.000 When did they do this?
00:46:54.000 This is like...
00:46:56.000 This was not that long ago.
00:46:57.000 I would say the middle of the 20th century, but I don't know for sure.
00:47:01.000 Here's why I know about it.
00:47:02.000 Do you know what the name of the operation is called?
00:47:05.000 I think it was called Project Plowshares.
00:47:07.000 Project Plowshares.
00:47:08.000 Yeah, look it up.
00:47:10.000 The reason I like it is because I was thinking, and this is a little bit of a spoiler, but while I was writing Robo-Apocalypse, I was thinking, look, if I was a super intelligent AI, right, I'm not going to hang out and have all my processors in a place where humans are going to be comfortable,
00:47:25.000 right?
00:47:26.000 So what's my fortress of solitude, right?
00:47:28.000 So when these nukes detonate, they vaporize a spherical chunk of, like, rock underground, and they create this chamber.
00:47:37.000 And the walls turn to glass, right?
00:47:41.000 Bubble.
00:47:42.000 Deep under the ground.
00:47:43.000 Highly radioactive.
00:47:45.000 Oh my god.
00:47:45.000 It is the ultimate fortress of solitude.
00:47:47.000 And it's of course where my bad guy lives.
00:47:49.000 Oh my god.
00:47:50.000 Sorry, I kind of blew that for anybody that's going to read.
00:47:52.000 Dude, that's dope though.
00:47:52.000 I want to read that again.
00:47:54.000 That is a dope idea that we would create a place where the bad guy lives because we're so stupid we blew up nukes in a hole that we dug into the ground.
00:48:02.000 That is true.
00:48:02.000 You know, it's always, it's what with the hubris.
00:48:04.000 You know, all the hubris.
00:48:06.000 That's insanity.
00:48:08.000 I can't even believe.
00:48:10.000 Now, is that a case of people just having too much power because it's a military project to do whatever they want to do and the scientists are allowed to say, hey, let's try this, like some wacky scientist?
00:48:22.000 This may not be a popular viewpoint, but I think it's awesome.
00:48:26.000 I think it's optimism.
00:48:28.000 I think it's like...
00:48:30.000 The human race as a little kid who has stumbled across this awesome new toy and they're like, what can we do with it?
00:48:38.000 Oh, let's try everything!
00:48:39.000 Because in the 50s, we had used technology to end World War II. People were high on technology.
00:48:47.000 They were like, man, technology is going to do everything for us.
00:48:50.000 We're going to eat it.
00:48:51.000 It's going to be food pills.
00:48:52.000 We're not even going to have to eat this bullshit that Earth creates for us.
00:48:56.000 We've come full circle now.
00:48:57.000 It's all about organic and...
00:48:59.000 We're afraid of chemicals and afraid of technology.
00:49:02.000 But, you know, in the 50s, that was like the golden age.
00:49:05.000 It was like, we're going to have atomic pens, you know?
00:49:08.000 I don't know why we need atomic energy in our pens, but damn it, we're going to have it.
00:49:12.000 And I wrote a book called Where's My Jetpack that covers all this stuff.
00:49:16.000 I'm sorry, I keep dropping all these book names.
00:49:18.000 Dude, your books sound fucking awesome.
00:49:20.000 I know, right?
00:49:21.000 I even had a joke about you'll never have weed and jetpacks together at the same time.
00:49:29.000 Inebriated human torpedoes.
00:49:31.000 Yeah, that's what happens.
00:49:32.000 Plus, no one would work.
00:49:34.000 Because why would you show up at work when you could smoke pot and fly?
00:49:38.000 There'd be very few things that you would be having more fun at than smoking pot and flying around in a jetpack.
00:49:45.000 Or would we just be in a future where Louis C.K. stands in front of an audience and says, Why is everybody complaining about jetpacks?
00:49:52.000 You're flying, people!
00:49:54.000 Well, that would definitely happen.
00:49:56.000 Yeah, we would just get so sick of it.
00:49:58.000 Oh, fucking jetpacks.
00:49:59.000 I've always got bugs in my teeth.
00:50:00.000 I'm tired of getting my natural gas to fuel my jetpack.
00:50:04.000 You know, why can't we make a fucking solar jetpack?
00:50:06.000 It's 2037. Yeah, we'll always complain, for sure.
00:50:10.000 But that's why things get better.
00:50:12.000 Exactly, I agree.
00:50:13.000 It's really important.
00:50:14.000 You've got to be able to silence even the most bitter of critics.
00:50:17.000 There's certain things to do.
00:50:19.000 A badass jetpack, I think, would be that.
00:50:21.000 You can actually have a compact, sort of a light backpack, sort of size thing that you can actually fly around with.
00:50:27.000 So I can give you the lowdown on the jetpack, because I remember all this stuff.
00:50:32.000 So in the 40s, Wendell Moore got a grant from the Army to build a jetpack, and so he's working for Bell Aerosystems.
00:50:39.000 This is a guy who specializes in building small rocket engines, basically.
00:50:43.000 Airplanes that are flying really, really high.
00:50:45.000 There's not a lot of atmosphere, so they can't plane off of it.
00:50:48.000 So in order to change direction, they fire these little rockets.
00:50:52.000 It's like a spacecraft type deal.
00:50:54.000 So he took one of these little rockets and he literally strapped it onto his back and tethered himself to the ground and just turned it on.
00:51:02.000 And he broke his knee.
00:51:04.000 He shattered his knee, actually, immediately.
00:51:06.000 But it worked.
00:51:06.000 So then he literally hired the kid who mowed his lawn, a guy named Bill Suter, who was like 19, Hale and Hardy, to test the jetpack.
00:51:18.000 And they had a working model of the jetpack.
00:51:21.000 It's called a rocket belt.
00:51:23.000 It's literally a rocket.
00:51:24.000 And that's the problem, is that you just take the hydrogen peroxide and silver, or whatever, and you put them together.
00:51:30.000 And then it creates an explosion.
00:51:32.000 And the real hard thing is to throttle it so you don't just explode.
00:51:37.000 And then it runs out of juice in like 30 seconds.
00:51:41.000 It's got a terrible fuel to weight ratio.
00:51:43.000 And that's the problem.
00:51:45.000 They put a buzzer in the helmet.
00:51:48.000 Because they showed that Bill Suter would take these things to the Olympics.
00:51:51.000 They were in that Bond movie.
00:51:53.000 They would do this for demonstrations to make money after the program was scrapped.
00:51:58.000 And they put a buzzer in the helmet that would buzz after 20 seconds, basically saying, you better land right now.
00:52:05.000 You got 10 seconds before you just turn into a ballast.
00:52:11.000 I was at WPBI. My friend Willie has a radio show there, and they did this thing in the parking lot where a dude flew a jetpack.
00:52:19.000 And we got a video of it.
00:52:21.000 What did he throw?
00:52:21.000 Were you there for that one?
00:52:23.000 No.
00:52:24.000 It's loud.
00:52:24.000 He flew it for like 30 seconds, I think.
00:52:27.000 That's like all it had.
00:52:29.000 You can only do it for like a minute.
00:52:31.000 I think they got it up to a minute or something.
00:52:33.000 Yeah.
00:52:33.000 It's pretty crazy, but this guy's fucked his legs up.
00:52:36.000 Both of his knees are blown out from doing this.
00:52:39.000 His ACLs are gone, so he's all fucked up from just crashing with this thing.
00:52:43.000 The thing you would have to worry about is anything that has enough power to get you to be flying in the air has enough power to...
00:52:50.000 Speed you into something.
00:52:52.000 People would be smacking into each other.
00:52:55.000 There would be no way they would let you just have a jetpack.
00:52:59.000 You would have to have some sort of a bumper car outer shell.
00:53:04.000 I think that's a great example of applying a current mindset to a technology that's in the future.
00:53:12.000 If they were really going to make it, there's no way to make it safe.
00:53:14.000 It would have to be Controlled by a robot, basically.
00:53:18.000 It would have to be like Google cars.
00:53:20.000 Those Google cars.
00:53:21.000 Those Google cars are essentially driving around now.
00:53:25.000 A lot of people in Seattle were talking about how they see them all the time on the freeway just driving around like a guy in the back seat and no one in the front seat.
00:53:32.000 Jesus!
00:53:33.000 What the fuck is that like for that dude in the back seat?
00:53:36.000 This car's hitting the blankers and shit.
00:53:39.000 It knows when to change lanes.
00:53:41.000 I got into one of those at Carnegie Mellon where I went to school and I got into their autonomous car...
00:53:46.000 And drove with it in the back seat while it was driving, going around like a test track, right?
00:53:52.000 And it was, okay, first of all, the wheel is turning by itself, right?
00:53:56.000 And you're in the back.
00:53:58.000 So that's weird, right?
00:54:00.000 But then it starts to kind of feel like, I mean, it doesn't wreck immediately.
00:54:04.000 And so you sort of start to loosen up a little bit.
00:54:06.000 And you get the feeling that you're like on a ride, you know, like, and you start to trust it.
00:54:11.000 Like, When you're on a roller coaster, no one's driving it.
00:54:14.000 You feel like you're safe.
00:54:16.000 But then what happens is...
00:54:19.000 I started realizing this car, it didn't know I was inside it, right?
00:54:24.000 So it's tolerant, the tolerances for a car with nobody inside of it are very different than the tolerances of a car.
00:54:31.000 So it would like hit the, it would sort of sense that there was something there that wasn't, you know, and it would kind of swerve.
00:54:39.000 And it would throw you around like really hard.
00:54:42.000 And you realize...
00:54:43.000 It doesn't give a shit.
00:54:45.000 It isn't going to be like doing the mom thing where it throws the hand out in front of you like, oh, sorry, dear.
00:54:50.000 This thing is just driving to the tolerance of the engineering that it's designed for.
00:54:54.000 And we're just being thrown around in the back.
00:54:57.000 Was the car losing traction at any point?
00:55:00.000 It was always safe.
00:55:02.000 It was seeing how fast it could go because they're competitive.
00:55:05.000 The DARPA Grand Challenge, the DARPA Urban Challenge...
00:55:07.000 They had to go as fast as they could, and they didn't have a human in them at all.
00:55:14.000 That's scary.
00:55:15.000 So now Google's different.
00:55:16.000 You know, Google just bought Stanford's team, essentially.
00:55:18.000 There's a guy named Sebastian Thrun who got me into school at CMU, and then he worked on the autonomous cars at Stanford, because Stanford bought him.
00:55:26.000 And then Google bought They took Sebastian and now, I mean this guy, Sebastian Thrun, he's going to change the world.
00:55:35.000 He's going to introduce autonomous vehicles.
00:55:37.000 It's going to change our cities.
00:55:38.000 It's going to save lives.
00:55:39.000 It's really cool.
00:55:41.000 I'm really proud of him.
00:55:42.000 That's awesome, man.
00:55:43.000 That's amazing.
00:55:45.000 So do you think that's the future, that everyone will have their own personal autonomous vehicle and they'll queue in on the highways and whatever and you'll be able to read the newspaper and your little vehicle won't be in control of it anymore?
00:55:58.000 Well, the first thing, I'm not sure if people are really doing this where they're in the back outside of a closed course.
00:56:06.000 These cars require a person to be behind the wheel to take the blame if it wrecks.
00:56:12.000 Oh, I see.
00:56:12.000 So you're just sort of like a big, meaty, blame, piece of blame machinery.
00:56:19.000 Like, that's the only part you play in this whole thing.
00:56:23.000 That's going to be a job in the future, though, like a fry guy, but you're going to be sitting in the backseat of a car being the blame me.
00:56:28.000 Yeah, I'm a blame guy.
00:56:30.000 In Ampt, there's a truck driver who all he does is he sits in the front seat and he's just there to take the blame.
00:56:37.000 That's his whole job.
00:56:38.000 Wow.
00:56:38.000 And he just drives cross-country 24-7 in these trucks.
00:56:43.000 It will be a job, you know?
00:56:45.000 Yeah, there will be a time where there are these robot machines just moving back and forth across the country carrying goods.
00:56:52.000 Yeah, and you think about it, and it's like, what is it that you can't put into the car?
00:56:56.000 And it's morality, right?
00:57:00.000 Yeah, and he has to make certain decisions, like what do you do if a horse is in front of you?
00:57:05.000 Yeah.
00:57:05.000 Do you hit it?
00:57:06.000 Or do you slam on the brakes and risk losing control of the vehicle?
00:57:11.000 Or do you fuck it?
00:57:12.000 Or do you just slow down, put your arm outside the window and say, hey, hey, what you doing?
00:57:18.000 Yeah, like people have said that.
00:57:20.000 About small animals.
00:57:22.000 If you see a squirrel, you should just hit them.
00:57:24.000 Because if you swerve...
00:57:26.000 People die all the time from swerving to get away from squirrels.
00:57:30.000 But if it's a baby carriage...
00:57:32.000 Yeah, it's a different thing.
00:57:34.000 This is what got Will Smith all uptight in iRobot, right?
00:57:39.000 Where at the end, the big reveal for his character is he's like...
00:57:42.000 A robot saved my life from a car, but it let the little girl next to me die because the probability of saving her life was lower.
00:57:49.000 And it's like, how are you going to blame the robot for that?
00:57:52.000 That's like saying I put my finger in a pencil sharpener and it cut me.
00:57:56.000 That's what it's designed to do.
00:57:57.000 You're not going to hate the pencil sharpener.
00:58:00.000 Well, you are for a movie, though, because the middle of America is not going to understand.
00:58:05.000 I was bummed when that happened.
00:58:07.000 I was digging that movie, and then I was like, oh, that's weak.
00:58:10.000 The CGI in that movie was dope.
00:58:11.000 Yeah, you know what cracked up.
00:58:14.000 Amazing work.
00:58:15.000 All those robots go nuts, right?
00:58:16.000 And what do they do?
00:58:17.000 What do they do?
00:58:18.000 They turn red.
00:58:20.000 There's that part where the robot is like, there's an old lady in her house, and she's like, can you make me some toast or something?
00:58:28.000 And the robot turns around and it's glowing bright red.
00:58:30.000 And I'm like, what a thoughtful roboticist.
00:58:34.000 To include a red LED in there, just in case they all turn evil, to indicate to people, oh, your robot has flipped over into evil mode.
00:58:42.000 You're going to want to plan accordingly.
00:58:44.000 Dude, robots are scary.
00:58:46.000 Robots would beat the shit out of you.
00:58:47.000 Those robots in that movie, if there was a whole bunch of them and they had their own thoughts and ideas, I think that's very possible.
00:58:55.000 No, those robots are bullshit.
00:58:56.000 The robot scares the shit out of me.
00:58:58.000 How can you say they're bullshit?
00:58:59.000 Because if you're making a domestic robot that's going to operate in people's homes, right?
00:59:03.000 Right.
00:59:03.000 It's going to be in people's homes.
00:59:05.000 Old people, young people.
00:59:06.000 It's going to be on the street walking dogs.
00:59:08.000 I mean, I thought that was awesome, right?
00:59:09.000 Here's the deal.
00:59:11.000 You're going to want to make a safe robot.
00:59:12.000 It's a safe consumer product, okay?
00:59:14.000 So just think of this even just from the beginning for like one second from the perspective of a person who's actually building a domestic humanoid robot to sell.
00:59:22.000 Okay, first of all, you know that a human being, anything you put in their environment at home, they are going to put their fingers in it, they're going to try to have sex with it, they're going to get in the bathtub with it, they're going to find a way to like kill themselves using this...
00:59:37.000 A toaster is really hard to build Think about building this domestic robot.
00:59:43.000 I think the first thing that you're gonna do is you're gonna make it incapable of hurting people.
00:59:47.000 You're gonna make it small and light so that it can't walk through a plate glass window.
00:59:52.000 So that if it loses its batteries and goes all George Bush Segway and just falls over, then it won't crush your baby.
01:00:01.000 Is that what happened to George Bush?
01:00:02.000 He was on one of those Segways and the battery died?
01:00:04.000 Yeah, either that or it wasn't even on.
01:00:07.000 I'm not sure what was going through his head, but he tried to mount up and it wasn't happening.
01:00:12.000 Oh, no.
01:00:14.000 Well, those things need to be on.
01:00:15.000 They have a gyro.
01:00:17.000 But anyway, the point I'm trying to make is if you're going to build a real domestic robot that's in someone's house, it's not going to be capable of crushing your fist with its hand or leaping through a glass window and falling three stories and denting the concrete.
01:00:32.000 What the fuck?
01:00:32.000 What a fucking waste of money, right?
01:00:34.000 First of all.
01:00:35.000 Well, what if it's like how cars are today?
01:00:38.000 Like if you buy, say, a brand new Mustang, like a Ford Mustang GT. In the old days, they used to have like 300 horsepower.
01:00:49.000 They weren't that fast.
01:00:51.000 The new ones, just the standard Mustang GT, you can get, they're over 400 horsepower.
01:00:57.000 They're insanely fast.
01:00:58.000 For this, I think it's like $30,000 or $35,000 or something like that.
01:01:03.000 That is insane amounts of performance compared to what existed in the 1960s.
01:01:07.000 If the robots keep getting better and better, people are going to want to have a Ferrari or a Ducati robot.
01:01:15.000 Well, that's true.
01:01:15.000 So people are willing to lay down a premium.
01:01:17.000 And in fact, I got to again, Carl Wrench made this short film called The Gift, and these people have this snooty-looking butler robot that is a total badass, like totally physically over-engineered for the job he does, and it's an awesome,
01:01:33.000 awesome little film.
01:01:35.000 And then they gave him 47 Ronin.
01:01:37.000 I read this comic book when I was a kid.
01:01:39.000 When I was a kid, I was super into those black and white, really cool comic books, like the creepy and eerie series.
01:01:47.000 Have you ever seen the illustrated stuff?
01:01:49.000 Well, The Crow was kind of...
01:01:50.000 Yeah, sort of.
01:01:52.000 Straight up white and black.
01:01:54.000 And this, there was like, all these compilation ones, Creepy and Eerie, were all these different stories.
01:02:00.000 And I remember one of them, from one of those types of comic books, was about a robot that wound up fucking this dude's wife.
01:02:07.000 And it was really heavy-duty, man, because the guy tried to fight the robot, and the robot snapped his arm and broke it in front of him, and the robot had this giant dick.
01:02:16.000 Just dominated him?
01:02:17.000 Yeah, it was really creepy, because it was like this guy couldn't do anything about it, and this robot was taking over and fucking his wife.
01:02:25.000 It was a really fucked up video, or a comic book, rather, because I remember reading it when I was like, God, I couldn't have been more than eight or nine.
01:02:34.000 You know, that's when I was, like, really into comic books.
01:02:36.000 That was my comic book era.
01:02:38.000 And so, this image of this bald, giant robot with this giant cock snapping this guy's arm after he got done fucking his wife was, like, so disturbing.
01:02:51.000 I was like, could you imagine if that's what you have to deal with?
01:02:53.000 Does this robot start coming along and fucking people's wives and snapping dudes' arms and shit?
01:02:59.000 I think that's kind of the underlying fear, right?
01:03:01.000 Of course.
01:03:02.000 There's going to be better at us than everything.
01:03:04.000 And if they're black robots.
01:03:05.000 Yeah, we get some of that mixed in.
01:03:08.000 Yeah, at least it wasn't fucking him.
01:03:12.000 Yeah, well, that was probably next.
01:03:14.000 Yeah, that's like episode two.
01:03:16.000 Fuck him with his own broken arm.
01:03:18.000 Pull that thing off and stuff it up his ass.
01:03:20.000 Literally, it could do that.
01:03:21.000 That's some scary shit, man.
01:03:23.000 Yeah.
01:03:23.000 You know, what's weird for me to think about is that, you know, we know that you can't beat a robot at chess.
01:03:29.000 I mean, like, we can't.
01:03:30.000 I mean, maybe you could if you were, like, a master, right?
01:03:33.000 You can't beat a robot at Jeopardy!
01:03:34.000 It whooped everybody's ass.
01:03:36.000 You know that you can't beat them on the battlefield.
01:03:38.000 Like, robots have dominated in a lot of different areas, like, since we've been kids.
01:03:43.000 Right.
01:03:43.000 And we at least remember a time when you could beat a robot at chess and just be a normal person who likes to play chess every now and then.
01:03:53.000 The next generations, they've never lived in a time when a robot didn't dominate them at a lot of intellectual tasks and increasingly physical tasks.
01:04:03.000 Like, there's gonna be people born that don't remember a time when cars didn't drive themselves better than humans can drive cars.
01:04:10.000 And it's interesting to think like...
01:04:12.000 Well, how about the fact that right now we have people that have never known a life without the Internet?
01:04:18.000 Yeah, or being able to look things up on smartphone.
01:04:20.000 That's crazy.
01:04:20.000 I want to know what I would be like if I was one of these kids.
01:04:23.000 You know what I mean?
01:04:24.000 I think you got in very early, though.
01:04:29.000 Yeah.
01:04:30.000 As far as most people, you were tuned into the internet when you were in your 20s, right?
01:04:36.000 When you were in your 20s?
01:04:37.000 Well, I mean, we had computers growing up.
01:04:39.000 It was more like the internet was probably 18, but I've had computers my whole life.
01:04:44.000 What do you think, coming from Ohio, coming from a place like Columbus, what do you think is the biggest impact that the internet has to a place like that?
01:04:52.000 Because now when you go back there, do you find these kids to be more tuned in than you were when you were their age?
01:04:58.000 It's definitely communication.
01:05:00.000 Back in the day, the only way I would know anybody from another school or whatever is if I met them at a game or if I went to a roller skating rink and they were in the men's bathroom and they told me to come in a stall.
01:05:12.000 I'm just kidding.
01:05:13.000 It's got gays.
01:05:14.000 Fuck.
01:05:14.000 What happened there?
01:05:14.000 Are you okay?
01:05:16.000 Do you need to call somebody?
01:05:17.000 No, but now I think it's like, look at Death Squad Ohio when we was in Death Squad.
01:05:20.000 There was a huge group of people that are all friends now.
01:05:24.000 Good friends.
01:05:25.000 They're staying at each other's houses.
01:05:27.000 They're flying in and getting each other at the airport, and they're a gang now.
01:05:30.000 Right, and that's all because of the internet.
01:05:33.000 The internet connecting all these different things in a way that's never happened before allows all these areas that used to have no culture coming into them.
01:05:43.000 It allows them to experience an incredible variety of different things right out of their fucking computer.
01:05:48.000 Yeah, but on specific details, like really specific things that you're into where you wouldn't be able to get a critical mass of people that are into it if you had to be co-located.
01:05:57.000 Yeah.
01:05:57.000 If you're a Bigfoot hunter and you live in any fucking town, try finding a fellow Bigfoot hunter.
01:06:04.000 That shit is hard, man.
01:06:06.000 You can't order a cup of coffee and then, so what do you guys think about Bigfoot?
01:06:10.000 No one's going to talk to you about Bigfoot.
01:06:11.000 Now you go on the internet for five minutes, you get your own show.
01:06:13.000 A lot of foot fetish guys trying to come to your house.
01:06:16.000 Yeah, if you go, yeah, right?
01:06:18.000 I've had guys offer to massage my feet.
01:06:21.000 Are you a Bigfoot lover too?
01:06:23.000 Yeah.
01:06:25.000 Wait a minute.
01:06:25.000 No, no, no.
01:06:26.000 Sasquatch.
01:06:27.000 Sasquatch.
01:06:28.000 That's what they call me, honey.
01:06:31.000 Yeah, it's really tough to be a Bigfoot person out there in the real world.
01:06:36.000 But on the internet, it's easy as fuck.
01:06:38.000 You just join a forum.
01:06:40.000 There's a new Bigfoot video every couple weeks nowadays.
01:06:43.000 On the other end of that spectrum, though...
01:06:45.000 Don't you think it also sort of cheapens your relationships?
01:06:48.000 How's that?
01:06:49.000 Well, so you can't have, like...
01:06:51.000 I don't know about other people.
01:06:54.000 Maybe I'm just a dweeb, but I can maintain about two good friends.
01:06:59.000 Like, real buddies.
01:07:00.000 And then, like...
01:07:02.000 Anything beyond that is really, there's a very narrow stripe of like acquaintances and then it's all, you know, you have all this time that you spend interacting with people online and it's a thousand people.
01:07:13.000 Yeah.
01:07:13.000 And so, you know, and not to assign any value judgment to it.
01:07:17.000 It's a different type of interaction.
01:07:21.000 Right.
01:07:37.000 It's interesting.
01:07:38.000 I think it's also something that we have to get used to managing.
01:07:43.000 It's managing the amount of information that comes into your life and managing the amount of people that you're interacting with.
01:07:50.000 Just any sort of social interaction through a message board or Facebook or Twitter, you could get so absorbed in communicating with all these different people that you will never get anything done.
01:08:02.000 It will eat up all your time.
01:08:04.000 Yeah, and you'll be the guy looking at his damn vibrating monster cell phone all the time while you're out and your friends are just like, I hate this guy.
01:08:12.000 This immersion, this human immersion to technology, disturbs me the most when I see people that are really, really addicted to role-playing games.
01:08:19.000 That's where I see, like, wow, you could really get stuck in a black hole and lose your fucking life.
01:08:26.000 Have you ever met anybody that's known anybody?
01:08:28.000 Of course.
01:08:28.000 I have friends that are super into role-playing games, but also board games people get into, but then also video games make role-playing games so much easier, especially the massively online multiplayer stuff.
01:08:40.000 A lot of times that seems to me to be just simple escape.
01:08:44.000 Your life sucks.
01:08:46.000 It doesn't suck, but it's just more fun.
01:08:49.000 It's way cooler.
01:08:52.000 Magic and shit, going out and getting gold and bitches.
01:08:55.000 In real life, you're smelling cat piss in your house.
01:08:59.000 The fuck is that smell?
01:09:01.000 I hope not.
01:09:01.000 The fuck is that cat piss smell?
01:09:04.000 It's such a deal.
01:09:05.000 Cat toilets.
01:09:08.000 Horrible shit online about parents who were neglecting their children because they were just completely absorbed in these online games.
01:09:15.000 It goes the other way.
01:09:16.000 My wife's a child psychologist.
01:09:18.000 She had a kid who was addicted to, you know, one of the online games.
01:09:22.000 And in order to deal with him, she had to find his guild master, who is some...
01:09:27.000 This is a kid who's like 13 or 14. She found his guild master, who's like a 30-year-old dude, you know, like in Eugene or something.
01:09:35.000 And he comes up to Portland.
01:09:36.000 For a meeting and says, hey man, like, you know, you've got a problem and I'm going to have to limit like the amount of raids you can go on.
01:09:43.000 And she had to, but she had to make that human element real, you know, because this kid had a real relationship with his guild master.
01:09:50.000 They hung out for hours and hours and hours, you know, they were tight, but they never met.
01:09:55.000 You know, and so they weren't able to, you know, she had to bring him in to look out for this kid.
01:09:59.000 Wow.
01:10:00.000 Did you see the lines for the new Call of Duty game last night?
01:10:03.000 No.
01:10:03.000 Oh man.
01:10:03.000 Jesus Christ.
01:10:04.000 There was people there at one in the afternoon outside of a GameStop just like sitting in these lawn chairs and I'm like, what?
01:10:10.000 Who are these people?
01:10:11.000 Savages?
01:10:12.000 Yeah.
01:10:12.000 And then last night I was on Sunset four blocks Of just straight mob people waiting in line for a video game.
01:10:19.000 I'm like, what the fuck?
01:10:20.000 I don't get that because I woke up this morning and it was on my front porch.
01:10:23.000 Thanks, Amazon.
01:10:24.000 You guys are like, what?
01:10:25.000 Oh, you got an extra seven hours or something?
01:10:28.000 Well, yeah.
01:10:29.000 You're not an addict, though.
01:10:31.000 You understand.
01:10:31.000 Well, and that's also the community, right?
01:10:32.000 Yeah.
01:10:34.000 Yeah, that's true.
01:10:35.000 Yeah, totally.
01:10:36.000 Look, that's a fucking fun game.
01:10:39.000 That game scares the shit out of me.
01:10:41.000 Oh, it's great.
01:10:42.000 I've watched Bruce Buffer plays it.
01:10:43.000 He plays it at the airport.
01:10:44.000 And I was like, you better keep that fucking game away from me.
01:10:46.000 I'll lose my life.
01:10:47.000 I'll have no life.
01:10:49.000 Forbes article about a guy named Peter Singer who's a...
01:10:53.000 He's basically a consultant for anybody, the CIA, all the military infrastructure, and they had him design all the robotic weaponry that's in the new Call of Duty.
01:11:05.000 And the shit is all super legit and really realistic.
01:11:09.000 Jesus, those games are so immersive.
01:11:11.000 You've got to have 12-year-old reflexes.
01:11:14.000 It's like I'm not going to go out and try to start a career as a gymnast right now.
01:11:18.000 I'm too old.
01:11:19.000 That's the way I feel about Call of Duty.
01:11:21.000 I can't.
01:11:22.000 Oh, you would just have to get absorbed.
01:11:24.000 I was really terrible when I first started out at Quake.
01:11:29.000 But I got really good after a while.
01:11:31.000 That was my addictive item.
01:11:32.000 Hey, Castle Wolfenstein, man.
01:11:34.000 Dude, that was old school.
01:11:35.000 That was like the first one, right?
01:11:37.000 Wasn't that the first?
01:11:39.000 Yeah, and you couldn't jump or anything.
01:11:41.000 It was flat.
01:11:43.000 Well, there was Duke Nukem.
01:11:45.000 I think Castle Wolfenstein was the first, and then came Doom.
01:11:48.000 Doom, right, right, right.
01:11:50.000 And Doom, they named Doom from that line in The Color of Money with Tom Cruise.
01:11:55.000 I remember Tom Cruise went out to play this guy and Tom Cruise is like the best player in the country.
01:12:00.000 And he had this crazy pool cue.
01:12:01.000 And the guy said, what you got in the case, boy?
01:12:03.000 And he goes in here.
01:12:05.000 And he opens it up and he goes, doom.
01:12:08.000 And the idea was that their game was so badass that that's what they were going to do to the whole video game industry.
01:12:13.000 We were going to drop some doom on them.
01:12:16.000 They did.
01:12:17.000 And it was, yeah.
01:12:17.000 The 90s, it was all articles about those guys buying Shitloads of Ferraris.
01:12:21.000 I know Carmack, man.
01:12:23.000 I've been over his studios a few times.
01:12:26.000 Those guys are really cool guys.
01:12:29.000 And Carmack is like a real rocket scientist, like in his spare time.
01:12:33.000 That's what I love is that these guys are all really, really smart.
01:12:35.000 He's dude.
01:12:36.000 He's beyond smart.
01:12:37.000 He's one of those guys where I talk to him.
01:12:39.000 And I'm like, I'm not convinced we're the same species.
01:12:42.000 I'm not convinced.
01:12:43.000 He's so past, just computer-wise, what he's doing all day, the computations that he's making, the way he's redesigning these first-person shooters.
01:12:55.000 He's like a super, super fucking genius.
01:12:58.000 Yeah.
01:12:58.000 Having lived in Seattle, you meet these guys, right?
01:13:03.000 The ones that have started companies and are mad intelligent.
01:13:07.000 And for me, they're always about five or ten years older than me.
01:13:13.000 And I'm always pissed.
01:13:15.000 I'm always like, damn!
01:13:17.000 If I was 10 years older, I would have been on that scene.
01:13:20.000 I would have had a chance to go nuts with it.
01:13:22.000 I have a buddy who works for Bungie.
01:13:25.000 He makes Halo.
01:13:27.000 It's a sweet job.
01:13:28.000 But when he was in college, he wrote a textbook on how to program graphics.
01:13:34.000 And you know what that got him?
01:13:36.000 A job at Bungie.
01:13:37.000 But it didn't get him a Ferrari garage.
01:13:42.000 It didn't get him...
01:13:43.000 Every generation has these smart guys and girls.
01:13:48.000 And when they fall in, it's all about how much the field's been blown open.
01:13:53.000 There's a lot of stuff, like low-hanging fruit waiting on you.
01:13:57.000 Of course, he'd probably bust my ass if he heard me say this.
01:13:59.000 He'd be like, I earned it!
01:14:02.000 Don't trivialize my accomplishments.
01:14:03.000 He certainly did innovate in a big way in the first-person shooter world.
01:14:08.000 It was him, and there was that other guy that was with him with Doom, and then that guy left when he made Quake, who had long hair, had fabulous hair.
01:14:18.000 He was a very controversial video game designer himself, and those were the original id guys.
01:14:24.000 But they went on to make Quake, like Quake 2, Quake 3. Each one of them got better and more intense with the graphics, The amount of hours of entertainment they provided with those video games.
01:14:37.000 It's crazy to think about.
01:14:38.000 It's staggering!
01:14:38.000 And this Call of Duty is on...
01:14:40.000 That's the next level shit.
01:14:42.000 That's the highest level.
01:14:43.000 What I love is it puts Hollywood to shame.
01:14:46.000 Movies don't make this much money.
01:14:48.000 Movies pale in comparison now to video games.
01:14:51.000 They're way more exciting.
01:14:52.000 You get so locked in on one of those fucking games, and you're creeping around and shooting at people and shit.
01:14:58.000 I don't understand why movies aren't loss leaders to video games now.
01:15:02.000 It's always like, yeah, we're making a movie, and we're going to have a video game.
01:15:06.000 It's like, dude, the money is in video games.
01:15:08.000 Make a movie just as a PR campaign to get people into your video game, because that's...
01:15:14.000 Where the cash money is at.
01:15:15.000 Yeah, not a bad idea.
01:15:17.000 Especially if you develop one of these Call of Duty type franchises where everyone plays it.
01:15:24.000 Ice-T is always on TV playing.
01:15:25.000 Doesn't he play that?
01:15:26.000 He plays Gears of War too, right?
01:15:28.000 Yeah.
01:15:28.000 You know what I did?
01:15:29.000 Gears of War is another one.
01:15:30.000 He bought me a Nintendo, dude.
01:15:31.000 I went to the thrift store and I bought an old TV and an old VCR. And you know what I found?
01:15:38.000 Watching old school movies on a VCR and on a little bitty 4x3 TV, the graphics weren't as good, right?
01:15:49.000 They didn't have CGI and all that stuff.
01:15:50.000 I'm thinking Lost Boys and Terminator 2 and stuff.
01:15:55.000 Pretty good graphics, but not the...
01:15:56.000 You watch it on that little screen, it's convincing.
01:16:00.000 Really?
01:16:00.000 It's low resolution.
01:16:01.000 There's a lot of stuff that just goes right by.
01:16:05.000 Well, have you ever seen an old movie that's been brought to Blu-ray?
01:16:11.000 Yeah, I think, but I can't.
01:16:13.000 Aliens, the second Alien movie, was brought to Blu-ray.
01:16:17.000 And when you watch it on a modern television with a Blu-ray player, it's so hokey-looking.
01:16:24.000 Yeah.
01:16:24.000 The fucking background.
01:16:25.000 There's a scene where there's a jet that's parked there, or one of those, you know, whatever.
01:16:30.000 Dropships.
01:16:31.000 Spaceships.
01:16:32.000 Five bucks.
01:16:32.000 And then the background is supposed to be like this whole warehouse area.
01:16:35.000 It's so fucking fake.
01:16:37.000 I mean, it's so obviously fake.
01:16:40.000 It's like a painted set.
01:16:40.000 It looks terrible.
01:16:42.000 I mean, it looks so bad.
01:16:43.000 It's so hokey.
01:16:44.000 You're like, oh my god!
01:16:45.000 You're not supposed to have this at this resolution.
01:16:47.000 When the director made it, he made it for a certain resolution.
01:16:51.000 When you deal with special effects, you've got to respect that.
01:16:53.000 You can't pump it up to Blu-ray, you greedy bitches.
01:16:56.000 You know why I bought the Nintendo?
01:16:57.000 I have a daughter.
01:16:58.000 She's two and a half, right?
01:16:59.000 And she is not allowed in the room when I'm playing video games because it's intense.
01:17:04.000 I'm playing Skyrim and it's very realistic.
01:17:06.000 What is Skyrim?
01:17:07.000 What is Skyrim?
01:17:08.000 I'm sorry.
01:17:09.000 Way of life.
01:17:09.000 I had to run away from the video games, man.
01:17:11.000 I'm a fellow junkie.
01:17:13.000 It's in the Oblivion series.
01:17:14.000 It's one of those role-playing games, but it's single player.
01:17:17.000 So you can finish it and it can leave your life.
01:17:20.000 But it's incredibly realistic.
01:17:22.000 Oh, so you're playing against the computer, you're not playing online.
01:17:24.000 You're playing in a total immersive world.
01:17:25.000 I mean, I could talk about Skyrim forever.
01:17:27.000 And you're running it through your PC or your computer.
01:17:30.000 Yeah, I actually bought a new gaming PC to run this game.
01:17:33.000 Oh my god.
01:17:33.000 For a while, it was more fun to actually download all the mods to like...
01:17:38.000 Hype up all the texture mapping, all the water, the fire, the blood, the atmosphere, the weather.
01:17:46.000 And what are you doing?
01:17:46.000 Are you fighting people in this game?
01:17:48.000 Yeah, I mean, you know, it's your typical Lord of the Rings rip-off.
01:17:51.000 Like, you're in a world where...
01:17:53.000 You know, it's like that medieval stuff.
01:17:55.000 There's magic and all that.
01:17:56.000 It's really, really fun, but it's intense.
01:17:58.000 And if I was playing it, my two-year-old, no way is she allowed in the room, right?
01:18:02.000 And so one day she says to me something along the lines of, like, you know, video games are scary.
01:18:07.000 And I'm like, that's bullshit, man.
01:18:10.000 Video games are...
01:18:11.000 Super Mario Brothers is not scary, right?
01:18:13.000 So that's why I bought the Nintendo.
01:18:15.000 I showed her this.
01:18:16.000 I was like, this is a video game, honey.
01:18:18.000 Boop, boop, boop, boop, boop, boop, boop.
01:18:20.000 Just so that she's not terrified by what video games have sort of become.
01:18:25.000 Especially if you're watching a man's or a young man's, something that's geared towards them.
01:18:32.000 I mean, obviously, I guess there's a few women that want to go run around playing Call of Duty.
01:18:36.000 It must be.
01:18:37.000 I mean, it's so popular.
01:18:38.000 It must be women playing it as well, right?
01:18:39.000 There's tons of women.
01:18:40.000 But are there games that are geared specifically towards women?
01:18:44.000 Someone's going to be waiting outside to kick you in the balls if you keep down this room.
01:18:47.000 Road of questioning whether women are are playing these games because I mean they are they are in a big way, right?
01:18:54.000 I'm just guessing.
01:18:55.000 I don't know the community.
01:18:56.000 I know there was a lot of women quake players though There was like a lot of girls were really good that would play one-on-one duels with dudes and fuck them up and it was embarrassing as shit Because you just get jacked and quake by a chick.
01:19:07.000 Have you seen the zombies in the new Call of Duty?
01:19:10.000 They have a zombie mode like and it's crazy as fuck.
01:19:13.000 Can you pull up a video?
01:19:14.000 Yeah What was that?
01:19:19.000 So here just shows you a little bit of the zombies from Mashimo.
01:19:25.000 This is like the trailer for it.
01:19:26.000 We're looking at the trailer right now.
01:19:28.000 I'm getting really scared you guys.
01:19:30.000 Wow.
01:19:31.000 Graphics are amazing.
01:19:34.000 It's just like, what they can do now in a vid- just a- the CGI opening for a video game is incredible.
01:19:39.000 This is this, like, really dingy, post-apocalyptic bus scenario.
01:19:45.000 It's probably generated by the game engine, you know?
01:19:47.000 Yeah.
01:19:48.000 Oh, wow.
01:19:51.000 Oh, this is great.
01:19:52.000 Are those soldiers?
01:19:54.000 Yeah, and here's- Is there anything more fun than shooting zombies?
01:20:04.000 That is a negatory.
01:20:06.000 Running them over, yeah, right?
01:20:08.000 I had a zombie dream last night, man.
01:20:11.000 The Walking Dead gets me dreaming about zombies all the damn time now.
01:20:15.000 And they just hoard you.
01:20:16.000 And you're pretty much just like constantly having to try to get the fuck away from these zombies.
01:20:21.000 Isn't it funny that one of the most fearsome things that we can conjure up is a human being that's dead and wants to get you?
01:20:27.000 Yeah.
01:20:29.000 Robots are kind of similar, right?
01:20:30.000 I mean, we're afraid of the human form, man.
01:20:32.000 Well, we're afraid of the human form in a diseased manner too, whether it's psychologically diseased or whether it's that 28 Days Later, that epidemic, that rage shit that got out.
01:20:43.000 That was one of the scariest movies ever.
01:20:45.000 I got a whole horror movie theory about this.
01:20:47.000 My theory is that the reason that like a werewolf is scarier than like a wolf is because the werewolf, because it has human traits, has the capability of being evil, right?
01:21:01.000 Because a wolf or like an animal or just nature is not good or evil.
01:21:06.000 Like you don't blame the wolf for killing somebody.
01:21:08.000 It's like that's what a wolf does.
01:21:09.000 But as soon as you inject some human into it, then you have something that's capable of just really being evil and just doing something for evil's sake.
01:21:19.000 Yeah, I think that's a really accurate representation.
01:21:22.000 If you stop and think about it, it's very rare that a wolf would...
01:21:25.000 Actually, I have a friend, now that you think about this, I have a friend who had these wolves.
01:21:31.000 They were his pets, and they were like seven-eighths timber wolf, and they had like a little bit of husky or something else in them, and they were essentially wolves, man.
01:21:40.000 And he didn't really have good control over these things.
01:21:43.000 And they got out and they killed a bunch of the neighbor's farm animals.
01:21:46.000 And they didn't just kill one and ate it.
01:21:48.000 They killed them for pleasure.
01:21:50.000 Most of the time, I think wolves, when they're killing, they're killing out of starvation.
01:21:56.000 They're killing because they want to eat.
01:21:57.000 But I think they can kill a lot for pleasure, too.
01:22:03.000 Wolves actually kill for pleasure.
01:22:06.000 If they were fattened up, too, they might just fuck with you and jack you.
01:22:12.000 But the idea of a human...
01:22:14.000 Humans seem to be a lot more capable of evil than wolves.
01:22:17.000 Well, again, it's the competition thing, you know?
01:22:19.000 It's the complexities of the possibilities of emotions that can be conjured up raising a child and doing a shitty job of doing it and putting the kid in horrible situations.
01:22:31.000 And then all of a sudden, what that person is at the most evil...
01:22:36.000 Merges with a wolf.
01:22:37.000 You know the worst characteristics ever of a human being merged with a wolf and that's what a werewolf would be like a just horrible psychotic killer animal.
01:22:46.000 It's about it's like it's about like knowing what you're doing right like Hannibal Lecter freaks me the hell out because he's so aware of exactly what he's inflicting and it's like and that's what Like, amplifies whatever evil act he's doing is the level of,
01:23:04.000 like, satisfaction that he's getting out of it.
01:23:07.000 And that really, you know, makes it worse.
01:23:09.000 I don't know.
01:23:10.000 Yeah, it does.
01:23:11.000 It was always disturbing how, like, well-read he was and how aware of how fucked up he was, but he didn't care.
01:23:16.000 He's specific about that stuff.
01:23:17.000 Yeah, he was one of the most terrifying guys ever.
01:23:21.000 Yeah, and there was two versions of him, too.
01:23:26.000 Did you ever see that first movie?
01:23:28.000 God, it was Dragon's Blood or something like that.
01:23:31.000 There was a prequel to that.
01:23:34.000 Right, it came out later, but it was...
01:23:36.000 I think I had enough when I saw someone eating their own brain.
01:23:39.000 I was like, I'm never going to forget this.
01:23:41.000 It's going to haunt me for the rest of my life.
01:23:43.000 The Red Dragon, I think that's what it was.
01:23:46.000 And they made that a movie later.
01:23:48.000 They eventually...
01:23:50.000 Yeah, he had that tattoo on his back.
01:23:52.000 Oh, that might have been the one that freaked me out.
01:23:53.000 That's where he's torturing the old rich guy.
01:23:59.000 I think so.
01:24:00.000 God, I can't remember.
01:24:01.000 I think it was Gary Oldman.
01:24:02.000 I can't remember.
01:24:04.000 I don't remember who the Hannibal Lecter was either, but he was very subtle.
01:24:08.000 It was a very different take.
01:24:09.000 Oh, it was a different actor?
01:24:11.000 Yeah, it wasn't Hopkins, no.
01:24:13.000 It was some other guy.
01:24:17.000 But, yeah, that idea of the genius that wants to kill you, that doesn't have any remorse whatsoever and is doing it because it's the only thing that gives them any sort of a feeling.
01:24:28.000 Yeah.
01:24:28.000 That's a terrifying thought.
01:24:30.000 That's way scarier than robots, right?
01:24:32.000 I think it's why people are the scariest thing, you know?
01:24:35.000 So we have to worry about cyborgs, not robots.
01:24:37.000 Robots maybe will save us from the cyborgs.
01:24:41.000 It's all on the table at the end.
01:24:42.000 Can you imagine?
01:24:44.000 What if it becomes the first country that commits their army to becoming robot cyborgs?
01:24:52.000 I mean, this is like a question with whether we should implant ourselves and use neural implants to do things because you think to yourself, well, like, okay, we have all this bioethics and we've decided that it's not ethical for People to do this because everyone would have to get one in order to compete,
01:25:11.000 right?
01:25:11.000 So we're going to outlaw them.
01:25:13.000 And then it's like, oh, in China, they're mandated by the state.
01:25:18.000 And they're getting real productive over there in China.
01:25:21.000 And it's like, oh, you know, you think about sort of you have a macrocosm view and then you have like a microcosm.
01:25:28.000 Microcosm is everybody in my kid's classroom went to the doctor and got diagnosed with ADHD. And now all these kids have a neural implant And they're all way smarter than my kid.
01:25:38.000 And the macrocosm is like that but applied to a whole nation that we're competing with.
01:25:44.000 Dude, that would freak you out if you were the only kid that had a natural brain in class and all the other kids had chips in their heads and you couldn't fuck with anything they were saying.
01:25:51.000 Dude, there's a super sweet...
01:25:52.000 You'd be like, what?
01:25:52.000 There's a really sweet outer limit that's all about that where...
01:25:56.000 There's this one kid whose brain doesn't allow him to get this, so he's basically not on all the internet and social networks.
01:26:02.000 And then all the other kids get their brains fried by a virus, and he's the only one that's cool.
01:26:08.000 He reads books and stuff, and he saves everybody.
01:26:11.000 I like the Outer Limits.
01:26:12.000 The Outer Limits is an awesome fucking show.
01:26:15.000 I love the idea of the possibilities that science fiction presents.
01:26:18.000 I just love that there's so many different...
01:26:20.000 When you stop and think, especially when we were talking about Lost in Space, when they really had no idea what the future was going to be like.
01:26:27.000 And you get to see what their vision of it was.
01:26:30.000 It's so fascinating to me.
01:26:31.000 Almost more fascinating than when we...
01:26:33.000 Like Prometheus, to me, what we're going to be capable of.
01:26:36.000 It's like, I see what you're doing.
01:26:37.000 A little bit of Microsoft Touch in there.
01:26:39.000 Yeah.
01:26:39.000 You know, she created a whole different environment.
01:26:43.000 Now she thinks she's in the desert.
01:26:44.000 I haven't seen it, by the way.
01:26:46.000 I know I should see it.
01:26:47.000 I got a two-year-old.
01:26:48.000 I can't see anything.
01:26:49.000 You know what, man?
01:26:50.000 It's hard to live up to the Alien series.
01:26:55.000 And it doesn't really.
01:26:56.000 It stumbles a little bit.
01:26:58.000 But still, I've seen it more than once.
01:27:00.000 I've seen it again in a hotel room.
01:27:01.000 I was bored.
01:27:02.000 So I watched it again.
01:27:03.000 It's a badass movie.
01:27:04.000 I mean, visually, it's stunning.
01:27:06.000 Visually alone, it's worth watching because there's some incredible scenes in it, just visually.
01:27:10.000 But the future, like the technology they present, doesn't seem much different than what we're capable of right now.
01:27:16.000 That stuff really does, you know, influence actual science.
01:27:20.000 Like, you know, people will take clips from these movies and everything and show them during their presentations and say, this is like what we're doing.
01:27:27.000 Wow.
01:27:28.000 Specifically, Minority Report, where he's doing that.
01:27:31.000 That was huge for human-computer interaction and HCI people.
01:27:35.000 Suddenly, that clip was showing up everywhere at conferences.
01:27:40.000 People were like, this is what we're doing.
01:27:42.000 Do you think we're going to get away from the keyboard interface in the near future?
01:27:47.000 There's so much research that's going into natural human interfaces.
01:27:51.000 We have all this machinery upstairs that's...
01:27:54.000 Got us geared to interacting with other human beings the way we interact.
01:27:58.000 Face recognition, speech recognition, gesture recognition, emotion recognition.
01:28:02.000 You know what I'm really impressed with is the speech recognition.
01:28:05.000 The iPhone has a native app that comes with it called Notes.
01:28:10.000 And this Notes, when you go to enter into a new note, it has this little...
01:28:15.000 This little button that you can press that looks like an old-school microphone, and you can just press it, and you talk into it, and you go, Daniel H. Wilson is a bad motherfucker.
01:28:28.000 So that's it, and boom, it reads it.
01:28:31.000 Dude, mine doesn't do that.
01:28:32.000 I have notes.
01:28:33.000 I gotta figure out where that button's at.
01:28:35.000 It's right at the bottom of the keyboard to the left of the space bar.
01:28:38.000 Do you see how awesome that is?
01:28:40.000 Yeah.
01:28:40.000 I mean, I just said it for the folks at home who are listening to this.
01:28:43.000 I just said that into my phone and it's instantly printed up exactly what I said.
01:28:48.000 Daniel H. Wilson is a bad motherfucker.
01:28:50.000 It is even spelled bad motherfucker correctly.
01:28:54.000 Have you done this yet?
01:28:57.000 Text Paris Hilton, I want to eat her asshole.
01:29:00.000 Hey man, that's just rude.
01:29:03.000 What if Paris Hilton was listening to this?
01:29:05.000 Here's your message to Paris Hilton.
01:29:08.000 Ready to send it?
01:29:09.000 And it says, I want to eat your asshole.
01:29:11.000 And I'll send it.
01:29:12.000 Okay, I'll send it.
01:29:16.000 Paris Hilton is in your phone?
01:29:18.000 Yeah.
01:29:19.000 Or does Siri just know?
01:29:21.000 Siri just looked it up.
01:29:22.000 You can't see it.
01:29:25.000 Well, what you just did was very rude, young man.
01:29:27.000 If you really did, send her that.
01:29:28.000 You were abusing technology.
01:29:30.000 You were part of the problem.
01:29:31.000 Whatever Siri did, I didn't do it.
01:29:32.000 Even if she wanted her asshole licked, she probably doesn't want you bringing it up like that.
01:29:37.000 It's like, Jesus, can we talk about it in private after a couple cocktails?
01:29:42.000 Yeah, I like it, okay?
01:29:44.000 I like it when I'm clean, but I don't need it to be on a podcast, bitch.
01:29:51.000 You're rude, dude.
01:29:53.000 Every morning at 7.30 before she leaves the house.
01:29:55.000 Your book, Robo Apocalypse, is a New York Times bestselling book.
01:30:00.000 That's a pretty dope thing attached to your name.
01:30:03.000 New York Times bestselling author.
01:30:05.000 I know, man.
01:30:06.000 I love it.
01:30:06.000 I got it on my Twitter handle.
01:30:08.000 You're legit.
01:30:08.000 I'm going to eventually get over it, but for now, I'm still really into it.
01:30:11.000 No, why would you get over that?
01:30:12.000 You're legit.
01:30:13.000 That's like what everybody wants.
01:30:15.000 That makes you like a black belt in writing.
01:30:18.000 I think it really is important because when you're writing for a living, it's really hard to convey to people whether you're doing all right or not.
01:30:27.000 Because people, you'll be at a party and no one knows who the hell you are.
01:30:30.000 You're a writer, right?
01:30:31.000 Why would they?
01:30:32.000 It doesn't really matter how successful you get.
01:30:34.000 They're not going to know who you are.
01:30:35.000 And you're at a party and someone says, what do you do?
01:30:37.000 And you're like, I write science fiction for a living.
01:30:39.000 And they're like, well, good luck with that, man.
01:30:42.000 I hope that works out.
01:30:43.000 And it's like, well, no, I'm doing okay.
01:30:46.000 We can talk about it.
01:30:48.000 It's not embarrassing.
01:30:49.000 Oh, that's funny.
01:30:50.000 So they automatically assume that you were failed?
01:30:52.000 Yeah, I think so.
01:30:54.000 Is it because you're not Stephen King?
01:30:55.000 It's because it's really hard to make a living writing science fiction.
01:30:58.000 I mean, it's actually scary.
01:30:59.000 I mean, the understanding is if you say, I write science fiction for a living, the understanding is you do not get paid.
01:31:05.000 You have another job.
01:31:06.000 That's interesting, because I would think that there would be a big market for that.
01:31:10.000 There is, right?
01:31:11.000 Yeah, it seems to me there's a lot of people that are science fiction buffs.
01:31:14.000 Science fiction's killing it, but yet it's still sort of got all this bad...
01:31:20.000 Image problems from like the pulp era, you know?
01:31:23.000 So like Doubleday, my publisher, They promoted the book as a techno-thriller.
01:31:29.000 Not science fiction.
01:31:30.000 It's a techno-thriller because there's like a science fiction ghetto.
01:31:34.000 There's like a science fiction sub-genre.
01:31:37.000 Distinction.
01:31:37.000 Yeah.
01:31:38.000 And if you fall into that, then you go into the science fiction.
01:31:40.000 You don't go into the mainstream popular.
01:31:43.000 What?
01:31:43.000 You don't go on the front table.
01:31:45.000 You go into the science fiction area.
01:31:46.000 That's so weird.
01:31:47.000 And you know what pisses me off?
01:31:49.000 It's like science fiction is a bad word.
01:31:53.000 The most popular book is Fifty Shades of Grey.
01:31:57.000 You're going to be embarrassed about science fiction?
01:31:59.000 What do you think that's about, man, as a writer?
01:32:02.000 You see this poorly written Bondo, Bondage book, and it's doing really well.
01:32:08.000 Bondo?
01:32:08.000 Not Bondo, that's the shit for your cars.
01:32:11.000 Bondage, sadomasochistic, weird sexual thing.
01:32:15.000 There's a puddle on the floor.
01:32:16.000 It's about a really handsome guy who likes to hurt girls.
01:32:18.000 Doesn't he do it on purpose or something like that?
01:32:21.000 I haven't read it.
01:32:21.000 I heard he's gone.
01:32:22.000 You read it, dude.
01:32:23.000 I saw the look in your eyes.
01:32:24.000 He was trying to lie to me.
01:32:25.000 I've seen a lot of people reading it in public.
01:32:28.000 That's so strange.
01:32:29.000 That's like a porno.
01:32:30.000 Why not just have a penthouse in public?
01:32:32.000 I know, but I kind of like it.
01:32:34.000 You like it?
01:32:34.000 When you see a cute girl reading it in public, you're sort of like, you dirty, dirty, dirty, dirty bitch.
01:32:40.000 Yeah, you're right.
01:32:41.000 I didn't think about it that way.
01:32:42.000 Yeah, maybe it's girls like putting out a signal like they're down.
01:32:45.000 I had a friend who dated this girl and they were normal and he said she got this book, read the book, and then the next time they were together she asked him to spit in her mouth.
01:32:56.000 And he was like, what the fuck?
01:32:58.000 She goes, do it.
01:32:58.000 Spit my mouth.
01:32:59.000 And he was like, what?
01:33:00.000 What the fuck are you doing?
01:33:01.000 I've been reading this book, Fifty Shades of Grey.
01:33:03.000 And I'm just getting really excited about it.
01:33:05.000 And he was like, what the fuck are you reading?
01:33:06.000 So he started reading what she was reading and trying to figure out why she wanted him to spit in her mouth.
01:33:12.000 I was at a...
01:33:13.000 I went to Oricon, the science fiction convention in Portland, right?
01:33:19.000 So we have a hotel room for the sci-fi convention.
01:33:23.000 And my friend rents porno, right?
01:33:26.000 On the TV and at the beginning and I've never seen this before at the beginning before they played the the porn right there's a message that says These are unrealistic scenarios.
01:33:38.000 Do not try this.
01:33:40.000 These people are, like, trained.
01:33:42.000 It was, like, the whole, like, do not try this at home thing.
01:33:44.000 And then, of course, it's just typical pornography, right?
01:33:47.000 Very large objects going in very small places.
01:33:50.000 And it's true, like, I don't know if that's just a hotel thing or if that's just, like, on all porn.
01:33:56.000 But you can imagine the hotels are, like...
01:33:58.000 We really need to have a little disclaimer at the beginning because people are hurting each other.
01:34:03.000 They're getting rowdy.
01:34:06.000 We're spending a fortune on sheets.
01:34:08.000 If you talk to anybody that's in the medical profession, that's done any work in the emergency room, they would tell you about all the various things that people stuffed in their body and then got stuck up there.
01:34:18.000 And it's crazy, man.
01:34:20.000 I had a friend who had to pull a light bulb out of this dude's ass.
01:34:23.000 One of those twisty light bulbs.
01:34:25.000 You know those slightly thicker glass because they're kind of twisty?
01:34:28.000 At least it wasn't a fluorescent.
01:34:29.000 It had broken in his ass.
01:34:34.000 They had to pull this out of his ass.
01:34:38.000 Dude, gingerly had to walk him because he was afraid that if he closed down on his ass, it would shatter inside of his ass and just shards of blood.
01:34:47.000 And there's a video of a guy doing that very thing.
01:34:49.000 There's a video of a guy with a cup.
01:34:51.000 Oh my god, it's horrific.
01:34:53.000 This time, he doesn't do it with a lipo, but he does it with a cup, which is even scarier because it's a heavy, thick glass jar.
01:35:00.000 And it breaks off inside of him and he's pulling these chunks of glass with blood and it's all We're good to go.
01:35:15.000 We're good to go.
01:35:27.000 Wait for it.
01:35:28.000 It's like the new faces of death.
01:35:30.000 Yeah, and you never know, right?
01:35:32.000 It could just be like some old lady crossing the street or like a guy taking a drink of milk and then compound fracture.
01:35:37.000 You want to see a broken toe, a horrifically broken toe?
01:35:39.000 I'm not going to look at it.
01:35:40.000 You're not going to look at it?
01:35:41.000 Is it yours?
01:35:42.000 No, Anthony Parash.
01:35:43.000 He's a fighter that fights in the UFC and he just had to withdraw from a fight because of a broken toe.
01:35:48.000 It's pretty incredible.
01:35:50.000 You want to see it?
01:35:51.000 No.
01:35:51.000 No?
01:35:52.000 Do you want to see it?
01:35:53.000 I like to use the VCR method and lower the resolution by squinting.
01:35:57.000 Yeah.
01:35:58.000 Okay, then I'm not going to show it to you.
01:36:00.000 I'm not going to show it to you, but it's one of the craziest things I've ever seen.
01:36:03.000 It doesn't look good.
01:36:05.000 Or I do the thing where I'm just looking at the bottom left of the screen, so I could see it like an arm.
01:36:10.000 Yeah, you remember the first time you saw someone die online?
01:36:14.000 Like the first video of the Bud Dwyer video or something like that?
01:36:17.000 You're like, oh wow, that's a guy dying.
01:36:18.000 That's what it looks like when someone dies.
01:36:20.000 Well, you used to have to go through all the trouble to get Faces of Death, and then it was a huge deal.
01:36:26.000 Yeah.
01:36:26.000 Now it's like you're waiting for the bus.
01:36:28.000 It's like, I think I'm going to watch 15 people die.
01:36:30.000 Yeah, Faces of Death was the same era for me as a friend who had one of those barnyard porns.
01:36:37.000 There was a few of those porns that were going around that were like so grainy.
01:36:42.000 And seedy.
01:36:43.000 Yeah.
01:36:43.000 Oh, just horrendous.
01:36:45.000 You just got to think about what it was like to be this poor girl that was just blowing this donkey.
01:36:49.000 And the actual VHS tapes are always sticky and the stickers ripped off.
01:36:54.000 Sometimes the tape would get fucking broken and spool up.
01:36:59.000 There were horrible, horrible videos.
01:37:02.000 The resolution was terrible.
01:37:05.000 But at least there was a little bit of pageantry.
01:37:07.000 Maybe that's the wrong word.
01:37:09.000 But gravity associated with the whole act of witnessing this terrible thing.
01:37:15.000 It wasn't so easy.
01:37:16.000 Yeah, you're right.
01:37:18.000 It was a big deal.
01:37:19.000 Like, we planned it out.
01:37:20.000 Like, one guy even had to watch the door.
01:37:23.000 Right.
01:37:23.000 Like, one guy watched the door and we watched it downstairs on this, like, it was at the bottom.
01:37:27.000 Like, he was, like, he stood, like, halfway up the stairs so he could, like, still lean down and look over at this.
01:37:32.000 It was a little shitty-ass fucking television we were watching this on.
01:37:36.000 And when we were watching that girl blow the horse, we were like, what the We thought we could go to jail at any moment.
01:37:41.000 Someone could kick the door in.
01:37:44.000 I don't remember how old we were, but it wasn't more than high school age.
01:37:49.000 It was somewhere around high school age.
01:37:50.000 It was just the craziest thing to see ever, to watch this chick have sex with animals.
01:37:55.000 There was a dog.
01:37:56.000 She blew a dog.
01:37:58.000 She blew a donkey.
01:37:59.000 There was a couple different animals that she had to have sex with.
01:38:01.000 It's so weird.
01:38:03.000 That was so hard to do.
01:38:04.000 And we felt bizarre for days afterwards.
01:38:10.000 Kids today just laugh about that shit.
01:38:12.000 They think it's hilarious.
01:38:14.000 Yeah, it rolls off of them.
01:38:15.000 They witness so much faster than we ever did.
01:38:20.000 I don't know.
01:38:21.000 Yeah, there's a weirdness to this life for them that just never existed, as far as we know, for any human ever.
01:38:27.000 This weird connection to almost anything.
01:38:30.000 If you leave your kid out in the world with an iPhone, you send him out in the world, you're giving them access to fucking everything.
01:38:36.000 That's the whole world.
01:38:37.000 They're not going to probably get taken by internet scammers from their iPhone, but almost everything else.
01:38:42.000 There's a Times article right now where they're interviewing people, like a bunch of different people, about Porn, you know, and the impact on society.
01:38:50.000 And there's somebody that says exactly that.
01:38:51.000 They're like, look, you know, kids have access to everything.
01:38:55.000 It's really impossible to limit that access.
01:38:58.000 And they're talking about how all this easy access to porn has affected people's sex lives.
01:39:04.000 And apparently, you know, certain...
01:39:07.000 They go through and they catalog which sex acts occur in the majority of videos.
01:39:11.000 So they can say, like, 88% of videos have a facial, you know?
01:39:15.000 And then they look and there's a direct correlation.
01:39:17.000 What happens in those videos is what people want to do at home.
01:39:23.000 That's funny.
01:39:24.000 So like butt sex goes way up.
01:39:25.000 It must be.
01:39:26.000 They said like 80% had butt sex and then 4% of people actually had butt sex.
01:39:32.000 What do you mean?
01:39:33.000 80%?
01:39:34.000 In the video...
01:39:35.000 In the video, 80% of the people did it, but in real life 4% of the people did it?
01:39:38.000 Yeah, because there's a correlation between demands and what's in the videos.
01:39:42.000 Not so much a correlation between what actually happens.
01:39:45.000 Because of the gatekeepers in this whole process.
01:39:48.000 Are the females.
01:39:49.000 Yeah.
01:39:50.000 Who are actually like, go fuck yourself.
01:39:52.000 Yeah.
01:39:53.000 We talked about it must be so crazy in the gay community because there's not that female influence.
01:39:59.000 Yeah, I said the same thing to my wife.
01:40:00.000 I said the same thing to her.
01:40:01.000 I said, thank God that you guys are slowing us down.
01:40:05.000 And God knows what's happening in that community because there's no break.
01:40:08.000 Yeah, believe me.
01:40:09.000 There's no way it would be the same.
01:40:11.000 If men and women were the same, as far as our veracity, it would be just like gay people.
01:40:17.000 And it's not.
01:40:18.000 The men have to tone down for the women.
01:40:20.000 It doesn't exist in the gay community.
01:40:22.000 They have a totally different, bizarre operating dynamic.
01:40:26.000 Yeah, yeah.
01:40:27.000 Well, I'm sure that everybody acclimates to it.
01:40:30.000 I guess they have less suppression, though, that way.
01:40:33.000 Certainly, they know each other.
01:40:34.000 They understand each other better.
01:40:36.000 Yeah, and I think that also that relationship, whenever you have gay friends and you see how their relationships are, you know, to the extent that you see that if you're a heterosexual, I mean, obviously you're not seeing the whole story.
01:40:50.000 They're not seeing your whole story in your bedroom either.
01:40:52.000 But I think that those types of relationships, each person has their own specific one and it affects each other.
01:40:57.000 You see that, like, you know, my friends that are gay tend to have, like, Relationships where they trust each other a lot and they're a little looser.
01:41:06.000 It's not a lot of jealousy and stuff.
01:41:08.000 Maybe I just have weird friends.
01:41:09.000 I don't know.
01:41:10.000 But it's interesting.
01:41:12.000 You rub off on each other.
01:41:14.000 Yeah, it's interesting that they...
01:41:17.000 It's just a different dynamic.
01:41:19.000 And for whatever reason, the real issue lies in the fact that it's not legal everywhere, that they don't share the same rights as people everywhere.
01:41:26.000 That, to me, is really bizarre.
01:41:28.000 Because if two people want to pretend to be the husband, what do you give a fuck, as long as they're together?
01:41:33.000 If one guy wants to pretend he's a woman, what do you care?
01:41:36.000 Why do you care?
01:41:36.000 They're just two people that want to get married.
01:41:38.000 The idea that you would have to have a certain sexual proclivity in order to engage in this It's just so bizarre.
01:41:46.000 And if you allow any sort of bizarre, any sort of discrimination that doesn't make any sense objectively, if you allow any of that in our world, then it's going to come at you too, man.
01:41:56.000 And if you don't take a stand for gay people that want to get married and for whatever reason they're being persecuted by numbskulls and overly religious crazy people, if you don't take a stand for them, then who's going to take a stand for you when it comes in your direction?
01:42:09.000 Who's going to take a stand for humanity?
01:42:11.000 Because it's just a person who happens to like men.
01:42:14.000 Why do you let them marry each other, you fuck?
01:42:17.000 What do you care?
01:42:18.000 Yeah, I agree.
01:42:18.000 It doesn't make any sense.
01:42:19.000 I think, you know, there are a lot of like cultural mores or whatever, right?
01:42:23.000 However you pronounce that word.
01:42:25.000 Like sort of stuff that people just think is obvious, you know?
01:42:28.000 And it's just accepted.
01:42:29.000 But times change, you know?
01:42:31.000 People, our relationships, how human beings interact with each other.
01:42:35.000 You know, even just like you think about people interacting over the computer, stuff like that.
01:42:39.000 All that stuff changes, and so all of the cultural stuff has to change too.
01:42:43.000 And sometimes I'm glad that we do have a full spectrum of some people are just willing to embrace anything new, right?
01:42:52.000 Especially with technology, you have all kinds of new shit getting thrown at us all the time.
01:42:56.000 Other people dig in their heels and they say, let's take it slow.
01:42:58.000 Let's not go nuts here.
01:43:00.000 And I think that's a good thing because we have so much change coming at us so fast that we need...
01:43:06.000 Some people that are willing to embrace it and experiment with it.
01:43:09.000 Other people that are...
01:43:10.000 And I'm off of the gay thing now, by the way.
01:43:12.000 I'm on to technology.
01:43:13.000 Thank God, dude.
01:43:14.000 I was going to talk to you about this.
01:43:17.000 I'm on to people who are saying that...
01:43:23.000 That we shouldn't experiment with stem cells.
01:43:28.000 All of this scientific discovery that's out there and people are saying, no, no, no, we shouldn't study it.
01:43:33.000 We should slow down.
01:43:35.000 It is nice to have some people that want to slow things down.
01:43:39.000 Well, it's bizarre when religion actually tries to interfere with science in ways that don't make any sense.
01:43:46.000 One of my favorite ones was the Pope talking to Stephen Hawking, and he told him that it was okay to explore the nature of the universe, but it wasn't okay to explore the origins of the Big Bang, because that would be like questioning God himself.
01:44:04.000 Isn't that hilarious?
01:44:05.000 Yeah, you know...
01:44:07.000 Religions, to me...
01:44:09.000 Like, he was, like, telling them what...
01:44:11.000 I mean, this was, like, our lifetime.
01:44:12.000 I mean, Hawking's still alive, so this was in our lifetime.
01:44:16.000 The Pope was telling this guy...
01:44:17.000 He's still studying the Big Bang.
01:44:19.000 It's so silly that that could ever get to a point where it could happen, where this nutty cult member actually could get into a position.
01:44:27.000 And in this day and age, that could...
01:44:29.000 I mean, it didn't have an influence, other than to make a humorous anecdote.
01:44:33.000 But still, what if he listened?
01:44:35.000 Well, it's to reinforce his notions, really.
01:44:37.000 And who knows what his own psychological proclivity might be?
01:44:41.000 There are some people that are subject to the influence of people that you wouldn't be or I wouldn't be.
01:44:46.000 And, you know, in certain cases, like someone who's in a big position of power, like a pope.
01:44:52.000 Well, you know, for thousands and thousands of years, religion is the bedrock.
01:44:56.000 It's what keeps people united.
01:44:57.000 It's what keeps them alive.
01:44:59.000 I mean, to have a shared culture...
01:45:01.000 I have Native American in my background, right?
01:45:05.000 Growing up, I was always interested in reading the history and thinking about why did Native Americans get basically wiped out, and also Aborigines and things.
01:45:16.000 And you think about the fact that in Australia, All the different tribes in Australia, they all spoke different languages, and they all had a certain amount of land, and they were kind of in stasis.
01:45:28.000 I mean, they fought with each other, but everything always kind of ended up the same.
01:45:32.000 There was no one group that conquered all of Australia.
01:45:36.000 And then you think about Europe and other more bellicose places.
01:45:41.000 Here you've got places where They enforce one culture on massive groups of people, right?
01:45:48.000 Through religion.
01:45:49.000 And they're incredibly effective.
01:45:51.000 They travel all over the world and white people out.
01:45:54.000 They work together.
01:45:55.000 They build cities.
01:45:56.000 There's great utility in having people think alike like this.
01:46:01.000 But then, you know, there's also a drawback.
01:46:04.000 And the drawback can be that it's resistant to change and it doesn't...
01:46:10.000 You know, adapt for new circumstances.
01:46:13.000 So it's almost like you could look at religion as an operating system on a cell phone.
01:46:18.000 It allows you to get things done, but if you don't update it, and continually update it with the latest versions...
01:46:27.000 Well, science naturally updates itself.
01:46:30.000 You can always go back and prove something wrong, prove it for yourself, and Fix it, update it with new information.
01:46:36.000 I mean, that's the strength of science, but that's scary because there is no bedrock.
01:46:41.000 You know how much you don't know.
01:46:43.000 And you really have to trust yourself a lot, I think, to really trust science because you are acknowledging that you don't know everything.
01:46:52.000 You're allowing a thought into your mind that a lot of people do not find comforting.
01:46:55.000 It's really scary.
01:46:56.000 And that is that no one is watching this thing.
01:46:57.000 No one's paying attention to this.
01:46:59.000 We essentially live in a soup of madness.
01:47:02.000 And that is the world that we actually exist in.
01:47:05.000 And we want to pretend that we're in this strange Sandra Bullock movie where everything's going to be okay and everything's going to be normal.
01:47:12.000 And one of the best ways to do that is to think that there's a guy that lived a long time ago that came back from the dead and he absolved you of all your bullshit.
01:47:20.000 You just got to take him into your heart and you're good, no matter how bad you've been in the past.
01:47:24.000 And they go with that.
01:47:25.000 It's not about you believing it.
01:47:26.000 It's about...
01:47:27.000 Lots of people believing it, so you draw strength on other people.
01:47:30.000 Yes, good point.
01:47:32.000 I think it's a natural human thing to do.
01:47:34.000 I don't think there's anything wrong with it.
01:47:34.000 I think it's an operating system.
01:47:35.000 I think it's like a scaffolding for morals.
01:47:38.000 I think of it as a policy.
01:47:39.000 Like, it's a set of behaviors, you know, for certain situations we'll behave this way.
01:47:46.000 And yeah, it glues people together.
01:47:49.000 Sort of like when a guy wears a suit and a tie, he's going to be a gentleman.
01:47:52.000 Would the gentleman care for a drink?
01:47:55.000 I mean, God doesn't say that if you're wearing Muay Thai shorts and flip-flops and you're sweating.
01:48:01.000 They don't want to have anything to do with you.
01:48:03.000 They wouldn't say, would the gentleman like a cocktail?
01:48:05.000 They'd be like, sir, you can't come in here naked with flip-flops on.
01:48:08.000 We want you to look in a very specific, uniform way.
01:48:13.000 And that's how I know I can predict your behavior.
01:48:16.000 You're going to behave like a gentleman.
01:48:17.000 Well, that...
01:48:18.000 Believe it or not, I can make anything tie back to robotics.
01:48:21.000 I believe you could.
01:48:23.000 That actually ties back in.
01:48:24.000 When you build a robot, right, what you choose to make it look like is a promise.
01:48:29.000 It's a guarantee to the person that's going to interact with it.
01:48:31.000 So if you build a robot that looks just exactly like a human being, then anybody that walks up to that robot is going to damn well expect that that robot is going to be as smart as a human being.
01:48:40.000 If you say, what's up, buddy, it's going to say, hey there, pal.
01:48:44.000 And if it doesn't, people get mad, you know?
01:48:46.000 Which is why I don't think we're gonna see super realistic, like, you know, androids anytime soon, because we don't have the full package, you know?
01:48:56.000 We may be able to make it look really realistic, but we can't make it behave in a really realistic way.
01:49:01.000 Why is that?
01:49:02.000 I mean, just because that's the hard part.
01:49:04.000 But is that, I mean, it's a temporary hurdle, isn't it?
01:49:08.000 I mean, with the way science continually grows in this exponential manner, it seems that if they could figure out, they could sort of figure out how to mimic various aspects of the actual human.
01:49:20.000 Yeah, I mean, we're making progress.
01:49:22.000 Do you think they'll actually engineer in egos and things like that?
01:49:25.000 Yeah, yeah.
01:49:26.000 Self-respect?
01:49:27.000 Or they'll copy it, you know, they'll mimic it to the extent that you can't tell the difference.
01:49:31.000 What if robots start revenge beatings, waiting for people behind cars, and people talk shit at them in a meeting?
01:49:39.000 Socks full of doorknobs.
01:49:40.000 Yeah, beat the fuck out of them.
01:49:42.000 I think that would be a malfunction.
01:49:43.000 Yeah, it would be.
01:49:45.000 Is there a way you could ever, in the Alien series, like, we've been engineered to never harm a human being.
01:49:53.000 I love those aliens, bro.
01:49:55.000 I love Bishop.
01:49:56.000 He gets ripped in half.
01:49:59.000 The first guy was good, too, man.
01:50:01.000 The first guy was good, too.
01:50:02.000 Spoiler alert.
01:50:03.000 Spoiler alert.
01:50:04.000 In Prometheus, there's another one.
01:50:06.000 Oh, yeah.
01:50:07.000 What's his name?
01:50:08.000 The most substantial.
01:50:09.000 That awesome actor.
01:50:10.000 Yeah, yeah, yeah.
01:50:10.000 That dude.
01:50:11.000 I don't remember his name.
01:50:12.000 David is what they named him, too, which is weird.
01:50:15.000 All right, I'll tell you who he is.
01:50:16.000 We'll give the guy his props.
01:50:18.000 Some credit, yeah.
01:50:19.000 Yeah, but he's amazing in it.
01:50:24.000 What the fuck's the dude's name?
01:50:26.000 IMDB, son.
01:50:28.000 Where did IMDB? He got a 7.3 rating.
01:50:33.000 That's pretty fucking good.
01:50:34.000 That is.
01:50:36.000 Alright, homeboy, what's his name?
01:50:40.000 So they announced we got Anne Hathaway is going to be in Robo-Pocalypse.
01:50:44.000 Anne Hathaway.
01:50:46.000 That became official yesterday.
01:50:48.000 Wow.
01:50:49.000 Yeah, I don't even know what part she's going to play, but I'm kind of excited.
01:50:52.000 She's sort of a badass.
01:50:54.000 Is she really?
01:50:55.000 Well, I mean, as Catwoman and stuff, she pulled it off.
01:50:58.000 Any of that superhero stuff, if you can dress up in a gimmicky costume and pull it off and make it seem real, then you've got my respect.
01:51:06.000 Right.
01:51:07.000 I can't find this dude.
01:51:10.000 I should know his name.
01:51:12.000 Goddammit.
01:51:12.000 Is it Rafe Spall?
01:51:14.000 No.
01:51:14.000 No, that's not it.
01:51:18.000 Motherfucker.
01:51:19.000 Alright.
01:51:20.000 I give up.
01:51:21.000 I give up.
01:51:22.000 Props to him, whoever he is.
01:51:23.000 Yeah, whoever he is.
01:51:23.000 He's a bad motherfucker.
01:51:25.000 They don't seem to have his photo here in the cast.
01:51:28.000 He was the robot guy.
01:51:29.000 It's not Guy Pearce.
01:51:31.000 Mm-mm.
01:51:31.000 I don't think so.
01:51:36.000 I give up.
01:51:37.000 Sorry.
01:51:38.000 Lots of people are screaming his name, right?
01:51:40.000 Yeah, I'm sure someone on Twitter will tell me who the guy's name is, but yeah, he played an amazing robot.
01:51:45.000 He did the best job, in my opinion, in any robot movie, of walking that creepy line where you think he almost feels slighted, but doesn't because he really doesn't have any programmed emotions, but he's recognizing that you're trying to slight him.
01:52:00.000 It's very fascinating.
01:52:02.000 It's worth seeing, really, for some of his scenes alone.
01:52:06.000 Especially, I can't tell you.
01:52:08.000 It's too much of a spoiler.
01:52:09.000 For a really realistic humanoid, or android, I mean, there's a momentum behind the emotions, right?
01:52:17.000 So even if it doesn't feel pissed off, if it looks and in all other ways behaves completely...
01:52:31.000 Yeah.
01:52:32.000 Yeah.
01:52:45.000 Well, also, the guy's name is Michael Fassbender.
01:52:47.000 Michael Fassbender, yeah.
01:52:49.000 Spoiler alert.
01:52:50.000 Also, he actually acts in almost a vindictive manner to one of the guys who had been fucking with him in this movie.
01:52:58.000 It was kind of interesting.
01:52:59.000 I gotta see this.
01:53:00.000 How they had had it set up.
01:53:02.000 You know, like I said, it wasn't a terrible movie.
01:53:05.000 It just wasn't what I was hoping.
01:53:06.000 Michael Fassbender, that's what it is.
01:53:08.000 He played David.
01:53:09.000 He was the second one.
01:53:10.000 There it is.
01:53:11.000 His hair is a different color in this photo.
01:53:13.000 I got confused.
01:53:14.000 But yeah, he's an amazing actor, man.
01:53:16.000 That's a skill that's going to be difficult when they start doing CGI movies, like completely CGI, like really replicating an actual human's emotions and the way a human like that guy can act.
01:53:29.000 That's going to be really difficult to do.
01:53:31.000 It's all the little things that the actors learn.
01:53:35.000 I've been watching all the Chris Hemsworth movies because he's also supposed to be in Robo-Apocalypse.
01:53:40.000 Who's Chris Hemsworth?
01:53:41.000 He's Thor, Mighty Mighty Thor.
01:53:43.000 Oh, cool.
01:53:43.000 And he's in Red Dawn, the remake, and he's in Snow White and the Huntsman.
01:53:47.000 So I was watching that, and I can't remember the name of the actress.
01:53:51.000 She's the one from Twilight.
01:53:53.000 Kristen Stewart?
01:53:54.000 Yeah.
01:53:54.000 And she's really pretty, and she does this thing where...
01:53:58.000 You start to notice that it's almost her trademark.
01:54:03.000 She breathes really hard and her collarbones sort of stick out.
01:54:07.000 That's like an acting thing.
01:54:09.000 Or when you notice their little nostrils are flipping around and you're like, that's why they get paid the big bucks.
01:54:15.000 That's hilarious.
01:54:16.000 Their nostrils are doing SeaWorld shit and flipping out of the water.
01:54:19.000 Yeah.
01:54:20.000 That's funny.
01:54:22.000 Some people, they have the same guy.
01:54:24.000 Like Christopher Walken, he's the same guy in almost every movie ever.
01:54:28.000 I read an interview with him.
01:54:30.000 He said that they started like...
01:54:31.000 Now he has a clause in his contract that says they can't rewrite it to make it weirder when they hire him because he's tired of all the roles shaping to fit him instead of him having to act.
01:54:44.000 Oh, that's funny.
01:54:45.000 Like, so as he gets it, they go, look, we're going to have Christopher Walken and he's going to be on acid 24-7.
01:54:50.000 And he only wears, like, flippers and golf shoes and shit like that around the house.
01:54:56.000 And he was just supposed to be someone's, like, dad or something.
01:54:58.000 Yeah.
01:54:59.000 Yeah, I wonder if, yeah, like the variety of roles that he gets offered.
01:55:03.000 Everyone's crazy.
01:55:04.000 Every role that he does has got to be a crazy guy.
01:55:06.000 And once you get typecast, I guess.
01:55:08.000 Al Pacino, like he's got to scream.
01:55:10.000 There's got to be at least one ranting, screaming scene in every movie.
01:55:15.000 I think after a while, these dudes probably like, Especially after they get paid a few times, they really sell out hard and make some fat cash on a terrible movie.
01:55:27.000 And they're like, wow, that was pretty easy.
01:55:28.000 It's one of those good problems, you know?
01:55:31.000 I keep getting hired and getting paid a lot of money to be myself.
01:55:34.000 But Robert De Niro did a bunch of stinky-ass movies, man.
01:55:38.000 He did that one movie where he was a gremlin or a warlock or something like that.
01:55:44.000 Wasn't there a movie where it was like one of those Hobbit-type rip-offs?
01:55:48.000 Oh.
01:55:49.000 Yeah, yeah, yeah.
01:55:50.000 I feel like...
01:55:51.000 Oh, God, I can't remember the movie.
01:55:52.000 Well, you know, I saw...
01:55:53.000 But nobody saw it.
01:55:55.000 In Snow White, there were all these dwarves, right?
01:55:57.000 And I thought that they were little people.
01:55:59.000 I shouldn't say nobody.
01:56:00.000 And instead, they were real actors.
01:56:03.000 I couldn't figure out how it was happening.
01:56:06.000 Yeah, because they were famous actors.
01:56:09.000 That was a big point of contention with the little people community.
01:56:13.000 Because they were like, what the fuck, man?
01:56:14.000 These are the only gigs that we could take.
01:56:17.000 And instead, you think it's cute to show off with...
01:56:19.000 There's plenty of little people actors that would have loved those roles.
01:56:24.000 So it's kind of a...
01:56:25.000 That was a little touch-and-go situation.
01:56:27.000 Like, what is it?
01:56:30.000 Bud Hot...
01:56:31.000 What's his name?
01:56:32.000 Hotskins?
01:56:33.000 What is the guy's name?
01:56:35.000 What roles does he play?
01:56:36.000 He was one of the little people.
01:56:38.000 Very famous.
01:56:39.000 Fuck.
01:56:39.000 God damn it.
01:56:40.000 Now I'm going to make me IMDB this motherfucker, too.
01:56:43.000 I might be MD more today than I ever have in my life.
01:56:48.000 But then there was the other movie...
01:56:49.000 You can tell IMDB was one of the first websites because it's got...
01:56:53.000 Because they would never name it that, right?
01:56:55.000 It would be like...
01:56:56.000 They would give it a...
01:56:58.000 Movie poo or something.
01:57:00.000 Movie time!
01:57:01.000 Something like that.
01:57:03.000 Yeah.
01:57:05.000 YMIMDB. Internet Movie Database.
01:57:08.000 I guess it's easy to remember.
01:57:09.000 Plus it's four characters, man.
01:57:10.000 You can't get a website that's four characters.
01:57:13.000 Boz Hoskins.
01:57:15.000 That's homeboy's name.
01:57:17.000 He's been in a gang of movies.
01:57:19.000 Well, is he Willow?
01:57:20.000 Bob Hoskins.
01:57:20.000 Bob Hoskins was in...
01:57:22.000 Wasn't he in Roger Rabbit?
01:57:24.000 Wasn't he in that, too?
01:57:26.000 Yeah, he was.
01:57:26.000 There's a guy who was in Willow and Star Wars and, like, everything.
01:57:29.000 Dude, this guy's been in fucking everything.
01:57:31.000 This guy's been in a lot of shit.
01:57:33.000 He's been in a lot of shit.
01:57:35.000 But he's in that movie, too.
01:57:37.000 See, I feel the same way about...
01:57:39.000 In Robo-Pocalypse, it's all Native Americans.
01:57:42.000 Like, so I have this feeling that if the federal government were to fall apart...
01:57:47.000 There are all these sovereign tribal governments that are around that have jails and cops and hospitals and everything that you need, right?
01:57:54.000 Only it's smaller.
01:57:55.000 And in the book, everybody ends up there.
01:57:58.000 And so in the movie, I don't know.
01:58:00.000 I haven't read the script.
01:58:01.000 I don't know anything about it.
01:58:03.000 I'm wondering, are they going to hire Native actors?
01:58:06.000 You would hope so, right?
01:58:08.000 Because it's an Osage army, basically.
01:58:12.000 They would just have to maybe look a little bit more carefully, but I'm sure there's plenty of qualified people.
01:58:17.000 Acting is one of those things, too, man.
01:58:18.000 There's a lot of people that you don't know that can do it.
01:58:21.000 It's one of those things where there's so many people that have done theater, and they can fucking act.
01:58:26.000 They know how to act.
01:58:27.000 It's beautiful.
01:58:27.000 Spielberg's good at getting people that aren't big names and scouting and making them big names, but at least while they're with him.
01:58:33.000 Well, you know, I think when you've been in the movie business that long, you trust casting agents, too.
01:58:39.000 And a casting agent can tell you, hey, there's this kid.
01:58:41.000 He's amazing.
01:58:43.000 You've got to check him out.
01:58:44.000 Nobody knows who he is because he's really got something special.
01:58:47.000 He's perfect for this one particular role that he might have.
01:58:50.000 But he's good at it.
01:58:52.000 But some people just...
01:58:53.000 It's kind of funny that we have this thing with movies where we accept that we're seeing the same guy over and over and over again in a bunch of different lives.
01:59:03.000 I mean, in one movie, he's the last American samurai, whatever the fuck he is.
01:59:07.000 In another movie, he's a vampire.
01:59:10.000 Now he's going to be Jack, the guy from the, who is it, Clive Cussler or something?
01:59:15.000 Oh, man.
01:59:16.000 From what movie?
01:59:17.000 We're terrible, dude.
01:59:17.000 Yeah, we're fucking up today.
01:59:19.000 Anyway, yeah, he's...
01:59:20.000 I need some bulletproof coffee.
01:59:22.000 He's playing a guy who is actually, like, 6'4", linebacker.
01:59:26.000 Oh, he's playing...
01:59:27.000 No, I was just thinking of Matthew...
01:59:30.000 No, Matt Damon.
01:59:31.000 Didn't Matt Damon play a rugby player, a famous rugby player?
01:59:34.000 Didn't he?
01:59:35.000 No?
01:59:36.000 I think he did.
01:59:37.000 It seems like he could pull that off.
01:59:38.000 Yeah.
01:59:39.000 No, I mean, he played the Bourne Identity movies.
01:59:42.000 He's great.
01:59:43.000 Yeah, I bought that.
01:59:44.000 Although I bought the new guy more.
01:59:46.000 I bought the new guy...
01:59:47.000 I believe the new guy was more, like, natural as a killer.
01:59:51.000 Who's the name?
01:59:51.000 Jeremy Renner?
01:59:52.000 Is that his name?
01:59:52.000 Renner.
01:59:53.000 Yeah.
01:59:54.000 We knew somebody's name today.
01:59:55.000 Yay!
01:59:57.000 That guy's a badass.
01:59:58.000 What I found was amazing about that movie, though, was that one of the things about that movie is that this guy can endure incredible cold and pain.
02:00:07.000 He can do all these amazing things physically, and yet he also had incredible discipline and he never chased pussy.
02:00:14.000 It's like they made him the most unrealistic superhero ever.
02:00:19.000 Because like I was thinking about they were going over James Bond today and they were talking about on the Ron and Fez show, they were talking about all the different names for the James Bond women and which one was the hottest and which girl, which James Bond girl.
02:00:30.000 Is that a weakness?
02:00:31.000 Because they portray that as his great strength.
02:00:33.000 Yes, it is his great strength.
02:00:34.000 He's seducing women.
02:00:35.000 He's doing this to them.
02:00:36.000 Oh, yeah.
02:00:37.000 They're powerless.
02:00:38.000 Instead of, he's getting portrayist.
02:00:40.000 Well, he's just not real.
02:00:42.000 He's not even trying to get laid.
02:00:44.000 He has no needs whatsoever sexually, but yet he flips through the air and lands on top of the roofs and beats the shit everybody and carries this woman the whole way.
02:00:52.000 But she just wants, like, that is like the ideal man.
02:00:54.000 Women want a vampire that won't suck your blood and can go out in the daytime.
02:00:59.000 He just sparkles.
02:01:00.000 They want, I mean, they want this super badass who doesn't want any pussy at all.
02:01:04.000 He's just your guard dog that he takes care of you everywhere.
02:01:07.000 He doesn't even try to fuck you.
02:01:08.000 Like, this woman is unbelievably beautiful, and he's saving her through the entire movie, and they never even so much as make out.
02:01:14.000 This is Bond?
02:01:14.000 No, no, it's not Bond.
02:01:15.000 Oh, okay.
02:01:15.000 It's Jeremy Renner, one of the new Bourne Identities.
02:01:18.000 Right, because I was, okay.
02:01:18.000 But I'm saying, at the same time, if you go back and look, they had a Bond movie called Octopussy.
02:01:23.000 I mean, that is the most unsubtle play on words ever.
02:01:27.000 I remember my grandfather had that VHS tape and I was like, oh yeah!
02:01:33.000 I grabbed that one and popped it in as soon as he was out the door.
02:01:36.000 Bond is always getting hot freaks.
02:01:37.000 That was part of being Bond.
02:01:39.000 He got laid like crazy.
02:01:41.000 And it was a good thing back then.
02:01:43.000 But not anymore.
02:01:44.000 A superhero can't do that anymore.
02:01:45.000 Is the new Bond a philanderer?
02:01:47.000 I don't know.
02:01:48.000 Does he go and get some mass in this new movie?
02:01:51.000 He must.
02:01:52.000 I mean, it's his trademark.
02:01:52.000 He's going to have a martini.
02:01:53.000 He's going to fire a neat gun.
02:01:56.000 Yeah, he's going to win, always.
02:01:58.000 In the end, he'll be fine.
02:01:59.000 He might have a little cast on his hand or some shit like that, but in the end, he'll meet the queen or something.
02:02:04.000 So I heard Heineken paid a bunch of money so that he only drinks Heineken beers during the...
02:02:09.000 During the movie.
02:02:10.000 It's unsubstantiated, but...
02:02:12.000 Well, have you seen the video?
02:02:13.000 The commercial, rather?
02:02:14.000 That's a pretty well-done commercial for a beer commercial.
02:02:17.000 And Heineken was going that way already, because they had all those commercials where the dudes flipping through the air, and he's just kind of a Bond-esque kind of badass.
02:02:23.000 Yeah, so I guess that's okay.
02:02:26.000 It's not like Heineken sucks.
02:02:28.000 If they could pay him to only drink Heineken.
02:02:30.000 What's wrong with that?
02:02:30.000 I like Heineken.
02:02:33.000 Yeah, it's the principle of it.
02:02:35.000 What is the principle?
02:02:35.000 The principle is that it's product placement in a Bond movie which is semi-sacrilegious.
02:02:40.000 Is that it?
02:02:40.000 Is it?
02:02:41.000 I mean, I don't know.
02:02:42.000 Kind of.
02:02:42.000 Does it matter what movie it is?
02:02:43.000 Yeah, it kind of does.
02:02:45.000 There's certain things like Bond that are, like, sacred.
02:02:47.000 You know, I mean, Bonds, you gotta go Sean Connery, Roger Moore.
02:02:50.000 If you wanna do that shit, go Mission Impossible if you wanna only drink, like, if you wanna whore out your main character.
02:02:56.000 But they already whored out Bond during the Pierce Brosnan era.
02:02:59.000 With cars, right?
02:03:00.000 That was like, wait a minute, but he wasn't Bond.
02:03:02.000 Daniel Craig is Bond, okay?
02:03:04.000 Daniel Craig looks like a real English bad motherfucker that could snap your neck.
02:03:09.000 Pierce Morgan was like, or what's his name?
02:03:12.000 Not Pierce Morgan.
02:03:12.000 Pierce Brosnan.
02:03:13.000 Pierce Brosnan.
02:03:14.000 Pierce Brosnan.
02:03:15.000 He seemed like a very nice guy, and I bet he's a hell of a good actor.
02:03:18.000 He seemed like an English gentleman.
02:03:19.000 Yeah, he did.
02:03:20.000 I did not buy him kicking people's asses.
02:03:22.000 I buy this Daniel Craig guy fucking people up and killing people.
02:03:25.000 I totally buy it.
02:03:26.000 He's more compact and like, Daniel Craig?
02:03:29.000 Yeah, Craig.
02:03:30.000 He's built like a pit bull.
02:03:31.000 Yeah, he's thick.
02:03:32.000 That guy looks like an athlete.
02:03:34.000 When you see him with his shirt off and he's got a gun in his hand, you believe that's a guy who kills people for a living.
02:03:39.000 That's a real special agent.
02:03:40.000 Whereas with Pierce Brosnan, I'm like, bitch, I'm going to take your gun.
02:03:43.000 You can't hold on to that gun.
02:03:45.000 Give me that thing.
02:03:45.000 I'm going to kill your hair.
02:03:46.000 Yeah, someone's going to hold you down and pee on you.
02:03:49.000 I just don't buy it.
02:03:50.000 Roger Moore, at least we got the tongue-in-cheek and it was a different era.
02:03:54.000 Pierce Brosnan was in this weird sort of like 90s drama era.
02:03:59.000 Transitioning into an action movie.
02:04:00.000 I got a theory about this too.
02:04:02.000 I got a lot of theories, by the way.
02:04:04.000 My theory on this, there's 80s and 90s.
02:04:07.000 Late 80s, early 90s, there's all these movies where the action scenes consist of people holding submachine guns and going, like spraying bullets at each other, right?
02:04:16.000 Or they'll be like way up on the catwalk and they'll have a shotgun and they're like, boom, boom, shooting somebody like way the fuck across the warehouse.
02:04:22.000 And my theory is that you can't get away with that shit anymore because every kid plays Call of Duty.
02:04:28.000 They know that a shotgun is ineffective at long range.
02:04:32.000 This is like in the DNA of every 14-year-old boy.
02:04:36.000 You just know that you can't shoot somebody with a shotgun from across a football field.
02:04:41.000 You may have never touched a shotgun.
02:04:43.000 And so the whole spraying, the submachine gun thing, that stuff is just gone now.
02:04:49.000 You've got to be way more brutal And like accurate and realistic about it.
02:04:54.000 Well, that's why martial arts movies don't look like those early, you know, Jean-Claude Van Damme and those early, like martial arts movies.
02:05:03.000 It's harder to buy this guy, you know, flipping you over his head and grabbing you by the wrist when you see like an MMA fight.
02:05:10.000 Yeah, so when you realize what really happened.
02:05:12.000 Yeah, I was watching Demolition Man and like Wesley Snipes is, like, fucking doing, like, roundhouse kicks, like, four, five, six times.
02:05:20.000 The guy's still standing there, like, really connecting with a real roundhouse kick.
02:05:25.000 Like, when does that even ever happen, right?
02:05:27.000 But, you know, it's just like...
02:05:29.000 It's like watching Home Alone or something.
02:05:31.000 Yeah.
02:05:32.000 Yeah, that brick would have caved in his skull and you would have immediately thrown up all over the floor.
02:05:37.000 The police would have come.
02:05:39.000 There was a bunch of movies, those martial arts movies that were completely ridiculous.
02:05:45.000 Yeah.
02:05:45.000 Where dudes would just get, stand in a circle, and guys would just charge at him to the left, and he would kick him, and then the guy would charge from the right, he would kick him, and nobody ever rushed him all at one time.
02:05:57.000 Norris, man, Norris.
02:05:58.000 This is his bread and butter right here.
02:06:01.000 Chuck Norris in some of the greatest movies ever.
02:06:03.000 Walker, Texas Ranger.
02:06:04.000 Oh, it's a beautiful show.
02:06:05.000 The whole episode is just, they're like, keep going!
02:06:08.000 Kick him a couple more times in the stomach.
02:06:11.000 Dude, there was an episode where a bomb went off and he lost his sight and he went and he meditated and he got his sight back at the end and saved somebody.
02:06:20.000 He had to get his sight back to save somebody.
02:06:22.000 I was like, oh my god.
02:06:24.000 He gets a little spiritual about it.
02:06:26.000 Fuck yeah, he does.
02:06:28.000 Chuck Norris is very high on Jesus.
02:06:30.000 I met Chuck Norris.
02:06:32.000 It was one of the proudest moments of my life that Chuck Norris knew who I was and gave me a hug.
02:06:36.000 I was like, holy shit, because Chuck Norris watches the UFC. I was like, goddammit, why don't I have a camera?
02:06:41.000 The photo with Chuck Norris in the martial arts community, that's akin to a photo with Elvis if you're a singer.
02:06:47.000 Blow it up.
02:06:48.000 Yeah, have you got a photo with Chuck Norris?
02:06:50.000 You know you want to just hit those kettlebells and just stare at the photo.
02:06:53.000 Chuck Norris and me together, yeah.
02:06:56.000 Just motivate me to train hard.
02:06:58.000 Kiss me.
02:07:00.000 Chuck Norris had a bunch of unbelievably preposterous movies.
02:07:04.000 I mean, Chuck Norris was a bad motherfucker, but some of those movies were hilarious.
02:07:09.000 You know, Roadhouse, right, is one of the great movies.
02:07:11.000 The classic.
02:07:12.000 That is the classic.
02:07:12.000 I had some friends in Portland who created a Roadhouse musical, and this is one of the best things I've ever seen.
02:07:19.000 A Roadhouse musical?
02:07:20.000 It's people acting out Roadhouse, then occasionally there's songs, and there's a narrator that's like...
02:07:26.000 Oh my god.
02:07:27.000 And I was like, did you guys get permission?
02:07:29.000 And they're like, ah, you know.
02:07:31.000 They just did it?
02:07:32.000 They just did it, you know?
02:07:33.000 Are they doing it for profit or is it just for fun?
02:07:35.000 Yeah, well, you know, it was for profit, but they sold out.
02:07:38.000 I mean, these are real actors.
02:07:40.000 I don't think anybody would find it as anything other than...
02:07:43.000 Ideally, people would look at it and say, that's a tribute.
02:07:47.000 And it's what keeps people buying the DVDs and keeps investing in the movie.
02:07:53.000 That was a great, bad movie.
02:07:55.000 It's a great, bad movie.
02:07:57.000 Dude, it's got everything.
02:07:58.000 Yeah, it's an amazing movie.
02:08:00.000 It's like a perfect storm.
02:08:02.000 Every sort of genre, you have...
02:08:05.000 Movies that perfect different elements of the genre and then you and then you'll have one perfect one or two perfect storm movies that get everything and then you know and then it dissipates the times change Code of Silence was one super legit movie that Chuck Norris did.
02:08:20.000 I was trying to remember it.
02:08:21.000 It was a movie, almost like a Charles Bronson type movie where he played a cop.
02:08:26.000 It's just two hours of him meditating?
02:08:28.000 These motherfuckers only gave it a 5.6 in the ratings.
02:08:31.000 How dare you IMDB? I disagree.
02:08:34.000 I think you gotta look at it in the time that it was made.
02:08:37.000 But in the time that it was made, it was like Chuck Norris had very little martial arts in it for a Chuck Norris movie.
02:08:43.000 It was more of, you know, just a good action drama movie.
02:08:48.000 I mean, I think he threw like one sidekick in the whole movie.
02:08:51.000 Yeah, it was more like...
02:08:52.000 Contractually obligated to...
02:08:53.000 Yeah, probably.
02:08:54.000 Probably, yeah.
02:08:55.000 You can't have a Chuck Norris movie if it doesn't kick at least one dude.
02:08:58.000 How does Chuck Norris look now?
02:08:59.000 Looks great.
02:09:00.000 He's healthy.
02:09:01.000 He's a martial artist.
02:09:01.000 Have you ever seen him in those commercials where he does those...
02:09:05.000 What is that thing?
02:09:05.000 It's a pulley system?
02:09:06.000 They do those exercises?
02:09:08.000 So he's legit then.
02:09:09.000 He's legitimately...
02:09:10.000 Because, you know, a lot of the action heroes from that era that are also movie stars, you know, there's a lot of sort of surgical stuff that starts creeping in and they start looking a little funny.
02:09:20.000 Well, Chuck Norris is...
02:09:22.000 I think he's in his 60s.
02:09:23.000 Let's look here.
02:09:25.000 But I know he definitely still trains.
02:09:29.000 He's...
02:09:29.000 He was born in 1940. Whoa.
02:09:33.000 Yeah.
02:09:34.000 So, that means he's 72?
02:09:36.000 Stang!
02:09:36.000 Wow.
02:09:37.000 God dang.
02:09:38.000 Oh my god.
02:09:39.000 Well, he's doing alright.
02:09:40.000 Healthy living.
02:09:40.000 That's amazing.
02:09:41.000 That's amazing.
02:09:42.000 Well, he looks incredible for 72. That's incredible.
02:09:45.000 Am I retarded?
02:09:46.000 No, I'm not.
02:09:47.000 That's exactly what he is.
02:09:48.000 Wow.
02:09:49.000 He's still very fit, though.
02:09:51.000 If you see him in these...
02:09:51.000 Whatever these things are, it's him and Christy Brinkley.
02:09:54.000 I don't remember what they're called.
02:09:55.000 These cable...
02:09:56.000 It's essentially a low-impact cable pulley system workout, and he promotes those things.
02:10:02.000 And he's a black belt in jiu-jitsu, and I'm sure he still probably does some form of martial arts still, too.
02:10:08.000 But he looks great.
02:10:09.000 For 72, he's incredible.
02:10:11.000 Yeah, yeah.
02:10:12.000 But, yeah, a lot of those dudes, they, you know, fucking your body doesn't last, son.
02:10:18.000 Shit's gonna go.
02:10:19.000 And that's where robotics comes in.
02:10:20.000 Hey, that's why I'm saving mine.
02:10:22.000 I'm not working out because you only get so many movements, you know, so I'm saving them all up.
02:10:28.000 Imagine if it worked that way, if you were born a fucking superhero, and every day of your life, you used your physical points, and the more physical activity you did, the lower your life got.
02:10:38.000 The brighter the candle burns, the faster.
02:10:40.000 Yeah, so your life would be almost like a video game.
02:10:44.000 Everybody would have the equal number to start with.
02:10:47.000 Well, they made that movie where you have a certain amount of time to live.
02:10:50.000 Did you know they're remaking Logan's Run?
02:10:52.000 Are they really?
02:10:53.000 I sat at the science fiction convention, I sat next to this guy...
02:10:56.000 And he's sitting next to me and he's bitching, you know, about, like, there's not any people here.
02:11:00.000 And I was like, he was older.
02:11:02.000 And I was like, so what, you know, what's your deal?
02:11:05.000 Why are you bitching so hard, you know?
02:11:06.000 And he's like, basically, it comes out.
02:11:08.000 He's like, I wrote Logan's Run, you know?
02:11:11.000 Oh, wow.
02:11:11.000 And I'm like, you are so lucky that you had a movie made out of your book in the 70s.
02:11:17.000 And I think they bought their rights in the 60s.
02:11:19.000 That people still remember, right?
02:11:21.000 I mean, how easy for that to just go, you know, just be gone.
02:11:25.000 And he was just bitter?
02:11:26.000 Just wasn't a happy guy?
02:11:26.000 No, he actually seemed cool.
02:11:28.000 He was sort of like, he was at peace with the fact that in this world, you can show up and a hundred people be in line to get a book signed.
02:11:38.000 Or you can show up and it'll just be like, like nobody.
02:11:41.000 Or like that one guy that wants to talk about...
02:11:44.000 Every aspect of Star Trek and then he sort of wanders off without buying a book.
02:11:48.000 Oh, that's got to be brutal for those guys.
02:11:49.000 I just invested 20 minutes in you, buddy.
02:11:52.000 That's hilarious.
02:11:53.000 That's hilarious.
02:11:54.000 Yeah, that's got to be annoying for those guys because they're really trying to be friendly to people but they're also trying to sell some shit and get some things signed.
02:12:00.000 Well, I mean, at the end of the day, yeah, I mean, you're trying to sell a book.
02:12:02.000 It's got to be weird when you...
02:12:04.000 I know the book Logan's Run.
02:12:06.000 I remember it, but I don't remember anything about it.
02:12:09.000 I don't remember the movie.
02:12:10.000 I don't remember the book.
02:12:11.000 The movie was about all these people that basically...
02:12:13.000 It's like 70s.
02:12:14.000 Everybody's in unitards.
02:12:15.000 And you die whenever you're 30. And if you don't agree to die, they hunt you down and kill you.
02:12:21.000 You have like the little gem in your hand.
02:12:22.000 It starts flashing.
02:12:23.000 So Logan doesn't want to die, basically.
02:12:27.000 I mean, I haven't seen the movie in forever and ever, but he makes a run for it.
02:12:30.000 I mean...
02:12:31.000 So how much different is it from that Justin Timberlake movie?
02:12:35.000 Yeah, right.
02:12:36.000 That's what got me thinking about it, where you have a certain number of minutes to move.
02:12:39.000 Life points.
02:12:40.000 Yeah.
02:12:40.000 Yeah, not very.
02:12:41.000 It's not very different.
02:12:43.000 That theme has been explored.
02:12:44.000 Well, people are always concerned about the idea that one day, due to the fact that we can keep everybody alive...
02:12:51.000 And the fact that populations are exploding, we're continuing to figure out new diseases and how to cure people when they're sick and people are staying alive longer.
02:12:59.000 At what point in time does it become an issue?
02:13:02.000 Do they ever need to control it?
02:13:03.000 How do you even go about doing that?
02:13:05.000 So all these scenarios, like the Justin Timberlake movie or Logan's Run pop up where the evil government forces you into a contamination process.
02:13:15.000 I don't know.
02:13:16.000 Yeah, I mean, again, I think at its heart it's about, like, being afraid that technology is going to change human nature.
02:13:22.000 Like, there are certain things that are innate about being human.
02:13:24.000 Like, we're born and we die, right?
02:13:26.000 I mean, come on.
02:13:27.000 Every single human being has lived and they're all going to die or they did die already.
02:13:32.000 And if you change that, then, I mean, that's scary stuff, right?
02:13:35.000 You're going into a completely unknown territory.
02:13:39.000 And ultimately, that's what everybody is shooting for with the height of technology.
02:13:44.000 The height of technology is to ascend past the physical body.
02:13:48.000 To get yourself into a position where you truly become immortal because you become part of some computer program.
02:13:55.000 See, man, I don't...
02:13:56.000 I'm not down with this.
02:13:57.000 Ray Kurzweil, he doesn't want to die.
02:13:59.000 He doesn't believe that he should have to die.
02:14:01.000 Yeah, no, this is what he says.
02:14:02.000 I mean, I understand.
02:14:03.000 He doesn't want to die.
02:14:04.000 And he says, you know, one way to avoid it is to have a machine that's going to Basically take a snapshot of every neuron in your brain, which they're all just little switches, right?
02:14:13.000 And it's going to figure out exactly what's going on in your brain, and then it's going to continue to simulate that.
02:14:19.000 But, like, part of me is just saying, yo!
02:14:23.000 When you die, even if you simulate yourself perfectly, you're freaking dead.
02:14:29.000 You're a pile of meat.
02:14:30.000 You're dead.
02:14:30.000 And there's the other possibility that when you're dead and this other life goes on, it's going to be completely disconnected from reality and who knows how it's going to progress.
02:14:44.000 Just because it's a copy of the operating system and all the information and all the traits that your brain...
02:14:50.000 How the fuck do you know what that thing is once it's on its own?
02:14:53.000 You remember Robocop where they put the criminal guy's mind into a machine and he's like...
02:14:58.000 They turn him on and he's just like...
02:14:59.000 This demonic face like he's totally in agony and screaming and he doesn't know...
02:15:03.000 What's happened to him?
02:15:05.000 He's lost his embodiment as a human being.
02:15:07.000 I mean, if you lose your limb, that mentally screws you up because your brain has a map of who you are, right?
02:15:15.000 And then if you took your whole brain and stuck it into something that wasn't even a human body, I mean, that shit would mentally traumatize you.
02:15:22.000 It might drive you insane, you know?
02:15:25.000 Yeah, it's a good point.
02:15:26.000 And then you'd hunt down Robocop and try to give drugs to children.
02:15:29.000 Well, we don't know what the impact of it would be psychologically to all of a sudden be trapped in this immortal machine.
02:15:36.000 And what if there became ethical considerations to whether or not you should be able to shut yourself off?
02:15:41.000 Because we can't commit suicide, but can a person who's downloaded their consciousness into an eternal machine...
02:15:47.000 Can they figure out?
02:15:48.000 Are they alive even?
02:15:49.000 Or could you get some crazy guy like George Foreman who named all his kids George?
02:15:55.000 What if you like cloned yourself a billion times?
02:15:57.000 You're just like one nutty dude who just decided to make a million of you and set yourself loose on Cleveland.
02:16:03.000 There's a lot of science fiction that covers...
02:16:05.000 This is all post-singularity stuff and there's a lot of sci-fi that covers this because you...
02:16:11.000 You lose grasp on basic human things like what does it mean to die if there's a thousand other copies of you?
02:16:17.000 What does it mean to have a daughter if you have a thousand copies of you?
02:16:20.000 What does it mean to be you?
02:16:21.000 How did you become you?
02:16:23.000 Are you a collection of genes?
02:16:24.000 Are you a reaction to your environment?
02:16:27.000 Yeah.
02:16:28.000 There's certain people that you meet them and you go, wow, you're fucking cool.
02:16:32.000 Like, where did you come from?
02:16:34.000 You know, I love talking to you.
02:16:35.000 And you could run into 500 people and not feel that and then run into one where you just can't stop talking to them.
02:16:43.000 And it's like, what makes that?
02:16:44.000 How do you make that?
02:16:45.000 What combination of things?
02:16:48.000 And how would you hold on to that if you transferred them?
02:16:51.000 Whatever that magic is to us, that magic compelling sort of charm is to us, that's completely lost in the worlds of cold ones and zeros and machines.
02:17:01.000 I agree.
02:17:01.000 Yeah, my whole outlook on science and what it's there for...
02:17:05.000 It's not a battle to defeat human nature or to escape from our meat or our bodies or our fates.
02:17:14.000 I think it's there to amplify what we've already got.
02:17:17.000 I think there's a lot of value.
02:17:18.000 We certainly can use it that way, yeah.
02:17:20.000 We're intellectual creatures, right?
02:17:22.000 We can live our lives in our heads and we can try to ignore our bodies or think of our bodies as impediments.
02:17:29.000 I don't work out.
02:17:31.000 I don't push my body.
02:17:33.000 I don't compete physically with people.
02:17:36.000 But I acknowledge that I eat.
02:17:40.000 I shit.
02:17:41.000 I have sex.
02:17:42.000 Tell us more.
02:17:43.000 I've made babies.
02:17:44.000 Dude, you're crazy.
02:17:45.000 These are all things that human beings do, right?
02:17:47.000 Right.
02:17:48.000 And to try to run away from that I think is crazy.
02:17:52.000 Because you've got to admit.
02:17:54.000 You've got to acknowledge you're embodied as a human being.
02:17:56.000 And if you...
02:17:57.000 Give in to that, then you, in some sense, are sort of fulfilling what you're there for.
02:18:03.000 It feels good to eat.
02:18:05.000 It feels good to go to the back.
02:18:06.000 Because it's natural.
02:18:08.000 Just to play devil's advocate.
02:18:10.000 What we're talking about is the concept of you becoming some sort of an immortal thing that doesn't do all the things that a human does.
02:18:17.000 But what if you do do all the things that a human does?
02:18:19.000 It's just that we've recreated it in a quote-unquote artificial form.
02:18:24.000 Instead of it being a carbon-based life form that occurred naturally, it's something that we engineered to occur.
02:18:32.000 But it is the same goddamn thing.
02:18:35.000 So it's not that you wouldn't shit.
02:18:37.000 It's not that you wouldn't eat.
02:18:39.000 But you'd be doing it in a new body.
02:18:41.000 Yeah.
02:18:41.000 That would be the ultimate form of it.
02:18:43.000 If you're not in a human body, right?
02:18:46.000 But it would be a human body.
02:18:48.000 If you get transferred into just a young human body?
02:18:51.000 I think they're going to be able to make human bodies.
02:18:55.000 With our idea of a robot, I think we'll certainly be able to make something that looks exactly like a human being.
02:19:01.000 But I think they're also going to be able to make a human being.
02:19:04.000 See, you're asking me to put my money where my mouth is basically because...
02:19:08.000 No, not really.
02:19:08.000 No, no, no.
02:19:08.000 But really I think you are because...
02:19:10.000 And it's interesting to me because I haven't thought about it this way.
02:19:12.000 But you're saying, look, Wilson, if you think that because you're embodied as a human, you're obligated to experience life as a human and do all the things humans do.
02:19:22.000 Well, guess what all humans do?
02:19:23.000 They die.
02:19:24.000 Right.
02:19:25.000 Right?
02:19:25.000 So if you're saying that you really are putting your money where your mouth is, then you have to be willing to die, right?
02:19:31.000 Right.
02:19:31.000 And acknowledge that that's a natural part of what humans do and a part of accepting your embodiment as a human.
02:19:39.000 I don't know.
02:19:40.000 I don't know if it came down to it.
02:19:41.000 I might not want to go over the rollercoaster.
02:19:44.000 Yeah, I think we don't want to leave behind the ones we love.
02:19:47.000 That's one of the big things.
02:19:49.000 We don't want to leave behind...
02:19:51.000 That would be one of the saddest things, your family being remorseful that you weren't around.
02:19:56.000 But the idea of what you're saying is you wouldn't want to be downloaded into some machine where you didn't experience all the joys an actual person experiences.
02:20:04.000 What I'm saying is I think they're going to be able to create artificial human beings that literally you will get a whole new body to download yourself into.
02:20:13.000 And you will drink wine, and you will enjoy it, and you will like blowjobs, and you will like water parks, and going skiing, and you will like doing all the things that a person likes.
02:20:22.000 You would just be doing it in this completely new physical existence that they've created, an artificial human being.
02:20:28.000 With boobs and a vagina.
02:20:29.000 If Brian would have this one freak group on a forum somewhere, and Brian would be a part of it.
02:20:37.000 You have a penis and a vagina.
02:20:39.000 I'm into my soul.
02:20:41.000 You know, I still think doing that would violate a lot of what it means to be a human being.
02:20:47.000 Brian would have an excuse to be fat.
02:20:49.000 What he would do is he would put the vagina like an inch below his belly button so he could fold it over and literally fuck himself.
02:20:55.000 Or just fuck your belly button.
02:20:57.000 Get your belly button winded.
02:20:58.000 Yeah, you could do that.
02:20:58.000 Throwing himself on doorknobs.
02:21:00.000 Wouldn't be as good though.
02:21:02.000 Bellybutton wouldn't be as good.
02:21:04.000 Yeah, it's gonna be really interesting to see what form all of this, you know, quote-unquote, progress and technological innovation, where it goes.
02:21:14.000 Because no one from, obviously, from the Lost in Space days, we were just looking at that.
02:21:19.000 No one saw this coming.
02:21:20.000 They didn't see the Internet coming.
02:21:21.000 They didn't see, you know, Twitter or, you know, Wi-Fi.
02:21:25.000 They didn't see any of that stuff.
02:21:26.000 Who knows what we're missing?
02:21:28.000 Who knows what, like, what...
02:21:29.000 One big thing that's going to change the whole ball of wax.
02:21:33.000 Yeah.
02:21:33.000 Like just drones.
02:21:34.000 You know, there was no drones in those old movies.
02:21:37.000 Like the Star Wars, there was like a few things that would fly around, like jets and shit.
02:21:41.000 Yeah, the little reconnaissance.
02:21:43.000 But not like what is going to be in our cities in just a few decades.
02:21:47.000 Looking to the military is not a bad idea.
02:21:50.000 Yeah.
02:21:51.000 The ARPANET was a DARPA project.
02:21:53.000 It turned into the internet.
02:21:56.000 GPS satellites, all that stuff, that was all military tech that eventually went public.
02:22:01.000 It would be so awesome if they weren't killing innocent people.
02:22:04.000 The Autonomous Vehicles is from a DARPA project.
02:22:09.000 That was from the Future Combat Initiative that has since gone away.
02:22:13.000 They wanted a lot of autonomous caravans because they were tired of humans getting blown up with IEDs.
02:22:19.000 So, looking to the military is not a bad idea if you want to see what's coming next for civilians.
02:22:25.000 The real spooky thing is the Air Force drone aviary, where they're working on all these different sized drones, flapping wing drones, drones that look like a bird.
02:22:37.000 They literally flap their wings and move like a bird.
02:22:39.000 There was a video where they showed all the different ones they have now, these dragonfly ones that fit on the tip of your finger.
02:22:45.000 It's incredible.
02:22:47.000 If these things were flying around, you would have no idea.
02:22:49.000 You would really think that that was a bug.
02:22:51.000 Yeah.
02:22:52.000 And that's just a couple years away.
02:22:54.000 They're going to be everywhere, filming everything.
02:22:56.000 Microwair vehicles, when they get smaller than six inches, all the flight dynamics change.
02:23:02.000 And so you have to start looking at insects instead of birds.
02:23:05.000 Oh, really?
02:23:06.000 So that's why the design changes like that?
02:23:08.000 Yeah, yeah.
02:23:08.000 And so that's all.
02:23:09.000 I mean, they study the dynamics of how insects fly, and then they try to replicate it.
02:23:14.000 Yeah.
02:23:15.000 Yeah, you really wouldn't be able to keep up those RPMs at the higher mass.
02:23:19.000 That's why a pterodactyl wouldn't fly like a dragonfly.
02:23:24.000 Yeah, I'm not an expert on this in particular, but from what I understand, the way a fly actually flaps its wings is it's more of a scoop, because it's dealing with such a small number of particles of air, and it actually becomes almost more like a fluid,
02:23:39.000 the way that it's interacting.
02:23:41.000 Yeah, you don't have to do that on a larger scale.
02:23:44.000 You can get lift without having to do that.
02:23:47.000 The agility of a fly, if you really wrap your head around it, like when you try to swat a fly and it darts away from you, it is mind-blowing how well those fuckers can move.
02:23:58.000 Crazy setup.
02:23:59.000 This little round body and these giant clear wings.
02:24:03.000 They have a lot of high level thought going on.
02:24:05.000 They're just wired straight to their nervous system to avoid obstacles.
02:24:09.000 Bees are way easier to swat out of the sky than flies are.
02:24:13.000 Killer bees.
02:24:14.000 You don't want that.
02:24:15.000 Killer bee honey.
02:24:16.000 When you swat a bee out of the sky though, it's really not that much of an accomplishment.
02:24:21.000 If you can actually swat a fly out of the sky, like whoa.
02:24:24.000 Bees kind of...
02:24:24.000 You've got to anticipate a fly.
02:24:26.000 Yeah.
02:24:27.000 Flies are robots, that's why.
02:24:29.000 Flies and hummingbirds.
02:24:30.000 You know what, man?
02:24:31.000 If flies were big, we'd be fucking terrified.
02:24:33.000 If flies were from another planet and we found them, if we tuned into another planet and we sent a probe, a light year away or whatever, and they found giant insect forms, if they were giant flies, like flies the size of bulls...
02:24:47.000 Bulls, that's pretty big.
02:24:49.000 I was reading about this on io9.com.
02:24:53.000 They had a deal about this, because insects used to be a lot bigger.
02:24:56.000 And it's apparently, you know, they breathe through these little holes.
02:24:59.000 I can't remember what they're called.
02:25:01.000 But that's the way that they get oxygen, right?
02:25:03.000 And it used to be that there was more oxygen in the atmosphere.
02:25:06.000 And that's why the bugs were all a shitload bigger.
02:25:10.000 Is that why dinosaurs are so big too?
02:25:12.000 Because it would support their weight better?
02:25:14.000 I don't know.
02:25:14.000 I don't think so.
02:25:15.000 No?
02:25:15.000 No.
02:25:16.000 I'm not sure though.
02:25:17.000 But I know that the bugs, because of the way they breathe specifically, they don't have...
02:25:23.000 Like lungs, like we do.
02:25:25.000 They have just little holes, and I don't know exactly how it works, but apparently whenever there's a lot of high oxygen, you get a ratio that's better for them.
02:25:34.000 Well, yeah, what I said doesn't make sense.
02:25:37.000 But I think there was some question.
02:25:39.000 I took it from...
02:25:40.000 There was some question that the environment was thicker, that the atmosphere was thicker.
02:25:45.000 It was a very recent thing.
02:25:46.000 They were trying to figure out why they were so big.
02:25:50.000 Like, what led them to grow so large?
02:25:54.000 I forget what the article was about.
02:25:57.000 But I think that was part of the...
02:26:02.000 Part of the scenario that they're proposing was that something about the atmosphere was much different and that it was easier to support giant forms of life.
02:26:10.000 Then the other thing was, I guess, the trees and the vegetation was different back then.
02:26:14.000 A lot of these animals were vegetable eaters.
02:26:17.000 And then if you're going to have a giant brontosaurus, if something's going to eat it, it's got to be like a T-Rex.
02:26:21.000 It's got to be a giant, fucking, even bigger, crazier thing.
02:26:24.000 So it's almost like the more plants you have to eat, the bigger the things are that eat the plants and the bigger the things that eat the...
02:26:30.000 Things.
02:26:30.000 Yeah.
02:26:31.000 We were talking about this documentary about an area of Africa that's cut off.
02:26:36.000 The river changed its course just a hundred years ago.
02:26:39.000 And this area has been isolated.
02:26:41.000 Straight up King Kong action?
02:26:42.000 Well, sort of.
02:26:43.000 It's lions and water buffaloes.
02:26:45.000 And the lions have grown larger because they have to only take down water buffaloes.
02:26:49.000 Right.
02:26:49.000 So there's like one pack specifically of female lions that are big as male lions.
02:26:55.000 They're enormous.
02:26:56.000 And it's all because they have to take down water buffaloes.
02:26:58.000 Like they've adapted to this one particular area.
02:27:01.000 Every animal is a solution to a problem.
02:27:03.000 Yeah, but do robots, that's the question, do robots, and if they're going to have that engineered into their system, aren't us the first shit they're going to get rid of, man?
02:27:12.000 I don't think so.
02:27:13.000 I think that they're just, you know, robots are tools and if we design them well, they'll The fuckers are going to take over, dude.
02:27:18.000 Do you believe in a scenario where there's going to be people that don't even know that they're robots?
02:27:23.000 I hope.
02:27:24.000 Yeah, you hope?
02:27:25.000 Yeah.
02:27:25.000 You're going to marry one of those?
02:27:27.000 Bring one of those in?
02:27:29.000 Look, she's just going to stay with us for a while.
02:27:32.000 It just makes out with you when no one's around.
02:27:35.000 And then she cleans.
02:27:37.000 Yeah, it's like we're going to eventually have to deal with the moral aspects of ordering them around, having them as slaves, making them rust, sex tools.
02:27:48.000 I think that robots have the potential to be more human than we are, to be more moral than we are.
02:27:55.000 And to be great examples for our children, to raise our children.
02:28:00.000 I think we're going to become very, very, very intimate with these machines.
02:28:06.000 Until I hack your robot.
02:28:08.000 Yeah, and then it's going to make your robot fuck you.
02:28:10.000 Yeah, and kiss your kid.
02:28:11.000 Do you think it's ever possible that robots can be funny?
02:28:14.000 Will they ever figure that out?
02:28:16.000 Is that a mathematical...
02:28:17.000 Yeah, is that a problem?
02:28:19.000 Is that what that is?
02:28:20.000 I think, yeah, you could build a learner that would try to figure out what's funny specifically and tell jokes.
02:28:26.000 I'm sure people have done that.
02:28:27.000 I bet if you look for that, you'll find it.
02:28:29.000 I wonder if a robot could craft a bunch of things that were specifically funny.
02:28:35.000 Because...
02:28:37.000 We always want to think that, well, there's certain things you can't do, you can't recreate talent.
02:28:41.000 Maybe we can.
02:28:42.000 They can paint paintings, they can create symphonies.
02:28:45.000 Maybe we're just very rudimentary right now in our thoughts on what robotics are or what artificial intelligence is, but why would we think that we're so special that we can't be recreated?
02:28:53.000 I mean, it's really kind of arrogant.
02:28:55.000 I know, and it's a losing battle, right?
02:28:57.000 It's like every year there's a new thing that they can do, and we're slowly shrinking down.
02:29:02.000 And you know, at the end of the day, the only thing that might be left is that morality, that ability to be good or evil.
02:29:08.000 That lump of meat that has to sit behind the steering wheel to take the blame.
02:29:13.000 That might be all we have to offer.
02:29:14.000 And we have to figure out what is that for, even?
02:29:17.000 Why does that exist?
02:29:18.000 What is it showing us about the human race where there's a positivity and there's energy to be derived from good behavior and from healthy behavior?
02:29:29.000 It's just very difficult to teach that to people.
02:29:31.000 And if you negatively reinforce it and give them a lot of negative energy in their life and a lot of negative experiences, then they recreate that sort of energy and they go after it over and over again.
02:29:40.000 They get addicted to a certain pattern.
02:29:44.000 That's the number one issue with engineering human beings, period.
02:29:48.000 Forget about robots.
02:29:49.000 We haven't even figured out how to get the human meat machine to operate in the correct manner.
02:29:54.000 That's the great thing about humans, right?
02:29:56.000 We think at least that we have the free will and then we can be good or evil.
02:30:00.000 If you program a robot to be good, no matter what, then it's not good.
02:30:04.000 It's just doing what it's programmed to do.
02:30:05.000 That's interesting.
02:30:06.000 So that's like a lot of trust.
02:30:07.000 I mean, if we ever have a machine that's capable of doing that, that's like creating life.
02:30:14.000 Because that's when you say, go out and do good or evil and then I want to see what you do.
02:30:19.000 That's some more biblical themes right there.
02:30:21.000 It really is, right?
02:30:23.000 Yeah.
02:30:23.000 And that really is biblical.
02:30:25.000 I mean, you really are creating a life form.
02:30:27.000 If you were 52 years old and you had been divorced several times and you had almost no money left and somebody gave you a beautiful robot sex slave that didn't want to vote and had no personality of its own, just existed to fulfill your sexual needs, maybe then you'd understand.
02:30:42.000 But right now you're a young man with hope in your eyes and dreams for the future.
02:30:46.000 If you were broken by a steady stream of bad relationship choices and divorces and you were living in a fucking shack outside of Palmdale with a car that's broken down, you wouldn't think twice about that robot fuck doll.
02:30:56.000 You'd be like, I'm out of the dating scene for life.
02:30:59.000 Do you think that would make people happy?
02:31:02.000 No, certainly not.
02:31:04.000 They'd probably...
02:31:06.000 Drown her and kill themselves in the process.
02:31:11.000 I don't know.
02:31:12.000 I think that it's very hard for people to be happy.
02:31:15.000 And one of the things that people need in order to be happy, in my opinion, is that you need happy people in your life.
02:31:22.000 You need to surround yourself with happy people.
02:31:23.000 That alone is very difficult to find because finding a group of people that have managed to maneuver and managed to carve a path through life that's been generating The majority of the people they're encountering with are enjoying their company.
02:31:38.000 The majority of it is a positive experience.
02:31:41.000 Goals have been fulfilled.
02:31:43.000 Health is in order.
02:31:44.000 All those different things, all those variables that have to be in place in order to find a truly happy person, it's really difficult to accumulate a bunch of people like that and get together.
02:31:53.000 So occasionally, like many of us in our lives, have known there are certain people we have to cut off.
02:31:58.000 There's a certain point in time, you know, okay, this person is an energy vampire.
02:32:02.000 They're never getting their own shit together.
02:32:04.000 Hold on, you gotta cut me off for one second.
02:32:05.000 You gotta pee.
02:32:06.000 I gotta pee, man.
02:32:07.000 See, son, my bladder's strong.
02:32:09.000 Mike Goldberg, you hear that?
02:32:10.000 Yeah, please go.
02:32:11.000 Let's go through right there at their door, and there's a door on the left.
02:32:14.000 See, dude, I ramble so hard I make people pee their pants.
02:32:18.000 That's how I do, son.
02:32:20.000 People are requesting a Brian Redband and Joe Rogan-only podcast for number 300. Oh, yeah.
02:32:25.000 Is that coming up?
02:32:26.000 Yeah, it's coming up, dude.
02:32:27.000 I think we only have a couple more weeks.
02:32:28.000 Yeah.
02:32:29.000 What number is this?
02:32:31.000 284, I think.
02:32:32.000 That's ridiculous, son.
02:32:33.000 We should see if we could coincide it somehow or another.
02:32:36.000 We time them out right.
02:32:38.000 We can coincide it with the end of the month, which is when we started in the first place, right?
02:32:44.000 We started December 31st.
02:32:47.000 Yeah, it was like New Year's Eve, I think.
02:32:51.000 They'll probably know before us.
02:32:52.000 But it seems like that'll wind up around there.
02:32:55.000 Yeah.
02:32:55.000 You know?
02:32:55.000 We should probably time it to that.
02:32:57.000 We should.
02:32:57.000 Yeah.
02:32:58.000 That'd be cool.
02:32:58.000 And so we'll do a three-year anniversary.
02:33:02.000 We've been doing this for three years.
02:33:03.000 That's crazy.
02:33:04.000 Isn't that ridiculous?
02:33:04.000 Yeah.
02:33:05.000 I got a pee, too.
02:33:05.000 Do you have any robot questions for this dude?
02:33:07.000 I don't know.
02:33:07.000 He's probably taking a big stinker in there.
02:33:09.000 He's taking a long front pee.
02:33:10.000 He's snapping one off.
02:33:11.000 He might be beating off all this talk about sex and robots and robot fuck dolls in the Palmdale Desert.
02:33:16.000 Let's check the toilet cam.
02:33:18.000 I think robot fuck dolls is probably going to be one of the first things that people invent besides maids.
02:33:25.000 If you can make a Tara Patrick looking maid, you're going to fuck it.
02:33:28.000 It's going to clean your house and you're going to fuck it.
02:33:30.000 I think it's going to start off as a pet.
02:33:31.000 At first it's going to be some kind of animal, like a cat.
02:33:34.000 I got a pee too, man.
02:33:35.000 We're talking about Tara Patrick as a robot that you can fuck.
02:33:40.000 Elaborate, please.
02:33:42.000 Tara Patrick.
02:33:43.000 Oh, I'm taking over now, huh?
02:33:45.000 You and you, Brian?
02:33:45.000 Yeah.
02:33:46.000 So what do you think about the Olive Garden?
02:33:51.000 I think the Olive Garden is pretty nice.
02:33:54.000 You know that there's a list of companies, including the Olive Garden and Red Lobster, where they're cutting everyone's hours so they don't have to give them health care.
02:34:01.000 Have you heard about this?
02:34:02.000 It's like Red Lobster, Olive Garden, all these legit companies.
02:34:07.000 And because of that Obamacare, they're cutting everyone's hours.
02:34:10.000 And it's pretty fucking weird watching these companies do that.
02:34:14.000 Yeah, it's sort of weird because you see that the corporations have rebounded.
02:34:19.000 They've got all this money, but they're still not quite trusting that the economy is better, so they're sticking with all the cuts they made and riding that as long as they can.
02:34:32.000 Do you collect robots?
02:34:34.000 Do you have a collection of toys?
02:34:38.000 People give me robots a lot because I'm the robot guy.
02:34:42.000 I've been studying them.
02:34:44.000 I have a lot of them around the house, but Yeah.
02:34:51.000 Yeah.
02:35:10.000 And like metal pieces that you could put together.
02:35:12.000 And one of the things that they give you instructions for is how to build a robot.
02:35:16.000 And this robot was just like, I think it had like eyeballs that were light bulbs and it moved and stuff.
02:35:22.000 But as a kid, it was like, I built my own robot.
02:35:24.000 There were like a lot of 80s robot kits that you could buy.
02:35:27.000 And they were like, look, it's a robot.
02:35:30.000 You're going to get a robot.
02:35:31.000 And then you get it and you're like, I don't know.
02:35:33.000 It's like Revenge of the Nerds, you know, they have that robot.
02:35:35.000 And you're like, I don't think it can really You remember Napoleon Dynamite where the uncle got a time machine and he kept trying to use it?
02:35:41.000 Yes, it's like that stuff.
02:35:44.000 I told a friend to meet me at 5, but I gotta text him real quick.
02:35:47.000 I'm so sorry.
02:35:48.000 Oh, that's okay, man.
02:35:49.000 We actually have 5 minutes.
02:35:51.000 Oh, okay.
02:35:52.000 Is this really...
02:35:53.000 We're almost done.
02:35:54.000 Time flies, man.
02:35:55.000 10 minutes?
02:35:56.000 Yeah.
02:35:56.000 It was a long-ass podcast, dude.
02:35:58.000 It was fun, though.
02:35:59.000 Listen, if anybody wants to get a hold of you, they can get a hold of you on Twitter.
02:36:04.000 Your Twitter is...
02:36:05.000 What is it again?
02:36:06.000 It's Daniel Wilson PDX. Daniel Wilson PDX. I have to repeat it just in case they didn't hear you, even though you were very clear.
02:36:14.000 Daniel Wilson PDX, folks.
02:36:17.000 And you can also go to his website, which is DanielHWilson.com.
02:36:23.000 And the book is Robopocalypse.
02:36:27.000 Did I say it right?
02:36:28.000 Robopocalypse.
02:36:29.000 In Germany, it's called Robocalypse because the Po means ass.
02:36:33.000 Oh, really?
02:36:34.000 The German editors were like, hey, so we don't really want it to be like robot ass apocalypse.
02:36:39.000 That sounds perfect.
02:36:40.000 It's a whole different book.
02:36:41.000 That sounds better, man.
02:36:42.000 Why am I... I might have to do a rewrite.
02:36:45.000 Yeah, they're censoring you, man.
02:36:47.000 The man is trying to keep you from selling books.
02:36:49.000 That's fucked up, dude.
02:36:51.000 And Amped is the other book, right?
02:36:55.000 Yeah.
02:36:55.000 It's on Kindle and all that stuff, right?
02:36:57.000 Yeah, they're in bookstores.
02:36:58.000 They're on Amazon.
02:36:59.000 Oh, it's actually on Audible.
02:37:00.000 I got one of those Barnes& Noble Nooks.
02:37:03.000 It's pretty badass, dude.
02:37:05.000 I got it when I was in San Francisco.
02:37:06.000 I like it.
02:37:07.000 Tell me when you get a real tablet.
02:37:09.000 What are you talking about?
02:37:10.000 I like that too, man.
02:37:11.000 But I like for reading, you know what I like about it?
02:37:13.000 That it looks like print on paper.
02:37:16.000 It doesn't have that look of like an LCD screen.
02:37:18.000 You know, the way the Kindles look.
02:37:21.000 You know, the Kindles and the Nooks, they both have that sort of same aspect to it.
02:37:25.000 I like the paper look, yeah.
02:37:27.000 Yeah, I like that.
02:37:28.000 I think it's less eye strain for reading books.
02:37:30.000 And you can get your shit on that, right?
02:37:32.000 Yeah, yeah, absolutely.
02:37:33.000 And what I love about that is on Kindle, if you highlight something in the book, then that stuff can be shared.
02:37:40.000 And so I can actually see what people are highlighting in the book and see what they like.
02:37:45.000 And people comment about it.
02:37:46.000 So let's say, you know...
02:37:47.000 That's some stupid shit.
02:37:48.000 How do they do that?
02:37:49.000 When they highlight it and share it, what do they do?
02:37:51.000 I think you have to have your settings to share.
02:37:54.000 Oh, that's beautiful.
02:37:54.000 What a great idea.
02:37:55.000 It's pretty cool as an author to be able to go onto Amazon and see what people are reading.
02:38:02.000 That's some new stuff.
02:38:03.000 Yeah, and see the exact quotes that people are responding to.
02:38:06.000 It's a stand-up comic.
02:38:07.000 Sometimes you'll say something at a show and the next night someone will quote it and they'll be laughing and you go, oh yeah, I forgot I even said that.
02:38:13.000 You know, it becomes these quoted things.
02:38:17.000 When you see your stuff highlighted and you see like the things that people really enjoy or did enjoy, how much does that affect your next writing?
02:38:27.000 Do you really look a lot at the feedback and try to like see it from their point of view?
02:38:33.000 No, man.
02:38:34.000 It affects my readings because I'll try to go read.
02:38:37.000 When I do a reading, I'll try to read like The part that people like, so they think I'm smart and good at writing.
02:38:46.000 So it affects which excerpts you choose for a reading thing.
02:38:50.000 But when I'm writing, man, when I'm nerding out and I'm all super excited, then I know I'm doing it right, basically.
02:38:57.000 Isn't that the coolest thing in the world?
02:38:59.000 Like when you're writing and an idea comes into your head and you're just following it down and it's like building and growing like right before your eyes.
02:39:07.000 Being able to create something and being able to, you know, come up with some shit that didn't exist before and then boom, then all of a sudden it does.
02:39:13.000 It's such an unbelievably satisfying experience.
02:39:16.000 And then it's there.
02:39:17.000 It hangs around.
02:39:18.000 You got it.
02:39:19.000 You're like, I did this.
02:39:20.000 This is what I did in 2009. It's right there.
02:39:24.000 It's a thing I made.
02:39:25.000 If you have a job where you're able to create and also make some kind of artifact that you have, you can say, I did that.
02:39:34.000 New York Times bestselling book.
02:39:35.000 And on top of that, people can continue to buy it.
02:39:38.000 It's always available.
02:39:39.000 They can always get it.
02:39:40.000 They can get it in a hard form.
02:39:42.000 They can get it in a digital form.
02:39:43.000 They can keep getting it.
02:39:44.000 I'm pretty curious what this movie is going to do for you.
02:39:47.000 Oh, it's going to blow through the roof.
02:39:49.000 If you just only had a few bitches getting rape-choked and gagged and ball-gagged and mouth-fucked, you just kind of have them abused a little bit more.
02:39:58.000 Do your books go in order at all, if I'm ready to buy them?
02:40:01.000 No.
02:40:01.000 So Robopocalypse and Amped are two different standalone books.
02:40:05.000 What are your theories on this Fifty Shades of Grey thing and why women are into getting their mouths spit in and stuff like that?
02:40:11.000 What is this change?
02:40:13.000 What is what's going on here?
02:40:14.000 Yeah, I mean, what do you think, right?
02:40:16.000 Because, well, it's interesting that it started out as fan fiction, right?
02:40:19.000 Really?
02:40:19.000 So this was Edward and Bella.
02:40:22.000 You didn't know that?
02:40:22.000 Oh, no, I did not.
02:40:23.000 This book was fan fiction.
02:40:25.000 Shut the fuck up.
02:40:26.000 Fifty Shades of Grey started out with Twilight?
02:40:28.000 Yes.
02:40:29.000 All they did, literally, was change the names of the characters.
02:40:32.000 Oh, my God.
02:40:33.000 And then they published it.
02:40:34.000 That is ridiculous.
02:40:36.000 That's the most silly thing I've ever heard in my life.
02:40:38.000 It's the real deal.
02:40:39.000 So it was someone who was really into that and they just followed on that tone and created like a sexual bondage version of it.
02:40:48.000 You wonder, maybe that's what's the undercurrent, right?
02:40:51.000 I mean, so it's like...
02:40:53.000 Twilight has a story, you know, it has all this stuff, but there's obviously a deep sexual undercurrent.
02:40:59.000 It's a love story, right?
02:41:01.000 And so this lady just went for the jugular.
02:41:04.000 She said, forget all the bullshit.
02:41:06.000 Fuck vampires.
02:41:08.000 Okay, let's say he's a billionaire.
02:41:09.000 He's a financial vampire.
02:41:10.000 And then it's all just about fucking, you know?
02:41:12.000 Wow.
02:41:13.000 Maybe that's it.
02:41:13.000 You know, it's just humans are not as complicated as we like to think.
02:41:17.000 Maybe that's what it's really about.
02:41:18.000 I certainly think that's the case, but I also think that women want to hear shit that's in their voice.
02:41:25.000 So, like, most pornography, I think, is from the male voice.
02:41:29.000 What is that?
02:41:29.000 I just bought his book.
02:41:30.000 Nice.
02:41:31.000 How beautiful is that?
02:41:32.000 You got Ernie Klein next to me.
02:41:33.000 I'm in good company.
02:41:34.000 Powerful applications.
02:41:34.000 Oh, yeah.
02:41:35.000 Have you read that book?
02:41:36.000 Sure.
02:41:36.000 I love it.
02:41:37.000 I blurbed it.
02:41:38.000 My blurb's on that book somewhere.
02:41:39.000 Yeah, I'm reading it right now.
02:41:40.000 Do you think Apple will ever shrink that bitch down so it fits in your pocket?
02:41:43.000 It does fit in my pocket.
02:41:44.000 Your, like, your pants pocket?
02:41:45.000 Yeah, check this out.
02:41:46.000 He's got Apple brand pants.
02:41:48.000 Can you get some apple pants?
02:41:52.000 Okay, if you sit down, that is going to break and it's going to cut your dick in half.
02:41:55.000 Oh my god, it's one man, one iPad.
02:41:57.000 It's so going to cut your dick.
02:42:00.000 You will be one man, one iPad.
02:42:02.000 Don't ever take that out when you're drunk.
02:42:04.000 But if you have a nice man purse, you can carry that around easily.
02:42:07.000 I'm sitting down normal.
02:42:08.000 No, you're not.
02:42:09.000 You're going to die.
02:42:09.000 Your voice is high.
02:42:10.000 You should commit to a man purse.
02:42:12.000 Commit to a man purse.
02:42:14.000 So you're a big proponent of that, using that, because you can get online with it.
02:42:18.000 I do.
02:42:19.000 Photos, everything.
02:42:19.000 It does pretty much everything.
02:42:20.000 I got the non-cellular one because I have an iPhone that has a hotspot.
02:42:24.000 But I think more that I'm using the hotspot feature on the iPhone, I think if you were going to get it, do get the one with the cell phone service built into it.
02:42:32.000 It's just kind of more of a pain in the butt, like, oh, I've got to turn on my hotspot.
02:42:35.000 Right, that's what I was thinking.
02:42:36.000 I was thinking that it would probably be better to wait for the other version of it.
02:42:41.000 But as compared to the two iPads, I have both of them.
02:42:45.000 Definitely I love this one better.
02:42:47.000 It's enough.
02:42:49.000 It's what you need, right?
02:42:50.000 What was crazy when we were in Seattle and we were streaming from it, we streamed a Ustream show from it.
02:42:55.000 Yeah.
02:42:56.000 We lost Daniel.
02:42:57.000 Sorry.
02:42:59.000 Do you give a fuck about this kind of stuff?
02:43:01.000 Or are you just holding into robotics?
02:43:03.000 I'm sorry.
02:43:04.000 Whatever.
02:43:05.000 No worries, man.
02:43:06.000 I don't really collect a lot of gadgets.
02:43:08.000 No?
02:43:09.000 Really?
02:43:09.000 I have this thing that is like the magnet doodle, you know?
02:43:12.000 Where you use the little magnet thing and you draw on it.
02:43:16.000 And that's like the closest thing to an iPad that's like at my house right now.
02:43:20.000 It's really sad.
02:43:20.000 The one with the clown?
02:43:21.000 Or what's the one with the clown?
02:43:22.000 It is a toy from my kid, yeah.
02:43:26.000 Lightbrite's pretty badass.
02:43:27.000 I keep meaning to.
02:43:29.000 Lightbrite?
02:43:29.000 Lightbrite?
02:43:30.000 Honestly, I'm just scared because it changes everything.
02:43:33.000 Every time you get one, you know, like already, if I look at the stupid phone, you saw that, it sucks you in.
02:43:39.000 You totally immediately forget about everything.
02:43:41.000 I'm like a little retarded with this.
02:43:43.000 Yeah, when someone has a new one, five minutes, we're almost done.
02:43:46.000 If someone has a new one, man, I'm drawn to it.
02:43:49.000 Like a baby.
02:43:50.000 And that's the other thing.
02:43:51.000 Little kids love it, right?
02:43:53.000 So if you bring it, it's like bringing crack home.
02:43:56.000 It's amazing.
02:43:57.000 And you're done.
02:43:57.000 And I'm kind of afraid of how it will affect my kids, too.
02:44:00.000 Well, there's educational shows on there.
02:44:01.000 And there's educational games that they can count.
02:44:03.000 They learn how to count things.
02:44:05.000 Wasn't I hearing earlier about a certain feline companion that enjoys using the iPads?
02:44:10.000 Yeah, oh yeah, my cat plays with my iPad and pisses all over my couch.
02:44:13.000 And that's not a euphemism for anything?
02:44:16.000 My cat.
02:44:17.000 My cat plays with my iPad and then she pisses all over my couch.
02:44:20.000 Oh no, I do live with a black guy.
02:44:22.000 A black gentleman.
02:44:24.000 My cat.
02:44:24.000 He's an old cat.
02:44:27.000 Was.
02:44:28.000 Name is William.
02:44:29.000 Go by the name of William.
02:44:31.000 He likes iPad.
02:44:32.000 Cat was born in the 40s.
02:44:35.000 Yeah, right?
02:44:37.000 He's an old one-pocket player.
02:44:39.000 It's actually Cat Williams.
02:44:40.000 Oh.
02:44:41.000 Cat Williams.
02:44:42.000 Okay.
02:44:43.000 Enough said.
02:44:44.000 Enough said.
02:44:45.000 Anyway, ladies and gentlemen, I think we gave out all the information.
02:44:48.000 We probably could.
02:44:49.000 Daniel H. Wilson PDX on Twitter.
02:44:53.000 Daniel H. Wilson.
02:44:54.000 No, it's Daniel Wilson PDX on Twitter, not Daniel H. Wilson.
02:44:58.000 So Daniel Wilson PDX on Twitter, DanielHWilson.com.
02:45:02.000 Thanks a lot, man.
02:45:02.000 It's been really fun.
02:45:03.000 It was a fun conversation, really interesting stuff.
02:45:05.000 And if you ever have anything you want to promote, you want to come back, do it again.
02:45:08.000 We'd be happy to have you on.
02:45:09.000 Dude, I had a great time.
02:45:11.000 Sorry I had to urinate.
02:45:12.000 That's so human of me.
02:45:14.000 I had it too.
02:45:15.000 Once you did it, you broke the ice.
02:45:16.000 I had to go do it too.
02:45:18.000 But we know.
02:45:19.000 Look, it's fucking three hours.
02:45:20.000 We don't even take commercial breaks.
02:45:22.000 But I think it makes for a better conversation that way.
02:45:24.000 It's hard to...
02:45:25.000 It's hard to take a commercial break and then come back and just pick up where you were.
02:45:28.000 It just seems awkward, you know?
02:45:29.000 So we do it this way, and it requires people to hold their bladder and keep it together.
02:45:33.000 I think you did an excellent job, though.
02:45:35.000 You had a big thing of water.
02:45:36.000 You lasted like two hours.
02:45:38.000 That's pretty strong.
02:45:39.000 Thank you guys so much for having me on.
02:45:40.000 You're welcome, man.
02:45:40.000 Thank you.
02:45:40.000 I had a great time.
02:45:41.000 And one more time, the two books, your books are Robopocalypse and Amped, and those are the ones that you're promoting, right?
02:45:50.000 But you have more than that, right?
02:45:51.000 Yeah, I have some.
02:45:51.000 How many books do you have altogether?
02:45:53.000 Eight.
02:45:53.000 Eight.
02:45:54.000 Go buy them, bitches.
02:45:55.000 Go get them on Audible.
02:45:57.000 Yeah.
02:45:57.000 Do you have them on Audible?
02:45:59.000 Yeah.
02:46:00.000 Beautiful.
02:46:01.000 Go to audible.com.
02:46:02.000 And if you go to audible.com forward slash Joe, you can get a free book and you get a 30-day free membership to one of the best services that I think...
02:46:16.000 I love Audible.com.
02:46:18.000 I love the idea behind it.
02:46:20.000 And they have a massive selection, including this young man's fantastic robot books.
02:46:25.000 So go get it, you dirty bitches.
02:46:27.000 Thanks to Onnit.com.
02:46:29.000 That's O-N-N-I-T. Go and get yourself some New Mood, bitch, or some Alpha Brain.
02:46:34.000 If you have any questions about these supplements, there's a 100% money-back guarantee on the first 30 pills.
02:46:39.000 You don't even have to return the product.
02:46:41.000 No one's trying to rip you off.
02:46:42.000 They're just trying to save you The looking for the best shit possible.
02:46:46.000 Sell you all the stuff that I would use.
02:46:48.000 Use the code name Rogan and you will save yourself 10%.
02:46:52.000 Alright, this fucking show's over.
02:46:53.000 Oh, if you use the code name Sandy, we'll take that 10% and we'll donate it towards Hurricane Relief.
02:46:57.000 We actually decided to go with the Salvation Army because in this case, the Salvation Army is using 100% of the proceeds for Hurricane Relief.
02:47:04.000 It's like a lot of them...
02:47:05.000 It actually gets down to like as low as like 30% actually goes to the people and the victims.
02:47:09.000 But Salvation Army in this case, it's 100%.
02:47:11.000 So we're going with them if you use the code name Sandy.
02:47:15.000 All right, fuckers.
02:47:16.000 We might see you tomorrow.
02:47:17.000 We got a lot of work to do.
02:47:18.000 We're going to be at the new studio, tightening shit down.
02:47:22.000 Brian and I are just starting to set that place.
02:47:24.000 I was not quite done yet.
02:47:26.000 Takes time, bitches.
02:47:27.000 Takes time.
02:47:28.000 But I'll see you guys all in Montreal for sure this Friday at the Metropolis with Duncan Trussell.
02:47:33.000 All right, go fuck yourselves.
02:47:34.000 See ya.
02:47:34.000 Bye.