The Joe Rogan Experience - December 17, 2019


Joe Rogan Experience #1402 - Boyan Slat


Episode Stats

Length

1 hour and 44 minutes

Words per Minute

161.47594

Word Count

16,885

Sentence Count

1,375

Misogynist Sentences

23

Hate Speech Sentences

10


Summary

The Great Pacific Garbage Patch is a huge patch of plastic floating around the Pacific Ocean, causing a massive amount of pollution. In order to clean it up, we need to catch and remove it from the ocean, and the biggest problem is that it's spread out over a vast area of the ocean and it's hard to catch it all at once. What if we could create a system that could catch all the plastic floating on the surface of the oceans, and then dump it in one giant pile? Well, that's exactly what we've been working on for the past 5 years, and finally it's finally up and running. In this episode of the CleanTech Talk podcast, we catch up with the man in charge of the project and find out what it's been like getting it into the wild, and how it's going so far! We also hear from the team behind the first ever ocean catch net and how they plan to catch even more in the future. This is a must listen for anyone who wants to clean up the Great Pacific garbage patch, or wants to know if they can catch all of the plastic in the ocean. Listen in to find out if they've got what it takes to catch all that amount of plastic that's out there. Have a question or would like to suggest a solution to the problem? hl=en We'd love to hear your responses in the comments below! Timestamps: 0:00: What's your favourite piece of plastic catching machine? 5:00 - What do you like about it? 6:30 - What are you looking forward to catching? 7:30:00 8:00- What's the next step? 9: What s your biggest challenge? 11:30- What is your biggest piece of equipment? 12:00s - How do you think you'd like to catch more? 13:30s - Is there a better solution? 15:00+ - What would you like to see me catch? 16:00? 17:15 - How big? 18:40 - How would you catch more of it in the wild? 19:40s - What kind of problem do you're looking for? 21:00 +16: What are your biggest goal? 22:00e: What do I need? 23:00a: How do I catch it in five years? 26:00 | What s the biggest thing you're working on?


Transcript

00:00:03.000 What's up, fella?
00:00:04.000 How are you?
00:00:05.000 Good to see you again, man.
00:00:06.000 Likewise.
00:00:06.000 I've been reading that you are having some great success with your machine, finally.
00:00:11.000 Everything's up and running.
00:00:13.000 Last time we talked, you had yet to implement it, actually out in the wild.
00:00:18.000 Explain to us what happened.
00:00:19.000 You had some bumps in the beginning, right?
00:00:22.000 Yes.
00:00:22.000 Yeah, so it's been quite a few years.
00:00:25.000 Finally, something's happening.
00:00:28.000 So we launched our first ocean system from San Francisco in September of last year, and we took it out, and roughly two months later, we figured that, first of all, it wasn't catching plastic,
00:00:44.000 so What we saw was that the system was moving at roughly the same speed as the plastic.
00:00:48.000 So maybe just take one step back the idea and how it works.
00:00:52.000 So, of course, we have this Great Pacific garbage patch between here and Hawaii, twice the size of Texas, 100 million kilos of plastic, doesn't go away by itself.
00:01:02.000 And the idea was to have this artificial coastline that is driven by the forces of the ocean.
00:01:08.000 We put it in there and the plastic naturally accumulates against it and kind of stays in there so we can then periodically get it out.
00:01:15.000 Because the big challenge is that although there's a lot of plastic, it's spread out over this vast area.
00:01:22.000 So we first have to concentrate it before we can take it out because if you were to simply troll the ocean for plastic with boats and nets, it would just take...
00:01:31.000 Forever, really.
00:01:33.000 So the idea was to have those artificial coastlines.
00:01:36.000 We deployed the first one, and then what we saw was that somehow the system was moving at the same speed as the plastic.
00:01:43.000 So you can imagine if this is like your Pac-Man, and this is your catch, and it's moving at the same speed, it's not going in.
00:01:53.000 And sometimes it did go in, but it went out again.
00:01:56.000 We got a video of it, what it was doing.
00:01:58.000 Oh, that's great, yeah.
00:01:59.000 So...
00:02:00.000 So this is the basic idea.
00:02:02.000 But it wasn't doing that.
00:02:04.000 And then we thought, okay, that's alright.
00:02:08.000 We'll learn from it.
00:02:09.000 We'll try and adjust the systems.
00:02:11.000 And then literally exactly a year ago, the system broke into two.
00:02:15.000 And so it was a structural failure, forcing us to tell the whole thing back to land and go back to the drawing board.
00:02:23.000 So we didn't have the best start of this year.
00:02:26.000 How much time has been lost?
00:02:28.000 Or how much time has been spent, I should say, in the beginning phase, the initial version that you launched versus where you're at now?
00:02:35.000 So we've been going on at this since 2013. Oh, wow.
00:02:41.000 Six years.
00:02:42.000 Yeah, so basically after five years, launching it and seeing it break into two, that wasn't the best start of the year I could have imagined.
00:02:50.000 But then, yeah, we went to the drawing board, and the team really took it well, and we...
00:02:57.000 We took those lessons into account, adjusted the design and relaunched really just a few months later, so in June.
00:03:03.000 And this time we made the system a bit more modular so we could try different things to try and adjust the speed, make it go faster, make it go slower.
00:03:13.000 And then what we figured was, well, the system isn't going fast enough.
00:03:15.000 What if we actually turn the problem into a solution?
00:03:18.000 What if we turn it around and actually slow it down so that it goes slower than the plastic?
00:03:23.000 And then we figured that that actually works.
00:03:27.000 And in October, we announced that we're actually catching plastic.
00:03:30.000 And really just last week, the first two shipping containers full of plastic were landed in port.
00:03:36.000 Wow, so it's really recently up and running the way you expected it.
00:03:41.000 Now, how long does it take to accumulate two shipping containers full?
00:03:44.000 So that was roughly a month, month and a half.
00:03:47.000 And how big are these shipping containers?
00:03:48.000 20 foot, so probably two of these rooms.
00:03:51.000 So the only thing that's really stopping it from getting more is the actual size of the net itself.
00:03:56.000 Yes, so that's the next step.
00:03:58.000 So now that we went from zero to one, we have the basic principle of catching plastic confirmed.
00:04:05.000 We're going to have to make it bigger before we can build a whole fleet of them because we reckon we need maybe 50 or 100 of them to really clean up half this patch in five years.
00:04:15.000 That's the objective.
00:04:16.000 Half the whole patch in five years?
00:04:18.000 That's the real objective?
00:04:19.000 That's what we're going to do.
00:04:20.000 Wow!
00:04:20.000 Is that really possible?
00:04:21.000 If you have enough systems, yeah.
00:04:23.000 That's incredible.
00:04:24.000 Now, where are you at in terms of trying to get these systems made and implemented?
00:04:31.000 So now we just finished this first step with the system number one.
00:04:37.000 That's how we called it.
00:04:38.000 And the next step is to develop what we call system two, which is indeed going to be a bigger version.
00:04:43.000 And the idea is to minimize the amount of vessel use that you need for it.
00:04:50.000 Because boats are really fucking expensive.
00:04:53.000 The boat that we have costs roughly 15,000 euros per day to keep running.
00:04:58.000 Just one boat.
00:05:00.000 You don't want boats.
00:05:02.000 Do you anticipate that it ever gets to a point where the amount of money that you can generate from the actual resource of physical plastic can actually pay for the whole experiment?
00:05:13.000 I hope so.
00:05:14.000 That's what we want to trial next year by making products from the catch that we make.
00:05:22.000 The material itself hardly has any value.
00:05:25.000 It's really the story to it.
00:05:27.000 You should make straws so people don't feel guilty.
00:05:31.000 It's a recycled straw.
00:05:33.000 Yeah, that's a terrible idea.
00:05:38.000 It would be a very sustainable business model.
00:05:42.000 What are the products that you're thinking about?
00:05:46.000 We have a few ideas.
00:05:48.000 It's still under development, so I think in September we should be ready to launch the first one.
00:05:53.000 But I think it's going to be things that are durable, that don't end up in ways that will retain their value, so can last for a very, very long time, and that you actually want, and ideally carry around so you can talk about it with other people.
00:06:10.000 Actually now, so just last week, with that first plastic on shore, we said, okay, now we welcome our supporters to actually make, well, I shouldn't call it a reservation, but kind of make a down payment so that you can be first in line.
00:06:26.000 So if people go to our website, they can actually put in the 50 bucks and get the right for the first ever products made from the Great Pacific Garbage Patch.
00:06:35.000 So you're just trying to figure out what products will have the most sustainability, what products people will keep for the longest?
00:06:41.000 Yeah, and things that people want, right?
00:06:43.000 You don't want some kind of gimmick that's just going to be this paperweight.
00:06:47.000 Flip-flops seem like an easy one, right?
00:06:49.000 Yeah.
00:06:50.000 Don't they?
00:06:50.000 I mean, people love to buy flip-flops if they've bought flip-flops.
00:06:53.000 Especially people that are like sort of like outdoorsy type folks.
00:06:57.000 Sure.
00:06:58.000 Appreciate the beach.
00:06:59.000 Maybe I have to write these things down.
00:07:01.000 Flip-flops?
00:07:02.000 That's a pretty easy one.
00:07:03.000 Any other ideas?
00:07:04.000 What else, Jamie?
00:07:06.000 What else would be a good one?
00:07:08.000 Belts.
00:07:09.000 People like belts.
00:07:12.000 Shoes.
00:07:12.000 Definitely some Yeezys.
00:07:14.000 Some dope recycled sneakers.
00:07:18.000 What is foam?
00:07:19.000 The foam that they make running shoes out of.
00:07:21.000 That's a very specific type of...
00:07:24.000 That's not actual plastic, right?
00:07:25.000 It's probably made out of something else.
00:07:27.000 It's a type of plastic.
00:07:28.000 We might be able to foam this material as well.
00:07:31.000 I think we've done tests with that.
00:07:32.000 Oh, really?
00:07:33.000 Turn it into foam?
00:07:33.000 Yes.
00:07:34.000 Oh, through some sort of process.
00:07:35.000 Yeah, I mean, you could make athletic shoes.
00:07:37.000 That would be easy, right?
00:07:39.000 People, but...
00:07:41.000 You'd have to have a way to incentivize people to recycle them.
00:07:46.000 It would be so ironic for those fucking things wound up back in the ocean.
00:07:49.000 There's got to be a way to do that.
00:07:52.000 If you have your own company drop-off points in cities where when they're done with their stuff, if it's broken down or it's old, you could throw it into this bin and you will ensure that it gets converted back into raw materials and utilized again.
00:08:09.000 Yeah, that'd be a great move.
00:08:12.000 And people would do that if you made it easy for them.
00:08:15.000 Sort of like recycling bins.
00:08:17.000 If you make it easy for them, they'll throw their bottle in there.
00:08:21.000 So you have these two cargo ships or these two cargo containers filled with this plastic stuff.
00:08:30.000 What do you do with it now?
00:08:32.000 Yeah, so now it's going to Europe.
00:08:34.000 Unfortunately, there isn't really any useful recycling infrastructure in the US. So we set up this infrastructure in Europe to be able to first sort it, and then shred it, and then recycle it.
00:08:47.000 And then make those first products out of them.
00:08:50.000 So hopefully, and hopefully with that, then generate the cash needed to continue running the cleanup.
00:08:58.000 And of course, now it's still small scale.
00:09:00.000 Eventually, we should have those number of shipping containers every day, probably.
00:09:06.000 So do you have a group of people that's trying to come up with ideas of what to make out of the plastic?
00:09:09.000 Yeah, it's a little team inside the O2 cleanup working on that.
00:09:15.000 I think they say that by September they should be ready to launch the first product.
00:09:21.000 That's great, man.
00:09:23.000 The whole idea behind it is beautiful.
00:09:25.000 You have a river system too as well, right?
00:09:28.000 Yes, so that's the other thing, right?
00:09:31.000 So on one hand, we need to clean up what's running in the ocean, doesn't go away by itself, and basically the only way to deal with that is to just go out there and clean it up.
00:09:45.000 But of course, then there's this other side of the equation, which is there's still huge, huge amounts of plastic flowing into the ocean every day, mostly from countries in Central America, Southeast Asia, where people are kind of at this stage of development or countries are at a stage of development where the people are wealthy enough to consume a lot of things that are wrapped in plastic,
00:10:10.000 yet there isn't any waste infrastructure yet to take care of it.
00:10:13.000 So...
00:10:14.000 You literally see people on scooters just drive to a bridge to dump their municipal waste into the river because that's simply the easiest way to get rid of it.
00:10:26.000 To your point, what's easiest people will do?
00:10:29.000 And so it's not really that people don't care there or that they are less civilized or something, but it's really a combination.
00:10:38.000 There's a lot of people and there is no infrastructure that they can make use of.
00:10:45.000 Back in 2015, we were like, okay, maybe at some point in time this ocean thing will work out.
00:10:51.000 Who knows?
00:10:52.000 But then we're stuck with this problem that there's still so much plastic flowing in that we would just have to keep going forever.
00:10:59.000 That would just be not very motivating and we want to be this project with a beginning and an end.
00:11:06.000 So we're like, okay, so where's the plastic coming from?
00:11:08.000 And then we figured, you know, probably rivers.
00:11:10.000 Rivers are like these archeries that carry the trash from land to sea, because when it rains, plastic washes from streets to creek to river to ocean.
00:11:21.000 But then we found out that there is 100,000 rivers in the world.
00:11:26.000 So that's kind of a big amount if you want to do something about it.
00:11:30.000 So we started doing measurements in rivers.
00:11:33.000 And then what we found was that just 1% of rivers are responsible for 80% of the pollution.
00:11:40.000 So really just a very tiny amount of rivers, if you were to tackle those, could really address the majority of the plastic going into the ocean.
00:11:50.000 And it's mostly like these relatively small rivers in capital cities like Manila, Jakarta, Kuala Lumpur.
00:12:00.000 We have very high density of people.
00:12:03.000 Near the coast, that's where most of the leakage, most of the emissions occurs.
00:12:08.000 So...
00:12:09.000 So since 2015, we've been kind of as a secret side project, been working on seeing, well, can we actually develop something to intercept the plastic in those rivers?
00:12:19.000 And we just launched it a month ago.
00:12:21.000 We call it the Interceptor.
00:12:23.000 And it's this scalable system that's almost like plug-and-play.
00:12:30.000 So you bring it to a river and you install it, and it just works.
00:12:36.000 It's fully autonomous, solar-powered...
00:12:38.000 So this is all the real plastic that's being pulled out of this river from your machine.
00:12:43.000 That's incredible.
00:12:44.000 This was the prototype.
00:12:46.000 Dude, that's amazing.
00:12:47.000 This was in Jakarta.
00:12:49.000 For people that are just listening, we're looking at this thing pull enormous amounts of plastic out of this river, and it's also doing so, and they're stacking it into these bags.
00:12:58.000 It's a large physical quantity of stuff.
00:13:01.000 Oh yeah, and then maybe you can actually pull up the video of Interceptor 2 in Malaysia.
00:13:06.000 So we already have two of them in real life as we speak.
00:13:10.000 How does it avoid doing anything with fish?
00:13:13.000 How do you avoid capturing accidentally?
00:13:16.000 Oh my god, is that real?
00:13:18.000 Yeah, so this is the Klang River in Kuala Lumpur.
00:13:21.000 And it's, according to our model, it's like the fifth most polluting river in the world.
00:13:25.000 So 1% of all plastic going into the world's oceans is coming from that one river.
00:13:30.000 This is unbelievable how polluted this is.
00:13:33.000 This is crazy.
00:13:34.000 It's 10 million kilos per year, roughly.
00:13:36.000 Just looking at it, it looks like a wasteland.
00:13:39.000 That's so sad.
00:13:41.000 Yeah, so we now have four interceptors.
00:13:46.000 Two of them have already been deployed.
00:13:48.000 Here's the one going to this Klang River.
00:13:53.000 And we kind of wanted to make it look like a spaceship, just so people would like it.
00:14:00.000 And so it has this barrier that concentrates the plastic to the mouth of the interceptor where you have a conveyor belt that then scoops it out of the water.
00:14:07.000 Again, fully solar and battery powered, and then deposits it onto this moving shuttle conveyor, which then distributes it across these big dumpsters, can hold roughly 50 cubic meters of trash, and it just works by itself,
00:14:25.000 so that's what it does.
00:14:28.000 That's an insane amount of garbage that you're pulling out of there.
00:14:31.000 When you look at it visually, folks, you can watch the video.
00:14:34.000 What is the name of this video, Jamie, so people can find it?
00:14:37.000 Rivers Interceptor 002 Cleaning in Malaysia is the title of the YouTube video.
00:14:43.000 It's crazy.
00:14:44.000 Now, you're not catching any fish in this?
00:14:47.000 Oh, yeah.
00:14:48.000 So, because this barrier is non-permeable, the current just flows underneath it.
00:14:54.000 Basically, the sea life can just pass it, actually.
00:14:57.000 At one point, we had this giant lizard, which was probably two meters.
00:15:01.000 We should probably post that photo.
00:15:03.000 It was actually kind of climbing onto the barrier, and then it just swam around it.
00:15:09.000 Two meters?
00:15:10.000 Really?
00:15:10.000 Yeah, one and a half.
00:15:11.000 It was...
00:15:12.000 What kind of lizard is that?
00:15:15.000 I don't know the name, but it was...
00:15:18.000 Some kind of monitor or something?
00:15:19.000 Monitor, you're right.
00:15:21.000 Where is that?
00:15:22.000 Do you see an image of that?
00:15:25.000 No?
00:15:26.000 I'll post that next week.
00:15:27.000 Oh, you haven't posted it before?
00:15:28.000 Oh, okay.
00:15:29.000 He was looking for it.
00:15:30.000 I thought you were saying it was out there.
00:15:32.000 So, it's safe for fish.
00:15:34.000 What about the stuff that doesn't float on the very surface?
00:15:38.000 Right.
00:15:39.000 So the system goes down one meter.
00:15:41.000 What we measured is that really almost all the plastic is in that top layer.
00:15:46.000 So sure, it won't be 100% efficient, but I think it's really about having this pragmatic thing that catches most of it.
00:15:53.000 And it most importantly leaves wildlife alone because everything else can just swim underneath that.
00:15:58.000 Exactly.
00:15:59.000 Yeah, that's great.
00:16:01.000 And so this plan that you had when it's been six years running, how much of your daily time is devoted to this?
00:16:13.000 Oh, I don't think there's much free time at all.
00:16:18.000 Especially past year, I've not had a single free day.
00:16:24.000 Not a single?
00:16:25.000 Yeah, just 9am to usually 9pm in the office.
00:16:32.000 It's been busy, but I think it was worth it, looking where we were at the beginning of the year to where we're now.
00:16:40.000 Well, now that you've actually pulled these cargo containers filled with plastic out of the ocean, that must give you an extreme feeling of satisfaction, right?
00:16:48.000 You've actually made it work.
00:16:51.000 It's moving now.
00:16:53.000 So I was kind of hoping for that feeling, but then when you get to that point, you're like, okay, but you can really only see the amount of work that's still ahead of you.
00:17:02.000 So it's actually really hard to enjoy successes in a way.
00:17:09.000 Mm-hmm.
00:17:10.000 I should probably get better at that.
00:17:12.000 It's hard.
00:17:12.000 Well, particularly what you're doing, you have a monumental task in front of you.
00:17:17.000 And what you're doing is rightly being applauded by so many people, but I don't know how many people are actually helping you.
00:17:22.000 You have a crazy thing that you're doing.
00:17:24.000 You're trying to pull the plastic out of the ocean.
00:17:26.000 When people find out about the Great Pacific Garbage Patch, they get panicky.
00:17:31.000 They're like, what?
00:17:31.000 How long has this been going on?
00:17:32.000 How do I not know about this?
00:17:34.000 Because so few people...
00:17:36.000 I mean, I would think like maybe 40% of the population understands that there's a gigantic patch of garbage in the middle of the ocean.
00:17:41.000 And it was discovered 20 years ago.
00:17:43.000 Yeah.
00:17:43.000 Exactly 20 years ago.
00:17:44.000 That's crazy.
00:17:45.000 And still it's there, just been growing.
00:17:47.000 So, 1998, no one had a goddamn clue.
00:17:49.000 Nope.
00:17:50.000 And then they went, wait, hey, what?
00:17:52.000 What's going on here?
00:17:53.000 What is all this garbage?
00:17:54.000 And it keeps getting bigger and bigger, right?
00:17:56.000 Yeah, it's actually quite a good story.
00:17:57.000 This is a sailor called Charles Moore who was participating in a sail race between Hawaii and California.
00:18:05.000 And while others would go further north, he thought, well, let's try and cut off this piece.
00:18:12.000 And then he was looking at the water and he just saw all that trash.
00:18:17.000 Then he went back, he was so shocked about it, and then he decided to take some measurements, publish the results, and that kind of popularized that whole concept of the Great Pacific Garbage Patch.
00:18:30.000 It's a weird thing to see when you see drones flying over it, and you see the footage of it.
00:18:35.000 And it's also, a lot of it is a lot smaller than people think of it, because it's broken down by the ocean, right?
00:18:41.000 Yeah, so that's what happens over time, is that these larger objects basically enter the ocean due to the working of the waves as well as the sun breaks down into these smaller and smaller pieces, which is actually not really a good thing because these smaller pieces are then easier to ingest for fish and other wildlife.
00:19:02.000 So the smaller it gets, in a way, the more harmful it gets as well.
00:19:08.000 Fortunately, what we see is that still 92% of the plastic is still non-microplastic, so big stuff.
00:19:16.000 But of course, if we don't clean it up over the next few decades, all of that big stuff will also become microplastics, and then we're in a much worse state.
00:19:23.000 Is the cleanup of those microplastics possible, or is it just something that needs to be sort of rethought out?
00:19:30.000 Well, so that was actually one of the positive surprises that we had this year, is that the cleanup system in the patch wasn't just catching plastic, not just the big stuff.
00:19:40.000 It was also catching most of the microplastics.
00:19:43.000 So down to one millimeter.
00:19:45.000 Because it all gets clogged up with all the other stuff?
00:19:47.000 Is that what it is?
00:19:48.000 We're not exactly sure how it was able to do that, but we just saw huge amounts of those microplastics in the system.
00:19:56.000 It probably has to do something with the radiation of the waves, so you have that big pipe that keeps the system together.
00:20:03.000 And because waves are kind of crashing against it, it reflects waves as well.
00:20:08.000 And almost like a lens, it was concentrating those microplastics into one patch in the middle of the system, which was kind of just holding on into the system.
00:20:20.000 So that was really, we weren't expecting to collect microplastics, but there we were.
00:20:25.000 That's pretty cool.
00:20:28.000 So now, where are you at in terms of trying to expand it to a point where you could, you know, really get this goal of half of the plastic of the ocean in five years?
00:20:39.000 Yeah, so it would probably be easier if we had one goal, but we now set two goals for ourselves.
00:20:45.000 One is the 50% of five years for the patch, but the other one is that we want to Have interceptors into the 1,000 most polluting rivers, the ones that do the 80%, in the next five years.
00:20:58.000 So we'll be pretty busy.
00:21:02.000 Really, really busy.
00:21:03.000 Yes.
00:21:04.000 As if you're not busy enough.
00:21:06.000 How many people are working in your organization?
00:21:09.000 About a hundred.
00:21:10.000 I think the team is now better than it's ever been.
00:21:16.000 Fortunately, there is definitely a lot of help, but we're still recruiting.
00:21:22.000 I think another 20, 30 people coming half a year.
00:21:27.000 Definitely, there is a lot of work to do.
00:21:30.000 What has this ride been like for you from being this really young guy when you figured this out?
00:21:35.000 How old were you when you came up with the idea?
00:21:37.000 The first idea is when I was 16, but really founded the organization when I was 18. Yeah, so that's really young.
00:21:44.000 18 to where you are now, just being constantly involved in this process.
00:21:48.000 What has that been like for you?
00:21:51.000 Educational, I would say.
00:21:53.000 So really when I started and when I look back at when I started, I really didn't have a bloody clue what I was doing.
00:22:00.000 And I suppose that was a good thing because if I would have known how complicated and how big it would have to become in order to actually...
00:22:09.000 Take some plastic out of the ocean.
00:22:11.000 I probably wouldn't have started it.
00:22:13.000 It was just too big.
00:22:15.000 I remember giving my first presentation back in 2012 and somebody approached me and said, okay, it's a great idea.
00:22:25.000 It's going to cost tens of millions of dollars.
00:22:29.000 You're going to need a team of maybe 100 people to get this to a point that it could actually work.
00:22:35.000 And I thought this guy was crazy.
00:22:37.000 No way.
00:22:38.000 So that's why I started a $2 million crowdfunding campaign to get it started.
00:22:44.000 And yeah, he was closer to the truth than I could have ever imagined.
00:22:51.000 When you think about the amount of time that you're investing in this, how do you see yourself ever getting off this ride?
00:22:57.000 Is this what you want to do for the rest of your life?
00:23:00.000 No, so I simply want to solve problems, and I think this is kind of a good starter problem.
00:23:09.000 I think it's very feasible to actually solve it.
00:23:15.000 It's, by the way, hilarious that you're talking about the Great Pacific Garbage Patch as a starter problem.
00:23:21.000 Oh, well.
00:23:22.000 What kind of ambition do you have?
00:23:25.000 That's one of the most perplexing problems with garbage and waste today.
00:23:31.000 Yeah, but I do think it's solvable.
00:23:35.000 I think it is too, according to you and what you're saying, but to call it a starter problem is hilarious.
00:23:41.000 Yeah, and I think the exciting thing for me is that I picked this problem as the first one because I believed it would not just be solvable, it's solvable by a relatively small group of passionate people.
00:23:57.000 Yeah, so of course what I hope is that with the OceanClean we can kind of create this blueprint of how you solve a problem and how you make civilization a bit more sustainable so that hopefully with that blueprint we can not only solve more other problems in the future but also inspire others to do the same thing.
00:24:18.000 Well, that's a beautiful sentiment.
00:24:20.000 And do you have any other things that you want to try to solve once you've sort of stepped away from this?
00:24:26.000 Yeah, so I really...
00:24:29.000 There's definitely not going to be a shortage of ideas.
00:24:34.000 So I keep this little booklet that's kind of overflowing, but...
00:24:39.000 What I realized is to be successful with the cleanup, I really need razor sharp focus and I can only do one thing at the same time.
00:24:52.000 Ideas are like viruses and when they enter your mind it kind of expands and evolves and it's really quite dangerous actually to have new ideas.
00:25:04.000 I forgot who said that, but somebody recently I heard saying, the best thing you can do is having one great idea and then never having any other ideas in the rest of your life.
00:25:15.000 So just because, you know, to be...
00:25:17.000 Your resources.
00:25:18.000 Right.
00:25:19.000 To achieve something, you need full focus.
00:25:22.000 I think it was Stuart Brand, by the way.
00:25:24.000 It's a good thing to say.
00:25:24.000 It's accurate.
00:25:25.000 What do you think, though, about what I was going to get at was, do you ever conceive a possibility of coming up with something that removes carbon from the atmosphere?
00:25:33.000 That's a giant issue with us, right?
00:25:36.000 Right.
00:25:37.000 Carbon emissions.
00:25:38.000 So, definitely, I believe negative emissions, I think you referred to them, will be required to make the goals to kind of keep the warming in check.
00:25:55.000 However, it's a much more difficult problem because if you think of the ocean, it's basically a two-dimensional problem.
00:26:03.000 It's plastic on the surface and fortunately it's not even the whole ocean.
00:26:08.000 It's kind of concentrating in these accumulation zones.
00:26:11.000 So the garbage patch...
00:26:12.000 Although it's twice the size of Texas, it's still 1.6 million square kilometers, while the ocean is like 300 million square kilometers.
00:26:20.000 So it's really just maybe less than a percent of the ocean which needs to be cleaned.
00:26:26.000 And again, it's a two-dimensional problem.
00:26:28.000 Well, the atmosphere is three-dimensional.
00:26:32.000 So it's just this one-dimensional increase is just...
00:26:37.000 Yeah, it's just a huge, huge challenge.
00:26:40.000 So I do think it needs to be tackled, and it's definitely an exciting problem to think about.
00:26:47.000 I do think that's definitely not a good startup problem to work on.
00:26:53.000 No.
00:26:54.000 Wasn't there something, Jamie, that we had talked about where they had figured out a way to make these building-sized, essentially vacuum cleaners they were going to put in the center of certain cities?
00:27:03.000 I believe it was in Asia, maybe perhaps China.
00:27:06.000 They'd come up with this.
00:27:07.000 I don't know if they implemented it yet, but the idea was to have these enormous things in place that look like a skyscraper.
00:27:13.000 And really, it was just a huge vacuum cleaner for carbon.
00:27:17.000 Sure.
00:27:17.000 I know there are a few companies that work on it.
00:27:20.000 I believe Carbon Engineering is one.
00:27:22.000 There is also one out of Switzerland.
00:27:25.000 I forgot the name.
00:27:27.000 But definitely good, smart people are working on that problem.
00:27:32.000 I'm not sure where they are in terms of the economics and scale.
00:27:36.000 Is it right here?
00:27:36.000 When he mentions Carbon Engineering is this one.
00:27:39.000 Okay, so that looks like giant fans, like a huge building filled with fans.
00:27:45.000 We believe humanity can solve climate change.
00:27:48.000 Yikes.
00:27:49.000 Imagine, like, we have filters for air the same way we have filters for water.
00:27:53.000 Direct air capture technology.
00:27:56.000 Carbon engineering.
00:27:58.000 More than 10 years in the making that can capture carbon dioxide directly from the atmosphere.
00:28:03.000 And look at that machine.
00:28:04.000 Try to get a close-up on what that thing looks like.
00:28:06.000 It looks like giant fucking washing machines.
00:28:09.000 Right?
00:28:09.000 Like it's washing the air.
00:28:11.000 Doesn't it look like giant washing machines?
00:28:13.000 Wow.
00:28:14.000 I mean, it seems like it's feasible.
00:28:16.000 It doesn't seem like it's something that's impossible.
00:28:19.000 Yeah.
00:28:19.000 I think it's the scalability that's the main challenge.
00:28:23.000 Well, it's also funding.
00:28:24.000 Like if you drive over or fly over Manhattan rather and see the density of the structures and how many buildings are in there, you know that people can make some pretty insane shit.
00:28:33.000 Right.
00:28:33.000 Why couldn't they make some giant insane vacuum cleaner for the air that's, you know, as big as a city block?
00:28:38.000 Of course, a lot of it comes down to economics.
00:28:41.000 Our system is not very good at valuing things that are long-term or directly benefit ourselves.
00:28:49.000 So definitely...
00:28:51.000 People tax it.
00:28:53.000 They'll find a way to make it profitable.
00:28:55.000 Is this another one?
00:28:55.000 Rendering of what one would look like to capture 1 million tons of CO2 per year.
00:29:00.000 Whoa!
00:29:01.000 It looks like it would be noisy.
00:29:03.000 Oh yeah, probably annoying as fuck.
00:29:05.000 Look at those fans.
00:29:06.000 That's so weird though.
00:29:08.000 Like the whole array of fans.
00:29:10.000 Like, okay, that seems like a way to do it.
00:29:13.000 Looks like somebody built a giant computer and tried to cool it or something.
00:29:16.000 And then they'll have all this carbon.
00:29:18.000 What the fuck did they do with that?
00:29:19.000 Burn it?
00:29:20.000 What do they do?
00:29:21.000 Yeah, I don't know.
00:29:22.000 You make shit out of it?
00:29:23.000 What do you do?
00:29:23.000 Make diamonds?
00:29:24.000 Imagine that.
00:29:25.000 Diamonds are a girl's best friend.
00:29:26.000 You make it out of the carbon that you pull out of the air.
00:29:28.000 That would be a good business model.
00:29:29.000 That would be a great business model.
00:29:30.000 This would be like a green diamond.
00:29:32.000 A diamond that's actually made and it's all pressured by solar power.
00:29:36.000 They use solar power to fucking smash it.
00:29:38.000 Carbon fiber too.
00:29:40.000 Is that the same carbon fiber?
00:29:41.000 I don't know.
00:29:42.000 Is it the same shit?
00:29:43.000 Yeah, why not, right?
00:29:44.000 It must be.
00:29:44.000 It's carbon, right?
00:29:46.000 So, carbon is how they make...
00:29:48.000 It's coal, right?
00:29:49.000 Which is essentially carbon, right?
00:29:52.000 Do they make diamonds from carbon, or do they make diamonds just from coal?
00:29:55.000 What is coal?
00:29:56.000 Coal is like burnt shit, right?
00:29:59.000 There's many forms in which carbon exists.
00:30:03.000 Yeah.
00:30:03.000 So, different crystal structures.
00:30:05.000 They are doing that now, where they are making commercially made diamonds.
00:30:09.000 Diamonds are made of carbon.
00:30:11.000 So they form as carbon atoms under a high temperature and pressure.
00:30:15.000 They bond together to start growing crystals.
00:30:17.000 That's why a diamond is such a hard material because you have each carbon atom participating in four of these very strong covalent bonds that form between carbon atoms.
00:30:28.000 I've never read that word out loud.
00:30:30.000 Covalent?
00:30:31.000 Have you ever read that word out loud?
00:30:33.000 Covalent, I believe.
00:30:34.000 Is that how you say it?
00:30:35.000 I've never even seen that word.
00:30:36.000 So these bonds that form between carbon atoms.
00:30:39.000 So I know they're doing that now.
00:30:41.000 They're making diamonds with certain machines.
00:30:44.000 High pressure, high heat.
00:30:45.000 That would be hilarious.
00:30:48.000 That would be a good thing, too, because they would put a dent in the actual diamond market, which is this weird lockdown fucking strange market.
00:30:55.000 Because diamonds aren't nearly as valuable.
00:30:57.000 As they're set out to be.
00:30:59.000 De Beers takes these diamonds and they stockpile them and they only release a certain amount of them and they keep the price very high.
00:31:05.000 But it's all engineered.
00:31:08.000 Diamonds used to be far more rare than they are now.
00:31:11.000 But with the innovation in mining technology and their ability to get to diamonds they couldn't get to before, they have a lot of diamonds.
00:31:19.000 It's not as valuable as it appears when you go to buy one.
00:31:23.000 Didn't know that.
00:31:24.000 So we can make carbon diamonds, bro.
00:31:27.000 And actually, plastic, again, it's just carbon chains.
00:31:31.000 Oh, that's right.
00:31:31.000 We could even make diamonds out of ocean plastic.
00:31:34.000 Whoa, that would be the ultimate green diamond.
00:31:37.000 Imagine if you were a really ecologically minded rapper.
00:31:41.000 You could wear all your ice, could come from the ocean.
00:31:44.000 Let everybody know.
00:31:46.000 From trash to treasure.
00:31:47.000 Yes.
00:31:48.000 Dude, that's the signature of the company.
00:31:52.000 In quotes.
00:31:53.000 From trash to treasure.
00:31:54.000 Write these things down.
00:31:56.000 Boy and diamonds.
00:31:57.000 How about that?
00:31:58.000 I like it.
00:31:59.000 Yeah.
00:31:59.000 Dude, you could be the first guy to do this.
00:32:03.000 Here we go.
00:32:03.000 Plastic.
00:32:04.000 This is an ocean diamond.
00:32:06.000 Whoa!
00:32:07.000 Earth is crushing the ocean into salty diamonds.
00:32:11.000 That's a dope-looking diamond, too.
00:32:13.000 What is that?
00:32:14.000 Salt, I guess.
00:32:16.000 Recreated salty diamond deposits in a high-pressure, high-temperature experiment suggesting that many of Earth's diamonds form when the mantle crushes ancient seabed minerals.
00:32:30.000 Isn't science and the Earth cool?
00:32:34.000 I mean, if you do get to do this, here's another problem, okay?
00:32:38.000 Here's a big one for the ocean.
00:32:40.000 We're depleting it of seafood, of life.
00:32:43.000 I mean, you know, I had, how do you say his name again?
00:32:47.000 Sahoyas, right?
00:32:48.000 Luis Sahoyas, who directed The Cove on.
00:32:51.000 Yeah.
00:32:51.000 And we were talking about the deplenishing of the wildlife in the ocean.
00:32:56.000 And when you start looking at it on a grand scale, like how much fish they're pulling out of the ocean, it's very sobering.
00:33:03.000 Maybe you can come up with a way to replenish fish in the ocean so we can continue eating sushi.
00:33:10.000 What do you think?
00:33:12.000 So maybe just zoom out a bit.
00:33:16.000 Because of course plastic pollution, climate change, overfishing, I think it's all part of one big problem to make civilization sustainable.
00:33:26.000 The way I look at it is that, of course, over the past 200 years, humanity has made tremendous progress.
00:33:34.000 So, of course, since agricultural revolution 10,000 years ago, humanity has been kind of stagnant, no progress, just very, very slow progress, number of people, lifespan, it was all kind of flat,
00:33:50.000 nothing really happened.
00:33:52.000 Since the dawn of the Industrial Revolution, when we learned how to utilize science and our knowledge, collective knowledge, to turn that into progress, basically every possible metric for humanity has improved tremendously.
00:34:11.000 If you think of wealth, health, violence, education, Writes, all these things.
00:34:18.000 I know you've had Steven Pinker on.
00:34:19.000 He's much more knowledgeable on that topic than I am.
00:34:24.000 Yet, so truly, at this point in time, it has never been a better time to be alive for humans than today.
00:34:33.000 Not saying that it can't get better, but we have made tremendous progress.
00:34:38.000 By one hand, I'm imagining things that don't exist yet, so inventing technologies and also inventing institutions.
00:34:47.000 And on the other hand, our human ability to collaborate effectively in large numbers, which includes the corporation, which is a very effective way for people to work together.
00:35:00.000 Now, all that progress has also had its negative side effects, which are most pronounced, of course, in the area of the environment, where we put things into an environment that shouldn't belong there, and we take too much out of it, then nature can replenish,
00:35:17.000 which includes the fish, and on the other hand, you have the plastic going into the environment, etc., So, then the question is, well, how do we solve that?
00:35:27.000 And, of course, one hand is to say, okay, it's kind of the...
00:35:30.000 Maybe the luditis may be a bit of a negative way to phrase it, but the reactionary approach of saying, okay, we should...
00:35:41.000 Consume less.
00:35:43.000 Corporations are bad.
00:35:44.000 Technology is bad.
00:35:45.000 We should all get rid of all those things.
00:35:48.000 And I think the modern environmental movement, which is really kind of this romantic movement, has this image of back in the day, everything was great and we lived in harmony with nature.
00:35:59.000 So let's get rid of all this modernity and try and return to that pure original state.
00:36:06.000 What I, however, believe is that, first of all, I don't think it's a very realistic thing.
00:36:11.000 People want to keep their iPhones and their cars and people want to move forward.
00:36:17.000 And at the same time, I don't think it's really the most effective way to solve these problems because it would be like fighting a leper tank with bow and arrow.
00:36:27.000 Technology is nothing more than an enabler of human capabilities.
00:36:32.000 It enhances our power.
00:36:34.000 Why not use that power to also try and solve these problems as well?
00:36:39.000 Rather than try and reject business, reject technology, I truly believe that we should embrace those forces that make us human and has created this amazing world to also try and solve these negative side effects as well.
00:36:55.000 That's why I believe The overconsumption of fish is not going to end by people all becoming vegan, but rather through fake meat.
00:37:06.000 I think that the transport emissions are not going to be solved by people not flying anymore or not going anywhere anymore.
00:37:15.000 Realistically, people are going to fly more, so we better invent technologies that allow people to do that without harming the environment.
00:37:24.000 The same thing, I think, would be the case for plastic and really other energy uses as well.
00:37:31.000 No, I think that's a very wise way of looking at it, and it's a hopeful way of looking at it.
00:37:39.000 Today, even though you're dealing with statistics and factual information, like the fact that it's safer to live today, there's less violent crime, it's easier to get by this more technology, more innovation, medical technologies improve radically,
00:37:54.000 all these things are true, but you still have to say, it's not where we want it to be.
00:38:00.000 I'm not saying that the world's perfect.
00:38:01.000 You have to say that, even you.
00:38:04.000 It's the worry about people barking at you.
00:38:09.000 It's still terrible in parts of the world.
00:38:11.000 It's still terrible for people of color.
00:38:14.000 It's still terrible for trans.
00:38:15.000 I get it.
00:38:16.000 I get it.
00:38:16.000 I get it.
00:38:17.000 No one's saying that there's not room for improvement.
00:38:19.000 But you have to say that.
00:38:21.000 Even though you felt compelled, it's still not perfect.
00:38:24.000 Yeah.
00:38:24.000 And I wonder why it's so controversial.
00:38:29.000 I think it's important to learn from the things that we do well and then apply that.
00:38:35.000 I don't think it is that controversial.
00:38:37.000 I think it's a trick.
00:38:38.000 I think there's just a lot of people looking for every single opportunity to complain, even to someone like you who has objectively done nothing but good.
00:38:47.000 Right.
00:38:47.000 You say one thing.
00:38:49.000 I mean, Steven Pinker took a ton of heat for saying that.
00:38:53.000 And even though he's talking about actual scientific statistics, he's not saying the world's perfect and everyone should shut up.
00:39:00.000 What he's saying is we should look at this from a bird's eye view, look down and understand that although there's much work to be done, we're in a great place in comparison to the rest of human history.
00:39:11.000 And it's hopeful to realize that progress is possible.
00:39:15.000 Just imagine that there's something that feels intuitively right, as if every step forward would also have to equal a step backward elsewhere.
00:39:26.000 Yes.
00:39:27.000 I don't think that's the case.
00:39:29.000 There's plenty of things that you can invent that are not that.
00:39:35.000 And we see it, for example, with carbon right now that there's countries where, like Sweden, GDP has grown a lot past 20 years, carbon emissions has gone down.
00:39:46.000 So they call that the decoupling.
00:39:49.000 And I think what's really the main challenge in this century is to Decouple human progress from those negative side effects.
00:40:00.000 I think the way to do that is not reactionary.
00:40:03.000 It's really, again, through innovation and through collaboration.
00:40:10.000 I agree with you, and I think that a lot of times people just assume that these are the consequences of innovation, that there's a pro and a con to everything, because there has been so many things.
00:40:20.000 There have been so many things that are inventions that there are a pro and a con to it.
00:40:25.000 But that doesn't necessarily mean it has to be that way.
00:40:28.000 No, and even if things have a pro and a con, it doesn't mean the pro is as big as the con.
00:40:35.000 And if that would be the case, all the technology, every technology would be neutral, and it wouldn't matter what you event, but it would mean that an atomic bomb is...
00:40:50.000 I think?
00:41:04.000 There is a certain use that you prescribe with your invention.
00:41:10.000 I mean, you don't use nuclear bombs to wash your car, right?
00:41:14.000 I mean, you use it not for benign uses, unless maybe you want to terraform Mars, which some people...
00:41:27.000 I don't think technology is neutral.
00:41:30.000 It has a morality.
00:41:32.000 So what that means is that as long as we consistently develop net positive technologies...
00:41:40.000 Eventually, the world does get better and better.
00:41:44.000 If, say, a technology is 60% good and maybe has 40% downside, okay, but then we can invent a solution for that 40%, and maybe that's, again, net positive, and you kind of get this cascade of ever-improving world.
00:42:02.000 No, I think what you're saying sounds beautiful.
00:42:04.000 And if more people thought the way you're thinking, I think the world would be a better place.
00:42:08.000 I like the positivity.
00:42:11.000 I like the optimism in what you're thinking, particularly in terms of what's possible with innovation.
00:42:18.000 Yeah.
00:42:19.000 Well, I just don't think that being against something is very productive.
00:42:28.000 It doesn't really move us forward.
00:42:32.000 Right.
00:42:34.000 Rather than, you know, protesting against the things that I don't agree with, and there's certainly things that I don't agree with, but again, I don't think it's very helpful.
00:42:44.000 Rather than doing that, I'd much rather build towards the future that I do agree with.
00:42:49.000 Listen, man, I think what you're saying is very, very logical.
00:42:53.000 I wish more people thought like you.
00:42:55.000 You're a great role model for a lot of kids to use your energy in a positive direction.
00:43:01.000 You know, it can be done.
00:43:05.000 Yeah, sure.
00:43:07.000 I mean, I think we agree with your turn.
00:43:09.000 Yeah.
00:43:10.000 Do you think that, I mean, when you're looking at this possibility of getting 50% of the ocean's garbage out in five years, do you think that's realistic in terms of, like, the resources that you have, the funding that you have and all that stuff?
00:43:25.000 And if not, how can people help?
00:43:27.000 Do you have a website where people can...
00:43:30.000 Yeah, so definitely, you know, we don't yet have both, you know, we don't have the technology ready yet to really clean up the patch.
00:43:40.000 On the river, it's a different story.
00:43:42.000 We're really ready to scale.
00:43:44.000 But on the ocean, we still need some, you know, need some iteration.
00:43:48.000 And, of course, the funding isn't there yet.
00:43:51.000 So, of course, on our website, theoceancleanup.com, people can donate.
00:43:54.000 People can also...
00:43:55.000 Okay, cool.
00:43:56.000 There it goes.
00:43:56.000 Support the cleanup.
00:43:57.000 Yeah.
00:43:59.000 Is it like PayPal or what is it?
00:44:02.000 Yeah, anything you want and you can join the queue for the products.
00:44:07.000 That's excellent.
00:44:09.000 Beautiful.
00:44:11.000 Are you getting a lot of success with that?
00:44:13.000 People are contributing?
00:44:16.000 So definitely so far has been enough to keep the development going.
00:44:22.000 So it hasn't really been the limiting factor.
00:44:26.000 But of course, if we want to scale, we're going to need a lot more resources.
00:44:30.000 So definitely a lot of help will be required there in the coming years.
00:44:34.000 It's amazing that this is taking you so long and that you've been working on it so hard that you have all this energy to be able to pursue something like this.
00:44:43.000 I mean, was there ever a time while you're doing this to be like, Jesus Christ, I don't know how long I'm going to be able to do this?
00:44:49.000 Is it sustainable?
00:44:50.000 Like this every day?
00:44:52.000 No days off?
00:44:52.000 Constant?
00:44:54.000 Yeah.
00:44:54.000 So probably I should take a few days off.
00:44:56.000 Hell yeah.
00:44:57.000 End of the year.
00:44:57.000 Yeah.
00:44:58.000 This year has been tough.
00:44:59.000 Do you feel guilty if you take time off?
00:45:01.000 Ah, yeah.
00:45:03.000 Yeah.
00:45:05.000 So, I usually feel that with a lot of my, that's probably the case for everyone, that a lot of my strengths are at the same time also my weaknesses.
00:45:17.000 So, I think I'm pretty creative, so it's good.
00:45:20.000 But at the same time, it means that, you know, I really have to force myself to not be distracted by new ideas.
00:45:29.000 I think I'm...
00:45:31.000 I have a good work ethic, but the downside is that it's also very hard to slow down.
00:45:38.000 And I do realize that taking breaks, eventually it is better.
00:45:47.000 The best ideas that I've had were during times off.
00:45:52.000 Even the ocean cleanup idea, I was 16, was scuba diving in Greece, so I'm more plastic than fish.
00:45:59.000 That was during a break.
00:46:01.000 So I should probably take a few days.
00:46:05.000 Yeah, man.
00:46:06.000 Just go somewhere where you could just take a few naps.
00:46:12.000 Just relax, recharge.
00:46:14.000 Get your system back online perfectly.
00:46:17.000 Yeah.
00:46:18.000 Though, I think one sign to make is that, you know, with everything that I've ever done in my life, I've always been very obsessed about it.
00:46:27.000 And I think when, you know, it's something that you cannot really stop thinking about it, it never really feels like work either.
00:46:38.000 So, it's… A calling.
00:46:42.000 Well, I just wouldn't be able to imagine just having a normal job doing something you're not passionate about.
00:46:53.000 So, you know, I never really...
00:46:55.000 How miserable would it be to just be in an office and have to stare at the clock waiting for 6 p.m.
00:47:03.000 until you can go home?
00:47:04.000 Right.
00:47:05.000 That must be...
00:47:06.000 That's, like, probably my biggest nightmare.
00:47:08.000 For a lot of people, that's their life, you know?
00:47:10.000 Well, yeah, I don't mean to offend anyone here, but...
00:47:14.000 Look at you, being nice again.
00:47:15.000 Growing your tracks.
00:47:17.000 But, yeah...
00:47:18.000 No, I agree with you.
00:47:20.000 I understand exactly what you're saying.
00:47:22.000 A lot of people don't realize that the biggest asset they have in their life is their time and to spend that wisely.
00:47:29.000 You have this 80,000 hours, which I believe is 40 years, 40 hours a week.
00:47:37.000 It turns out that's 80,000 hours.
00:47:39.000 That you can use for anything.
00:47:42.000 And I do believe that people often have a lot more potential than what they turn out to be doing if they were to realize how valuable that time is.
00:47:58.000 And sort of the classic model also for...
00:48:02.000 More wealthy people is to work very hard and then to kind of donate here and there.
00:48:09.000 But probably you could be a lot more effective if you were to just use your brain, use your time directly on, you know, working something that matters.
00:48:20.000 Well, I think what you just said is legitimate inspiration talk.
00:48:25.000 You know, there's a lot of people that...
00:48:28.000 Should I write a book?
00:48:29.000 You should.
00:48:30.000 Well, I mean, just a video, I think, is good enough.
00:48:32.000 Just a video of you explaining your philosophy.
00:48:34.000 I mean, you have accomplished so much, and your idea that you're doing is so noble and actually effective.
00:48:41.000 There's something that people need to hear sometimes about...
00:48:44.000 Different people's philosophies on how to spend their time and their energy.
00:48:47.000 And your perception of instead of wasting in on other things, just concentrate on something that you think is going to make an impact, something that you're drawn to, something that...
00:48:56.000 And yeah, when you do that, then you have a cause.
00:48:59.000 Then you have a thing that you're working towards.
00:49:00.000 It's not just simply showing up and doing something that someone's paying you to do that you don't necessarily want to do, which is a trap.
00:49:07.000 A lot of people find themselves in that trap.
00:49:09.000 Yeah.
00:49:10.000 They need to hear people like you talk sometimes.
00:49:12.000 That's as inspirational as anyone who's like a professional, inspirational, or motivational speaker for a lot of folks.
00:49:20.000 Maybe more so because you're actually doing something.
00:49:22.000 Yeah.
00:49:23.000 Well, probably my words carry more credibility once the oceans are actually clean.
00:49:28.000 Yeah, it'll help.
00:49:29.000 But just the fact that you've got two cargo holds filled with two cargo containers.
00:49:34.000 I mean, who the fuck has that?
00:49:35.000 No one.
00:49:36.000 You know?
00:49:36.000 A couple of people have some bags.
00:49:38.000 Yeah.
00:49:39.000 And on the rivers, we have that amount every day now.
00:49:43.000 Really?
00:49:44.000 Yeah.
00:49:44.000 Every day on the rivers.
00:49:45.000 Wow.
00:49:46.000 When you see that horrible...
00:49:50.000 Pollution drifting downriver, when you see that stuff, does that feel almost like impossible to capture all of it or pointless because people keep throwing it in there?
00:50:00.000 Well, of course, that's the thing.
00:50:01.000 So we don't position these interceptors as being the ultimate solution for the whole plastic.
00:50:08.000 So, of course, eventually, you have to make sure plastic doesn't end up in the ocean or in the rivers in the first place, right?
00:50:15.000 But I was standing.
00:50:18.000 So a few weeks ago, I was in Indonesia and Malaysia to see the machines and talk with government people there.
00:50:24.000 And I was standing on the interceptor and you see this constant, literally, torrent of plastic going into the interceptor and I was looking upstream and I realized, well, there's more than 5 million people living in the catchment area of this river and they have limited infrastructure,
00:50:46.000 they consume so much and just trying to imagine All that plastic not ending up in the river with such a diffuse source, 5 million people,
00:51:01.000 was just so hard to imagine.
00:51:04.000 And of course, that's where we have to go to.
00:51:07.000 But realistically speaking, it's just going to take a while.
00:51:13.000 It's going to take maybe two decades, three decades, something like that.
00:51:18.000 So, I think rather than kind of staring at kind of the perfect solution and really just working on that, which of course is very important, I think we also need to be a bit more pragmatic and also realize,
00:51:35.000 well, okay, it may take 20-30 years, let's at least make sure that during those 20-30 years we don't have 10 million kilos of plastic flowing out of this river.
00:51:46.000 You'd have to have some sort of cooperation with the people that are doing that and chucking that plastic into the river.
00:51:52.000 Someone's got to figure out a way to get to them.
00:51:54.000 I do hope that the interceptors can have a positive influence upstream as well.
00:52:00.000 Or people say, I don't have to worry about it.
00:52:03.000 They've got a thing now.
00:52:04.000 It's the best place to throw your plastic because they've got a thing that scoops it out.
00:52:08.000 Guilt-free dumping.
00:52:11.000 That risk is called moral hazard.
00:52:18.000 It's a phrase from economics wherein the insurance industry is kind of a thing where people make more damage once they're insured because they're less worried.
00:52:30.000 I don't buy it that much as an argument for the plastic problem because it's not like it's a conscious cost-benefit analysis whether you're going to throw something on the street or not.
00:52:43.000 It's more of an unconscious thing.
00:52:45.000 You just do it, right?
00:52:47.000 Or at least I hope you don't, but some people do.
00:52:50.000 And then it's...
00:52:55.000 With that same logic, why maybe municipalities should also stop sweeping the streets?
00:53:01.000 And maybe we shouldn't even collect garbage at people's houses because it only incentivizes the creation of garbage.
00:53:09.000 You don't believe that though, right?
00:53:10.000 No.
00:53:12.000 And then there's this other effect called the broken window effect, which I think it was back in the 60s in New York, what they found is that in streets and neighborhoods where you have buildings that show obvious sign of decay,
00:53:27.000 like broken windows or litter, that actually would incentivize other unlawful activities.
00:53:37.000 The similar effect has been observed with a park.
00:53:41.000 If a park doesn't have any litter, people litter less than when there is litter on the ground.
00:53:45.000 So Radha, I think, is the opposite.
00:53:47.000 If you truly believe that the ocean is going to be polluted forever and it doesn't really matter, it's already dirty, that's not really a strong motivation to not litter.
00:54:01.000 But if you say, okay, well, the ocean is clean now, They love effort for that.
00:54:07.000 And once it's clean, I think that would actually be a motivation to not litter.
00:54:14.000 I think you're 100% right.
00:54:15.000 I think some of those videos where you see shorelines that are so thick, I don't know what part of the world it is, but that are so thick with plastic you can't even get into the water.
00:54:25.000 You can't wade out there and swim.
00:54:27.000 Yes.
00:54:28.000 It's so disheartening and you wonder where the world will be if not for people like you that are trying to come up with a solution.
00:54:35.000 Where the world will be in 50 years.
00:54:37.000 50 years ago this wasn't the case.
00:54:40.000 Now it is.
00:54:41.000 If you could see like a time-lapse video of these oceans from like, go back to like 1900 to 2019, and then go back before 1900, it was relatively unchanged for thousands of years, right?
00:54:53.000 And then all of a sudden this massive change very quickly with the Industrial Age.
00:54:58.000 Yeah.
00:54:58.000 But again, I believe this is a transition phase.
00:55:01.000 It's like our modern civilization being in its sort of teenage years and we kind of have to grow out of it.
00:55:12.000 I couldn't agree more.
00:55:14.000 I hope this worry that we all have will translate into improvement and progress.
00:55:22.000 And I always say the same thing, that we're in some sort of adolescent stage.
00:55:26.000 Of society and evolution, that we're in this weird sort of state where we're aware of how much we can change our environment, but also still contributing to the detriment of our environment in a non-sustainable way, and then eventually it's going to have to come to a head.
00:55:40.000 You know, when you see people screaming about climate change and all these different things, I mean, this is people realizing that there's a lot going on that maybe not everyone is completely and totally aware of, but I'm with you.
00:55:53.000 I think it's...
00:55:55.000 It's good to be optimistic.
00:55:57.000 It's healthier to be positive.
00:55:58.000 And I think it's logical that people will find a way out of this.
00:56:03.000 I really do.
00:56:04.000 Yeah.
00:56:04.000 Well, I'm not sure whether we'll be right.
00:56:10.000 I'm not sure either.
00:56:11.000 I hope we are.
00:56:14.000 I think eventually it's going to be fine.
00:56:17.000 It's just a question of how long is it going to take, how much damage will have been done in that period.
00:56:23.000 But realistically, it's the only way there is a chance that we figure this out, right?
00:56:29.000 Have you faced any opposition to this?
00:56:31.000 Is there anybody that thinks this is a fruitless idea?
00:56:33.000 Because I know there were people that were actually, I was very shocked.
00:56:36.000 I read people that were actually happy that your project didn't work the first time.
00:56:40.000 I'm like, what the fuck, man?
00:56:41.000 There's people that...
00:56:42.000 I think it's a young thing.
00:56:46.000 Like, because you're this really hopeful, young, intelligent guy who comes up with a solution.
00:56:52.000 I think it probably goes, ah, does he think he's so fucking smart?
00:56:54.000 Ah, fuck him, I hope it fails.
00:56:56.000 And when it failed, people actually enjoyed it.
00:56:58.000 Yeah.
00:56:59.000 Was that...
00:57:00.000 Did that hurt?
00:57:02.000 No.
00:57:03.000 So for me, yes, of course, really since the beginning of 2013, there have been people, a relatively small group of people, but there have been people that have been opposing it.
00:57:17.000 And most of them, ironically enough, are people that care about the ocean because they don't feel it's the right way to tackle the issue.
00:57:27.000 But the way I deal with it is, at least what I used to do in the beginning, now unfortunately there aren't many new arguments anymore, but just basically write them out, every single argument, rationally analyze them, no emotions, emotions only model your thinking in that way.
00:57:46.000 And make a distinction, okay, is this something where this person has a point?
00:57:50.000 If so, great, because I'd rather have somebody else pointing it out to me than us having to learn it in the field and having an unnecessary failure.
00:58:00.000 And if the person doesn't have a point and if it's just an assumption or unfounded or whatever, then it's very easy for me to just ignore it and And then the question is, well, what motivates people to be negative?
00:58:14.000 And I think there's probably four reasons.
00:58:16.000 First of all, it's genuine skepticism whether it can be done.
00:58:21.000 And I think that's healthy.
00:58:23.000 And I think we've proved most of those arguments wrong now.
00:58:27.000 But, of course, there's still the whole scale-up thing, which we still have to do.
00:58:31.000 So there's still a bit of that, but it's kind of morphing now to a few other things.
00:58:36.000 I think one thing is...
00:58:40.000 Human risk perception, which sometimes I think is a cause of some opposition where it's very easy for people to ignore the baseline when they look at risks.
00:58:57.000 So, you know, you can, for example, say, okay, nuclear power, super risky, we shouldn't do that.
00:59:05.000 But then if you compare it to the baseline of other sources of energy, that's actually probably the least risky source of energy there is.
00:59:15.000 Even solar energy causes more deaths per megawatt hour than nuclear power because people fall off roofs.
00:59:26.000 So if you ignore the baseline and if you say, okay, doing this cleanup, we shouldn't do it because there's all these potential risks, right?
00:59:38.000 Potentially, there's some sea life that may be caught.
00:59:41.000 Potentially, there are these moral hazards.
00:59:45.000 There's all these risks.
00:59:46.000 And basically, the best thing to do is not do it.
00:59:50.000 What people then are ignoring is sort of the certain hazard of this hundreds of millions of kilos of plastic that's already in the ocean.
00:59:59.000 And if you were to kind of pose the opposite question and say, okay, so if I were to go to the ocean right now and just dump the equivalent amount of plastic that we were to take out, we'd dump it into the ocean.
01:00:13.000 Would you think that's a good plan?
01:00:15.000 And then, well, probably the answer is no.
01:00:18.000 So I think there's a bit of this, you know, of course what we're doing, it's new, there are risks involved, but as long as we map them well, we take things step by step, I think they're manageable.
01:00:33.000 And there are definitely reasons to not do it because, of course, the baseline is that there is already a lot of harm being done by the status quo.
01:00:43.000 So I think that's one argument behind people's opposition.
01:00:48.000 There's also a bit of what I call zero-sum game bias where people are saying, well, you shouldn't do this because the resources would be better spent elsewhere.
01:00:57.000 I saw an op-ed in Wired a few weeks ago where people were saying, well, or just one person actually was writing where this person said, You shouldn't worry about the plastic pollution issue.
01:01:11.000 You shouldn't do anything about it because climate change is the biggest issue and all our attention should go there.
01:01:17.000 Plastic pollution is just a distraction.
01:01:21.000 That's foolish.
01:01:22.000 Well, yeah, I think, you know, there's seven and a half billion people in the world, and we can do more than one thing at the same time, I think.
01:01:29.000 Yeah, I mean, should you not wash your dishes because your carpet is dirty?
01:01:33.000 I mean, it doesn't make any sense.
01:01:34.000 Both of them are a problem.
01:01:35.000 Clean both of them.
01:01:36.000 This idea that you should only think about climate change.
01:01:39.000 It's like, oh, don't think about the giant Pacific garbage patch that's twice as big as Texas?
01:01:43.000 Are you fucking serious?
01:01:44.000 It's a dumb argument.
01:01:46.000 Both of them are important.
01:01:47.000 To think about both of them are important, but...
01:01:49.000 A part of writing an article today is writing something that people will get upset about.
01:01:54.000 Part of it is generating outrage, clickbait stuff, having controversial opinions, being contrarian.
01:01:59.000 All those things are profitable today.
01:02:01.000 It's a giant part of why people write articles.
01:02:04.000 They don't write articles.
01:02:05.000 They don't write articles to state an objective, well-thought-out perspective always.
01:02:10.000 Sometimes people do, but a lot of times people make some click-baity bullshit and they kind of twist a story and twist an idea of who you are, twist it to sort of make their narrative be more compelling and sell more or click more and get more ad sales.
01:02:28.000 Yeah.
01:02:29.000 I wonder whether that's in part behind the growing tribalism and polarization that you see everywhere.
01:02:36.000 Social media.
01:02:37.000 I mean, the fact that Facebook's algorithms, in a sense, support outrage, right?
01:02:43.000 Like, these things are designed to support...
01:02:46.000 My friend Ari Shafir tested this, and it's really interesting because he tested it to find out what does it actually support.
01:02:51.000 What it actually supports is what you're interested in.
01:02:53.000 And if you're interested in being outraged, it'll show you things that outrage you.
01:02:57.000 So he decided to just only YouTube puppies.
01:03:01.000 And that's all YouTube would show him.
01:03:02.000 It's puppies.
01:03:03.000 He's like, no, you assholes.
01:03:04.000 This is what you're into.
01:03:06.000 If you're into fucking getting mad about the border and getting mad about the climate and getting mad about abortion and getting mad about whatever the fuck it is, that's what it'll show you because that's what you're interested in.
01:03:16.000 You know, my YouTube feed is mostly muscle cars and fights.
01:03:21.000 Why?
01:03:22.000 Because that's what I'm interested in.
01:03:24.000 And occasionally science things.
01:03:26.000 But that's just because that's what you search for.
01:03:29.000 It'll show you what you search for.
01:03:31.000 I'm sure you're somewhat happy that it shows you those things.
01:03:34.000 Yes.
01:03:35.000 Sure.
01:03:35.000 I don't think it's a sinister thing.
01:03:38.000 As people want to say it is.
01:03:40.000 I think the issue is, human nature, we are compelled to get upset about things, and I think a lot of it is people that feel disempowered in their own existence.
01:03:49.000 The people that you were talking about that are stuck in cubicles and that are staring at that clock, waiting for the buzzer to ring so they can go home.
01:03:56.000 Those people are online.
01:03:57.000 They're tweeting, they're taking a shit and tweeting, fuck this guy, this little kid thinks he's going to fix this fucking shit.
01:04:03.000 There's a lot of what's going on.
01:04:04.000 There's a lot of people that are upset.
01:04:07.000 It's fun to be upset when your life sucks.
01:04:09.000 It's fun to shit on somebody.
01:04:11.000 It's fun to get mad about the border.
01:04:15.000 You're living in fucking South Dakota.
01:04:16.000 You're nowhere near the border.
01:04:17.000 What are you worried about?
01:04:19.000 What you're worried about, you're just angry.
01:04:20.000 People are just angry.
01:04:21.000 These aren't logical discussions that people are having.
01:04:25.000 They're shout-offs.
01:04:28.000 It's a natural part of human nature to get upset about stuff.
01:04:32.000 Even someone who's doing something as beautiful as your perspective or your idea, instead of just saying, this guy is doing something amazing, we need someone like this who's just as innovative and just as inspired to try to tackle this climate issue.
01:04:48.000 We need more people like him.
01:04:50.000 This is amazing.
01:04:51.000 Instead of that, you're spending your resources incorrectly.
01:04:54.000 Yeah, yeah.
01:04:56.000 Well, yeah.
01:04:56.000 He got you, though.
01:04:57.000 You're talking about it.
01:04:58.000 Yeah.
01:04:59.000 Sure.
01:05:01.000 And, yeah, I suppose it's, you know, from the perspective of the person who writes that, it's...
01:05:07.000 He thinks he's got a point.
01:05:08.000 Yeah.
01:05:09.000 And, indeed, if you're just saying what everyone else says, nobody would see your opinion.
01:05:17.000 Exactly.
01:05:18.000 That's a big part of it.
01:05:19.000 Well, journalists are fucked right now.
01:05:22.000 And it's not their fault.
01:05:23.000 It's just print journalism is almost on the way out in terms of buying things, buying newspapers and buying magazines.
01:05:31.000 Their numbers are radically down.
01:05:33.000 So they resort to online things.
01:05:37.000 Well, in the online world, you have so much competition.
01:05:41.000 You have competition from a million different things that people can choose to look at or read.
01:05:46.000 And to get them to read a fucking article, you've got to have something good in there.
01:05:50.000 So you have to distort.
01:05:51.000 You have to inflame.
01:05:54.000 You have to get people polarized.
01:05:56.000 You've got to get them upset.
01:05:57.000 You've got to paint a picture that makes you want to click on it.
01:06:01.000 Like, what is he doing?
01:06:02.000 That fucking idiot's wasting his time trying to pull – doesn't he know what Greta Thornburg has been saying?
01:06:07.000 How dare you!
01:06:09.000 And that's what's going on, man.
01:06:11.000 But it's just a fun, weird time for humans.
01:06:16.000 There's a lot of negative things, but there's also a lot of positive things.
01:06:20.000 It's a fun, weird time.
01:06:23.000 There's a lot going on, and it's happening very, very, very quickly.
01:06:27.000 And the prognosticators, the people that are trying to...
01:06:30.000 Have some sort of an idea of where this is all going?
01:06:33.000 No one really knows.
01:06:35.000 And change is happening at such a rapid pace that it scares everybody.
01:06:39.000 So they're looking to define things and they're looking for control and they're looking to be the person who's got it figured out.
01:06:45.000 Because nobody's got it figured out.
01:06:47.000 It's madness.
01:06:48.000 The earth is heating.
01:06:49.000 The fucking ice caps are melting.
01:06:50.000 The fish are disappearing.
01:06:52.000 People are eating dolphins.
01:06:53.000 It's madness.
01:06:54.000 It's madness out there.
01:06:56.000 The fucking garbage patch is growing and growing and growing.
01:06:58.000 And if it wasn't for someone like you...
01:07:00.000 Who's actually acting and doing something about it.
01:07:03.000 It would just get worse.
01:07:05.000 You have a workable solution.
01:07:06.000 You should be applauded.
01:07:09.000 Yeah.
01:07:09.000 Fuck that guy.
01:07:10.000 You heard me.
01:07:11.000 Right, Jamie?
01:07:12.000 I don't think it was specifically...
01:07:14.000 Fuck that guy.
01:07:14.000 Fuck him hard.
01:07:15.000 Fuck him hard, right?
01:07:16.000 Yeah.
01:07:16.000 Jamie agrees.
01:07:17.000 Well, consensus.
01:07:19.000 Yes.
01:07:19.000 Consensus is fuck that guy.
01:07:21.000 Yeah.
01:07:21.000 But I get...
01:07:22.000 Consensus doesn't create clicks, so maybe we need some...
01:07:24.000 That guy is trying to make a living.
01:07:26.000 Or she, or there, or they.
01:07:28.000 Yeah.
01:07:28.000 They're trying to make a living.
01:07:29.000 You know?
01:07:29.000 I mean, I understand.
01:07:30.000 It's like, in this day and age, you have to...
01:07:33.000 Things you have to write about.
01:07:35.000 So they write about things, and it might not necessarily be...
01:07:39.000 So how do you incentivize the truth?
01:07:44.000 Again, I think we're in this transitionary phase.
01:07:47.000 And I also think technology is going to make a lot of what we're concentrating on obsolete.
01:07:56.000 I think we are really, really close to some crazy breakthroughs in terms of distribution of information that's going to make it obsolete.
01:08:04.000 And people aren't going to care as much about clickbaity things because, you know, you're going to be able to feel things from digitally created media.
01:08:13.000 I think we're very, very close to augmented reality becoming an essential part of people's lives.
01:08:19.000 You know, the same way your phone has become an essential part of your life.
01:08:22.000 Twenty years ago, no one carried a phone around.
01:08:24.000 It was very rare.
01:08:25.000 And, you know, 1999, I mean, a small percentage of people had phones on them.
01:08:31.000 Now it's 100%, right?
01:08:34.000 All this stuff is happening at this exponentially increasing rate.
01:08:38.000 When they implement augmented reality, and who was telling us that Apple's somewhere around 2021...
01:08:44.000 Man, I've been looking that up.
01:08:46.000 I mentioned it once or twice.
01:08:47.000 You definitely did.
01:08:48.000 Might have been you.
01:08:49.000 But some other folks have brought it up, too, that Apple's Really close.
01:08:54.000 And they're in the process right now of developing some sort of augmented reality goggles.
01:08:59.000 And they'll be like glasses.
01:09:01.000 You put on a pair of...
01:09:03.000 Just like this.
01:09:05.000 But you'll be seeing all these things in front of you.
01:09:08.000 You'll be able to move them around.
01:09:10.000 You'll be able to see navigation.
01:09:12.000 You'll be able to turn it on and off.
01:09:14.000 It'll probably work on Siri.
01:09:15.000 You'll be able to talk to it.
01:09:17.000 And you're going to be able to get video and information written, podcasts, all these things.
01:09:21.000 Music.
01:09:22.000 It's going to come through this.
01:09:24.000 And probably this is one step in this ever-increasing trend of us getting further and further immersed in technology.
01:09:34.000 And augmented reality will lead to some sort of impossible-to-determined virtual reality, where it's indistinguishable from regular reality.
01:09:43.000 We're like...
01:09:44.000 We're like 50 years away from literally being in the Matrix.
01:09:48.000 Yeah.
01:09:49.000 Yeah, so I think it's underappreciated how much our behavior is also guided by technology.
01:09:57.000 I mean, of course, we have our genes, our genotype, which kind of lies at the most fundamental level of how our behaviors are formed.
01:10:06.000 That's why there is such a thing as human nature.
01:10:10.000 But then there is this whole sort of cultural layer that we humans created around us, Which I call the technosphere.
01:10:20.000 Maybe other people have different names for it.
01:10:22.000 But it's indeed everything read.
01:10:25.000 We interact with something like 30,000 inventions or 30,000 technologies through our entire lives.
01:10:32.000 That's a huge amount.
01:10:34.000 And I think that environment that shapes your behavior, it decides what kind of genes are expressed in And the interesting thing is that it's not just a natural environment,
01:10:49.000 but it's an environment we create.
01:10:50.000 So probably, you know, when you think about people being born thousands of years ago, their genes were very, very similar to the people today.
01:11:02.000 Yet...
01:11:03.000 How they behave is completely different.
01:11:06.000 Look again at violence.
01:11:07.000 And why is that the case?
01:11:10.000 It's thanks to these inventions, not just physical inventions, but also cultural inventions and institutions that we created that shapes our behavior.
01:11:23.000 And probably...
01:11:28.000 Human behavior is very hard to change unless it actually benefits what we do.
01:11:36.000 Look at smartphones, how fast that happened versus how long it takes for smoking to go away.
01:11:46.000 One is incentivizing the continued use of it through addictive products, while with smartphones, again, it's something that you want to use.
01:12:00.000 So I just wonder whether that interaction between humans and the technology that we create incentivizes inventors to become morally better and better, because Did you lose me already?
01:12:17.000 No, no, no.
01:12:18.000 So the question is...
01:12:23.000 Well, people are incentivized primarily by profit, right?
01:12:25.000 Right.
01:12:26.000 But the behavior that people express is kind of shaped by the world they live in.
01:12:35.000 And who knows?
01:12:37.000 Maybe a person today is...
01:12:40.000 It's more incentivized to do good things because of the environment that has been created rather than a thousand years ago.
01:12:47.000 No, I think that's absolutely the case.
01:12:50.000 And I hope that people's ability to express themselves through social media, although it's often negative and bitchy, sometimes also can give you a sense of the moral landscape of the culture.
01:13:01.000 Like, not just the people on the far fringes that are the most angry and vehement about things, but people that have objective...
01:13:10.000 Real rational thoughts like the fact that you were able to read that article and objectively assess whether or not someone has any good points or not.
01:13:20.000 If we could all do that about everything, you know, if people had that sort of perspective instead of being so reactionary, instead of being so angry about things, just look at criticisms, look at possibilities, look at all these different things and then shape technology to fit within our ethical and moral boundaries.
01:13:39.000 So there's – and also it's very profitable, right?
01:13:43.000 Because if things don't feel – if you don't have like a guilty feeling about buying something – like every time I get a plastic straw now, I feel guilty, right?
01:13:51.000 If there was something – That people, they innovate to the point where you don't feel guilty supporting products and you feel like this company has the same sort of ethics and ideas that you have.
01:14:01.000 That's all good.
01:14:02.000 And I think we're moving more towards that.
01:14:04.000 But again, we're dealing with a very short window of time where human beings have had to adapt to this incredible amount of change that takes place during a small period of time.
01:14:14.000 Yeah, it's kind of the...
01:14:16.000 One way to look at problems is that it's kind of this chasm between human nature, human behavior, and how we want the world to be.
01:14:26.000 And indeed, social media, that's the case.
01:14:29.000 But similarly for environmental problems.
01:14:32.000 We humans are driven by certain things.
01:14:36.000 Self-interest is definitely...
01:14:38.000 A big part of it.
01:14:40.000 And yet, that's not creating the world right now that we want to live in because the technosphere, the technology that is interfaced between the world, sort of nature and human nature,
01:14:56.000 that interface.
01:14:57.000 It's not compatible with both.
01:15:00.000 So you either have something that's compatible with human nature, so it's like a big car with a V8 engine, but that's not compatible with nature.
01:15:11.000 Or you have something that's compatible with nature, which is probably walking, but it's not really compatible with human nature because we're lazy and greedy.
01:15:20.000 It's cold outside and you've got to get somewhere in a snowstorm.
01:15:23.000 Exactly.
01:15:23.000 So ideally what we do is rather than trying to change humans, which I don't think is a very futile activity because there is such a thing as human nature.
01:15:33.000 We have genes.
01:15:34.000 We have this evolutionary history.
01:15:36.000 Rather than trying to change that, I think it's much more effective to change the technology around us.
01:15:42.000 It enables our inner desires and behaviors to be positive rather than negative.
01:15:51.000 I agree with you.
01:15:52.000 I think it's going to be difficult, though, to get that same sort of positive...
01:16:00.000 Result when it comes to our addiction to technology, our addiction to smartphones in particular.
01:16:06.000 I mean, for a long time it was like televisions, right?
01:16:09.000 Like people talked about how much kids watch TV. Kids watch TV eight hours a day.
01:16:13.000 It's so much.
01:16:14.000 It's so bad.
01:16:16.000 We're good to go.
01:16:37.000 Yeah, and I suppose that's, again, this sort of infantile stage of that technology, I suppose.
01:16:45.000 Now we're infantile.
01:16:46.000 It was adolescent before, now you're dropping it down.
01:16:50.000 I think you're right.
01:16:54.000 Probably we can engineer social media and our information technology to incentivize people to do good things, but indeed now it's probably incentivizing the use of scrolling through timelines because you watch more ads.
01:17:12.000 Also, I think it's our bodies and our minds and the way we view the world.
01:17:19.000 We're not designed to live in this digital realm.
01:17:22.000 This is a completely new thing for the species, and I think we don't really know how to handle the dopamine rush that we get.
01:17:29.000 From clicking on Instagram and scrolling through your feeds and checking your DMs and reading your emails and constantly interacting with people and checking, did he text me back?
01:17:38.000 Oh, what did he say?
01:17:39.000 Oh, well, that's interesting.
01:17:40.000 What about this and that and this and that?
01:17:42.000 You're just all day, all day interacting with some digital device.
01:17:45.000 We're not made for this.
01:17:46.000 We're supposed to go outside.
01:17:48.000 And then you have very bright engineers somewhere in a big shiny building, A-B testing all day to see whether a red dot on a certain icon in the social media app makes people click more or less.
01:18:04.000 Yes.
01:18:06.000 Well, Instagram's kind of dabbling with this idea of taking away the likes.
01:18:11.000 Right.
01:18:12.000 Like, what if we just didn't show anybody the likes?
01:18:14.000 Right.
01:18:15.000 You don't know how many likes you get.
01:18:16.000 You put up a picture, it's just a fucking picture.
01:18:18.000 Move on.
01:18:19.000 Yeah.
01:18:20.000 No.
01:18:20.000 You put up a picture.
01:18:21.000 He got 70,000 likes for that picture?
01:18:25.000 Right.
01:18:25.000 What the fuck, man?
01:18:27.000 You check it, check it an hour later.
01:18:28.000 74,000.
01:18:29.000 Right.
01:18:30.000 Ooh, it's going viral.
01:18:32.000 Right.
01:18:32.000 Yes.
01:18:32.000 That's weird.
01:18:33.000 That likes thing is one of the weirdest drugs.
01:18:36.000 Nobody saw it coming, and people get addicted to saying things that get likes, right?
01:18:43.000 Putting things up that are socially conscious to let everybody know how virtuous you are.
01:18:51.000 Give me some likes.
01:18:52.000 Give me some likes.
01:18:53.000 Yeah.
01:18:54.000 And it's all making use of, I suppose, the flaws of our human nature.
01:18:59.000 What I'm worried is that one day those likes will actually be a physical feeling.
01:19:03.000 Oh yeah, you get like a...
01:19:04.000 A little jolt, a little love jolt.
01:19:08.000 And they'll engineer the system to get you to seek those constant love jolts.
01:19:13.000 Yes.
01:19:14.000 Why not?
01:19:15.000 Look, if they're going to give you augmented reality, we are how many generations?
01:19:20.000 I don't know.
01:19:20.000 Away from something being embedded in your body.
01:19:23.000 People have already decided to do that.
01:19:25.000 There's some...
01:19:26.000 Was it a guy or a girl embedded a fucking Tesla Model 3 key in their arm so that they didn't ever have to have their key in their pocket?
01:19:36.000 They could just walk up to their Tesla and the fucking door unlocks.
01:19:41.000 Imagine software updates, key doesn't work anymore.
01:19:45.000 But it runs out of batteries.
01:19:46.000 They've got to cut you open like a fish.
01:19:49.000 I mean, what the fuck are people doing?
01:19:51.000 Those are people at the fringes.
01:19:53.000 They are at the fringes.
01:19:54.000 But there's more of them than you think, and if they make it more...
01:19:57.000 If they make it simple, like you just need a flu shot, bang!
01:20:01.000 I wonder whether there's any innate fear or aversion towards crossing that...
01:20:13.000 Interior-exterior boundary with technology.
01:20:15.000 Fair is a good way of looking at it, right?
01:20:18.000 Like, what is fair?
01:20:19.000 Is it fair if you agree to do it?
01:20:20.000 Like, look, is it fair if you decide to get a face tattoo?
01:20:24.000 Right?
01:20:25.000 It's up to you, man.
01:20:27.000 If it's fair, it's like, hey, man, my credit card company told me they'd give me 10% off if I stick this, you know, this credit card chip under my skin somewhere.
01:20:36.000 Yes.
01:20:38.000 I suppose if you, again, incentivize it with selfish interests, maybe it will take off.
01:20:45.000 There's that and there's also the big concern is what if these – I mean we're talking about income inequality in this world.
01:20:54.000 A big one would be, what if there's a jump that you can make in enlightenment, in intelligence, access to information, number crunching, the ability to assess risk versus reward.
01:21:07.000 This is all done computer-wise and it's done through some sort of additional piece of hardware.
01:21:14.000 That they give you or put in your body, but it costs a lot of money.
01:21:18.000 So the people that can afford it initially are the people that have money in the first place.
01:21:22.000 They're the wealthy people already.
01:21:24.000 Because it's very valuable.
01:21:25.000 But then the people that really need it, they can't afford it.
01:21:28.000 So by the time it becomes something, all the money's gone.
01:21:32.000 Everybody's chewed it all up.
01:21:33.000 Everybody's figured out how to hack the system.
01:21:35.000 You should become a writer for Black Mirror.
01:21:37.000 That seems like a Black Mirror episode.
01:21:39.000 It seems like it would work, right?
01:21:41.000 Yeah.
01:21:42.000 Yeah, that's...
01:21:43.000 Well, that's what people are worried about when it comes to longevity too, right?
01:21:46.000 They're worried about technological innovations that are allowed people to, you know, nanobots and all sorts of different weird things are going to repair cells and allow people to live for extended periods of time.
01:21:57.000 But then who are these people going to be?
01:21:59.000 Are they going to be the king class?
01:22:01.000 You know, are they going to be these super duper wealthy people of the future that are going to, you know, hold this over the poor folks who can't afford the technology?
01:22:10.000 Yeah.
01:22:11.000 Yeah, so it truly seems like the technologies that we're developing, or at least are not too far away, our institutions aren't ready yet to really cope with those.
01:22:27.000 No.
01:22:28.000 Because definitely that would probably increase inequality quite a lot.
01:22:33.000 Yes, that is one of the major concerns when it comes to this sort of rapid change that we're facing right now.
01:22:40.000 You know, another one, of course, is artificial intelligence.
01:22:43.000 There's people that I respect very, very much that have a very negative view of what the future of artificial technology is going to mean to the human race.
01:22:54.000 Sam Harris.
01:22:55.000 Elon.
01:22:56.000 Elon.
01:22:57.000 Yeah, both of them scare the shit out of me every time I talk about it.
01:23:01.000 Sam and I did an episode where he talked about artificial intelligence and the rise of it and the fact that once it's uncorked, it's really not going to be able to be put back in the bottle.
01:23:14.000 We talked about it for like an hour and a half.
01:23:17.000 After it was over, the rest of the day I was bummed out.
01:23:20.000 I was like, this is inevitable.
01:23:22.000 Yes.
01:23:26.000 I suppose a very optimistic and pessimistic view of technology at the same time.
01:23:31.000 I think on one hand it allows us to improve the world and that's what we've seen and it's gradual and it continues probably because people want to solve their own problems and with that inadvertently solve other people's problems.
01:23:47.000 That's how progress happens I believe.
01:23:51.000 But then at the same time, while the world is getting a lot better, it's also getting riskier.
01:23:56.000 I mean, 2,000 years ago, or even 200 years ago, there was no way to wipe out humanity.
01:24:03.000 There simply wasn't.
01:24:04.000 Even if you wanted it to happen very badly, you know, you could scream, wouldn't happen.
01:24:10.000 Now, though, there are actually people who have the power to do that.
01:24:15.000 And rapidly.
01:24:17.000 The whole of humanity could be wiped out in a day.
01:24:20.000 Yeah, and now it's fortunately just a few people.
01:24:23.000 But imagine if that goes from a few people to quite a few corporations to maybe even everyone.
01:24:31.000 I think there's this sort of brain teaser or mental experiment that Nick Bostrom came up with that says, well, what if you could have kind of this atomic bomb that you could just make yourself in your microwave?
01:24:48.000 It's like, well, maybe at some point in time it would just not be economically feasible anymore to rebuild cities because it would just… So I don't know.
01:24:59.000 On one hand, I think that's kind of the scary, risky aspect of it.
01:25:08.000 At the same time, when you think of it, I would much rather trust or entrust an average person today with the button for a nuclear detonation device than somebody a thousand years ago.
01:25:24.000 Oh, for sure.
01:25:24.000 One of the Mongols or someone.
01:25:26.000 Some savage.
01:25:28.000 Let's do it.
01:25:29.000 Nick Bostrom freaked me out, man.
01:25:31.000 We had a conversation about probability of life being a simulation.
01:25:37.000 That's a very high probability.
01:25:39.000 It's more probable that we're in a simulation now than we're not.
01:25:43.000 And my puny monkey brain...
01:25:44.000 According to some logic, yeah.
01:25:46.000 Yes, that's where it gets weird when you deal with the...
01:25:50.000 The number of potential civilizations out there, the number of human beings, the amount of time that life has had a chance to evolve, not just here, but everywhere in the entire universe, where the possibilities that a simulation has occurred already, very high.
01:26:06.000 With the possibilities that we're in a simulation right now, also pretty high.
01:26:10.000 Simply because there's only one base layer of simulation, so, yeah.
01:26:15.000 Also, life seems fake.
01:26:18.000 Right?
01:26:19.000 It seems weird.
01:26:21.000 There's so much of it that seems like, boy, this is...
01:26:23.000 A few weeks ago, I saw a duck in a palm just making infinite circles.
01:26:28.000 That's definitely a glitch.
01:26:30.000 A glitch.
01:26:30.000 Yeah.
01:26:32.000 Yeah, someone needs to fix the code.
01:26:34.000 Yeah.
01:26:35.000 Debug the shit.
01:26:37.000 Yeah.
01:26:38.000 Yeah.
01:26:38.000 So, of course, it's lots of fun to think about those things, but, you know, relatively depressing things that we'll likely never know.
01:26:48.000 Well, maybe.
01:26:49.000 Maybe one time somebody will figure out the solution and...
01:26:52.000 This dystopian view of the future, it's, I mean, I get the perspective.
01:26:57.000 I get the dystopian perspective.
01:26:59.000 But right now, as we said, like, you know, according to Pinker, according to statistics, things are really better than they've been before.
01:27:06.000 And my concern is that, my concern is one of the things that Elon said, we're the biological bootloader for artificial life.
01:27:15.000 Right.
01:27:16.000 Look, when a caterpillar makes a cocoon, it doesn't know what the fuck it's doing.
01:27:21.000 It just does it.
01:27:22.000 It just makes a cocoon and becomes a butterfly.
01:27:25.000 We're buying the iPhone 36 and the Cybertrucks and trying to get a...
01:27:31.000 Solar-powered plane off the ground.
01:27:34.000 We're probably giving in to this thing.
01:27:38.000 Look, what we have right now is more than sufficient for survival.
01:27:42.000 If we had just decided, if we got all the people in the world to say, hey, watches, we make watches that keep perfect time.
01:27:49.000 Computers, they get online.
01:27:50.000 It's great.
01:27:51.000 You can download YouTube videos.
01:27:53.000 Cameras, they're very clear.
01:27:54.000 They take very clear pictures.
01:27:56.000 TVs look great.
01:27:58.000 Everything looks great.
01:27:59.000 Internet speeds, pretty fucking good, man.
01:28:02.000 Especially with 5G. Let's stop!
01:28:04.000 Everybody stop!
01:28:06.000 Stop.
01:28:07.000 Stop making stuff.
01:28:08.000 Everything we have right now, just keep making it.
01:28:12.000 No new innovation.
01:28:13.000 Let's just enjoy life together.
01:28:16.000 That sounds so logical but yet also so ridiculous.
01:28:21.000 No one's going to agree to that.
01:28:24.000 That iPhone 37 is already in production, bitch.
01:28:27.000 It's going to be better and faster and it's going to wrap around your dick and keep you comfort at night.
01:28:31.000 They're going to figure out better stuff no matter what forever.
01:28:35.000 It's part of what makes people people.
01:28:37.000 We have this unquenchable thirst for innovation.
01:28:40.000 That's one of the weird things that freaks me out about this move towards technology is that materialism, which seems to be this like really standard behavior with a giant percentage of the population like people are really into things and this desire to have the newest greatest things is what propels innovation because there's a financial incentive because people are making money off of selling you these better watches that you don't really need or these better cars or these better computers
01:29:11.000 and all these things just keep getting better and better and better and better and a lot of it is fueled by this weird desire that people have for stuff Which doesn't make any sense.
01:29:20.000 Like, where'd that come from?
01:29:21.000 Well, that might be the stuff that makes the caterpillar make the cocoon.
01:29:26.000 Yeah, I mean, it's probably just making use of the same biological mechanisms as social media, right?
01:29:32.000 You feel like you need it.
01:29:34.000 Yep, tricks, little tricks.
01:29:36.000 Yeah, and next thing you know, they go, listen, we have two options.
01:29:40.000 Either we let artificial life take over and be the superior life form, or we merge.
01:29:46.000 Let's just make friends.
01:29:47.000 Let's just make friends with artificial intelligence.
01:29:50.000 Take this little chip boy on.
01:29:52.000 Yeah, well, so probably if it's possible, and it likely is, it's probably pretty inevitable that it's going to happen.
01:30:02.000 I think it was Edison that said that...
01:30:07.000 I never invented anything.
01:30:09.000 I just took elements of what was there.
01:30:11.000 Probably the quote is a lot better than I'm paraphrasing now.
01:30:15.000 But it's kind of, if it's possible, it's kind of there.
01:30:22.000 It's in the air.
01:30:23.000 It just needs to be invented.
01:30:24.000 Yes.
01:30:24.000 You just have to discover it.
01:30:26.000 Marshall.
01:30:28.000 So probably that's going to happen, but what does give me hope is that to my point of the nuclear detonator a thousand years ago versus now, it seems like we are getting more responsible and our ability to foresee the future allows us to invent things,
01:30:48.000 but it also allows us to think about the risks and to try and mitigate the risks before they happen.
01:30:57.000 I don't think there's nearly enough attention given to these existential risks.
01:31:03.000 But the fact that some people are thinking about it is kind of hopeful.
01:31:11.000 No, I agree.
01:31:12.000 I really do.
01:31:13.000 And I'm posing these things about this dystopian potential future just because...
01:31:18.000 Really, it's probably something we should think about, but I am hopeful that as technology improves, our understanding of humans improves along with it.
01:31:28.000 And also that perhaps some technology, like I'm not exactly sure what this neural link thing is with Elon that he's coming up with, but I think some of it has to do with a much more rapid access to information.
01:31:43.000 Sure.
01:31:44.000 You know, that has to do with increasing bandwidth.
01:31:46.000 Yeah, increased bandwidth, yeah.
01:31:49.000 Hopefully, that will become, I mean, you don't want to say hopefully some fucking wires they stick in your brain will become standard because that seems like we are merging.
01:31:58.000 I mean, that is merging, right?
01:32:00.000 That's the merge with technology.
01:32:02.000 Yes.
01:32:03.000 Marshall McLuhan said the greatest thing about this.
01:32:06.000 He said, human beings are the sex organs of the machine world.
01:32:10.000 What a great quote, right?
01:32:12.000 That's one of those quotes you just go, whoa.
01:32:15.000 That is exactly what it is.
01:32:16.000 The machine world can't make itself.
01:32:18.000 Needs us.
01:32:20.000 Like, if we do make artificial life...
01:32:22.000 And McLuhan, I think, wrote that in the 60s.
01:32:26.000 Yeah.
01:32:26.000 I think that's from...
01:32:27.000 What is the book?
01:32:29.000 He's got a book, Media Something.
01:32:32.000 But...
01:32:34.000 What is the book?
01:32:35.000 Marshall McLuhan.
01:32:37.000 So it's a broader point that our understanding media or media?
01:32:45.000 Understanding media.
01:32:46.000 That's it.
01:32:47.000 1964. Imagine if that guy called it in 64. So definitely, it's a broader point a lot of people make that we are, in a way, enslaved by our technology.
01:33:01.000 I think in the book Sapiens by Harari, he makes the point about grain enslaving us because with the agricultural revolution 10,000 years ago, We didn't really become better,
01:33:17.000 according to him.
01:33:18.000 It was less nutritious.
01:33:20.000 It was just a worse way of living than the hunter-gatherers did.
01:33:27.000 But it was very good for the population of grain, and there was no way back for us.
01:33:33.000 So I suppose that's the thing with all our inventions.
01:33:39.000 There are these lock-in effects that can kind of lock us into an inferior position.
01:33:48.000 Of course, the risk with artificial intelligence is that a similar thing happens, and that's not very benign.
01:33:59.000 We didn't foresee those consequences that we will be locked in in the year 2065 or whatever it is.
01:34:06.000 It's one of the more fascinating things about people, though, that we have the ability to contemplate the possibilities, that we have the ability to look at this and go, oh, okay, what are we doing here?
01:34:14.000 Hold on.
01:34:15.000 Hold on, we're making a mistake here.
01:34:17.000 Look what wheat's got us doing.
01:34:19.000 Look what rice has got us doing.
01:34:20.000 God damn it.
01:34:21.000 Look how much people there are in this city.
01:34:23.000 There's so many people in this city.
01:34:24.000 We've got to feed all these people.
01:34:26.000 Shit!
01:34:26.000 We didn't think of this.
01:34:27.000 We just kept breeding.
01:34:29.000 And, you know, that's the big concern when people start developing into new areas.
01:34:35.000 When people start expanding the technological or the rather societal sprawl.
01:34:40.000 When you see these urban sprawls just slowly encroaching on new land and pushing out into areas where there was no houses before.
01:34:48.000 It's always weird for me when I drive by a place.
01:34:51.000 Boulder, Colorado does a really good job of limiting the amount of construction that gets done there.
01:34:58.000 They're pretty fierce about it, but even they have been sort of lightening up a little.
01:35:02.000 Things have been getting built, and every now and then I'll drive by.
01:35:05.000 If I'm in Colorado, I'm like, oh, that wasn't there before.
01:35:08.000 Now it's there.
01:35:09.000 Everybody thinks it's harmless.
01:35:10.000 No big deal.
01:35:11.000 Just a new building.
01:35:11.000 Used to be an open field.
01:35:13.000 Who's that helping?
01:35:14.000 It's not helping anyone.
01:35:15.000 And then another building outside of that.
01:35:17.000 And then you have the ability to look in time 50 years from now.
01:35:22.000 You see this spread where this weird wart of humanity starts moving across the globe.
01:35:28.000 There's this cool feature in Google Maps where you can have, or Google Earth, where you can have time lapses from satellite photos for the past six years.
01:35:36.000 And for example, if you look at Dubai, 30 years ago, nothing.
01:35:41.000 Dubai is a crazy example.
01:35:43.000 That's a crazy example.
01:35:45.000 That place is so strange.
01:35:47.000 It's like Las Vegas on steroids.
01:35:50.000 Yeah, Las Vegas on steroids with its own islands.
01:35:54.000 Man-made islands, the shape of the world, like all the different continents of the world.
01:35:58.000 Have you seen that?
01:35:59.000 Yeah, yeah.
01:35:59.000 Some of it, apparently, they have to keep adjusting because the tide rises.
01:36:06.000 Is this the Google Maps thing?
01:36:08.000 There's another one.
01:36:09.000 There's another one in Dubai.
01:36:10.000 Oh, my God.
01:36:11.000 This is insane.
01:36:13.000 This is insane.
01:36:14.000 It's happening so fast.
01:36:16.000 How many years is this spanning?
01:36:18.000 20?
01:36:20.000 Oh my god, that's amazing.
01:36:22.000 So actually some of our engineers used to work for the dredger and they actually helped build the palm islands.
01:36:31.000 And apparently the problem is that the water doesn't really move in the arms of the palms so it kind of gets stinky and algae.
01:36:40.000 So this prime real estate that just...
01:36:43.000 Stinks.
01:36:44.000 Oh no, really?
01:36:45.000 Oh, that makes sense.
01:36:46.000 That the water in between it would get stale.
01:36:49.000 Of course, that makes sense.
01:36:52.000 Fuck, this is bonkers.
01:36:53.000 This video is bonkers.
01:36:55.000 Here you see the...
01:36:55.000 These are actual...
01:36:56.000 Oh, there's the...
01:36:57.000 That's the thing.
01:36:59.000 So all that water inside, yeah, it doesn't get recycled.
01:37:02.000 It doesn't move around.
01:37:03.000 No.
01:37:03.000 How the fuck did...
01:37:04.000 They had to do that to keep the...
01:37:05.000 Like an ocean break, right?
01:37:07.000 I suppose.
01:37:09.000 To keep the water from smashing into it?
01:37:11.000 Like the outside rim?
01:37:12.000 You can't just pump everything, I suppose.
01:37:14.000 So now what do they do?
01:37:15.000 With all the stinky water?
01:37:17.000 You just accept it?
01:37:18.000 Good question.
01:37:19.000 That fucking world's tallest building is bananas.
01:37:22.000 Right.
01:37:23.000 I've seen pictures that people have taken from the top floor.
01:37:26.000 It doesn't even look real.
01:37:28.000 Like all you flat earthers, you need to go to Dubai.
01:37:31.000 Get up there.
01:37:31.000 You can literally see the curve.
01:37:33.000 Yeah, and when you stand at its foot, because of its shape, it looks even taller.
01:37:38.000 Can you see the curve from up there?
01:37:40.000 I don't think so.
01:37:44.000 On an airplane, you can't see it either.
01:37:46.000 You can kind of see it.
01:37:48.000 That's true.
01:37:49.000 Yeah, you've got to kind of see it.
01:37:52.000 You've got to be way up.
01:37:54.000 Would you go on one of those virgins trips?
01:37:56.000 Do they do a virgin spaceship?
01:37:58.000 Fly above the earth and look down?
01:38:00.000 Once there's good safety statistics.
01:38:03.000 Right.
01:38:03.000 Good for you.
01:38:05.000 Fuck early adopting.
01:38:07.000 Yeah, it's probably not good to be an early adopter in the space area.
01:38:11.000 What kind of psychos would want to be on that first flight?
01:38:14.000 Well, I suppose Richard would have to go on one of the first ones himself, right?
01:38:19.000 Oh, if I was him, I would clone myself and put a fake me on that and see if that bitch blows up.
01:38:24.000 I wouldn't trust it.
01:38:25.000 Stunt double.
01:38:26.000 Yeah.
01:38:27.000 What is this, Jamie?
01:38:28.000 Is this from...
01:38:29.000 Oh!
01:38:29.000 Oh my god.
01:38:31.000 I'm getting vertigo just looking down.
01:38:33.000 I am freaking out.
01:38:35.000 This is all GoPro, so this is what flat earthers use to show that the earth is actually flat.
01:38:41.000 It's the perspective shift.
01:38:43.000 But just the height of that goddamn thing makes my palms sweat.
01:38:47.000 I was there last year, and there was somebody sticking her phone out of the...
01:38:53.000 No!
01:38:54.000 No!
01:38:56.000 No!
01:38:57.000 No!
01:39:00.000 Goddamn, people are crazy.
01:39:01.000 Yes.
01:39:02.000 Imagine if you dropped it and it fell and just went right through someone's fucking head.
01:39:07.000 Literally like a missile.
01:39:08.000 Boom!
01:39:09.000 Your head would just explode.
01:39:11.000 You imagine getting hit in the head with a cell phone from a mile up?
01:39:15.000 Motherfucker.
01:39:17.000 Motherfucker.
01:39:17.000 People are so crazy.
01:39:19.000 I'm going to take a picture and look down.
01:39:21.000 Look.
01:39:25.000 It would take so long, too.
01:39:27.000 That puts the word burst mode on a new perspective to that term.
01:39:31.000 They should have an alarm that goes off when someone drops something.
01:39:33.000 Someone can quickly hit an alarm.
01:39:34.000 And by the time it hits the bottom, the bottom part will hit.
01:39:38.000 They can all back away from the building.
01:39:41.000 These guys are on the tip top.
01:39:42.000 No, shut the fuck up.
01:39:43.000 Oh my god, they're hanging on!
01:39:44.000 Are these Russians?
01:39:46.000 Dude, look at my hands.
01:39:47.000 This kid's with his dad, it looks like.
01:39:49.000 Oh, his dad's crazy too.
01:39:50.000 Great, you're both fucking nuts.
01:39:52.000 Dude, my hands are sweating right now.
01:39:54.000 I can't handle these.
01:39:59.000 That dude that we've had on, James...
01:40:01.000 With a K... Kingman?
01:40:10.000 The guy who does those.
01:40:12.000 Is that it?
01:40:13.000 Kingman, right?
01:40:15.000 He does these videos from these fucking places Kingston.
01:40:19.000 Thank you.
01:40:19.000 Sorry, James He's American?
01:40:21.000 I smug a lot of weed.
01:40:22.000 No, he lives in England.
01:40:24.000 Okay.
01:40:24.000 And he takes these videos where he's like hanging on by one hand with a fucking selfie stick.
01:40:29.000 Yes.
01:40:29.000 Looking down at these giant skyscrapers.
01:40:32.000 My hands sweat so hard just watching those.
01:40:36.000 I can't imagine if I was actually doing that.
01:40:38.000 What a terrible fucking instinct or a terrible thing that happens to people.
01:40:42.000 When you see people hanging on to something, your hands sweat.
01:40:45.000 Hmm.
01:40:46.000 So do your hands sweat when you're hanging out?
01:40:47.000 Yeah, that's just not helping you.
01:40:49.000 That's the worst thing that could happen is your hands get all slippery.
01:40:53.000 Fuck, man.
01:40:55.000 I'm just looking at his channel.
01:40:56.000 It looks like he recently had an accident.
01:40:58.000 What?
01:40:58.000 A video posted on September 11th says the day I nearly died and he's got a big old stitches gash.
01:41:05.000 Well, we talked to him about that when he was on the podcast.
01:41:08.000 Does he think about that?
01:41:10.000 But he's kind of locked into that now.
01:41:12.000 You're sort of married to this idea that you...
01:41:15.000 You know, you're the guy who goes up there and does that.
01:41:17.000 You can't just say, well, I've done it.
01:41:19.000 No more.
01:41:20.000 It's like the grain.
01:41:21.000 Yep.
01:41:22.000 It's like the grain.
01:41:23.000 Yeah, he's married to the thrill.
01:41:25.000 Well, Alex Honnold is the best example of that, right?
01:41:27.000 He's this free solo guy.
01:41:29.000 Oh, yeah.
01:41:29.000 A free solo climber that goes El Capitan with no ropes.
01:41:34.000 Did you watch the...
01:41:35.000 Yes!
01:41:36.000 Fuck that!
01:41:37.000 I've had him on the podcast a couple of times.
01:41:40.000 He freaks me out.
01:41:41.000 I just don't understand how he can do it.
01:41:43.000 I get it.
01:41:46.000 That's his thing.
01:41:47.000 He's passionate about it.
01:41:50.000 Sweat!
01:41:51.000 My hands are so sweaty!
01:41:54.000 It's just a weird thing that people do where they try to activate their adrenal glands.
01:41:58.000 They try to activate their thrill glands.
01:42:01.000 I do respect that.
01:42:04.000 It's his passion and he follows his passion.
01:42:08.000 Some people clean the ocean.
01:42:09.000 Some people climb a 45 degree angle backwards up the size of a mountain.
01:42:15.000 Wouldn't want to be a family, but...
01:42:17.000 No.
01:42:18.000 Right?
01:42:19.000 It looks like he might have stopped doing that.
01:42:22.000 Kingston?
01:42:23.000 I mean, he might be getting paid to still do it as, like, a stunt person for movies because he did one for a movie recently, but he has a video from this year recently that says, like, YouTube demonetized all his videos, and he's just been posting car stuff.
01:42:34.000 YouTube has demonetized his videos?
01:42:36.000 Why?
01:42:37.000 Because they don't want to incentivize?
01:42:38.000 Probably.
01:42:38.000 Oh, wow.
01:42:40.000 That's interesting.
01:42:41.000 He's had a video explaining it.
01:42:42.000 You could probably talk about it.
01:42:43.000 But I guess at the end of climbing, YouTube demonetized me.
01:42:45.000 August 22nd.
01:42:47.000 Wow.
01:42:48.000 That's interesting.
01:42:50.000 There you go.
01:42:51.000 That's interesting.
01:42:53.000 Huh.
01:42:54.000 He's being more realistic.
01:42:56.000 Yeah, but here's the thing.
01:42:58.000 They demonetize it, but yet they still have it up.
01:43:01.000 And they still have an algorithm that will click you after his video and suggest a bunch of other shit they're going to monetize.
01:43:08.000 But it might not have ads on it.
01:43:09.000 Right.
01:43:10.000 His video, I'm sure, doesn't have ads on it, but that doesn't mean it's not effectively a part of their system that sort of gets revenue.
01:43:19.000 Because, you know, they're going to recommend a bunch of videos that do have ads, and you're going to keep clicking.
01:43:23.000 I think as far as YouTube stands, it's just like if an advertiser saw their ad on this video, we can't be a part of that.
01:43:30.000 Well, I think also from YouTube's perspective, almost all of it is illegal.
01:43:35.000 Very good point.
01:43:35.000 I think that's a big one.
01:43:37.000 Like, I don't think you could do illegal shit on YouTube and get money off of it.
01:43:40.000 Then they would be responsible in some sort of a way, right?
01:43:44.000 Make sense?
01:43:45.000 Boy, I'm over here fixing the world, dude.
01:43:47.000 Yeah, done.
01:43:50.000 Anything else?
01:43:53.000 Do we cover it all, basically?
01:43:56.000 Yeah, I think so.
01:43:58.000 Yeah.
01:44:00.000 Of course.
01:44:01.000 Theoceancleanup.com.
01:44:02.000 People, please go there and help.
01:44:09.000 We'll help, too.
01:44:10.000 We'd love to contribute, love to be a part of this, for sure.
01:44:14.000 If there's anything else we can do, if there's anything that you need promoted or you want to let people know, we'd be happy to help.
01:44:20.000 But thank you.
01:44:21.000 Thanks for being you, man.
01:44:22.000 Thanks for inventing this and thanks for pursuing this so doggedly and being so obsessed with what is an incredibly worthy cause.
01:44:30.000 Appreciate you, man.
01:44:30.000 Well, thanks so much.
01:44:31.000 My pleasure.
01:44:32.000 Alright, bye everybody.
01:44:33.000 See you!