The Joe Rogan Experience - March 13, 2017


Joe Rogan Experience #930 - Will MacAskill


Episode Stats

Length

2 hours and 44 minutes

Words per Minute

181.14458

Word Count

29,753

Sentence Count

2,081

Misogynist Sentences

39


Summary

In this episode of the podcast, I sit down with author and speaker Sam Harris to talk about the concept of "effective altruism" and why it's important to give back to society. We talk about what it means to be an effective altruist, the benefits of giving back, and why we should all try to do the same. Sam is a great guy and I really enjoyed our conversation, so I hope you do too. If you like what you hear, please HIT SUBSCRIBE on Apple Podcasts or wherever else you get your stuff. I'll be picking one person at random who leave a review to win a FREE place on the next Shreddin8 program! Thanks again for listening and Good Luck Out There! -Jon Sorrentino Timestamps: 3:00 - Sam Harris 4:30 - The concept of effective altruism 5:15 - The benefits of charitable giving 6:00 7:30 8:20 - How much money do you need 9:40 - Why is it important to be generous? 10:00 | How can we all be kinder to each other 11:20 12:30 | What is the best thing we can do with our money? 13:40 14:00 // 15:40 | How to be a good human being 15:00 Is it possible to have a nice private jet 16:20 | Is there a better private jet? 17:00 / 16:30 / 17: What are you going to do with your money 18: Is it better than a nice thing? 19:00/16:00 +17: What do you really need to be better than that you can afford a nice plane 21:40 / 18:40/19:50 / 21:50 22:30/23:30 // 22:00 Can you have a nicer private jet ? 23:00 Do you want to have more money than you can you can have a jumbo jet? / 22:50/25:00 & 25:00 Are you better off than you need to buy a plane? 26:00 Does it really matter? 27:00 Should you have to be more than $100,000 or $200,000? , 26:30 & 27:10 Is it a problem?


Transcript

00:00:00.000 We've got to figure out a way to make it so we just have a one-button thing where everything syncs up with one button.
00:00:07.000 Is that possible one day?
00:00:08.000 Maybe.
00:00:10.000 We're live.
00:00:11.000 Right now we're live?
00:00:12.000 We're live live.
00:00:12.000 Cool.
00:00:13.000 How are you, sir?
00:00:14.000 Yeah, I'm not too bad.
00:00:14.000 Welcome.
00:00:15.000 Thanks for coming, man.
00:00:16.000 Appreciate it.
00:00:16.000 Yeah, no worries.
00:00:17.000 Thanks so much for having me.
00:00:18.000 Sam Harris is going to be with us, but he flaked out last minute.
00:00:20.000 He's a busy man.
00:00:21.000 Yeah, he's a busy man.
00:00:24.000 So, I'm interested to talk to you about a bunch of things, but one of the big ones is this idea of effective altruism.
00:00:31.000 And this is something that you really promote to the point where, I don't know if this is true, but I read this about you, that everything that you make over $36,000 a year, you donate?
00:00:40.000 Yeah, that's right.
00:00:41.000 Wow.
00:00:42.000 Yeah, so everything...
00:00:44.000 Technically, it's everything above £20,000 from 2009 Oxford.
00:00:49.000 So just for inflation, cost of living changes and stuff.
00:00:52.000 But that's about $36,000.
00:00:54.000 So you've just sort of decided that, which is, by the way, the 1% for the whole world.
00:00:59.000 Yeah, not quite.
00:01:00.000 About 2%.
00:01:01.000 Yeah, I'll be in the top.
00:01:02.000 Still be in the top 2%, even despite...
00:01:04.000 I thought it was $34,000.
00:01:05.000 I think $34,000 puts you in the top 1%.
00:01:07.000 I think it's $55,000.
00:01:09.000 Oh, has it changed?
00:01:10.000 Maybe since Trump's been in office.
00:01:12.000 Yeah, that's right.
00:01:13.000 But it's a...
00:01:14.000 You know, what you're doing is...
00:01:17.000 If that's really the case, that's a very charitable thing.
00:01:20.000 Yeah, and it's also...
00:01:21.000 I mean, it's most of my income over the course of my life.
00:01:24.000 Like, especially as an academic, you're not going to earn tons.
00:01:27.000 Though...
00:01:29.000 Since effective altism blew up, you end up getting things like speaking fees and, you know, and I give all that away as well.
00:01:35.000 So it's gonna end up probably being, like, the large majority of income over the course of my life.
00:01:39.000 Do you ever, like, want to buy something and be like, shit, if I wasn't so goddamn generous, I'd be able to get this?
00:01:44.000 You know, I never do.
00:01:45.000 Really?
00:01:45.000 I, like, basically never think that, yeah.
00:01:47.000 I think, like...
00:01:50.000 I feel like in contemporary society, we just get bombarded with marketing stuff all the time, saying like, oh, you really need this thing if you're going to have a good life.
00:01:59.000 And I think in almost every case, that's just not true.
00:02:02.000 I think the psychological evidence just shows that once you're above a certain level of income, additional money just has a very small impact on your happiness.
00:02:12.000 And in my own case, like...
00:02:14.000 The things that make me happy are being surrounded by friends, that's free.
00:02:17.000 Gym membership, that's like $40 a month or something.
00:02:20.000 It's not very much.
00:02:21.000 I can afford that.
00:02:22.000 Being able to work on what I really am passionate about, and I already have that.
00:02:26.000 So my life is just so good in so many ways, and I feel like there's so much of a focus on money and how money is the key to happiness, and I think it's just all bullshit, basically.
00:02:36.000 Well, it's definitely some bullshit in it.
00:02:38.000 And I see that a lot in my neighborhood because I live where white people go to breed.
00:02:44.000 And they go to breed and they sit down and they just talk about things.
00:02:50.000 They talk about Range Rovers and certain watches and certain purses and shoes.
00:02:57.000 And it becomes this constant...
00:02:59.000 The amazing thing is just how you adapt.
00:03:01.000 It's called the hedonic treadmill.
00:03:03.000 The richer you are, the richer you need to be.
00:03:05.000 Oh, yeah.
00:03:06.000 So I was once part of a conversation.
00:03:09.000 I was going to give a talk and I was going to a family and I was on a private jet, in fact.
00:03:16.000 And the conversation was discussion of different private jets and which private jets are better than so on.
00:03:21.000 This other person has this really nice private jet.
00:03:24.000 And it just means that, like, at no stage do you ever lose the, like, oh, I could just have this nicer thing.
00:03:29.000 No, because you can get to the point where you want a jumbo jet, like one of those Qantas Airbuses and deck that out like a house.
00:03:36.000 Yeah, I mean, I'm sure that one of those Richard Branson-type characters probably has something like that.
00:03:42.000 Yeah, that's probably right.
00:03:43.000 Well, it seems to get to this...
00:03:45.000 You hit this critical mass stage where you, you know, like these billionaire characters, where they start buying $100 million yachts and $400 million yachts.
00:03:54.000 And what is the most expensive yacht?
00:03:56.000 I believe it's a half a billion dollars or more.
00:03:59.000 That's incredible, yeah.
00:04:00.000 And you need to have a staff to...
00:04:02.000 Take care of it the whole time.
00:04:04.000 And if it ever...
00:04:04.000 The thing is, I think if I had a yacht, that would make my life worse.
00:04:07.000 Because now I'd be stressing about this yacht, like, what if it gets damaged, like, I feel bad that I'm not using it.
00:04:13.000 Mmm, yeah.
00:04:14.000 Yeah, I would imagine.
00:04:16.000 Unless...
00:04:16.000 Well, I guess not, though, because if you kind of...
00:04:18.000 Look at this.
00:04:19.000 Oh, Jesus Christ!
00:04:21.000 It's a billion?
00:04:22.000 Billion dollars on a yacht.
00:04:23.000 The streets of Monaco is what it's called, and it is one billion dollars.
00:04:27.000 Go to that thing.
00:04:28.000 That's it?
00:04:29.000 Thanks.
00:04:30.000 Oh my god, it's a neighborhood!
00:04:34.000 It's a floating neighborhood!
00:04:36.000 I think on all of these things you should replace the cost with how many bed nets you could buy for children in sub-Saharan Africa.
00:04:43.000 Oh, well, that's just ridiculous.
00:04:45.000 Hold on, go up.
00:04:46.000 Did one say $1.2 billion?
00:04:48.000 Scroll up.
00:04:50.000 Estimated price.
00:04:51.000 Oh my god, the eclipse.
00:04:52.000 Oh, 450 million to 1.2 billion.
00:04:55.000 That's like when you go to get it made, and you go like, how much is it gonna cost me?
00:04:59.000 Like, between 450 million and 1.2 billion, you're like, ah, you know, normal money.
00:05:05.000 Yeah, yeah.
00:05:05.000 Normal shit.
00:05:06.000 Fuck it change.
00:05:07.000 That is fucking insane.
00:05:09.000 Look at that goddamn thing.
00:05:12.000 I mean, oh, it's a replica of the Monaco Grand Prix track.
00:05:18.000 Oh my god, that's insane.
00:05:21.000 So you can drive around on your yacht at a ridiculous rate of speed.
00:05:26.000 So this guy probably has like a Ferrari that goes all over the surface of his crazy yacht.
00:05:34.000 He's got a fake beach!
00:05:36.000 But it hasn't been sold yet.
00:05:38.000 Oh, it hasn't?
00:05:38.000 So it's not actually owned yet, I don't think.
00:05:40.000 Mmm.
00:05:41.000 Oh, okay.
00:05:41.000 It's gonna be interesting who buys that.
00:05:42.000 They're gonna get a lot of attention.
00:05:44.000 Well, there are enough people.
00:05:46.000 There's a bunch of those people.
00:05:48.000 Yeah.
00:05:49.000 I mean, I don't know how many...
00:05:51.000 If it's 1.2 billion, that's probably, there's only a couple of thousand people in the world who are worth that much.
00:05:55.000 Really?
00:05:55.000 Yeah.
00:05:56.000 Even if they're willing to sink their whole fortune.
00:05:57.000 How many billionaires do you think there are worldwide?
00:06:00.000 Let's guess.
00:06:00.000 Three and a half thousand billionaires.
00:06:01.000 3,500?
00:06:02.000 You sound very confident.
00:06:04.000 I think it's about that, yeah.
00:06:05.000 Oh, that's a large number.
00:06:07.000 That is kind of crazy.
00:06:08.000 3,500 people that have more than $1,000 million.
00:06:12.000 Yeah.
00:06:14.000 And there's old Will McCaskill.
00:06:16.000 I know.
00:06:17.000 35,000?
00:06:18.000 Cuts it off.
00:06:20.000 Half that.
00:06:20.000 1,800.
00:06:21.000 1,800 people that are billionaires?
00:06:24.000 Oh, you're happy.
00:06:25.000 Well, no, I'm just happy we've got a fact checker on here.
00:06:27.000 Oh.
00:06:28.000 Correct all my false statistics.
00:06:30.000 Well, that's a lot of money, man.
00:06:32.000 But it is one of those weird things where I do not think that money equates to happiness.
00:06:36.000 One of the things that money does do is it alleviates the stress of bills.
00:06:40.000 But a lot of those stress of bills can be alleviated by not buying as many things, right?
00:06:44.000 It's like a lot of the stress of bills that people have is sort of self-imposed stress.
00:06:48.000 Like you get a mortgage for a very large house, you have car payments, you have all these different things that you're paying for.
00:06:53.000 So that kind of money stress that some people put themselves under Is actually not really necessary, right?
00:07:00.000 Yeah, absolutely.
00:07:01.000 So if you broke it down to what do you actually need?
00:07:04.000 Just need a nice place to live where it's not crime-ridden and it's safe.
00:07:07.000 You need a bed.
00:07:08.000 What else do you need?
00:07:10.000 Food?
00:07:10.000 Yeah, you need food, exercise, obviously.
00:07:12.000 Are you one of those no TV dudes?
00:07:14.000 Do you have a TV? No.
00:07:15.000 Well, I watch, you know, Netflix, HBO. Oh, okay.
00:07:17.000 All right.
00:07:17.000 Just finished Veep, which I love.
00:07:19.000 Is it good?
00:07:20.000 Yeah, it gets better.
00:07:21.000 Really?
00:07:21.000 The first seasons aren't so good, but then it gets really good.
00:07:23.000 I don't have that kind of patience for not-so-good seasons.
00:07:26.000 Oh, yeah, I just get addicted.
00:07:28.000 Even if I watch something and I think it's awful, I still just, I will get addicted.
00:07:31.000 Right away?
00:07:32.000 I have to watch all of it.
00:07:33.000 Yeah, I have, like, the most compulsive personality.
00:07:35.000 Have you seen House of Cards?
00:07:36.000 I've not seen, deliberately not started House of Cards.
00:07:38.000 Oh, that's a good show.
00:07:40.000 That's a good show.
00:07:41.000 I'm deep into that.
00:07:42.000 Yeah, like, Game of Thrones makes my life worse.
00:07:44.000 I, like, hate it.
00:07:45.000 Really?
00:07:46.000 I think it's amazing television, but I find it just so discussing.
00:07:49.000 Because it's so good?
00:07:49.000 I still have to watch it all the time.
00:07:51.000 Why do you find it distressing?
00:07:52.000 The violence?
00:07:53.000 Yeah, the violence, people getting their heads popped and stuff.
00:07:56.000 Oh, that one with the mountain?
00:07:57.000 Yeah, that's the one that really stays with me.
00:07:59.000 Woo!
00:08:00.000 That's rough, yeah.
00:08:02.000 It gives me a lot of anxiety because I know there's only two seasons left.
00:08:06.000 And the next season, this one coming up, is only seven episodes, and the final season is only six.
00:08:10.000 I'm so happy about that.
00:08:12.000 It's like...
00:08:12.000 It's not making me happy, Will.
00:08:14.000 I'm not very happy about that at all.
00:08:15.000 It's like someone saying they're going to stop selling heatherwin or something.
00:08:18.000 And then you're like, well, I'm going to have to get hooked on OxyContin's then.
00:08:22.000 That's what I feel.
00:08:23.000 I'm going to have to watch the whole season all over again, or the whole series.
00:08:27.000 So you have a television.
00:08:29.000 You have a computer, I'm sure.
00:08:30.000 Yeah, of course.
00:08:31.000 I have a computer.
00:08:33.000 Yeah, I like move around a ton, so I normally, like I don't have a house, but it wouldn't be convenient to have a house because I'm traveling so much.
00:08:40.000 So you rent an apartment or something?
00:08:41.000 Yeah, I rent an apartment.
00:08:42.000 You live in England?
00:08:43.000 I live in Oxford most of the time.
00:08:45.000 I spend quite a chunk of my time out in the Bay Area.
00:08:48.000 Like, a significant part of our staff and the non-profits out there.
00:08:51.000 I've got lots of contacts, sister organizations out there.
00:08:54.000 So most of your time, it seems like you're spending working for charitable organizations or Yeah, so I have kind of three hats.
00:09:03.000 So one is an academic, so I'm a professor at Oxford.
00:09:06.000 Second is this kind of more public figure where I'm talking about these ideas through books or on this podcast and so on.
00:09:12.000 And then third is I run a nonprofit called the Center for Effective Altruism, which is more about like finding the best charities, the ones that are doing the most good, going to help other people the most, and trying to promote them and try and get people to give more and to give more effectively.
00:09:26.000 Yeah, we've gone over ineffective charities, or I shouldn't say ineffective, but charities that are, the way they're structured, when you look at how much money is actually going towards the charity itself, and how much is going towards the structure of the organization, it's kind of crazy.
00:09:42.000 Yeah.
00:09:43.000 Yeah, I mean, I think, so that's normally the focus of ineffective charities is on, like, yeah, how much is spent on overheads.
00:09:49.000 Right.
00:09:49.000 But I actually think that's not the most important thing.
00:09:52.000 The most important thing is, what's the charity actually doing?
00:09:55.000 Like, what's the actual program?
00:09:57.000 So, one charity, for example, that I'm sure, like, you'll find funny is a charity called Homeopaths Without Borders.
00:10:03.000 And it goes to Haiti, in particular, and distributes homeopathic remedies, which don't work.
00:10:09.000 They don't provide any health benefit.
00:10:11.000 And even if it had a 0% overhead cost, so apps spent nothing, everyone was volunteers, it would still be a bad charity.
00:10:18.000 You still shouldn't be giving to that charity.
00:10:20.000 Right.
00:10:20.000 That's a hilarious one.
00:10:21.000 I didn't know that that one existed.
00:10:23.000 Yeah, yeah.
00:10:23.000 It's kind of small.
00:10:24.000 I would imagine.
00:10:27.000 Thankfully.
00:10:28.000 Homeopaths without borders.
00:10:30.000 Jeez.
00:10:31.000 God.
00:10:47.000 What you should just care about is how much money you're putting in and what you're getting as an outcome.
00:10:51.000 Right.
00:10:52.000 Well, I think it's impossible for you to give $10 and all $10 is going to go directly to the charity because there's got to be overhead.
00:11:02.000 There's got to be infrastructure.
00:11:03.000 There's got to be a bunch of people working there, rent.
00:11:06.000 There's costs.
00:11:08.000 But the question is, at what point does it become kind of a scam?
00:11:13.000 Because there are most certainly some organizations that appear to be charitable organizations but are really kind of a scam.
00:11:22.000 Yeah, there's definitely some.
00:11:23.000 So like the Kids Wish Network, for example, kind of like the Make-A-Wish Foundation, similar idea.
00:11:28.000 And they spent 99% of their budget on fundraising.
00:11:31.000 So they were just like this kind of charitable Ponzi scheme, basically.
00:11:35.000 So they spent all their money on fundraising itself.
00:11:39.000 Yeah, to then invest in more fundraising.
00:11:42.000 And 1% somehow or another gets out there.
00:11:44.000 Maybe it's not as high as 99%, but it was about 90%.
00:11:47.000 Something crazy.
00:11:48.000 So what does that money get to?
00:11:50.000 What do they do with the actual money itself?
00:11:52.000 And then the idea behind that was granting wishes for sick children.
00:11:56.000 Do you remember the San Francisco thing with Bat Kid?
00:12:02.000 There was a big event, lots of publicity around it.
00:12:05.000 Was Bat Kid a child that had some strange disorder?
00:12:08.000 Yeah, so the child...
00:12:10.000 I don't know the details.
00:12:11.000 I think the child had leukemia.
00:12:13.000 Their wish was that they wanted to be Batman for the day.
00:12:15.000 Oh, okay.
00:12:15.000 This is a different thing.
00:12:16.000 Yeah, okay, cool.
00:12:17.000 So the Make-A-Wish Foundation set up this amazing story where they've got to drive in a Batmobile and have this fantastic day where they're basically Batman for the day.
00:12:28.000 Kids' wish network is doing basically the same thing.
00:12:30.000 They find seriously sick kids, often terminally ill kids, and say, what one thing would you want?
00:12:35.000 And we'll make it happen.
00:12:38.000 But there is a lot of focus on particularly bad charities.
00:12:43.000 You know, the ones that are just really corrupt or completely dysfunctional.
00:12:45.000 I think that's not actually the most important message.
00:12:48.000 What's most important is just even among the charities that are kind of good, even the ones that are making a difference, there's still a vast difference in the impact that you have.
00:12:57.000 Difference of hundreds or thousands of times between the charities that are merely good and the ones that are really the very best.
00:13:03.000 And that's primarily dependent on what program are they focusing on.
00:13:07.000 Hmm.
00:13:08.000 So, is there any charity that people should avoid spending their money on?
00:13:13.000 Like, are there charities that you feel like are just so ridiculously ineffective?
00:13:19.000 Yeah, I mean, like, the ones we mentioned of Kids Wish Network or Homeopaths Without Borders.
00:13:24.000 The Homeopaths Without Borders is just ridiculous.
00:13:26.000 It's like voodoo on parade.
00:13:28.000 Just stop.
00:13:29.000 Yeah, I mean, there's another one, I can't remember it, but it does...
00:13:34.000 Astrology Without Limits?
00:13:35.000 Astrology Without Limits.
00:13:36.000 No, it does dolphin therapy for autistic children, which has no evidence of working, but does actually just have some, like, risk of the children drowning.
00:13:47.000 Oh, Jesus Christ.
00:13:49.000 Yeah, so you can, like, cherry-pick these examples, but the thing is that these are just, like, not really representative.
00:13:53.000 In general, I think charity's doing good, but the question is just, like, in the same way as if you're buying a product for yourself, you don't just want to get, like...
00:14:03.000 You know, a laptop, as long as it works.
00:14:05.000 You want to find, like, what's the best laptop I can get with my money?
00:14:08.000 Right.
00:14:08.000 Or if you're investing, you want to not just get, like, an okay return.
00:14:11.000 You want to see, well, what's the best return I can get?
00:14:13.000 Right.
00:14:14.000 So in that sense, I think, like, the number of charities that you think are just, yeah, this is really competing for being the most effective charity in the world, that's actually very small.
00:14:22.000 So GiveWell, for example, is an evaluator.
00:14:25.000 It looks at all sorts of different global health And global development charities.
00:14:30.000 And its list of charities that's like, yeah, this is just super good.
00:14:33.000 You should really be donating to them.
00:14:34.000 It's only seven charities long at the moment.
00:14:37.000 Wow.
00:14:37.000 And that's up from last year when it was only four charities long.
00:14:39.000 Wow.
00:14:40.000 Seven charities out of how many?
00:14:41.000 I mean, what is the overall total of active charities?
00:14:45.000 It's got to be in the thousands.
00:14:46.000 Hundreds of thousands.
00:14:47.000 Yeah, I'm sure.
00:14:48.000 What got you involved in this?
00:14:50.000 You're a young guy.
00:14:51.000 You seem like you should be playing video games and skateboarding or something.
00:14:55.000 I spent a lot of my teenage years playing video games.
00:14:58.000 Yeah?
00:14:58.000 Yeah.
00:14:59.000 It was, again, compulsive personality.
00:15:01.000 Yeah.
00:15:02.000 I need to ban myself from doing it.
00:15:04.000 So your compulsive personality is now going towards good things.
00:15:06.000 Yeah, the key was managing my life so that the things I get really focused on and addicted to were good things rather than bad.
00:15:14.000 So yeah, it all started back in...
00:15:16.000 So I was back in high school, kind of undergraduate, of being very convinced by the arguments of this philosopher, Peter Singer.
00:15:24.000 Oh, I know Peter Singer.
00:15:26.000 He's like a radical animal rights activist as well, right?
00:15:29.000 Yeah, he has a few things.
00:15:31.000 And he had this argument, which is that, you know, the way I tell the story is, imagine someone walks, is walking past a shallow pond, and they see a child drowning in that shallow pond.
00:15:45.000 And they could run in, and they could save the child.
00:15:48.000 But they're wearing a really nice suit, a suit that costs like $3,000.
00:15:52.000 And so they say, no, I'm not going to save that child.
00:15:56.000 I'm just going to walk by and let it drown, because I don't want to lose the cost of this suit.
00:16:00.000 I normally say, look, in moral philosophy, we have a technical term for people like that.
00:16:04.000 They're called assholes.
00:16:07.000 And this is how I convey it in my seminars.
00:16:13.000 And obviously we all agree, like, yeah, come on, if it's just you could clearly save this child that's right in front of you, you ought to do that.
00:16:20.000 The cost of $3,000 does not count.
00:16:23.000 But then what Peter Singer's insight is, he says, well, what's the difference between that child that's right there in front of you and that child that's in sub-Saharan Africa who you could save?
00:16:32.000 You'll never meet them, for sure.
00:16:35.000 But you could still save their life with just a few thousand dollars if you donate it to a really effective non-profit.
00:16:41.000 And he considers all the different ways in which these cases might be disanalogous, but decides ultimately, like, no, there's actually just no morally relevant difference.
00:16:49.000 And so, yeah, we do just have an obligation to give away at least a very significant proportion of our income.
00:16:56.000 And I was really convinced by this kind of on an intellectual level for many years, but I never really did anything about it.
00:17:04.000 And not until I went to Oxford to do a postgraduate degree in philosophy.
00:17:09.000 And in the summer between then I needed some money, I worked as a fundraiser for Care International, a global development charity.
00:17:15.000 So I was one of those annoying people in the street who would kind of get in your way and then ask you to donate $10 a month.
00:17:22.000 And it meant that all day, every day, I was talking about, like, look, this is the conditions of people in extreme poverty.
00:17:27.000 We can do so much to help people at such little cost to ourselves.
00:17:30.000 You know, why are we not doing this?
00:17:33.000 And I was just over and over again kind of getting these apathetic responses.
00:17:37.000 And I was just getting so frustrated because I just thought, look, these people are just not living up to their own values.
00:17:42.000 People clearly do care, but there's some sort of block going on.
00:17:46.000 And then I thought, well, I'm going to do philosophy.
00:17:48.000 And at the time, I was planning to do philosophy of language, logic, very esoteric stuff.
00:17:53.000 And so I thought, well, I'm not living up to my own values.
00:17:56.000 I should really try and make a change.
00:17:59.000 And so I went to Oxford, and I started asking a whole bunch of different academics, well, what's the impact of your work?
00:18:06.000 What kind of a difference have you made?
00:18:08.000 And normally they were like, I'm not really in it to make an impact.
00:18:11.000 I'm just kind of interested in these ideas.
00:18:13.000 And that was pretty disheartening.
00:18:15.000 But I kept persisting until I met another postgraduate student called Toby Ord.
00:18:20.000 And he just blew me away.
00:18:21.000 Because he had also been convinced by these ideas, but he'd gone one step further.
00:18:25.000 And he'd said, yep, I've made a commitment to give away almost all of my income over the course of my life, about a million pounds.
00:18:31.000 At the time, he was living on 9,000 pounds, saving 2,000 pounds, and donating 2,000 pounds.
00:18:37.000 So he was like really hardcore.
00:18:39.000 But the thing, as well as actually taking these ideas and putting them into practice, what really blew me away was just how positive he was.
00:18:45.000 And it was not that he was kind of wearing this hair shirt, flagellating.
00:18:49.000 Instead, he was saying, look, this is an amazing way to live.
00:18:53.000 We have this amazing opportunity to do a huge amount of good, to help so many other people, thousands of people, at what's actually a very low cost to ourselves.
00:19:02.000 And me having that one person who also kind of shared my worldview, shared my ambitions, just meant kind of gave that little psychological block, was lifted.
00:19:14.000 And it meant that I was like, okay, cool, I'm on board.
00:19:18.000 First, I kind of committed 10%.
00:19:19.000 Then I was like, no, actually, I think I can do this further pledge.
00:19:23.000 And then that meant I had this question of, well, I'm planning to give away like a million pounds over the course of my life.
00:19:29.000 Where should that money go?
00:19:30.000 You know, I want to make sure it has as big an impact as possible.
00:19:33.000 And that meant I started digging into, well, how can we compare between different charities?
00:19:38.000 I found there was a ton of work from health and development economics that could help us to answer this.
00:19:43.000 And what began as this kind of side project between these two, you know, Ivory Tower academics, me and Toby, We found that loads of people just were really taken by this idea, both of giving more, but in particular of giving more effectively.
00:19:59.000 And over time, this kind of global movement called effective altruism started to form around these ideas and started to broaden in a couple of ways.
00:20:07.000 So, one is that I broadened away from just charitable donations to also thinking about, well, what should I think about with respect to my personal consumption?
00:20:16.000 What should I think about with respect to my career?
00:20:18.000 If I'm really aiming to do as much good as possible, what should I do?
00:20:22.000 And then secondly, also starting to think about cause areas other than just global poverty as well.
00:20:27.000 I think?
00:20:45.000 And then also preservation of the long-run future of humanity and worrying about risks of global catastrophe, things that may be fairly unlikely but would be very, very bad if they did happen, especially relating to new technology like novel pathogens,
00:21:03.000 viruses you could design in a lab and so on.
00:21:05.000 Well, you're also very concerned with AI as well, right?
00:21:08.000 Artificial intelligence?
00:21:09.000 Yeah, that's exactly right.
00:21:10.000 And that's, I think, in this category of If you look at the history of human progress, technological change just creates these huge step changes in just how humanity progresses.
00:21:22.000 So it was only 12 years in 1933 to then 1945 between Leo Szilard first coming up with the idea of the nuclear chain reaction.
00:21:33.000 And that was just a purely conceptual idea on a bit of paper.
00:21:36.000 12 years from that to then the deployment of the first nuclear bomb.
00:21:41.000 And think how radical a change that is, suddenly being in the nuclear age.
00:21:46.000 That was only 12 years.
00:21:47.000 We went over the invention of the airplane to dropping an atomic bomb out of the airplane.
00:21:53.000 I believe it was 50 years, right?
00:21:56.000 Somewhere in the neighborhood of 50 years?
00:21:57.000 Can you take a few?
00:21:59.000 Yeah.
00:21:59.000 So technological progress can suddenly go in these huge leaps.
00:22:03.000 That we're not prepared for.
00:22:04.000 That we're often very not prepared for.
00:22:06.000 And I think artificial intelligence is in this category where we're really making radical progress in AI, especially over the last five years.
00:22:15.000 It's really one of the fastest developing technologies, I think.
00:22:19.000 And yet has huge potential in so many different ways.
00:22:22.000 And as with any new technology, huge positive potential.
00:22:26.000 Really, if you get AI right, you can solve almost any other problem.
00:22:29.000 But also potential risks as well.
00:22:32.000 Where there's risks that might be more familiar, you know, worries about automation, unemployment.
00:22:41.000 Worries about autonomous weapons, which I think should be taken seriously.
00:22:45.000 And then also just worries about, well, what if we really do manage to make human-level artificial intelligence?
00:22:50.000 Very good arguments that would then quickly move to superhuman-level artificial intelligence.
00:22:55.000 And what then?
00:22:57.000 Are we now in a situation like the Neanderthals versus Homo sapiens where we've suddenly created this intelligence that is greater than our own?
00:23:08.000 Are we able to control that?
00:23:10.000 Are we able to ensure that transition is positive rather than negative?
00:23:15.000 Have you ever considered the possibility when you look at all the impoverished people in the world, all the cruelty, all the people that are so just concerned with material possessions and shallow thinking and war and just the evil that men do?
00:23:30.000 Is it possible that we're sort of an outdated concept that what we are as these biological organisms that are still slaves to the whole Darwinian evolutionary survival of the fittest natural selection sort of paradigm that we've operated under for all these many thousands and hundreds of thousands of years as humans Is it possible that we're giving birth to the next thing?
00:23:57.000 That just like we don't long for the days when we used to be monkeys throwing shit at each other from the trees, one day we will be something different, whether it will be a combination of us and these machines, or whether we're going to augment our own intelligence with some sort of Artificial,
00:24:18.000 whether it's some sort of an exo-brain or something that's going to take us to that.
00:24:23.000 Or it's going to be simply that we create artificial intelligence.
00:24:28.000 Artificial intelligence no longer has use for us because we're illogical.
00:24:31.000 And then that becomes the new life form.
00:24:33.000 And then we're hiding the cave somewhere, hoping the Terminators don't get us.
00:24:38.000 Yeah, I mean, I think, like, over the long term, I mean, with all of these things, the question of kind of timelines is very hard.
00:24:44.000 And sometimes people want to reject this sort of discussion because, oh, this is so far in the future.
00:24:50.000 Whereas I think, like, if something's sufficiently important, we should be talking about it even if maybe it's, you know, decades or generations hence.
00:24:58.000 It might not be, right?
00:25:00.000 I mean, it might not be that far away.
00:25:01.000 But who knows?
00:25:02.000 Like with the atomic bomb, that was hugely fast progress.
00:25:06.000 Just, you know, 12 years.
00:25:07.000 So we want to be prepared.
00:25:10.000 But then as for, like, yeah, is it going to be Homo sapiens around for the next, you know, in a thousand years' time?
00:25:16.000 I think that would just be extremely unlikely.
00:25:18.000 That will be around?
00:25:19.000 You think we're not going to be around anymore?
00:25:21.000 Yeah, I mean, I think if intelligent creatures are still around, it's going to be in a thousand years' time, it's going to be something that's not...
00:25:29.000 Homo sapiens, like you said, there's kind of three...
00:25:32.000 Or it's like not what we would consider kind of typical humans now.
00:25:36.000 Well, we're obviously severely flawed, right?
00:25:39.000 I mean, if you ask people, if you ask the average person, do you think that in your lifetime you can imagine a world without war?
00:25:47.000 Most people say no.
00:25:48.000 Like the vast majority of people say no.
00:25:50.000 A world without crime, a world without violence, a world without theft.
00:25:55.000 Most people say no.
00:25:57.000 That just shows you how inherently flawed most people view the human species.
00:26:02.000 We know that we can do it in small groups.
00:26:04.000 Like if the three of us were on an island, I'm pretty sure we wouldn't be stealing from each other and murdering each other, right?
00:26:09.000 Just a few of us.
00:26:10.000 But when you get to large-scale humanity, it becomes very easy to...
00:26:15.000 Disassociate or create this diffusion of responsibility where there's you know enough people So you don't really value them as much and you're allowed to get away with some pretty heinous stuff Especially when you consider drone warfare things that we're able to do with long distance where we're not seeing the person that we're having the effect on It's a very flawed thing the human species Wouldn't it be better if something better came along?
00:26:41.000 I mean, I think there's, yeah.
00:26:43.000 Sorta.
00:26:44.000 Not good for you and I, though.
00:26:46.000 We'd be obsolete.
00:26:47.000 Yeah, I mean, well, we're going to be obsolete in a hundred years anyway.
00:26:51.000 I mean, as in, we'll be dead.
00:26:54.000 Right.
00:26:54.000 So the question is just, will our kind of, you know, generations hence, will, you know, the question's not really about us, it's about our grandchildren.
00:27:03.000 What really forces the idea...
00:27:06.000 To be considered, what is valuable about life?
00:27:10.000 Is it the experience?
00:27:12.000 Is it happiness?
00:27:13.000 Is it shared fun?
00:27:16.000 Is it love?
00:27:17.000 What's valuable about being a person?
00:27:20.000 And how much of that is going to change if we're made out of something that people have created or maybe we're made out of something artificial intelligence has created because we've created something that's far superior to us.
00:27:33.000 So yeah, I mean, I have a view on this, as you might expect.
00:27:36.000 I mean, in my view, the thing that's valuable and the only thing that's valuable ultimately is conscious experience.
00:27:44.000 So that's good conscious experiences, happiness, joy, and so on.
00:27:50.000 That's positive.
00:27:51.000 That's good for the world.
00:27:52.000 Negative conscious experiences, suffering, pain, distress, those are bad for the world.
00:27:56.000 And so that's why it's a good thing for me to do some service to you to benefit you, but I can't do anything good to benefit this bottle of water.
00:28:05.000 Right.
00:28:06.000 And so then the key question in terms of what should we think about, supposing it is the case that, you know, a thousand years time, it's now synthetic life, it's artificial intelligence or something that's, like,
00:28:22.000 that are in charge and there are no longer any humans, would this be good or bad?
00:28:25.000 The question for me is, you know, are they having conscious experiences and are those conscious experiences good or bad?
00:28:32.000 So that's it.
00:28:33.000 Just conscious experience.
00:28:35.000 That seems so selfish.
00:28:37.000 It's a controversial view.
00:28:39.000 There's a thought experiment which is often used to challenge this view.
00:28:42.000 Do you want to hear it?
00:28:43.000 Yes.
00:28:44.000 So it's called the experience machine.
00:28:46.000 And the idea is, supposing that tomorrow you could plug into this machine.
00:28:53.000 It's like the most amazing VR you could ever have.
00:28:56.000 And in this machine, you will live, let's say you'll live 200 years, and you'll be in the most amazing bliss.
00:29:03.000 You'll have the most amazing experiences of, you know, and your experiences will involve incredible relationships, incredible creative achievement and so on.
00:29:12.000 And it'll just be like the perfect life that you could live experientially for the next 200 years.
00:29:18.000 And the question is...
00:29:22.000 Insofar as you are self-interested, so put aside considerations you might have about wanting to make the world a better place, but just insofar as you care about yourself, would you plug into this thing?
00:29:32.000 Bearing in mind that in a certain sense, all of these experiences are going to be fake.
00:29:36.000 You're going to have experiences of having amazing friendships, writing great works of art and so on.
00:29:44.000 But they're not going to be real.
00:29:46.000 It's just sensory inputs provided by a computer.
00:29:51.000 So the question is, would you, or ought you, insofar as you're self-interested, plug into this machine?
00:29:57.000 What would you answer?
00:30:00.000 That's a very good question.
00:30:01.000 I might already be plugged into it, right?
00:30:04.000 Oh, so this is a great question.
00:30:07.000 And I think it's a good argument against, is the question, well, supposing you were already plugged in.
00:30:11.000 Would you unplug?
00:30:12.000 Supposing I told you that actually you're a banker in Monaco and...
00:30:16.000 Fuck Monaco.
00:30:18.000 I'm not interested in that.
00:30:19.000 No.
00:30:20.000 I want to stay right here.
00:30:21.000 Yeah.
00:30:22.000 Can I stay plugged in, please?
00:30:23.000 Do I have to pay more?
00:30:24.000 What do I have to do?
00:30:25.000 You would have to do nothing, but...
00:30:27.000 It's interesting, then, if people think...
00:30:29.000 So most people...
00:30:31.000 And it seemed like maybe yourself would, intuitively, you'd say, no, I wouldn't plug into this machine.
00:30:36.000 I don't know if I would say that.
00:30:39.000 I would have to really deeply consider it, because right now, it's just so abstract, this idea that that could be possible.
00:30:47.000 It's fantasy.
00:30:48.000 We're having fun.
00:30:49.000 But...
00:30:50.000 When you talk to the leading minds when it comes to virtual reality or artificial reality or simulation theory, when they start talking about what will be possible one day, they're going to, without a doubt, within 100 years or 500 years or whatever the number is,
00:31:08.000 They're going to be able to create an artificial reality that's indiscernible from this reality.
00:31:12.000 You're going to be able to feel things.
00:31:14.000 There's going to be emotions that come to you.
00:31:16.000 They're going to be able to recreate every single aspect of an everyday life.
00:31:20.000 It's just a matter of time.
00:31:21.000 I mean, they're really close now.
00:31:23.000 And not really close in terms of, like, they don't give you emotions and they don't give you feeling.
00:31:28.000 But if you put on an HTC Vive and go through some of those virtual reality games, I mean, it's bizarre how real it feels.
00:31:36.000 Yeah, yeah.
00:31:37.000 And when you go back to like playing Pong, did you ever play Pong?
00:31:43.000 You know, it's such a weird thing that that happened inside of our...
00:31:47.000 When I was a kid, Pong came along and we were blown away.
00:31:50.000 We couldn't believe that we could actually do something on the television.
00:31:54.000 You could see it move.
00:31:55.000 It was so fantastic.
00:31:58.000 And if you gave that to one of my kids, they'd spit on it.
00:32:00.000 They'd be like, what kind of piece of shit video game is this?
00:32:03.000 They would think it's just so ridiculous.
00:32:05.000 But to me, at the time, it was amazing.
00:32:08.000 You go from that to one of these HTC Vive games, which has all taken place within my lifetime, and you go, well, a lifetime from now, if you follow the exponential increase in the ability, the technological innovation, it's going to be spectacular.
00:32:22.000 It's going to be...
00:32:23.000 So when that does happen, how will you be able to know...
00:32:28.000 If it's indiscernible, how will you know if you're in it?
00:32:30.000 And how do you know if you're not in it right now?
00:32:32.000 That's the real question, right?
00:32:34.000 Yeah, I mean, there are actually some arguments for thinking, you know, this is Nick Boston, a colleague of mine, his simulation argument, for thinking we are in a simulation right now.
00:32:42.000 In fact, it's very likely that we should be.
00:32:44.000 Yeah.
00:32:46.000 Do you buy that?
00:32:48.000 I actually, I'm kind of agnostic.
00:32:49.000 I think you should take the hypothesis seriously.
00:32:53.000 But I think the...
00:32:56.000 The argument doesn't quite go through for...
00:32:58.000 What's attractive and what's not attractive about that theory to you?
00:33:02.000 His version of it.
00:33:04.000 Yeah, so the argument is that...
00:33:05.000 Frame it, if you could, like his version of it.
00:33:07.000 Yeah, so his argument is that in the future, supposing we believe that the human race doesn't go extinct, or post-humans don't go extinct over the next few thousand years...
00:33:19.000 And secondly, that the people in the future have an interest in recreating their past, just for kind of historical interest or for learning, that they're going to be interested in running, because they're now going to have huge, amazing computer power.
00:33:31.000 They're going to be able to create simulations of the past.
00:33:34.000 That they're going to have some interest in running simulations of the past.
00:33:40.000 Well, if that is true, then the number of simulations that these future people are going to be running will vastly outnumber the number of actual timelines, the kind of base universe, as it were.
00:33:54.000 So for the one real universe where history kind of unfolds, there's also, let's call it, 10,000 simulations of that universe.
00:34:05.000 And if that's true, then...
00:34:09.000 It's the case that, well, given that I'm just, you know, these things really are indiscernible for the people who are inside them, it's overwhelmingly likely, just in the base rate, that I'm going to be in a simulation rather than in the real world.
00:34:24.000 And what Nick Bostrom says actually is not that we definitely are in a simulation, but he just points out the conflict between these three kind of beliefs that we would seem to hold.
00:34:34.000 One is that we're not going to go extinct in the near future.
00:34:38.000 Two is that, you know, people in the future will have some interest in simulating the past.
00:34:43.000 And thirdly, that we're not living in a simulation.
00:34:46.000 And he himself gives, you know, a reasonable degree of belief.
00:34:49.000 Maybe he thinks it's like 10% likely, 15% likely that we're in a simulation.
00:34:54.000 Other people who understand the argument vary a bit more, but I think it's something you should at least be taking seriously.
00:35:03.000 The reason I reject it is kind of even weirder, I think, or it's somewhat technical.
00:35:12.000 But the basic thought is just that According to the best guesses from cosmologists, we're actually in an infinite universe.
00:35:22.000 The universe is infinitely big.
00:35:24.000 Now, we can't affect an infinitely big universe.
00:35:27.000 We're restricted by the speed of light to what we can affect and to what we can see.
00:35:32.000 But the best idea, according to the best theory we have, the universe just kind of keeps on going.
00:35:38.000 But if so, then there's already like an infinite number of observers of people kind of in that bottom universe.
00:35:47.000 And that means that you've now got kind of an infinite number of people kind of experiencing things, and then you've got the simulations, and you've got like 10,000 simulations.
00:35:57.000 But you can't say there's 10,000 times as many simulated beings as there are real beings, because there's already an infinite number of real beings.
00:36:08.000 You're looking so consternated.
00:36:11.000 No, no, no, go ahead, keep going.
00:36:12.000 But that means if you've got...
00:36:14.000 So the key of Bostam's argument was that...
00:36:17.000 You've got 10,000 times as many simulated beings as you have real, like, non-simulated beings.
00:36:23.000 But the problem is an infinite number of real beings because the universe is infinite.
00:36:28.000 Yeah, that's right.
00:36:29.000 And so if you've already got an infinite number of real beings, the fact that you've got 10,000 times infinite, that's still infinite.
00:36:36.000 Right.
00:36:36.000 And you can't...
00:36:37.000 It's kind of a case where, like, our best methods of assigning degrees of belief to things kind of run out.
00:36:43.000 If you think it's, you know, there's an infinite number of...
00:36:47.000 Simulated beings, an infinite number of real beings, then what's the chance of you being one or the other?
00:36:52.000 I mean, like, we don't actually have the, like, tools to be able to answer that.
00:36:56.000 Neil deGrasse Tyson was trying to explain this to me a couple of weeks ago, that there are infinities that are bigger than other infinities.
00:37:05.000 Yeah, so that's also the case, but...
00:37:07.000 Yeah, that was right.
00:37:08.000 Broke my brain again.
00:37:11.000 So the number, but the key, we're all talking about the lowest, what's called cardinality, the smallest infinity, which is the size of the infinity of all the integers, one, two, three, four, counting numbers.
00:37:27.000 Right.
00:37:27.000 And if you take that size of infinity and multiply it by 10,000, let's say, you just get the same number, which is infinity.
00:37:34.000 Right.
00:37:35.000 And then what Neil was saying was, yeah, there are these even bigger levels of infinity.
00:37:39.000 So if you look at not just all the counting numbers, but all of the numbers you can make fractions out of, a half, a quarter, an eighth, and so on, that's just more numbers than the infinity of the counting numbers.
00:37:51.000 I've spent a lot of time trying to understand why human beings are so obsessed with innovation, why human beings are so obsessed with technological progress.
00:38:01.000 And one of the things that I continue to come to is that we think of everything in this world as being natural, but the behavior of butterflies and wolves and the way rivers run down from the mountain.
00:38:13.000 But we don't think of ourselves and our own behavior as natural.
00:38:17.000 We don't think of our own thirst for conquest and innovation and even materialism.
00:38:23.000 I think materialism is probably a very natural reaction to our need to somehow or another fuel innovation.
00:38:33.000 And that one of the ways to ensure that innovation is constantly fueled is that people are constantly obsessed with buying new things, constantly obsessed with the latest and greatest, which fuels innovation.
00:38:45.000 And when you look at the universe itself, and you look at all the various things that we know to be natural processes in the universe, like in order to make a human being, a star has to explode.
00:38:58.000 When you literally are made out of stardust, which is...
00:39:01.000 When you run that by people for the first time, they go, wait, what?
00:39:05.000 In order for you to have carbon-based life form that has to be created inside a burning, dying star, and that's the only way you make this thing, what you are right now.
00:39:14.000 And then that thing makes artificial reality, and then that thing makes...
00:39:20.000 Perhaps even crazier.
00:39:22.000 I mean, if you follow the ideas of technological progress, if something gets to a point where it's indiscernible from reality, how do you know it's not a new reality?
00:39:32.000 How do you know it's not a new kind of reality?
00:39:46.000 Yeah, yeah.
00:39:58.000 You're in these gigantic fake worlds where you're traveling from place to place, but right now we're looking at it in a very two-dimensional way.
00:40:06.000 You're looking at it on a flat screen.
00:40:08.000 One day it's not going to be two-dimensional.
00:40:10.000 One day it's going to be something that you're interfacing with.
00:40:14.000 Your consciousness is interfacing with it.
00:40:17.000 Is it only real if we can take it and drop it on something?
00:40:23.000 If we can hit it with a hammer?
00:40:24.000 If we could put it on a scale?
00:40:25.000 If we can use a measuring stick and measuring it?
00:40:28.000 Is it only real there?
00:40:30.000 Or is it real if it follows every single check?
00:40:35.000 Like if you check off every single item on the list of conscious reality and conscious experience?
00:40:41.000 Yeah, I think that's a great question, because I think the dichotomy that a lot of people think in terms of natural, non-natural, I think it's just meaningless.
00:40:50.000 I mean, people firstly think this is natural and this is not.
00:40:53.000 I mean, in a sense, everything we're doing is natural because homo sapiens are part of a natural process.
00:41:01.000 And maybe in another sense, everything we're doing is not natural.
00:41:05.000 But then why does that matter?
00:41:06.000 What's the model relevance of something being natural versus not natural?
00:41:10.000 Lots of stuff that happens in the natural world is just really awful.
00:41:14.000 Huge amounts of cannibalism, murder, suffering.
00:41:20.000 So it's not clear why we would care about something being natural rather than non-natural.
00:41:26.000 But then the second question is, yeah, let's consider this virtual reality again, this experience machine that you could plug yourself into.
00:41:36.000 And as part of the description, I said, oh, none of this would be real.
00:41:40.000 You'd have all of these interactions with people that you think are friends and so on, but that wouldn't be real.
00:41:45.000 And I think you could very well push back on that and say, why should something be physically instantiated?
00:41:53.000 Like...
00:41:54.000 In order for it to count as a real experience.
00:41:57.000 Why is it not the case that in this virtual reality you're interacting with algorithms, but that's just as much...
00:42:04.000 At least it's possible for that to be just as much friendship as if you're interacting with people who are, you know, flesh and blood.
00:42:12.000 And I think it's hard to explain kind of what the difference would be.
00:42:16.000 Because, you know, if you think about Star Trek...
00:42:19.000 Jean-Luc Picard can be friends with data and android.
00:42:24.000 He's not biological, but we think that you can still have moral worth and friendships and so on with creatures that are not made of human biology.
00:42:36.000 In which case, why does the fact that something merely lives on silicon, why wouldn't that exist?
00:42:41.000 Or as seemingly merely software, why does that mean you couldn't have a genuine friendship with that thing, if it acts in a sufficiently sophisticated way, perhaps?
00:42:53.000 Isn't there also an issue with our incredibly limited ability to view reality itself?
00:42:57.000 Because we're only viewing the dimensions that are relevant to us in this current state of carbon-based life form, this talking monkey clinging to the spaceship flying through the universe, right?
00:43:10.000 This is what's important to us.
00:43:12.000 But when you pay attention to those, the dudes who write on yellow legal pads and they get into quantum physics and they have all those crazy equations that nobody but them understands, Maybe you do.
00:43:21.000 I look at that shit and I go, what the fuck are they writing?
00:43:24.000 But they believe, I mean, what is the current model?
00:43:27.000 They believe there's at least 11 dimensions.
00:43:30.000 There perhaps could be more.
00:43:31.000 What if there is a dimension that you can plug into that it's purely consciousness-driven, meaning there's no physical experience, there's no touching the ground, there's no gravity, but you exist in a conscious state and it's perpetual.
00:43:45.000 Like, if you take A rocket ship, and it gets past our gravity and shoots off into distant space, and you have a clear shot of, you know, 14 billion years back to the beginning of the universe itself with nothing in the way, you're just gonna keep going for 14 billion light years.
00:44:01.000 You're just gonna keep going.
00:44:02.000 Like, what if there is a place that your consciousness can go to like that, where it can't?
00:44:07.000 It's no longer burdened by biology, by the timeline of birth to death.
00:44:12.000 By the limitations of the flesh, but consciousness itself can exist in some bizarre dimension that we just haven't access to.
00:44:22.000 So yeah, I mean, I think consciousness is probably just ultimately a physical process.
00:44:26.000 Why do you think that?
00:44:27.000 In, ultimately because of conservation of energy.
00:44:33.000 The reason being, so, you know, there's this age-old philosophical debate between the monists and dualists.
00:44:42.000 People who think, is consciousness just ultimately some sort of physical process?
00:44:47.000 Or is it something special?
00:44:48.000 So Descartes thought there was this...
00:44:51.000 Pineal gland, this little bit of your brain, and your conscious kind of soul was just kind of steering your monkey body through this pineal gland.
00:45:02.000 But the question is just for why...
00:45:05.000 I think the strongest argument about why that couldn't be right is it seems to be...
00:45:10.000 It would have to be creating energy out of nowhere.
00:45:12.000 And we've never...
00:45:14.000 It seems to be just fixed law of the universe that that just can't happen.
00:45:19.000 Because in order for, you know, this conscious mind to, if it's not merely a physical process, if it's not just the brain, in order for it to be able to affect what this physical entity is doing, it would have to use energy to be able to do that.
00:45:34.000 So the energy would have to be coming from somewhere, and if it's not coming from just the physical realm, then suddenly we've got this counter-example to all the rest of science.
00:45:43.000 Sort of, but are you aware of the theories of human neurotransmitters being pathways to other dimensions like dimethyltryptamine?
00:45:51.000 Do you know about all that?
00:45:53.000 I mean, I know about DMT. Do you know it's produced in the pineal gland?
00:45:57.000 Where Descartes thought that all that stuff was going on, the seed of the soul, what the Egyptians called the Eye of Horus, and the reason why the Catholics and so many ancient religions were so focused on pine cones and their...
00:46:07.000 Their art and their imagery, that's the pineal gland.
00:46:12.000 That's the image of it.
00:46:13.000 That's what it's supposed to represent.
00:46:15.000 And for people who've had these intense transformative psychedelic experiences by consuming exogenous dimethyltryptamine, which is produced by the brain, that you have these insane transformative experiences where you feel like you are traveling to other dimensions.
00:46:30.000 Yeah, so I think...
00:46:31.000 I mean, I do want to say, like...
00:46:32.000 Have you done any of that?
00:46:34.000 I've never done DMT, no.
00:46:35.000 Oh, you son of a bitch.
00:46:36.000 Why not?
00:46:36.000 What are you doing?
00:46:37.000 You're wasting your time.
00:46:38.000 I know.
00:46:38.000 I'm such a good boy.
00:46:42.000 But it's something that's in the brain.
00:46:43.000 I mean, it's a natural product of human biology.
00:46:46.000 I mean, whether it's natural or not isn't the question.
00:46:48.000 Just, you know, if I'm going to have a career based on my brain, I want to be very careful to...
00:46:54.000 To not break it?
00:46:55.000 To not break it, yeah.
00:46:56.000 Yeah, but it's one of the most transient drugs ever observed in the body.
00:46:59.000 Your body brings it back to baseline in like 15 minutes.
00:47:02.000 Okay, because I mean, there's a lot of, I do think there's like tons of, people very often greatly overestimate the risks of non-legal drugs, like MDMA is like super safe and so on.
00:47:15.000 Overestimate the risk, is that what you're saying?
00:47:17.000 Of MDMA? Yeah.
00:47:20.000 MDMA is weird, right?
00:47:21.000 That's a weird one.
00:47:22.000 It's not a natural drug.
00:47:25.000 Dimethyltryptamine, I think the real concern would be psychological, because what you face is so bizarre.
00:47:30.000 Terence McKenna had the best quote about it, I think he said, that you would risk death by astonishment.
00:47:38.000 Yeah.
00:47:40.000 It's so bizarre that it's almost a sin for a guy as smart as you to not experience it.
00:47:46.000 But you just come right back and even when you're there, you're there.
00:47:49.000 It's you.
00:47:49.000 It's not like your consciousness dissolves into some bizarre quasi-living state and then you have to work your way back to being you again.
00:48:00.000 No, you're you.
00:48:01.000 You're Will McCaskill in the dimension, whatever the fuck it is.
00:48:05.000 But what's crazy about it is that this is produced in the very area where Descartes was believing the seat of the soul is, and so many different Eastern religions, and all this psychological, like, all these different...
00:48:22.000 Religions and all these different cultures, they were all convinced that that one gland had some massive significance in terms of the spirit and the soul, whatever that means, whatever the spirit means.
00:48:35.000 So yeah, so then the question is just in these experiences, is it the case that you're like genuinely seeing into another dimension?
00:48:44.000 Right.
00:48:44.000 Or is it the case that you just have a new kind of perspective on consciousness?
00:48:47.000 So one thing I do think is that In terms of conscious experience, there's the sort of conscious experiences that humans have access to.
00:48:55.000 And I think that must just be 0.001% of the entire landscape of possible conscious experiences.
00:49:01.000 So if you think, imagine if you were a bat and you could echolocate.
00:49:05.000 That's just a radically different conscious experience.
00:49:08.000 I don't think that maps onto any sort of conscious experience that humans could have.
00:49:11.000 Have you seen people do that?
00:49:12.000 You see blind people?
00:49:13.000 Some blind people can do that?
00:49:14.000 It's pretty amazing.
00:49:15.000 It is amazing.
00:49:16.000 Very effectively, too.
00:49:17.000 It's like shockingly effectively.
00:49:19.000 Yeah, I think you're absolutely right.
00:49:22.000 I mean, but there's also experiences, human experiences, that are available without drugs that some people have achieved through radical states of meditation and kundalini yoga, where they could achieve natural psychedelic states.
00:49:35.000 Holotropic breathing, people that have done that have experienced, like, really radical psychological transformations and incredible psychedelic experiences from that as well.
00:49:44.000 Yeah, and so I think, like...
00:49:47.000 These sorts of experiences are very important, very interesting.
00:49:52.000 I said that maybe we experience 0.01% of all possible conscious experiences, and that just allows you to see a little bit more of this potential vast landscape.
00:50:03.000 Whereas I think there's nothing unmagical about saying ultimately that's all explained in terms of physics, in terms of different sorts of neurons firing and different sorts of transmitters and so on.
00:50:15.000 We don't need to say, oh, and it's also this other thing which breaks all the known laws of physics that you're seeing into some other dimension in order for that to be an incredibly important thing.
00:50:27.000 And nor is it unscientific to say we know almost nothing about consciousness.
00:50:32.000 In terms of the areas of scientific inquiry, we have no understanding at all about the relationship between conscious experiences and, you know, what we would think of as physical processes.
00:50:46.000 We really have no idea about, you know, if you give me any sufficiently complicated physical processes which are conscious and which are not, All we can go on is really this, well, I'm conscious, and so I know that things that are kind of like me are probably conscious too.
00:51:01.000 And that's the best we've got, really.
00:51:04.000 And this is known as a hard problem of consciousness.
00:51:07.000 And philosophers often say that they've solved it with something, and I think it's always begging the question.
00:51:12.000 I think we should be very open to the fact that Just as in, you know, 3000 BC, people had no idea about the laws of physics.
00:51:20.000 This was just completely unexplored territory.
00:51:23.000 We should think contemporary science, this is just a big, like, big black gap in our scientific understanding.
00:51:32.000 And perhaps it's something maybe 21st century science, maybe 22nd century science can really get to grips with.
00:51:38.000 It does seem like the ultimate question.
00:51:41.000 Like, what is it for?
00:51:43.000 Why is it here?
00:51:45.000 What controls it?
00:51:46.000 Is it in the mind?
00:51:47.000 Is it external?
00:51:48.000 Is the brain just an antenna that tunes into consciousness?
00:51:53.000 The dimethyltryptamine question is so bizarre because it's the most potent psychedelic drug known to man and your brain makes it.
00:52:01.000 What's it in there for?
00:52:03.000 I don't know if this is a myth, but I've heard it's what gets made when you die.
00:52:07.000 Yeah.
00:52:07.000 They believe that during high rates of stress, your body believes you're going to die.
00:52:11.000 And when you're dreaming, when you're in heavy REM sleep, your body produces larger amounts of it than baseline.
00:52:19.000 But they don't know.
00:52:21.000 It's really difficult.
00:52:22.000 They've only just now, within the last few years, the Cottonwood Research Foundation, which...
00:52:29.000 Dr. Rick Strassman has a big part of it.
00:52:31.000 He's the guy who wrote the book DMT, The Spirit Molecule.
00:52:35.000 He did a bunch of the first FDA-approved drug trials with civilians where they took people and they gave them a Schedule I drug, dimethyltryptamine, which is so crazy that it's a Schedule I drug that your body produces.
00:52:51.000 We're good to go.
00:53:01.000 The different commonalities that these people had in their experiences.
00:53:05.000 And he's working very closely with the Cottonwood Research Foundation.
00:53:08.000 And one of the things that they found is that they've recently discovered, it was just anecdotal evidence that it was produced by the pineal gland.
00:53:15.000 We knew that DMT was produced by the liver and the lungs, but now they know for sure because they've isolated it in rats.
00:53:22.000 So in living rats, they know that they produce DMT with the pineal gland.
00:53:25.000 So that explains a lot of ancient Eastern mysticism and all the symbology, all these symbols that people had to represent this gland.
00:53:36.000 Now they know, okay, well this gland definitely does produce this incredibly potent psychedelic drug.
00:53:40.000 But now the question is, at what levels, during what periods of stress, do you have to bring someone to the point of death before they experience this?
00:53:50.000 And if that is the case, is it possible that consciousness itself is something that we, since we haven't really figured out what exactly it is, is it possible that consciousness can travel Through this chemical pathway that maybe these intense dimethyltryptamine experiences are in fact a gateway to what people have assumed exists from the beginning of time,
00:54:14.000 like an afterlife, or a sea of souls, or something, some stage of existence other than this physical existence that we all experience right now.
00:54:24.000 Yeah, so, I mean, I feel like I'd be...
00:54:27.000 Sounds like crazy talk, right?
00:54:28.000 It sounds pretty crazy.
00:54:30.000 It's coming out of my mouth and I'm going, what the fuck are you talking about, dude?
00:54:34.000 I think I'd just be surprised if consciousness was just this one chemical.
00:54:38.000 I think it's much more likely that it's this emergent phenomenon from this incredibly complex system of billions of different neurons firing in a certain way.
00:54:48.000 And when you have a certain process that's sufficiently complex in the right way, somehow, and this is just this big black box that we've got no idea about, somehow subjective experience comes out of that.
00:54:59.000 But it would seem...
00:55:00.000 I mean, otherwise the issue is you could have just DMT traveling and just a test tube or something and Petri dish.
00:55:06.000 And it would seem like, oh, is this Petri dish conscious?
00:55:10.000 That would seem really strange.
00:55:11.000 Why would that be the case?
00:55:12.000 If you're breathing air and the air keeps you alive, like you're breathing in and bringing out, you don't think that air carries the life with it to another place, right?
00:55:22.000 Air is just a component of life.
00:55:23.000 It's something that your body requires.
00:55:25.000 Yeah.
00:55:25.000 So, I mean, it's possible.
00:55:26.000 Maybe it's the case.
00:55:28.000 Though, again, I feel I'd be surprised if it was like this chemical is necessary for consciousness in some way.
00:55:33.000 I'm not saying it's necessary.
00:55:34.000 But I am curious as to how consciousness varies.
00:55:37.000 You know, consciousness and the actual feeling of being alive varies depending upon your health, depending upon stress levels.
00:55:46.000 Depending upon love and happiness and all these different factors change the way you view the world, which is really interesting because in effect that changes consciousness and you can be more, you know, you can be more elevated like you can I guarantee you All this effective altruism that you're concentrating on is somehow or another elevating your consciousness because you're putting out so much love and so much happiness and you're helping so many people.
00:56:14.000 There's so many positive benefits to your very existence.
00:56:17.000 I've got to believe that somehow or another that manages to come back to you.
00:56:22.000 I mean, it definitely comes back to me in kind of how I feel about my life.
00:56:27.000 I mean, when we were talking about how money is just not the key to a happy life, the question is, well, what is?
00:56:34.000 And the answers are having a great community, having a greater purpose in life, feeling like you're making a difference.
00:56:42.000 So all of these reasons are why.
00:56:44.000 So we've built up this kind of community around effective altruism.
00:56:47.000 You know, people all around the world who are making a significant change.
00:56:51.000 So for example, donating 10% of their income to the charities they think are most effective or pursuing a career that they think is really effective.
00:56:59.000 And one thing I wasn't surprising from the outset, but I'm so happy happened, is that this strong community has formed.
00:57:05.000 It's kind of like a little global village or something.
00:57:07.000 And people have found that actually, far from being a sacrifice, as you might have expected, this is actually incredibly rewarding.
00:57:16.000 Because you've now got this community of people who have shared aims to you, and you're all working towards this greater goal.
00:57:23.000 And that's something that I think is very lacking in the world today.
00:57:26.000 So many people just...
00:57:29.000 They work 9 to 5, and they have a nice time on the weekend, but they're like, where is all of this going?
00:57:35.000 At the end of my life, I'm really going to think, yeah, I made the most of this.
00:57:40.000 Whereas if you think at the end of your life, like, yep, I dedicated my life to helping others, and I had this transformative impact on thousands of people, you're not going to think at the end of your life, gee, I really wasted that.
00:57:52.000 It's just something I don't think you can really look at.
00:57:54.000 If you go deep, though, down the philosophical rabbit hole, You really consider that life is this temporary experience and even benefiting someone through this temporary experience is still a temporary experience It's like you are helping some you gave them a pillow for the ride and it's a temporary ride the ride comes to an end and then what and then what is the point of all this like what is the point of effective altruism if you're just helping people during this temporary ride and That doesn't seem to mean anything.
00:58:25.000 Yeah, so I think there's two things.
00:58:27.000 I like your eyebrow.
00:58:28.000 It's really cool.
00:58:29.000 I can't help myself.
00:58:30.000 I can do that too.
00:58:31.000 Just raise up.
00:58:32.000 I just go, what the fuck is this?
00:58:35.000 Freak myself out.
00:58:36.000 Well, we do get freaked out at this, you know, when you think of existential ants.
00:58:42.000 The angst of existence.
00:58:45.000 So I think there's two answers here.
00:58:49.000 The first is that the ride is the goal, ultimately.
00:58:54.000 Again, if you think the purpose of life is to increase the amount of happiness and reduce the amount of suffering, the final goal is good experiences, and the kind of anti-goal is bad experiences.
00:59:04.000 So when we're sitting here talking, having a great time, this is us kind of achieving.
00:59:08.000 This is us getting points on the win counter.
00:59:12.000 Because we're having a good time.
00:59:13.000 That's right, yeah.
00:59:14.000 If we were really hating this, then we'd be losing.
00:59:16.000 Well, even more so because we're broadcasting this live and millions of people are going to hear it.
00:59:21.000 And hopefully they're enjoying it.
00:59:23.000 Hopefully.
00:59:24.000 And maybe if they're not, at least there's a little stress relief.
00:59:26.000 Like maybe they're at the gym and they go, these fucking idiots!
00:59:29.000 And they're doing squats and they're getting angry.
00:59:31.000 Yeah.
00:59:31.000 So I think that's the first thing.
00:59:33.000 But then the second thing relates to this idea of cosmic significance.
00:59:41.000 Where what often motivates...
00:59:43.000 So you say, oh, we're just along for a ride.
00:59:45.000 We're all going to get eaten up by the sun eventually, and so on.
00:59:47.000 What's the kind of greater purpose of life?
00:59:50.000 But I actually think there are some ways that our actions now can have much greater cosmic significance.
00:59:57.000 And that's because, I think, if you think that the human race survives for the next few centuries, it seems kind of inevitable that we're going to spread to the stars.
01:00:10.000 And I think that would be good.
01:00:11.000 Again, from this perspective, we can go into more arguments if you want, of just saying what we want to do is promote happiness and reduce suffering.
01:00:20.000 If that means we can live on other planets as well and have kind of thriving civilizations there, not only where the people are having great lives, but also making scientific, artistic contributions and so on, then that's a good thing to do as well.
01:00:34.000 Well, there's no technological reason for thinking that we won't be able to do that in the future, given current rates of technological progress, unless something really bad happens along the way.
01:00:44.000 And this kind of gets back to one of the things we talked about right at the start was one of the focus areas of the effect of altruism community is on trying to reduce Risks of human extinction, of global catastrophic risks.
01:00:59.000 These are the sorts of things that could imperil the human journey, as it were.
01:01:07.000 And I think that if you're working to mitigate some of these things, Then you're increasing the chance that we do get to the sort of level where humanity can have a thriving future, not just on this planet, but on other planets as well.
01:01:22.000 And that actually means your actions really do have this huge cosmic significance.
01:01:27.000 So the conscious effort to be a kind person, a generous person and effective altruism spreads and it impacts people.
01:01:37.000 There's this ripple effect and your good deeds could perhaps Fuel enough people with this thought and with effective altruism and more people might act on that to the point where we reduce the amount of suffering, to the point where we extend the lifespan of human beings,
01:01:55.000 we extend the areas where we have no war, we reduce the amount of violence to the point where we can successfully innovate to the point where we can get off this planet.
01:02:04.000 And then start from scratch with a new dictator on Mars.
01:02:07.000 Donald Trump on Mars.
01:02:09.000 How about that?
01:02:10.000 Yeah, I mean, so I think...
01:02:11.000 Booting on Mars.
01:02:14.000 Well, if he could become president of Mars, I'd be pretty happy with that.
01:02:17.000 It'd be fascinating.
01:02:17.000 We'd have to go to war with Mars.
01:02:19.000 Do you think, though, I mean, I've wondered about this many, many times.
01:02:23.000 I wonder if it's an outdated idea, this idea of traveling to the stars.
01:02:26.000 And again, I go back to this whole interdimensional thing.
01:02:30.000 I wonder if that's the reason why we have never been visited by other planets, by species from another planet.
01:02:36.000 Maybe that's not what happens.
01:02:38.000 Maybe they develop artificial realities.
01:02:40.000 Like what Jamie was talking about to me with these artificial computer realities.
01:02:45.000 If someone develops some sort of a matrix-like world where you can plug into it and experience an infinite number of things, an infinite number of artificially created dimensions that are indistinguishable from this, Why would you want to, like, risk a six-month trip in a metal tube to another planet?
01:03:04.000 I mean, maybe that's really retro.
01:03:05.000 Maybe that's a really ancient way of looking at things.
01:03:08.000 Maybe it's like zeppelins, like big flying balloons instead of, you know?
01:03:12.000 So, yeah, the question you've raised is called the Fermi Paradox.
01:03:15.000 Right.
01:03:16.000 Which is, just given there's so much, so 100 billion stars in our galaxy, 8 billion galaxies in the affectable universe, 100 billion in the observable universe, The universe is also pretty old, 15 billion years old.
01:03:32.000 So if it was the case that life is very common, that it's very easy for us to, life to then develop to a level of advanced technological ability, we should expect to see evidence of aliens all over the place.
01:03:46.000 But yet we see absolutely none.
01:03:49.000 And that means that from somewhere from a habitable planet, somewhere along the path from a habitable planet to space-faring civilization, there must be some big filter.
01:04:01.000 There must be some step that's just incredibly hard for that, or incredibly unlikely that civilization moves them, or life moves them that step to another.
01:04:11.000 And one hypothesis is this, yeah, like, people just...
01:04:15.000 Civilization gets to a sufficiently advanced level and they just chill out.
01:04:20.000 Or they go internally.
01:04:21.000 Yeah, they go internal.
01:04:22.000 The issue with that explanation, I think, is it's just not strong enough.
01:04:26.000 Because...
01:04:28.000 You'd have to think that that's, for this kind of filter to work, it has to be a really strong filter.
01:04:33.000 Filter?
01:04:34.000 Yeah, as in like, because there's just so many stars, so many Earth, so many seemingly habitable planets, it has to be the case that it's exceptionally unlikely at some stage or other.
01:04:46.000 Like, not just really unlikely, as in like, you know, one in a trillion unlikely planets.
01:04:53.000 On this path from habitable planet to spacefaring civilization.
01:04:58.000 And so you'd have to think, of a trillion civilizations that get to this level of technological ability, they all choose to turn inward.
01:05:06.000 And that seems just very unlikely.
01:05:08.000 It seems like, well, at least one would really try and spread out.
01:05:11.000 And if so, then we'd see evidence of that.
01:05:16.000 Because, cosmically speaking, the time from getting to the level of technological capability where you can spread to the stars and the level where we'd be able to kind of see real evidence of that is kind of small.
01:05:30.000 So I actually think that the reason that we can't see aliens is because the very first stages of life are incredibly unlikely.
01:05:39.000 The move from nothing to kind of basic replication, and then secondly, the move from single-celled organisms to multi-celled organisms.
01:05:48.000 And the reason for thinking this is very unlikely is it took an incredibly long time on Earth, billions of years before this happened.
01:05:57.000 And in particular, in the move from single-celled to multi-celled life, that's only ever happened once.
01:06:02.000 And so, given that we don't see any aliens, we should think some part of this is really hard.
01:06:08.000 Our best guess is that that move from single-celled to multi-celled, and perhaps from the creation of the first cells as well, that was incredibly difficult.
01:06:18.000 And that means that we're just exceptionally lucky to be alive, as it were.
01:06:23.000 But if the universe is infinite, that means that this has happened an infinite number of times.
01:06:30.000 That's right.
01:06:31.000 Though it might be very far away, like sufficiently far away that we are not connectable to each, like we can't contact each other or observe each other.
01:06:39.000 But there's an infinite number of those infinitely far places.
01:06:45.000 So there would be some clusters of the universe.
01:06:49.000 And again, the idea of the universe is only a hypothesis.
01:06:52.000 And I'm just deferring to other people who say it's the leading hypothesis.
01:06:55.000 Well, the most puzzling hypothesis to me was the evidence of supermassive black holes being at the center of every galaxy.
01:07:04.000 And that the hypothesis was that the supermassive black holes are exactly one half of one percent of the mass of the entire galaxy.
01:07:12.000 And that if you go through those supermassive black holes, you may in fact go into a completely new universe, filled with hundreds of billions of galaxies, each with supermassive black holes at the center of those galaxies, which will take you to hundreds of billions of galaxies in another universe.
01:07:29.000 It's never-ending, and that's what the real infinity is.
01:07:33.000 It's not just the mass of all the things that we can observe in the 14 plus billion light years that we know of from the Big Bang to today.
01:07:41.000 It's all of those things being portals to incredibly different, totally new universes.
01:07:49.000 Okay, yes, it's turtles all the way down.
01:07:51.000 Turtles all the way down.
01:07:52.000 So the real question to me, and I proposed this to Brian Cox and I didn't get a sufficient answer, it's why would we assume that there's someone more advanced than us?
01:08:06.000 It is possible that someone, some species, something is the tip of the spear.
01:08:13.000 That something is the first.
01:08:16.000 That something is the most advanced life form in the universe.
01:08:20.000 Why would we assume that someone would be more advanced than us if we are the most advanced thing that we can find?
01:08:27.000 The only logic That I could point to was that we are relatively young in terms of the history and the age of the universe itself.
01:08:36.000 The universe itself being roughly 14 billion years old.
01:08:38.000 We are 4.6.
01:08:41.000 What is the age of the earth?
01:08:43.000 Somewhere in there?
01:08:44.000 Somewhere in the neighborhood, right?
01:08:46.000 Relatively young when you consider that 10 billion years of life, give or take, or of existence happened before we came along.
01:08:55.000 But why would we assume that there's anything out there that's more advanced?
01:08:59.000 And why would we assume that this isn't as far as anybody's ever gotten?
01:09:04.000 In terms of infinity, right?
01:09:06.000 14 billion years seems like a long time.
01:09:09.000 But in terms of infinity, it's a blink.
01:09:12.000 So I think we should believe that in the...
01:09:15.000 And again, let's now just ditch the infinity and just think about the observable universe, which is finite.
01:09:20.000 Because people pulled over sweating in their car right now.
01:09:22.000 Yeah, yeah, exactly.
01:09:24.000 Infinity...
01:09:25.000 Have you ever heard of Graham's number?
01:09:26.000 This is now a total recognition.
01:09:28.000 Of Graham's number...
01:09:30.000 I don't believe so.
01:09:31.000 What is Graham's number?
01:09:32.000 It got known as the largest number ever seriously used in the mathematical proof.
01:09:38.000 And Tim Urban of Wait But Why has this amazing post trying to explain just how big Graham's number is.
01:09:45.000 And you have to use a special notation in order to be able to explain it.
01:09:49.000 And numbers just get really big.
01:09:51.000 And once you really start to think this through, you're just like, you're left just kind of walking back and forth.
01:09:58.000 Yeah, not like just totally freaked out.
01:10:01.000 Yeah.
01:10:01.000 For our little monkey minds.
01:10:02.000 Because you think like trillion is so big.
01:10:03.000 Yeah.
01:10:04.000 Trillion is just a speck of dust compared to the famous number.
01:10:07.000 Right.
01:10:07.000 Even a trillion years is a speck of dust.
01:10:09.000 When you consider the possibility of the universe itself being infinite or the possibility that is a continuous cycle of big bangs to expansion to contraction back to an infinitely small point, back to another big bang, which is a plausible possibility.
01:10:24.000 Yeah, I mean, I think, yeah, I'm also very worried, you know, I'm not Neil deGrasse Tyson, I'm sure I'm butchering tons of the science.
01:10:34.000 I think my understanding at the moment is that we currently think that the universe is just expanding and it just keeps expanding further.
01:10:41.000 I know it was definitely a leading theory that it was going to expand and slow and then kind of crunch.
01:10:45.000 Yes.
01:10:46.000 But you mentioned humans being the most advanced kind of creature.
01:10:51.000 I think that probably is correct in the observable, or certainly our galaxy, let's say.
01:10:57.000 Well, we know it is in our solar system, right?
01:11:00.000 Yeah, that's right.
01:11:00.000 But I think we know it is in our galaxy as well.
01:11:02.000 You think so?
01:11:04.000 It's so far.
01:11:06.000 But the thing is that it's like 100,000 light years.
01:11:11.000 Oh, nothing.
01:11:12.000 But when you're thinking about 15 billion years of the age of the universe, that's actually just a very short period of time.
01:11:18.000 Right.
01:11:18.000 But why would you assume that 100,000 light years from now, there's not something exactly like us?
01:11:24.000 So it's possible.
01:11:25.000 But the thing is that if it was...
01:11:28.000 Somewhat easy, or if it was just not incredibly difficult for intelligent life to evolve, then it would have happened in the past already and we would see evidence of it.
01:11:37.000 And the fact that we don't see any evidence at all of intelligent life and other solar systems at all suggests that it's incredibly difficult for that to happen.
01:11:47.000 But isn't that like being in the woods and unzipping your tent and sticking your head out and saying, I don't see anything.
01:11:54.000 This must be empty woods.
01:11:57.000 It's more like...
01:12:00.000 I mean...
01:12:01.000 You're talking about a very small area that you've observed and we've taken account of.
01:12:08.000 So I think it's more like...
01:12:10.000 Because I think...
01:12:13.000 If an alien civilization or us in the future goes to kind of start, yeah, spreading to the stars, in the course of, you know, just a few seconds, a million years, let's say, there will be really significant evidence.
01:12:29.000 You'd see Dyson spheres being constructed around suns, you know, to harness the sun's energy.
01:12:35.000 You'd see some evidence of, like, galactic engineering projects and so on.
01:12:39.000 It would be like a really big impact.
01:12:41.000 Do you think you'd see that with hundreds of thousands of light years between us and the observable objects?
01:12:47.000 But again, 100,000 light years is just not very long compared to the kind of 15 billion.
01:12:54.000 So it would just be this amazing coincidence if it's the case that...
01:12:59.000 A life that's as advanced or more advanced than us has evolved at just the same time as us, where 100,000 years, give or take, is basically just the same time, but hasn't evolved more than a million years ago, where we would start to see kind of major impacts of that.
01:13:16.000 So if something within the observable universe...
01:13:19.000 But we've observed so little.
01:13:20.000 We don't even have really adequate photographs of anything outside of our solar system.
01:13:24.000 I mean, everything is just radio spectrum.
01:13:28.000 You know, the analysis is that they're getting off of light waves of what the components of the atmosphere is.
01:13:35.000 So using your analogy, what I'm suggesting is that if it was the case that intelligent life was not that hard to come by, you'd stick your head out the tent and you'd look like Tokyo rather than looking like the woods.
01:13:49.000 But why does it have to look like Tokyo?
01:13:51.000 Why can't it look like Kansas?
01:13:53.000 Why can't it be like really spread out and very little life?
01:13:57.000 Because I think if life is spreading out, then it's just going to want to, what does life do?
01:14:02.000 It just tries to harness resources and tries to grow more of itself.
01:14:05.000 Maybe it reaches a point where it realizes that's futile.
01:14:08.000 It just concentrates on effective altruism at home.
01:14:11.000 So that's the turning inward suggestion again.
01:14:14.000 And so maybe it's the case that like, yeah.
01:14:18.000 Like, is it more important to get your shit together at home or to go all over the world with the same bullshit ideas?
01:14:26.000 Right?
01:14:26.000 And if that's the case...
01:14:28.000 Wouldn't that be the same thing that you could turn towards interstellar travel?
01:14:32.000 Like, wouldn't it be more important for these communities to concentrate on taking care of their planet and figuring out a way to work in some sort of harmonious fashion with the very nature of the planet itself rather than travel to the stars?
01:14:48.000 I mean, possibly.
01:14:48.000 But now imagine there's...
01:14:50.000 So on this alien planet, there's 10 billion aliens, and they're like, let's say they're a thousand years more advanced than humans are at the moment.
01:15:01.000 In order for this argument to work, it'd have to be the case that every single one of them makes that decision to just turn inwards and focus on...
01:15:08.000 Why would that be the case?
01:15:09.000 Because not all those people would be the ones that would innovate in the first place.
01:15:12.000 It wouldn't have to be everyone that makes a decision, but it would have to be everyone of a high enough consciousness to figure out how to make these interstellar machines decides not to harness this nuclear power and jet off into space.
01:15:23.000 But I think over time that would just be everyone.
01:15:27.000 Really?
01:15:28.000 Well, yeah, I mean, just technological progress just keeps going, and eventually, like, I mean, obviously we're doing this, like, weird thought experiment.
01:15:35.000 Right, right, right.
01:15:36.000 Speculating on, like, economics and sociology of a hypothetical alien world.
01:15:41.000 But, uh...
01:15:44.000 I mean, just at some point, as a civilization progresses, then there's going to at least be many, many actors with sufficient power and capability to spread to the stars.
01:15:58.000 And you need to say that every single one of them decides to turn inwards.
01:16:02.000 So it's sort of like technology becomes very rare and then ultimately over time becomes very common, like the cell phone.
01:16:09.000 Like the cell phone, yeah.
01:16:10.000 Right.
01:16:10.000 So when a cell phone was first invented, it was extremely rare and very expensive.
01:16:15.000 Now everyone has one and the capabilities of those cell phones have greatly, greatly improved.
01:16:21.000 Yeah.
01:16:21.000 And that this will happen with everything, including space travel.
01:16:24.000 Yeah, I mean, but also it doesn't need to be the case that it gets out to 10 billion people, even if it's just like 1,000 people or something.
01:16:30.000 Again, it would just seem unlikely that, you know, in every civilization and every one just has, you know, even just 1,000 people, everyone chooses not what a single person thinks exists.
01:16:40.000 Hey, I just want there to be more spread out.
01:16:44.000 Now, that obviously is dependent upon there being a more advanced civilization than human beings on planet Earth.
01:16:51.000 Because if there weren't, if there were a few years behind us, like if they were stuck in the 1950s, or maybe they're stuck in ancient Greece, then obviously they don't have the capabilities yet.
01:17:01.000 We might be the very most advanced.
01:17:04.000 We might be the very tip of the spear, right?
01:17:06.000 Yeah.
01:17:06.000 And I just think, yeah, because I think it would be unlikely that...
01:17:12.000 Something more advanced happened just a little bit faster than us, but not, say, 100 million years ago, which is not very long ago in cosmic terms.
01:17:22.000 But it's still possible.
01:17:23.000 I mean, it's still possible that something happened 100 years quicker than us, or that they haven't had the same setbacks that we've had in terms of, like, asteroidal impacts and natural catastrophe, supervolcanoes and the like.
01:17:37.000 It's a real weird thought experiment, because you start thinking, and you start extrapolating, okay, well where are we gonna be?
01:17:43.000 You know, where are we gonna be?
01:17:45.000 And why would we do that?
01:17:46.000 Like, that's one of the things that always gets me about this whole trip to Mars, and I have a joke about it in my Last comedy special, where people were, somebody actually said this to me, like, because it was before California had solved its drought, or Mother Nature solved our drought for us, rather, where people were like, hey man, we should really consider going to Mars,
01:18:02.000 because, I mean, look at our environment, California's almost out of water, and my joke was like, we're right next to the fucking ocean.
01:18:09.000 Like, there's so much water, you can't see the end of it.
01:18:11.000 We have a salt problem, we don't have a water problem.
01:18:14.000 Like, what are you gonna do?
01:18:15.000 You gonna bring water to Mars?
01:18:16.000 Like, that's the stupidest thing I've ever heard in my life.
01:18:18.000 Yeah, there's this weird, when people start talking about Mars, I mean, I think, so there's the project of going to Mars, setting up a colony.
01:18:26.000 Now, like, the aim of doing that, because it's awesome.
01:18:29.000 Totally on board with that.
01:18:30.000 In the same way as, like, going to the moon, it's like, look what we can achieve.
01:18:33.000 This is an exciting, like, global human project.
01:18:35.000 Even just the space shuttle going into orbit, it's pretty badass, right?
01:18:38.000 Yeah, exactly, exactly.
01:18:39.000 But then this talk of like, oh, well, we need this in order to be able to survive as a species.
01:18:44.000 I'm like, look, if you want to have this kind of refuge or colony in order to make the Earth more robust, Mars is just not a great place to pick.
01:18:54.000 There's so many different ways that, I mean, Mars is like really inhospitable.
01:18:59.000 And if you wanted to build a refuge, why not go under the sea?
01:19:03.000 That's, like, going to be protected from, you know, viruses or asteroid impacts and so on.
01:19:10.000 Not really, though.
01:19:11.000 If one of those big things that's slammed into the Yucatan slams into where your village is in the sea?
01:19:15.000 I mean, if you had this underwater village with, you know, 10 years of food supplies and so on, then you could, like, come back.
01:19:22.000 Because the impact from the asteroid wasn't just, like, shook everyone up.
01:19:26.000 It's that the sky is gone.
01:19:27.000 Mm-hmm.
01:19:28.000 The skies get clouded over with ash.
01:19:30.000 The Earth rang for a million years.
01:19:33.000 Oh, what is that?
01:19:34.000 As in like...
01:19:34.000 From the impact.
01:19:35.000 Like...
01:19:35.000 Yeah.
01:19:37.000 That's so interesting.
01:19:38.000 That's so insane.
01:19:39.000 Yeah.
01:19:39.000 When you think about how big that thing was that killed the dinosaur 65 million years ago and that there's hundreds of thousands of those things floating around in space.
01:19:47.000 So yeah, I was asking some people at NASA just two days ago actually on how many of them we've managed to identify.
01:19:56.000 Because they're serious about kind of scanning the skies to find them all.
01:20:01.000 And the answer, I thought we had it covered.
01:20:04.000 I thought this was something that NASA was like, yeah, yeah, we know where all the Earth killers are.
01:20:09.000 And their response was like, no, we've got no idea.
01:20:12.000 We don't know how many of them are out there, and so we don't know how many we've managed to track.
01:20:17.000 There's a guy named Randall Carlson that I've had him on podcast a few times, and he's obsessed with the idea that asteroidal impacts were probably what ended the Ice Age, you know, 10 to 12,000 years ago.
01:20:28.000 And there's a significant amount of physical evidence that points to this.
01:20:34.000 Both in evidence of impact in nuclear glass.
01:20:37.000 I think it's called tritonite.
01:20:39.000 I forget the exact word.
01:20:41.000 But it appears all throughout Europe and Asia at around that same timeline, around 10,000 and 12,000 years ago, when they do core samples.
01:20:50.000 And it points to this idea that there were significant...
01:20:55.000 impacts from Asteroidal objects all over Europe and all over Asia around that time they think some of them Slammed into the ice caps that were you know North America was covered in a giant chunk of it was covered in as much as two miles high of ice just 10,000 years ago and And he points to an incredible amount of physical change in the environment that looks like it took place over a very short period of time.
01:21:21.000 Like catastrophic change over an incredibly short amount of time that he believes points to these impacts, melting the ice caps, creating massive amount of flooding, killing off who knows how many people, resetting civilization in many different parts of the world.
01:21:39.000 This evidence of the nuclear glass, of these micro-diamonds that also exist, they find them during nuclear test sites when they blow off bombs, and they also find them at asteroid impact sites.
01:21:53.000 And when you know that we have been hit many times in the past, and they do have evidence of that, and then you see the moon and all the different impact craters on the moon, you know that this is just What he calls a cosmic shooting gallery, essentially.
01:22:06.000 He's like, it's very likely that that was the cause of the end of the Ice Age.
01:22:11.000 There's a lot of this climate data that sort of seems to point to that as well.
01:22:15.000 So this is now, like, really outside my area of expertise.
01:22:20.000 I'll send you some links to some of his stuff, because he's been obsessed with this for about 30 years.
01:22:25.000 Fascinating guy.
01:22:26.000 The two things that would really surprise me about that are, firstly, just that there were so many ice ages, and it just seemed to be this, it comes on, goes off.
01:22:34.000 Oh, sure, yeah.
01:22:37.000 You know, fairly dynamic, predictable process, whereas asteroid impact, super random.
01:22:42.000 So you wouldn't expect to have this kind of back and forth dynamic if it was asteroids that was doing it.
01:22:47.000 And then secondly, my understanding would be that asteroids would cool the planet because asteroid hits, ash just spreads out all over the sky.
01:22:55.000 That just blocks out sunlight.
01:22:57.000 So it would surprise me if it had this kind of warming effect.
01:23:01.000 Well, I think the idea is that, first of all, when it hits, the impact is massive, and it melts off just the huge chunk of the amount of water that is covering North America, right?
01:23:13.000 And that's one of the things that causes this massive flooding and this massive changing of the topography.
01:23:19.000 And as far as, like, what causes the natural...
01:23:21.000 I don't know if it interrupts it temporarily, and then it comes back and gets warmer.
01:23:27.000 But, yeah, that natural cycle of...
01:23:29.000 Warming and cooling has been going on since they, I mean, from as far back as they can measure it.
01:23:34.000 What he's talking about is significant quick changes.
01:23:37.000 Also the extinction event that killed somewhere around 65% or more of all of the large mammals in North America.
01:23:47.000 Really quickly, like woolly mammoths, really quickly.
01:23:50.000 Sabertooth tigers, really quickly.
01:23:52.000 They don't know about that.
01:23:53.000 There's a lot of speculation back and forth about that.
01:23:55.000 Because they think that humans did it, but then they found these mass dead sites where they're not consumed.
01:24:02.000 What was the ones that he showed where these woolly mammoths, they found them Where their legs were broken, and it looked like just the impact of something had knocked them flat, and they had found like thousands of them in these mass death sites.
01:24:17.000 Interesting.
01:24:17.000 But I thought that the...
01:24:18.000 So firstly, it just seemed to me like the homo...
01:24:22.000 The idea that it was humans killing them all just seems like...
01:24:24.000 Crazy.
01:24:25.000 Oh no, I thought it just seems like such a good explanation.
01:24:27.000 But they didn't even have...
01:24:28.000 They had atlatls.
01:24:29.000 That was like the best weapon they had at the time.
01:24:31.000 They weren't even riding on horseback at the time.
01:24:34.000 But then with respect to the death sites, I thought the mechanism for killing a woolly mammoth is you've got like 200 humans and you just chase the woolly mammoth off a cliff.
01:24:41.000 That does work if you can get them near a cliff.
01:24:43.000 But the idea of getting them all near cliffs and killing them all off by a bunch of people that hadn't figured out the wheel seems a little unlikely.
01:24:51.000 It's just...
01:24:52.000 It's possible.
01:24:53.000 Like, over thousands of years.
01:24:54.000 Because that's the thing, like, we often tell these stories about, you know, pre-civilization humans.
01:25:00.000 It's like, oh, and then they migrated and made this great journey to Europe and so on.
01:25:05.000 And often that's like, they moved a mile every year.
01:25:08.000 Right.
01:25:08.000 So it's like, great journey is actually just this very gradual thing.
01:25:11.000 Yes, yes, very gradual.
01:25:12.000 And similarly, if you've got this grave site and it's got, wow, hundreds of woolly mammoths in this one place, that might be over thousands of years.
01:25:18.000 I mean, again, this is just something I No, that's the thing.
01:25:20.000 They're talking about carbon dating, that it's all within the same time period.
01:25:24.000 You'd have to really go over his stuff with a fine-tooth comb and talk to him about it, because I'm not the right guy.
01:25:29.000 I just listen to him and go, whoa, and then try to relay it as much as possible.
01:25:34.000 There's a podcast that I actually retweeted today, because somebody brought it up on YouTube.
01:25:39.000 It's available, so I'll send you to it afterwards and see what you think about it.
01:25:43.000 But this is something, yeah, if you know the book Sapiens...
01:25:46.000 No, you're like the fifth person to talk about it.
01:25:49.000 I've got to get it.
01:25:49.000 Everyone talks about Sapiens.
01:25:50.000 Sapiens is like THE book.
01:25:53.000 Pull on up to that.
01:25:54.000 Pull that a little closer to you because it makes a big difference in the sound.
01:25:58.000 But yeah, one of the things that most blew my mind there was how much megafauna there was in the early days of Homo sapiens.
01:26:05.000 You know, moving across North America, there were two ton sloths.
01:26:10.000 Huge giant sloths.
01:26:12.000 And these are one of just very, very many massive megafauna that we just don't have anymore.
01:26:17.000 Yeah, the blitzkrieg hypothesis is what they call the human animal killing off all of the other animals.
01:26:24.000 It's a really troubling hypothesis because we don't want to think that we're capable of doing that.
01:26:29.000 But obviously we do do that.
01:26:31.000 I mean, we're doing it right now.
01:26:32.000 We did it to the buffalo.
01:26:33.000 I mean, we almost brought the bison.
01:26:35.000 Did it to the dodo.
01:26:36.000 Yeah.
01:26:36.000 We're doing it just...
01:26:37.000 Tasmanian tiger.
01:26:38.000 There's a lot of different animals that within our lifetime have gone extinct.
01:26:41.000 I mean, we're actually, like, in terms of extinctions, I'm not sure if we'll get the number right, but it would be pretty accurate to describe this as the fourth, and maybe it's not fourth, but mass extinction, because it's just huge, the number of species that have gone extinct as a result of human activity.
01:26:57.000 And it's also one of those things where we don't think of it as being significant because it happens slowly over the course of many years, but if you look at it on a timeline, you're like, oh my god, look, everything's dying right now.
01:27:08.000 Yeah, yeah, exactly.
01:27:09.000 So it's...
01:27:09.000 Slow by human standards, but very quick by geological standards.
01:27:14.000 It's a fascinating subject, the end of the Ice Age happening so quickly, the animals dying off so quickly, and so many large mammals dying off so quickly.
01:27:24.000 When you think about what we know people have done, like when we almost killed off the bison, we know why they did that, we know how they did that, and they did it with extraordinary weapons.
01:27:35.000 I mean, they did it with high-powered rifles.
01:27:37.000 They could shoot things from a far distance.
01:27:39.000 They did it by shooting off trains.
01:27:41.000 I mean, they did a lot of crazy shit back then.
01:27:43.000 So we understand, I mean, and there's a lot of physical evidence.
01:27:46.000 There's photographs of the actual piles of bones and all that crazy shit.
01:27:51.000 When you take away those physical capabilities, the extraordinary physical capabilities, like even riding on horseback, there's a guy named Dan Flores, who's a fascinating guy, he's a scholar, who believes that even without The Europeans coming over here and market hunting and killing off all the bison,
01:28:10.000 he thinks just the firearm and the horse with the Native Americans, it's entirely possible that they were going to eradicate the bison on their own.
01:28:19.000 I mean, again, it just depends about timescales.
01:28:23.000 So even if you're just killing like slightly more of the species, like killing just enough of the species that they're now below the, you know, two children for every two parents.
01:28:35.000 Right.
01:28:35.000 Viability stage.
01:28:37.000 Yeah, exactly.
01:28:37.000 Then just over sufficient time.
01:28:39.000 Yeah.
01:28:40.000 And remembering that Homo sapiens between the hunter-gatherer age was 190,000 years.
01:28:46.000 It's very long time spans.
01:28:48.000 Again, very short geologically, but...
01:28:50.000 Yeah, very long time spans.
01:28:52.000 So again, you don't have to be killing that many woolly mammoth to drive them to extinction over the course of several thousand years.
01:29:00.000 What are your thoughts when it comes to the ethical reintroduction of animals that have gone extinct?
01:29:06.000 Like, there are some people in Russia that are currently...
01:29:10.000 Working on some form of a woolly mammoth.
01:29:13.000 We're going to take woolly mammoth DNA from some of these frozen bodies that they've gotten.
01:29:19.000 I mean, they've gotten some pretty intact woolly mammoths now, and they're going to try to clone one.
01:29:25.000 Yeah, so I don't know the details of how this will work.
01:29:29.000 I guess they have to gestate it in an elephant.
01:29:32.000 But I mean, I think it's like scientifically interesting.
01:29:35.000 I don't think there's anything wrong with it.
01:29:36.000 I don't think there's anything...
01:29:39.000 Like where you have woolly mammoths everywhere.
01:29:41.000 Yeah, I mean, I think...
01:29:42.000 I don't think there's any ethical imperative to do it.
01:29:46.000 I think there's not an imperative not...
01:29:51.000 Like, I would think just if there's more woolly mammoths, that's the same as there just being more elephants.
01:29:55.000 And it might be of scientific interest.
01:29:57.000 I heard...
01:29:59.000 While we're on, like, hypotheses that we heard and we're like, oh, that's cool, but sound ridiculous.
01:30:06.000 Yeah, I heard the idea was reintroducing woolly mammoths to, like, stomp down snow in order to prevent...
01:30:12.000 Yes.
01:30:13.000 ...prevent...
01:30:13.000 Global warming.
01:30:15.000 Yeah, to slow it down somehow or another.
01:30:17.000 Yeah.
01:30:18.000 There's definitely things of the sort of thing that people say over dinner, but...
01:30:22.000 Yeah, well, the idea wasn't just stomp down snow, but also to eat the foliage.
01:30:27.000 Okay.
01:30:27.000 Yeah, there's like some exfoliating thing that they're doing where they would consume so many trees and so many plants that it would actually lower the temperature of the earth.
01:30:37.000 Like, what in the fuck?
01:30:41.000 Seems that you're skeptical of that.
01:30:42.000 But, I mean, there is this philosophical question of whether you should...
01:30:46.000 So, the question of biodiversity loss.
01:30:50.000 Which has been huge.
01:30:52.000 How do you value that?
01:30:53.000 So is it the case that loss of a species, you can just cash that out in terms of impacts on individuals?
01:31:01.000 Because obviously it's bad for the animals that die in the course of that, and we maybe have a loss of information that we can just not get back.
01:31:12.000 But is there something intrinsically bad about just having fewer species?
01:31:21.000 To act in a way that suggests they seem to believe yes, but it's hard.
01:31:27.000 I think it's hard philosophically to cash that out.
01:31:30.000 I think it's hard to explain like why would we care so much about losing species where we don't seem to care about having, you know, Deliberately randomizing breeding and so on, that we get more species.
01:31:44.000 It seems like we're only just conservative about not losing them.
01:31:48.000 But if it really is of value to have greater diversity of species, why do we not actively try and promote a greater amount of biodiversity rather than merely preventing loss of biodiversity?
01:31:58.000 I think the reintroduction of species, if you have an environment that's stable, if you have some sort of an ecosystem that's stable, and then you reintroduce a predator or prey or some animal that's going to eat up all the foliage,
01:32:14.000 you're running this big risk, and you're taking these big chances that you can sort of predict the future.
01:32:21.000 You could look at A plus B, well, that's going to equal C. But it doesn't always work that way, and there's been disastrous results When they've introduced species to other environments where they're not native.
01:32:33.000 You know what's going on with places like Australia?
01:32:36.000 Australia is kind of hilarious in that regard.
01:32:38.000 Yeah, so they introduced a type of frog to Australia.
01:32:42.000 I'm going to butcher this as well.
01:32:44.000 They introduced a type of frog to Australia.
01:32:45.000 It took over.
01:32:46.000 So they introduced rabbits to try and eat these frogs or something to eat the frogs.
01:32:51.000 And then they took over and didn't kill the frogs.
01:32:54.000 Well, then they introduced foxes to try to kill the rabbits, and they killed all the ground-nesting birds, and they introduced cats to kill the foxes, and cats to kill the rabbits.
01:33:05.000 Well, especially back then.
01:33:08.000 You know, when they were doing this in the 1800s in Australia, they really didn't know what the fuck they were doing.
01:33:12.000 They were thinking short-term, right in front of them.
01:33:15.000 They also brought in a bunch of animals that don't have natural predators, so they have to gun them down from the fucking sky.
01:33:20.000 I mean, they have all these deer and stags and all these magic beasts.
01:33:25.000 I mean, have you ever seen a stag?
01:33:27.000 They're incredible.
01:33:28.000 They roar.
01:33:28.000 They sound like a lion.
01:33:29.000 And they have so many of them in Australia and particularly in New Zealand, but they don't have any natural predators.
01:33:34.000 Zero.
01:33:35.000 No predators.
01:33:36.000 So they have to fly over in helicopters and gun them down.
01:33:39.000 And they leave them.
01:33:41.000 They just leave them to rot.
01:33:42.000 They just have too many of them.
01:33:43.000 It's the same with kangaroos as well.
01:33:45.000 Have you seen those herds of kangaroos?
01:33:48.000 Have you ever seen that?
01:33:48.000 No, I haven't actually.
01:33:49.000 Oh my god, there's a video that some guy took somewhere in Australia and it is Thousands and thousands of kangaroos running across this field, and it looks like some apocalypse, some apocalyptic kangaroo invasion.
01:34:06.000 See if you can find that, Jamie, in a video, because it's worth seeing to realize, oh, this is what can happen when there's no predators.
01:34:15.000 Animals just get completely out of control.
01:34:17.000 Yeah, so I'm vegetarian and have been for a long time now.
01:34:21.000 But with some other vegetarian friends, we had the conversation of, yeah, what would be the most ethical meat to eat?
01:34:27.000 And I think we concluded that kangaroo would be the most ethical because it's being killed anyway, because they just need to, like, you've got this population explosion.
01:34:36.000 It's on land that wouldn't be otherwise used for anything.
01:34:40.000 They're roaming free.
01:34:41.000 They've got pretty good lives.
01:34:43.000 The environmental impact is therefore going to be low and non-existent as well.
01:34:50.000 Obviously, kangaroo meat is very unusual in almost all of those regards.
01:34:55.000 It's not, though.
01:34:55.000 Yeah, I mean, it's very nutritious, apparently.
01:34:58.000 Kangaroo is actually a type of deer, believe it or not.
01:35:01.000 Yeah, I don't believe that.
01:35:02.000 I thought it was a marsupial, which is a totally different...
01:35:04.000 It is, but it's related to the deer in some...
01:35:07.000 Look at these fuckers.
01:35:08.000 Just hanging out.
01:35:10.000 This is not the one I'm talking about, though.
01:35:12.000 There's a bunch of them running across a field.
01:35:14.000 This is just a large population of kangaroo.
01:35:17.000 Yeah, there's some way in the deer family in some strange way.
01:35:22.000 See if Jamie can find that, too.
01:35:23.000 Do you know that we have wallabies in Scotland?
01:35:25.000 Yeah, I know.
01:35:26.000 Yeah, in an island called Inch Conachan.
01:35:28.000 Yeah, I've heard of that.
01:35:28.000 I've visited them a number of times.
01:35:30.000 And were they introduced to Scotland?
01:35:33.000 Yeah, so Lady Avon Cahoon had...
01:35:36.000 Oh, that bitch.
01:35:36.000 Yeah, well, no need for that.
01:35:40.000 Who is she?
01:35:41.000 So she, I actually don't know, but she owned the island.
01:35:44.000 She owned a zoo on the island, like a personal zoo.
01:35:49.000 And she died, I think.
01:35:54.000 The zoo went to rack and ruin, so it just kind of...
01:35:57.000 The wallabies just got out?
01:35:59.000 And the wallabies took over, yeah.
01:36:01.000 And the first evidence, because people wouldn't regularly visit this, was they would find these dead wallaby carcasses on the mainland.
01:36:07.000 And that was during the winter, the loch, Scottish for lake, would freeze over and the wallabies would hop on the ice and then get hit by a car.
01:36:17.000 But they're now very tame.
01:36:18.000 It was a shame because I first found out about them back when it was still a bit of a secret.
01:36:24.000 That's fascinating.
01:36:25.000 Now it's become a bit of a tourist hotspot.
01:36:28.000 Wow.
01:36:29.000 It says that kangaroos are marsupials and more closely related to possums than deer.
01:36:35.000 Oh, okay.
01:36:35.000 So they're not related to deer, correct?
01:36:37.000 Yeah.
01:36:38.000 Someone had told me that they were in some way in the deer family, or cousins of deer, or something like that.
01:36:43.000 Early explorers said that they were just, that's what their descriptions were, that they were like deers without antlers, and they stood upright like men, but I saw some, I mean, it's a Q or a question, so I didn't find like an official scientist saying, here's the sighting on it, but yeah.
01:36:59.000 Yeah.
01:36:59.000 I wish I had this my whole life.
01:37:02.000 Someone who could just follow me around and correct me every time I say something.
01:37:05.000 Well, this is an amazing time.
01:37:07.000 Somebody put something up on Instagram today and it was a quote from the 1800s about an ancient philosopher or an ancient scholar rather would give his life for the information that's available to the common school boy today.
01:37:22.000 And this is from a quote from 1888. Wow, okay.
01:37:27.000 Which is nothing now compared to what we can do.
01:37:30.000 Yeah, I think there's another statistic.
01:37:33.000 And again, it's unclear how do you measure this, but in terms of written information at least, one newspaper has more written information in it than a typical person in the 1700s would be exposed to for their entire lifetime.
01:37:47.000 I wonder what was the natural predator of kangaroos?
01:37:51.000 Because kangaroos, they're a native animal to Australia, and if they didn't...
01:37:56.000 Do you know, there was a giant predator in New Zealand, at least, at one point in time.
01:38:01.000 It was called the host eagle, and it was an enormous eagle, the biggest eagle they think that ever lived.
01:38:06.000 It had something like a 10-foot wingspan, and they believed they'd even hunted people.
01:38:11.000 A huge, huge eagle.
01:38:12.000 And it's a part of the...
01:38:14.000 I guess it's the Maori?
01:38:16.000 It's a part of their ancient mythology, and they found out that it was actually a real animal.
01:38:22.000 Somewhere around the 1400s was made extinct through hunting.
01:38:27.000 My understanding was in Australia, before humans invaded...
01:38:32.000 Proconyles.
01:38:33.000 My understanding was that it was just no major predators for...
01:38:38.000 That's the Tasmanian tiger.
01:38:40.000 The thylacine?
01:38:42.000 Yeah, they call that thing the Tasmanian tiger.
01:38:44.000 That died during human, like, modern times.
01:38:48.000 That's a crazy looking picture.
01:38:50.000 Look at its face.
01:38:51.000 Look at that mouth on that thing.
01:38:52.000 Jesus Christ.
01:38:53.000 But that, I believe those things died off in the 1930s.
01:38:57.000 I just typed this in here.
01:38:58.000 Now it's extinct, but the dingo is probably the more closest related predator they have.
01:39:03.000 When did it die?
01:39:04.000 Thylacine is now extinct.
01:39:05.000 However, humans arrived in Australia at least 50,000 years ago and introduced the dingo about 5,000 years ago.
01:39:11.000 Hmm.
01:39:12.000 So maybe those things were eating kangaroos.
01:39:14.000 A big part of kangaroos, I guess, would probably be catching them when they're not with their young, but they carry their young inside their body in that pouch, which makes them different from any other kind of animal that would be prey, because they can take care of their young and bounce away quickly.
01:39:30.000 Well, this is why, so in terms of large mammals, humans killed every single type of large mammal other than kangaroos in Australia.
01:39:40.000 I think there were kind of hundreds of different types originally.
01:39:42.000 Oh, there's a bunch of different things other than kangaroos?
01:39:45.000 Yeah, yeah, yeah.
01:39:45.000 Like what?
01:39:47.000 Again, I don't know.
01:39:48.000 Maybe giant koalas, let's say.
01:39:51.000 But yeah, and my understanding was the reason for that was because they didn't have natural predators.
01:39:58.000 And so they just didn't know what to do with people.
01:40:00.000 Yeah, exactly.
01:40:01.000 Yeah, that makes sense.
01:40:02.000 You know, have all of these like defensive mechanisms.
01:40:05.000 Right.
01:40:05.000 And also have wolves and coyotes and bears and all these different things that are chasing them down.
01:40:11.000 That's interesting, the concept of what's the most ethical thing to eat.
01:40:14.000 I would think you would think it would be like mollusks.
01:40:18.000 Okay, so I do think it's totally fine to eat anything.
01:40:24.000 Well, what I say is I don't eat anything with a brain.
01:40:27.000 So that means that oysters, mussels, clams, they're okay.
01:40:33.000 So I got convinced.
01:40:35.000 I didn't used to be like this.
01:40:36.000 I got convinced by an advocate for what's called bivalve veganism.
01:40:41.000 I mean, it doesn't make a big difference.
01:40:42.000 I don't really like these things.
01:40:43.000 I eat them occasionally, but...
01:40:44.000 You don't like, like, mussels?
01:40:46.000 No.
01:40:47.000 Really?
01:40:47.000 Yeah, no.
01:40:48.000 Have you ever had linguine with mussels, like at a good Italian restaurant with a nice red sauce?
01:40:53.000 Yeah, I mean, so when they're good, they're fine.
01:40:56.000 And when they're bad, they're really bad.
01:40:58.000 Well, that's in that case with everything?
01:41:01.000 No.
01:41:01.000 Some things, when they're good, they can be.
01:41:03.000 Gross hamburgers.
01:41:04.000 I mean, you can get down the line, you know, you can rot in food.
01:41:08.000 No, but, like, you know, good pizza is just amazing pizza.
01:41:11.000 Or, like, I feel like the very best muscles I'm, like, meh towards.
01:41:16.000 Really?
01:41:17.000 Yeah.
01:41:17.000 Oh, man, you need to go to a really good Italian restaurant.
01:41:19.000 Yeah.
01:41:20.000 Have you ever had linguine with clams?
01:41:22.000 Do you like clams?
01:41:23.000 Yeah, I think so.
01:41:24.000 Again, I just feel pretty indifferent about them.
01:41:26.000 Oh, you're crazy.
01:41:27.000 You just need to go to a really good restaurant.
01:41:29.000 You guys are eating in England, man.
01:41:30.000 That's the problem.
01:41:31.000 They don't know how to make Italian food there.
01:41:33.000 Yeah, that is true.
01:41:35.000 I mean, there's a few people right now that are screaming in England, I make good Italian food, you son of a bitch!
01:41:40.000 I'm generalizing and I'm aware I'm ignorant in saying that.
01:41:44.000 Look, I can't defend English cuisine.
01:41:46.000 Oh, there's some great...
01:41:47.000 Having been out to New York, San Francisco...
01:41:49.000 Well, London has some amazing restaurants now.
01:41:52.000 London does, yeah.
01:41:53.000 But it was always the generalized, stereotypical knock was that the food in England was terrible.
01:41:59.000 The first time I went there was pretty bad.
01:42:01.000 But yeah, with respect to what's the most ethical meat, I think it is a really interesting question because I think...
01:42:07.000 You know, the debate on vegetarianism and so on is normally phrased as this either-or thing, like not doing anything or just go vegetarian or vegan.
01:42:16.000 But I was interested in this question of just, yeah, well, supposing you only want to go halfway or of the different foodstuffs, like what are the ones that are going to do the most in terms of animal welfare if you cut them out?
01:42:29.000 Because most people, when they go halfway to being vegetarian, they might cut out red meat to cut beef and so on.
01:42:35.000 And I actually think that's, if you care at least about the animal welfare side of things, I think that's just wrong.
01:42:40.000 And I think there's two reasons for that.
01:42:42.000 One is respect to the amount of suffering that the animal has in the course of its life, where...
01:42:51.000 The way that chickens are currently treated, if you look at just average, and again, we're talking about most chickens, though.
01:42:58.000 You're talking about factory farming conditions?
01:43:00.000 Factory farming conditions, which is well over 90%, I think like 99% of chickens that are eaten are in these conditions.
01:43:07.000 Their lives just, I think they're the worst off creatures in the planet, basically.
01:43:12.000 And cows, I think, often don't have great lives, but it's just nothing really compared to chickens.
01:43:18.000 And I think pork are similar, like pigs also have really terrible lives.
01:43:23.000 Whereas larger animals, cows, sheep, just in general aren't being treated as badly.
01:43:30.000 And then the second question is, how many animals are you affecting?
01:43:37.000 Where if you consume a steak or something, that's like a thousandth of a cow on average.
01:43:44.000 Whereas you can easily eat kind of half a chicken.
01:43:47.000 And that's a factor that people normally don't consider as well.
01:43:50.000 And obviously, maybe you value a cow's life greater than a chicken's life or something.
01:43:54.000 We do in some strange way.
01:43:55.000 There's a hierarchy that humans have almost inherently, or at least we do in the Western world.
01:44:01.000 Yeah, I think it's really hard to know.
01:44:03.000 Like, this is one of the hardest...
01:44:05.000 Philosophical question I've thought about for ages and have eventually given up on is you've got an unhappy cow day, an unhappy chicken day, which is worse.
01:44:13.000 How do you weight those two?
01:44:15.000 You can't.
01:44:15.000 Or an unhappy fish.
01:44:17.000 People have very few feelings about fish.
01:44:19.000 Yeah.
01:44:19.000 Like you see a dead fish, people don't feel the same way they feel if you see a dead lamb.
01:44:23.000 Yeah.
01:44:23.000 But in general, I've become more sympathetic.
01:44:26.000 I think there's a bias where, you know, we tend to sympathize more with things that look like us.
01:44:31.000 Fish have these weird, you know, kind of look alien.
01:44:33.000 They don't take care of their young, too.
01:44:34.000 That's a lot of differentiation.
01:44:36.000 Yeah.
01:44:37.000 And so over time, I've definitely become a lot more sympathetic to taking suffering of chickens and fish fairly seriously.
01:44:45.000 But I think when you combine these two factors of, again, yeah, fish, I think, except that there's less good information on them, but I think this might be in the category.
01:44:55.000 But certainly chickens and pigs compared to beef.
01:44:58.000 I actually think if you just want to kill, like take out most of the suffering from your diet, removing chickens, caged eggs, I think in the US actually that's basically all eggs, unless you kill them yourself.
01:45:12.000 And pork, I think you're removing, and maybe fish, I think you're removing most of the suffering from your diet.
01:45:18.000 Vastly more than when it comes to beef or milk.
01:45:21.000 Yeah, well in terms of like the amount of individuals that get impacted, you're right.
01:45:25.000 And that one cow can feed much more people obviously than one chicken can.
01:45:29.000 So if you're taking one life in that form.
01:45:32.000 What disturbs me most about factory farming Well, for one thing disturbs me, it sort of existed and then I found out about it and it was already there.
01:45:39.000 And I had been eating it all along.
01:45:41.000 And that shocked me in that I was...
01:45:43.000 I remember sitting back, I'd watched some documentary on it, and I remember sitting back thinking, like, this happened because we weren't paying attention.
01:45:52.000 Because I was a grown man when I found out about it.
01:45:54.000 I hadn't been paying attention.
01:45:56.000 And when you leave people alone and you say, hey man, do you think you can get us some beef?
01:46:01.000 The guy's like, yeah, yeah, yeah, I got it.
01:46:03.000 Don't worry about it.
01:46:03.000 You just stay over there in your city.
01:46:05.000 I'll take care of it over here, out of sight, out of mind.
01:46:07.000 And then when we find out about it, and then you hear about, in America we have these things called ag-gag laws.
01:46:13.000 I'm sure you're aware of those.
01:46:15.000 Unbelievable.
01:46:16.000 Like, no possible justification for this.
01:46:19.000 Terrifying.
01:46:19.000 It's just because...
01:46:20.000 So, yeah, where it's...
01:46:22.000 They're hiding information.
01:46:23.000 Hiding information, yeah.
01:46:24.000 And there was a case where there was an animal welfare activist goes into a factory farm, is filming instances of animal cruelty for a kind of documentary film that gets presented.
01:46:36.000 And she got tried and had to go to prison for not intervening in the animal cruelty.
01:46:44.000 That was just happening all the time, and she was the person just actually...
01:46:47.000 So she got tried for not intervening, not stopping the animal cruelty?
01:46:51.000 Yeah, which is happening all of the time.
01:46:54.000 I thought she would get tried for violating the ag-gag laws.
01:46:58.000 No, well, she was...
01:46:59.000 Because it's an invasion of privacy on a corporation and corporate secrets.
01:47:02.000 Yeah, I think it was prior to the ag-gag laws.
01:47:04.000 Oh, so they found another way to try her, to discourage...
01:47:07.000 That's right, yeah.
01:47:09.000 That's so insane.
01:47:10.000 But the thing you said earlier, when you were talking about the ways in which humans are broken, I think if you just look at, yeah, suffering...
01:47:20.000 We're good to go.
01:47:34.000 People would be outraged.
01:47:36.000 People would just think I'm this kind of despicable person.
01:47:39.000 And that's the natural reaction, because I'm just caught inflicting unnecessary suffering on this creature.
01:47:45.000 But then you can just modify the circumstances such that this natural emotional reaction of sympathy just fades away, where now it's this huge warehouse, and it's not just one chicken, it's hundreds of thousands of chickens, and it's all mechanized,
01:48:01.000 and it's all taken out of sight.
01:48:04.000 Suddenly, yeah.
01:48:06.000 I mean, Joseph Stalin said, yeah, a single death is a tragedy, a million deaths is a statistic.
01:48:11.000 I don't generally like to take life lessons from Stalin, but it's an extremely good quote.
01:48:18.000 But he was talking about humans.
01:48:20.000 And, you know, any death of a human will be tragedy.
01:48:24.000 And when they get to large numbers, it's sort of...
01:48:28.000 It's very difficult to calculate because it's hard for people to understand or grasp the concept of a million people dying in a war.
01:48:35.000 What's bizarre about factory farming is that it's all kind of done behind these warehouse walls.
01:48:46.000 It's all undercover and it's all incredibly common and it's all not discussed.
01:48:53.000 Like if a war is happening, I was going to say if a war is happening and 100,000 people a month are dying, we're discussing, you know, how do we mitigate this?
01:49:01.000 How do we stop this?
01:49:02.000 How do we bring peace?
01:49:04.000 There's so few people wondering how to stop chicken suffering.
01:49:09.000 Yeah, absolutely.
01:49:10.000 I mean, because we've looked into this, and one of the reasons it's such a priority area is just the amount of just philanthropic money going into this, when the focus is really on factory farming, not stray dogs and so on.
01:49:22.000 It's in the low tens of millions of dollars.
01:49:24.000 Of trying to stop factory farming?
01:49:26.000 Yeah, or trying to mitigate it.
01:49:28.000 What is the solution?
01:49:29.000 Like, other than going vegetarian, Have we reached this point, sort of like unmanageable point, where the population centers like Los Angeles, New York, whatever, that don't grow their own food have gotten so massive that in order to fuel these people with food, especially with animal protein,
01:49:46.000 you almost have to have these setups?
01:49:48.000 Yeah, I mean, I think if you've got the constraint of animal protein I mean, I think the answer is probably still no, but the other thing is you just don't need that constraint of animal protein.
01:50:00.000 We eat radically more meat than we did, you know, 50 years ago, 100 years ago.
01:50:06.000 Far more than we need to have a healthy diet.
01:50:08.000 I mean, I've been vegetarian 11 years.
01:50:10.000 Do you eat eggs, though?
01:50:11.000 Free-range eggs?
01:50:12.000 Yeah, I do.
01:50:13.000 That's a, for me, I don't understand why people don't.
01:50:16.000 Like, when PETA had that whole campaign about eggs or chickens periods, I'm like, look...
01:50:21.000 I can understand you not wanting to eat factory farm chickens' eggs because these animals are tortured and they're confined and it's horrific, but you can definitely find eggs.
01:50:30.000 And I have my own chickens.
01:50:32.000 I have 22 chickens.
01:50:33.000 And they lay eggs and I eat their eggs all the time and I ate five of them this morning.
01:50:37.000 They're great.
01:50:38.000 But when you're talking about those eggs, it's like...
01:50:43.000 There's no suffering.
01:50:44.000 The eggs come out.
01:50:46.000 They don't become a chicken.
01:50:47.000 You take them.
01:50:48.000 It's free.
01:50:49.000 And those chickens, by the way, they're a bunch of little murderers.
01:50:52.000 They run around my yard.
01:50:54.000 I've seen them eat a mouse before.
01:50:56.000 If they found a bird that was down, like a nesting bird that had fallen out of a nest, they'll fuck that bird up.
01:51:02.000 They eat anything that's on the ground.
01:51:04.000 The only thing they don't seem to like, they don't seem to like slugs.
01:51:08.000 Okay, you've tried to feed them slugs.
01:51:09.000 No, they eat them.
01:51:11.000 We pick up a rock in my garden.
01:51:13.000 I'll pick up a rock and the chickens come over and just jack anything that's under the rock.
01:51:17.000 They figured out that when I lift up the rock, there's bugs under there.
01:51:20.000 They're little murderers, man.
01:51:22.000 They're ruthless.
01:51:22.000 They don't like slugs.
01:51:23.000 They try them and then they start shaking their head.
01:51:27.000 They try to get the slime off their beak and they kind of freak out.
01:51:30.000 So yeah, I mean, there's this big, within the animal welfare, kind of activists, this is actually quite big divide between, you could call them maybe the abolitionists on one side and the welfareists.
01:51:41.000 And the abolitionists' view is just, you know, the way we treat animals is like how we treated slaves.
01:51:48.000 This is just, this is kind of the equivalent of slavery of our time.
01:51:52.000 And the only, and you know, imagine if we'd been in...
01:51:56.000 Slave-owning Americans said, like, hey, well, why don't we just cut down the number of slaves we hold?
01:52:01.000 It's just not doing enough moral seriousness.
01:52:04.000 The welfarists, in contrast, are more like, look, almost all the suffering, if we're going to quantify the suffering of the way humans, the animals now, 99% of it comes from factory farms.
01:52:16.000 If we could eliminate that factory farms, sure, there's still something left.
01:52:20.000 It's not like, even if you agree we're not at the kind of final stage, but this is where the vast majority of Both the animals used and the worst conditions are.
01:52:28.000 And so the welfareist would instead say, look, let's really just focus all of our attention on this.
01:52:32.000 And things like the arranged eggs or circuses or fur are just, these are just really kind of not the main issue.
01:52:40.000 And, you know, I'm naturally most sympathetic to the kind of welfareist perspective.
01:52:47.000 But it is interesting of the animal.
01:52:49.000 I know lots of people who were in the welfarist camp and then moved to the abolitionist camp on welfarist grounds, where the kind of worry is just that if you're just trying to get people to do a little, then you're not actually going to move them at all.
01:53:02.000 Whereas you need to have this hard moral line, and then people kind of see the integrity of that and follow it.
01:53:07.000 Well, it seems to me that there's a slippery slope when agriculture and civilization were introduced that someone wasn't going to exploit it to the nth degree.
01:53:15.000 And figure, well, there's just got to be a better way to squeeze money out of this situation.
01:53:18.000 And then next thing you know, you've got these factory pig farms.
01:53:21.000 I'm sure you've seen the horrific one where they fly the drone over the lakes of pig piss and pig shit.
01:53:28.000 Absolutely.
01:53:29.000 And that these animals are living just completely confined where they can't even turn around.
01:53:33.000 And they're just pumping them up with whatever the fuck they need to keep them alive until they get to a certain point where they can kill them.
01:53:38.000 Yeah.
01:53:39.000 And it is true.
01:53:40.000 So many people would be absolutely...
01:53:43.000 If that was right there in front of them, they would be sickened.
01:53:45.000 Yeah, hence the ag-gag laws.
01:53:47.000 In order to keep that money coming in, they have to keep people in the dark of these situations.
01:53:51.000 And unless they go online and seek it out and watch these videos, and those videos are very polarizing too, because, you know, when you come to a lot of these animal rights organizations, a lot of them have roots in the Animal Liberation Organization, which doesn't even believe that you should have pets.
01:54:07.000 They think that your pets are all, you know, prisoners.
01:54:11.000 Yeah, it's so interesting, going back to Peter Singer, where he said Animal Liberation, which was the name of his book, which was kind of text, you know, a founding text for what became the animal rights movement.
01:54:22.000 And what's interesting is that Singer doesn't believe in rights.
01:54:25.000 He's a consequentialist.
01:54:26.000 He's a utilitarian.
01:54:27.000 He never used the word once.
01:54:29.000 So his approach would just be thinking, yeah, what's going to do the most good?
01:54:34.000 And on the pets question...
01:54:37.000 I don't want to speak for Peter, but he's going to think, well, if they have a good life and they're well treated, it just seems fine.
01:54:43.000 And again, he'd want to say, like, the focus should be on...
01:54:46.000 Suffering.
01:54:47.000 Yeah, on the vast magnitudes of suffering that go on the factory farms.
01:54:51.000 That's priority one, two, three, and four.
01:54:52.000 Yeah, I have a hard time even entertaining the conversation that there's something wrong with a healthy pet dog.
01:54:58.000 Like, that dog loves the owner, the people love the dog, and the dog has obviously gone through an incredible evolutionary process where it's gone from being a wolf to being a chihuahua.
01:55:09.000 Like, if you think that thing should be out fending for itself in the forest, boy, you're dooming that little fucker to death.
01:55:15.000 I mean, well, the question, the dog, in all of these cases, like, the animals wouldn't exist otherwise.
01:55:20.000 And they wouldn't exist from people.
01:55:22.000 I mean, if it wasn't for people breeding them and making them this bulldog, like this thing that can't even hardly breathe and walks with a waddle, like, we're weird that we've done that in the first place.
01:55:32.000 Yeah, I mean, I find, especially the pets, like dogs that have difficulty breathing, genetic diseases, I find it kind of gross.
01:55:40.000 It is gross.
01:55:41.000 That we've kind of done that.
01:55:42.000 I've got one.
01:55:43.000 What type of dog do you have?
01:55:44.000 It's a Shibu Inu English Bulldog Mix.
01:55:46.000 Poor little fucker.
01:55:47.000 He's a mess.
01:55:48.000 We got him as a puppy, you know, because he was cute and, you know, he just seemed like he needed a home and we took him in.
01:55:54.000 God, he's all messed up, man.
01:55:56.000 I mean, I've had him for 10 years.
01:55:57.000 He's had all these surgeries and can't walk right.
01:56:00.000 His hips are all fucked up.
01:56:02.000 It's just, like, they breed him to the point where he's half Shibuino, so he's actually better off than a lot of Bulldogs.
01:56:07.000 Because he's 12 now.
01:56:09.000 I don't think Bulldogs usually live that long.
01:56:11.000 I don't think they live to that age.
01:56:13.000 But he's got all sorts of, like, difficulties.
01:56:16.000 He can't really run.
01:56:17.000 You know, he's lazy.
01:56:18.000 He just likes to lay down, snore.
01:56:20.000 But the poor little things, like if you look at an actual, like, legit English bulldog with their flat faces, like, they have massive respiratory problems.
01:56:30.000 Yeah, so I find that, like, the fact that we, like, engage in this product in the process of bleeding them kind of weird.
01:56:36.000 But then, like, yeah, if you're going to have a dog and look after it, well, like...
01:56:39.000 It's not the problem, right?
01:56:41.000 It's definitely, yeah.
01:56:42.000 And so there's this question of just, if you're talking about that, are you just, like, distracting from the main issue, which is...
01:56:47.000 Right.
01:56:48.000 Well, it also seems to me that this is just like everything else in life.
01:56:52.000 Like, as you go down the rabbit hole and you look at it deeper and deeper and deeper, you go, God, this is a complicated issue.
01:56:58.000 How do you get all these people to stop eating so much meat so that you don't need so much meat, so that you don't need factory farming and have to get people aware of what is the consequences of going and buying a chicken sandwich?
01:57:11.000 Well, do you know where that chicken came from?
01:57:13.000 Here, check this out.
01:57:14.000 Are you happy now?
01:57:15.000 And a lot of people, they watch those videos and then they go, ah, fuck it.
01:57:19.000 I'm hungry.
01:57:19.000 I want a chicken sandwich.
01:57:20.000 Yeah, lots of people do.
01:57:21.000 I do think, though, like, so in the UK, at least, if you buy a pack of cigarettes, you get these pictures on them showing kind of what this is what your lungs will look like if you smoke 20 a day and there's warnings and things.
01:57:32.000 Yeah, that doesn't stop people in some weird way.
01:57:34.000 I mean, people are addicted to cigarettes.
01:57:36.000 I think it must have some impact.
01:57:38.000 I don't think it does.
01:57:39.000 But I wonder if you could buy a pack of chicken and it would say, well, this is this field of piss and shit that this chicken grew up in.
01:57:46.000 Right.
01:57:47.000 Like the opposite of an ag-gag law.
01:57:49.000 Like force it in your face.
01:57:50.000 Yeah, because look, I mean, you're just giving the consumer more information.
01:57:55.000 How can that be bad?
01:57:56.000 Like if you went to the butcher shop, went to the butcher section of the grocery store, and there was videos that were playing constantly above the packaged meat that showed these animals getting like a piston through the head and hanging by their ankles and getting bled out while they bucked and kicked.
01:58:13.000 How many people that would be a fucking conveyor belt of baby male chicks falling into this?
01:58:18.000 Yeah getting ground up.
01:58:20.000 Yeah, that would be a fascinating Psychological examination to watch people walk up to that butcher shop and see those videos playing like if that became the law and I mean, there's an amazing...
01:58:34.000 There's a comedy show, a sketch show, that did something kind of similar, which was, you know, they would go up to the butcher's counter and say, okay, I'd like some sausages.
01:58:44.000 They'd go, okay, pick up a little baby pig and put it into this box.
01:58:49.000 It's obviously fake.
01:58:50.000 And just, like, do this action and sausages would come out.
01:58:52.000 Obviously, they're not actually killing a pig.
01:58:54.000 Right, right, right.
01:58:55.000 And people would be outlaid.
01:58:57.000 Don't do that.
01:58:58.000 Like...
01:58:59.000 And it's like, do you not know where pork comes from?
01:59:05.000 So yeah, that's the thing that's just amazing is how people can call themselves animal lovers.
01:59:10.000 Well, there's also people that love animals and eat meat.
01:59:14.000 Like they'll eat steak and then get mad at people for hunting animals.
01:59:17.000 I've experienced that personally.
01:59:20.000 This is a good case, though, of the salience issue where...
01:59:24.000 I mean, so I, like, oppose hunting.
01:59:27.000 I think it's bad for the animal that gets killed.
01:59:29.000 But the thing is, it's just so salient compared to factory farming.
01:59:34.000 And it's like, you know, would I prefer that people hunt meat rather than, like, factory farming?
01:59:38.000 It's like, of course.
01:59:39.000 And, like, you do the math, it's like, not only am I behind that, I'm behind that, like, a thousand times.
01:59:44.000 But again, the hunting is just, it's this very salient thing.
01:59:47.000 You know, in the UK, Huge about fox hunting and so on that's a different thing because fox hunting you're not you're not eating it No, I mean it's supposed to be for yeah, it's kind of like vermin control Yeah, and there's some there's some logic to that that if you don't have natural predators you need to figure out some way to control certain populations that can be damaging like Fox or in some places black bear and there's a bunch of different animals that you do have to control because they don't have a they don't have a natural predator Yeah,
02:00:15.000 yeah, but the thing that's like Incredible for me is just how people can have such long views on that and such long views on hunting and then just know the action to factory farming.
02:00:26.000 They just don't see it.
02:00:28.000 It is just because we are very manipulable as humans in terms of our model reactions.
02:00:34.000 That's really worrying because we can't go far with that.
02:00:37.000 Yeah, but there's certain animals that you have to control the populations of, especially invasive species, like pigs.
02:00:43.000 Like, wild pigs are a huge problem in America in getting bigger and bigger.
02:00:46.000 I know you guys don't have them as much in the UK, but in America, particularly in Texas, and now in Northern California, there's just massive, massive populations of wild pigs.
02:00:57.000 And they give birth two to three times a year.
02:01:00.000 And they can give birth to as many as three to six piglets.
02:01:05.000 And then six months later, those piglets are ready to give birth.
02:01:08.000 So they just boom, boom, boom, boom.
02:01:10.000 And if you don't control their populations, what are you going to do?
02:01:13.000 Are you going to let wolves loose to control their populations?
02:01:16.000 I mean, they have to figure out how to do it.
02:01:27.000 Mm-hmm.
02:01:32.000 They do wind up donating the meat of that pigs to homeless shelters and people who need it.
02:01:38.000 It's actually very nutritious and very healthy and very good for you.
02:01:42.000 And that's probably way better than buying pig from someone who's raised it in some horrific factory farming environment.
02:01:50.000 For people that just want the animals to live and be unchallenged and, you know, unpreyed upon, I get it.
02:01:56.000 It all seems very disturbing.
02:01:58.000 But you've got to control the populations because you're not going to have any agriculture.
02:02:02.000 I mean, they're going to find out where the farms are and they tear them apart at night.
02:02:06.000 They're nocturnal animals.
02:02:07.000 You can't stop them with fences.
02:02:09.000 They go right through fences.
02:02:10.000 They're huge, huge animals.
02:02:12.000 Wild pigs create millions of dollars of damage in Riverside County.
02:02:15.000 Wow!
02:02:16.000 Yeah, I didn't know about this at all.
02:02:18.000 Riverside County is super populated, but this is an enormous, enormous problem in this country.
02:02:24.000 And by the way, when you look at that animal, what's really cool about pigs is that they morph.
02:02:29.000 When you see that animal, it looks very different than a domestic pig, but it's the exact same animal.
02:02:34.000 They're all the same genus.
02:02:36.000 It's called sous scroffa.
02:02:38.000 And when you take a domestic pig and you let it go, within months, within months of being free, their hair starts to change, their snout starts to elongate, their tusks start to grow longer.
02:02:51.000 Once they become feral, once they realize they have to fend for themselves, there's an actual physiological change in the structure of their body.
02:02:59.000 So interesting.
02:03:00.000 It's fascinating.
02:03:01.000 It's really crazy.
02:03:03.000 Their hair gets thicker.
02:03:04.000 They develop a thicker plate, the males do, around the chest to protect themselves from other males when they fight.
02:03:11.000 It's bizarre.
02:03:12.000 So those wild pigs that people see, there's a bunch of different kinds.
02:03:15.000 Some of them are Russian boars.
02:03:17.000 They're wild, you know, different kind of pig.
02:03:21.000 But ultimately, they all interbreed with each other.
02:03:23.000 Yeah, yeah.
02:03:24.000 This is so interesting as well, coming back to the question of what's natural and not and so on.
02:03:29.000 And people often think this about meat they're eating as well, where if you look at the, you know, chickens can barely stand because they've been so engineered to have these huge breasts.
02:03:39.000 Pigs that you're talking about are not meant to be pink, meant to be brown.
02:03:45.000 Cows, can you really imagine a cow evolving in the wild?
02:03:50.000 Of course not.
02:03:51.000 All of these things are incredibly unnatural through thousands of years of selective breeding.
02:03:57.000 Well, cows don't live in the wild, but here's where it gets interesting.
02:04:01.000 In Australia, when cows have gotten wild, they've gotten loose from these pens that people held them in, and then they become what they call scrub bulls, and they're out there in the wild, and people hunt them like they would hunt a wild animal, and they're very wary,
02:04:17.000 and they run from people, they see people, they get the fuck out of there, and the bulls are incredibly violent.
02:04:23.000 Like, the male cows, these scrub bulls, are some of the most dangerous things to hunt in the world.
02:04:29.000 Because they'll actively chase you down.
02:04:32.000 Like a bull.
02:04:32.000 Like, you know, if you see, like, people trying to ride bulls, how bulls kick and, you know, they go crazy.
02:04:37.000 Well, these scrub bulls are essentially those bulls, but many, many, many generations wild.
02:04:42.000 So they're feral bulls.
02:04:44.000 Man.
02:04:45.000 Yeah, so they sort of were bred to be this domestic thing, and then they got loose, and then they became this wild thing.
02:04:55.000 And so they look slightly different, like that's what they look like.
02:04:58.000 That's a scrub bull.
02:04:59.000 So they're becoming slowly, over the course of many generations, a more wild animal.
02:05:06.000 So they have these hunts for these scrub bulls.
02:05:08.000 And if that thing sees you, by the way, that crazy looking bull, they will fuck you up.
02:05:15.000 They're some of the most dangerous animals that you can encounter in the wild, apparently.
02:05:20.000 But I have a buddy, my friend Adam.
02:05:22.000 Yeah, they look different.
02:05:23.000 Like, look at the hump on its back.
02:05:25.000 I mean, that looks like some crazy wild African animal.
02:05:28.000 And it was originally, a long time ago, a regular domestic cow.
02:05:33.000 But yeah, so it shows just how artificially they are.
02:05:37.000 If that's the sort of changes you get over just the course of a few generations.
02:05:41.000 Natural selection, as opposed to what we're doing with dogs, you know, when we create a bulldog.
02:05:46.000 I mean, that is, those are the animals that have survived, and they've changed their coloration, their physical structure looks different, you know, over many, many generations.
02:05:55.000 It's quite, quite fascinating.
02:05:58.000 It's like we have to figure out where we stand, I think, in terms of the entire ecosystem, because we're certainly not viable.
02:06:06.000 We can't go out there and live amongst those animals.
02:06:09.000 I mean, we won't.
02:06:10.000 We'll get killed.
02:06:10.000 We'll get eaten.
02:06:11.000 So we have to stay inside of our homes.
02:06:14.000 We have to stay inside of our environments.
02:06:16.000 And then we have to figure out, like, how much of an impact should we have on those things around us?
02:06:20.000 Should we be like all the other animals, like the wolves and all these other animals, the coyotes that have this impact on the environment?
02:06:28.000 Or should we try to lessen our footprint to the point where we have zero impact on any of the animals and we just live inside of these sustained areas that grow vegetation?
02:06:37.000 It's an interesting question because those animals prey on each other.
02:06:41.000 They all do.
02:06:42.000 And should we be a part of that?
02:06:45.000 Should we take part in that?
02:06:47.000 I definitely don't think we should factory farm.
02:06:49.000 I definitely think that that was a huge mistake.
02:06:52.000 And I also definitely think that that huge mistake is what led us to be able to have these gigantic cities.
02:06:57.000 And I don't think necessarily cities are a huge mistake, but man, trying to figure out how to feed those people...
02:07:04.000 In the way that they're accustomed to eating right now, that's a massive battle.
02:07:10.000 Yeah, but I think the kind of question of large populations and how do you feed them, that massively tells in favor of lower meat consumption or vegetarians.
02:07:18.000 Sure.
02:07:18.000 Because you've got this 10 to 1 rule where to create a calorie of meat, you need 10 calories or more of grain or soy or whatever you're feeding them.
02:07:28.000 Unless you're dealing with people just consuming wild pigs.
02:07:32.000 Yeah, wild pigs or kangaroos or something.
02:07:34.000 There are exceptions to that as well.
02:07:36.000 But I don't think there's enough to feed people, though.
02:07:38.000 That's the other thing.
02:07:39.000 There's 350 million people in this country.
02:07:41.000 There's not 350 million wild pigs.
02:07:43.000 Yeah, exactly.
02:07:45.000 But it means that in the future, just as populations get larger, then...
02:07:50.000 Yeah, again, we're just going to need to use land and energy more efficiently.
02:07:55.000 So this is yet another argument in favor of plant-based diets.
02:08:00.000 Yeah.
02:08:01.000 Well, in America, at least, the majority, the vast majority of the money that goes towards conservation, towards keeping wild animal populations high, is actually from hunting.
02:08:11.000 Hmm.
02:08:12.000 It's a real strange contradiction that makes people really uncomfortable once they find it out, is that the vast majority of the money that goes to protect habitat, to preserve wild lands, it comes from hunting.
02:08:25.000 In fact, hunters voluntarily agreed, I believe it was in the 1930s, to give up 10% I want to make sure that number's right, too.
02:08:34.000 But the money, in terms of the percentage of sales of hunting equipment, goes directly towards conservation.
02:08:43.000 Interesting.
02:08:44.000 Yeah.
02:08:44.000 There's all these different entities, like the Rocky Mountain Elk Federation, that have repopulated elk into all these areas, but done so specifically so that people can hunt them.
02:08:54.000 So it gets really weird.
02:08:56.000 Yeah, yeah.
02:08:56.000 It might be an uneasy alliance between...
02:08:59.000 In many people's eyes, but they're the ones that are giving up the money.
02:09:02.000 The money is not coming from altruistic organizations that just want to preserve these animals so that they can exist in the free, wild way that they did before people got here.
02:09:11.000 But there's more white-tailed deer in America today than when Columbus landed.
02:09:16.000 And that's all because of conservation, because of hunting.
02:09:20.000 So it's another one of those weird things where when you look at the whole picture...
02:09:25.000 Yeah, there's a solution that I've heard suggested for reducing species loss, which is to allow basically ownership of species.
02:09:37.000 So you could copyright the panda.
02:09:39.000 Oh, yeah.
02:09:41.000 Now, isn't this a weird idea?
02:09:42.000 But the idea is that now, suddenly, like at the moment, no one has a financial incentive to ensure that pandas don't go extinct.
02:09:50.000 Whereas if someone were able to say, no, if you want to use a panda in video or so on, you have to pay the owner.
02:09:59.000 Yeah.
02:10:00.000 Well, I'm very uncomfortable with the idea of this laboratory-created meat.
02:10:05.000 As to where that's going to go.
02:10:07.000 Why are you uncomfortable?
02:10:08.000 I'm very, well, positive about it.
02:10:10.000 I mean, the science is kind of tricky, but...
02:10:12.000 I'm positive about it in that it's not going to be any animal suffering.
02:10:15.000 It's going to be fascinating in that regard.
02:10:17.000 But what's going to happen if we find out that...
02:10:22.000 Well, there's a bunch of different things, right?
02:10:23.000 First of all, we have to make sure it's healthy.
02:10:25.000 We have to make sure that it doesn't cause some sort of a weird disease because you're not eating something that's living and moving and when you eat sedentary creatures maybe there's some sort of an adverse impact on our biology because I think there's an adverse impact when you eat protein from an animal that is like weak and sick and they've actually shown There was a study that Dr. Rhonda Patrick sent me recently that showed that animals that eat older,
02:10:53.000 sick animals die quicker.
02:10:56.000 They have a shorter lifespan and exhibit less health characteristics, I believe it was, than animals that ate younger animals.
02:11:06.000 And there seems to be some sort of a direct correlation between eating younger healthy things and having a positive healthy impact on physical life itself, the animal that's consuming it.
02:11:18.000 And that if you're eating something that never existed in the first place, like...
02:11:23.000 Unless they're able to recreate the characteristics of a healthy animal, like a strong muscle tissue, like maybe they could do that with electrical impulses, like some sort of electrical muscular stimulation.
02:11:34.000 Yeah, I don't see why that would be a problem.
02:11:36.000 I mean, at least you'd think you'd be able to get past that, in fact, where, you know, the meat that we've currently got, stuffed full of antibiotics, you know, there's often viruses that...
02:11:50.000 Sure.
02:11:51.000 Yeah, viruses that kind of arise.
02:11:52.000 Swine flu, avian flu.
02:11:53.000 Avian flu, exactly.
02:11:54.000 You could avoid all of them.
02:11:56.000 All that stuff comes from factory farming.
02:11:58.000 And in fact, yeah.
02:11:59.000 And in fact, then, once you start to engineer meat, perhaps you could engineer exactly the healthiest sort of meat.
02:12:05.000 You've got so much more control over the product.
02:12:09.000 That would be crazy.
02:12:10.000 And so, of course, you've got to, yeah, with the development of any new technology, you've got to be cautious about it.
02:12:17.000 But ultimately...
02:12:19.000 It seems like we should be able to get to the point where we have tastier, cheaper, more healthy meat that has far less carbon dioxide as a side effect, uses far less land area.
02:12:34.000 It's going to be better than every single way.
02:12:37.000 I think the science, it does seem hard, in particular, just to get the costs down low enough.
02:12:42.000 Well, I think they've got it down pretty low.
02:12:44.000 I mean, there was a recent article about it where they were talking about the original one was worth hundreds of thousands of dollars, and now they've got it down to like 20 bucks.
02:12:50.000 Yeah, I think that was misleading, actually.
02:12:53.000 Was it?
02:12:53.000 Yeah, it's a shame.
02:12:54.000 Like, it seemed to me that there's been a little bit too much hype around in Víctor de Mí, where, yeah, there's some stories of, like, the costs are radically going down.
02:13:04.000 Right.
02:13:05.000 Whereas it's definitely much lower than that.
02:13:07.000 I think it's definitely still decades away.
02:13:10.000 Decades?
02:13:11.000 Yeah, I think so.
02:13:11.000 Really?
02:13:12.000 Yeah.
02:13:12.000 What makes you think decades?
02:13:14.000 So the argument is that currently the...
02:13:16.000 So it depends on what you're talking about.
02:13:19.000 Like egg white, I think, is pretty easy comparatively.
02:13:23.000 Milk is comparatively easy, but structured meat.
02:13:26.000 So, you know, steak or chicken, it has a structure.
02:13:31.000 That's, I think, very difficult.
02:13:33.000 And I think...
02:13:34.000 Apparently part of what the difficulty is, there's a certain solution that you need to grow this meat in, and that solution is currently very expensive.
02:13:43.000 And the key part of the cost, even once we get to the point of being able to develop this, getting the cost down low enough so it's competitive, You're going to need to take this fluid that currently costs, I don't know how much, but like a thousand dollars a litre, get it down to the cost of soda.
02:13:59.000 And we don't currently, it seems, have like a clear kind of scientific path towards that.
02:14:04.000 It would be the ultimate conundrum.
02:14:06.000 If they found out that the only way to make that fluid and to make it financially viable was to make it out of ground-up pets that get killed anyway, euthanized pets.
02:14:17.000 Like, would people be upset if they took euthanized pets and they used it to make the fluid to grow the artificial meat in?
02:14:22.000 Or would they prefer those euthanized pets just be cremated?
02:14:26.000 So, at the moment, that fluid does have to come from animals.
02:14:30.000 There's a certain part of it that is animal-based.
02:14:32.000 I was just guessing.
02:14:34.000 So, it's not exactly ground up pets.
02:14:38.000 Puppy brains.
02:14:39.000 Only puppy brains.
02:14:40.000 As I understand, it's still currently not vegan.
02:14:43.000 But it's interesting.
02:14:44.000 I think it's going to change.
02:14:45.000 I mean, I do think given the level of just moral cognitive dissonance that's currently going on between people's attitudes to animals, pets, any animal they can see, and consumption of meat, once you take self-interest out of the equation, once you've got meat that is cheaper and just as tasty,
02:15:03.000 I think just everyone's going to switch.
02:15:04.000 And then within a generation, people will look back at the current generation and just think, how did anyone ever engage in such...
02:15:12.000 Abominable activity as factory farmed meat.
02:15:16.000 Yeah, well, it's probably one of the darkest things that we as a civilized humanity do.
02:15:23.000 When you think about, other than war, which is obviously the most horrific thing, or one of the most horrific things, I mean, it's arguable that in terms of suffering, it's the next thing.
02:15:35.000 Because, I mean, it has to be, it is the next thing, right?
02:15:38.000 Other than poisoning people for profit, you know, other than companies that have polluted environments that have wound up poisoning people.
02:15:45.000 Yeah, so in terms of animals, so 50 billion animals are killed every year for human consumption.
02:15:50.000 Worldwide.
02:15:50.000 Worldwide.
02:15:51.000 Most of them have kind of short lives, so...
02:15:56.000 No, broiler hens have six-week lives.
02:15:59.000 That's crazy.
02:16:00.000 Six weeks.
02:16:01.000 So from the time they're incubated to the time they're in an oven.
02:16:05.000 Six weeks.
02:16:06.000 That's nuts.
02:16:07.000 That's the point at which they die.
02:16:09.000 That's the highlight of their life, in my view.
02:16:11.000 Because their life is just filled with suffering.
02:16:16.000 So that means at any one time, there's seven billion animals in factory farms right now.
02:16:20.000 Living, basically being tortured for the entirety of their short lives.
02:16:23.000 So the entire population of the human race.
02:16:26.000 It's basically one-to-one, yeah.
02:16:27.000 At any one time.
02:16:29.000 Isn't it nuts that that's less than 100 years old?
02:16:32.000 Yeah, much less than that.
02:16:33.000 Less than 50 years old, really.
02:16:37.000 Who's the first crazy asshole that jammed those chickens into those little cages?
02:16:41.000 Henry Field, I think?
02:16:43.000 That's the guy?
02:16:44.000 Is that his name?
02:16:44.000 So he, fascinating, Rise of the Free Marketeers.
02:16:49.000 So back in the 50s, I'm going to go on a digression, but it's not as bad as Infinity, I promise you.
02:16:53.000 That's all right.
02:16:54.000 They're all awesome.
02:16:56.000 So back in the 50s, free market economics was just completely dead.
02:17:01.000 It was just not a mainstream idea at all within academic economics.
02:17:06.000 But it really rose to prominence across the end of the 60s, certainly the 70s, and then Thatcher and Reagan getting in power.
02:17:13.000 Huge uptake in this intellectual movement.
02:17:16.000 And so the question is kind of where did it come from?
02:17:20.000 And it was actually very significantly driven by a small number of people in the 50s and early 60s, like very deliberately saying, okay, we want this ideology to become really dominant.
02:17:32.000 And one of the most important first organizations was the Institute for Economic Affairs based in London, a think tank.
02:17:39.000 And it was funded by the person who brought factory farmed chicken to Britain.
02:17:45.000 So it's weird because I promote this idea of earning to give as something that young people should consider.
02:17:54.000 Not as the only thing they should do, but as one of many things they should do, or should consider doing.
02:18:00.000 If you want to do good, you could go and, like, directly have an impact.
02:18:03.000 But there is also another option, which is doing something you're really passionate about that maybe has less of a direct impact, earn a lot, and donate what you're making, at least a significant part of that, to the things you think are most important.
02:18:17.000 And then I think of this, I think it's Henry Fisher, sorry.
02:18:20.000 This is like this most perverse instantiation of that, where the guy went and became a factory farming entrepreneur, basically on an indicative give grounds.
02:18:30.000 Jesus Christ.
02:18:31.000 Isn't that just so indicative of how humans are so contradictory?
02:18:37.000 We're so complex and so strange.
02:18:39.000 Mm-hmm.
02:18:40.000 In that we will find all these justifications for all these bizarre behaviors, and then we're never, like, totally pure.
02:18:48.000 Like, there's so many people that are so...
02:18:51.000 This is the terrible example, but it's the one I use all the time.
02:18:54.000 Bill Cosby made so many people laugh, and he raped about a hundred, or whatever, allegedly.
02:19:00.000 You know?
02:19:00.000 Like, he was...
02:19:02.000 Helping and putting out so much love to so many people and then being fucking evil to a bunch of people that he drugged It's like yeah that this this exists this Duality that this yeah think about Nazi Germany you think about the number of people who were involved in the Holocaust who loved their children and then children and if you talk to them You would have had it like a great conversation.
02:19:24.000 They would have been very caring and so on this is I mean, yeah, it's a very powerful idea, the banality of evil, have a Rents phrase, where, yeah, the worst crimes committed are not because people are bad.
02:19:42.000 It's because...
02:19:44.000 Not bad or evil in the way that you think James Bond villain, like this person's plotting something.
02:19:50.000 It's just because they have some goal that is...
02:19:55.000 Some goal on which they are indifferent to suffering, and they cause that as a side effect.
02:19:59.000 And so it's the same.
02:20:00.000 If you ask people, do you want animals to suffer horrifically, infect the farms?
02:20:04.000 They'll say, no, of course not.
02:20:05.000 It's just that I don't care.
02:20:07.000 Casualties of war.
02:20:08.000 Exactly.
02:20:09.000 And casualties of civilization.
02:20:10.000 Yeah.
02:20:11.000 And the same insight, actually, when we talk about AI as well, is...
02:20:15.000 You know, sometimes in the media, people say, oh, the worry about AI is a Terminator that's going to want to kill humans.
02:20:20.000 But that's not the worry at all, the idea.
02:20:22.000 Or when you think about Homo sapiens and Neanderthals, again, it's just having some other entity that has goals on which you're just not very important.
02:20:31.000 Right.
02:20:32.000 And that means that, yeah.
02:20:34.000 Well, and they're also going to judge us.
02:20:36.000 I mean, if they are intelligent and they are superior to us, they're going to judge us based on the entire whole of our behavior.
02:20:42.000 And they're going to go, look at this messy species.
02:20:45.000 This fucking species is crazy.
02:20:48.000 Elon Musk has my most terrifying quote.
02:20:52.000 His quote is the most terrifying to me, that he thinks that with AI we are summoning the demon.
02:20:58.000 Summoning the demon, yeah.
02:20:59.000 I love that quote.
02:21:00.000 It's just like...
02:21:01.000 I mean, I worry...
02:21:02.000 Yeah.
02:21:02.000 Like, I think a lot of the media attention around AI is, like, has been really unfortunate because it suggests, like, it's coming next year and it's going to control its...
02:21:11.000 Like, the demon, I think, anthropomorphizes it more than is necessary and so on.
02:21:15.000 Sort of.
02:21:16.000 I think...
02:21:16.000 But of its ultimate goal is the extinction of the human race.
02:21:19.000 That's very demonic in our regard.
02:21:21.000 Yeah.
02:21:21.000 I mean, it's more...
02:21:24.000 Indifferent?
02:21:25.000 Yeah, indifferent.
02:21:26.000 Sort of the way we think about mollusks?
02:21:28.000 Yeah.
02:21:28.000 Yeah, exactly.
02:21:29.000 Or the way we think of, like...
02:21:31.000 You know, mosquitoes.
02:21:32.000 Yeah.
02:21:33.000 Mosquitoes are my favorite because vegans will slap mosquitoes.
02:21:36.000 Oh, yeah.
02:21:37.000 I mean, are mosquitoes sentient or not?
02:21:40.000 They're alive.
02:21:42.000 Yeah, but I think clams and mollusks aren't sentient.
02:21:45.000 Then insects, I'm like...
02:21:46.000 Well, there's some weird arguments about that then.
02:21:49.000 I mean, why not eat crickets?
02:21:50.000 Because cricket protein is excellent.
02:21:52.000 I've had cricket bars before.
02:21:53.000 They're covered in chocolate.
02:21:55.000 They taste really good.
02:21:55.000 They're high protein.
02:21:57.000 Yeah.
02:21:57.000 I mean, many...
02:21:59.000 I do know many people who do advocate for that.
02:22:02.000 My view is just like, if you're unsure, then play safe.
02:22:05.000 And you'd be eating a lot of crickets.
02:22:07.000 Yeah, but there's a lot of crickets out there to eat.
02:22:11.000 Well, if you're growing them, like, you'll be hunting crickets with a tiny little spear.
02:22:16.000 I don't think that's how you do it.
02:22:18.000 You're a lot more brutal than that.
02:22:20.000 I mean, I think factory farming for crickets would be a horrific institution.
02:22:23.000 You know, and you just, what would you do?
02:22:26.000 Just fucking swarms of them and smash them down to a protein bar.
02:22:31.000 What I worry about is that, what is the current number of the amount of species that have ever existed that are now extinct?
02:22:38.000 It's fucking huge.
02:22:39.000 It's like 99.99% or something.
02:22:41.000 Why not us?
02:22:42.000 Why not us?
02:22:43.000 And if we do give birth to artificial intelligence, if we are the caterpillar that gives birth to the butterfly that winds up taking over the world, some artificial butterfly.
02:22:53.000 Yeah, I mean, I think the thing that worries me is that You know, it's...
02:22:58.000 AI is kind of its own thing.
02:23:01.000 And I think, you know, we do...
02:23:04.000 Because it's, like, potentially extremely beneficial as well.
02:23:06.000 Right.
02:23:08.000 Even if supposing it goes well, then it's a huge thing.
02:23:12.000 We should care about it whether or not we're worried about the extinction risk.
02:23:16.000 One of the rare cases, I think, where we can really see into the future and think, yes, this is going to be a transformative technology.
02:23:23.000 We don't know when it's going to happen, but when it does, it's going to be transformative.
02:23:27.000 And it's going to be very powerful.
02:23:29.000 And that means we should have some kind of careful thought about it.
02:23:32.000 But it seems to me there's a variety of ways that the human race could kill itself now.
02:23:37.000 So novel pathogens being one example.
02:23:41.000 Large Hadron Collider.
02:23:43.000 I mean, so my colleague, Toby, actually wrote a paper on the Large Hadron Collider because there was all this, you know, talk about, oh, we could create black holes and so on.
02:23:55.000 And so he wrote an academic paper where he just talked about the risk analysis that they did.
02:24:01.000 And they said, oh, the chance of the Large Hadron Collider creating a black hole or something else that was, like, really dangerous is 10 to the power negative 63. Yeah.
02:24:10.000 You know what that's not?
02:24:11.000 Go on.
02:24:12.000 Zero.
02:24:12.000 It's not zero.
02:24:13.000 Firstly, it's not zero.
02:24:14.000 Those motherfuckers.
02:24:16.000 The odds were really long.
02:24:19.000 We didn't know.
02:24:20.000 But the second thing also is that you shouldn't think that anything's 10 to the negative 63, really, unless you have very, very strong models behind it.
02:24:29.000 Because what's the chance that you just made some mistake in your calculation?
02:24:33.000 It's like, you know, maybe it's as low as one in a million.
02:24:35.000 But that mistake completely swamps the probability.
02:24:38.000 And so that was the point that he was making.
02:24:40.000 Just statistical point saying, look, I'm not commenting on whether this is dangerous or not.
02:24:46.000 It's just that you've made a mistake in your methodology.
02:24:49.000 With respect to your risk assessment.
02:24:51.000 And so it was really funny because then he was there, this very calm, sensible, you know, philosopher from Oxford in a press meeting with the Large Island Collider surrounded by all the, you know, aluminum, like tinfoil.
02:25:06.000 Aluminum foil, tinfoil hat people?
02:25:08.000 Yeah, aluminum foil.
02:25:10.000 Aluminium.
02:25:10.000 I love the way you guys say aluminum.
02:25:11.000 I know.
02:25:12.000 Apparently, I was so annoyed when I found this, but apparently your way is correct.
02:25:17.000 Of course it is.
02:25:18.000 We're American.
02:25:19.000 How dare you?
02:25:22.000 Tire with an I, really.
02:25:23.000 And then you start saying things like foyer, click.
02:25:27.000 Oh, there's a nice click of people in the niche in the foyer.
02:25:31.000 Oh, niche.
02:25:33.000 Niche, clique, foyer.
02:25:35.000 Foyer.
02:25:36.000 We say foyer, though.
02:25:38.000 Do people say foyer?
02:25:39.000 I've definitely heard foyer.
02:25:41.000 They're Walmart people.
02:25:42.000 Those are white trash.
02:25:43.000 Where are you from, originally?
02:25:45.000 Where am I from?
02:25:46.000 New Jersey is where I was born.
02:25:47.000 Interesting, because I thought it was a New York thing, but...
02:25:49.000 Maybe.
02:25:50.000 Maybe not.
02:25:51.000 I didn't really grow up there.
02:25:52.000 I grew up all over the place.
02:25:53.000 Boston, mostly.
02:25:55.000 Yeah, but no, you do so many things long.
02:25:57.000 It's very disgusting.
02:25:57.000 Oh, how dare you.
02:25:58.000 It's very disgusting to me.
02:25:59.000 I'll tell you what we don't do.
02:26:00.000 We don't do queens.
02:26:01.000 You guys still have queens.
02:26:02.000 I love the queen.
02:26:03.000 Get out of here with that shit.
02:26:04.000 She's still going.
02:26:05.000 Ridiculous.
02:26:06.000 It's so funny.
02:26:07.000 I feel like every time I go from...
02:26:09.000 No, I think the queen is hilarious.
02:26:12.000 It's ridiculous.
02:26:15.000 That's her job.
02:26:16.000 Her job is to wave in this very particular way.
02:26:18.000 She doesn't even really wave.
02:26:19.000 She just kind of rocks her hand back and forth.
02:26:21.000 Some sort of weird semi-Vulcan stance.
02:26:25.000 It's kind of funny, yeah, talking to especially some of the kind of progressive friends I have in America, and they're like, you've got a monarchy, isn't this?
02:26:33.000 Like, isn't everyone talking about it?
02:26:34.000 But you guys think it's quaint.
02:26:36.000 Yeah, we're just like, no one really thinks about it.
02:26:38.000 Well, she doesn't really have power, right?
02:26:41.000 But she still lives in a fucking castle.
02:26:42.000 She lives in a castle.
02:26:43.000 She lives off the dime.
02:26:45.000 But if you do an economic analysis, she brings in more money than she...
02:26:49.000 Um, sort of, but she takes a lot.
02:26:51.000 She's sort of the anti-Will McCaskill, if you ask me.
02:26:53.000 Yeah, that's right.
02:26:54.000 I mean, the charities they support.
02:26:55.000 She just gets all this free money, and that bitch just wears gold and shit and drives around in a limo.
02:27:00.000 It's kind of ridiculous.
02:27:01.000 So you could, yeah, you could definitely get the same tourism benefits.
02:27:06.000 People are mad that I say bitch.
02:27:07.000 I'm not really meaning the word bitch.
02:27:09.000 It's like, all due respect, folks.
02:27:11.000 It's just a figure of speech.
02:27:12.000 It's a humorous figure of speech.
02:27:14.000 Okay, well, I appreciate the...
02:27:15.000 Yeah, I don't want to disparage your ruler.
02:27:18.000 Yeah, I appreciate the caveat.
02:27:20.000 Such a strange ruler, though.
02:27:22.000 Kings and queens and Prince Charles.
02:27:24.000 It's a really funny part of British culture.
02:27:29.000 It's so funny because I spend a lot of time in California, but every time I come back, it seems to be on some major event to do with royalty.
02:27:37.000 So one was the queen's birthday, one was the event of the queen being the longest ever running monarch, one was her jubilee.
02:27:44.000 We don't know about that at all.
02:27:46.000 We don't know about any of those things.
02:27:47.000 For you, it's these massive events.
02:27:49.000 Well, it just means I come off the plane, been in America for a while, and there's just pictures of the Queen everywhere.
02:27:54.000 Ah, I see.
02:27:55.000 Okay, yeah, I'm definitely back in Blighty now.
02:27:57.000 Now, what's going on now?
02:27:59.000 Anything crazy?
02:28:00.000 What's happening now?
02:28:00.000 In the UK, yeah.
02:28:01.000 Huge news today.
02:28:03.000 What?
02:28:04.000 Well, for me, as a Scot, Nicola Sturgeon, the First Minister, so like the leader of Scotland, I kind of think of Scotland to the UK as like state to federal, but it's a little bit different.
02:28:18.000 Announce there's going to be a second, she's planning a second Scottish referendum.
02:28:22.000 So because Britain is taking itself out of the European Union, where they expect, is that announcement going to be made Tuesday, end of month, but very shortly.
02:28:36.000 Scotland did not want to leave the EU, voted overwhelmingly in favour of remaining.
02:28:43.000 So Scotland in general tends to lean a lot further left than the rest of the UK. And previously had an independence referendum, it was very close actually.
02:28:52.000 52% were in favour of staying part of the union, so they stayed part of the union.
02:28:59.000 There's now going to be a second referendum, at least this is what Nicola Sturgeon is saying.
02:29:06.000 And because of the Brexit vote, I think it's much more likely that Scotland will say, yes, we're going to leave, and then they remain part of the European Union, whereas the rest of Britain will leave.
02:29:18.000 And it's interesting for me, because I was very kind of pro the Union against independence in the previous election.
02:29:37.000 Well, I think there's two things.
02:29:39.000 One is that...
02:29:41.000 I think that now the case for Scotland being part of the EU but not part of Britain, the economic case makes a bit more sense now than it did in the past.
02:29:50.000 But then secondly, I would worry that Britain leaves the EU, does that trigger spark a much larger movement where just the EU as a project breaks down?
02:30:06.000 And if it's the case, like, well, UK leaves the EU, but as a result, the country just falls apart.
02:30:12.000 So you wanted that to happen?
02:30:14.000 You wanted England to fall apart, to be punished for leaving the EU? I mean, I think it would be like a very major signal.
02:30:20.000 But what if they prospered?
02:30:23.000 And they were correct.
02:30:24.000 Oh, yeah.
02:30:25.000 I mean, then if I was convinced that the Brexit was the right decision, it was actually best for the world, then I would change my mind.
02:30:34.000 I don't know enough about it, but I do have a friend who's very knowledgeable, and he's from England, and his take on it was the real issue with the EU is that you're dealing with a bunch of people that aren't even elected.
02:30:47.000 They're just sort of running the European Union.
02:30:50.000 And he's like, we don't have to tell you when you just look at history what happens when people have a great amount of power and aren't even elected to their position.
02:30:59.000 And you're allowed to just go to any part of the European Union and move into it.
02:31:07.000 He's like, that was very detrimental and very bad in terms of the way England's Financial structure was set up.
02:31:15.000 They were like, this would be detrimental to England, but beneficial to other places.
02:31:18.000 And the idea was that we were supposed to accept the fact that it would be detrimental to England and beneficial to other countries.
02:31:24.000 And many people in England did not want to do that.
02:31:27.000 And in making that decision, they were thought to be xenophobic, they were thought to be nationalistic, and that it was racist.
02:31:35.000 So I think there's, yeah, two things.
02:31:38.000 I mean, one thing is, yeah, I don't like...
02:31:40.000 Yeah, I mean, so there's kind of two things.
02:31:42.000 One, with respect to the kind of sovereignty question.
02:31:45.000 I mean, like, European Union, like, it has its own parliament and so on.
02:31:49.000 You can vote on that.
02:31:50.000 You each get a number of...
02:31:52.000 And the reason, insofar as it's undemocratic, it's mainly just because people don't care.
02:31:58.000 They don't care whether or not it's democratic?
02:32:00.000 As in voters.
02:32:02.000 So turnout to elections for members of the European Parliament.
02:32:06.000 The turnout is very low.
02:32:07.000 I think it's like 30% sort of thing.
02:32:09.000 Maybe it'll be larger now that they realize the consequences of it.
02:32:12.000 Well, I mean, there's not going to be any more because it's going to leave.
02:32:15.000 It's going to implode?
02:32:16.000 Well, Britain's leaving.
02:32:17.000 Britain's leaving.
02:32:18.000 So you're no longer voting for members of European Parliament.
02:32:22.000 So that's one question.
02:32:23.000 And then, like, is this good or bad for Britain?
02:32:26.000 I think, like, the economic case is just incredibly strong for Europe being kind of good for Britain.
02:32:34.000 The reason being just, like, free trade in general benefits both parties.
02:32:37.000 You want to really maximize the amount of free trade.
02:32:40.000 But then the bigger thing for me is just like with respect to unity between countries is like the tail risk, risk of war, which we don't really think about because we haven't had a world war since, you know,
02:32:57.000 the early mid 20th century.
02:33:00.000 But Europe had had, like, a long period of comparative peacefulness, like before the First World War, people thought, no, it's unthinkable, given the level of interconnectedness between the countries that a world war could break out, and then two did.
02:33:12.000 Right.
02:33:13.000 And so, and I think those sorts of things would be, you know, that's the tail outcome, but can be very bad indeed.
02:33:20.000 And we don't often think about it because it's just this occasional thing.
02:33:23.000 And so that's why, in general, I'm just almost always more pro-closer relations between countries.
02:33:34.000 That makes sense to me.
02:33:35.000 What he said makes sense to me as well, though, when he was saying essentially it was like, think of the United States, but now think of each state being a country.
02:33:44.000 You're allowed to elect a leader of that country, but you can't elect a leader for the United States.
02:33:49.000 And so that's essentially how he was looking at the European Union.
02:33:52.000 He was saying the European Union is, they're not elected, and yet they're controlling all these other elected officials and elected states, all grouped together.
02:34:02.000 Instead of thinking them as like Germany and thinking it was England, think of them as states.
02:34:06.000 And think of the European Union and the officials, the people that are in control of the European Union aren't even elected.
02:34:13.000 Yeah.
02:34:13.000 So, I mean, you do elect the parliament.
02:34:15.000 And then it's also the case that the analogy, like the amount of power that Europe has over the remaining, the other countries is like, you know, nothing like the amount of power the federal government has over the states.
02:34:34.000 You know, the UK sets, so the powers the EU has, one of the things that got made lots of attention was bendy bananas.
02:34:43.000 This got like a real focus area for people's ire.
02:34:46.000 Bendy?
02:34:47.000 What does that mean?
02:34:48.000 So according to EU regulations, so EU has a single market, so that means you have just the same standards across all countries.
02:34:56.000 But then that means you just start to have these standards on for things, like bananas.
02:35:00.000 And so there was one EU regulation which was that a banana couldn't be too bendy, otherwise it would count as a defective banana.
02:35:08.000 And so people were like up in outrage about this, like how can the EU dictate to us the shape of our bananas?
02:35:16.000 But I think the case is like a good one.
02:35:19.000 It's like, it's really not that important.
02:35:20.000 It's just a banana.
02:35:21.000 Why do they even try to regulate it then?
02:35:23.000 Well, it's because if you want to have like a free, like single market, you need to have common standards across.
02:35:30.000 But doesn't the market dictate those standards where like the bendy bananas don't sell and then the straighter ones do?
02:35:37.000 Yeah, I mean, I don't know more of the detail about the bananas.
02:35:40.000 It seems to me like any time the government steps in on something as fucking ridiculous as the bend in the shape of a banana, they'll be like, hey, fuckface, why don't you go take care of poverty?
02:35:49.000 You know?
02:35:50.000 Why don't you handle something real instead of dealing with bendy bananas?
02:35:53.000 Look, so on the bendy bananas case, yeah, I can't, off the top of my head, think of why you'd want to not allow the sale of other bendy bananas.
02:36:00.000 But that's what people worry about when they worry about bureaucracy, when they worry about too much control.
02:36:04.000 Yeah, so...
02:36:05.000 That's a great example, in fact, of why people don't want micromanaging of our culture.
02:36:10.000 Yeah, but then the question is, do we want to leave over bananas?
02:36:15.000 Well, there's a lot of other factors.
02:36:17.000 It's not the bananas that caused it, right?
02:36:20.000 But the thing is, the UK, as part of the European Union, has sovereignty over its income taxes, all of its laws, as long as they don't conflict with the UN Declaration of Human Rights, which was first invented by the UK, has control over all of its internal legislation.
02:36:38.000 It can go to war if it wants, and it did.
02:36:42.000 So the loss of sovereignty seems pretty mild from my perspective, and I feel like I feel like they focus on these examples, which is like, okay, maybe, like, let's say, yeah, it's okay, it's a cost.
02:36:55.000 We would like to be able, like, maybe it would be better if Britain could make decision of bananas.
02:37:00.000 Maybe the bananas was the bad call.
02:37:02.000 Well, it definitely doesn't seem like a universal reaction.
02:37:04.000 I mean, there's a large percentage of the people in England that are very upset about Brexit.
02:37:11.000 It's a really interesting sort of a divide between people.
02:37:15.000 Yeah.
02:37:15.000 I mean, the thing that I find fascinating is that we would make, and I think this in general, I think this with elections as well, because I studied a bunch of voting theory while doing my PhD, and we make these momentous decisions as a country where we get everyone in the population to try and go to a specific place and then get the smallest possible information out of them that you can,
02:37:39.000 which is just a single tick, like yes or no.
02:37:42.000 Whereas there's so much more you could be doing.
02:37:44.000 Right.
02:37:44.000 In one case with a referendum, rather than just at a particular date, where the turnout is affected by things like the weather, it's affected by what happened in the week before, instead you just have three referenda.
02:37:59.000 And given the momentousness of the decision, spending more money on actually getting the accurate views of the people, Is super important.
02:38:09.000 So instead, yeah, you have three over the period of six months and choose the best, you know, best out of three, basically.
02:38:17.000 That would be like a more accurate representation of what people think over time.
02:38:22.000 Sure, but isn't there also a gigantic issue with people not being informed about what they're voting on?
02:38:27.000 You don't have to be informed.
02:38:29.000 About what you're voting on, you certainly don't have to be accurate.
02:38:31.000 You could easily be misled.
02:38:33.000 And the actual hard, provable facts could be completely outside of your grasp, and yet you still make a big decision.
02:38:40.000 Yeah, I wondered before about having a...
02:38:45.000 A test?
02:38:45.000 A test, yeah, you go.
02:38:47.000 But really, really basic.
02:38:50.000 I think it would still...
02:38:52.000 There's this question of just, why do we care about democracy?
02:38:54.000 What's the point?
02:38:56.000 Who questions that?
02:38:58.000 It seems like a really important thing.
02:39:00.000 Oh yeah, political philosophers talk about this all the time.
02:39:03.000 So they kind of agree, like...
02:39:05.000 Democracy seems good.
02:39:06.000 Other forms of government that we know so far seem terrible or worse.
02:39:11.000 But why?
02:39:12.000 Why is democracy good?
02:39:13.000 Is it just that democracy gives us this way to boot out dictators and the risk of a single person taking power is just really, really bad and so we just need some mechanism to get rid of that?
02:39:25.000 Is it that it's intrinsically valuable?
02:39:27.000 Is it that people just have a right to have equal representation and that's just this fundamental thing?
02:39:34.000 Or is it justified just in terms of the consequences?
02:39:37.000 Is it because if everybody's able to contribute, then people will make better decisions?
02:39:42.000 I don't necessarily think it's an either-or.
02:39:44.000 I think there's also that people like to feel like they play a part.
02:39:48.000 Like they don't want to feel like they're being ruled over by some monarch.
02:39:51.000 They want to feel like they have some sort of a play in their decision-making.
02:39:54.000 It's also one of the gross things about Trump winning in this country is how many people gloated You know, how many people gloat upon victory that their side won, and then you're dealing with this whole team mentality that people adopt when it comes to any sort of an issue.
02:40:08.000 Yeah, well, I mean, this is...
02:40:10.000 Including Brexit, right?
02:40:11.000 Yeah, no, in general, this is one of the things I'm really worried about with...
02:40:17.000 Is increasing levels of partisanship.
02:40:19.000 This is just this really robust phenomenon that we're seeing.
02:40:23.000 And it's really worrying because it means that we're just undermining any chance of people changing their minds.
02:40:28.000 Like, Trump won.
02:40:28.000 Like, people say, well, of course, it's Comey, etc.
02:40:31.000 But, like, the vast majority of Trump's votes were, and similarly for Hillary's votes, were from people who just always vote Republican or always vote Democrat.
02:40:39.000 Well, not necessarily, because Trump won by so many votes that a good percentage of them had to have voted for Obama, just statistically.
02:40:47.000 Oh, but I'm still thinking, of Trump's votes, what proportion of people have only ever voted Republican?
02:40:58.000 That's a good question.
02:40:59.000 And I would, like, definitely bet greater than 80%.
02:41:02.000 Really?
02:41:02.000 Probably better than 90%.
02:41:04.000 Yeah, that's right.
02:41:04.000 I mean, if you look at the polls, like, it's always that, in terms of expected number of votes, like, oh, it's only 46% in favor of Trump.
02:41:13.000 Well, there's also the issue that the independents in the swing states, whether it's Gary Johnson or whether it's Jill Stein, those independents, the amount of votes they got would have swung the other way towards Hillary.
02:41:25.000 Yeah, I remember looking into this for Jill Stein in particular, and actually it was the case, she would have won the popular vote by even more, but in none of the swing states did she get enough of a percentage.
02:41:36.000 Not just Jill Stein, but Gary Johnson as well.
02:41:38.000 Yeah, though Gary Johnson, it seemed to me, was split almost evenly between Thump and Hill.
02:41:44.000 Right.
02:41:45.000 But this is an interesting case, so...
02:41:47.000 The thing that people don't think about so much is I think the process, we call this a democracy, but one single checkbox every four years is the smallest amount of information you can be getting from people.
02:42:03.000 And it's susceptible to all sorts of different things.
02:42:06.000 And this happens on both sides.
02:42:08.000 So supposing Jill Stein became really popular, took 10% of the vote, She would have just killed Hillary.
02:42:16.000 Like, absolutely.
02:42:17.000 Or supposing that Evan McMullen, is that his name?
02:42:21.000 Who's that?
02:42:22.000 Yeah, he was a Republican independent.
02:42:24.000 Okay.
02:42:25.000 Did well in Utah.
02:42:26.000 Anyway, supposing a far-right candidate does really well.
02:42:30.000 Again, takes all of the votes away from Trump.
02:42:33.000 The fact that that's possible shows that, like, first past the post, the voting system is a very bad voting system.
02:42:39.000 It's not accurately representing the will of the people.
02:42:42.000 And we could do so much better than it would mean that...
02:42:47.000 As a democratic process, you'd be much closer to representing what people actually believe or feel about things.
02:42:54.000 Because right now, it means that, yeah, you can be influenced by stuff like how much support does a third party get?
02:42:59.000 That's a terrible system.
02:43:01.000 It's a terrible system.
02:43:01.000 It lasts too long.
02:43:02.000 The decisions last for four years.
02:43:04.000 This person gets locked into position unless you impeach them and then remove them from office.
02:43:08.000 They're stuck.
02:43:09.000 It sucks.
02:43:10.000 I wish I could talk about it more, but I can't.
02:43:11.000 I've got to get the fuck out of here.
02:43:13.000 But that's the least interesting thing we talked about today.
02:43:16.000 But the AI and all the other stuff is just fascinating stuff.
02:43:21.000 If people want to know more about your effective altruism movement and more about you, where should they go?
02:43:28.000 They should go to effectivealtruism.org.
02:43:30.000 That's got tons of information about effective altruism.
02:43:32.000 If there's one takeaway that you really want to do, you think, wow, actually, this was kind of cool.
02:43:37.000 I do want to make more of a difference.
02:43:39.000 We've just launched a set of funds, so it just means you can donate within one of these core areas of global development, animal welfare, or preservation of the long-run future against global catastrophic risks.
02:43:50.000 You can just donate and have it ensure that it will go to the very most effective non-profits.
02:43:55.000 0% overhead, depending on how you donate.
02:43:59.000 And we don't take any money along the way.
02:44:01.000 And just means that, yeah, super easy to donate as effectively as possible.
02:44:06.000 Alright, beautiful.
02:44:07.000 Thank you, Will.
02:44:08.000 Appreciate it, man.
02:44:08.000 It was fun talking to you.
02:44:09.000 We'll be back tomorrow with Jim Norton.
02:44:11.000 See ya.
02:44:13.000 That was fun, man.
02:44:15.000 Cool, no, that was great.