Why You Can Not Allow Nerds to Congregate with Austin Chen
Episode Stats
Words per Minute
222.29865
Summary
Austin Chen is a co-founder of the Prediction Market, Manifold, a platform where anyone can create any question and anyone can ask any question. In this episode, he talks about the founding of the platform, why it exists, and what it's all about. He also talks about a sex party that took place at the Manifest conference last year, and why he thinks there might be one at Manifest this year.
Transcript
00:00:00.000
Hello, everyone. We are very excited today to be joined by Austin Chen. He is one of the co-founders
00:00:05.480
of the prediction market, Manifold, though now he is transitioning to work on Manifund,
00:00:10.740
which is their sort of independent grant-making entity. We're just so excited. But what he's
00:00:15.560
doing right now as he's transitioning is prepping and ramping up for basically the conference of
00:00:21.220
the year for us. We're really excited for it, aside from Natalism Conference, which we're also
00:00:24.400
very excited for because of Pernatalism, right? But last year was the first ever Manifest
00:00:29.700
Conference. This is the flagship conference and the one and only of Manifold. And it was
00:00:36.340
one of the best gatherings we've ever been to in terms of the caliber of people, in terms of
00:00:41.460
execution. So you're planning for this. We wanted to bring you on to talk about this,
00:00:45.660
to talk about Manifold, to talk about Manifund. And I just wanted to kick this off with...
00:00:49.840
Why? No, hold on. We got to kick it off as a good question, okay? Not a generic question.
00:00:56.600
I think that was a great question. The good question is around the betting pool that was
00:01:01.740
put around when a sex party would form at last year's event. Oh my God.
00:01:06.340
An orgy, yes. Yeah, can you speak to that? What happened there?
00:01:11.480
So this is, I think, an outgrowth of Manifold's very libertarian philosophy. To take very many steps
00:01:17.520
back about what is special about Manifold, right? We're a prediction market platform where anyone
00:01:21.140
can create any question. And this was basically, from the very beginning, we wanted to be a place
00:01:26.520
where instead of the more standard platforms, Metaculous, Polymarket, Cauchy, which all have
00:01:30.920
prediction markets or forecasts, but all of their forecasts and prediction markets are gatekept.
00:01:35.560
They're approved by the moderators of the platform. We were like, we don't want this.
00:01:39.580
We want more of a decentralized, anyone can ask any question kind of system. So I think even from
00:01:45.480
the very early days, we had a lot more of the out there, sketchy, raunchy, degenerate crowd of
00:01:50.800
people. But also a lot of people who are just very interested in prediction markets for their
00:01:54.360
own sake, like prediction market nerds. When you put those two together, you get all kinds of weird
00:01:58.060
questions. Will there be an orgy at Manifest? And I think the New York Times famously covered this
00:02:02.480
maybe in a little bit more depth than I would have hoped because me, I'm like, I'm so excited to be
00:02:06.760
on the New York Times, except that my name is now associated with this orgy that happened
00:02:09.540
in my comments. But yeah, I don't know if there's like too much more about that than what they
00:02:15.660
covered. It was like somebody who's one of our users was like, there's a bunch of like really
00:02:19.700
spicy things happening. The like rationalist EA crowd is famous for the very loose like norms on
00:02:24.020
like sexuality, like polyamory, that kind of thing. Maybe there'll be an orgy. It's a thing that has
00:02:28.600
probably ever happened before in like other rationalist parties or something like that.
00:02:32.200
And there's just like a market for it. And every prediction market can also be viewed as like an
00:02:37.220
incentive market, where if you have inside information, or especially if you have the
00:02:41.260
ability to make the outcome of the market happen in this case, if you have the ability to make an
00:02:44.620
orgy happen, then it's a very natural thing to bet up. Will there be an orgy and just arrange an orgy.
00:02:49.640
And I think that's basically what happened here. Some like enterprising like user in our community
00:02:53.860
was like, it probably wouldn't be that hard to get together with the minimum viable orgy. If we can get
00:02:58.120
three people to agree to come over and have sex. I think they did this off the Lighthaven campus,
00:03:03.300
then we can result this market correctly. So that's my best guess of what happened.
00:03:06.160
And Alo is one of the speakers. She's one of the speakers this year, too.
00:03:09.200
Yeah, that's right. And also before we go further to get this all at the beginning,
00:03:12.580
if you happen to live in the Bay Area, and you are a fan of ours, and you're not going to one of
00:03:16.820
the natal cons or something, because I think that's going to be in Austin again, this is a great place
00:03:20.040
to meet up because we're going to be there speaking and the types of conversations that were happening
00:03:25.060
at the event last year are very similar to what goes on in our discord with speaking of if you
00:03:29.960
haven't seen our discord, go check that out. With the one caveat that because you have people like
00:03:35.220
LAIs or Yukowsky again coming this year, you get more of the anti-AI side, whereas generally our
00:03:40.740
audience is very pro-AI accelerationist, which we also get a lot of accelerationists there.
00:03:45.300
Yeah, we have, for example, Brian Chow coming, who is of the Alliance for the Future 5.1c3 that's
00:03:50.120
He's been on our podcast a number of times. And we've done his a number of times too. Yeah,
00:03:54.260
he's great. So Simone, you can now ask the question that you want.
00:03:57.000
Yeah, you've partially answered the question, which I think is, so we're getting there. And I think we need to
00:04:01.160
dig deeper here is a lot of the people who follow this podcast and who like this podcast are really starved
00:04:07.460
for intellectually interesting conversations, not just heterodox. There's a lot of people throw around
00:04:13.540
the word heterodox thinker a lot. And that tends to sometimes just mean that someone's a contrarian and
00:04:19.660
they just like to troll people or say things that sound controversial. Whereas the people who follow this
00:04:24.140
podcast who really love it are like, I just want to really discuss something from first principles with people
00:04:29.220
and really have an intellectual conversation. And these people are all over the place and manifest.
00:04:33.900
And one thing that you pointed out was that the people who are drawn to Manifold, because it's more
00:04:37.980
open, are like a mixture of like, very analytical people, like super autistic, and then like also
00:04:44.700
just degenerates and like people who are just willing to be very playful as well. So that seems to be one
00:04:51.200
element of this is that you have a bunch of people who are both fun and smart.
00:04:56.220
And I think the bigger element, Simone, is that prediction marketplaces create a status hierarchy
00:05:01.880
around real world information. Because a lot of quote unquote, being smart in our society today
00:05:07.840
is based around being able to parrot back a dominant societal narrative. Whereas in a prediction market
00:05:13.880
environment, your knowledge about the state of the world today actually has to be predictive of
00:05:19.900
Genuine intelligence. And so it sorts for genuine intelligence. And then within prediction marketplaces,
00:05:25.020
it sorts for degenerates with genuine intelligence, which is our audience.
00:05:29.980
So Austin, aside from those things, are there any other common characteristics of Manifold users and people
00:05:36.540
who attend Manifest? And as a second question to this, for those who are starved for these kinds of
00:05:42.640
interactions, who couldn't go to Manifest, and who just like really want to meet people like that,
00:05:49.360
So the first question, what they have in common,
00:05:53.200
I think taking it back, like a lot of like, who comes to the conference is where does the
00:06:02.600
conference get marketed? And who does this seem appealing to? And then the next level, who,
00:06:07.520
like looking at the conference, thinks that like the kinds of people who would want to come to this
00:06:10.540
conference are the kinds of people I want to talk to, right? It's like a multi-level, like kind of
00:06:13.880
model, which is this the crowd for me vibe. And we're trying to position Manifest as really hard
00:06:19.780
to say. I sometimes say like Manifest is in one sense, like a conference that like, is like Austin's
00:06:23.780
like ideal, like dream conference, where I think like a lot of the like people who I follow online,
00:06:29.840
who like, I think are really interesting. I've tried to create a thing that they would very much
00:06:33.760
like to attend. And I think it's actually a borne out a little bit in that in a process of running
00:06:37.200
this conference, like I like try to invite a lot of them and a surprising number say yes.
00:06:39.920
I think like roughly like about half of the like speakers and guest honors who I've like asked to
00:06:44.760
come have accepted. And for me, this feels like, like incredibly like high. I'm like so happy that
00:06:49.900
they are all like interested in coming to this kind of thing. There's an effect where with Manifest,
00:06:54.040
I'm also trying to map out the boundaries of some different online communities. I think forecasting
00:06:57.940
it obviously, but also the natalism, fertility, culture crowd, which is you two, obviously,
00:07:03.700
and Richard Hanania and like Robin Hanson, who talked about these kinds of topics,
00:07:06.720
Alexander Tesla, people like that. They often talk to each other. I think there's some types
00:07:10.600
like reference each other. You like go on each other's podcasts all the time. Yeah. That's also
00:07:14.220
like a crowd of people who I like learn a lot from, and I think have some kind of like natural
00:07:18.760
affinity for like markets, economics, prediction markets, especially they're like, they're in full
00:07:23.180
force. Startup crowd, of course, the tech startups, the, I think there was one other.
00:07:28.060
There are tons of machine learning people also.
00:07:30.640
Machine learning people. I view that as a bit of an outgrowth of like startups slash
00:07:33.660
rationality. There's a biotechnology. Sorry. It was the one that was on my mind,
00:07:38.380
especially in the, like the polygenic screening.
00:07:40.740
Yeah. Lots of that. Yeah. We met the guy doing the tooth thing at the last one.
00:07:45.580
Yes. Yes, that's right. And he's coming again. I got the tooth thing on me personally, actually.
00:07:50.440
Lumen Bioworks is the name of the company. Yeah. It's Aaron who has, he's a friend of Ayla. He was
00:07:57.100
like one of her secretary type people, I think for a while.
00:07:59.860
Yeah. You know, they've worked together for a while, but for those who are not familiar with
00:08:03.260
Lumen Bioworks, this is basically a swab you can do on your teeth that should give you basically
00:08:09.620
a mouth macrobiotic environment that makes it less likely, severely less likely to get,
00:08:18.620
to develop cavities. So Austin, you're theoretically cavity free still, I take it?
00:08:22.960
Theoretically, it might take another like a few months because it takes about a year to fully
00:08:26.980
colonize your mouth. And the other thing is that we don't really know if it's taken yet. Cause I
00:08:30.740
haven't gone in and swabbed again to test it. So I think I got swabbed like about six months ago and
00:08:37.440
like now definitely I should be able to figure it out, but I haven't yet. I'm getting, going in for
00:08:41.760
a dental appointment, like in another week or so. So maybe I can tell you then if it like hasn't
00:08:45.400
worked. We'll check with you in with you at, at manifest. You should have a prediction market around
00:08:49.960
your dental. Oh, that definitely makes sense. Yes. I'll do that. So something I wanted to dig in here
00:08:56.480
is this nonprofit thing that you're building. Yes. Can you tell us more about ManaFund?
00:09:01.480
Sure. Oh, sorry. There's another question. I don't know if you want to go back to the question.
00:09:04.720
Oh no. Yeah. Where can people, yeah. Where else are these people congregating? In the Bay Area,
00:09:08.840
we've heard a lot of people say it's harder to find. On our Discord, Simone, on our Discord. And I'm
00:09:12.760
going to put the link below. But aside from our Discord, where can people find these people?
00:09:17.420
Especially when it comes to finding partners, because people email us all the time saying,
00:09:21.920
yeah, you know, this is great. Well, ManaFund tried to create that dating
00:09:24.780
market that we promoted on our podcast. Yeah. Yeah. But then there wasn't enough
00:09:28.700
people. It didn't have a big enough sample. So yeah. What are your thoughts there?
00:09:33.200
Yeah. And there's like two different questions, even like one is like a question for just hanging
00:09:37.400
out and finding a community. And one is like finding a partner. I guess there's a sense in which
00:09:41.160
like you don't necessarily want this to be different too. You might want them to be like pretty aligned.
00:09:44.340
It's like nicer to just find love like in your community that you're in anyways.
00:09:47.780
Also famously, we had like our own dating product, manifold.love and it still exists. You can go,
00:09:53.120
still go to the site. I think it's still a pretty good resource. The pitch for manifold.love right
00:09:56.720
now is it's like an open database of like date me docs, which is like pretty in-depth profiles with
00:10:03.480
like photos and rough like bios. And then like answers to a bunch of like questions about that,
00:10:08.120
like pretty key to like understanding the personality of a person. So check it out. If that's one place I
00:10:12.760
would immediately recommend people to look a little bit for potential partners, manifold.love.
00:10:16.900
But yeah, on the question of where to hang out, if you're looking to meet people like this in
00:10:21.460
person, I run like a event called taco Tuesday. And I actually haven't really advertised or pitched
00:10:27.040
it before to like people like broadly. So I I'm unsure if like, this is a good or bad thing to do.
00:10:31.600
But anyways, it's like every Tuesday at my house, more or less like cooked tacos, usually like 20 or
00:10:36.100
like 40 people will show up. We'll do some kind of event afterwards, like karaoke.
00:10:42.280
I'll link to like a recent invite where the address is on so people can look at it and think
00:10:46.880
it was like close by. It's like basically middle of San Francisco. Yeah. I would love that Simone
00:10:50.300
Malcolm if you two are ever in town on a Tuesday to come free to come.
00:10:53.300
Oh, I would love that. Yeah. Next time I'm in the city, I was just there for my GSB reunion.
00:10:56.620
And yeah, I'd actually say that our audience, weirdly, we are at the stage of middling fame at
00:11:02.680
the moment where pretty much almost everyone I've met from our audience is really intellectual and
00:11:07.480
high quality. Like we haven't gotten that many idiots yet. A couple, but not a lot.
00:11:12.280
Which is fortunate in terms of not accidentally inviting too many weirdos to your house.
00:11:17.560
Yeah. For Taco Tuesday, I also put the invite on a Manifold market. Every week, it is like how many
00:11:22.280
people will show up and you can bet on more than 10, more than 20, more than 30. And it is just like
00:11:26.360
an open invite on the internet where my address is on there. But like the people who are Manifold users
00:11:30.360
who think, oh, I might enjoy coming to talk to you. They tend to also just be like quite like intellectual,
00:11:34.680
like fun to talk to. So that's why I would promote that as like maybe a great place to merge
00:11:39.480
the Manifold and the like Simone and Malcolm, like audience. Other places like this, I think
00:11:44.120
there's definitely like a, like rationality community in like Berkeley, which you're probably
00:11:47.960
like familiar with, or maybe your audience is in San Francisco. There's actually like less of one,
00:11:51.960
I'd say probably because San Francisco is a lot more like tech heavy. It's like bigger. So like the
00:11:57.960
rationalist EA people and people who are like interested in topics like this are just like less,
00:12:02.040
there's less of a concentration of them. They're drowned out by the like tech world, I'd say.
00:12:08.840
One thing I'm picking up for from you though, which I think is underrated and which I'm sick
00:12:13.960
of people complaining about because we get this is you are making all this happen. You are hosting
00:12:21.720
Taco Tuesdays. You are organizing conferences. You are reaching out to your intellectual heroes and
00:12:28.040
the people whose content you like to consume and the people who you think would be interesting and
00:12:31.560
you're inviting them to your conference. And I think that is the key. Being popular isn't about
00:12:36.680
like somehow magically being popular. It is about being the one who organizes the thing and takes
00:12:42.200
the initiative and does the work. And it's a lot of freaking work. You're putting a lot of work into
00:12:45.640
all these things. So I think I just want to point that out to people that like, guess what? This is
00:12:51.080
going to involve a lot of legwork on your end. And if you just want to sit on your ass and show up to
00:12:54.680
something, you're not going to have any of this. Yeah. I very much agree with that. I want to
00:12:58.520
double down on that. I think a lot of people or a lot of like people I know, like friends, I want
00:13:03.080
to have a more vibrant social network. I want to go to the cool events. And I often try to tell them
00:13:07.400
like, you could just organize a cool event or you could just organize an event and it won't start
00:13:10.760
out cool, but that's okay. Like you'll get better at organizing better and better events. I think
00:13:14.680
Taco Tuesday started as like me and like two of my friends during COVID who are just like wanting to hang
00:13:19.880
out more often. So we're like, okay, we're going to show up and we're going to eat tacos. And we just kept
00:13:22.440
doing it and we did it for three years. And that's how it's grown to this thing at this point,
00:13:25.720
which like 40 people come to 30 people come to every single week. And yeah, I think like social
00:13:32.280
events, like starting from as small as like a gathering to as large as like a conference are
00:13:36.520
still like pretty under provided. Like the like market demand for them is a lot higher than like
00:13:41.720
the number of events that people can actually go to right now. So I think it would be a great thing if
00:13:45.000
people listen to this podcast and say, maybe I'll try and manifest to me was like last year,
00:13:49.240
can I put on a thing on the size of a hundred people, 200 people showing up?
00:13:53.320
And I never put a conference together before, but I think I learned a lot along the way. And
00:13:57.080
this year I'm like very excited for manifest the second time, right?
00:14:00.440
I actually, Simone, I'm going to take a side here and say that I think that people like Austin and
00:14:06.040
us are, it's a genetic thing. I really don't think you can motivate somebody to be this type of
00:14:11.640
person by just being like, Oh, if you, because I've seen what happens when I try to,
00:14:15.800
because like when I am mentoring young people or some fans, I'm like, Oh, you could put something
00:14:20.920
together. Like here's a market space where you can put something together and they'll do the initial
00:14:25.240
bit of work, but they don't really follow through. They don't really put in the effort to make it
00:14:29.720
actually happen on a big scale. And then when it's pitters out, they're like, that's why I never try
00:14:34.120
anything because they don't really immerse themselves in making it happen. But I think that's a
00:14:39.080
dispositional thing where if you're dispositionally one of those people, you're just going to do it anyway.
00:14:44.520
So it's a weird sort of a, it's optimistic perspective. And then I'm like, look, anyone
00:14:49.080
can do this, but not any, not everyone is anyone. And because of that, the people who have the
00:14:56.200
disposition to just tackle life this way are already going to be tackling life this way.
00:15:03.000
That's funny. I think that's exactly what actually I was going to say, or it was on my mind as well.
00:15:06.760
This brings me back to one of my favorite pieces from Scott Alexander, the Parable of the Talents,
00:15:11.320
I think. He went last year and he's coming again this year, by the way.
00:15:14.920
That's right. Yes. But I think it was this piece. If not, it was a different like Scott Alexander
00:15:18.760
piece that talks about Scott's own reflections on his ability to write really well. And conversely,
00:15:24.360
his ability to not be like really good. Like he barely got like a D in calculus. I think it goes
00:15:28.200
the piece. And this goes to show that different people are like very good at different things.
00:15:32.760
But most importantly for Scott, it's not like he like tried really hard to become a good writer.
00:15:36.680
It was more that like becoming a good writer came naturally. He was just like goofing up and then
00:15:40.440
his like random like English essays would become like the best essays or like win like state
00:15:44.760
competitions, that kind of thing. So maybe the way that like possibly me and possibly the two of you
00:15:49.640
are like feel more naturally drawn towards inviting people to things and hosting events well and make
00:15:54.840
people happy. I think empathy is probably like a really key point of like how to make a event like run
00:15:59.880
really well because you have to really understand how your participants are feeling and what like
00:16:04.120
changes you can make to give them a better experience. And I don't know if this is like
00:16:07.400
Oh, it's fascinating. We take the exact opposite perspective.
00:16:11.400
You host events for the exact opposite reason you host events. So when we host events,
00:16:15.800
we generally do it because we do not like spending time around people. We do not like meeting people.
00:16:21.640
And I want to lower the amount of time in my year that I am spending socializing. And so to do that,
00:16:29.640
like one of the things we used to do is every other month, we'd put together
00:16:31.880
like a party or event in New York to do that. We were just like, Oh, we'll put together these
00:16:37.160
events because we need friends. We need a high powered social network to achieve the things we
00:16:42.440
want to in life. So that required some level of socializing. But what it meant is we basically
00:16:46.840
needed to condense our socializing to be as refined and pure as possible, not have to do it that frequently.
00:16:57.000
That is so interesting, actually. For me, my motivations came very differently. It's,
00:17:01.240
I'm also not an extrovert. I actually don't particularly enjoy spending time in social
00:17:04.760
situations. I think maybe this is similar to you two then.
00:17:07.960
So that's not why I put on events. I put on events actually, because
00:17:10.840
I think most other people are pretty extroverted, but also are bad at making their own events
00:17:15.480
happen or something like that. Or I just like egoistically think I can run a better event than you
00:17:19.320
can. So I'm going to run the event. I actually clearly can though.
00:17:21.800
Yeah. He's also in a unique time within the marketplace right now of events because the
00:17:28.280
EA community and the rationalist community have been overcome by AI apocalypticism and just gone
00:17:33.800
crazy. And so they're just not fun to be around. No one wants to be around that nihilistic BS.
00:17:38.280
And then the pro natalist communities, like the one fun community that's adjacent to those circles,
00:17:43.960
it's still not overcome with that. So they're too busy raising their kids. So they're not hanging out.
00:17:48.440
And then the many of them don't have kids, but they're just like, whatever. But your event,
00:17:54.600
because it's drawing on these marketplaces, it's able to take, if you're in this, and the reason I
00:18:00.120
talk about the EA community and the rationalist community is this community, regardless of how
00:18:04.840
polluted the overall ideology has become, is still collecting most of the world's highest agency,
00:18:11.640
highest intelligence people. There's just not a lot of good conferences anymore for high agency,
00:18:17.800
high intelligence people since the corruption of the old EA and rationalist communities,
00:18:22.200
which really only happened after Sam Bankman-Fried.
00:18:24.920
I don't know if I like agree with you, like all the way on the idea that like
00:18:28.360
EA rational had been like corrupted per se, or you definitely use like stronger words than I would
00:18:31.720
use. But I do think there's a true element, which is, I think EAG, for example, is not very fun.
00:18:35.720
It is like very automatic or that's not even the right word. Like it feels very good-hearted,
00:18:40.760
but EAG explicitly has like a metric around effective altruism global. This is like the premiere,
00:18:46.280
it runs like two or three times a year global conference of effective altruists, where a lot
00:18:50.120
of them like get together and talk about the different causes they're working on. It is
00:18:53.560
actually like a pretty good conference, like all, like all around, if that's like a thing that you're
00:18:57.720
interested in. I enjoyed going, but again, to me, going there feels like a chore, like a job, like
00:19:02.600
work. And I think most people who go feel this way. It is like a professional event, like focus
00:19:06.920
around networking and learning, but it's not fun. I think like when I was crafting manifest,
00:19:11.560
I tried to make it like halfway between the EAG effective altruism global and Vibecamp. I'm not
00:19:17.240
sure if either of you are familiar with Vibecamp. Yeah, we've been to Vibecamp. I was not a fan.
00:19:20.520
It was well organized, but I didn't, the people that it draws were not intellectually additive to me.
00:19:27.320
Interesting. So my disclaimer is that I've actually not been to Vibecamp. I was just like
00:19:31.720
using the vibes of Vibecamp, so to speak, like what it felt like people enjoyed from Vibecamp.
00:19:35.960
We were trying to combine those two out, have the intellectual caliber of people who go to an EAG,
00:19:40.840
but the like fun elements of Vibecamp. The playfulness of Vibecamp.
00:19:44.920
Playfulness. Yes. Yes, exactly. Those are the things that I wanted to have in full force
00:19:49.160
at Manifest. I think we did okay last year and hopefully we'll do even better this year.
00:19:53.000
Yeah. So now let's talk about the nonprofit before we end, because I really want to understand this
00:19:56.760
better. What is this? Yeah. Manifund is an independent 501c3 charity,
00:20:02.040
which primarily is like, like a grant maker and like a website that people can apply for grants on.
00:20:06.920
So it's a little bit similar to Kickstarter. You can like list your application there and like submit
00:20:11.560
and then it'll be hosted on the internet. So that's one thing that's pretty different between like
00:20:15.080
Manifund and most other grant makers, where like most other grant applications are done,
00:20:18.360
like like in private behind closed doors, like you'll fill out like a Google form or something,
00:20:21.800
and then somebody will read it. And like in a few weeks time, you'll get like a yes or no response.
00:20:25.240
Instead on Manifund where like you apply for a grant there, it goes, anyone can leave a comment,
00:20:29.640
like talk about what they like, what they didn't like about your application. And then anyone can
00:20:33.480
donate to it via our Kickstarter like platform. We've also received some amount of funding from
00:20:38.120
various like individual donors and other like EA sources so that some people who are on the site
00:20:42.600
just already have a pot of money like allocated to them to give out. For example, we have 1.5 million
00:20:47.240
dollars worth of AI safety regranting budget. Six different like experts in the field of AI safety
00:20:51.560
on our sites can choose to like directly fund applications that they find particularly compelling.
00:20:56.280
So yeah, this is the thing that I'm spending more of my time working on. And like, I'm like worried
00:21:01.160
like in your head, some of them are like, oh, this seems like just like worse than the Manifund thing.
00:21:05.240
Like, why are you like doing this instead? So I'm going to try to preempt that criticism.
00:21:08.600
So Manifund was like a really compelling idea around prediction markets. And we had a long time to
00:21:13.720
test out like specifically how good is a prediction market. And I think like we found it's good for
00:21:18.680
some classes of things, but it's not like the panacea that like we thought it could be, which is to say,
00:21:23.720
they're like often good for topics that like are very broadly popular because they can draw a lot
00:21:28.200
of attention to people like getting in and like asking, no, will Trump win the election? That's
00:21:31.800
like an example of a question. I was like, one, very popular, two, really easy to understand
00:21:35.800
and resolves like relatively soon. Three, these things make for one particular kind of good
00:21:39.480
prediction market. But I think when I started on Manifund, there was some idea that, oh, we can use
00:21:43.080
prediction markets for everything. I can make a hundred choices in my day-to-day life via prediction markets.
00:21:47.560
And I don't think we've quite gotten there. I think even in the like scope of trying to decide
00:21:51.960
what features Manifund should work on, which is like pretty important and has a high amount of
00:21:55.320
uncertainty and has a lot of trying to figure out what the future will hold, where it's pretty
00:21:59.480
hard to operationalize a prediction market to like get that question answered for us.
00:22:02.920
So I think at the end of that, I was like, oh, I think prediction markets might be like a good
00:22:06.760
business might be like very popular, but it might be a good business in the way that like the New York
00:22:10.840
Times is like a good business or something like that, but it's not like enough to change the world.
00:22:13.880
So now I'm like trying to find a thing that will change the world. And it's not exactly clear if
00:22:17.080
this grant making thing is like on the path there, but I'm hoping that it will be.
00:22:21.000
I actually think it's a good idea. I think it's a really good idea. And I think it's a better idea
00:22:24.360
than Manifest. So Manifold, Manifold. So I'll explain why. So it actually comes from what I was
00:22:29.480
saying. You have to look at the landscape of intellectually alive people and where young ones or
00:22:38.520
up and coming ones in the pipeline are aggregating. And there's just a, after the corruption of,
00:22:47.640
we have an episode that I haven't posted yet because I'm a little hesitant because it calls
00:22:50.680
out a little too many people on the death of the EA movement. But Sam Baikman-Fried,
00:22:57.160
because the EA movement like really over invested in appeasing Sam Baikman-Fried for a while,
00:23:02.600
because he was such a major donor. He was like 90% of all the funds in the space.
00:23:06.760
And because he was really just using the movement to support, like to the whitewash,
00:23:11.640
his reputation was democratic politicians. It meant that really everything that they were drawn
00:23:16.760
to was like mainstream and democratic. And it led to those sentiments growing and growing was in the
00:23:22.920
movement until recently, like even Nick Bostrom saying it, Oxford got shut down. Like you, what was
00:23:29.880
this Institute that got shut down? It was like a couple of weeks ago.
00:23:32.600
Yeah. Sorry, like very quickly note a disagreement, which I don't think that's
00:23:38.760
time to get into. I actually like strongly support Sam Baikman-Fried, like even now,
00:23:42.360
I don't know. I think I wrote, but yeah, it's like disagreement, but I don't care about him
00:23:47.480
as a person. So I guess what I'd say is I actually have no beef with him as a person,
00:23:51.560
but I think the downstream effects of his prioritization had a major memetic effect
00:23:58.600
on the movement as a whole. Not even his prioritization, just the fact that there was
00:24:02.440
any single entity in the space that was giving out what felt like infinite money at the time
00:24:07.160
caused a virtue signal spiral. And that's more what we're referring to.
00:24:10.520
Then when that one force disappeared from the movement as a motivator, the movements went off
00:24:17.240
filter, like it no longer had its ballast and then began to spin out into like weird culty side
00:24:23.720
projects, which we would argue, obviously you don't, like you are concerned about AI safety,
00:24:27.880
but we argue a lot of the more extreme AI safety stuff has become. And as a result,
00:24:32.520
there's a lot of intellectually active people who are like, I want to dedicate my life to making a
00:24:37.720
world a better place. But I don't feel that will be achieved by going through the mainstream EA
00:24:45.400
organizations. And as a result, you creating an alternative, which is prestigious, which is able
00:24:52.680
to host conferences that get people as diverse as us and Eliezer and Scott Alexander. And I think
00:25:01.080
like Richard Hanani or something like that in a recent one, like you, you are getting the huge spectrum
00:25:06.600
here, right? Which creates what otherwise doesn't exist, which is a mainstream nonprofit fund that is
00:25:16.600
not ideologically captured for the intellectually active in our society. And so I don't actually
00:25:22.680
think that you're like, however your fund works, of course, it would need to work through some sort of
00:25:27.720
selective, weird mechanism, or it wouldn't appeal to this crowd. But I think that really more what
00:25:33.320
you're capturing here is just a hole in the market right now that was created by the current position.
00:25:39.640
You can say, why don't you guys in the Pernatalist Foundation fit that hole? It's because we are too
00:25:44.120
explicitly right leaning for somebody who wants to stay vanilla to donate to. That's why yours doesn't
00:25:51.880
directly compete with ours either. I hadn't quite put it in those words before or, but I think like
00:25:58.120
the way you just said things right now, like Malcolm, it all fits into place. I do think
00:26:01.640
there's a lot of things. I think I'd probably disagree with you on the like causes of like
00:26:06.120
why the EA movement is the way it is. But I like roughly agree with your like assessment. It is
00:26:09.480
the case that like, I think there's lots of like really smart, agentic, like cool people who are
00:26:13.320
just looking at what the EA movement is right now and be like, that's not exactly the place for me.
00:26:16.920
And I would very much like Manifon to be the place where like they come and try to work on the
00:26:20.920
Well, I like that too. It seems like one of the most common elements of someone who's
00:26:26.440
associated with effective altruism is that they first very vehemently insist that they're not
00:26:30.600
effective altruists. And so there's this kind of need for a community and affiliation, but there's
00:26:37.080
not yet one that seems to adequately represent things. And I, what I like about Manifold associated
00:26:42.600
projects and events and things is that they're very much for, they're, they're independently driven.
00:26:48.600
They're in this case, we're looking at largely crowdsourced grants, which is really cool.
00:26:53.960
And I know we have to wrap up soon, but I didn't want to ask like how the mechanics work. So if
00:26:59.400
someone is interested in once this goes live, and I'd love a timeline from you on that too,
00:27:03.880
if possible, putting their project on this, how does it work? Is there a threshold that has to be
00:27:09.400
met before they could get a grant? And do you think it's going to end up just being kind of an AI
00:27:14.520
safety platform? Because that's something I worry about with all EA adjacent grant making platforms.
00:27:20.040
Absolutely. So to answer your question one, it's already live and it's actually been live for a
00:27:23.800
year and a half. We've already moved about a couple of million dollars, I think about
00:27:28.680
two million dollars worth of like grants to date. How it works, the most basic version is anyone can
00:27:34.360
come on and submit an application for like any project that they want to. We have a lot of AI safety
00:27:39.880
products, but we're interested in variety. For example, Lumina, the like we funded, we were one
00:27:44.600
of the like earliest funders for Lumina. And that was as a result of them applying on our platform.
00:27:49.160
And I think that's actually how Aaron, actually that's not true. Aaron was originally a investor
00:27:53.480
in Manifold as well. So, and sometimes like this, like people who are interested in like similar things
00:27:57.400
get to know each other already, but anyways, we're interested in finding things like Lumina
00:28:00.680
probiotics and a lot of other cool, like techie. I would love to find like somewhat more like
00:28:04.760
pronatalist like initiatives. They don't really apply for Manifold. So hopefully some of your listeners
00:28:08.280
will look at this and think, oh, so you can go to manifund.org, M-A-N-I-F-U-N-D.org,
00:28:12.920
to check it out, look at some of the existing applications and apply for funding. So again,
00:28:17.800
go for it. I was going to say, there was actually a number that got funded recently
00:28:21.480
and they were all out of the Scott Alexander Fund. That's right. And that was a little bit
00:28:26.440
different. We have a two standard grant. One was direct grants, which is the more typical 501c3
00:28:30.440
funding. Another are impact certificates. Impact certificates, I'm not sure if you're familiar
00:28:34.280
with or your audience is they're like equity for charitable projects. A very rough sketch is that
00:28:38.840
somebody might put up like a large prize, let's say a hundred thousand dollars for the best
00:28:42.360
pronatalist products. Maybe if you want to do a pronatalist impact search to be awarded at the
00:28:45.640
end of 2024. And then in the meantime, somebody can put up a project saying, I'm going to work on this
00:28:50.040
project. I think it has a good chance of winning. If you think, if you agree, my project has a good
00:28:53.960
chance of winning, you can invest. I will sell you 20% of my project for a thousand dollars. And then if I
00:28:58.680
end up winning $5,000 or more, your investment will have made it like a return on investment. That's like
00:29:02.920
roughly an impact certificate. It's a separate out the, like assessing whether the project was
00:29:07.560
good, which is done at the end of the year to the upfront funding, the job of something like
00:29:11.080
an angel investor, as opposed to a grant maker. That's a system that we ran for Scott Alexander
00:29:14.840
on our platform. And it's like a concept that we're very excited about, which I don't think
00:29:19.160
I'll go into more detail now, but that's a impact certificate. I love that. That's super cool.
00:29:24.280
Yeah, that is super cool. And if you are looking for people to be like experts, like you said,
00:29:28.200
you had the AI experts. If you want pronatalist people to help distribute dollars on the platform,
00:29:32.680
let us know. And we're happy to take those roles. Absolutely.
00:29:37.000
Oh, this is really cool. Okay. Where should people go if they want to register for Manifest
00:29:42.200
while there's still time? There is still time. It is manifest.is is the website. You can go there,
00:29:47.960
check out who's coming to speak. We'll have a schedule up, like probably in the next couple of
00:29:51.800
weeks. You can buy your tickets now and save a hundred dollars. You'll, it'll be a great time.
00:29:55.160
And Manifest is not just like the two and a half day conference that was last year. Manifest,
00:29:58.840
beyond the like two and a half day conference, there's also a summer camp, a week long, like
00:30:02.760
people like us just hanging out and like working by day and talking with each other that night.
00:30:06.600
And then it starts with a two and a half day, like unconference. That's like mostly just like
00:30:10.360
online bloggers. I don't know if you two are going to be attending the rest of those.
00:30:13.720
Less online, right? Yeah. Alas, no. It's hard to get away from the kids that long.
00:30:19.320
I think we're offering childcare. It's a thing for less online summer camp and Manifest itself.
00:30:26.920
So in case you two want to bring your kids, I would love to meet your kids, but I don't know
00:30:30.760
if that's like practical for you two. I'd be, we're getting close to the stage where
00:30:34.760
we might start bringing our oldest to things, but I don't think we're going to this time,
00:30:37.960
probably next year, because I do want them to, as they are developing to be able to go,
00:30:43.240
I actually think it'd be really fun. Like when they hit seven or so, and they can talk
00:30:48.520
to be giving speeches at some events like this. Um, it would be not like a speech,
00:30:54.200
but like an audience questioning. Cause I think being able to interact with the mind of a young
00:30:59.480
person growing up in the next generation can provide people with a perspective that cannot
00:31:05.320
be easily gained from other avenues in our society right now.
00:31:10.120
Yeah. There's actually a conference or retreat series called Renaissance weekend that was first
00:31:14.200
popularized because the Clintons attended it back in like the Clinton era. And it had,
00:31:20.040
it was always very family. It was a very pro natalist, very family oriented, and it was
00:31:24.040
oriented invite only focused more on elite intellectuals. And they had a kid portion of
00:31:29.880
it where they did have childcare. They'd have a camp Renaissance thing that went on and both luminaries
00:31:35.800
from the main events would come out and do sessions for the kids, but also kids were
00:31:39.960
encouraged to be involved in not just attending.
00:31:42.920
One of the most popular things was all the kids would get together and they would, uh,
00:31:47.960
prep for a speech they were going to give as a group. And then they would be grilled by
00:31:51.960
famous people. And actually, uh, my responses were so good that Bill Clinton did an entire speech
00:31:57.640
just about me at one of the Renaissance weekends saying that if this is what the future is going to
00:32:02.440
be, our country is going to be great. If we have more people like this, but the
00:32:08.680
The important thing about this is that it helps kids like Malcolm normalize that they can aspire
00:32:14.680
to be like presidents. They can aspire to be like these leaders, these people who start these amazing
00:32:19.720
startups and who do these incredible technical things and who are in biotech and AI and all these
00:32:24.040
other. So I really love that you already are doing childcare. I love that you're doing that. And I strongly
00:32:28.440
encourage you to continue because you will produce kids who end up like Malcolm, who have the balls
00:32:33.560
to do stuff that other people aren't willing to do. It has been fantastic catching up with you.
00:32:40.680
And I hope we draw some traffic to this event and we'll meet some fans when we go this time.
00:32:45.320
Yeah, exactly. So everyone, please remember to check out you basically now, if you don't know about
00:32:49.720
these things already have three amazing things to check out, not just manifest, which is happening
00:32:53.960
2024, June 7th to June 9th. So hopefully you see us there in Berkeley. That's in the Bay Area.
00:33:04.760
Not group house. It's on this like sprawling campus. It's absolutely gorgeous and super modern
00:33:08.760
with crazy cool furniture and all these cool niches.
00:33:14.920
This one, yeah. Lighthaven campus is amazing. And we're so grateful to be able to host it there again.
00:33:19.880
Yeah, I've never really seen a place quite like Lighthaven before. It's like Never Neverland,
00:33:26.360
You never socialized in the Bay Area. I do not think this is true.
00:33:29.080
Yeah, I don't think that's true. I grew up in the Bay Area, Malcolm. I don't know what you're...
00:33:31.960
You weren't invited to all the crazy rationalist parties, Simone.
00:33:35.960
I missed some kind of golden age that only Malcolm had access to.
00:33:40.040
This is true though. I actually did. So that is one thing. The other is Manifold,
00:33:43.480
obviously incredible prediction market and the more fun one because you actually can do the fun stuff.
00:33:47.960
And I can say that playing around with the other prediction markets, I did,
00:33:51.320
turned off by the fact that a lot of fun, weird stuff couldn't be there and that it felt really
00:33:55.640
structured and more gated. Whereas Manifold feels more open and fun and playful. And then,
00:34:00.120
of course, also Manifold. Great place to both look at cool philanthropic projects,
00:34:05.320
but also contribute to them and maybe get contributed to. So Austin, thank you so much for
00:34:10.040
your work, all the initiative you're taking. Can't wait to see you in a couple of weeks. And yeah,
00:34:13.640
can't wait to see hopefully some of the people following this podcast as well.
00:34:16.520
Thank you so much for having me on. It was great getting to chat with you. I'll see you at MathFest.